The post China Telecom develops MoE AI models exclusively on Huawei Chips appeared on BitcoinEthereumNews.com. China Telecom has developed the country’s first artificialThe post China Telecom develops MoE AI models exclusively on Huawei Chips appeared on BitcoinEthereumNews.com. China Telecom has developed the country’s first artificial

China Telecom develops MoE AI models exclusively on Huawei Chips

China Telecom has developed the country’s first artificial intelligence models with the innovative Mixture-of-Experts (MoE) architecture that are trained entirely on advanced chips from Huawei Technologies.

According to a technical paper published last month by China Telecom’s Institute of Artificial Intelligence (TeleAI), the TeleChat3 models, ranging from 105 billion to trillions of parameters, were trained on Huawei’s Ascend 910B chips and its open-source deep learning AI framework, MindSpore.

TeleAI researchers stated that the Huawei stack met the “severe demands” of training large-scale MoE models across a range of sizes. “These contributions collectively address critical bottlenecks in frontier-scale model training, establishing a mature full-stack solution tailored to domestic computational ecosystems,” they added

China Telecom’s model lags behind OpenAI’s  GPT-OSS-120B

MoE architecture distributes tasks to multiple specialized submodels, or “experts.” Therefore, AI models developed with it can scale up capacity without significant increases in computational overhead. MoE was popularized by DeepSeek’s V3 model, released in December 2024, and has since become the norm for leading-edge Chinese AI models.

MoE models, however, were considered more technically demanding to train and run. China Telecom’s self-reported performance scores for its TeleChat3 models showed that they lagged behind those of OpenAI’s GPT-OSS-120B, released in August, on several benchmarks.

Last week, Tsinghua University spin said its new image-generation model was trained on Huawei chips, making it the first open-source model developed on an entirely domestic training stack to achieve industry-leading scores in image generation.

Beijing-based Zhipu AI was blacklisted by Washington last January. The US has placed several Chinese technology companies, including Huawei and iFlytek, on export-control blacklists. This effectively bars them from receiving US-origin chips, semiconductor tools, and other advanced technology.

Ant Group researchers, a fintech affiliate of Alibaba Group Holding, also said they successfully trained a 300-billion-parameter MoE model “without premium GPUs”. However, they did not specify whether they had exclusively used domestically designed chips.

Meanwhile, as reported by Cryptopolitan, a Nasdaq-style index of local Chinese tech stocks has jumped nearly 13% just this month. A second gauge tracking Hong Kong-listed Chinese tech firms is up 6%, and both are leaving the Nasdaq 100 behind.

Nvidia stock tanks as Beijing declares self-reliance

Nvidia said that its advanced GPUs and machine-learning frameworks were the best tools in the world for training large-scale MoE models. However, Beijing has made self-reliance across the entire AI stack a key priority for the country in the next five years due to US trade restrictions that block Chinese firms’ access to advanced US chips.

The US government recently gave the go-ahead for Nvidia to sell the H200, the firm’s second-most-powerful chip, to China. However, China moved to block shipments of advanced chips. Cryptopolitan reported that Beijing could be considering restrictions to advance local chip development or strengthen its negotiating position with the US.

As a result, suppliers paused production of H200 components after the block. Nvidia had expected more than 1 million orders from Chinese customers, with suppliers gearing up for March deliveries, but customs officials reportedly refused entry for the chips.

Nvidia shares have since slid about 3% after reports. According to analysts, Nvidia faces a clear risk. If China continues blocking H200 shipments, the stock could break a key near-term support. Should approvals ease, the boost could come fast, but policy uncertainty swings both ways. 

On the other hand, other chipmakers showed mixed moves as AMD climbed 1.7%, Intel fell 2.8%, while the S&P 500 ETF SPY dipped roughly 0.1%. Meanwhile, market watchers are looking out for NVDA’s upcoming February 25 quarterly earnings and any fresh details on its China export situation.

Sharpen your strategy with mentorship + daily ideas – 30 days free access to our trading program

Source: https://www.cryptopolitan.com/china-telecom-develops-moe-ai-huawei-chips/

Market Opportunity
DeepBook Logo
DeepBook Price(DEEP)
$0.039654
$0.039654$0.039654
+1.04%
USD
DeepBook (DEEP) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

5 Best Crypto Investments for Small Budgets: Why Ozak Al at $0.012 Is the Hottest Pick Under $0.01

5 Best Crypto Investments for Small Budgets: Why Ozak Al at $0.012 Is the Hottest Pick Under $0.01

Ozak AI is another innovative AI-based crypto project that is rocking the market with the combination of AI and a DePIN (Decentralized Physical Infrastructure Network).
Share
Cryptodaily2025/09/20 20:17
Google puts 1.4 billion as collateral: 5.4% pro forma in Cipher

Google puts 1.4 billion as collateral: 5.4% pro forma in Cipher

Big Tech raises the stakes on HPC for AI: Google has provided a $1.4 billion guarantee on Fluidstack bonds.
Share
The Cryptonomist2025/09/25 23:32
Justin Bieber’s First No. 1 Single Turns 10

Justin Bieber’s First No. 1 Single Turns 10

The post Justin Bieber’s First No. 1 Single Turns 10 appeared on BitcoinEthereumNews.com. Justin Bieber earned his first No. 1 on the Hot 100 in 2015 with “What Do You Mean?,” a song that marked his transition into mature pop sounds. NEW YORK, NY – MAY 04: Singer Justin Bieber attends the ‘China: Through The Looking Glass’ Costume Institute Benefit Gala at the Metropolitan Museum of Art on May 4, 2015 in New York City. (Photo by Dimitrios Kambouris/Getty Images) Getty Images Justin Bieber’s music career was essentially nonexistent for several years, and fans were beginning to wonder when they’d get to hear from the pop star again — until, out of nowhere, he revealed his new album Swag would drop in just a few hours. The full-length, which blended pop and R&B, arrived shortly thereafter in mid-July, and it brought him back to the highest reaches of several Billboard charts this summer. More recently, Bieber delivered a second installment, titled, appropriately, Swag II, which is counted together with Swag for charting purposes in the United States As he celebrates songs from Swag II and the continued success of multiple tracks from the first edition, his first leader on the Hot 100 turns 10. “What Do You Mean?” Debuted at No. 1 “What Do You Mean?” debuted at No. 1 a decade ago, opening atop the Hot 100 on the chart dated September 19, 2015. The cut was not only Bieber’s first to start in first place, but — amazingly — his first ruler on the most competitive songs ranking in America. Justin Bieber Was a Superstar Without a No. 1 By the time “What Do You Mean?” arrived, Bieber was already one of the biggest pop stars on the planet. He’d racked up multiple hits in America, but he had never managed to lead the Hot 100. The Canadian musician had come…
Share
BitcoinEthereumNews2025/09/19 23:07