The post NVIDIA’s $20B chip could make ChatGPT look slow appeared on BitcoinEthereumNews.com. Chip giant NVIDIA is preparing to unveil a powerful new artificialThe post NVIDIA’s $20B chip could make ChatGPT look slow appeared on BitcoinEthereumNews.com. Chip giant NVIDIA is preparing to unveil a powerful new artificial

NVIDIA’s $20B chip could make ChatGPT look slow

For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Chip giant NVIDIA is preparing to unveil a powerful new artificial intelligence processor designed to speed up how chatbots and other AI tools generate responses, potentially making today’s systems like ChatGPT appear sluggish by comparison.

The new platform, expected to debut at NVIDIA’s annual GTC developer conference, is optimized for AI inference, the stage when trained models produce answers to user prompts. Unlike traditional GPUs built to handle both training and inference, the upcoming processor focuses specifically on delivering responses faster and more efficiently.

The product, if launched, will mark the first tangible result of December’s deal that brought Groq’s founders into the fold, whose company specializes in high-speed AI processing hardware.

Late last year, NVIDIA reportedly spent about $20 billion to license technology from the chip startup Groq and recruit key personnel, including its CEO. Around the same time, NVIDIA CEO Jensen Huang told employees, “We plan to integrate Groq’s low-latency processors into the NVIDIA AI factory architecture, extending the platform to serve an even broader range of AI inference and real-time workloads.”

Now, the new inference chip is expected to handle complex AI queries at high speed, with OpenAI and other leading clients likely to adopt it, according to The Wall Street Journal. Its report also showed that the new chip may handle close to 10% of OpenAI’s inference workload.

The Groq-style chip will use SRAM, sources say

During a recent earnings call, NVIDIA CEO hinted that several new products will be unveiled at the upcoming GTC event, often described as the “Super Bowl of AI.” He had remarked, “I’ve got some great ideas that I’d like to share with you at GTC.” 

Most analysts agree the Groq-style chip could be part of the lineup. They also stated that its design could shed light on how NVIDIA aims to address memory constraints in inference computing. Such platforms typically run on high-bandwidth memory (HBM). However, HBM has been difficult to source lately.

Insiders have claimed the firm plans to use SRAM in the chip rather than the dynamic RAM associated with HBM. Ideally, SRAM is more accessible and can improve the performance of AI reasoning workloads.

If the chip is unveiled, it could be a great step forward for the chip company and AI-trained models. However, speaking on its possible launch, Sid Sheth, founder and CEO of d-Matrix, cast a shadow on its development. He noted that while NVIDIA remains the clear leader in AI training, inference represents a very different landscape. He shared: “Developers can turn to competitors other than NVIDIA because running finished AI models doesn’t require the same kind of programming as training them.” 

Nevertheless, other tech giants are also advancing inference computing. Meta this week unveiled four processors tailored for inference, prompting a Silicon Valley investor to say the industry may be entering a non–“NVIDIA-dominant” phase.

However, more recently, June Paik, chief executive of FuriosaAI, a NVIDIA rival, commenting on the benefit of easily deployable inference computing, cautioned that most data centers can’t accommodate the latest liquid-cooled GPUs.

Nonetheless, despite his worries, the Bank of America analysts expect inference workloads to represent 75% of AI data center spending by 2030, when the market reaches about $1.2 trillion, up from about 50% last year. Ben Bajarin, a tech analyst at Creative Strategies, also asserted that data centers of the future won’t conform to a one-size-fits-all model, anticipating that companies will take different approaches to chip and facility development.

NVIDIA is expected to release the Vera Rubin chips later in 2026

NVIDIA has also recently rolled out its next-gen AI chips, Vera Rubin AI chips, anticipating that the rise of reasoning AI platforms such as DeepSeek will fuel even greater computing demand. It claimed the chips would help train larger AI models and provide more sophisticated outputs to a broader user base. 

According to Huang, Rubin will also hit the market in the second half of 2026, with a high-end “ultra” version coming in 2027.

He also explained that a single Rubin system would combine 576 individual GPUs into a single chip. Currently, NVIDIA’s Blackwell chip clusters 72 GPUs in its NVL72 system, meaning Rubin will feature more advanced memory.

Source: https://www.cryptopolitan.com/nvidias-chip-could-make-chatgpt-look-slow/

Market Opportunity
Gitcoin Logo
Gitcoin Price(GTC)
$0.09383
$0.09383$0.09383
-6.40%
USD
Gitcoin (GTC) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Tags:

You May Also Like

Tim Draper’s Stark Prediction As Fiat Trust Plummets

Tim Draper’s Stark Prediction As Fiat Trust Plummets

The post Tim Draper’s Stark Prediction As Fiat Trust Plummets appeared on BitcoinEthereumNews.com. Bitcoin Adoption: Tim Draper’s Stark Prediction As Fiat Trust
Share
BitcoinEthereumNews2026/03/14 14:57
Chorus One and MEV Zone Team Up to Boost Avalanche Staking Rewards

Chorus One and MEV Zone Team Up to Boost Avalanche Staking Rewards

The post Chorus One and MEV Zone Team Up to Boost Avalanche Staking Rewards appeared on BitcoinEthereumNews.com. Through the partnership with MEV Zone, Chorus One users will earn extra yield automatically. The Chorus One Avalanche node has a total stake of over 1.7 million, valued at around $55 million. This collaboration will introduce MEV Zone to both public nodes and Validator-as-a-Service. The Avalanche network stands to benefit from fairer and more efficient markets due to enhanced transparency. Chorus One, a highly decorated institutional-grade staking provider, has inked a strategic partnership with MEV Zone to enhance yield generation on the Avalanche (AVAX) network. The Chorus One partnered with MEV Zone to increase the AVAX staking yields, while simultaneously contributing to the general growth of the Avalanche network. “At Chorus One, we see this as an important step in our ongoing journey to provide robust infrastructure and innovative yield strategies for our partners and clients,” the announcement noted.  Why Did Chorus One Partner With MEV Zone? The Chorus One platform has grown to a top-tier institutional-grade staking ecosystem, with more than 40 blockchains, since 2018. In a bid to evolve with the needs of crypto investors and the supported blockchains, Chorus One has inked several strategic partnerships in the recent past, including MEV Zone. In the recent past, MEV Zone has specialized in addressing the Maximal Extractable Value (MEV) challenges on the Avalanche network. The MEV Zone will help Chorus One’s AVAX node validator to use Proposer-Builder Separation (PBS). As such, Chorus One’s AVAX node will seamlessly select certain transactions that are more profitable when making blocks. For instance, MEV Zone will help Chorus One’s AVAX node validator to capture arbitrage and liquidation transactions more often since they are more profitable.  How will Chorus One’s AVAX Stakers Benefit Via This Partnership? The Chorus One AVAX node has grown over the years to more than 1.77 million coins staked, valued…
Share
BitcoinEthereumNews2025/09/18 03:19
USDC Beats USDT in Transaction Volume for First Time Since 2019

USDC Beats USDT in Transaction Volume for First Time Since 2019

TLDR Mizuho reports USDC holds 64% market share in adjusted transaction volume, overtaking USDT year-to-date This is the first time USDC has led in volume since
Share
Coincentral2026/03/14 15:41