Bittensor Training Milestone Gains Backing From Nvidia CEO

• Bittensor’s decentralized AI model draws attention from Nvidia CEO Jensen Huang
• Chamath Palihapitiya highlights the project as a real world AI breakthrough
• TAO token rises as interest in decentralized AI accelerates
A decentralized artificial intelligence project is gaining traction beyond crypto circles after drawing attention from leading technology figures.
Bittensor’s latest training milestone has sparked discussion about whether distributed AI systems can compete with centralized models.
The development comes as the race to define the future of AI infrastructure intensifies.
Bittensor Training Milestone Gains Industry Attention
The Bittensor training milestone entered the spotlight after Chamath Palihapitiya discussed the project during a recent episode of the All In Podcast.
He pointed to Bittensor’s Covenant 72B model as a working example of decentralized AI moving beyond theory. According to Palihapitiya, contributors trained a large scale model using distributed compute rather than centralized infrastructure.
He described the effort as a network driven system where individuals contribute excess computing power in exchange for incentives.
Meanwhile, Jensen Huang acknowledged the concept, framing decentralized and centralized AI as complementary rather than competing approaches.
“These two things are not A or B; it’s A and B,” Huang said during the discussion, emphasizing that both models will likely coexist.
Decentralized AI Model Challenges Traditional Infrastructure
Bittensor operates as a blockchain based network where developers exchange machine learning models and compute resources.
Its Covenant 72B model represents one of the largest decentralized training efforts to date. The system reportedly used over 70 contributors and coordinated training across standard internet connections.
Technically, the model includes 72 billion parameters and was trained on roughly 1.1 trillion tokens, according to project details.
This approach contrasts with traditional AI systems, which rely on centralized data centers controlled by large technology firms.
However, Huang noted that most users will continue relying on polished, general purpose AI systems rather than building custom models.
Still, he added that industries requiring data control and customization may benefit from open or decentralized models.
Growing Divide Between Open and Closed AI
The AI sector has increasingly split into two models over the past two years.
On one side are closed systems such as ChatGPT and other proprietary platforms, which offer ease of use and strong performance.
On the other are open and decentralized models that allow developers to modify systems and retain control over data.
This divide has become more relevant as enterprises seek tailored AI solutions.
As a result, decentralized networks like Bittensor are positioning themselves as alternatives for specialized use cases.
Market Impact and Token Reaction
Market response to the Bittensor training milestone has been noticeable.
The project’s native token, TAO, has risen approximately 24% since the discussion featuring Palihapitiya and Huang circulated on social media.
The price movement reflects growing investor interest in AI focused crypto assets, particularly those linked to infrastructure and compute networks.
However, broader adoption remains uncertain and depends on whether decentralized models can scale effectively.
Industry Perspective
Huang’s comments suggest that the future of AI may not rely on a single framework.
Instead, he outlined a dual approach where open systems support innovation while proprietary platforms deliver user friendly products.
For startups, he noted a trend toward building on open source models before adding proprietary layers.
This strategy reflects a broader shift in how companies approach AI development and commercialization.
Bittensor’s training milestone marks a notable moment for decentralized AI, drawing attention from both investors and industry leaders.
While centralized systems continue to dominate mainstream use, open and distributed models are gaining relevance in specialized areas.
The long term outcome will likely depend on performance, scalability, and real world adoption.
Covering startup news, AI, technology, and business at YCryptoNews. Delivering accurate, in-depth reporting on the stories that shape the future.