Bittensor Subnet Completes 72 Billion Parameter LLM Pretraining, TAO Rises 54.8% in Two Weeks

TAO-1,23%

Gate News, March 16 — According to official sources, the Bittensor subnet Templar (SN3) completed the largest decentralized LLM pretraining in history, Covenant-72B, on March 10. Covenant-72B is a language model with 72 billion parameters, pretrained by the Templar team on Bittensor Subnet 3, entirely based on the general internet, without centralized data centers. The model scored 67.1 on the MMLU (zero-shot) test, surpassing centralized baseline models like LLaMA-2-70B and LLM360 K2 under the same evaluation conditions. It is the largest fully permissionless collaborative language model to date, with over 70 different nodes contributing computing resources throughout its operation. The team has released all weights and checkpoints under the Apache license. Following this news, Bittensor (TAO) and its subnet tokens surged, with TAO increasing by 54.8% over the past two weeks. The subnet token Templar has risen 194% in the past 7 days, currently trading at $19.3.

View Original
Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments