Lates News

date
24/04/2026
According to industry sources, Meituan's new generation of large-scale basic model LongCat-2.0-Preview has begun open testing. The total parameter scale of this model has surpassed the trillion level, ranking among the top large models globally. Informants revealed that the total parameter and activation parameter scales of DeepSeek's new generation V4 large model, released on the same day, are basically the same as Meituan's LongCat-2.0-Preview. In addition to the parameter scale, the major breakthrough of Meituan's new generation of basic large model lies in the fact that its training and inference processes rely entirely on domestic computing power clusters. According to the aforementioned sources, the number of computing cards used in Meituan's training phase is between 50,000 and 60,000, making it the largest-scale large model training task completed on domestic computing power up to now (Jiemian).