Meituan's trillion-level parameter large model open testing, the entire training process is completed by domestic computing power cluster.
On April 24th, according to industry sources, Meituan's new generation basic large model LongCat-2.0-Preview has been opened for testing. The total parameter scale of this model has exceeded trillions, placing it among the top large models in the world. According to insiders, on the same day, DeepSeek released a new generation V4 large model, with total parameter and activation parameter quantities basically equal to Meituan LongCat-2.0-Preview. In addition to parameter scale, the bigger breakthrough of Meituan's new generation basic large model lies in the fact that its training and inference are completed entirely with the help of domestic computing power clusters. According to the above source, the number of computing power cards used in Meituan's training phase is between 50,000 and 60,000, making it the largest-scale large model training task completed on domestic computing power to date.
Latest

