Meituan is internally testing a new generation of large-scale models with a capacity of tens of billions, trained entirely on domestically produced computing power clusters.

date
24/04/2026
Reporters have learned that Meituan has quietly started testing invitations for a new generation of large-scale models. The scale of this model's parameters has reached trillions, and it is worth noting that, according to informed sources, the model is completely based on domestically produced computational power clusters for training. This indicates that Meituan may have achieved a breakthrough in using domestic computational power to train trillion-level models. Currently, this model is only open to invited users.
Latest
See all latestmore