Lenovo's AI computing power infrastructure has been fully upgraded, with AI inference performance improved by 5-10 times.

date
09/05/2025
At the 2025 Lenovo Innovation Technology Conference, Lenovo China's infrastructure business announced a full product upgrade, launching six major innovation technologies and two major upgrades. It is reported that the six major innovative technologies released this time include: AI inference acceleration algorithm set, AI compilation optimizer, AI training and slow node fault prediction and self-healing system, expert parallel communication algorithm, "flying fish" bionic heat dissipation design and immersion cooling system, and computing power service "cost-effectiveness" dual-optimization operation system. Among them, the comprehensive landing of the Wanan heterogeneous smart computing platform 3.0 last year based on the five major differentiated technologies, through continuous innovation of four major heterogeneous smart computing technologies, helps achieve a new breakthrough in local infrastructure computing power efficiency. For example, the AI inference acceleration algorithm set can help improve AI inference performance by 5-10 times; the AI compilation optimizer reduces training and inference computing costs by at least 15%; the AI training and slow node fault prediction and self-healing system achieves fault self-healing in milliseconds, minutes, and tens of minutes respectively; the expert parallel communication algorithm reduces inference latency by at least 3 times.