China–U.S. AI Valuation Dynamics: Hong Kong Tech Leaders Quietly Reshape the Game

date
29/09/2025
avatar
GMT Eight
Alibaba (09988.HK) rose by more than 100%, up 102.34% as of the time of publication, at 182.6 Hong Kong dollars, with a turnover of 12.84 billion Hong Kong dollars, while Baidu Group-SW (09888.HK) climbed over 60%, driven by strong southbound capital inflows and Ark ETF accumulation.

In the second half of 2025, Hong Kong’s AI sector has been undergoing a subtle yet profound realignment. Shares of Alibaba (09988.HK) have more than doubled year-to-date, while Baidu Group-SW (09888.HK) has rallied over 60%. Southbound capital has consistently targeted both names, and Cathie Wood’s Ark ETF has also substantially increased its positions in these stocks.

Beneath these impressive price moves, a striking valuation gap persists between Chinese and American AI giants. Wall Street assigns forward P/E ratios as high as 60.4x for fiscal 2025 and 42.04x for fiscal 2026 to Nvidia (NVDA.US), while Alphabet (GOOG.US), Microsoft (MSFT.US) and Amazon (AMZN.US) also trade at lofty multiples. By contrast, Baidu (BIDU.US) and Alibaba (BABA.US) each command forward earnings multiples near 20x.

Analysts often point to PEG ratios—the forward P/E divided by expected long-term growth—to justify this discrepancy. U.S. technology leaders typically exhibit PEGs above unity, reflecting optimism about their growth trajectories, whereas Chinese peers trade below one, implying more cautious forecasts. This divergence rests on the assumption that American firms will outpace their Chinese counterparts in the AI revolution. Should that premise fail to materialize, the existing valuation differential may require significant recalibration.

The recent announcement of a $100 billion strategic partnership between Nvidia and OpenAI was heralded as the creation of an ideal compute-capital-equity loop. Yet this arrangement risks entrapping both parties in a cycle of immense investment and uncertain returns. Nvidia’s staged funding will support the purchase of four to five million next-generation GPUs for a planned 10 GW data center network, but with each gigawatt costing $50–60 billion, total capital requirements could approach $600 billion—far exceeding Nvidia’s commitment.

OpenAI’s projected 2025 revenue of $13 billion—including $10 billion from ChatGPT—pales against these extraordinary capital outlays, underscoring the challenge of aligning costs with income. Both Chinese and U.S. AI firms share a common commercial model: turning breakthroughs in core technology into enhanced efficiency and, ultimately, revenue. OpenAI monetizes inference through paid GPT licenses, while Baidu deploys its Wenxin model to optimize search results, each translating AI capability into productivity gains.

However, their monetization strategies diverge. U.S. companies rely on direct subscription fees and hardware commissions, whereas Chinese firms attract users with free AI services and monetize through core businesses such as advertising and cloud platforms. OpenAI’s proprietary model remains accessible only via paid APIs, while Baidu Wenxin and Alibaba Tongyi publish their model weights to foster developer ecosystems.

Startup economics also differ markedly. U.S. AI ventures depend on revenue to offset rising compute costs, often operating at a loss. Hong Kong-listed counterparts, by contrast, reinvest profits from established operations into AI research, embedding intelligent features within platforms like DingTalk and Taobao to diversify cost allocation.

Although U.S. tech giants integrate AI into consumer applications, their principal growth driver remains infrastructure sales to hyperscale clients like OpenAI. China’s open-source initiatives, such as DeepSeek, offer cost-effective base models and enterprise customization, leveraging vast datasets and localized use cases that align more closely with end-user needs.

DeepSeek’s R1 model, reportedly trained for just $294,000 with foundational development costs of $6 million, contrasts sharply with GPT-4’s estimated $100 million training price and GPT-5’s rumored $500 million expenditure. These discrepancies suggest that steep compute expenses could erode the margins of closed-source providers.

China’s infrastructure advantage further reinforces this edge. The “East Data West Compute” initiative has established the world’s largest integrated AI compute network, utilizing ultra-high-voltage corridors to deliver clean energy directly to data centers at scale, driving down costs and improving utilization.

In the U.S., data center expansion is hampered by capital intensity and grid limitations. Microsoft’s $3.3 billion Wisconsin campus, for example, consumes nearly 10% of local power capacity, straining transmission networks and often requiring costly diesel backups. China’s exabyte-scale data lakes provide low-cost storage, whereas U.S. commercial architectures and privacy regulations hinder data mobility, and high bandwidth fees constrain cloud storage economics.

China’s coordinated compute-storage-energy framework creates a closed-loop ecosystem from data generation to application—an advantage unmatched by the U.S. fragmented model. Short-term direct monetization may yield clear returns for American firms, but like Netflix (NFLX.US), subscription growth plateaus eventually, especially when facing free open-source alternatives.

Hong Kong-listed AI leaders anticipated this dynamic, embedding “free-to-ecosystem” models since inception. Tencent (00700.HK) and Alibaba pioneered these strategies to build large user bases and data pools, enabling iterative model improvements. Vast user engagement on Baidu Search and enterprise data from DingTalk have driven model refinement, and integrating Tongyi Qianwen into DingTalk has significantly boosted paid, value-added service revenue.

China’s “Inclusive AI” strategy, combined with policy support for infrastructure, reduces operating expenses and accelerates commercial adoption. If OpenAI’s revenue fails to offset its high capital requirements, Nvidia’s growth projections may come under pressure. Advanced compute alone, without supportive, cost-effective infrastructure, may struggle to justify valuation premiums.

The global AI sector now faces a critical valuation inflection point. Wall Street’s apprehension regarding stretched U.S. tech multiples stems from concerns over the sustainability of the $100 billion closed-loop model amid open-source competition and infrastructure bottlenecks. Meanwhile, Hong Kong’s AI champions, with undervalued price tags, ecosystem-driven monetization, and China’s integrated infrastructure support, await a potential value renaissance.

The nature of the China–U.S. AI contest is shifting from sheer technological prowess to the ability to sustain and scale operations cost-efficiently. China’s synchronized compute-storage-energy network under “East Data West Compute” offers a foundational enabler for large-scale AI deployment, while Hong Kong-listed firms’ deep integration of AI within existing businesses aligns with global trends toward inclusive intelligence. As the market prepares for the next wave of investment themes, those enterprises that balance commercialization, cost control, and infrastructure leverage are best positioned to lead the long-term AI evolution.