Google targets the inference chip track to challenge industry leader Nvidia.
In just a few months, Google's artificial intelligence chip has become one of the hottest commodities in the tech industry. Leading artificial intelligence developers, including some of the company's biggest competitors, are stockpiling these chips. Now, Google is looking to take things a step further and may introduce a new chip specifically for inference, which means running AI models after training. With this advancement, Google hopes to pose a bigger challenge to market leader Nvidia in the rapidly growing semiconductor field due to the surge in AI software. The company plans to announce a new generation of custom chips, called Tensor Processing Units, at the Google Cloud Next conference in Las Vegas this week. Amin Vahdat, in charge of Google's AI infrastructure and chip work, refused to comment on the plan for a inference chip that could accelerate AI output speed, but said that more information may be disclosed "in the relatively near future".
Latest
13 m ago

