OpenAI, a leading AI startup, has dismissed reports suggesting it plans to use Google's custom chips for its products. A spokesperson clarified that while OpenAI is conducting early tests on Google's Tensor Processing Units (TPUs), there are no immediate plans for large-scale deployment. The company currently relies heavily on NVIDIA's graphics processing units (GPUs) and AMD's AI chips to meet increasing demand.
OpenAI is also developing its own chips, with the design expected to be finalized and ready for production this year. Reports earlier this month indicated that OpenAI had partnered with Google Cloud to address its growing computational needs, highlighting a surprising collaboration between two AI industry competitors. Most of OpenAI's computing power will be sourced from GPU servers supported by CoreWeave, a new cloud company.
Google has expanded the external availability of its AI chips, which were initially for internal use. This strategy has attracted clients like Apple and startups such as Anthropic and Safe Superintelligence, both competitors of OpenAI.