[Market Trends] How Google Makes Custom Cloud Chips That Power Apple AI And Gemini | CNBC
π Google’s Secret Weapon: Custom Chips Powering AI for Apple and Gemini
At Google’s Silicon Valley lab, custom-built chips called Tensor Processing Units (TPUs) are developed to power AI models and services like Google search and YouTube. TPUs, first launched in 2015, were created to meet Google's growing computational demands, especially for voice recognition and AI applications. They are more efficient than general-purpose chips because they are designed specifically for AI tasks. Google's TPUs also play a crucial role in training large AI models, including Google’s chatbot Gemini and even Apple’s AI models. Despite early AI innovations, some believe Google has fallen behind competitors like Amazon and Microsoft. However, Google holds the largest market share for custom AI cloud chips. These AI chips are essential for handling the immense compute power needed for generative AI. Google also partners with companies like Broadcom and Taiwan Semiconductor Manufacturing Company to develop and produce these chips while mitigating geopolitical risks in the semiconductor industry.