The government will bolster the nation’s computational infrastructure by acquiring 10,000 graphics processing units (GPUs) within the next 18 months, former Niti Ayog CEO and India’s G20 Sherpa Amitabh Kant said.
“This is expected to enhance India’s processing power, bringing its computational capabilities in line with its data generation,” Kant wrote in a post on X (formally Twitter).
“India generates 20 percent of the world’s data. India also holds the position of having the second-highest number of GitHub AI projects globally, accounting for 19 percent of worldwide AI projects. This demonstrates our vibrant and active engagement in AI development on an international scale,” Kant said in his post.
“AI will be a defining opportunity for India to transform learning & health outcomes and improve human development indices,” he added.
A recent report from CBRE South Asia Pvt Ltd found that India has emerged as the leading country in the Asia Pacific region (excluding China) for data centre capacity, with a capacity of 950 MW, surpassing other major nations such as Australia, Hong Kong SAR, Japan, Singapore and Korea.
The report also forecasts that India will add 850 MW of data centre capacity between 2024 and 2026, the highest increase in the region.
Data centres are large groups of networked servers, which are used for remote data storage and distribution. According to a March report by CareEdge Ratings, India’s data centre industry is expected to double its capacity from approximately 0.9 GW in 2023 to 2 GW by 2026.
This growth is projected to require an estimated capital expenditure of Rs 50,000 crore over the next three years.
India now has over 100 generative AI startups, however, the investment into the space has been comparatively small. The US saw nearly $250 billion in private investments into AI startups between 2013 and 2022, while investments in India stood at just $8 billion.
Over the same period, China saw $95 billion of investments while for the UK, the number stood at $18 billion, as per data from the AI Index 2023 Annual Report.
Role of GPUs in AI
GPUs play an important role in AI today, providing top performance for AI training and inference. They also offer significant benefits across a diverse array of applications that demand accelerated computing.
Tesla currently has 35,000 H100 chips — the most powerful and super-expensive graphics processing units designed for AI applications — CEO Elon Musk said during the company’s first quarterly earnings for 2024.
In January, Mark Zuckerberg said that Meta plans to have about 3.5 lakh H100 chips by the end of 2024, taking Meta’s total stash to about 600,000 by year-end.
Microsoft is next in the line with 1.5 lakh GPUs on order in 2023, according to technology research firm Omdia. Oracle, Tencent, Google, and Amazon, which reportedly bought 50,000 each,” the report added.
Leave a Comment