Qualcomm arrived at the India AI Impact Summit in New Delhi last week carrying the most concentrated set of India commitments in its history: a USD 150 million technology ecosystem fund, a collaboration with Tata Electronics for automotive silicon modules, a 2-nanometer tape-out accomplished in material part by Indian engineers, and active discussions with prospective data centre partners modelled on its 200-megawatt Aramco deployment in Saudi Arabia. Taken together, they amount to a declaration that India is no longer a market Qualcomm serves — it is a geography Qualcomm is building with. The summit itself underscored that direction: on its final day, India formally joined Pax Silica, the US-led coalition of trusted nations committed to securing the full silicon stack from critical minerals through to AI deployment infrastructure, becoming its tenth member and signalling, unmistakably, which side of the technology supply chain it intends to occupy.
The ambition was underscored by Qualcomm's own presence on stage. Chief executive Cristiano Amon delivered a keynote at the summit, and later the same day Durga Malladi — Executive Vice President and General Manager for Technology Planning, Edge Solutions and Data Centre, IEEE Fellow, and the company's most architecturally consequential executive — delivered one of his own. It was between these two appearances, in a conference room on the sidelines of Bharat Mandapam, that Malladi sat down with BW Businessworld. The symmetry was not incidental. When a company's chief executive and most foundational tech executive both take the stage at the same summit, it signals where the company believes the next decade will be won.
Malladi is a nearly three-decade veteran of the San Diego-based fabless chip designer, a man who has shaped Qualcomm's technology strategy from the silicon up — and in New Delhi, sandwiched between two keynotes, he made an argument that cuts against the prevailing consensus of an industry currently building GPU clusters as fast as power grids will permit: that the entire model of centralising AI compute in data centres is, for a country like India, the wrong answer.
"If you want to scale AI, you must ensure compute is distributed across the entire network," he said. "You run inference on devices when you can. If someone asks why run inference on devices, I would ask, 'Why not?'"
It is a position with enormous commercial implications — and, as India's government absorbs the summit's resolutions on sovereign AI infrastructure, potentially enormous policy ones too.
The Summit India Has Been Earning
The India AI Impact Summit, which concluded with the adoption of a Leaders' Declaration affirming India's commitment to sovereign AI infrastructure, inclusive international AI governance, and AI literacy at population scale, drew nearly 300,000 participants across five days at Bharat Mandapam, with delegations from over 100 countries, more than 20 heads of state, and 60 ministers. Over 500 global AI leaders attended — among them more than 100 founders and chief executives, 150 academicians and researchers, and nearly 400 chief technical officers. The event, inaugurated by Prime Minister Narendra Modi on 19 February, was the first global AI summit in this series to be hosted by a Global South nation — a distinction India wore with deliberate intent.
The preceding evening had produced what may be the defining image of the global AI moment: Prime Minister Modi convening Sam Altman and Dario Amodei on the same stage, where the founders of OpenAI and Anthropic declined, with what observers variously described as professional caution or low-key theatre, to raise their hands together when asked whether they were building artificial general intelligence. The moment circulated across the internet within minutes.
Malladi, characteristically, had more interest in the summit's structural meaning than its celebrity theatre. "One thing I realised is that the amount of energy and enthusiasm for AI is massive, and this straddles practically every part of the ecosystem," he said. "This includes those who are in the business of making AI models, those creating applications that run on top of those models, those migrating towards developing hardware platforms for it, and those who want to get involved in data centre construction and where things go beyond devices. It's been absolutely amazing."
He was not surprised by the scale. "I was told it was going to be big, so I came in with those expectations, and I wasn't surprised. We have seen this narrative build up over a period of time." The arc was legible from outside: "If I go back to the 2022 and 2023 timeframe, a lot of the buzz was just the fact that generative AI was something new — even though it had already been around for a while. It reached a point where an average consumer, who didn't know anything about AI, could suddenly realise this was something different — not just taking a picture and cleaning it up, but a lot more than that. However, a lot of that activity was centred around a couple of countries. Over the last two years, AI has become a national narrative in every country."
On whether India's developer energy compares in measure with that of the United States or China, he was carefully calibrated. "I want to be cautious about what I say here, but first of all, the scale of this event is just massive. In the US and China, it's already big, and I can see right here in India that it's pretty big as well. But this is just the first step in a journey; we have to see a lot of things actually happen after this in terms of applications."
It was, delivered without drama, the most important sentence of the conversation. Enthusiasm is abundant. What India must now manufacture, at scale, is output.
The Distributed Intelligence Thesis
The most intellectually substantive argument Malladi made in New Delhi concerns where AI computation should actually happen — a question the industry's dominant players have strong incentives to answer incorrectly.
Minister Ashwini Vaishnaw had said, the previous morning, that Edge AI would be critical to scaling AI in India. Malladi unpacked this with the methodical satisfaction of a man whose long-held position has just received ministerial endorsement. "I would argue that you need all of the above," he began — a concessive opening that was in fact the setup for a thorough structural argument.
He laid out the full computational spectrum with meticulous care: wearable devices such as glasses operate at sub-1-billion or 1-to-2-billion parameter models; "a smartphone today can run a 10-billion parameter model without breaking a sweat — that's impressive, it wasn't possible three or four years back"; personal computers sit at approximately 30 billion parameters; automotive systems run between 70 and 100 billion; enterprise on-premises servers — air-cooled rather than liquid-cooled, sparing policymakers the water supply problem that haunts large data centre deployments — can handle 100 to 500 billion parameters. Only at the apex, for genuinely foundational workloads at "an order of magnitude of a trillion parameters," do data centres become indispensable. "Our solutions at the data centre can host trillion-parameter models, while our on-device solutions cater to the exact needs of those specific devices.”
The India-specific policy argument is direct. "If you are a policymaker thinking about installing gigawatts of capacity, putting all of it in a data centre means you have to worry about land, power supply, and water supply. For enterprises, a good fraction of that installed base already exists. It's easier — you only have to worry about power supply, not water. When you look at a smartphone, it runs on a 4,500 milliamp-hour battery operating at 4 to 5 watts, whereas a data centre uses hundreds of kilowatts. It requires the usage of all of the above. When we talk of Edge AI, we have a much more balanced view of distributing workloads across the network."
This is what makes Qualcomm's portfolio its strategic argument rather than just hype. "We are one of the few companies — if not the only one — that has a very diverse portfolio going all the way from doorbells to data centres," Malladi noted. "Our technology goes into wearables and consumer IoT — earbuds and watches are at one extreme. As you move higher up, we go towards smartphones, PCs, and XR devices, then industrial IoT equipment, automotive, and finally into data centres. That's a pretty massive portfolio. For each of these segments, there is always someone thinking about their role. They could be developers creating apps or agents, part of the hardware solutions, or just in the business of local manufacturing. Everyone has a role to play over here, and it's amazing to see that in action."
Indian Engineers at the 2nm Frontier
The 2-nanometer tape-out represents the absolute frontier of semiconductor fabrication — transistors packed at densities that require tolerances measured in atoms. That Indian engineers contributed materially to Qualcomm reaching it is, in Malladi's telling, the visible culmination of a long-building structural commitment rather than a new development.
"The question you're asking is almost independent of the 2-nanometer question because it's been going on for a while, but the 2-nanometer chip is the most recent announcement made," he said. "We have a substantial engineering presence right here in India, where the workforce is focused on hardware design and a lot of the software we bring along with that on our platforms. The most recent state-of-the-art technology today for smartphones, moving towards PCs and XR devices, is based upon the 2-nanometer process technology, and it's cutting edge." The work is distributed globally — "We have teams in North America" — but the Indian contribution is integral, not incidental. "The team right here in India was instrumental in bringing about that tape-out that was just recently done. It's a joint effort, but it's an important piece of the puzzle right here in India for us."
For a government that has invested considerable political capital in building a domestic semiconductor design ecosystem, this is the best available evidence that the ambition is not merely aspirational. The work is happening now, at the frontier.
Taking the Fight to NVIDIA
The data centre business, where NVIDIA's dominance is most entrenched, is where Qualcomm's ambitions require the clearest explanation. CUDA, NVIDIA's proprietary computing platform, represents two decades of accumulated developer intuition — not merely a software layer but the professional grammar of an entire generation of AI researchers. Competing against it demands more than better specifications. It demands a credible developer experience.
Qualcomm's answer is structured around openness. "If you are a developer today looking at Hugging Face, you have 2.5 million model choices," Malladi said. "How do you take a model from there and try it on one of our AI 200 or AI 250 racks? We have a one-click deployment model. If you go to Hugging Face right now, you will see a Qualcomm repo. We installed a transformers library there, which means you can pick up that library, click once, and your model immediately maps onto one of our racks." The second pillar is a wholehearted embrace of open-source Linux. "We have fully embraced the open-source Linux methodology, upstreaming a lot of our code back into the open-source Linux community. There is no black box here. We offer compilers, math libraries, debuggers, and profilers that most developers crave. We are looking forward to developers gradually latching onto these tools."
The cloud services layer is handled by partners. "In some instances, we provide racks to a data centre, but we aren't the ones providing the cloud services. Someone else provides the cloud services on top of our hardware and one-click deployment tools, adding their own software layer. Together, we feel we have a very competitive solution."
The Saudi Arabia deployment — an initial contract for 200 megawatts with Aramco, with racks shipping at the time of our conversation — provides the proof point at scale. "When we announced our solutions in Saudi Arabia, we got our first contract for the initial 200 megawatts. They have a much bigger rollout coming, and we expect to be part of that as well. That journey has already started, and we are shipping racks to the region as we speak." The AI 250's memory architecture is the competitive differentiator. "It tilts the scales on the total cost of ownership, making it far more favourable towards Qualcomm racks. That has definitely caught people's attention, similar to what happened last year when we first announced it and began talking to hyperscalers. These racks are shipping now, and we are identifying the right partners. Stay tuned, because more announcements will be coming later."
India is the logical next theatre, given the government's Rs 10,000 crore tax holiday for data centres, with approximately Rs 8,000 crore specifically earmarked. "We are right in the midst of discussions with potential cloud service providers and those building out data centres. There isn't a whole lot to talk about yet, beyond the fact that there's a lot of incoming interest."
The 30-Year Architecture Bet
The acquisition of Ventana Micro Systems, built entirely upon RISC-V — the open-source instruction set architecture and the only credible long-term alternative to Arm's dominance of computing — is the most architecturally profound move Qualcomm announced surrounding its New Delhi presence. It received less fanfare than the USD 150 million fund. It may matter more.
Malladi framed it in the deepest available historical arc. "Instruction Set Architecture transitions occur only once every 20 years or so. In the late '70s and early '80s, all we had was x86. Then, the Arm ISA came in during the '90s. It's been about 30 years." The arithmetic is not subtle. If history's periodicity holds, the window is open.
"RISC-V is a very interesting technology based on open-source standards. For a long time, we've been contributing to RISC-V standards as they are built up, alongside a large number of our partners. Looking at the maturity of the specifications, about a year and a half ago, we asked ourselves if it made sense to start thinking about products in this space. When we looked at RISC-V, we found that it's a clean start that brings benefits. It has the right attributes in some market segments that are still relatively greenfield, allowing us to experiment with new ISAs. Our analysis concluded that it has the potential to bring very competitive products, so we went ahead with the acquisition."
On whether the primary advantage is performance or efficiency: "The answer is both. When we looked at the power-performance curves for what we can do with RISC-V in the segments we analysed, it was very competitive. It is competitive with existing products across the board, so when we ask if it moves the needle further, the answer is yes."
It is, above all, a technology bet. "We are still in the planning stages, so there isn't much more to say, but it is a technology-based bet. At the end of the day, we are a technology company, and we will go with the best possible technology for any given segment. We identified a couple of segments where we thought this would be the best bet."
Whether Qualcomm could be producing products across two separate instruction set architectures fifteen to twenty years hence: "In the technology sector, it's very hard to make predictions for anything more than a year out, let alone 10 to 20 years. There is no real way to answer that question right now; I think it will just play itself out."
It will. Qualcomm has ensured it will be positioned when it does.
The Robot at India's Factory Gate
The closing thread — robotics — places Qualcomm's ambitions at the intersection of India's most charged economic reality. At CES earlier this year, the company demonstrated a full physical AI stack, including a robot dog, built upon work originally developed for automotive driver-assistance systems and extended through Vision-Language-Action models — the generative architectures that enable machines to translate visual perception and language instruction into physical movement.
"We have placed our bets and investments in physical AI, including robotics and humanoids," Malladi said. "Right now, we are trying to ensure our products align with customer needs, which can be extremely diverse in robotics. It is still early, but we have made significant investments here. You are right that we are leveraging work we previously did for automotive, but it is now carved out into its own domain with new generative AI models — specifically Vision-Language-Action models, or VLAs. As commercial efforts begin to mature, we will be in a better position to answer your question. We do have a few customers lined up for this."
Asked whether those customers include Indian ones: "They are emerging in India as well, in addition to other parts of the world."
For a country simultaneously scaling its manufacturing base and confronting the prospect of its automation, this is not a small sentence.
The Architecture of What Comes Next
The India AI Impact Summit's Leaders' Declaration — sovereign AI infrastructure, inclusive governance, mass AI literacy — states the right ambitions at the right moment. India's entry into Pax Silica on the summit's final day gives those ambitions a geopolitical skeleton. What realises them fully is neither declarations nor coalitions but the accumulation of decisions by companies like Qualcomm: where to commission the engineering, where to deploy the hardware, which markets to treat as structurally central rather than commercially peripheral.
That Cristiano Amon chose New Delhi for a keynote, and that the man widely regarded as his most technically formidable lieutenant delivered one of his own on the same day, before sitting down to discuss the full architecture of Qualcomm's India strategy, is itself a signal. Companies keynote where they believe the future is being written.
Durga Malladi delivered his arguments without embellishment, in the precise and unhurried manner of someone who has been building towards this moment for three decades. He does not train foundational models. He does not operate a consumer chatbot. He designs the chips inside the devices — billions of them — through which most of the world's population will eventually encounter artificial intelligence. In a country of 1.4 billion people, that is the more durable position.
India declared its intent to be an architect of the AI era. Qualcomm, with its chief executive on one stage and its most foundational technology architect on another, is already laying the foundations. |