The exponential growth of artificial intelligence has propelled an unprecedented demand for computational power, pushing the limits of existing data center infrastructure. Increasingly, the primary constraint on scaling these vital AI facilities is not merely the availability of processing units, but the sheer volume of electricity required to operate them efficiently. This critical challenge has garnered significant attention from the venture capital community, leading Peak XV Partners to champion C2i Semiconductors, an Indian startup pioneering integrated power solutions designed to dramatically reduce energy losses and optimize the economic viability of large-scale AI operations.
The Escalating Energy Demands of Artificial Intelligence
Artificial intelligence, particularly deep learning models, relies on extensive training and inference processes that consume prodigious amounts of energy. Modern AI models, characterized by billions or even trillions of parameters, necessitate massive parallel computation, primarily executed on specialized hardware like Graphics Processing Units (GPUs) or Tensor Processing Units (TPUs). The continuous operation of these powerful accelerators, coupled with the cooling systems required to prevent overheating, transforms data centers into veritable energy sponges.
Historically, data centers have been significant energy consumers, but the advent of generative AI and large language models has accelerated this trend to an alarming degree. Industry analyses underscore this escalating consumption: a December 2025 report from BloombergNEF projects that electricity usage by data centers could nearly triple by 2035. Similarly, Goldman Sachs Research estimates a staggering 175% surge in data center power demand by 2030 compared to 2023 levels, an increase roughly equivalent to adding an entire country’s electricity consumption that ranks among the top ten globally. This trajectory poses substantial challenges not only for the profitability of AI ventures but also for global energy grids and environmental sustainability efforts.
The Hidden Drain: Power Conversion Inefficiency
While the overall energy demand is monumental, a significant portion of this consumption is lost within the data center itself, specifically during the intricate process of power conversion. Electricity enters a data center at high voltages, often hundreds of volts, and must be stepped down multiple times to the much lower voltages (typically less than one volt) required by sensitive microprocessors like GPUs. This multi-stage conversion process, involving AC-to-DC rectification and subsequent DC-to-DC transformations, is inherently inefficient.
According to Preetam Tadeparthy, co-founder and CTO of C2i, this current power delivery architecture results in approximately 15% to 20% of energy being wasted as heat. This inefficiency not only inflates electricity bills but also necessitates more robust and energy-intensive cooling systems, creating a compounding effect on operational costs and environmental impact. The trend toward higher voltage inputs, with systems already transitioning from 400 volts to 800 volts and likely higher, further complicates the challenge of efficient power conversion, demanding more sophisticated and robust solutions.
C2i’s Integrated "Grid-to-GPU" Solution
Founded in 2024 by a team of seasoned power executives from Texas Instruments—Ram Anant, Vikram Gakhar, Preetam Tadeparthy, and Dattatreya Suryanarayana, alongside Harsha S. B and Muthusubramanian N. V—C2i Semiconductors has embarked on a mission to fundamentally reimagine power delivery for AI infrastructure. The company’s innovative approach centers on developing a unified, "plug-and-play" system that manages power from the incoming grid connection all the way to the individual GPU. This "grid-to-GPU" strategy aims to integrate power conversion, control, and packaging into a single, cohesive platform.
By addressing the entire power delivery chain holistically, C2i projects a reduction in end-to-end energy losses by approximately 10%. This seemingly modest percentage translates into substantial savings: roughly 100 kilowatts conserved for every megawatt consumed. The ripple effects of such efficiency gains are profound, impacting cooling requirements, enhancing GPU utilization rates, and ultimately improving the overall economic model of data center operations. The promise of this integrated system is not just about saving electricity, but about optimizing the total cost of ownership, boosting revenue potential, and improving the profitability of AI-driven enterprises.
C2i recently secured $15 million in a Series A funding round led by Peak XV Partners, with additional participation from Yali Deeptech and TDK Ventures. This latest investment brings the two-year-old Bengaluru-based startup’s total funding to $19 million, providing crucial capital to advance its ambitious technological roadmap.
Strategic Investment Amidst Global Energy Concerns
The investment by Peak XV Partners underscores a growing recognition within the venture capital community that power efficiency is no longer a peripheral concern but a central pillar of AI infrastructure economics. Rajan Anandan, managing director at Peak XV Partners, highlighted that while upfront capital expenditures for servers and facilities are significant, energy costs quickly become the dominant ongoing expense for data centers. In this context, even marginal improvements in efficiency can yield immense financial benefits. A reduction in energy costs by 10% to 30%, as Anandan suggested, could amount to tens of billions of dollars in savings across the industry, demonstrating the profound market opportunity C2i aims to capture.
This strategic funding also reflects a broader shift in investment priorities towards "deep tech" solutions that tackle fundamental infrastructure challenges. As the world grapples with climate change and the imperative to decarbonize, energy-efficient computing solutions are becoming increasingly vital. The environmental footprint of AI is a growing concern, and innovations like C2i’s could play a critical role in mitigating the carbon emissions associated with the rapid expansion of AI capabilities.
The Economic and Environmental Imperative
The economic impact of inefficient power delivery in data centers extends far beyond electricity bills. The heat generated by wasted energy requires robust and costly cooling systems, which themselves consume significant power. This creates a vicious cycle where inefficiency begets further inefficiency. By reducing heat generation at the source, C2i’s solutions promise a dual benefit: lower direct power consumption and reduced cooling overheads. This translates directly to a healthier bottom line for data center operators and hyperscalers, who are constantly seeking ways to optimize their vast and complex infrastructures.
Beyond the immediate financial gains, the environmental implications are substantial. As AI becomes more ubiquitous, its energy demands threaten to undermine global efforts to combat climate change. Developing "green AI" that is both powerful and sustainable is a critical challenge for the technology industry. Innovations that make AI infrastructure more energy-efficient contribute directly to reducing the carbon footprint of digital technologies, aligning with broader societal goals for sustainability and responsible technological advancement. The global energy transition towards renewables also places pressure on data centers to become more flexible and efficient users of electricity, as grid stability can be impacted by large, inflexible loads.
Navigating a Challenging Market Landscape
C2i’s ambition to revolutionize data center power delivery positions it in a market historically dominated by established incumbents. The power delivery segment of the data center stack is characterized by long qualification cycles, deep balance sheets, and a preference for proven, reliable technologies. Many existing companies focus on incremental improvements to individual components rather than a complete architectural overhaul. C2i’s integrated, end-to-end approach, which simultaneously addresses silicon design, packaging, and system architecture, is a capital-intensive undertaking that few startups attempt. Proving the efficacy and reliability of such a comprehensive system in real-world production environments can take years.
The true test for C2i will lie in its execution. The startup expects its initial two silicon designs to be fabricated and returned between April and June, marking a crucial milestone. Following this, C2i plans to rigorously validate the performance of its solutions with leading data center operators and hyperscalers who have expressed interest in reviewing the data. This validation phase will be critical in demonstrating the tangible benefits of their technology and building confidence among potential customers. With a team of approximately 65 engineers based in Bengaluru, C2i is also strategically establishing customer-facing operations in the U.S. and Taiwan to facilitate early deployments and forge key industry partnerships. Anandan of Peak XV Partners noted that the feedback loop for C2i should be relatively short, indicating that the efficacy of their thesis will be tested within the next six months through silicon validation and initial customer feedback.
India’s Emergence in Deep Tech and Semiconductor Design
C2i’s rise also signifies the increasing maturity and global competitiveness of India’s semiconductor design ecosystem. Rajan Anandan likened the current state of semiconductors in India to the early days of e-commerce in 2008, suggesting a nascent but rapidly accelerating growth phase. India boasts a deep pool of engineering talent, with a growing proportion of the world’s chip designers based in the country. This intellectual capital provides a strong foundation for innovative deep tech ventures.
Furthermore, government-backed initiatives, such as design-linked incentives, are playing a pivotal role in fostering a conducive environment for semiconductor startups. These incentives help mitigate the high costs and risks associated with "tape-outs" (the final design stage before manufacturing), making it more feasible for Indian companies to develop globally competitive semiconductor products. This support enables startups like C2i to operate as independent innovators rather than merely captive design centers for multinational corporations. The ability to design, develop, and potentially even manufacture advanced semiconductor solutions within India marks a significant step towards national self-reliance and global leadership in critical technological domains.
As C2i moves forward with validating its system-level power solutions, the coming months will reveal whether these favorable conditions translate into a truly disruptive product that can redefine energy efficiency in the AI era. The success of such ventures is paramount, not only for the future of AI but also for shaping a more sustainable technological landscape.







