Skip to content

DCVC DTOR 2024: It’s time to start making AI greener

Unless we find smarter ways to manage the burgeoning power require­ments of data centers, including those supporting the newest machine-learning models in AI, deep-tech innovators could end up enlarging humanity’s net carbon footprint rather than shrinking it.
Fervo Energy

The 2024 edition of the DCVC Deep Tech Oppor­tu­ni­ties Report explains the guiding principles behind our investing and how our portfolio companies contribute to deep tech’s coun­terof­fen­sive against climate change and the other threats to prosperity and abundance. Four of the oppor­tu­ni­ties described in the report relate to computing; this is the third.

Between 2010 and 2020, the major data center operators held power require­ments roughly flat through energy-efficiency improve­ments and renewable energy purchases. Then came AI’s great break­through in the form of large transformer-based foundation models” such as GPT‑3, which entailed a shocking increase in electricity and cooling require­ments. By 2030, according to projections from Boston Consulting Group, data centers running generative AI models could be sucking up 7.5 percent of all electricity in the U.S. — as much power as is used by 40 million homes.

The essence of the problem is that AI-focused data centers use specialized graphics processing units (GPUs) and tensor processing units (TPUs), which have more processing cores than traditional central processing units (CPUs) and consume 10 to 15 times more electricity. Newmark, a research firm that studies commercial property markets, says that as hyper­scalers” such as Amazon, Google, Meta, Microsoft build more AI-focused data centers, total data center energy demand is likely to rise from about 17 gigawatts of power in 2022 to 35 gigawatts by 2030. In a world where nations are getting serious about decar­boniza­tion, AI will be competing for that electricity with battery-powered vehicles, hydrogen production, new industrial facilities, and many other needs.

The situation demands innovation along two tracks. First, we must keep building clean new generating capacity, in the form of both variable, renewable sources like wind and solar power and 247 firm power from sources like nuclear, geothermal, and grid-scale storage. Second, as part of any vision for the responsible, ethical rollout of new AI tools, we need a plan for limiting and eventually reducing the energy require­ments of foundation models themselves. The AI companies’ current strategy for improving foundation models is to keep feeding them larger and larger amounts of training data. They must also invest in techniques that reduce their models’ training require­ments, such as transfer learning (using knowledge gained from one task or domain to perform better on another).

At the same time, we’d like to see semi­con­ductor manu­fac­turers work harder to optimize next-generation GPUs and TPUs for energy efficiency. The same goes for in-device chips, since we’ll need to offload more AI inference tasks to edge devices such as laptops and phones. Here, a company in the DCVC portfolio called Mythic is one of the leading players. Its analog silicon archi­tec­ture combines processing and memory on a single chip, making it thousands of times faster and more energy-efficient than traditional digital chips. The company’s technology may ultimately be the only affordable option for running high-performance computer vision models or large language models on distributed devices such as smart-city cameras or smart-home appliances.

The hyper­scalers themselves must also stop relying on renewable energy certifi­cates to offset their consumption of electricity generated from fossil fuels, and instead find around-the-clock sources of carbon-free electricity. Microsoft’s plan to buy the entire generating capacity of a rebooted Three Mile Island Unit 1 from the mothballed plant’s owner, Constel­la­tion Energy, is just such a step — and a harbinger of the future.

To date, companies have built larger and more powerful AI models without much consid­er­a­tion for their training costs or carbon footprints. Now it’s time to work on making AI more inherently carbon-friendly. As information systems scholars Thomas Davenport and Ajay Kumar wrote in Harvard Business Review in 2023, Generative models in particular…need to become greener before they become more pervasive.” We’ll look to all our portfolio companies, present and future, to do their part.

Related Content