Logo

Watt Matters in AI: Who’s responsible for AI’s energy hunger?

As AI grows more powerful, researchers urge users, developers, and organizations to rethink their roles in its environmental impact.

Published on April 14, 2025

watt matters in ai energy hunger

Bart, co-founder of Media52 and Professor of Journalism oversees IO+, events, and Laio. A journalist at heart, he keeps writing as many stories as possible.

In anticipation of the "Watt Matters in AI" conference, IO+ describes the current situation, the societal needs, and the scientific progress. Today, we dig into some behavioral aspects of the challenge.

Watt Matters in AI banner

Watt Matters in AI

Watt Matters in AI is an initiative of Mission 10-X, in collaboration with the University of Groningen, University of Twente, Eindhoven University of Technology, Radboud University, and the Convention Bureau Brainport Eindhoven. IO+ is responsible for marketing, communication, and conference organization.
More information on the Conference website.

Artificial intelligence, despite its remarkable capabilities, comes with a hidden cost: energy consumption. From generative AI tools that require massive computational resources to smaller models embedded in everyday devices, the energy required to develop, train, and deploy AI is skyrocketing. But as the environmental toll becomes more evident, a pressing question arises: Who bears responsibility for reducing AI’s energy footprint?

In 2024 and 2025, a wave of studies began to address this very question, not from the usual engineering or technical angle, but with ethical and behavioral perspectives. What these researchers argue is both timely and uncomfortable: we need to take responsibility. Whether as users, developers, corporate leaders, or policymakers, our actions - or inactions - shape the environmental consequences of AI. Below, we explore some of the most thought-provoking findings.

Environmental ethics enters the AI debate

Historically, AI ethics has focused on fairness, transparency, bias, and accountability in decision-making systems. But researchers like the authors of Towards an Environmental Ethics of Artificial Intelligence (2024) are pushing to broaden that scope. Their central claim is clear: if we care about justice in AI, we must also care about its environmental impact.

The study draws from environmental justice theory, introducing three key dimensions to consider:

  • Distributive justice: Who bears the ecological burden of AI?
  • Procedural justice: Who gets a say in how and where AI is developed and deployed?
  • Recognition justice: Whose voices - especially from vulnerable or underrepresented communities - are included in environmental decision-making?

These ethical lenses challenge tech developers not only to create efficient algorithms but also to think critically about why and for whom these tools are being built. It also calls on institutions to democratize the sustainability conversation around AI, inviting participation from civil society, environmental scientists, and marginalized groups.

Watt Matters in AI

Watt Matters in AI: Hardware-based views on energy efficiency

In anticipation of the "Watt Matters in AI" conference, IO+ describes the current situation, the societal needs, and the scientific progress

Corporate AI: Profitable, powerful, and opaque

While public interest in AI’s environmental impact grows, much of the computing power remains in the hands of a few corporations. These organizations often treat energy use as a private concern or a matter of data center optimization. But the 2025 paper Exploring the Sustainable Scaling of AI Dilemma pushes back on this status quo.

The authors argue that corporations must move beyond vague sustainability pledges. Instead, they should develop transparent, standardized frameworks for evaluating the environmental costs of their AI systems. A suggested metric is “Return on Environment” (RoE), a counterpart to ROI that prioritizes energy and carbon efficiency alongside profits and performance.

More provocatively, the paper questions whether AI scaling should even continue unchecked. If an AI system yields marginal gains but significantly increases energy consumption, is its deployment justifiable? In short, the authors call for hard choices and more open debate about the real value of AI systems.

From awareness to accountability: engaging AI users

Responsibility doesn’t stop at developers or corporations. Another key insight from recent research is that end-users of AI systems - researchers, engineers, and even everyday consumers - play a role in energy consumption, whether they are aware of it or not.

The Coca4ai project, introduced in 2024, explores this behavioral dimension by enabling user-level energy monitoring in AI data centers. When individuals submit jobs to the cluster, they receive feedback on the energy their code consumes. The early findings are promising: once aware of their consumption, users begin to adjust their behavior: running lighter models, optimizing code, or batch-processing tasks more efficiently.

This reflects a broader behavioral science principle: feedback leads to change. When users understand their impact, they are more likely to act responsibly. By integrating energy metrics into AI development tools and platforms, researchers hope to create a new norm, one where sustainability is part of the default workflow.

AI energy watt matters in ai

Watt Matters in AI: Software-based views on energy efficiency

In anticipation of the "Watt Matters in AI" conference, IO+ describes the current situation, the societal needs, and the scientific progress

Institutional guidelines: Europe sets a tone

Institutional leadership is also emerging, particularly in Europe. In its 2024 Guidelines on the Responsible Use of Generative AI in Research, the European Commission urges universities, research laboratories, and funding bodies to integrate sustainability into their AI policies.

The guidelines emphasize:

  • Training researchers to be aware of the environmental impact
  • Implementing monitoring and reporting mechanisms
  • Prioritizing energy-efficient AI in public funding decisions

This represents a shift from individual good intentions to systemic responsibility. When organizations set expectations and provide infrastructure, sustainability becomes more accessible and more enforceable.

Public awareness and the “invisible energy bill”

Outside academic and institutional contexts, the general public remains largely unaware of AI’s energy demands. A 2024 article from Wharton, The Hidden Cost of AI Energy Consumption, attempts to bridge that gap by making the issue relatable: using ChatGPT, Claude, or Mistral for a few hours might not feel energy-intensive, but in aggregate, such usage contributes significantly to global electricity demand.

The authors argue for a cultural shift akin to earlier movements in recycling or climate-conscious travel. Just as we’ve learned to think twice about plastic straws or short-haul flights, so too should we consider the energy cost of AI prompts, image generations, or unnecessarily large models. Raising awareness, they suggest, is the first step toward behavior change.

A collective burden, a collective opportunity

The energy hunger of AI is not a problem with a single villain or a single solution. As the research shows, responsibility is shared across the ecosystem:

  • Developers must design with efficiency and necessity in mind
  • Corporations must report and reduce their energy footprint
  • Users must understand and moderate their digital behavior
  • Institutions must create policies that support sustainable choices
  • Governments must provide regulations and incentives for greener AI

This isn’t just a technological challenge, it’s a cultural one as well. If AI is to be a tool for solving global problems, it must not become one of them. The frameworks, metrics, and motivation to act are already in place. What’s needed now is the will to distribute responsibility fairly and the discipline to follow through.

AI is here to stay, but so is our planet. As artificial intelligence continues to grow in capability, our ethical and behavioral maturity must grow alongside it. Recent research offers both a mirror and a roadmap. It reflects the imbalance between AI’s power and its accountability, while providing a path forward grounded in responsibility, equity, and sustainability.

Watt Matters in AI

Watt Matters in AI is a conference that aims to explore the potential of AI with significantly improved energy efficiency. In the run-up to the conference, IO+ publishes a series of articles that describe the current situation and potential solutions. Tickets to the conference can be found at wattmattersinai.eu.

View Watt Matters in AI Series

Disclaimer: In finding and analyzing relevant studies for this article, artificial intelligence was used.