FarrPoint's Kate Milne has just come back from the Economist's AI Compute Summit in Copenhagen, where she explored the intersection of economics, AI, data centres and energy. Read about the latest developments in AI in her key takeaways from this flagship event.
Author: Kate Milne, Economist at FarrPoint
As the race to scale AI intensifies, the real challenge isn’t just how fast we can grow, but how sustainably we can do it. The Economist's (this will open in a new window)AI Compute Summit in Copenhagen explored these critical questions, highlighting both the challenges and the emerging strategies shaping a more sustainable path for AI.
1. AI is no longer just a tool; it’s a geopolitical force. Compute capacity is now a lever of national power, with countries like Denmark and Finland setting sustainability benchmarks (companies like Gefion and LUMI), but facing limitations in cost, scale, and geography. Digital sovereignty and regionally trained models are gaining traction, alongside growing concerns over strategic regional dependencies and data security.
2. Less than 1% of data centres currently reuse heat, which is a staggering missed opportunity. While technologies like liquid cooling can reduce energy use, the heat they produce is harder to repurpose without the right incentives, coordination, and viable business models. To make heat reuse viable, the location of data centres is crucial and should be situated near where the heat can be effectively utilised, such as near cities, industries, or campuses.
3. It’s not just the compute that consumes power; it’s also the data movement. Running AI at the edge can potentially reduce energy consumption and lower latency. Inference, reasoning, and agentic AI should occur as locally as possible to the consumer, shifting how and where we design compute infrastructure.
4. AI optimisation must happen at every level:
- Model – downsize, specialise, and use inference closer to the user.
- Infrastructure – retrofitting data centres can often be harder than rebuilding, and depends on size, scale and cost. Embracing liquid cooling is also needed.
- Grid participation – data centres must become active partners in energy systems.
5. AI’s power use often peaks in the wrong places and at the wrong times, putting added pressure on already-stretched energy grids. Addressing this challenge will require more innovative collaboration between utilities and data centres, enabled by supportive regulation and underpinned by greater transparency.
- Demand-side innovation: smarter load balancing, battery integration and local generation
- Incentives: support for renewables, heat reuse, and reporting, as social value initiatives will become increasingly demanded
- Grid resilience: building capacity for AI surges without disruption
6. Despite demand, a growing skills gap exists in liquid cooling, infrastructure deployment, and AI operations. Europe also suffers from a fragmented regulatory landscape, which limits investment and hinders deep tech progress. Upskilling through vocational and university training is a clear priority. The AI innovation ecosystem needs trained engineers as much as cutting-edge models.
7. Lastly, here are some interesting examples of innovation in the AI sector:
- DMI: hyper-accurate weather forecasting -(this will open in a new window) more
- Teton: AI for hospital care companions - (this will open in a new window)more
- University of Copenhagen: 100x faster quantum simulations - (this will open in a new window)more
As AI continues to evolve from a technical tool into a critical layer of our economic, environmental and strategic systems, the choices we make today will shape not just the future of technology, but also the resilience of our infrastructure and the sustainability of our planet.
Get in touch or visit our data centre consultancy services page.
Connectivity is important. It drives business and society, bringing communities and commerce together. That's why we use our insight and experience to connect people and business.