Deze pagina is momenteel niet beschikbaar in het Nederlands.

How DeepL delivers sustainable Language AI with EcoDataCenter

author_by DeepL Team
Ecodatacenter Blog Post header

Artificial intelligence is broadening our intellectual curiosity and capacity for productivity, but it’s also taking a serious toll on the environment. Consider that a search using ChatGPT consumes 10x the energy of a traditional search on Google, then multiply that by the proliferating uses that the world has for AI — both from consumers themselves and from businesses of all types. Then add in the demands of ever-more advanced large language models (LLMs). You quickly get a sense of why the energy demands of AI are one of the most urgent issues on the table for tech companies, investors and society as a whole. Goldman Sachs predicts that data center power consumption will grow 160% by 2030. At that point, data centers will consume 8% of all power in the US, up from 3% in 2022.

Ensuring a healthy relationship between AI and energy is essential for ensuring that the technology’s impact on the planet is a positive one. That healthy relationship certainly isn’t inevitable, and it can’t be taken for granted. In fact, it’s more difficult to retrofit sustainability into AI than it is to design AI models and operations with sustainability at the center.

One of the many elements that distinguishes DeepL’s approach is an AI model that’s been consciously developed with energy efficiency in mind.

Designing AI for sustainability

It starts with the models powering Language AI themselves. Model efficiency has a huge role to play in reducing the energy footprint of LLMs. That efficiency improves dramatically when you train a model in a focused way, using relevant data and the right setup. DeepL’s approach of training Language AI on proprietary linguistic data sets, rather than scouring the internet as a whole, embeds an inherently efficient approach. It’s not just more accurate and less prone to hallucinations; it’s more sustainable.

When it came to providing the compute power and data storage for training DeepL, we chose EcoDataCenter, one of the most advanced and sustainable data centers in the world. Like DeepL, EcoDataCenter prioritizes energy efficiency and sustainability — from the choice of its location through to the design of its buildings and systems, its supply and use of energy, and its strategy for scale. Today, it represents the largest data center used by DeepL, and houses Mercury, the supercomputer cluster powering Language AI.

Setting new efficiency standards for data center design

We are proud to now lift the veil on our EcoDataCenter, located in northern Sweden, where we’re able to leverage the cold Nordic climate to support energy-efficient cooling and more effective heat recovery systems. It uses solely renewable energy, powered by a combination of wind and the region’s meltwater-fed rivers, and it recycles excess heat from generators to warm nearby homes and greenhouses, creating a circular energy economy. Its unique wood-beam construction minimizes the use of steel and cement, both of which come with a substantial carbon footprint. In fact, the wood used to build EcoDataCenter can be replenished by Sweden’s forests in just three minutes.

Ecodatacenter renewable energy blog divider

This rigorously sustainable design creates a very different energy footprint, compared to many of the conventional data centers supporting AI systems. Its power usage effectiveness (PUE) rating is 30% higher than its best-performing peers, and the amount of energy used for cooling (which typically represents 40% of a data center’s consumption) is significantly less. What’s more, EcoDataCenter runs on a smart grid that coordinates its energy use with fluctuations in demand, and balances the load to ensure a stable energy supply for everyone.

Scaling sustainability alongside AI

As DeepL continues to grow, it’s vital that our approach to sustainable AI increases at the same rate. EcoDataCenter itself has a modular design that enables it to scale its operations in an efficient way, in response to demand. It’s one of several data centers where DeepL self-hosts its operations, keeping control over environmental impact, as well as security and data privacy.

And when DeepL’s growth requires us to scale operations even beyond this, we will choose our suppliers to meet our high sustainability standards. That includes moving beyond carbon offsetting to deliver genuinely net-zero data centers, and interrogating the real footprint of any energy source that’s claimed to be zero carbon. As advances in chip design give us specialized chips that can reduce the energy cost of AI inference, we’ll adopt them to reduce our future energy needs, and ensure that our data center suppliers do the same.

We are proud of the impact DeepL has had on our companies, individuals and society to date. But we are even more proud of the impact our development AI doesn’t have on the environment. As we enter the next exciting phase of growth for DeepL, we are determined to maintain this balance.