It was as if he had just given himself the power to cancel transatlantic flights, or boost the number of vegetarians in the world. All by deleting a few lines of code.
In early 2020, programmer Danny van Kooten published an eye-catching blog post. In it, he explained why he had removed some code from a Mailchimp newsletter plugin that he had made for the WordPress blogging platform. At the time, all of his plugins were installed on more than two million websites and the small segment of code in question required 20 kilobytes of data.
After making some broad assumptions about how many people visited those websites, and the carbon intensity of electricity consumed by their computers, he estimated that this small adjustment, if replicated widely, could slash computing-related emissions by the equivalent of 59,000 kilograms of CO2 every month. In principle, over one year, that’s as much as stopping 1,200 people flying from Amsterdam to New York.
Born to run
The blog post soon went viral. Van Kooten’s was such a tantalising assertion that it has been discussed in many other blogs as well as mainstream media coverage since. And yet, calculations like this are surprisingly rare. Given the many millions of pieces of software running on various machines around the world right now, it is possible that a flurry of minor programming adjustments across many of them would have a significant global impact.
“We could save huge amounts over what we currently use,” says Laurence Tratt, a professor of software development at King’s College London.
One problem is that this area of programming remains frustratingly murky. “Very few people frame optimisation work in terms of energy and most organisations don’t make the data they do have on this very visible or clear,” he adds. We often don’t understand how computer programs consume energy well enough in order to make accurate assessments of them, or decide on improved approaches, says Tratt.
There are examples of energy efficient programming interventions out there, however, besides van Kooten’s. In 2021, Hambleton District Council in England – now part of North Yorkshire Council – responded to a Twitter user who pointed out that the file sizes of images on the council’s website could be smaller and, therefore, quicker to load. The council agreed and made the necessary changes. One analysis of this move, which I have not independently verified, suggested that, per month, this alteration could equate to preventing the boiling of water for 57,000 cups of tea.
Nearer my content to thee
“The biggest thing we can do is host pages near to somebody,” says Andy Eva-Dale, chief technology officer at Tangent, a creative agency.
He explains that web developers have a key role to play in reducing emissions. They can elect to store web content on servers in multiple countries, thus bringing that content physically closer to users in those countries, for example. This means data is transferred over a shorter distance, which can use less energy, depending on certain factors, and also helps the web page to load more quickly.
Eva-Dale says Tangent’s clients are increasingly asking about energy efficiency, which he adds is “encouraging”.
Among the firm’s customers is UK Power Networks, an organisation that is responsible for maintaining electricity infrastructure in much of England. Eva-Dale notes that by using techniques including local content delivery, he and his team were able to lower emissions for a single homepage visit to the UK Power Networks website from an estimated 1.44 grams to 0.22 grams of CO2 equivalent.
Even the programming language you choose could have an impact, says Rui Rua at the Institute for Systems and Computer Engineering, Technology and Science in Porto. He and colleagues published papers on this topic in 2017 and 2021. Rua says that a program written in an older language such as C could use as much as 80 times less energy than if that same program were written in a more modern language such as Python.
“[The older languages] are way more low-level and closer to the language that the processor itself understands,” he explains.
As some commentators have pointed out, this does not mean you cannot write an efficient program using Python – Rua acknowledges this. But his and his colleagues’ analysis does suggest that your choices about how you write the program, including which language you use, may ultimately influence energy consumption.
Tratt agrees while stressing that, even today, there is often still uncertainty about which specific programming decisions reliably relate to increased or reduced energy use. “We should be encouraging people to look deeper into this,” he says.
A drop in the ocean?
The elephant in the room, though, is generative AI. As I reported for BBC News back in May, the meteoric rise of this technology has raised questions about how much energy large language models (LLMs) are guzzling. Sasha Luccioni from Hugging Face, a machine learning company, told me that, for a given task, a generative AI approach could be 33 times more energy hungry than a comparatively traditional algorithm. And yet, for many tasks, generative AI is quickly becoming the default.
If AI is such a big, energy-gobbling beast, does it really matter whether you build an app in Python or C; or if you slim down the file size of images on your home page? It might seem like small fry in comparison.
For one thing, the extent of adoption matters. Rui notes that his research has touched on the energy consumption required by virtual keyboards in Android OS, an operating system widely used on many smartphones and tablets. He has estimated that switching to a lean, energy efficient version of these keyboards could cut energy use associated with that aspect of the software by 18%. Imagine such a choice repeated across the two or three billion Android smartphones in use worldwide, he says.
“I’ve been interested in taking care of our planet for maybe more than 50 years,” says Lasse Natvig at the Norwegian University of Science and Technology. Now, more conscious of climate change impacts than ever, he is currently adjusting the course he teaches to his students, in order to give it an even greater focus on energy efficient programming techniques.
Tiny tweaks to software are like commuters deciding to take a bike to work instead of driving their car, he suggests. The impact of multiple, individual interventions can soon add up.
“We have this saying in Norwegian – mange bekker små gjør en stor å,” he tells me, “Many small streams make a big river.”
Further reading on this week’s story
Earlier this week, Hannah Ritchie at Our World in Data analysed current estimates of how much energy generative AI might use by 2030. Some of the gloomiest projections have improved in recent months, partly because data centres keep getting more efficient.
Check out Rui Rua and colleagues’ 2021 paper on energy efficient programming languages here.
While some of the details of studies like this remain hotly debated by programming experts, there are lots of resources that back up the idea that languages such as Python, for example, are energy expensive when compared to C. YouTuber Bryan Lunduke posted an informative video on this point a few months ago.
Scientist readers of this newsletter might also think about the energy costs of large computer simulations they run during research and modelling work. This paper in Nature Computational Science introduces a set of principles for greener computational science
This article originally featured on The Reengineer and is available here.