15. December 2022 By Yelle Lieder
Life cycle assessment software – why an electricity meter is not enough
Anyone who looks into the environmental impact of digital solutions will quickly come across frightening statistics on the level of CO2 emissions they indirectly cause. Estimates for this range up to a share of six per cent of global CO2 emissions, although the realistic corridor is more likely to be in the range of two to four per cent. In individual publications, you also read about estimates which state that digital infrastructure will cause these emissions to increase to 23 per cent of global emissions by 2030. Unfortunately, ‘estimates’ is the key word here. In fact, neither the CO2 emissions nor the power consumption of the world’s cumulative IT practices can be measured. Despite there being a historically high level of networking and automation, there is often a lack of reliable data to back up estimates with facts. In this blog post, we will take a look at why such estimates are difficult to make and what can already be used today to measure the CO2 emissions of digital solutions.
How does life cycle assessment software work?
There is currently no uniform standard for measuring the environmental impact of software. The common practice in companies is to factor IT expenditure and extrapolate it to CO2 emissions. Similar methods can be seen in things such as mobile phone apps, in which private individuals can calculate their personal footprint based on their expenditures. Even though the weaknesses of this approach are obvious, it is still impressive because it is practical.
More advanced techniques include things such as using electricity meters at the main points of consumption within the system – as envisaged by the Blue Angel – or measuring CPU time, as in this example from the Öko-Institut (German Institute for Applied Ecology). The read-out of the load on the central processing unit (CPU) is a good reference value for measuring the electrical power consumption of software.
Running a test under real conditions is always a decisive factor when measuring this. A single visit to a single website is not a realistic usage scenario. Therefore, in order to obtain reliable data, one or more standard usage scenarios that generate a representative load on the system to be tested must always be specified when measuring it.
Has the problem surrounding life cycle assessment software been solved?
Even for less complex systems, such as a local desktop application, measurement methods are still in their relatively early stages. The consumption of a desktop application running on a single physical computer can still be measured with a sufficient degree of precision using standard household ammeters. However, most systems also use an Internet connection, which means that other physical devices in the network infrastructure, such as routers and switches, must be taken into account. On top of that, the endpoint of this data transfer over the network is another computer or server. These additional devices in complex distributed systems also tie up resources during production and consume power at runtime. With mobile devices, different types of transmission networks – Wi-Fi, LTE, 5G – in which the power consumption can sometimes differ by a factor of 40, also come into play. Provided that an electricity meter is not connected to each consumer in a distributed system, uncertainties must always be supplemented by estimates.
Is there a link between data transmission and CO2 emissions?
Popular free tools such as websitecarbon.com or digitalbeacon.co take a website’s URL and return an estimate of how much CO2 is emitted when a page is viewed. The calculation basis used to do this is transparent and shows that the transferred data volume per website visit is the main thing that is calculated. The power consumption is inferred from the measured data volume and used as the basis for inferring the CO2 emissions.
The simplified assumption that transferring one gigabyte of data always consumes the same amount of power is problematic. For example, these tools make no distinction as to which digital infrastructure is at play, whether content delivery networks are interposed or how complex the calculations on the servers are. These tools are also unable to take into account whether several geo-redundant data centres are holding the requested data in the background or whether a single-board computer is answering the request.
Moreover, power consumption on physical hardware does not always run linearly to data transmission. Routers and switches, for instance, have a fairly high base load in terms of power consumption, which barely changes when small amounts of data are transmitted. The power consumption often only increases in steps when the amount of data exceeds certain thresholds.
There are countless other aspects – such as caching or the type of device used – that make it so that looking solely at the volume of data is not enough. So if you only orientate yourself using these tools, your incentives for optimisation will be misguided. Those who opt solely for data transmission quickly get lost in low-efficiency measures and disregard important aspects such as hosting, the power source, the service life of hardware and optimisation in the backend. Despite all the criticism, however, it must be said that, unfortunately, these are currently the best tools that are freely available to most people. And in the end, a byte not transferred is almost always an improvement.
Other useful tools that can help make the aforementioned estimates more meaningful include:
- thegreenwebfoundation.org to check where the website is hosted
- aremythirdpartiesgreen.com to analyse the third-party services used
Is cloud computing the solution to digital sustainability?
The question as to whether moving to the cloud is not the solution comes up regularly in discussions about digital sustainability because, after all, many cloud providers advertise that they are carbon neutral. However, if you take a closer look at the providers’ sustainability reports, you will see that carbon neutrality is a very elastic term. Even though most providers have their reports audited by independent third parties, as long as there are no uniform standards for such an audit, the significance of these reports is mostly of a symbolic nature if anything – or the reports are simply for advertising purposes.
When a data centre purchases ‘green power’, it usually means they are paying a higher price for electricity that promotes the expansion of renewable energies. In this blog post we will show why CO2 is emitted despite electricity supposedly being green.
Amazon Web Services, for example, reports in many regions using the market as its basis, which means that their emissions are offset with CO₂ certificates they have purchased or with power purchase agreements (PPAs). Therefore, customers mostly see emissions of zero grams on their statement; what was actually emitted is not always made transparent to them.
Google, in comparison, reports using location as its basis in addition to the market-based approach. This makes the actual emissions for performing services transparent to an approximate degree. However, the reports are only made available with a significant delay so that it is difficult to actually be able to make a decision based on them.
Microsoft currently provides the most comprehensive report among the hyperscalers, which, as opposed to those previously mentioned, also includes the proportional emissions for the hardware used.
In comparison to cloud solutions, with on-premise infrastructure, you have significantly more possibilities to obtain unaltered figures, even if it involves more effort. Hyperscalers often have specialised hardware manufactured for which there are no public specifications. When it comes to standard hardware, the Boavizta database, for example, contains emission data for a wide range of common models. In a privately-owned data centre or in a colocation centre, there also tends to be an option to simply read the electrical power consumption off the back of the server. Measuring the CPU time for a container is often difficult in the cloud.
The cloud is however often a proven means of reducing the environmental impact of its digital solutions despite all the criticism heaped on it in regard to it being the supposed saviour of digital sustainability. But, this has less to do with allegedly carbon neutral services than it does with efficient cloud native solutions, higher hardware efficiency and a better position to negotiate with energy suppliers. One tool that attempts to shed some light on the darkness that is sustainability reporting is Cloud Carbon Footprint, which makes it possible to see an aggregated view of emissions data across multiple providers.
Is it worth it to measure the environmental impact of software?
Even though we mainly looked at CO2 emissions in this blog post, the digital infrastructure has much more far-reaching consequences for people and the environment. The mining of rare earths to manufacture hardware and indirect water consumption via electricity generation are just two of the many factors that a comprehensive life cycle assessment would have to take into account. ISO 14040:44 provides the appropriate framework for taking all these factors into account. That being said, performing a life cycle assessment in accordance with ISO also involves a lot of manual effort and can never compensate for all the missing data. It does however ensure a uniform approach and thus makes an important contribution to combatting greenwashing.
Now is not the time to stop measuring, despite all the aforementioned limitations regarding measurement. Even though there are still many uncertainties due to vague estimates and poor data, in many cases, a poor measurement is certainly better than no measurement at all. As long as the data remains insufficient, we have to make do with pragmatic solutions and do everything we can to avoid greenwashing.
You will find more exciting topics from the adesso world in our latest blog posts.