Neděle, 1 června, 2025

Green Computing: Data Centers and Energy Use

Sdílet

How Do Data Centers Manage Their Energy and Optimize Efficiency?

Introduction

Information technology is one of the fastest growing sectors and plays an essential role in modern life and in global progress. There are 5,44 billion people with access to internet connection in 2024. [1] That means about two thirds of the world population are connected to the world wide web. The number of internet users has doubled in the last decade, and this growth probably won’t stop anytime soon. It is estimated that the number of connected devices will reach 75 billion by 2025. [2]

Data centers play a key role in the functioning of the internet and IT infrastructure. They are essential for processing, storing and transmitting vast amounts of data. [3] These centers enable communication through social media platforms, power the huge storage and processing capability of cloud computing, they also ensure online transactions. Beyond supporting connectivity, they are essential to world’s digitalization. They allow businesses to adopt cloud-based systems and modernize their operations. [4] These facilities provide the company’s IT infrastructure, they host the digital data and applications that businesses need for efficient operating. They power innovative technologies such as artificial intelligence, machine learning, and the Internet of Things (IoT).  [5]

It doesn’t come as a surprise that the IT sector, driven by data centers has significant energy demands, C02 emissions and heavily impacts the environment. These issues need to be recognized and managed, which is why green IT exists.

What is green computing?

Green computing, also known as green IT, is an approach to reducing the environmental impact of information and communication technology (ICT) while maintaining the same level of performance. It focuses on manufacturing, designing, using and disposing of electronic devices to ensure minimal ecological footprint. [6] Green IT is a broad concept because it covers various practices. For example, use of sustainable materials, recycling electronic waste or extending the lifespan of devices. Data centers are crucial in this context, as they play a significant role in the energy consumption and environmental footprint of modern IT infrastructure. [7]

Development of green data centers

Data Centers are specialized facilities that house computer systems and related components. They are designed to process, store and manage large volumes of data. As technology advances each year, data centers adapt to meet new demands. They have become increasingly efficient, powerful and vital to our digital world. Data centers have changed a lot since their beginnings in the 1940s. Here’s an overview of how they and their environmental impact have evolved. [11], [8]

The world’s first digital, programmable computer designed for general purposes was the Electronic Numerical Integrator and Computer (ENIAC). It was designed by U.S. military to process complex numeric calculations. Its role was to calculate the optimal settings needed to aim artillery guns. [9] This computer was completed in 1945. ENIAC was enormous, it weighted over 27 tons and covered an area larger than 139 square meters. It is reported that ENIAC used about 150 kwH of energy when it was running.  [10]

The first ever data center was a room that was established to hold and support ENIAC. While it might not fit the modern definition of a data center, it was an important step in the development of this technology. It was built in 1945 at the University of Pennsylvania with high levels of security and secrecy, as it was a vital military device. This facility differed significantly from today’s data centers.  It was much smaller – only a single room filled with tall racks and metal cabinets. Yet it faced similar challenges as modern facilities, such as cooling. In the 1950s additional data centers were established at key locations like West Point, the Pentagon, and CIA headquarters. All these facilities were also used for military purposes. [11], [13]

To manage the considerable heat output of early computers, data centers were equipped with huge cooling systems. Large fans and vents were used to manage overheating. [17] Early computing centers were highly inefficient by today’s standards. The machines were only capable of performing basic numerical calculations and required significant energy and maintenance resources to function.[13]

In the 1950s the invention of transistor allowed businesses to have “computer rooms” in office buildings – their own version of a data center. This was possible because computers were becoming increasingly compact, cost-effective and energy efficient. [11] IBM’s first fully transistorized computer required 90 % less power than vacuum tubes systems and was half its size. [15]

By the 1960s, large, powerful machines that could support all business’s operations, called mainframes, became common. These early, small data centers had to be reliable since all infrastructure ran on one system. [13]

In the 1970s, minicomputers and microcomputers began to replace mainframes. These smaller machines were not only adopted by businesses but also became accessible to universities, research institutions or government agencies. Their power consumption used tens of kilowatts, which is a fraction of the power needed by mainframes. Smaller computers were able to work together to handle tasks, transforming a single mainframe into a distributed system. This shift made computing technology more widely available and played a key role in shaping the data centers we rely on today​. This model became increasingly common throughout the 1980s and 1990s. [15]

The first personal computers made their debut in 1981. After that PCs spread quickly and were set up in homes, offices and educational institutions without much attention paid to the environment. At this time computers started to connect to networks and servers. Servers receive requests and respond to them over a network. [11]

Big changes came in the 1990s, because they marked the rapid rise of the Internet and introduced the .com boom. [15] Client-server computing became the foundation for most online services at this time. This network architecture separates tasks between clients and servers. Clients (such as personal computers) request data or services and servers provide them. A simple example is accessing a webpage, where the client device requests the page, and the server provides the website data. So, a server is a powerful computer or system that provides data or services like storing files, hosting websites, or running applications. [18]

These changes dramatically increased online activity and internet traffic. This required construction of more data centers with hundreds of servers to meet the demand. At this time servers started to be organized together in cabinets, called racks instead of server rooms. This allowed for better management, easier maintenance, and more compact storage of the growing number of servers. [13]

The increased server capacity in data centers during the 1990s was a significant challenge. Heat generated by so many servers had to be managed. At the beginning primarily inefficient methods like air conditioning and ventilation were used. This made the operation of data centers expensive and unsustainable. Power was derived from non-renewable sources which contributed to a rise in CO₂ emissions. [15]

Rapid expansion of personal computers also came with significant environmental challenges. Electronic waste began to grow as newer computers replaced older ones. Until the 2000s        e-waste recycling wasn’t very common. Devices were discarded without proper disposal methods and were often sent to landfills. [48] That caused environmental harm due to the hazardous materials they contained, such as mercury and lead.  [49]

The 2000s saw the rise of cloud computing services, which allow businesses and individuals to access computing resources over the internet instead of relying on physical hardware they own. Major companies like Amazon, Google, and Microsoft began building cloud data centers that offered businesses the ability to rent computing power and storage instead of constructing and maintaining their own data centers. Amazon Web Services (AWS), launched in 2006, allows businesses to rent computing resources as needed. Google Docs was introduced the same year. It is a cloud-based tool used for document creation, editing and collaboration. It allows users to work on a document in real time and share it. [15]

These companies contributed to the rise of cloud technologies, which led to change in the role and design of data centers significantly. Instead of hosting physical servers for each business, data centers evolved to support virtualized environments. As demand for cloud services grew, data centers transformed into hyperscale facilities starting in the 2010s. These hyperscale data centers are engineered to handle massive computing loads with high efficiency and serve hundreds of thousands of clients.

By 2012, about 38 % of organizations were using cloud services, in 2023 it was approximately 94 %. [15], [14] As cloud computing expanded, companies kept focusing on reducing costs of data centers, as well as efficiency. One of the first IT companies that focused on the environmental impacts was Google. Google managed to achieve carbon neutrality in its data centers in 2007, primarily by investing in renewable energy, purchasing carbon offsets, and improving energy efficiency in its operations. [47] Meta (Facebook) launched the Open Compute Project in 2011 to share energy-efficient data center designs and encouraged broader adoption of sustainable practices. [50] Microsoft committed to achieving carbon neutrality across its data centers, laboratories and offices by implementing an internal carbon fee to fund renewable energy and efficiency projects. [53] There are many more examples of how various companies have approached green computing, some of which I will highlight later in this paper. What’s important is that, over the past 20 years, minimizing environmental impact has become a focus in IT.  

Adopting green IT practices helps companies reduce their environmental impact while also saving costs in the long term. Companies are becoming greener not only because of regulation, but being recognized as a green company is also becoming a big marketing tool. [20] Costumers and investors are more likely to support businesses that prioritize sustainability, which can lead to a stronger brand image and increased customer loyalty. [22], [41]

How can data centers drive down their energy consumption?

Energy monitoring and metrics  

Firstly, data centers must understand their energy consumption. That can be done by monitoring energy use. Sensors and specialized tools track the energy use of various equipment and throughout the whole infrastructure. For instance, power monitoring sensors, such as smart Power distribution units, measure energy consumption in the racks.  This helps to identify which parts of a data center should be optimized. [27], [19]

The most common calculation to determine efficiency of data facility is Power usage effectiveness (PUE). It is an index that measures the energy efficiency of data centers. It is computed as total supplied power usage divided by the power used to run IT equipment. The formula is introduced below.  [27]

PUE basically measures how much energy is “wasted” on non-IT functions. How much of power went to cooling, lighting, power distribution rather than being directly used by IT equipment like servers, storage, and networking devices.

Google’s large-scale data centers maintain an average PUE of 1.10. Similarly, Meta (Facebook) achieves an average PUE of 1.09 across its data centers and Microsoft’s data centers also operate efficiently, with an average PUE of 1.12. These companies lead the industry in balancing performance and energy efficiency. [23], [16], [59]

Regular maintenance

In a data center, regular maintenance is a crucial aspect of keeping systems running smoothly and ensuring that operations are as efficient and cost-effective as possible. Constant use of hardware components can lead to wear and tear. That affects performance, increases energy consumption and could even cause system failures. By checking and replacing outdated components, data centers can improve performance and avoid costly emergency repairs. Older components should be recycled or repurposed to minimize waste. Nowadays various models driven by AI are being used to predict failure and identify components that need attention, that saves both time and costs. [28], [20]

One of the most crucial components, used for storing and retrieving data, are hard drives. Over time due to constant use, they are very likely to wear out.  Hard disk drives (HDDs) used in data centers become less reliable as they age. That’s why hard disk drives have been replaced by solid state drivers (SSDs). [22] These more modern drivers consume less power and are faster, they also have a longer lifespan as they are less likely to get physically damaged. Even SSDs must be checked and replaced after some time to ensure their best efficiency. [28]

Power supply units, cooling systems, networking equipment and other infrastructure components in data centers also require regular maintenance and replacement. As PSUs provide consistent power, they degrade over time. Replacing them with energy-efficient models reduces energy consumption and ensures stable power delivery. Similarly, cooling systems such as air conditioners, fans, and liquid cooling units can lose efficiency as they age. Outdated networking equipment, like routers, which direct data traffic between devices and switches, which connect multiple devices, can slow down data transmission and increase power use. Batteries which provide backup power during outages, and fans also experience wear and need to be replaced to maintain reliability and prevent inefficiencies. [33]

Optimizing Servers

Data centers have many operations that rely heavily on servers to function efficiently. Servers store, process, and deliver data or services to other devices over a network. Managing server tasks ensures that no server is overwhelmed with work or left with too few tasks. [26]

For distributing work across servers, Load Balancing is used. Load balancer acts as a traffic manager, it monitors servers and then decides where to send incoming traffic (accessing website or using an app, …) based on multiple factors. These factors can be server capacity, current load, availability, health, response time and many more. Load balancers use several techniques to distribute traffic. One of them is called Round Robin, which distributes traffic evenly across servers. Round Robin is easy to implement and ensures equal traffic distribution. However, it doesn’t consider that servers may have different speeds or workloads, which can lead to inefficiency when traffic changes. [29]

Technique Least Connections routes request the server with the fewest active connections. This method ensures that servers with lighter workloads are prioritized. By directing traffic based on active connections, it aims to prevent any single server from becoming overloaded, improving overall performance and response time. However, it may not always consider other factors like server capacity or performance, which can affect efficiency in some situations.

Another method is Resource-based load balancing, that directs traffic to servers with the most available resources. This approach ensures that the servers with the greatest ability to handle additional tasks are prioritized. By considering the actual resources available on each server, it aims to maintain efficiency and prevent any server from being overwhelmed. [30]

There are many other load balancing methods beyond the ones mentioned above. Each data center may use different load balancing techniques based on their specific strategy and requirements. Larger data centers usually tend to use resource-based load balancing to optimize efficiency and ensure even distribution of work. Smaller data centers, with fewer resources and less complex traffic patterns, typically rely on Round Robin for simplicity. [29]

Nowadays artificial intelligence is increasingly being used for predicting load balancing strategies. It helps optimize load distribution by analyzing real-time traffic patterns, predicting server loads, and dynamically adjusting the allocation of resources. This allows for smarter, more adaptive traffic management.

Virtualization is a powerful technique used by data centers to create virtual replicas of physical hardware and software resources. These virtual replicas, known as virtual machines (VMs), function as independent machines, each with their own operating system, applications, and resources. Multiple VMs can be run on a single physical server. This results in cost and energy savings by reducing the need for physical space, energy, and cooling resources, as fewer servers are required. Virtual machines can be backed up, replicated, and restored more easily than traditional physical servers, which means data can be quickly recovered in the event of a failure. [25], [28]

Virtualization also supports Scalability, which allows data centers to easily adjust their resources based on demand. It allows infrastructure to automatically adjust resources – scale up (add more resources) or down (shut down servers or VMs).  Adding or removing virtual machines is much easier because data centers do not need to modify physical hardware and can quickly adapt to changing needs, ensuring optimal performance. In the past, data centers would predict busier times, such as peak hours or seasonal traffic, and manually scale resources ahead. Modern auto-scaling technology, often driven by AI, makes this process more dynamic. It enables resources to be adjusted in real-time based on actual demand, or load traffic ensuring efficiency and cost savings throughout the day and year. [31]

Efficient cooling

Heat management has always been a huge aspect in data center design. It is estimated that 60 – 75 % of non-IT energy load is consumed by cooling systems because most equipment generates heat while operating. Recommended data center temperature is between 21 – 24 °C. [32] However, some experiments have shown that maintaining slightly higher temperatures (around 30 – 35°C) can reduce cooling need and supports faster operation of hardware. Cooling techniques depend on various factors such as the type of data center, its location, design, and the specific requirements of the equipment it houses. These considerations influence the choice of cooling methods to ensure optimal performance and energy efficiency. [33]

Companies test and develop new cooling techniques all the time. Older cooling designs were highly energy-intensive, but modern air conditioning systems, such as in-row cooling and hot aisle/cold aisle configurations, have significantly reduced energy demands. Let’s look at some of those methods.

In-row cooling is a technology where cooling units are placed between server racks. That way the cooling targets heat at its source, instead of managing heat in the entire room. Proper airflow management in this system is critical to ensure that hot and cold air do not mix together. [34], [36] That is often achieved by hot isle/cold aisle configuration. Server racks are arranged in alternating rows, with the cold air intakes facing each other as well as the hot ones. Typically, cold air intakes are in front of the server racks and hot are at the rear. [35]

Many cooling techniques are more efficient with hot isle/cold aisle configuration. It usually has higher upfront costs than room cooling but much lower operating costs. Studies show that this type of cooling can reduce energy consumption by up to 40 % compared to room-based cooling, primarily by reducing unnecessary airflow and fan power consumption.

Free cooling replaces traditional cooling systems by leading hot air from servers outdoors and bringing in cooler air from outside.  This approach can significantly cut energy costs, especially in areas with long cooler seasons. [33]

Evaporative cooling uses water evaporation to cool air. Hot air passes through moistened pads and when it meets the water on the pads, the heat causes the water to evaporate. This evaporation cools the air, which is then circulated into the room. Compared to traditional air conditioning, it uses less energy, though it requires a consistent water source to operate efficiently. This technique works especially well in dry climates. [37]

Phase change cooling is similar to evaporative cooling, it uses a natural process to absorb heat. Instead of water evaporation, it relies on a material’s transition from liquid to gas. As the material evaporates, it absorbs heat, cooling the surrounding area.

Liquid cooling systems use water or other coolants to directly absorb heat from equipment. In methods like direct-to-chip cooling, liquid circulates through pipes to cool processors. In cold plate cooling, plates are attached to the equipment to extract heat. 

In immersion cooling, servers are fully placed in a special liquid that effectively absorbs heat. This method removes the need for traditional air circulation systems like fans, reducing overall energy consumption.

There are other, more specific ways to cool a data center’s equipment. One of them is geothermal cooling which uses earth’s natural temperature for cooling. Pipes buried underground circulate water, which is naturally cooler than the air above, and can be used to regulate temperatures inside the facility. Or thermal energy storage issystem, where equipment is stored in ice or chilled water tanks during off-peak hours and then used during peak hours. This reduces the load on cooling systems during high-energy demand times and can help reduce overall energy costs. [32], [24]

Efficient data storage

Storing data more efficiently is crucial for reducing energy consumption in data centers. By optimizing how and where data is stored, data centers can minimize energy use and costs while maintaining performance. Firstly, reducing the volume of stored data is an essential strategy.

The less data stored, the less energy is required to maintain and manage it. One of the most basic methods is data reduction. Which is just reducing volumes of data by deleting unnecessary parts. While it can be challenging to implement due to the value of more data, reducing its volume can lead to significant savings. Data deduplication is another key technique, where only one instance of the data is kept. Duplicates are replaced with references to the original. This way storage space is saved without losing the data itself.

Another method for optimizing data storage is compression, which reduces the size of data to conserve space. This is typically done by software that identifies patterns within the data to minimize its size. Files that are rarely accessed are more likely to be compressed than more frequent ones. Compressing data helps save storage, speeds up file transfers, and lowers costs for both storage and bandwidth. However, the process of compressing and decompressing data consumes CPU and memory resources that uses a lot of energy. [38]

Snapshots capture the state of data at a specific moment. They allow access to data while backups are made. Instead of duplicating entire datasets, snapshots store only the changes, which saves storage space and reduces downtime. By creating virtual copies of only modified data, snapshots ensure efficient, low-impact backups that help optimize both resources and system uptime. [39]

Storage tiering is a method where data is organized and stored on different types of storage based on how often it’s accessed or how fast it needs to be. Frequently accessed data is placed on high-performance, fast-access storage, while less frequently accessed data is stored on slower, more cost-effective storage options. The process of Automated Storage Tiering automates this by dynamically managing the movement of data between different storage levels based on its usage patterns. By using lower-performance data management becomes more efficient. That helps to reduce energy consumption and storage costs. This approach ensures that resources are used optimally without compromising performance for high-priority data.

Thin provisioning optimizes storage allocation by providing only the necessary capacity for an application’s needs. It avoids over-allocation of storage, that means just enough storage is given to an operation, not more or less. It reduces overall energy consumption associated with unused storage resources by only powering up storage that is actively in use. Thin provisioning efficiently manages resources, helping to reduce both storage waste and energy consumption. [40]

How do data centers become greener?

Driving down electricity consumption is an important way to make data centers greener, but it is not the only approach. Achieving sustainability in data centers requires addressing various factors like greenhouse gas emissions or electronic waste management and the use of renewable energy sources. [41]

Renewable energy

Many major companies are turning to renewable energy sources, such as solar, wind, and hydroelectric power, to lower their environmental impact. Solar and wind energy are the most used. Google has been using 100% renewable energy for its global operations since 2017. Microsoft achieved entirely renewable energy for its data centers in 2014. It has committed to becoming carbon negative by 2030, aiming to remove more carbon from the atmosphere than it emits. [64], [53] Amazon Web Services aims to power its data centers entirely with renewable energy by 2025. AWS invests in solar and wind projects to reach this goal. Apple has been carbon neutral for its entire business since 2020. [52] Meta (Facebook) has recycled 91 % of data center construction waste in 2023 and has also been using 100% renewable energy for its global data centers since 2020. These companies are leading the way in adopting renewable energy to reduce their carbon footprints. [50], [46]

Solar energy

Solar energy is increasingly popular among data centers due to its environmental and economic benefits.  It is also reliable because it won’t run out. One of the most common methods for harnessing solar energy is through photovoltaic panels. These panels transform sunlight directly into electricity. They are often placed on smaller data centers´ roofs or nearby. Larger facilities build large solar farms with PV panels in sunny areas that can spread across hundreds of square meters. Battery storage allows for energy use during less sunny days. [45]

Concentrated Solar Power is a thermal system that uses the sun’s heat to generate power. This heat is used to create steam, which powers a turbine connected to an electricity generator. This technology is usually used in areas with strong direct sunlight. This system can also save energy in the form of heat, so it enables continuous electricity generation. This technology is more suited for large-scale facilities due to its implementation costs. Solar thermal systems need little maintenance and have long durability. [42]

Hybrid systems combining photovoltaic and Concentrated solar power technologies are becoming more popular among large companies. For example, Google began a large solar project in Nevada in 2020. Microsoft is also building a solar thermal plant in Arizona. These systems use both PV panels and CSP to maximize energy generation and efficiency. They benefit from the strengths of each technology.

Wind energy is a renewable power source harnessed by turbines. One wind turbine can generate as much energy as 50 000 solar panels. [43] Wind turbines are also much bigger than a solar panels. These huge structures are typically 80 to 120 meters high with rotor blades with sizes about 40 to 60 meters. The largest turbines can be higher than 200 meters with blades over 80 meters. Due to their size and constant noise they are situated far from residential areas. Their life span is usually 20 to 25 years, and they operate for 120 000 hours during this period. [44]

Wind farms can be built offshore or onshore. Onshore wind farms are located on land with favorable wind conditions, like open fields or mountain ridges. Offshore wind farms, located in bodies of water, can harness stronger and more consistent winds. Offshore farms are usually more expensive to install but are also more reliable due to stable windy weather.

Data centers rely on wind energy to reduce their carbon footprint. By using on-site turbines or purchasing power from wind farms, data centers benefit from wind’s efficiency and lower environmental impact.

Hydropower is an effective renewable energy solution for data centers, particularly in regions rich in water resources. Data centers may establish direct connection to hydroelectric plants or invest in building them themselves. These dedicated installations are typically large-scale hydroelectric facilities that generate power using water flow, which can be supplied directly to the data center. [46]

Many data centers in Scandinavia leverage hydropower thanks to the region’s water resources. Data Center Light in Switzerland operates almost entirely on hydropower. While hydropower is not as common as solar or wind energy, its role is growing as part of the increasing trend toward renewable energy sources.

Geothermal energy harnesses heat from the Earth’s internal resources. They are placed only in areas with geothermal sources as Iceland, parts of United States of America or New Zealand. This energy can be used directly for cooling, as mentioned before. Geothermal energy can also be used for electricity generation by converting underground heat into power. Verne Global is a data center company based in Iceland that uses geothermal energy to provide both electricity and cooling for its facilities.

Geothermal energy for data centers has the advantage of being a more stable and reliable source of energy than solar or wind. However, the infrastructure investment required for geothermal energy can be very high and it is possible only in a few places. [47], [46]

Waste Management

Data centers generate significant electronic waste as equipment becomes outdated. To minimize waste, it’s crucial to prioritize maintenance to extend the lifespan of hardware.

When equipment no longer meets high-performance needs, it can be repurposed. For example, using older hardware for tasks that require lower performance like converting servers into backup systems. If the equipment can no longer be used within the data center, it can be resold to businesses or individuals with less demanding requirements. When reuse isn’t an option, recycling should be prioritized to recover valuable materials for manufacturing new products. Disposing of electronic waste in landfills must be avoided to protect the environment. Recycling also recovers valuable materials, reduces the need for mining, and helps protect natural resources. [48], [51]

Many companies have implemented sustainable waste management practices. Google, for example, has committed to zero landfill waste. They reuse older hardware and partner with recycling companies to responsibly dispose of outdated equipment. They ensure that components are securely recycled or repurposed. During the construction of new data centers, Google uses recycled steel and concrete as building materials. Other companies, such as Amazon, Microsoft, Meta, and Apple are also making comparable efforts in sustainability. Amazon focuses on minimizing packaging waste. Apple recycles lithium from data center batteries. [48], [52], [64]

Despite these efforts, only about 20 – 25 % of all electronic waste is recycled. [51] It is estimated that electronic waste accounts for about 2 % of the world’s solid waste and contributes to roughly 70 % of the hazardous waste in landfills, due to the harmful chemicals and heavy metals. Improper disposal of e-waste can release toxic substances into the environment, leading to significant pollution of soil, water, and air. [49]

Future of green computing in data centers

Examples of significant green data centers

The future of green computing in data centers looks bright, as more companies use new ideas to improve energy efficiency and sustainability. Some data centers are already achieving remarkable efficiency due to their location and design. [57]

For example, Google´s Hamina data center in Finland has a near perfect PUE. It uses seawater from the Gulf of Finland for cooling. [54] Microsoft’s Natick project in Scotland is a prototype data center that is placed underwater. [55] Meta operates an efficient data center in Sweden, where the cold climate allows the use of cold air for cooling. The Green Mountain data center in Norway, located in a former NATO bunker, uses cold water for cooling and is powered entirely by hydroelectric energy. [56] These facilities show the potential of green computing and set an example for other data centers. [58], [60]

Artificial intelligence

Artificial intelligence has quickly become one of the biggest IT phenomes in recent years. Generative AI is growing rapidly and transforms industries like medicine, education, finance or entertainment. However, this progress comes with challenges. Data centers powering AI are seeing a huge rise in energy use. [61] For example, a single query on ChatGPT uses nearly ten times the electricity of a Google search and more than 100 million users use ChatGPT weekly. The demand for computational power to support AI is doubling approximately every 100 days. If this growth continues at the same rate as in 2022, data center power demand could rise by 160% and data centers´ CO2 emissions may double by 2030. [62]

Microsoft, which invested in OpenAI reported nearly 30% increase in carbon dioxide emissions since 2020, largely due to data center expansion. Similarly, Google’s greenhouse gas emissions in 2023 were almost 50 % higher than in 2019, driven by the energy demands of its data centers. Training generative AI models requires a significant amount of electricity. The training of ChatGPT-3 required about 1 300 megawatt hours of electricity, which is about the annual energy usage of 130 homes in the United States. [63] The training of newer model, ChatGPT-4 used about 50 times more electricity. There are hundreds of more generative AI models, which use similar amounts of electricity.

AI also has the potential to bring positive impacts. It could help reduce greenhouse gas emissions by 5 – 10 %. AI is good for spotting patterns in large amounts of data. They then use this knowledge to optimize future tasks which helps automating and simplifying many routine operations. This process is called predictive analytics. Predictive analysis is being used more and more in improving cooling systems. Google reduced their cooling costs by 40 % after implementing AI. [65]

The future of green computing in data centers is uncertain but promising. As AI grows, companies are working to balance expansion with sustainability. By adopting green computing, they aim to increase efficiency, lower costs, and support a sustainable future. Success depends on continuous innovation and energy-efficient solutions to reduce IT´s environmental impact.

Conclusion

There are over 10,000 data centers in the world, and each is different. Differences in size, influence, location, and the technologies they use mean no single solution works for all. This paper focused on some of the most commonly used technologies and those I consider important for improving sustainability in data centers. It covered energy monitoring and metrics, techniques for server optimization, such as load balancing, virtualization, and scalability. It also looked at current cooling methods and their best uses. It introduced ways to optimize data storage to save energy. It mentioned the shift to greener energy sources like solar, wind, and waterpower and discussed ways data centers can better handle waste.


The future of Green IT is not certain, unexpected technological breakthroughs happen, as history has shown many times. Recently, generative AI has been changing and optimizing computing and I believe it will continue to play a significant role. While no one can predict what will shape green computing in the coming decades or even years, one thing is certain: Green IT practices are essential for sustainability and will continue to grow as technology and environmental needs develop. As technology evolves, it will be interesting to see what new innovations further improve the sustainability of data centers.

Sources

[1] Topic: Internet usage worldwide. Statista. Retrieved 10 November 2024, from https://www.statista.com/topics/1145/internet-usage-worldwide/

[2] IoT devices installed base worldwide 2015-2025. Statista. Retrieved 10 November 2024, from https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/

[3] What is a Data Center? – Cloud Data Center Explained – AWS. Amazon Web Services, Inc. Retrieved on 10 November 2024, from https://aws.amazon.com/what-is/data-center/

[4] What Is a Data Center? | IBM. (2022, March 22). Retrieved 10 November 2024, from https://www.ibm.com/topics/data-centers

[5] What is a Data Center? (n.d.). Palo Alto Networks. Retrieved 12 November 2024, from https://www.paloaltonetworks.com/cyberpedia/what-is-a-data-center

[6] What Is Green Computing? | NVIDIA Blog. Retrieved 12 November 2024, from https://blogs.nvidia.com/blog/what-is-green-computing/

[7] Advancing green computing: Practices, strategies, and impact in modern software development for environmental sustainability. (2024). ResearchGate. https://doi.org/10.30574/wjaets.2024.11.1.0052

[8] Martin, C. D. (1995). ENIAC: Press conference that shook the world. IEEE Technology and Society Magazine, 14(4), 3–10. IEEE Technology and Society Magazine. https://doi.org/10.1109/44.476631

[9] The world’s first general purpose computer turns 75. (2021, February 11). Penn Today. https://penntoday.upenn.edu/news/worlds-first-general-purpose-computer-turns-75

[11] A brief history of data centres. Digital Realty. (n.d.). Retrieved 16 November 2024, from https://www.digitalrealty.com/resources/articles/a-brief-history-of-data-centers?t=1732740103582?latest

[12] Foote, K. D. (2021, December 17). A Brief History of Cloud Computing. DATAVERSITY. https://www.dataversity.net/brief-history-cloud-computing/

[13] The Evolution of Data Centers. Lifeline Data Centers. (2012, July 2). https://lifelinedatacenters.com/data-center/the-evolution-of-data-centers/

[14] 25 Amazing Cloud Adoption Statistics [2023]: Cloud Migration, Computing, And More. (2023, June 22). Zippia. https://www.zippia.com/advice/cloud-adoption-statistics/

[15] Enconnex. (2024). Data Center History and Evolution. Enconnex Blog. Retrieved 3 December 2024, from https://blog.enconnex.com/data-center-history-and-evolution

[16] Infographic: How Energy Intensive Are Data Centers?. (2024, July 23). Statista Daily Data. https://www.statista.com/chart/32689/estimated-electricity-consumption-of-data-centers-compared-to-selected-countries

[17] Data Center Equipment | ENERGY STAR. (n.d.). Retrieved 17 November 2024, from https://www.energystar.gov/products/data_center_equipment

[18] Nyabuto, G. (2024). Client-server Architecture, a Review. International Journal of Advanced Science and Computer Applications, 3(2), Article 2. https://doi.org/10.47679/ijasca.v3i1.48

[19] Concepts and Techniques for the Green Data Center. (n.d.). Device42. Retrieved 16 November 2024, from https://www.device42.com/data-center-infrastructure-management-guide/data-center-capacity-planning/

[20] Raja, S. P. (2021). Green Computing and Carbon Footprint Management in the IT Sectors. IEEE Transactions on Computational Social Systems, 8(5), 1172–1177. IEEE Transactions on Computational Social Systems. https://doi.org/10.1109/TCSS.2021.3076461

[22] Reddy, V. D., Setz, B., Rao, G. S. V. R. K., Gangadharan, G. R., & Aiello, M. (2017). Metrics for Sustainable Data Centers. IEEE Transactions on Sustainable Computing, 2(3), 290–303. https://doi.org/10.1109/TSUSC.2017.2701883

[23] Data center average annual PUE worldwide 2024. (n.d.). Statista. Retrieved 17 November 2024, from https://www.statista.com/statistics/1229367/data-center-average-annual-pue-worldwide/

[24] Green Technology, 1d Computing and Data Centers: The Need for Integrated Energy Efficiency Framework and Effective Metric. (2024). ResearchGate. https://doi.org/10.14569/IJACSA.2014.050513

[25] Fernandez, R. (2023, September 22). What Is Data Center Virtualization? How It Works and Its Benefits. ServerWatch. https://www.serverwatch.com/virtualization/data-center-virtualization/

[26] Green Computing: An Era of Energy Saving Computing of Cloud Resources. (2024). ResearchGate. https://doi.org/10.5815/ijmsc.2021.02.05

[27] What Is Data Center PUE (Power Usage Effectiveness)? (n.d.). Retrieved 17 November 2024, from https://www.datacenterknowledge.com/sustainability/what-is-data-center-pue-defining-power-usage-effectiveness

[28] Srinath, S. (2024, November 6). An introduction to Green IT and the benefits of Green software [2024]. SIG. https://www.softwareimprovementgroup.com/green-it-introduction/

[29] Comparing Load Balancing Algorithms | JSCAPE. (n.d.). Retrieved 23 November 2024, from https://www.jscape.com/blog/load-balancing-algorithms

[30] Load Balancing Algorithms: Round-Robin, Least Connections, and Beyond. (2024, June 2). https://30dayscoding.com/blog/load-balancing-algorithms-round-robin-least-connections-and-beyond

[31] Application Scaling—AWS Auto Scaling—AWS. (n.d.). Retrieved 23 November 2024, from https://aws.amazon.com/autoscaling/

[32] Sustainability | Special Issue: Energy Efficient Sustainable Cooling Systems. (n.d.). Retrieved 23 November 2024, from https://www.mdpi.com/journal/sustainability/special_issues/Sustainable_Cooling_Systems

[33] Best Practices Guide for Energy-Efficient Data Center Design. (n.d.). Energy.Gov. Retrieved 16 November 2024, from https://www.energy.gov/femp/articles/best-practices-guide-energy-efficient-data-center-design

[34] Install In-rack or In-row Cooling | ENERGY STAR. (n.d.). Retrieved 16 November 2024, from https://www.energystar.gov/products/data_center_equipment/16-more-ways-cut-energy-waste-data-center/install-rack-or-row

[35] Move to a Hot Aisle/Cold Aisle Layout | ENERGY STAR. (n.d.). Retrieved 16 November 2024, from https://www.energystar.gov/products/data_center_equipment/16-more-ways-cut-energy-waste-data-center/move-hot-aislecold-aisle-layout

[36] admin. (2024, July 2). Understanding In-Row Cooling Systems in DataCentres. Blog. https://blog.ibitstech.com/?p=274

[37] Evaporative Cooling for Data Centers—Pros and Cons—AIRSYS North America. (n.d.). Retrieved 17 November 2024, from https://airsysnorthamerica.com/evaporative-cooling-for-data-centers-pros-and-cons/

[38] Data Reduction—What Is It, Techniques, Examples, Advantages. (n.d.). Retrieved 20 November 2024, from https://www.wallstreetmojo.com/data-reduction/

[39] What is Data Compression & What Are The Benefits | Barracuda Networks. (n.d.). Retrieved 20 November 2024, from https://www.barracuda.com/support/glossary/data-compression

[40] lorihollasch. (2024, September 25). Thin Provisioning—Windows drivers. https://learn.microsoft.com/en-us/windows-hardware/drivers/storage/thin-provisioning

[41] Johnston, R. (2023, April 5). How data centers can use renewable energy to increase sustainability and reduce costs. Device42 – Official Blog. https://www.device42.com/blog/2023/04/05/how-data-centers-can-use-renewable-energy-to-increase-sustainability-and-reduce-costs/

[42] HeliosCSP. (2023, August 7). Powering Data Centers with Concentrated Solar Power. HeliosCSP – Portal de noticias de energía termosolar. https://helioscsp.com/powering-data-centers-with-concentrated-solar-power/

[43] The Pros and Cons of Wind Power for Data Center Sustainability. (n.d.). Retrieved 20 November 2024, from https://www.datacenterknowledge.com/energy-power-supply/the-pros-and-cons-of-wind-power-for-data-center-sustainability

[44] Wind Turbine Facts | Loeriesfontein Wind Farm | Sustainable Wind Energy. (n.d.). Retrieved 20 November 2024, from https://loeriesfonteinwind.co.za/wind-energy-library/wind-turbine-facts/

[45] Solar Power for Data Centers and IT Infrastructure. (n.d.). Retrieved 20 November 2024, from https://green.org/2024/01/30/solar-power-for-data-centers-and-it-infrastructure/

[46] How data centers can be powered by renewable electricity. (n.d.). Retrieved 20 November 2024, from https://www.ecohz.com/sustainability-solutions/data-centers

[47] 24/7 Clean Energy – Data Centers – Google. (n.d.). Google Data Centers. Retrieved 16 November 2024, from https://www.google.com/about/datacenters/cleanenergy/

[48] Walbank, J. (2022, October 7). Navigating and addressing the data centre e-waste crisis. https://datacentremagazine.com/articles/navigating-and-addressing-the-data-centre-e-waste-crisis

[49] The toxicological implications of e-waste. (2024, October 9). Open Access Government. https://www.openaccessgovernment.org/article/toxicological-implications-of-e-waste-recycling-uc-davis/163103/

[50] Data Centers. (n.d.). Meta Sustainability. Retrieved 24 November 2024, from https://sustainability.atmeta.com/data-centers/

[51] Electronic waste (e-waste). (n.d.). Retrieved 24 November 2024, from https://www.who.int/news-room/fact-sheets/detail/electronic-waste-(e-waste)

[52] AWS Cloud—Amazon Sustainability. (n.d.). Retrieved 24 November 2024, from https://sustainability.aboutamazon.com/products-services/aws-cloud.html

[53] andreabichsel. (2024, November 12). Přehled datového modelu odpadu Cloud for Sustainability—Microsoft Cloud for Sustainability. https://learn.microsoft.com/cs-cz/industry/sustainability/data-model-waste-overview

[54] Hamina, Finland – Data Centers – Google. (n.d.). Google Data Centers. Retrieved 24 November 2024, from https://www.google.com/about/datacenters/locations/hamina/

[55] Natick. (n.d.). Microsoft Research. Retrieved 24 November 2024, from https://www.microsoft.com/en-us/research/project/natick/

[56] Green mountain. (2024, November 1). Green Mountain Data Center. https://greenmountain.no/

[57] Mondal, S., Faruk, F. B., Rajbongshi, D., Efaz, M. M. K., & Islam, M. M. (2023). GEECO: Green Data Centers for Energy Optimization and Carbon Footprint Reduction. Sustainability, 15(21), Article 21. https://doi.org/10.3390/su152115249

[58] Clancy, H. (2013, July 24). 12 green data centers worth emulating, from Apple to Verne. Trellis. https://trellis.net/article/12-green-data-centers-worth-emulating-apple-verne/

[59] Menear, H. (2021, October 10). Pushing the limits of data centre efficiency. https://datacentremagazine.com/critical-environments/pushing-limits-data-centre-efficiency

[60] Swallow, T. (2023, May 3). Top 10: Green Energy Data Centres. https://energydigital.com/top10/top-10-green-energy-data-centres

[61 AI and energy: Will AI reduce emissions or increase demand? (2024, July 22). World Economic Forum. https://www.weforum.org/stories/2024/07/generative-ai-energy-emissions/

[62] AI is poised to drive 160% increase in data center power demand. (n.d.). Retrieved 27 November 2024, from https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand

[63] Frequently Asked Questions (FAQs)—U.S. Energy Information Administration (EIA). (n.d.). Retrieved 27 November 2024, from https://www.eia.gov/tools/faqs/faq.php

[64] Efficiency – Data Centers – Google. (n.d.). Google Data Centers. Retrieved 27 November 2024, from https://www.google.com/about/datacenters/efficiency/

+ posts

Číst více

Další články