AI Sustainability

Artificial intelligence (AI) is a field of computer systems designed to simulate human thought or part of it, with the latest generation of products, such as ChatGPT, drawing from vast web content, processing responses, and training themselves to respond more effectively to user input. AI typically operates in the cloud, passing through large servers, which require energy and water consumption. 

The discussion about AI has primarily focused on concerns about its potential to replace humans, privacy, and the ethical ramifications of its use. However, there is a need to consider the environmental and social impacts of our consumption of AI. The availability of building areas and energy constraints, together with the management of thermal risks, represent the main challenges for the future design of data centers. In turn, these externalities, which generate industrial pollution, negatively affect people and the environment. The Shift Project estimates that digital technologies produce more carbon emissions than the aviation industry, and while AI currently contributes less than 1% of total carbon emissions, the AI market is predicted to grow ninefold by 2030. Therefore, it is crucial to consider these factors when discussing AI’s potential impact on our lives. AI has the potential to help better understand climate phenomena by allowing for much greater analytical capacity, increasing the efficiency of industrial plants, and managing renewable electricity networks. Thus, the emissions produced by AI must be considered in cost-benefit analysis.

Data Centers and AI Demand

The rise of AI applications is driving increased demand for computing power and storage. As a result, it’s become more critical than ever for businesses, individuals, and governments to process and store data securely. This need is boosting demand for advanced data centers, presenting investment opportunities in the infrastructure that powers them. Data centers are now essential for handling the massive AI workload. Over the past decade, developers have steadily added capacity to new colocation (shared facilities hosting multiple companies’ servers and networks) and hyperscale data centers, designed to meet the scale and high-performance needs of cloud providers and large technology companies. With the rise of generative AI (GenAI), the need for capacity continues to grow, driving higher investments in data center infrastructure for the future. To ensure secure information storage, data center infrastructure must evolve to handle the additional power required by AI and safeguard against data loss from power disruptions, overheating, and fires.

Energy Consumption Challenges

Data center construction must address two main constraints: the availability of land and energy. The fast evolution of GenAI has created a situation where demand is exceeding the available supply, a trend expected to persist. The availability of power has emerged as one of the most challenging aspects; until recently, power demand needs increased linearly, but the increase required by AI data centers now demands more robust solutions. By 2027, AI-related electricity consumption worldwide could more than double, rising by 85.4 to 134.0 TWh from newly manufactured servers, comparable to the entire power generation of countries like the Netherlands, Argentina, and Sweden. Moreover, AI servers are more energy-dense than traditional central processing unit (CPU) servers, relying on GPU (graphics processing unit) servers, which produce five times as much heat and demand five times the power of traditional servers. Such scenarios suggest that AI could consume 80% of data center energy in the next 15 years, making energy access a differentiator, as discussed by researcher Alex de Vries. Thus, this poses enormous pressure on electrical infrastructure, given that data center operators will have to adapt to the much higher demand for computing power. This process will take several years and could potentially fail to meet the growing needs of GenAI. The very design of data centers will have to change.

Evolving Data Center Design

AI applications generate significant heat, and data centers must maintain optimal operating temperatures to prevent hardware failure and ensure reliable performance. Today, most data centers use air cooling. The kind of energy consumption increase needed to address the rapid emergence of GenAI will imply additional or vastly more efficient HVAC (heating, ventilation, and air conditioning) equipment to cope with the significant increase in heat generation, as server densities outpace the cooling capabilities of current air systems. These typically run between 15 and 25 kW per rack before they lose effectiveness. As power density continues to increase, operators will need to consider liquid cooling technologies. The total available market (TAM) for liquid thermal management in data centers could increase by about five-fold in the next five years. Investors will note that companies specializing in cooling systems design and manufacturing offer attractive opportunities. The importance of effective cooling systems for data center security was highlighted by the fire in a data center in Strasbourg in 2021, which brought down millions of websites (including government ones) and resulted in substantial data loss.

Market Trends and Growth

In 2023, data center demand grew considerably. Global rental volume increased by more than 6 gigawatts (GW), with the majority in North America. This volume represents a doubling from 2022 and was nearly eight times higher than in 2019. Given the rules of capitalism and greed have not changed, the combination of supply constraints and strong demand has encouraged data center operators to raise rents. After an 18.6% annual rate increase in 2023, real estate experts forecast a further double-digit percentage increase in 2024. Thus, the data center industry will grow in tandem with the widespread and increasing adoption of AI technologies, which depend on both ever-increasing data processing capacities and computing infrastructure protected by ever more efficient and reliable cooling capacities, able to deal with the higher risk of overheating generated by expected higher energy consumption. Only those data center companies able to improve their infrastructure to meet the more demanding needs of AI will gain from the rise of its adoption.

Environmental Sustainability Considerations

Sustainability in AI implies adopting practices and technologies that minimize environmental, social, and economic impacts. This includes responsible resource management, promoting equity, and focusing on sustainable design. To enhance sustainability in AI, it is crucial to optimize models to reduce resource consumption, such as optimizing machine learning algorithms or implementing model compression techniques. Efficient and environmentally friendly algorithms, such as faster search and learning algorithms, can also contribute to sustainability. Companies can adopt sustainability-focused corporate policies, such as responsible purchasing for AI hardware, data management policies, and carbon reduction policies, to promote environmentally friendly practices. Investment in sustainable AI technology research and development is essential for driving innovation, including developing energy-efficient algorithms and machine learning techniques. AI can also reduce energy consumption by reducing spam and uncertified emails. By examining these strategies, it becomes clear that there are numerous opportunities to improve sustainability in AI. The next chapter will explore the ethical challenges associated with sustainability in AI and discuss effective solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *