Pages: [1]   Go Down
  Print  
Author Topic: Why do AI data centers use so many resources?  (Read 2 times)
HCK
Global Moderator
Hero Member
*****
Posts: 79425



« on: October 06, 2025, 04:05:01 pm »

Why do AI data centers use so many resources?

<p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">With the AI boom, construction of new data centers </span><a target="_blank" class="link" href="https://www.reuters.com/business/us-data-center-build-hits-record-ai-demand-surges-bank-america-institute-says-2025-09-10/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">has skyrocketed</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">, and not without consequence — </span><a target="_blank" class="link" href="https://www.nytimes.com/2025/07/14/technology/meta-data-center-water.html"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">some communities</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> that count these facilities as neighbors </span><a target="_blank" class="link" href="https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">are now</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> facing </span><a target="_blank" class="link" href="https://www.bbc.com/news/articles/cy8gy7lv448o"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">water shortages</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> and </span><a target="_blank" class="link" href="https://www.bloomberg.com/graphics/2024-ai-power-home-appliances/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">strained power supplies</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">. While tech's data center footprint has been growing for decades, generative AI has seemingly shifted the impacts of these operations toward the catastrophic. What exactly makes these new data centers such a burden on the environment and existing infrastructure, and is there anything we can do to fix it?&nbsp;</span></p><h2 style="text-align:left;" id="316d022e-7699-4dc1-bd9c-769f8c149acf"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Chips</span></h2><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">The industry believes AI will work its way into every corner of our lives, and so needs to build sufficient capacity to address that anticipated demand. But the hardware used to make AI work is so much more resource-intensive than standard cloud computing facilities that it requires a dramatic shift in how data centers are engineered.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Typically the most important part of a computer is its “brain,” the Central Processing Unit (CPU). It's designed to compute a wide variety of tasks, tackling them one at a time. Imagine a CPU as a one-lane motorway in which every vehicle, no matter the size, can get from A to B at extraordinary speed. What AI relies on instead are Graphics Processing Units (GPU), which are clusters of smaller, more specialized processors all running in parallel. In the example, a GPU is a thousand-lane motorway with a speed limit of just 30 mph. Both try to get a huge number of figurative vehicles to their destination in a short amount of time, but they take diametrically opposite approaches to solving that problem.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Phil Burr is Head of Product at Lumai, a British company looking to replace traditional GPUs with optical processors. “In AI, you repeatedly perform similar operations,” he explained, “and you can do that in parallel across the data set.” This gives GPUs an advantage over CPUs in large but fundamentally repetitive tasks, like graphics, executing AI models and crypto mining. “You can process a large amount of data very quickly, but it’s doing the same amount of processing each time,” he said.</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">In the same way that thousand-lane highway would be pretty wasteful, the more powerful GPUs get, the more energy hungry they become. “In the past, as [CPUs evolved] you could get a lot more transistors on a device, but the overall power [consumption] remained about the same,&quot; Burr said. They're also equipped with “specialized units that do [specific] work faster so the chip can return to idle sooner.” By comparison, “every iteration of a GPU has more and more transistors, but the power jumps up every time because getting gains from those processes is hard.” Not only are they physically larger — which results in higher power demands — but they&nbsp;“generally activate all of the processing units at once,” Burr said.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">In 2024, the Lawrence Berkeley National Laboratory published a congressionally mandated report into the </span><a target="_blank" class="link" href="https://escholarship.org/content/qt32d6m0d1/qt32d6m0d1.pdf"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">energy consumption of data centers</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">. The report identified a sharp increase in the amount of electricity data centers consumed as GPUs became more prevalent. Power use from 2014 to 2016 was stable at around 60 TWh, but started climbing in 2018, to 76 TWh, and leaping to 176 TWh by 2023. In just five years, data center energy use more than doubled from 1.9 percent of the US’ total, to nearly 4.4 percent — with that figure projected to reach up to 12 percent by the start of the 2030s.</span></p><h2 style="text-align:left;" id="1b5c202b-f2c7-4bc7-9e13-6c03bfb0128e"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Heat</span></h2><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Like a lightbulb filament, as electricity moves through the silicon of computer chips, it encounters resistance, generating heat. Extending that power efficiency metaphor from earlier, CPUs are closer to modern LEDs here, while GPUs, like old incandescent bulbs, lose a huge amount of their power to resistance. The newest generation of AI data centers are filled with rack after rack of them, depending on the owner’s needs and budget, each one kicking out what Burr described as “a massive amount of heat.”&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Heat isn’t just an unwelcome byproduct: if chips aren’t kept cool, they'll experience performance and longevity issues. The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) </span><a target="_blank" class="link" href="https://xp20.ashrae.org/datacom1_4th/ReferenceCard.pdf"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">publishes guidelines</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> for data center operators. It advocates server rooms should be kept between 18 to 27 degrees celsius (64.4 to 80.6 degrees Fahrenheit). Given the sheer volume of heat GPUs kick out, maintaining that temperature requires some intensive engineering, and a lot of energy.</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">The majority of data centers use a handful of methods to keep their hardware within the optimal temperature. One of the oldest ways to maximize the efficiency of air conditioning is a technique of hot and cold aisle containment. Essentially, cold air is pushed through the server racks to keep them cool, while the hot air those servers expel is drawn out to be cooled and recirculated.&nbsp;</span></p><div><div style="left:0;width:100%;height:0;position:relative;padding-bottom:56.25%;"><iframe src="https://www.youtube.com/embed/PaYxIFFWf2w?rel=0" style="top:0;left:0;width:100%;height:100%;position:absolute;border:0;" allowfullscreen scrolling="no"></iframe></div></div><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Many data centers, especially in the US, rely on the cooling effect that occurs as water changes from a liquid to a gas. This is done by drawing hot air through a wet medium to facilitate evaporation and blowing the resulting cooled air into the server room, in a method known as direct evaporative cooling. There's also indirect evaporative cooling, which works similarly but adds a heat exchanger — a device that's used to transfer heat between different mediums. In this type of setup, the heat from the warm air is transferred and cooled separately from the server room to avoid raising the humidity levels indoors.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Due in part to their cooling needs, data centers have a tremendous water footprint. The </span><a target="_blank" class="link" href="https://escholarship.org/content/qt32d6m0d1/qt32d6m0d1.pdf"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">Lawrence Berkeley report</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> found that, in 2014, US-based data centers consumed 21.2 billion liters of water. By 2018, however, that figure had leapt to 66 billion liters, much of which was attributed to what it collectively terms “hyperscale” facilities, which include AI-focused operations. In 2023, traditional US data centers reportedly consumed 10.56 billion liters of water while AI facilities used around 55.4 billion liters. The report’s projections believe that by 2028, AI data centers will likely consume as much as 124 billion liters of water.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">&quot;Collectively, data centers are among the top-ten water consuming industrial or commercial industries in the US,&quot; according to a 2021 </span><a target="_blank" class="link" href="https://iopscience.iop.org/article/10.1088/1748-9326/abfba1"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">study</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> published in the journal Environmental Research Letters. About one-fifth of these data centers use water from stressed watersheds, i.e. areas where the demand for water may be greater than the natural supply.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Most of the </span><a target="_blank" class="link" href="https://arxiv.org/pdf/2304.03271"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">water consumed by data centers</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> evaporates and won't be immediately replenished, while the rest goes to wastewater treatment plants. As a trio of academics explained in an op-ed for </span><a target="_blank" class="link" href="https://www.dallasnews.com/opinion/commentary/2024/05/06/data-centers-are-draining-resources-in-water-stressed-communities/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">The Dallas Morning News</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">, data centers are &quot;effectively removing [drinking water] from the local water cycle.&quot; Water used in the cooling process is typically treated with chemicals such as corrosion inhibitors and biocides, which prevent bacterial growth. The resulting wastewater often contains pollutants, so it can't be recycled for human consumption or agriculture.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">And data centers' water use goes well beyond cooling. A much bigger portion of their water footprint can be attributed to indirect uses, mainly through electricity generated by power plants but also through wastewater utilities. These account for about three-fourths of a data center's water footprint, the study notes. Power plants </span><a target="_blank" class="link" href="https://www.eia.gov/todayinenergy/detail.php?id=56820"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">use water</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> in a number of ways, primarily for cooling and to produce the steam needed to spin their electricity-generating turbines. According to the authors, 1 megawatt-hour of energy consumed by data centers in the US on average requires 7.1 cubic meters of water.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">&quot;Data centers are indirectly dependent on water from every state in the contiguous US, much of which is sourced from power plants drawing water from subbasins in the eastern and western coastal states,&quot; the authors explain. To adequately address the water issue, energy consumption must be reigned in too.&nbsp;</span></p><h2 style="text-align:left;" id="a0505dee-fa2a-40b5-bf94-7b50e2d0c306"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Exploring the alternatives</span></h2><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">One major approach to reduce the massive water footprint of these systems is to use closed-loop liquid cooling. This is already ubiquitous on a smaller scale in high-end PCs, where heat-generating components, such as the CPU and GPU, have large heat exchangers that a liquid is pumped through. The liquid draws away the heat, and then has to be cooled down via another heat exchanger, or a refrigeration unit, before being recirculated. </span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Liquid cooling is becoming more and more common, especially in AI data centers, given the heat that GPUs generate. With the exception of mechanical issues, like leaking, and the water needed to operate the facility more generally, closed-loop systems do not experience water loss and so make more reasonable demands on local water resources. Direct-to-chip liquid cooling drastically cuts a data center's potential water use, and more efficiently removes heat than traditional air-cooling systems. In recent years, companies including </span><a target="_blank" class="link rapid-with-clickid" href="https://shopping.yahoo.com/rdlw?merchantId=53056deb-0e54-4c87-964d-2e7d085829e8&amp;siteId=us-engadget&amp;pageId=1p-autolink&amp;contentUuid=5020f9da-0b83-4748-bd14-cfe044716850&amp;featureId=text-link&amp;merchantName=Google+Cloud+-+North+America&amp;linkText=Google&amp;custData=eyJzb3VyY2VOYW1lIjoiV2ViLURlc2t0b3AtVmVyaXpvbiIsImxhbmRpbmdVcmwiOiJodHRwczovL2Nsb3VkLmdvb2dsZS5jb20vYmxvZy90b3BpY3Mvc3lzdGVtcy9lbmFibGluZy0xLW13LWl0LXJhY2tzLWFuZC1saXF1aWQtY29vbGluZy1hdC1vY3AtZW1lYS1zdW1taXQiLCJjb250ZW50VXVpZCI6IjUwMjBmOWRhLTBiODMtNDc0OC1iZDE0LWNmZTA0NDcxNjg1MCIsIm9yaWdpbmFsVXJsIjoiaHR0cHM6Ly9jbG91ZC5nb29nbGUuY29tL2Jsb2cvdG9waWNzL3N5c3RlbXMvZW5hYmxpbmctMS1tdy1pdC1yYWNrcy1hbmQtbGlxdWlkLWNvb2xpbmctYXQtb2NwLWVtZWEtc3VtbWl0In0&amp;signature=AQAAAUXdiiuMeQISyTpFgdfMsLKov41pJ4xexryURtYRsi1R&amp;gcReferrer=https%3A%2F%2Fcloud.google.com%2Fblog%2Ftopics%2Fsystems%2Fenabling-1-mw-it-racks-and-liquid-cooling-at-ocp-emea-summit" data-i13n="elm:affiliate_link;sellerN:Google Cloud - North America;elmt:" data-original-link="https://cloud.google.com/blog/topics/systems/enabling-1-mw-it-racks-and-liquid-cooling-at-ocp-emea-summit"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">Google</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">, </span><a target="_blank" class="link" href="https://blogs.nvidia.com/blog/blackwell-platform-water-efficiency-liquid-cooling-data-centers-ai-factories/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">NVIDIA</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> and </span><a target="_blank" class="link rapid-with-clickid" href="https://shopping.yahoo.com/rdlw?merchantId=67071605-d4df-494e-8d9c-c9c236b8bb38&amp;siteId=us-engadget&amp;pageId=1p-autolink&amp;contentUuid=5020f9da-0b83-4748-bd14-cfe044716850&amp;featureId=text-link&amp;merchantName=Microsoft&amp;linkText=Microsoft&amp;custData=eyJzb3VyY2VOYW1lIjoiV2ViLURlc2t0b3AtVmVyaXpvbiIsImxhbmRpbmdVcmwiOiJodHRwczovL3d3dy5taWNyb3NvZnQuY29tL2VuLXVzL21pY3Jvc29mdC1jbG91ZC9ibG9nLzIwMjQvMTIvMDkvc3VzdGFpbmFibGUtYnktZGVzaWduLW5leHQtZ2VuZXJhdGlvbi1kYXRhY2VudGVycy1jb25zdW1lLXplcm8td2F0ZXItZm9yLWNvb2xpbmcvIiwiY29udGVudFV1aWQiOiI1MDIwZjlkYS0wYjgzLTQ3NDgtYmQxNC1jZmUwNDQ3MTY4NTAiLCJvcmlnaW5hbFVybCI6Imh0dHBzOi8vd3d3Lm1pY3Jvc29mdC5jb20vZW4tdXMvbWljcm9zb2Z0LWNsb3VkL2Jsb2cvMjAyNC8xMi8wOS9zdXN0YWluYWJsZS1ieS1kZXNpZ24tbmV4dC1nZW5lcmF0aW9uLWRhdGFjZW50ZXJzLWNvbnN1bWUtemVyby13YXRlci1mb3ItY29vbGluZy8ifQ&amp;signature=AQAAAWUkP3P1tB17MUVrZYrW5zk0TGbyvXdD_ktoMTxY6Ds5&amp;gcReferrer=https%3A%2F%2Fwww.microsoft.com%2Fen-us%2Fmicrosoft-cloud%2Fblog%2F2024%2F12%2F09%2Fsustainable-by-design-next-generation-datacenters-consume-zero-water-for-cooling%2F" data-i13n="elm:affiliate_link;sellerN:Microsoft;elmt:" data-original-link="https://www.microsoft.com/en-us/microsoft-cloud/blog/2024/12/09/sustainable-by-design-next-generation-datacenters-consume-zero-water-for-cooling/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">Microsoft</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> have been championing liquid cooling systems as a more sustainable way forward. And </span><a target="_blank" class="link" href="https://www.gatech.edu/news/2025/04/16/liquid-cooling-technology-developed-georgia-tech-awarded-us-patent-company-raising"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">researchers</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> are looking into ways to employ this approach on an even more granular level to tackle the heat right at the source.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Whereas cold plates (metal slabs with tubing or internal channels for coolant to flow through) are commonly used in liquid cooling systems to transfer heat away from the electronics, Microsoft has been testing a microfluidics-based cooling system in which liquid coolant travels through tiny channels on the back of the chip itself. In the lab, this system performed &quot;up to three times better than cold plates at removing heat,&quot; and the company said it &quot;can effectively cool a server running core services for a simulated Teams meeting.&quot; A </span><a target="_blank" class="link rapid-with-clickid" href="https://shopping.yahoo.com/rdlw?siteId=us-engadget&amp;pageId=1p-autolink&amp;contentUuid=5020f9da-0b83-4748-bd14-cfe044716850&amp;featureId=text-link&amp;linkText=blog+post&amp;custData=eyJzb3VyY2VOYW1lIjoiV2ViLURlc2t0b3AtVmVyaXpvbiIsImxhbmRpbmdVcmwiOiJodHRwczovL25ld3MubWljcm9zb2Z0LmNvbS9zb3VyY2UvZmVhdHVyZXMvaW5ub3ZhdGlvbi9taWNyb2ZsdWlkaWNzLWxpcXVpZC1jb29saW5nLWFpLWNoaXBzLyIsImNvbnRlbnRVdWlkIjoiNTAyMGY5ZGEtMGI4My00NzQ4LWJkMTQtY2ZlMDQ0NzE2ODUwIiwib3JpZ2luYWxVcmwiOiJodHRwczovL25ld3MubWljcm9zb2Z0LmNvbS9zb3VyY2UvZmVhdHVyZXMvaW5ub3ZhdGlvbi9taWNyb2ZsdWlkaWNzLWxpcXVpZC1jb29saW5nLWFpLWNoaXBzLyJ9&amp;signature=AQAAAUTD9_JMJ2VXhzmuYvw78SbUhRmJj9uGtNNJAv6-xVlp&amp;gcReferrer=https%3A%2F%2Fnews.microsoft.com%2Fsource%2Ffeatures%2Finnovation%2Fmicrofluidics-liquid-cooling-ai-chips%2F" data-i13n="elm:affiliate_link;sellerN:;elmt:" data-original-link="https://news.microsoft.com/source/features/innovation/microfluidics-liquid-cooling-ai-chips/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">blog post</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> about the findings noted, &quot;microfluidics also reduced the maximum temperature rise of the silicon inside a GPU by 65 percent, though this will vary by the type of chip.&quot;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Another option is &quot;free&quot; cooling, or making use of the natural environmental conditions at the data center site to cool the operation. Air-based free cooling utilizes the outdoor air in cold locales, while water-based free cooling relies on cold water sources such as seawater. Some facilities couple this with </span><a target="_blank" class="link rapid-with-clickid" href="https://shopping.yahoo.com/rdlw?siteId=us-engadget&amp;pageId=1p-autolink&amp;contentUuid=5020f9da-0b83-4748-bd14-cfe044716850&amp;featureId=text-link&amp;linkText=rainwater+harvesting&amp;custData=eyJzb3VyY2VOYW1lIjoiV2ViLURlc2t0b3AtVmVyaXpvbiIsImxhbmRpbmdVcmwiOiJodHRwczovL25ld3MubWljcm9zb2Z0LmNvbS9zb3VyY2UvZW1lYS9mZWF0dXJlcy9ob3ctbWljcm9zb2Z0cy1uZXctZGF0YWNlbnRlci1yZWdpb24taW4tc3dlZGVuLWluY29ycG9yYXRlcy10aGUtY29tcGFueXMtc3VzdGFpbmFiaWxpdHktY29tbWl0bWVudHMvIiwiY29udGVudFV1aWQiOiI1MDIwZjlkYS0wYjgzLTQ3NDgtYmQxNC1jZmUwNDQ3MTY4NTAiLCJvcmlnaW5hbFVybCI6Imh0dHBzOi8vbmV3cy5taWNyb3NvZnQuY29tL3NvdXJjZS9lbWVhL2ZlYXR1cmVzL2hvdy1taWNyb3NvZnRzLW5ldy1kYXRhY2VudGVyLXJlZ2lvbi1pbi1zd2VkZW4taW5jb3Jwb3JhdGVzLXRoZS1jb21wYW55cy1zdXN0YWluYWJpbGl0eS1jb21taXRtZW50cy8ifQ&amp;signature=AQAAAZeXz3SP_jKy8w8fqy-xZ6IEYjg5RsLOa2xv39NgZWif&amp;gcReferrer=https%3A%2F%2Fnews.microsoft.com%2Fsource%2Femea%2Ffeatures%2Fhow-microsofts-new-datacenter-region-in-sweden-incorporates-the-companys-sustainability-commitments%2F" data-i13n="elm:affiliate_link;sellerN:;elmt:" data-original-link="https://news.microsoft.com/source/emea/features/how-microsofts-new-datacenter-region-in-sweden-incorporates-the-companys-sustainability-commitments/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">rainwater harvesting</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> for their other water needs, like humidification.</span></p><figure><img src="https://d29szjachogqwa.cloudfront.net/videos/user-uploaded/StartCampusMap.jpg" data-crop-orig-src="https://d29szjachogqwa.cloudfront.net/videos/user-uploaded/StartCampusMap.jpg" style="height:1080px;width:1920px;" alt="A map of Start Campus" data-uuid="437c5078-0391-4b12-b351-0147dc5a28eb"><figcaption>A map of Start Campus</figcaption><div class="photo-credit">Start Campus</div></figure><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Start Campus, a data center project in Portugal, is located on the site of an old coal-fired power station and will use much of its old infrastructure. Rather than simply employ a closed-loop, the high temperatures will require the closed-loop system to interact with an open loop. When the campus is fully operational, its heat will be passed onto around 1.4 million cubic tons of seawater per day. Omer Wilson, CMO at Start Campus, said that by the time the water has returned to its source, its temperature will be the same as the surrounding sea. Start Campus has also pledged that there will be no meaningful water loss from this process.</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">There is another novel cooling method, immersion, in which computing equipment is — you guessed it — immersed in a non-conductive liquid suitable to draw heat. Wilson described it as a relatively niche approach, used in some crypto mining applications, but not used by industrial-scale facilities.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">To keep with both energy and cooling needs, some researchers say the industry must look to renewable resources. &quot;Directly connecting data center facilities to wind and solar energy sources ensures that water and carbon footprints are minimized,&quot; wrote the authors of the aforementioned Environmental Research study. Even purchasing renewable energy certificates — which each represent one megawatt-hour of electricity generated from a renewable source and delivered to the grid — could help shift the grid toward these sources over time, they added. &quot;Data center workloads can be migrated between data centers to align with the portion of the grid where renewable electricity supplies exceed instantaneous demand.&quot;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Geothermal resources have begun to look especially promising. According to a recent report by the </span><a target="_blank" class="link" href="https://rhg.com/research/geothermal-data-center-electricity-demand/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">Rhodium Group</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">, geothermal energy could meet up to 64 percent of data center's projected power demand growth in the US &quot;by the early 2030s.&quot; In the Western US, geothermal could meet 100 percent of demand growth in areas such as Phoenix, Dallas-Fort Worth and Las Vegas.</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">For cooling, geothermal heat pumps can be used to &quot;leverage the consistently cool temperatures&quot; found hundreds of feet beneath the surface. Or, in locations where there are shallow aquifers present, data centers can make use of geothermal absorption chillers. These rely on the low-grade heat at shallower depths &quot;to drive a chemical reaction that produces water vapor,&quot; the report explains. &quot;This water vapor cools as it is run through a condenser and cools the IT components of a data center using evaporation.&quot;&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Iron Mountain Data Centers operates a </span><a target="_blank" class="link" href="https://betterbuildingssolutioncenter.energy.gov/showcase-projects/iron-mountain-data-centers-geothermal-cooling-system"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">geothermally cooled data center</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> in Boyers, Pennsylvania at the site of an old limestone mine. A 35-acre underground reservoir provides a year-round supply of cool water. Geothermal may not be a widespread solution just yet, but it's catching on. In 2024, </span><a target="_blank" class="link" href="https://about.fb.com/news/2024/08/new-geothermal-energy-project-to-support-our-data-centers/"><span style="color:rgb(17, 85, 204);font-family:Arial, sans-serif;">Meta</span>[/url]<span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;"> announced a partnership with Sage Geosystems to supply its data centers with up to 150 megawatts (MW) of geothermal power starting in 2027.&nbsp;</span></p><h2 style="text-align:left;" id="65fa8eb2-a2d5-4ce2-b421-b341a9868619"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Beyond the hardware</span></h2><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">While novel cooling methods will undoubtedly help curb some of the AI data centers' excessive resource demands, the first step to meaningful change is transparency, according to Vijay Gadepally, a senior scientist at MIT's Lincoln Laboratory Supercomputing Center. AI companies need to be upfront about the emissions and resource use associated with their operations to give people a clear view of their footprints.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Then there is the hardware to consider. Incorporating more intelligent chip design — i.e. processors with better performance characteristics — could go a long way toward making data centers more sustainable. &quot;That's a huge area of innovation right now,&quot; Gadepally said. And large data centers are often &quot;running underutilized,&quot; with a lot of power that isn’t being allocated efficiently. Rather than leaning into the push to build more such facilities, the industry should first make better use of existing data centers' capacities.&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Similarly, many of today's AI models are vastly overpowered for the tasks they're being given. The current approach is &quot;like cutting a hamburger with a chainsaw,&quot; Gadepally said. &quot;Does it work? Sure… but it definitely is overkill.&quot; This doesn't need to be the case. &quot;We have found in many instances that you can use a smaller but tuned model, to achieve similar performance to a much larger model,&quot; Gadepally said, noting that this is especially true for new &quot;agentic&quot; systems. &quot;You're often trying thousands of different parameters, or different combinations of things to discover which is the best one, and by being a little bit more intelligent, we could dismiss or essentially terminate a lot of the workloads or a lot of those combinations that weren't getting you towards the right answer.&quot;&nbsp;</span></p><p style="text-align:left;"><span style="color:rgb(0, 0, 0);font-family:Arial, sans-serif;">Each of those unnecessary parameters isn't just a computational dead end, it's another nudge towards rolling blackouts, less potable water and rising utility costs to surrounding communities. As Gadepally said, &quot;We're just building bigger and bigger without thinking about, 'Do we actually need it?'&quot;&nbsp;</span></p><p style="text-align:left;"></p>This article originally appeared on Engadget at https://www.engadget.com/why-do-ai-data-centers-use-so-many-resources-171500010.html?src=rss

Source: Why do AI data centers use so many resources?
Logged
Pages: [1]   Go Up
  Print  
 
Jump to: