· 2 min read
illuminem summarizes for you the essential news of the day. Read the full piece on The Washington Post or enjoy below:
🗞️ Driving the news: AI chatbots, like ChatGPT, are placing significant environmental strain, with each query consuming large amounts of water and energy to cool data centers
• A 100-word response requires roughly 519 milliliters of water and 0.14 kilowatt-hours of electricity, leading to hidden environmental costs
🔭 The context: Data centers housing AI models generate substantial heat, requiring water or electricity-based cooling systems
• As AI grows, especially in energy-intensive data centers, it is causing environmental concerns, from increased water usage in drought-prone areas to higher electricity demands
🌍 Why it matters for the planet: The environmental footprint of AI is becoming increasingly visible, with models like GPT-3 consuming 700,000 liters of water during training
• Addressing these resource-intensive processes is vital for sustainability efforts, as tech companies struggle to meet green pledges
⏭️ What's next: Companies like Google, Microsoft, and Meta are exploring greener cooling technologies, though meeting ambitious sustainability goals remains a challenge as AI demands continue to surge
💬 One quote: "AI can be energy-intensive and that’s why we are constantly working to improve efficiency,” said Kayla Wood, OpenAI spokesperson
📈 One stat: Microsoft’s data center used 700,000 liters of water to train GPT-3, equivalent to producing 100 pounds of beef
Click for more news covering the latest on green tech