A recent study published in _Joule_ reveals that by the end of 2025, the energy consumption of artificial intelligence (AI) could surpass that of Bitcoin mining, with AI potentially accounting for up to 49% of global data center electricity usage. The research, conducted by Alex de Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam, estimates AI power demands could reach 23 gigawatts by January 1, 2026, roughly equal to 201 terawatt-hours annually. In comparison, Bitcoin currently consumes about 176 TWh. While Bitcoin's energy use is transparent, large tech companies like Google and Microsoft do not disclose specific AI energy consumption, making it difficult to assess their overall impact. De Vries-Gao also highlighted that Nvidia alone consumed 44-48% of Taiwan Semiconductor Manufacturing Company's chip packaging capacity, with projections indicating that AI could consume significantly more resources as demand for advanced AI chips continues to grow.

Source 🔗