AI's energy consumption is growing, but demand for more power is also coming from manufacturing and electric cars.
A recent study presents a radiative transfer model-driven machine learning technique for retrieving carbon monoxide from the world's first hyperspectral Geostationary Interferometric Infrared Sounder (GIIRS) onboard Fengyun-4B (FY-4B) satellite,
According to the U.S. Department of Energy, data centers consume anywhere from 10 to 50 times more energy than a typical commercial building and account for about 2% of total U.S. electricity consumption. As a result of AI, demand for energy is expected to nearly double or even triple in the next few years.
The AI in Energy Market is expected to reach USD 8.91 billion by 2024 from USD 58.66 billion in 2030, at a Compound Annual Growth Rate (CAGR) of 36.9 % from 2024–2030, according to a new report by MarketsandMarkets™.
Chip designer Nvidia, which skyrocketed into one of the most valuable companies in the world this year, has also ramped up efforts to become more energy efficient. Its next-generation AI chip, Blackwell, unveiled in March, has been marketed as being twice as fast as its predecessor, Hopper, and significantly more energy efficient.
PBC announced that they have signed a multi-year, seven-figure deal with Laconic, a company leading a global shift in climate
Planet Labs PBC (NYSE: PL), a leading provider of daily data and insights about Earth, today announced that they have signed a multi-year, seven-figure deal with Laconic, a company leading a global shift in climate finance,
Microsoft is working to advance the sustainability of the datacenter infrastructure that delivers cloud and AI innovations. Learn more.
AWS has formed a partnership with Orbital Materials which uses AI to design materials to help reduce emissions at data centers.
Amazon Web Services (NASDAQ:AMZN) has partnered with startup Orbital Materials to utilize its artificial intelligence platform to develop advanced materials capable of reducing the carbon footprint of data centers.
In the US, demand for power from AI data centers is skyrocketing, driven by the intensive computational requirements of its models. Will it let up?