人工智能不会消耗我们被告知的那么多电力 (2024)
AI won't use as much electricity as we are told (2024)

原始链接: https://johnquigginblog.substack.com/p/ai-wont-use-as-much-electricity-as

## 人工智能与能源:过度预测的历史 近期对生成式人工智能能源需求的担忧,估计到2032年可能占美国电力使用的25%,这与过去未能实现预测的情况如出一辙。 90年代末个人电脑兴起时,也曾出现类似的警报——预测电脑将消耗美国50%的电力,但由于效率的快速提升,这一预测被证明是荒谬的。 数据中心也引发了担忧,但信息技术行业全球电力消耗始终仅占1-2%,温室气体排放量低于1%——远低于水泥生产等行业。 虽然人工智能正在增长,但其对IT预算(5-10%)和整体电力使用的影响仍然很小。 即使大幅增长,例如到2030年增长十倍,也只会*使*IT的电力消耗翻倍,很可能被持续的效率改进和遏制浪费行为(如加密货币挖矿)所抵消。 这些反复出现的过高估计往往受到既得利益者的推动——化石燃料公司为了推广持续的需求,或“负增长”倡导者质疑信息经济的可持续性。 最终,历史趋势表明,人工智能的能源影响是可以控制的,电力不足不会成为其发展的限制因素。

## AI 与电力使用:黑客新闻讨论总结 最近黑客新闻上出现了一项讨论,内容是关于AI的电力消耗可能不会像预测的那么多。虽然有些人同意,认为过去的科技繁荣高估了能源需求,但许多评论者对此提出了反对意见。 反对的核心论点是*当前投资的规模*。与之前的预测不同,现在有数十亿美元正在投入实际的电力容量用于数据中心,而这些公司明显相信AI对能源有很高的需求。 多个链接强调了大型科技公司(微软、谷歌、亚马逊、OpenAI)在核能和能源基础设施方面达成的重大协议。 反驳意见承认了过去“反弹效应”——效率提高导致使用增加——并指出AI的潜力不同于之前的技术,例如加密货币,后者本质上*需要*增加能源消耗。 还有人强调需要考虑更广泛的影响,例如电动汽车和清洁能源转型的能源需求。 最终,这场讨论凸显了不确定性。 虽然效率提升是可能的,但AI的快速发展和潜在广泛应用表明可能需要大量的能源,而当前的投资模式支持这一观点。
相关文章

原文

From Renew Economy

The recent rise of “generative AI” models has led to a lot of dire predictions about the associated requirements for energy.  It has been estimated that AI will consume anything from 9 to 25 per cent of all US electricity by 2032.

But we have been here before.  Predictions of this kind have been made ever since the emergence of the Internet as a central part of modern life, often tied to claims and counterclaims about the transition to renewable energy.

Back in 1999 let alone AI, Forbes magazine ran a piece headlined, Dig more coal — the PCs are coming. This article claimed that personal computers would use 50 per cent of US electricity within a decade. The unsubtle implication was that any attempt to reduce carbon dioxide emissions was doomed to failure

Of course, this prediction wasn’t borne out. Computing power has increased a thousand-fold since the turn of the century.  But far from demanding more electricity personal computers have become more efficient with laptops mostly replacing large standalone boxes, and software improvements reducing waste.

A typical modern computer consumes around 30-60 watts when it is operating, less than a bar fridge or an incandescent light bulb.

The rise of large data centres and cloud computing produced another round of alarm. A US EPA report in 2007 predicted a doubling of demand every five years.  Again, this number fed into a range of debates about renewable energy and climate change.

Yet throughout this period, the actual share of electricity use accounted for by the IT sector has hovered between 1 and 2 per cent, accounting for less than 1 per cent of global greenhouse gas emissions. By contrast, the unglamorous and largely disregarded business of making cement accounts for around 7 per cent of global emissions.

Will generative AI change this pattern? Not for quite a while. Although most business organizations now use AI for some purposes, it typically accounts for only 5 to 10 per cent of IT budgets.

Even if that share doubled or tripled the impact would be barely noticeable.  Looking the other side of the market, OpenAI, the maker of ChatGPT, is bringing in around $3 billion a year in sales revenue, and has spent around $7 billion developing its model. Even if every penny of that was spent on electricity, the effect would be little more than a blip.

Of course, AI is growing rapidly. A tenfold increase in expenditure by 2030 isn’t out of the question. But that would only double total the total use of electricity in IT. 

And, as in the past, this growth will be offset by continued increases in efficiency. Most of the increase  could be fully offset if the world put an end to the incredible waste of electricity on cryptocurrency mining (currently 0.5 to 1 per cent of total world electricity consumption, and not normally counted in estimates of IT use).

If predictions of massive electricity use by the IT sector have been so consistently wrong for decades, why do they keep being made, and believed? 

The simplest explanation, epitomised by the Forbes article from 1999, is that coal and gas producers want to claim that there is a continuing demand for their products, one that can’t be met by solar PV and wind. That explanation is certainly relevant today, as gas producers in particular seize on projections of growing demand to justify new plants.

At the other end of the policy spectrum, advocates of “degrowth” don’t want to concede that the explosive growth of the information economy is sustainable, unlike the industrial economy of the 20th century.  The suggestion that electricity demand from AI will overwhelm attempts to decarbonise electricity supply supports the conclusion that we need to stop and reverse growth in all sectors of the economy.

Next there is the general free-floating concern about everything to with computers, which are both vitally necessary and mysterious to most of us. The rise of AI has heightened those concerns. But whereas no one can tell whether an AI apocalypse is on the way, or what it would entail, an electricity crisis is a much more comprehensible danger.

And finally, people just love a good story.  The Y2K panic, supposedly based on the shortening of digits in dates used in computers, was obviously false (if it had been true, we would have seen widespread failures well before 1 January 2000).

But the appeal of the story was irresistible, at least in the English-speaking world, and billions of dollars were spent on problems that could have been dealt with using a “fix on failure” approach.

For what it’s worth, it seems likely that the AI boom is already reaching a plateau, and highly likely that such a plateau will be reached sooner or later. But when and if this happens, it won’t be because we have run out of electricity to feed the machines.

.

Share

Leave a comment

Share John Quiggin's Blogstack

Read my newsletter

联系我们 contact @ memedata.com