Jan writes,

> Computer assisted decision making is valuable if the emphasis is on
> 'assisted'. And even then, the final decisions need to be controlled by
> humans, otherwise we get unchallenged nonsense IMO intelligence is
> a human attribute, not a mechanical or robotic one.   Jan


Techscape
<[email protected]>

By: Chris Stokel-Walker

Turns out there’s another problem with AI – its environmental toll.  AI uses 
huge amounts of electricity and water to work, and the problem is only going to 
get worse.

Technology never exists in a vacuum, as the rise of cryptocurrency in the last 
few years shows.

While plenty of people were making extraordinary amounts of money from 
investing in bitcoin and its competitors, there was consternation about the 
impact those get-rich-quick speculators had on the environment.

Mining cryptocurrency was environmentally taxing.

People began working on an industrial scale, snapping up the high-powered 
computer chips, GPUs , that could mine for crypto faster than your 
off-the-shelf computer.

And those computer chips required more electricity to power them; bitcoin 
mining alone uses more electricity than Norway and Ukraine combined. The 
environmental cost of the crypto craze is still being tallied.

The AI environmental footprint

A booming part of tech – which uses the exact same GPUs as intensely, if not 
moreso, than crypto mining – has got away with comparatively little scrutiny of 
its environmental impact.

We are, of course, talking about the AI revolution.

Generative AI tools are powered by GPUs, which are required to power the likes 
of ChatGPT and Google Bard.  (Google uses its own similar technology, called 
tensor processing units, or TPUs.)

There should be more conversation about the environmental impact of AI, says 
Sasha Luccioni, a researcher in ethical and sustainable AI at Hugging Face, 
which has become the de facto conscience of the AI industry. (Meta recently 
released its Llama 2 open-source large language model through Hugging Face.)

“Fundamentally speaking, if you do want to save the planet with AI, you have to 
consider the environmental footprint [of AI first],” she says. “It doesn’t make 
sense to burn a forest and then use AI to track deforestation.”

Counting the carbon cost

Luccioni is one of a number of researchers trying – with difficulty – to 
quantify AI’s environmental impact. It’s difficult for a number of reasons, 
among them that the companies behind the most popular tools, as well as the 
companies selling the chips that power them, aren’t very willing to share 
details of how much energy their systems use.

There’s also an intangibility to AI that stymies proper accounting of its 
environmental footprint. “I think AI is not part of these pledges or 
initiatives, because people think it’s not material, somehow,” she says.

“You can think of a computer or something that has a physical form, but AI is 
so ephemeral. For companies trying to make efforts, I don’t typically see AI on 
the radar.”

That ephemerality also exists for end users.

Let’s start with water use. Training GPT-3 used 3.5m litres of water through 
datacentre usage, according to one academic study, and that’s provided it used 
more efficient US datacentres.

If it was trained on Microsoft’s datacentres in Asia, the water usage balloons 
to closer to 5m litres.

Prior to the integration of GPT-4 into ChatGPT, researchers estimated that the 
generative AI chatbot would use up 500ml of water – a standard-sized water 
bottle – every 20 questions and corresponding answers.

And ChatGPT was only likely to get thirstier with the release of GPT-4, the 
researchers forecast.

Estimating energy use, and the resulting carbon footprint, is trickier. One 
third-party analysis by researchers estimated that training of GPT-3, a 
predecessor of ChatGPT, consumed 1,287 MWh, and led to emissions of more than 
550 tonnes of carbon dioxide equivalent, similar to flying between New York and 
San Francisco on a return journey 550 times.

Reporting suggests GPT-4 is trained on around 570 times more parameters than 
GPT-3. That doesn’t mean it uses 570 times more energy, of course – things get 
more efficient – but it does suggest that things are getting more energy 
intensive, not less.

For better or for worse

Tech boffins are trying to find ways to maintain AI’s intelligence without the 
huge energy use. But it’s difficult.

One recent study, published earlier this month, suggests that many of the 
workarounds already tabled end up trading off performance for environmental 
good.

It leaves the AI sector in an unenviable position. Users are already antsy 
about what they see as a worsening performance of generative AI tools like 
ChatGPT

Sacrificing performance to reduce ecological impact seems unlikely. But we need 
to rethink AI’s use – and fast.

Technology analysts Gartner believe that by 2025, unless a radical rethink 
takes place in how we develop AI systems to better account for their 
environmental impact, the energy consumption of AI tools will be greater than 
that of the entire human workforce.

By 2030, machine learning training and data storage could account for 3.5% of 
all global electricity consumption. Pre-AI revolution, datacentres used up 1% 
of all the world’s electricity demand in any given year.

So what should we do? Treating AI more like cryptocurrency – with an increased 
awareness of its harmful environmental impacts, alongside awe at its seemingly 
magical powers of deduction – would be a start.

_______________________________________________
Link mailing list
[email protected]
https://mailman.anu.edu.au/mailman/listinfo/link

Reply via email to