How AI will destroy the planet
So Intelligence can be artificial and laziness can abound.
The human brain is an incredible computational device. It is incredible in so far as the best computers on the planet have yet to beat it. Despite, running many many servers and GPUs, no AI has yet been able to supersede the intelligence of one human brain.
The marvel of the brain does not come from this fact. The marvel comes from the efficiency with which it performs this feat.
The human brain is the most energy-hungry part of your body. It takes up about 20% of the blood glucose. Tests showed that Gary Kasparov used up about 6000 Calories in a day when he was engaged in a game of chess. To put that in perspective, if you used up about 7700 Calories, you would have used up about 1 Kg of Body fat. Kasparov would have got up from the table, 900 grams lighter.
While that might seem like a lot.
1 Calorie = 4.184 Ws
6000 Calories = 25104 Ws or 6.97 Wh
To translate this in bulb terms, if you had a 100W bulb, you would be able to light it up for 4 minutes and 10 seconds. That is the amount of energy that Gary Kasparov’s body used for an entire day of Chess computation while still digesting his food, growing his nails, growing his hair (whatever little), regulating his breathing, regulating the release of 10s of hormones, pumping his heart, and myriad other bodily functions needed to keep him alive.
That 100W bulb would have used up 2400 Wh or 2.4 Units of electricity, to just stay lit for a whole day.
Intelligence Energy Gluttony
Off late ChatGPT and other Large Language Models (LLM) have been assumed to be the precursor to Artificial General Intelligence (AGI). Taking a look at their energy footprint -
With little information available at the time about ChatGPT’s user base, my estimate was based on the assumption that the ChatGPT service was running on 16 A100 GPUs. In lieu of recent reports that estimate that ChatGPT had 590 million visits in January , it’s likely that ChatGPT requires way more GPUs to service its users.
The table shows that ChatGPT’s electricity consumption in January 2023 is estimated to be between 1,168,200 KWh and 23,364,000 KWh.
Training ChatGPT’s underlying language model, GPT-3, consumed an estimated 1,287 MWh .
So at the low end of the estimated range for ChatGPT’s electricity consumption in January 2023, training GPT-3 and running ChatGPT for one month took roughly the same amount of energy. At the high end of the range, running ChatGPT took 18 times the energy it took to train.
Source: Towards Data Science
All energy consumed leads to a carbon footprint and ChatGPT has an impressive carbon footprint as well.
For large language models (LLMs) like ChatGPT, we’ve gone from around 100 million parameters in 2018 to 500 billion in 2023 with Google’s PaLM model. The theory behind this growth is that models with more parameters should have better performance, even on tasks they were not initially trained on, although this hypothesis remains unproven.
A study from Carnegie Melon University professor Emma Strubell about the carbon footprint of training LLMs estimated that training a 2019 model called BERT, which has only 213 million parameters, emitted 280 metric tons of carbon emissions, roughly equivalent to the emissions from five cars over their lifetimes. Since then, models have grown and hardware has become more efficient, so where are we now?
Depending on the energy source used for training and its carbon intensity, training a 2022-era LLM emits at least 25 metric tons of carbon equivalents if you use renewable energy, as we did for the BLOOM model. If you use carbon-intensive energy sources like coal and natural gas, which was the case for GPT-3, this number quickly goes up to 500 metric tons of carbon emissions, roughly equivalent to over a million miles driven by an average gasoline-powered car.
And the algorithms undergo retraining every time a new version is released with more parameters within. This is only causing an increasing amount of pollution across the globe. It does not end with that thought,
When looking into how much water is needed to cool the data processing centers employed by companies like OpenAI and Google, the researchers found that just in training GPT-3 alone, Microsoft, which is partnered with OpenAI, consumed a whopping 185,000 gallons of water — which is, per their calculations, equivalent to the amount of water needed to cool a nuclear reactor.
What's more: "ChatGPT needs to 'drink' [the equivalent of] a 500 ml bottle of water for a simple conversation of roughly 20-50 questions and answers," the paper notes. "While a 500ml bottle of water might not seem too much, the total combined water footprint for inference is still extremely large, considering ChatGPT’s billions of users."
So while OpenAI is busy releasing APIs to make it possible for companies to ask ChatGPT hundreds of questions every minute, it rapidly increases the environmental impact that these models have on the environment. When parts of the world are facing historic droughts, using up water to run a “word regurgitator” is not the best use of a very limited resource that we have.
AI is a crime
So now when you look at the staggering amount of energy used up to train this code. Just think of the amount of energy that will end up being consumed if this has to reach anywhere close to the levels of General Intelligence that humans possess.
Look at the energy efficiency of the human brain. We had earlier seen that a human brain when working at full kilter consumes about 7 Wh of energy in a day. Assuming a person lives 100 years and continues to consume energy at the same rate, that person would have used up 255 kWh of energy. This would include going from a hopeless infant to being a fully developed adult. Learning along the way, perhaps many languages and many skills. Applying them and even creating new things. ChatGPT required 100,000 times that energy in a month, to spit out words that even it did not understand.
Just imagine taking food that 100,000 people would eat in 100 years and giving it to one person in a month. That is the sort of energy allocation we are making.
All that for stupid AI. The amount of energy that would be required to reach AGI would be an order of magnitude greater.
We will all die not because AI will take over the planet but because we used a nuclear reactor’s worth of energy to answer 5 million stupid questions that did not ever need answering.
The most efficient computing device to have existed - is the brain. It does a lot more than any of the computers out there, using a lot less. In a less selfish world, we would be working on optimising the output that we can get out of that which we already have.
But capitalism demands we one person should hoard up everything and hence AI.
Chat GPT would have consumed the power to run city to write a content likes yours I suppose...