Suara Malaysia
ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
Monday, December 23, 2024
More
    ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
    HomeTechGenerative AI could soon use more power than a country

    Generative AI could soon use more power than a country

    -

    Fly AirAsia from Kuala Lumpur

    A Dutch researcher has highlighted the enormous energy use associated with the whole new generation of tools powered by generative artificial intelligence. Eventually, if they were to be adopted widely, these tools could end up using as much energy as a whole country, or even several countries combined.

    Alex de Vries, a PhD candidate at Vrije Universiteit Amsterdam, has published research in the journal Joule into the environmental impact of emerging technologies such as generative artificial intelligence.

    The arrival, in less than a year, of tools such as ChatGPT (from OpenAI), Bing Chat (Microsoft) and Bard (Google), as well as Midjourney and others in the image sector, has greatly boosted demand for servers, and consequently the energy required to keep them running smoothly. This development inevitably raises concerns about the environmental impact of this technology, which is already being used by many people.

    In recent years, excluding cryptocurrency mining, electricity consumption by data centers has been relatively stable, at around 1% of global consumption. However, the expansion of AI, which is unavoidable in many fields, is likely to change the game.

    According to Alex de Vries, the GPT-3 language model alone consumed more than 1,287 MWh during training. After this phase, the tool is put to work – with, to stay with ChatGPT, the generation of responses to prompts from Internet users.

    At the start of the year, SemiAnalysis estimated that OpenAI needed 3,617 servers, with a total of 28,936 graphics processing units (GPUs), to support ChatGPT, which would correspond to a power demand of some 564 MWh per day.

    ALSO READ:  WHO says AI can transform healthcare if understood properly

    And that, of course, is just the beginning. Still according to SemiAnalysis, implementing an AI similar to ChatGPT in every Google search would require the use of 512,821 dedicated servers, or a total of over 4 million GPUs.

    With a power demand of 6.5 kW per server, this would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh (terawatt-hours, or one billion kilowatt-hours). According to the most pessimistic scenario, AI deployed on a mass scale by Google could consume as much power as a country like Ireland (29.3 TWh per year).

    Alphabet has already confirmed that interaction with a language model could consume up to 10 times more power than a standard keyword search, increasing from 0.3 Wh to around 3 Wh. At Nvidia, the leading supplier of AI servers, more than 1.5 million units could be sold by 2027, for a total consumption ranging from 85 to 134 TWh per year.

    In conclusion, AI-related electricity consumption is fast becoming a major concern. Nevertheless, there are a number of ways that this can be reduced. The first would obviously be to prioritise renewable energy sources to power data centers.

    Next, comes the need to find ways of developing algorithms that consume less energy. Finally, Internet users could be educated on using AI responsibly, and without excess. – AFP Relaxnews

    Suara
    Suarahttps://www.suara.my
    Tech enthusiast turning dreams into reality, one byte at a time 🚀

    Related articles

    ADVERTISEMENTFly London from Kuala Lumpur

    Subscribe to Newsletter

    To be updated with all the latest news, offers and special announcements.

    Latest posts