Unfortunately we don't currently support Internet Explorer. Please upgrade to Microsoft Edge, Google Chrome or Safari
It seems that in the space of a few months, few people can suddenly stop talking about generative artificial intelligence (AI) tools, and the ways in which they could upend not only how we engage with content online, but also the functioning of our societies.
February alone saw Google, and then Microsoft, confirm that their search engines would undergo major overhauls, as the two tech titans looked to plough big investment into the purchase or construction of generative AI tools. Chinese search giant Baidu has said it will do much the same.
These technologies draw upon large language models (LLMs) in order to understand and respond to complicated queries. It should be no great surprise, then, that efforts are intensifying to incorporate these tools into the online search experience, in the hope that users will benefit from greater richness and accuracy in the results when they next call upon a search engine.
But headaches loom over the computing power AI-powered search could need
There is, however, at least one factor that will likely serve as a major counterpoint to the recent hype: in order to make high-performance, AI-powered search engines possible, there is likely to be a need for a steep rise in computing power.
As a consequence, tech companies will probably need to escalate the amount of energy that they use, with all the implications this could have for their carbon emissions.
In the words of Alan Woodward, a cybersecurity professor at the University of Surrey in the UK who was recently quoted on the issue by WIRED: “There are already huge resources involved in indexing and searching Internet content, but the incorporation of AI requires a different kind of firepower.
“It requires processing power as well as storage and efficient search. Every time we see a step change in online processing, we see significant increases in the power and cooling resources required by large processing centres. I think this could be such a step.”
The training of LLMs, like those that provide the foundations for OpenAI’s ChatGPT, entails parsing and computing linkages within monumental volumes of data, which helps explain why it is firms with especially deep resources that usually develop them.
As WIRED contributor Chris Stokel-Walker noted, third-party analysis has estimated that the training of GP-3 – on which ChatGPT is partly based – consumed 1,287 MWh. This process also resulted in emissions equivalent to more than 550 tons of carbon dioxide, which is the amount that would be incurred by a single person taking 550 round trips between New York and San Francisco.
That is before one considers that while ChatGPT is estimated to have about 13 million daily users, Bing processes half a billion searches a day.
Martin Bouchard, who cofounded the Canadian data centre company QScale, said that the addition of generative AI to the search processes of Microsoft and Google would – based on his reading of the plans – likely require “at least four or five times more computing per search”.
Can Google and Microsoft meet their existing sustainability commitments?
The ambitious plans that the two tech giants have set out on generative AI tools makes it all the more fascinating to look again at the sustainability milestones they are aspiring to reach in the years ahead.
Microsoft, for instance, has committed to becoming carbon negative by the middle of this century. The firm’s plans include the purchase this year of 1.5 million metric tons’ worth of carbon credits. Meanwhile, Google is aiming to achieve net-zero emissions across its operations and value chain by 2030.
However, there is hope that the process of integrating AI into search could be made less energy-intensive by such steps as the movement of data centres onto cleaner energy sources, and the redesign of neural networks to bolster their efficiency. Through such work, it could be possible to lower what is known as the “inference time”, which refers to the amount of computing power an algorithm needs in order to work on new data.
Google spokesperson Jane Park also said to WIRED that the company was initially releasing a version of Bard – its AI chatbot tool – that was powered by a lighter-weight LLM.
She also explained that according to research the search giant had published, “combining efficient models, processors, and data centres with clean energy sources can reduce the carbon footprint of a [machine learning] system by as much as 1,000 times.”
Don’t look elsewhere for the assistance that will help power your high-end brand’s growth in 2023
Doubtless, the story of generative AI tools in the broader context of how we search for and engage with content – as well as the impacts it will likely have on digital sustainability – is one that will continue to evolve.
In the meantime, when you are seeking out the services of professionals who are best-placed to support your luxury, lifestyle or fashion brand’s efforts to extract the best of its potential in 2023, Skywire London stands ready to help.
We are a digital agency in London, but with a global mindset, as could go a long way to helping your brand to achieve its growth goals in the months and years ahead.
Stay informed on fashion, luxury ecommerce