The Fragility of Artificial Intelligence
You could be forgiven for thinking that in this Big Bang moment for Artificial Intelligence (AI), it is now going to change everything and it’s an unstoppable technology. This is both true and wrong. AI is actually quite fragile. This is good and bad.
Artificial Intelligence is also an umbrella term for multiple tools. Natural Language Processing, Machine Learning, Neural Networks are some. And now of course, Generative AI (GAI) that is driving services like DALL-E 2, Midjourney and ChatGPT. They too are more fragile than we think.
Why Is AI So Fragile?
Three primary reasons. The first is energy and the second is related to energy, which is, it can be unplugged. The other is that AI is disembodied from society. Current investment into AI tools is more focused on AI development and business models. Less on sustainability of the tools.
AI is a disembodied technology. Unlike our devices and physical tools we use, AI isn’t really embodied within our daily interactions. When a technology is disembodied from culture, non-tactile and largely relies on our imagination for its existence, we have less connection to it, which makes it easier if we decide we don’t like it as a society and want to make changes. This is a point of fragility for AI.
When we’re barking commands at our voice assistants or asking ChatGPT to write a new Tolstoy novel, we don’t think about the incredible amount of energy it took to build that AI tool, such as all the training, and the cost of the energy to run the data centres. The reason OpenAI runs ChatGPT on Microsofts Azure platform is because the cost for them to build their own infrastructure wouldn’t make ChatGPT viable as a business.
Not only is it the immense energy costs that make AI fragile, it is the underlying power grid itself. There’s a reason that industrial scale data centres have massive backup systems in place. But they’re only designed for short periods of time, a few days at most. Some have invested in renewable energy sources to reduce reliance on the main grids.
But power grids that are part of national infrastructure even in developed nations are lacking significant investment needs. Many are at additional risk from the impacts of climate change. Massive storms and wild temperature fluctuations.
Today, any AI tool can also be unplugged. This could be for multiple reasons from a company shutting down to changes in laws or regulations due to concerns such as privacy or human rights. Or national security.
The upside of being able to unplug AI is that if it does something rogue, like actually does become intelligent or suddenly develops some form of awareness and it gets aggressive towards us, we can unplug it. As energy storage improves at scale and the power grid upgrades, this will become less so. We may want to ensure we do have an easy off switch somewhere.
Another reason AI is fragile is that we may well run out of data on which to train it. This may seem odd in a world where we generate massive volumes of data, numerical, textual, image in the run of a day. But we could run out of data. And not just because we don’t generate enough.
One of the big concerns arising out of GAI tools like ChatGPT and DALL-E 2 is copyright. These tools scraped the internet and used other sources of content. But did OpenAI obtain all the necessary permissions? Did they have to? This will be battled out in the courts for a few years.
But what we may start to see is the development of tools, such as software apps, that stop AI from accessing their content. Companies and other organisations may put up blocks like “do not follow” commands given to search engines. As well, there is far more content on the Dark Web than there is in the open internet.
So legal, anti-AI activism for intellectual protections and simply running out of new data, could create a problem for tools like Large Language Models (LLMs.)
The looming reality of the Splinternet, that some suggest is already here, could also play a role in limiting or having an adverse impact on some AI tools as well. The more walled gardens, which is a trend underway already, also means difficulty accessing data, or driving up the price of access. This is where infonomics come into play.
So AI is right now, rather fragile. More than we realise. These fragilities will lessen over time. Or they may get worse. Our world is rather messy right now and it’s like to get messier yet before it settles into a new period of prosperity. So if you’re of the mind that our AI overlords are looming larger, they’re not. For now.