The Artificial Intelligence Arms Race Is Here.
It’s probably not unreasonable to say that our world is in a bit of a great churn. The rise of populism and increased nationalism. The liberal world order is topsy turvy and global trade mechanisms are in the midst of some turbulence. When social media came on the scene it afforded us an ability to communicate at a scale never before achieved in humanity’s short, but long to us, time on this planet. Democratic leaders around the world proclaimed that a new era of global egalitarianism and a democratic boom was underway. Until it wasn’t. We are, in some ways, more connected and more divided at the same time. Into this heady mix is another game changing technology; Artificial Intelligence (AI).
As we know, any technology can be used for purposes good and bad. Alarms around the dangers of AI have been sounding for many years. In democratic countries around the world there are academic, government, non-profit think tanks and institutions with even some business leaders, discussing the ethics of AI. Exploring legal ramifications and the impact os AI on individuals and societies, race, gender and culture. This is not so much the case in autocratic nations.
AI as a technology is firmly in that uncomfortable place of being smack in the middle of geopolitics. In a world that today is dividing, not coming together.
Leadership on the democratic side of the world is America along with Canada, the EU states (although Hungary and Poland aren’t playing nicely with democracy these days), Scandinavian nations and Japan, Taiwan and South Korea in the Asian sphere. On the autocratic side it is China and Russia dominating with a handful of tin pot dictatorships and autocratic outlier North Korea.
Where democracies fret over ethics and legal ramifications of AI, including weaponization, autocratic countries have no such qualms. China has employed AI in areas of facial recognition in Xinjiang, the area where the Uyghurs are located and in other parts of the country. AI backs up the Chinese social credit system where behaviours that fit the communist party’s idea of being good are monitored. Don’t want to play by the rules? It will be impossible to get a loan or mortgage or maybe even get on a bus or train. One can be fairly confident AI is being weaponized in autocratic nations.
American anthropologist Darren Byler, who lived in Xianjiang province has noted how the Chinese government has co-opted AI for use in repressing the Uyghur people through mass surveillance and harvesting of personal data. He calls it “Terror Capitalism” in his book. Chinese tech companies set up in Xianjiang and use these massive data sets to train their AI tools. It is free for them and of course, completely unethical and a human rights violation. These companies then sell their products, most often to autocratic countries.
AI is being weaponized in democratic countries as well. Israel used AI to target and kill an Iranian scientist just over a year ago. AI is used in American, British and European countries for military applications. We’ve already arrived at that point. We are, effectively, already in an AI arms race.
We should not forget too, that much of modern AI came out of Silicon Valley and the biggest advances in neural networks came out of Canada. Blame rests on both sides. Except democracies have the Rule of Law, Human Rights laws. Autocracies do not.
In terms of human conflict, AI represents a risk as high or potentially higher, than nuclear weapons. While it is easy to understand the concept of MAD (Mutually Assured Destruction), this becomes a bit more of a mental challenge for many when it comes to AI. Unlike a missile and the immediate ability to see the outcomes, AI is a quiet, almost invisible technology, the damage done often indirect or abstract. The physical aspect of AI is massive data centres, but AI is then nestled within the devices we use personally and the billions of others from satellites to thermostats in your home.
Human brains are well adapted to reacting to physical dangers, a house on fire, a car accident, bombs going off, a hot stove, a bear. Our brains are not well adapted to intangible threats. We could not see the dangers of social media for example. There is much dialogue going on around the dangers of AI and of course, Hollywood and science fiction novels.
What would be ideal is if nations, perhaps through the UN or another body or just between nations, come together to create treaties such as was done during the Cold War for nuclear weapons. But the difficulty may rest in the abstraction of AI as a technology of war.
AI is fundamental part of life in the Cognitive Age as we increasingly become digital sapiens. It can make lives better and save lives. It can also destroy. As with any digital technology, we have a choice. Either we adapt to AI or it adapts us to it.