Why Robots Are Evolutionary, Not Progressive
We are just now truly entering the age of robots. Science fiction, over 50 years ago, promised we’d all have a nice personal robot in our homes by now, attending our every whim with, if not outright glee, then perhaps, witty sarcasm. It also promised us flying cars a la The Jetsons. It’s taking a little longer than expected. But that promise of robots is finally starting.
The sticky wicket however, is that we’ve been looking at robots, like AI, from a problem solving point of view, a necessary for progress. This is not helpful. Robots and AI are evolutionary technologies. We need to look at them differently to ensure they work for us and not us for them.
The technocrats of Silicon Valley put it all down to progress. So do the capitalists financing these technologies. They are wrong. Or at least, misunderstand why we need these technologies. They see them only in the light of solving problems. This is largely why we tend to make mistakes with technology.
Human survival is so deeply intertwined with technology. We have co-evolved for hundreds of thousand of years and the code we use to make this symbiotic relationship work is culture. Through traditions, rituals, language, arts, we teach our societies when, where, how and why to use technologies. From a stone axe to robots.
Evolution happens through ongoing dynamic interactions, both biologically and culturally. Technology advances in lockstep through combinations as a result of evolutionary processes to match human evolution.
Robots are an evolutionary technology. Able to work because we have evolved our understanding of the world and how we can interact with it.
For the past 50 or so years, we imagined robots in our lives in both practical and fanciful ways. The first use of robots was the mid 1950’s with a robotic arm for manufacturing. That’s largely where robots have progressed. In manufacturing, robots solve problems. And this is what has largely been driving robotics development ever since. This is a problem.
Robots have come a long way since then. Now they can open doors, make pizzas, wash dishes, fight and help rescue people in natural disasters. In Japan, they’re using robots to help combat loneliness in senior citizens.
If you’re of a mind that robot armies are coming to fight wars, that is a distinct possibility. We may or may not be doing enough to stop that. There was an attempt at the UN to get nations to not build automated weapons. It failed. The other fear is job losses. There is far more to fear about combat use than job losses. This is another example of why the problem solving approach to robotics in broader society is, well, a problem.
Today, we are in the early stages of using culture to figure out just how we want and will use robots. It’s an evolutionary cycle. Some of this experimentation is not obvious and not done implicitly as an experiment.
There are robotics scientists working to make robots much more human-centric. At the forefront is the brilliant Japanese scientist Hiroshi Ishiguru. He’s made astounding progress. It is important work in finding the boundaries society will set in regards to just how human we want robots to be. Dr. Ishiguru’s work is probably at the forefront of seeing robots as evolutionary, not just in terms of progress.
When we see robots through an evolutionary lens and not just as progress for progress’ sake, we see them in a different light and we can make them better.
Through the aesthetic elements of culture, our early experimentation is through movies; series like WestWorld, books and art. This is one of the most important parts of how we use culture to explore technologies. It’s because story telling is a fundamental communication tool for humans. Aesthetic elements in culture wrap around everything else.
We are telling a lot more stories about robots now. From those YouTube clips of Boston Dynamics robots to featuring them more in TV shows and films and in science fiction literature. Technology journalists are telling stories in the media as well.
Robots are also evolutionary because they express something we’ve been doing for a while now; trying to replicate ourselves mechanically. We’re doing this with Artificial Intelligence to replicate our cognitive selves. Only thing is, as I wrote before, machines will never think like humans. Robots represent a physical extension where AI represents a cognitive extension of humans.
Just as biological evolution is messy, often weird and prone to mistakes, so is the human development of evolutionary technology. The development of AI has already been messy as the recent roll-out of ChatGPT shown us, along with many other instances.
Why does this even matter? If we just see the development of robots and AI as progress, then we fail to apply critical thinking and just use problem solving approaches. While the problem solving mental model is useful and necessary, technologies like AI and robotics require critical thinking because these technologies are more holistic and are part of complex systems. They impact so many areas of our societies we can’t just see them as progressive.
When we look at robots through an evolutionary lens, we inherently think in more complex, holistic terms. We may tend to respect and be informed by the past and see them in a more long-term time frame. This could lead to much better robots that do more things and fit into society better.
Just seeing robots through a problem solving lens is technological reductionism. Both AI and robots can’t just be seen through the lens of return-on-investment only. They’re supposed to deliver a social good, which is the good side of capitalism. If a product or service delivers a social good well enough, not cheap enough, then profit should naturally be the result.
Not all robots need to be seen through an evolutionary lens. It largely depends on the application. In manufacturing, it’s very much about problem solving. But for robots that will interact with society in more complex ways and may use AI as part of their function and in broader sociocultural terms, we should use an evolutionary lens.