How We’re Adapting to Avoid Algorithms

--

Image by Gerd Altmann from Pixabay

The algorithms are everywhere. From social media and eCommerce platforms, to news sites and everything in between, including voice assistants like Siri, Hey Google and Alexa. They are used to surface topics in our feeds, to guide us into making purchases and understanding how we move around the internet and for censorship. They tie into predictive analytics, their results like honey to data scientists and marketing analytics wonks and Automated Decision Systems (ADS). They are unavoidable. Or are they?

As consumers become increasingly aware of what algorithms are and how they’re key to manipulating our behaviours, people are evolving ways of evading, avoiding and messing with them. Humans have been adapting ways to reject technologies, systems of governance and rules they don’t like for many thousands of years. In many non-Western societies, when a leader was disliked, the people would often leave that leaders centre of power and live farther away from that leader, reducing his ability to project power. Sometimes, they’d abandon cities altogether. other times, revolutions happened, leaders were imprisoned or had their heads cut off.

Such bloody revolutionary approaches though, are more uncommon than one might think. Usually, people find ways to work around a system. Ancient cultures often created very complex rules that effectively limited the power of the leader. In Native American cultures the leaders had to use persuasion rather than dictatorship to lead.

All this to say, that there is a quiet citizen revolution underway that any company reliant on algorithms, even being aware of it, probably can’t do much about. From civil society concerns to business, people are starting to fight back against the algorithms. This is about privacy and freedoms. It’s not a clear movement with an organised leader or leaders. It’s organic. A groundswell. The actions of many in many small ways. So what’s happening?

Language

When it comes to censorship, we’ve been adapting workarounds for some time. From txtspk to memes, emoji’s and what is known as Leet (1337 or 133tsp34k.) Leet, also called eleet or leetspeak is a way of using character replacement that play on the similarity of glyphs. The way you read a word and say it. It’s been around since the 1980’s, but is gaining popularity as people realize it can mess with algorithms.

Most often, various forms of leet or txtspk are used to avoid algorithms that censor speech. This is proving popular with more extreme groups on the far right and left when they use platforms where they’d normally be banned. It is starting to grow in popularity with younger netizens to avoid being targeted by ads. So too are we creating euphemistic language where we describe things as euphemisms to avoid the censorship algorithms. Instead of saying “dead” instead we say “not alive” or “become unalive.”

Facial Recognition

You may think this technology is used for fighting crime and terrorism only. It isn’t. A shopping mall in Canada was using it to analyse customers. Fortunately, it was shut down. Amazon is using it in their cashier-less stores and in Japan, it’s tied into vending machines to recommend beverages you might enjoy based on your face. Walmart is looking at using it to understand shoppers mood.

Now, it’s not just activists developing ways to block facial recognition technology, but also fashion designers. Online fashion brand Adversarial has hoodies, skirts, backpacks, scarves and more all designed to mess with facial recognition technology. The there’s Stealth Wear which is developing various types of clothing, including a hijab that blocks not just facial algorithms, but body heat too.

Yes, these are interesting concepts, but facial recognition companies are aware of them and in some case have figured out how to deal with them. Humans are nothing if not good at adapting. Other workarounds will be found.

What Does This All Mean?

We often don’t think about how we as humans, adapt to the technology rather than the technology adapt to us. This is especially so with digital communications technology. To be active and engaged in the digital world, humans adapted to the algorithms quickly. But now people are growing increasingly distrustful of algorithms because there has been little ethical rules around their use and the resulting manipulation of behaviours as now understood by more consumers than ever before.

As mistrust in any system grows, humans always find workarounds. Always. Algorithms are no exception. It’s not just privacy issues that tech companies are going to have to work better at, but how they use algorithms. Just as Apple has turned privacy into being a profitable message, if other tech companies choose to, they can make algorithms better and profit from it. This means a little more work, but the payback can be huge when it’s done right.

--

--

Giles Crouch | Digital Anthropologist
Giles Crouch | Digital Anthropologist

Written by Giles Crouch | Digital Anthropologist

Digital Anthropologist | I'm in WIRED, Forbes, National Geographic etc. | Speaker | Writer | Cymru

No responses yet