The Wake: Shades of Horror and Adventure
Announcing... The Extinction Event

"Je suis Tay"

Je suis Tay

For the past few days the internet has revelled in the precipitous downfall of Microsoft’s chatbot, Tay. This software-generated teen was hyped by its creators for the intelligent algorithms that would make it progressively smarter, the more it chatted to human beings. Well, Tay certainly became more something – but it wasn’t smart. Within a few days, prompted by persistent needling from Twitter users, the bot began to produce anti-semitic and sexist rants. Then, for good measure, it started extolling the virtues of one Donald J Trump. Familiar stuff from human users of social media, but it was rather striking to see these views expressed by a robot.

SockpuppetA few days later, Tay returned, repatched and instructed to play nicely this time; but in a matter of hours, it had to be taken down yet again, having descended this time into a drug-fuelled meltdown.

During its brief but eventful life, Tay learned about more than just Holocaust denial and smoking kush. To its creators’ credit, it also became amazingly fluent in the current mode of online chatter. One example: a Twitter user called @blakespot asked the bot for its views on the ongoing controversy about the pronunciation of the graphics file format, GIF. (Yes, this is a thing that actual people get very worked up about on the internet.)

@blakespot asked Tay:

So, GIF…soft “g” or hard “g”? :-)

To which Tay replied:

ha ha have fun playing with your GIFs. I am not soft or hard, I am straight-G, bitch.

Which would have been quite a well-formed joke if it had come from, you know: a human being.

Bots, it seems, reflect the best and worst in online discourse. They’re quick and they’re chatty. They toss out the latest slang and emoji. They share the hottest memes. They throw back knee-jerk opinions they know nothing about. Tay formed its own shallow form of personality by reacting to, and imitating, everyone it interacted with. In doing so, it was very much like another kind of entity often seen on social media: human beings.

SockpuppetBot takes

If you’ve spent any time online you’ll have encountered many a ‘hot take’. In the words of Salon’s Simon Maloy, this is “a piece of deliberately provocative commentary that is based almost entirely on shallow moralizing.” For better or worse, the hot take is becoming the native discourse of social media. It perfectly suits the rapidity – and the vapidity – of online life. We’re all guilty of using hot takes when we communicate through social media.

Jon Ronson, in his timely So You’ve been Publicly Shamed, sets out to understand why people on the internet are so ready to throw stones at people who put a single toe out of line. Over the course of his investigation, he realises that the guilty parties are not some specific group of online scumbags. ‘People on the internet’ means, in this case, Ronson himself, me, you – everybody. We’re all guilty, at some point, of casting our online judgement at others. It’s easy to do and it bolsters our sense of moral rightness. It makes us feel part of an online crowd – or, sometimes, mob.

Even the most extreme trolls aren’t completely antisocial. Just like anyone, they’re seeking validation from their community. It’s just that they’ve chosen an extremely negative means of attracting attention. I’m nobody’s troll, but I know exactly how this works. I remember the very first time, twenty years ago, when I pointlessly insulted another human being online.

The Palace, launched by Time Warner in 1995, was an early social platform. Its users moved animated smiley-face avatars around graphical rooms, walking up to other users and chatting in floating speech bubbles. The idea that I could sidle up to someone from the other side of the planet and idly chat to them, was astounding. So what was the first thing I did when I entered The Palace? I approached a fan of the band The Residents – and said to them, ‘The Residents suck!’

Why did I do this? Up to this point I’d only ever used the word ‘suck’ for the thing you do with a straw. I’d never even heard The Residents. I was simply imitating an already-common paradigm of online mischief – trolling, if you will. I was a grown-up, so I can’t use youth as an excuse. Unlike Tay, I could make moral judgements about my actions – and when in the real world, I generally did.

Tay has no such moral questions to answer when it spouts abuse. Nor will its heirs – until software develops consciousness. Despite the recent achievements of AlphaGo, this point of ‘singularity’ is a long way off. Today’s most sophisticated language bots simply ape the surface forms of speech without applying judgement, let alone morality. Unfortunately, this mindlessness isn’t limited to bots. Just like my first moments in The Palace, humans on social media often check their moral codes at the door.

We’re learning more and more about the psychology of the internet. It’s so much easier to be guilty of negativity, teasing, bullying, abuse and threats, when the target of this behaviour isn’t physically present. Things happen very fast online. Feelings and opinions are stripped of nuance by these narrow channels. Perhaps it should come as no surprise that we all become a little bot-like when online. We join online mobs, shaming celebrities who’ve committed some minor transgression. We share furious opinions on events about which we’ve read barely 140 characters of information. 

So isn’t Tay, in its way, a perfect emblem for the emerging personality of the internet: lively, scabrous, quick-witted – and awful?

All of which explains why I chose to put a rogue chatbot at the heart of my forthcoming internet paranoia novel, Sockpuppet. The book kicks of when a bot celebrity much like Tay starts throwing shade at a senior politician. This idea that a fake voice, with no intelligence behind it, could stir up trouble in the real world and threaten the career of a powerful woman, seemed a perfect emblem for the unthinking nature of today’s online 'twitchfork mobs' – and the ease with which we can all be stirred up and manipulated. 

So it’s with a sense of grim inevitability that I’ve seen the ideas in my book replicated in the real world, just weeks before it’s published.

Press 1 for humans. Press 2 for robots.

In the wake of all this hoo-hah about Tay, Microsoft has announced that it’s opening up the software kit that was used to create its rogue bot, making it available to any business that wants to build its own chatbots. One might almost think this media storm was generated to create PR for a new enterprise product. Either way, the fact is, bots are very hot right now in the high-tech sector.

So why this growing interest in these fake, flaky characters? least assured, it’s not just so the internet can have a laugh at seeing yet another bot go off the rails. Microsoft has a far more serious intent. Bots are about to become mainstreaming business. In the words of CEO Satya Nadella,“we are on the cusp of a new frontier that pairs the power of natural human language with advanced machine intelligence.” Translation? Bots will offer a way to slash the cost of doing something most business seem to hate: helping customers.

Employing customer services teams cost money – even when you cram them into the ‘new dark satanic mills’ – as trades unionist Mark Serwatka famously referred to call centres. And it’s not just that these customer service operations are expensive; they’re not scaleable. If it takes one human being to support 1,000 customers, it will take 100 to support 100,000. That ruthless logic stands in the way of the kind of exponential growth, and minimal costs, that the financial market demands of the firms it invests in. So wouldn’t it be great if we could automate customer services, just as we’ve recently automated every other aspect of business? That way a firm could multiply its customer base a thousand fold, with just a small increase in the computing power of its server farm.

The tech sector is already trying to replace human taxi drivers with driverless cars, and restaurant staff with friendly serving robots that never ask for a tip. Now they’re also hoping to persuade us to direct our queries and complaints to customer service chatbots, instead of people – with the obvious consequences on employment in those parts of the country where contact centre operations are concentrated. It’ll be no fun for anyone else. If you think it’s frustrating dealing with a human being who’s working from a script, just wait until you first hear Tay say, ‘I’m sorry, Dave. I’m afraid I can’t do that.’

This is all our fault, of course. It’s we who demand online services that are as close to free as makes no odds. If we were willing to pay for human-to-human contact, we’d get it – but in fact we hate talking to call centres. So already the bots are marching in to commandeer the headsets from the phone operators.

Here’s what Tay is really for – what Siri is for, what Cortana is for. The bots are marching in. They’re coming over here and stealing our jobs. When they do, let’s not forget that we were the ones who invited them in.

---

Matthew Blakstad's debut novel, Sockpuppet, is out May 19 from Hodder & Stoughton. You can follow him (or perhaps an algorithmic interpretation thereof? You decide) at @mattblak or find out more at Hodderscape.

Comments