Jane Bradley: Pioneer of AI will be turning in his grave

The modern versions of the 'virtual assistant' ' Apple's Siri or Amazon's Alexa have been given female voices, presumably after much market research and focus group planning. Picture: Contributed
The modern versions of the 'virtual assistant' ' Apple's Siri or Amazon's Alexa have been given female voices, presumably after much market research and focus group planning. Picture: Contributed
0
Have your say

The developers of Artificial Intelligence are ignoring the warnings of one of its pioneers, writes Jane Bradley.

In the 1980s, there was an MS-DOS computer programme which came automatically installed on our home PC called Eliza.

Originally created to attempt the Turing Test in 1966, she was coded to mimic the conversation a psychotherapist might have with a patient.

Purportedly able to have a real time – and realistic – conversation with the user, Eliza offered to discuss your problems and replied to whatever you told her with a seemingly relevant response.

“Come, come elucidate your thoughts,” she told my friends and I, soothingly, whether we told her that we didn’t like our dinner that night or that we thought our teacher was actually a vampire.

Of course, even as a fairly young child, I knew the program was just that – programmed. And once I had used it a few times, I discovered that the same responses came out time and time again, no matter what we put in. In short, Eliza was clearly not real.

Yet, it turns out that us cynical children were fairly unusual. The programme’s creator, German-American computer scientist Joseph Weizenbaum, who himself regarded Eliza as a method to show the superficiality of communication between man and machine, ended up surprised by the number of people who attributed human-like feelings to the inanimate creation.

In an interview, Weizenbaum explained: “My secretary, who had watched me work on the program for many months and therefore surely knew it to be merely a computer program, started conversing with it. After only a few interchanges with it, she asked me to leave the room.”

Weizenbaum’s secretary, who logically knew that Eliza was an inanimate creation, found her connection to the bot to be so hugely personal, that she wanted to keep their conversations private.

And therein lies the danger of a world where machines are given human attributes. The modern version is the “virtual assistant” – Apple’s Siri or Amazon’s Alexa. Both are, presumably after much market research and focus group planning, given female names and soothing voices to boot.

Earlier this week, Amazon announced plans to introduce a virtual “butler” service in hotels through Alexa. The technology tie up – initially with the Marriott hotel brand, will allow guests to use the Amazon Echo in their room to ask Alexa for hotel and tourist information, contact the hotel’s staff such as the concierge to request guest services, make dinner reservations and play music in their room.

Cards in the rooms, pictured to promote the new initiative, will prompt guests to ask “Alexa, turn on the TV” or “Alexa, show me the front door”. These are all things which, with just a tiny amount of research, or by moving just a few feet from our seats, we could easily accomplish or discover.

Of course these things would require just an iota of either brain power; physical movement or human interaction. And while all these things are repeatedly found to be good for us, both physically and mentally, they have somehow become the things that we as a modern society, most want to avoid.

Weizenbaum himself, in his 1976 book Computer Power and Human Reason, flatly denied that human intelligence could ever be formulated by machine-responsive equations and rules. As a pioneer of Artifical Intelligence, he realised what damage its creation could do to society.

One of the things he was most appalled by was that psychiatrists suggested that the Eliza program might be an acceptable substitute for human therapy. Now, computer programmes and apps offering Cognitive Behavioural Therapy (CBT) are regularly offered up as an alternative to patients with depression who would otherwise be on long waiting lists to see a counsellor face to face. Weizenbaum, who died in 2008, six years before the invention of Alexa, would turn in his grave.

On a daily basis, we are now bombarded with conversations which may or may not be with real people. Phone the bank and you are greeted by a recorded voice. It takes quite a few presses of the button and often a requirement to answer multiple questions put to you verbally before you can speak to a living, breathing, human being. Similarly, online chats which pop up in the corner of many a company’s website require you to converse with what is often a set of automatically generated phrases – please input your user name, account number, describe in ten words what your problem is – before you actually get the paid-for time of a real-life person.

We apparently go out of our way to avoid human contact yet various studies have shown the mental and physical impact of loneliness. We are urged to speak to our elderly neighbours, make sure that people are not left without someone to talk to, even if it is just to pass the time of day. We are happy to share these articles on social media, but in real life, for some reason we want to actually interact with other humans as little as possible.

Alexa is not real; she is not your friend. She is no more “intelligent” or has any more human compassion than the original Eliza. Weizenbaum was right: there is no alternative to human contact.