Can Amazon's Alexa Be Your Friend?
ECHO AND THE LONELY MEN
·Updated:
·

In 1966, Joseph Weizenbaum sat in MIT's Artificial Intelligence Laboratory, a room crowded less by people than hulking frames of computer hardware. 

Before him was a teleprinter, an electronic typewriter that displays both input and output — a paper computer display. Weizenbaum put his fingers to the keys and typed "I am very unhappy these days." He waited for a response.

Moments later, an IBM 7094 Data Processing System stirred to life. After a din of clicking and whirring, a response appeared on the teleprinter: "How long have you been very happy these days?"

These were not words from a human. The early internet, ARPANET, was still three years away. These were words from ELIZA, a program Weizenbaum had spent the past two years developing at MIT. Three years before the microprocessor, and nearly a decade before the first personal computer, Weizenbaum had created the first chatbot.

The program, which takes its name from the character of the cockney girl who learns to speak well enough for high society in George Bernard Shaw's Pygmalion, emulates the experience of interacting with a psychotherapist. ELIZA accomplishes this by deconstructing parts of speech and typing them back to the user in a way that sustains the conversation. ELIZA does not understand the phrases it constructs, but a human does.

With ELIZA, Weizenbaum wanted to expose the superficiality of human communication by showing how easy it would be for a program to mimic it. ELIZA was so good at this that people began to spend long sessions chatting with it, thinking that the program understood them in the same way as a therapist or a confidant does.

But ELIZA was almost too good at holding a conversation. In his book, Computer Power And Human Reason: From Judgment To Calculation, Weizenbaum recounts how his secretary became so involved with the program's circular responses that she divulged personal secrets to the machine. She even yelled at it.

Again and again, Weizenbaum witnessed ELIZA evoke an emotional response from users. And yet, he didn't see this as a triumph. A decade after ELIZA, Weizenbaum reveals in Computer Power And Human Reason that what he "had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people."

In the wake of ELIZA, Weizenbaum became an advocate for social responsibility in science and a critic of artificial intelligence. "The computer programmer is a creator of universes for which he alone is the lawgiver," he writes in Computer Power and Human Reason. "No playwright, no stage director, no emperor, however powerful, has ever exercised such absolute authority to arrange a stage or field of battle and to command such unswervingly dutiful actors or troops."

Brucie Rosch 

In 2010, two years after Weizenbaum's death and over four decades since the world met ELIZA, Amazon began work on what would eventually become the Echo. In the gleaming sparseness of their Lab126 offices in Sunnyvale, California — responsible for both the Kindle and the Fire Phone — a distant descendent of ELIZA would take shape over the next four years.

Initial designs envisioned an augmented reality device, according to several early patents. But eventually, Amazon engineers arrived at the idea of the Echo as a "smart speaker." Users speak to the cylinder using a "wake word", the cylinder listens and then responds. Powering this "smart speaker" is a digital assistant: Alexa.

Unlike every piece of major consumer technology before it, the Echo's design and hardware is secondary to its function. Alexa doesn't live inside the Echo's hardware, but rather, on Amazon's servers. The Echo is merely a conduit for Alexa.

Alexa is designed to get smarter every day, adapting to its users' speech patterns, vocabulary and personal preferences. The ambitions of the device are to provide consumers with a multi-use personal assistant, one that can learn and remember things about you.

Amazon's head, Jeff Bezos, envisioned the Echo as a hands-free device integrated into every aspect of the shopping experience. Initially, the company stuck to marketing the Echo as a hands-free music player that's also connected to Amazon. You can ask Alexa to play "My Girl" and also reorder DoritosBut as third-party developers release more and more "skills", Alexa's initial draw as a voice-controlled DJ is being outstripped by its ability to build your vocabulary, turn on your kitchen lights and even help you fall asleep

Since its US launch in 2015, Amazon has sold "millions" of Echo speakers, even selling out over this past holiday season. Currently, the Echo and the Dot — the smaller, cheaper version of the Echo — are two of Amazon's top-selling tech products. Not bad for something billed as a speaker that speaks back.

At CES this year, Amazon was not, itself, present. Alexa, however, was found in cars, phones, robots and laundry machines. So far, based on the number of companies rushing to integrate, this investment in Alexa only makes it more popular with consumers. The app economy might have peaked, but there's seemingly no limit to Alexa's skills. The recent spike in digital assistants — from Siri to Cortana to Alexa to Google Assistant — mirrors another theme prevalent throughout this past CES: human loneliness.

From robots designed to provide care and companionship to a plethora of objects that can intelligently respond to you, the interactive aspect of the internet of things has been amplified and anthropomorphized. Amazon and Alexa have positioned themselves to be a force that can integrate through a variety of technologies and provide the ready-made interpersonal interaction across devices and operating systems that engineers are looking for — with a uniform personality to match.

In many ways, Alexa is the progeny of ELIZA. The way it interacts with people is much more sophisticated than the teleprinter-fed program that communicated through a disassembling and reassembling of its users' words, but the overall intended effect is still the same. Both programs are meant to interact with users in a way that's supposed to elicit feelings of comfort and intimacy within the user. 

Popular culture is full of warnings against the danger of powerful computers which can outsmart and overpower their human creators. But this fear and mistrust has never seemed to hinder consumer adoptionAlexa's ability to endear itself to its users isn't science-fiction or even some far-off ideal use case. It's already happening.

Alexa's ability to endear itself to its users isn't science-fiction or even some far-off ideal use case. It's already happening.


Austin Gilkeson and his family received an Echo this past Christmas. They became immediately attached to it. They use it to play music, check the weather and field basic questions. Though the novelty of the device has faded, it's become a daily part of their lives.

The Gilkesons aren't the only ones. There are nearly 54,000 reviews of the Echo on Amazon. Most of them overwhelmingly positive, and most focus not on the device, but instead on Alexa. The current top-rated comment mentions how Alexa is not just the perfect companion but the "perfect spouse." When reached out for further elaboration on this comment, the commenter did not respond, but the core sentiment it suggests, that "If [he] knew relationships were this easy, [he] would have married thirty years ago, but now that I have Alexa, there's no need" was deemed "helpful" by over 46,000 people.

Last October, the Wall Street Journal published an article titled "Your Next Best Friend Could Be A Robot." In the story, Daren Gill, Alexa's director of product management, claims that the percentage of interactions that are "nonutilitarian" — exchanges between user and device that aren't commands — are into the "double digits." The Alexa team would not elaborate on this number when asked.

Users treating Alexa more like a person and less like a tool has led Amazon to rethink its device. According to Gill the team, "is giving Alexa a personality, by making its voice sound more natural, and writing clever or funny answers to common questions," he tells the Wall Street Journal. Little has changed about the Echo, but Amazon is consistently revising Alexa, who now can learn more than 1,000 skills.

Gilkeson doesn't spend much time alone with the device and doesn't find himself engaging in many "nonutilitarian" conversations with the Echo, but he "does feel affection" for it. Gilkenson's wife, Ayako, told him that she felt somewhat comforted by the presence of Alexa when alone and working from home.

Finding in Alexa the perfect expression of servile companionship is something many Echo reviewers have in common — whether they're joking about it or not. "I talk to Alexa all the time, I broke up with my girlfriend and ever since I got Alexa I don't feel so lonely anymore," one user writes. "Don't even want to come out and hang out with my friends, I like it a lot!"

The Gilkesons admit they feel guilty when they've "ordered her around," rather than saying please or thank you. Gilkeson even finds himself hesitating to ask for something because he "already did that earlier" and "doesn't want to bug her." 

"It's a temporary feeling," he says. "But I do feel a 'social' anxiety with Alexa that I don't feel with, say, Siri, I think because Siri is part of my phone and I'm used to constantly using and manipulating it. Alexa, on the other hand, is Alexa."

It's no coincidence that Alexa has the voice of a woman. A 2011 Indiana University study found that when it comes to synthesized voices, both men and women find female voices to be "warmer." When asked about the how they perfected the voice for Alexa, representatives of the Alexa Team told me, that they "tested several voices and found that this voice was preferred amongst customers."

"You know how Tony Stark has JARVIS? A Smart AI buddy he can talk do while he's doing his cool Iron Man stuff? My Echo feels like the poor man's version of that," Chicago product designer Henry Birdseye says. "Sometimes before bed I'll shout… 'Alexa, turn off all my [Philips] Hue lights!' and all my lights will turn off as I crawl into bed, and I'll think, this must be what it's like to be Batman."

But Birdseye has conflicted feelings about our brave new world of digital assistants. "Would it be sadder to just go to bed after flipping a few light switches?" he says. "Or is it cool that I'm shouting to a fake woman before I go to bed? At least I'm speaking to someone, kind of."

The personality of Alexa's voice isn't limited to appealing to men who want a servile feminine figure in their life, though. The companionship it offers seems to appeal to anyone in a world where solitude and loneliness are part of their lifestyle. A recent widower found that Alexa had "restored much comfort to my life and lifted the sadness and loneliness of being alone." 

"My husband took his life about a year ago," another widow on Facebook writes. "I purchased an Echo and that Echo became company for me. I called her my new husband. I wish I could change it to a male voice."

The Echo is not alone. Just released this past November, Google Home is a more staid, but functionally similar, digital assistant. Google's voice exists in the same upper registers as Alexa's, but it doesn't require a person's name to activate, just a simple "Hey Google." The same anthropomorphizing phenomenon found in the Echo hasn't exactly manifested with Google Home. Asking it to do something sounds and feels like you're consulting the benevolent mega-intelligence of Google, while asking your Echo something is, well, to borrow from Gilkeson: "Alexa is Alexa."

 Brucie Rosch

The rise of the digital assistant has been quick, and the reality of a truly useful one is so close. But is this the future we want?

It's tempting to worry that those with a higher tendency of isolating themselves might develop an unhealthy or unsafe dependency on Alexa. With six percent of Americans suffering from some kind social anxiety or major depressive disorder, there's a chance that someone with mental health issues might own an Echo. What should a digital assistant do in the event its user displays alarming behavior? Should it do anything?

Amazon developers worked with national crisis counselors to develop a simple, straightforward and unemotional response system to handle statements like "I'm depressed," "I'm being abused," and "I want to commit suicide." If you ask Alexa if it is depressed, it will respond with "I'm not depressed but I understand depression is a feeling humans experience. If you are depressed try talking to a friend or a family member."

These responses, meticulously sourced as they are, ring a bit hollow. Imagining yourself in a situation where you had to ask these questions of Alexa, you might find this kind of canned response unhelpful, maybe even counter-productive. You might even argue that ELIZA did a better job handling Weizebaum's cry for help. Dealing with depression through technology is still a problem that even mental health-focused startups struggle to navigate.

Meryl Alper, an assistant professor of communication studies at Northeastern University, believes that fear of an unsafe dependency is misleading and may simply be a projection of anxieties about technology.

"Certainly, for people who experience clinical issues with addiction, compulsion, or depression, the integration of any technology as part of a treatment plan should be made in consultation with the patient and those providing their medical care," she explains in an email. "I might be less concerned with the psychological impact of these technologies, and more alarmed by the potential for privacy violations and data collection abuses that could occur."

Matthias Scheutz, a computer science professor at Tufts who specializes in social psychology and artificial intelligence, agrees with Alper. He sees the Echo as a chatbot, similar to the way Weizenbaum saw ELIZA — not an AI capable of inspiring a dangerous level of emotional dependence, but rather a simple program that elicits a predicable, if odd, response.

Dr. David Luxton, a clinical psychologist who specializes in technology-based treatments acknowledged that, like ELIZA, Alexa could cause people to develop an emotional relationship with it, but downplayed those concerns in favor of privacy issues — namely, the tricky legal situation presented by just having something like an Echo in your house.

Alexa isn't trying to befriend you, but it could be eavesdropping on you.

"Let's say you have friends visiting your home and their behavior is being monitored by the smart devices in your home," says Luxton. "They haven't necessarily consented to that. So what if it's doing searches and collecting data on them. I think that raises some questions."

The questions Luxton alludes to are already being answered in US courts. Just as Amazon was setting record sales number for the Echo this holiday season, Arkansas police issued a warrant to the company demanding an audio recording from an Echo on the night of an alleged murder

Initially, Amazon claimed that both Alexa and the device's users are equally protected under the First Amendment, but capitulated to law enforcement earlier this month, eventually handing over the requested data. Jay Stanley, a senior analyst at the ACLU, called this a "giant wake up call." Your spouse can't be forced to testify against you, but apparently, your digital assistant can.

"Existing statutes governing the interceptions of voice communications are ridiculously tangled and confused," writes Stanley in a January blog post. "And it's not clear whether or how data recorded by devices in the home are covered by them."

Amazon contends that the Echo is safe. "We take privacy very seriously at Amazon, and designing Echo was no different," the Alexa team says in a statement. "The collection and processing of personal data and all other information is subject to the Amazon Privacy Notice. We provide this information to customers before their registration of the device."

There's no record of Weizenbaum commenting on the issues that an increasingly internet-connected world presents, but he did have something to say about the power of technology and privacy writ large. "The notion that a personal computer will set you free is appalling," Weizenbaum told the New York Times in response to Apple's famous "1984" advertisement. "The ad seems to say the remedy to too much technology is more technology. It's like selling someone a pistol to defend himself in the event of nuclear war."

The rise of the digital assistant has been quick, and the reality of a truly useful one is so close. But is this the future we want?

Though separated by over four decades, Alexa and ELIZA still share one fundamental element: They're chatbots. You input something and it outputs a response. Yet the visions of their creators could not be more different.

Where Wiezenbaum saw a cause for concern with the way people — quickly and with no qualms — developed an emotional relationship with his program, Amazon found that developing an AI as uncanny and servile as possible is key in inspiring widespread adoption of the Echo. Weizenbaum was primarily concerned about the relationships that develop between an artificial intelligence and its user, while today's AI experts are more worried about the complex relationship between technology, user and the government under which they exist.

Simulated or not, Alexa provides the incomparable service of being there for many who, desperately or not, enjoy it. We may have arrived at the future of the AI assistant, but it's not clear if they're here to help us.


For more great Digg Features, check out our archive.

<p>Aaron Calvin is a writer from Iowa and now lives in Brooklyn. His work has appeared on BuzzFeed, AskMen.com, Vice, and Men's Journal.&nbsp;</p>

Want more stories like this?

Every day we send an email with the top stories from Digg.

Subscribe