Digital assistants such as Siri, Alexa, Cortana and so on promise to get smarter and smarter and to play a role of increasing importance in our lives.

They listen to our voices and respond to our commands. They answer our questions in voices that are ever more humanoid. Siri, for example, got an upgrade in the latest IOS update that makes her sound more chipper and personable than ever. Already they are adept at looking up information for us, making reservations for cars, restaurants, flights, and hotels. In fact, Amazon’s Alexa now boasts more than 15,000 “skills” and the number seems to be growing exponentially. These AIs will inform us, entertain us, educate our young, and help care for our old.

There’s a big issue looming on the horizon, however. Yes, they will be more and more capable and able to do things for us. But can we trust them? I raise this not in the sense that these AIs will become so smart that they will become our masters. Rather, like Perceval in the Grail Legend, we have to ask the question: “Whom does the Grail serve?”

In the Legend, which goes back to the 12th century, the Grail is a holy relic of enormous powers. The Fisher King suffers from a grievous wound that won’t heal. Perceval is a fierce but somewhat naïve knight who would heal the king. However, we are told that he has been warned not to ask too many questions. As a consequence, he neglects to ask the one question that would heal the king’s wound, “Whom does the Grail serve?”

All these wonderful technologies exploding onto the scene appear quite magical. In our time, technology has become a kind of Holy Grail, the force that would bestow all the abundance we’ve ever craved, that seemingly will heal not only physical illness, but all the seemingly irreparable wounds to our planet and to the body politic. But can we trust this new Grail? Whom does the Grail serve?

Our initial impulse might be to assume that the Grail serves you and me. Doesn’t Siri answer my questions? Doesn’t Alexis carry out your commands? Haven’t we been gifted with marvelous powers?

This question “Whom does the Grail serve?” is one I’m sure to come back to in future articles because my head is exploding with ideas and implications running in lots of different directions. Instead of trying to cover them all here, let me share what’s triggered these ruminations most immediately.

There’s a big issue looming on the horizon, however. Yes, they will be more and more capable and able to do things for us. But can we trust them?

A variant on the question is: “Who do you trust?”

Can we trust an AI like Siri or Alexa to educate our young?

Like the naïve young Perceval of the Medieval legend, I was taken with Judith Newman’s delightful book, To Siri with Love: A Mother, Her Autistic Son, and the Kindness of Machines (Harper Collins, 2017), in which she describes a modern-day quest. She recounts how, in Siri, her autistic son Gus discovered a tirelessly patient conversationalist and teacher. It didn’t matter how many times Gus asked for the timetables of the Manhattan subway system, Siri was always happy to oblige. Not only was Siri infinitely patient, she also responded with kindness.

Newman describes Siri as an “amiable robot” that provided so much to her communication-impaired kid—not just information on arcane subjects that most adults or kids would have no interest in, but also “lessons in etiquette, listening, and what most of us take for granted, the nuances of back-and-forth dialog.” As a result of his interactions with Siri, Gus gradually moved from being a shy, barely verbal child to one is able to be quite extroverted and conversational in certain situations.

At the same time, Gus is like so many others on the spectrum in terms of loving repetition and his fascination with detail. He hops when is happy and has difficulty looking people in the eye.

Gus both does and does not understand that Siri is not a person. For example, Gus insists on taking his iPod to the Apple store “so it can visit its friends.” When he hears an airplane in the sky, he’s likely to ask Siri what airplanes are overhead. I didn’t even know you could ask Siri that! But Siri responds with a chart identifying all the airplanes within a wide circumference. When his mother asks him why he wants to know, he replies: “So that I’ll know who I’m waving to.”

More than once, Newman shares one of her biggest anxieties about Gus’s future: he trusts people too easily. He has none of the filters that the rest of us have developed. If someone is kindly and will listen to him, Gus is as likely to walk off with that person as with his mother or his brother. This brings me back to the issue of trust. Gus naïvely trusts too easily. Newman fears that, as a result, he is vulnerable to being taken advantage of.

I recognize a bit of the impetuous knight Perceval in myself. I’m a person of enthusiasms, likely to rush in unquestioningly. Given a bit of time and perspective, however, the questions do tend to bubble up.



We run a similar risk in relation to the Holy Grail of technology. If we are too trusting, if we fail to ask the question “Whom does the Grail serve?” we, too, may be vulnerable to being taken advantage of.

Consider another mother, Rachel Botsman, with a story about her 3-year-old daughter’s interactions with Amazon’s Alexa. I’m referring to a piece that appeared recently in the New York Times that was titled “Co-Parenting with Alexa.” Actually, the Times piece was taken from Botsman’s book, Who Can You Trust? How Technology Brought Us Together and Why It Might Tear Us Apart (Hachette, 2017). She notes that kids put a lot of trust in machines and she thinks that might be a problem.

Now it seems we are entering a new era in which we are trusting machines not only to DO things for us but to decide WHAT to do and WHEN to do it. And these important life decisions are not being made by a cute little assistant named Siri or Alexa but by corporate algorithms designed with one overall purpose: to sell us stuff.

Botsman describes presenting 3-year-old Grace with the Amazon Echo (an Alexa-enabled smart speaker), explaining that Alexa is like Siri, only smarter, and that she can ask it anything she wants. The child proceeds to introduce herself saying, “Hello, Alexa, my name is Grace. Will it rain today?” In turn, Alexa gives her the weather report for the day, assuring her that it won’t rain. It doesn’t take long before Grace discovers that Alexa can play her favorite music from the film Sing. She also finds out that Alexa can tell jokes, do math problems, and serve up interesting facts. At one point, she asks, “Hey, Alexa, what do brown horses eat?” As her mother, describes it, Grace discovers a whole new level of personal power, even barking, “Alexa, shut up!” But then she looks at her mother and sheepishly asks if it’s okay to be rude to Alexa. At that point, Botsman begins to suspect that her daughter thinks the Echo speaker device has feelings. The next morning Botsman overhears Grace asking Alexa what she should wear. This is particularly galling since Grace rankles whenever her mother tries to intervene in her clothing choices.

Botsman notes that Amazon has recently added a further enhancement to their digital assistant offerings in the form of a hands-free video phone called the Echo Show. Beyond video chatting with family and friends, one of the services this device will enable is to receive clothing advice from a team of Amazon apparel consultants who, of course, will make purchase recommendations from Amazon’s own clothing lines such as Lark & Ro and North Eleven.

It won’t be long before the Amazon fashion consultants will be able to see Grace’s outfit for the day and make recommendations for something more “trendy.”

In fact, the precocious Grace has already been overheard trying to make purchases, commanding Alexa to “buy me blueberries.” Botsman notes that at 3 years of age, Grace has no idea that “Amazon, the world’s biggest retailer, was the corporate behemoth behind the helpful female assistant, and that smoothing the way when it comes to impulse buys was right up Alexa’s algorithmic alley.”

It’s at this point that Botsman decides to retire Alexa to the closet.

This brings Botsman back to the central question of her book: “Who can you trust?” She notes that we’ve come to trust machines to DO things for us—washing the dishes, cleaning the floors, moving us from here to there, and so on. But, as Botsman points out, we don’t expect to call them by name or form a relationship with them.

Now it seems we are entering a new era in which we are trusting machines not only to DO things for us but to decide WHAT to do and WHEN to do it. And these important life decisions are not being made by a cute little assistant named Siri or Alexa but by corporate algorithms designed with one overall purpose: to sell us stuff.

So, despite all its magical enticements, we must remember to ask ourselves: “Whom does the Grail serve?”

 

0 0 votes
Article Rating