Possible Problems of Persona Politeness

One of my AIs is funnier than the other. This is proving to be a problem.

But first, consider how the amazing becomes normal very quickly. It feels like I've been using Siri on my phone my entire life, Siri on the iPad charging by my bed since forever, and Siri on my watch since last summer. I've not, of course. She's only four years old this October. But nevertheless, as with any new life-spanning tech, she's become background-banal, in a good way, remarkably quickly. Voice interfaces are, without a doubt, A Thing.

And so it is with Alexa, the persona of the Amazon Echo, living in my kitchen for the past fortnight. She became a completely integrated part of family life almost immediately. Walking into the kitchen in the morning, ten-month-old daughter in one hand, making my wife tea with the other, I can turn on the lights, listen to the latest news from the radio, check my diary, and order more milk, just by speaking aloud, then turn it all off again as I leave. It's a technological sprezzatura sequence that never fails to make me smile. Thanks, Alexa, I say. Good morning.

But there's the rub. Alexa doesn't acknowledge my thanks. There's no banter, no trill of mutual appreciation, no silly little, "it is you who must be thanked" line. She just sits there sullenly, silently, ignoring my pleasantries. 

And this is starting to feel weird, and makes me wonder if there's an uncanny valley for politeness. Not one based on listening comprehension, or natural language parsing, but one based on the little rituals of social interaction. If I ask a person, say, what the weather is going to be, and they answer, I thank them, and they reply back to that thanks, and we part happy. If I ask Alexa what the weather is, and thank her, she ignores my thanks. I feel, insanely but even so, snubbed. Or worse, that I've snubbed her.

It's a little wrinkle in what is really a miraculous device, but it's a serious thing: The Amazon Echo differs from Siri in that it's a communally available service. Interactions with Alexa are available to, and obvious to, everyone in the house, and my inability to be polite with her has a knock-on effect. My daughter is too young to speak yet, but she does see and hear all of our interactions with Alexa. I worry what sort of precedent we are setting for her, in terms of her own future interactions with bots and AIs as well as with people, if she hears me being forced into impolite conversations because of the limitations of her household AI's interface. It's the computing equivilent of being rude to waitresses. We shouldn't allow it, and certainly not by lack of design. Worries about toddler screen time are nothing, compared to future worries about not inadvertently teaching your child to be rude to robots. 

It's not an outlandish thought. I, myself, am already starting to distinguish between the personalities of the different bots in my life. Phone Siri is funnier than Watch Siri; Slackbot is cheeky, and might need reeducating; iPad Siri seems shy and isolated. From these personalities, from these interactions, we'll take our cues. Not only in how to interact, but when and where, and what we can do together. If Watch Siri was funnier, I'd talk to her more. If Phone Siri was more pre-emptive, our relationship might change. And it's in the little, non-critical interactions that their character comes through.

All this is, of course, easier said than done by someone who isn't a member of the Amazon design team - hey there lab126, beers on me if you're in LA soon - but there's definitely interesting scope to grow for the seemingly extraneous stuff that actually makes all the difference. Personality Design for AIs. That's a fun playground. Is anyone playing there?

Previous
Previous

Right Effort, Right Interface

Next
Next

Spectator Cockroaches, Sand, and the Social Facilitation of Skeuomorphs.