It’s normal to have bickering in households. Spouses and kids, in any combination, can squabble over just about anything. But you know what never gets into an argument in my home? My toaster and my microwave. Also, my washing machine and vacuum get along great. Never a harsh word between them.

That may be changing. Possible appliance strife is a real concern to many consumers. At least when it comes to virtual assistants.

Tools like Siri, Alexa and Cortana are, after all, appliances. Ericsson’s Consumer & Industry Lab conducted a recent survey of what they call “advanced internet users”. From it, they generated a list of 2019’s top 10 trends. Number two: smart quarrels.

In the survey, 47% of virtual assistant users said that different devices may well give them different answers to even the most basic of questions. Two-thirds of those surveyed believe that, within a few years, the virtual assistants in their household will start sniping with each other just like family members.

That’s one reason why 41% of people Ericsson surveyed think it will be important for couples to have compatible virtual assistants.

A way to express control

Thinking of Siri and Alexa as if they have human characteristics – whether the ability to argue or the need to get along – isn’t really surprising. After all, we give these tools human names, we talk to them, and they talk back.

As an article in Quartz notes, anthropomorphism (ascribing flesh-and-blood traits to inanimate objects) is hardly new. They cite Homer’s epics about people who named their swords and ships. Today, people still give human names to their boats, and often to their cars, guitars, bikes, laptops, even furniture.

Quartz says there are three primal reasons to anthropomorphize an object: 1) it looks like it has a face; 2) we can’t explain its unpredictable behaviour; and 3) we’d like to be friends with it.

Another article in The Atlantic says that giving things a human name can be a way to express affection, acknowledge that they do a job that people once did, and show faith. It can also be a way to exert control, even when we lack it. (The article notes how we give human names to other powerful forces, like hurricanes for instance.)

Are virtual assistants our pals? While they do keep us company, some critics worry that they might get in the middle of relationships.

What happens if we enjoy talking to them more than to the people who share our homes? Or when your Siri disagrees with my Alexa? Or when a virtual assistant and spouse have different takes? Whose side are you on?

Also, who do you confide in and trust the most?

The Ericsson survey found that half of those currently using virtual assistants think that they’ll soon grasp their emotions. Four in 10 think smartphones will understand them better than their friends do. And twice as many people trust an AI device more than a human to keep their secrets. We name them because we want to trust them.

Becoming friends with virtual assistants? That’s the least of it. There’s a report that over a one-year period more than 1 million users asked Alexa to marry them. Maybe for fun, to see how it would answer, but who knows.

What are we learning about relationships?

How we communicate with virtual assistants could have other impacts on our non-virtual relationships.

For instance, might children use these devices to replace their parents as sources of knowledge and advice? Will we become impatient with people when they don’t give us answers as quickly as Siri? Will speaking rudely to our devices cost us?

The Harvard Business Review weighed in, with a piece that urged users not to swear or used raised voices at our machines.

“If adaptive bots learn from every meaningful human interaction they have, then mistreatment and abuse become technological toxins,” HBR suggested.

Describe the use of these virtual assistant at work, HBR said that “Just as one wouldn’t ridicule a subordinate, the idea of mistreating ever-more-intelligent devices becomes unacceptable. These inanimate objects are explicitly trained to anticipate and respond to workplace needs. Verbally or textually abusing them in the course of one’s job seems gratuitously unprofessional and counterproductive. Crudely put, smashing your iPhone means you have a temper; calling your struggling Siri inappropriate names gets you called before HR.”

Maybe we should focus on stemming actual inappropriate workplace conduct. Just a thought.

Regarding tone of voice, some worry that speaking too politely to a piece of technology could be the thing that poses problems.

An article in Fast Company noted a manners feature in Alexa (called Magic Word). If children say “please” in a request, Alexa will offer a “thank you”, encouraging polite behaviour. Sounds sort of innocent, but consider the downside.

“In the process, what are children learning about their relationship to intelligent machines?” asked Fast Company. “By extending human social norms to software and cloud services, are we teaching children that machines have sensibilities to be considered the same way we consider human feelings?”

Moreover, “In teaching children to treat machines like people, we may also be treating people like machines. Telling kids to say ‘please’ and ‘thank you’ to software, knowing that no feelings are involved, could be construed to be telling them to run their courtesy routines automatically regardless of meaning, effect or purpose.”

There’s a broader lesson for all of us. Virtual assistants may seem friendly, with soothing voices. They may be dependable, always there for us. We may give them a special place in our homes. But never forget, as Fast Company says, they are “machines designed to create the illusion of humanity.”

Virtual assistants and various smart devices are amazing innovations. I appreciate them. It’s just that I don’t want to be besties with my thermostat. Or anything else that doesn’t have a beating heart.

I’m not anti-technology. I’m not even pro-human all of the time. I just like to remember the difference between the two.

Stuart Foxman is a Toronto-based freelance writer, who helps clients’ products, services, ideas and organizations to come alive. Follow me on Twitter @StuartFoxman, connect with me here on LinkedIn, or check me out at foxmancommunications.com. I would love to hear from you. More articles like this coming, with original posts every week about communications, writing, branding, creativity, media, marketing, persuasion, messages, etc., etc.

January 16, 2019

Share This