Voice Activated Personal Assistants are Fun and a Little Scary
We made fun of Siri, but then we learned to love her once we found it easier (and safer) to talk into our iPhones instead of typing as we drove. But Siri was just the beginning for voice-activated personal assistants. When Siri first came out, we felt like mini-James Bonds speaking to our robotic butlers. Sometimes asking inappropriate questions just to see what the artificially intelligent being would say in response. We felt, after all, that it would go no further than our smartphones.
NOW THERE ARE MILLIONS OF DIGITAL ASSISTANTS HIDDEN IN POCKETS OR GRACING KITCHEN COUNTERS:
- Microsoft’s Cortana
- Amazon’s Alexa
- and now Google’s new search assistant being advertised everywhere
They analyze what we say or type, and in return, offer us useful information. Even more recently, they have learned to anticipate what we want, such as:
- notifying us of traffic jams at just the right time
- and it’s great to find an address without having to touch our phones or
- be able to search Wikipedia for information on an old movie we’re watching on TV.
THE ONLY PROBLEM?
CONVENIENCE COMES AT A PRICE. PRIVACY AND SECURITY CONCERNS EXPOSE USERS TO MORE THAN THEY BARGAINED FOR.
That intimate relationship you have with the voice that shuffles music and plays Billy Joel songs isn’t so intimate. Your innocent query gets sent to a giant server somewhere where it’s analyzed, answered, and returned to your device. And in Hansel and Gretel fashion, your requests leave a trail of breadcrumbs behind — just like everything else you do online. So what about the not-so-innocent questions? Your queries and commands are kept by that server for months, the audio portion even longer. It’s all in the fine print you may avoid reading when you hit the “agree” button for any high tech device.
Now figure that when your question is location-oriented in nature, your human-sounding phone or countertop assistant can keep track of your habits, travels, and preferences. When’s the last time your iPhone flashed a message that told you (without prompting) how long it would take to get home? Keep in mind that these assistants wouldn’t be able to return useful results without being able to read your email and access your search history as well. The resulting data-portraits are available to law enforcement officers as well as used by hackers who may gain access to sensitive servers — something you simply don’t want to think about because — hey — you have nothing to hide, right?
- Amazon’s Echo is more popular than ever, dominating three-quarters of the virtual assistant market. But critics are still skeptical.
- Alexa’s non-threatening female voice is always listening, recording and storing our voices. Owners worry that strangers can hear their daily activities.
IF YOU ALREADY OWN AN ECHO, KNOW THAT IT RECORDS MORE THAN YOU EVER THOUGHT POSSIBLE.
If you are considering purchasing Echo, however, take heart. It’s possible to mess with your security settings to lessen what is being saved by the device and its server minions. For a complete picture of what you can do, Google a few YouTubes for instructions on this. For one, you can turn off Echo’s microphone, its most vulnerable part. The mic absorbs all the sound in the room, potentially compromising private conversations. Learning to do this lessens the fun, however, since it renders the device useless as a personal assistant unless you get into the habit of turning the mic on and off each time you need to use it. (It’s yelling that command from your kitchen table to turn down the volume on your musical entertainment when you get a phone call that makes you want to keep it on all the time.)
There are a host of other things you can do to safeguard your privacy such as turning off voice purchasing and setting up PIN codes. Telling Alexa to buy more dog food is pretty cool, but a single security breach could cost you dearly.
Many of us figure our lives are already an open book since being online or on our smartphones, most of the day exposes us to many of the same dangers. But these voice-activated assistants take us one step closer — perhaps — to Big Brother (whoever that may be) doing the watching.