Advertisement

Will The Lure Of Convenience Give Way To The Surveillance State?

(Kevin Bhagat/Unsplash)
(Kevin Bhagat/Unsplash)

A few weeks ago, after yet another failed attempt to fix the pull cord on our closet light, my husband bought a Google Home. Using this $29 device to turn on a smart light bulb, he argued, was cheaper than hiring an electrician to fix the balky wiring. I mocked this ridiculous First World solution up until the moment I used it. But the first time it worked — the first time I didn’t have to clamp a small flashlight between my teeth to find the black pants hanging in the equally black closet — I became an instant convert.

Now, like so many people I know, when my hands and eyes are otherwise engaged, I casually invoke Google to tell me the traffic conditions or play me a song. I have — hallelujah — found yet another way to multitask.

But a few days after this miniature speaker came to live in our bedroom, I asked my husband, “What would happen if you said, Hey Google, put away my laundry?

Before he could offer a justifiably wounded response about how much better he’d gotten about performing this particular household task, the Google Home spoke up.

“Sorry,” she said kindly, “I’m not sure how to help with that, but I’m still learning.”

And so now, even though we no longer have little kids in the house, I find myself reverting to old habits on occasion, lowering my voice and parsing my words so that Google Home won’t hear or misinterpret them.

I’ve also accelerated my tumble down the slippery slope of convenience, trading off a reasonable expectation of privacy in my own home for the rewards of instant gratification.

This paranoia is not unfounded. Anyone who owns a Google Home or Amazon Alexa or comparable smart speaker has had the experience of inadvertently waking the machine and hearing it respond to questions you didn’t intend to ask or orders you weren’t trying to issue. While these episodes are generally more comic than sinister, they do raise serious privacy concerns that manufacturers and lawyers are scrambling to address. After all, although these devices ostensibly do not record or pass any data to their servers in the cloud until someone in the house has spoken the “wake” word (e.g. “Alexa,” or “Hey, Google”), they are perpetually in listening mode, waiting to be woken. And once alert and actively transmitting your questions or commands, chances are good that on at least some occasions, they’ll keep at it even after you thought the “conversation” was over.

Indeed, as journalists Tom Dotan and Reed Albergotti note in their investigation of why Amazon Echo records were being subpoenaed in a criminal investigation, “[T]he [Echo’s seven] microphones can often be triggered inadvertently. And those errant recordings, like ambient sounds or partial conversations, are sent to Amazon’s servers just like any other. A look through the user history in an Alexa app often reveals a trove of conversation snippets that the device picked up and is stored remotely; people have to delete those audio clips manually.”

That’s why as gratifying as it was to be able to ask my Google Home the number of registered voters in Alabama without having to wrench my eyes away from the television screen, I’m alarmed at the prospect that somewhere there’s an archive of my private comments to my husband about the Roy Moore voters among them.

I’m not a conspiracy theorist, and I have no illusions about the sanctity of my commerce-related data. Every time we use a credit card, click on a link or pause to look at an item on a beacon-equipped store shelf, information about our interests and purchasing behavior is being captured and sold. But as the 2016 election so frighteningly proved, big data mining is digging deep into our beliefs, unearthing opinions and discontents that may, in time, be used not just to target, but to persecute.

It all feels easy and useful and harmless, but the potential for abuse is enormous.

Sound paranoid? Yes, until you learn about China’s imminent Social Credit System (SCS), which not only uses big data to rate the trustworthiness of its 1.3 billion citizens, but metes out penalties and rewards based on those ratings.

We are not there yet, but I’m starting to see how we might end up sauntering into the surveillance state. Quickly answering our ad hoc requests for recipes or pop culture trivia or cold, hard facts; giving us the weather forecast as we’re selecting clothes from the closet that’s now another node in the Internet of Things; entertaining our children by playing "The Ice Workers’ Song" from "Frozen" or Michael Jackson's "Beat It" over and over again without complaint – these smart speakers ingratiate themselves by fueling our conviction that at least in some aspects of life, we can and should get everything we ask for.

It all feels easy and useful and harmless, but the potential for abuse is enormous.

"Is the trade-off worth it?" I ask myself.

And then comes the increasingly ubiquitous answer: "Sorry, I’m not sure how to help with that, but I’m still learning."

Related:

Headshot of Julie Wittes Schlack

Julie Wittes Schlack Cognoscenti contributor
Julie Wittes Schlack writes essays, short stories and book reviews for various publications, including WBUR's Cognoscenti and The ARTery, and is the author of “This All-at-Onceness” and “Burning and Dodging.”

More…

Advertisement

More from WBUR

Listen Live
Close