Tech Contractors Report Hearing Drug Deals, Sex, and Medical Info Recorded by Virtual Assistants


Woman In Kitchen Asking Digital Assistant Question

“Hey Alexa, write this blog for me.”

Tech giants are scrambling to reconfigure their quality control processes after word recently got out that other human beings were listening to recordings from Google, Amazon, and Apple devices. Staff at the world’s most dominating technology companies were tasked with analyzing recordings to determine how well the digital assistant recognized human speech and handled the requests made of it.

“I Only Listen When You’re Talking to Me”

Generally, virtual assistants like Alexa, Siri, and Google Assistant only record audio when someone uses the wake command (ex. “Hey, Google”). Apple’s Siri has even been programmed to give a jokey response if you ask whether the device is always listening, claiming she only listens when you’re talking to her.

However, as many of us have discovered, sometimes these devices are triggered inadvertently. In 2017, a young girl in Texas accidentally ordered a $170 dollhouse and four pounds of sugar cookies through her family’s Amazon Echo Dot device. But, since we’re talking to a device and not a person, it’s easy to dismiss these devices when they offer help out of turn.

As it turns out, contractors employed by Apple, Google, and Amazon were listening to a small percentage of these recordings. An anonymous whistleblower recently told The Guardian of “countless instances” where contractors heard private discussions between business associates, people talking to their doctor, and people having sex.

Anonymity Not Guaranteed

Earlier this summer, Amazon exec Dave Limp told BBC News that only a “tiny fraction of 1%” of voice recordings was ever listened to by humans. Apple told NBC News that only a small subset making up less than 1% of daily Siri utterances were heard by quality control staff – and that Siri recordings are not attached to a person’s Apple username.

READ ALSO  Apigee: On-Prem Vs SaaS - Aptira

However, that doesn’t necessarily mean these recordings are anonymous. The contractors who spoke to The Guardian noted that they sometimes heard full names, bank account numbers, and other sensitive personal information.

No More Human Review, For Now

Both Apple and Google have put a stop to human review of audio recordings as of July, while Amazon has updated its privacy policies to allow users to opt-out of having their recordings used for quality improvement. Amazon device users can also delete their voice recordings periodically if they choose.

As privacy concerns grow across the globe relating to digital assistants, all three companies will likely need to complete an overhaul of procedures surrounding voice recordings. If the human review of recordings is to continue, the companies using it will need specific methods for managing user data as well as for contractors to report sensitive information.

Related Resources:



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com