In the beginning was the Word

As more and more of us digitise our homes and equip them with voice assistants, we tend to form a dependency on them. The time’s not far when these devices, powered by artificial intelligence that con
In the beginning was the Word

HYDERABAD : A recent report by cyber security firm McAfee has revealed that over 25 million voice assistants that are connected to home-based Internet of Things (IoT) devices are at an increased risk of hacking. These IoT devices that control lights, thermostat, locks, safes and other electronic gadgets tend to be vulnerable, say researchers.

The report also mentions that hackers could easily get access to the listening stream or microphone of these voice assistants and monitor everything said in its vicinity. Sounds frightening?
A few months ago in Portland, US, a husband and wife were having a discussion in their home about hardwood floors. A few minutes later, the husband’s employee, living in Seattle, calls them and informs that Alexa had recorded their entire conversation and sent the audio file to him! The couple reportedly had their entire home wired with Amazon’s voice assistant to control the heat, lights and security system.

his is what Amazon spokeswoman Shelby Lichliter had to say about the incident: “Echo woke up due to a word in background conversation sounding like ‘Alexa.’ Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud, ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right.’ As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

Amazon’s Echo smart speaker responds to the wake word ‘Alexa’ and begins listening to all conversations until you click the mic button on the top panel or unplug it from the socket.Interestingly, Bloomberg has reported last week that thousands of Amazon employees monitor the Alexa audio recordings of their customers to ‘eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands’. Apparently, the audio files include sensitive information such as bank account numbers, credit card details, etc.

Will voice assistants such as Alexa, Siri and Google Home completely wean us away from human interactions? Will they, by the tone of our voice, detect our emotions and comfort us better than our family members? Will they make humans convey their deepest, darkest thoughts and fears to them? What will happen to the mindset of kids who grow up in their midst? Only time will tell.

Protecting your privacy with Alexa
l Don’t automatically add your contacts: During setup, Alexa asks you to link all the contacts in your phone. Don’t do this unless you intend to make calls and send messages through Alexa. Linking contacts with Alexa is where many problems start — there is no danger of Alexa sending voice recordings to your contacts if it doesn’t know who your contacts are.
l Turn off voice purchasing: Go to Settings and turn off “Voice Ordering” option. It is enabled by default. 

l Don’t enable Drop In access: Enabling the ‘Drop In’ feature on your Alexa allows people with Drop In permissions to access the device’s mic and speaker – whether they’re in another room, another house, or another city. Anyone with permissions can also use it to listen to what’s going on in the room. 
l Use strong passwords: Implement two-factor authentication for your Amazon account, which ensures that along with a password, a unique code that is sent to your mobile must be entered. And use a strong password for your Wi-Fi network.

l Mute your device using the handy button on top: Turning off Alexa when you want privacy is a good habit; the devices can’t mistakenly record and share if they’re muted.l Turn up Alexa’s volume: Some Alexa mishaps have occurred because people didn’t hear what Alexa was asking, as the volume was set too low. You want to hear Alexa’s confirmations.
l Delete any recordings you don’t want in your history: Finally, clean up your history if you want to keep some or all of it private. Open the Alexa app, go to Alexa account, select History, then select a recording and select ‘Delete voice recordings’ option.
(Source: Leapfrog Services Inc)

Related Stories

No stories found.

X
The New Indian Express
www.newindianexpress.com