If you’ve ever considered buying an Amazon Alexa, you might have been on the receiving end of a skeptical glance or the cautionary warning, “be careful, Alexa is and will be always listening!” However, Alexa does have the alluring ability to make our days easier—dutifully performing thousands of otherwise time-consuming skills, like these things you didn’t know Alexa could do. Here lies the question, the internal conflict between sacrificing personal information for an extra hand (or, better yet, an extra brain) around your Boston condo. Whether you already live with an Alexa or are simply considering shacking up with one, everyone should be aware of the impact that this device can have on your privacy. Is Alexa really always listening? What does it mean for users if she is?
When does Alexa listen?
First and foremost, it is important to differentiate between listening and recording. Although the words are often used interchangeably when discussing Alexa’s capabilities, they actually have very different implications. According to Florian Schaub, Assistant Professor in the University of Michigan School of Information, the microphones in these smart speakers “are always listening, but, by default, they are only listening for the ‘wake word’ or the activation keyword.” Since the whole purpose behind the device is to instantly respond to users’ requests, it makes sense that Alexa is constantly scanning audio for its wake word, which can either be “Alexa,” “Computer,” or “Echo.” However, this does not mean that Alexa is always recording what your saying in your North End condo
When does Alexa record?
Amazon’s list of frequently asked questions says that Alexa only begins recording your conversation upon hearing the device’s wake word. “So, when you say ‘Hey, Alexa,’” Schaub explains, all of “the audio gets analyzed and is being listened to by the microphones on the device, and only if the keyword ‘Alexa’ is detected, then everything that you say after that gets” recorded. After the device records, it uploads the audio to Amazon’s cloud, where they “have algorithms in the server that analyze the speech pattern and try to detect and identify the words you are saying.” While Alexa’s response may seem instantaneous, it actually has to work with Amazon’s cloud to comprehend varying accents, speech clarity, and vocabularies. This means that each time you wake up Alexa, the smart speaker is recording your conversations, “creating an automated transcript of what you are saying, and using that to fulfill your request,” says Shaub.
For example, let’s say your in your Back Bay condo and you want to use Alexa to check the weather—and that you like to goof off in the “privacy” of your own home by asking funny questions to your Alexa. You might say, in a faux British accent, “Your Royal Highness, Queen Alexa, what is the weather?” Since Alexa is always listening, the device picks up and analyzes all of the audio that you just produced. However, it is only programmed to begin recording your words when it detects it’s trigger word, “Alexa.” The recording is then sent to the cloud, your accent is dissected, and the words are transcribed. Since Amazon’s server knows the location of the speaker, it identifies the weather in your area, sends it to the device, and Alexa reads the response aloud.
The truth is, “there are all kinds of reasons the device might accidentally activate and record in situations where you’re not expecting it,” warns Schaub. Since Alexa’s “voice recognition is somewhat finicky,” Schaub says, “if you say something that sounds like Alexa, or if you just use Alexa in a conversation, then the device will activate.”
In 2018, the spine-chilling effects of what seemed like smart helper espionage were felt especially fervently by a family in Portland, Oregon. During a private conversation, the family’s Alexa woke up to a sound that resembled its wake word and began recording. Through a series of mistakes, Alexa misinterpreted the family’s conversation as a “send message” request and forwarded the audio recording to someone in the family’s address book—with the family having no idea that Alexa was on. While this may seem like an isolated incident, it is actually extremely common for Alexa to activate accidentally, begin recording, and upload the “eavesdropped” audio to the cloud. According to Bloomberg’s reporting, there are at least 100 transcripts of conversations uploaded to the cloud each day that Alexas have recorded without being purposely activated.
Bottom line
Although Alexa is programmed to only record audio when it is woken up, there is a strong possibility that your Alexa is activating accidentally in your Beacon Hill apartment. The more you use it though, the better Alexa’s wake word (activate) detection words will get.
Beacon Hill apartments for Rent Just Listed
Sorry we are experiencing system issues. Please try again.
All Beacon Hill Apartments
Sorry we are experiencing system issues. Please try again.
Back to Boston real estate condo listings for sale & Rent