Alexa voice recognition audit teams had access to customers' location data
More allegations about how owners of the Amazon Echo smart speaker range are having their privacy infringed by the retailer have surfaced, with the latest claim involving an internal team of Alexa auditors having access to customer data that can reveal their home address.

It was revealed earlier in April Amazon employs thousands of workers to listen to snippets of audio recorded by Echo devices when users speak to Alexa, Amazon's digital assistant, for training purposes. Amazon did not advise the recordings of Alexa conversations were heard by employees, including background conversations not connected to the request.
In a new report, it is alleged those working on analyzing and transcribing the voice recordings for training purposes have access to far more data than first thought. Five employees familiar with the program advised to Bloomberg the team has access to location data, allowing it to determine a customer's home address in some cases.
Using the geographic coordinates provided along with the recording, the team members can find out the addresses using third-party mapping software.
Though there is no sign that Amazon employees are misusing the data by tracking down individuals, two team members were concerned over the "unnecessarily broad access" to customer data that would make otherwise anonymous data easy to identify.
The employees working on the recordings are largely based in Boston, Romania, and India, but some are provided extra information in an Amazon-produced tool that shows data about the device itself. While most of the data cannot be used to trace to a particular user, with items like device ID and customer ID numbers being relatively anonymous, location data is also offered in the tool.
One demonstration showed to the publication had a recording's coordinates pasted into Google Maps, quickly showing what could be the user's address.
On entering a customer ID into a separate tool, the home and work addresses, as well as phone numbers entered into the device at setup, can be viewed. Users who share contacts with Alexa can also find names, numbers, and email addresses appearing in the same tool.
It is thought Amazon is already restricting the amount of employees who can access the data, and the level of access. One dashboard tool that showed a user's contacts in full last year has been updated to have some digits from phone numbers obscured from view.
"Access to internal tools is highly controlled, and is only granted to a limited number of employees who require these tools to train and improve the service by processing an extremely small sample of interactions," an Amazon Spokesperson advises. "Our policies strictly prohibit employee access to or use of customer data for any other reason, and we have a zero tolerance policy for abuse of our systems. We regularly audit employee access to internal tools and limit access whenever and wherever possible."

It was revealed earlier in April Amazon employs thousands of workers to listen to snippets of audio recorded by Echo devices when users speak to Alexa, Amazon's digital assistant, for training purposes. Amazon did not advise the recordings of Alexa conversations were heard by employees, including background conversations not connected to the request.
In a new report, it is alleged those working on analyzing and transcribing the voice recordings for training purposes have access to far more data than first thought. Five employees familiar with the program advised to Bloomberg the team has access to location data, allowing it to determine a customer's home address in some cases.
Using the geographic coordinates provided along with the recording, the team members can find out the addresses using third-party mapping software.
Though there is no sign that Amazon employees are misusing the data by tracking down individuals, two team members were concerned over the "unnecessarily broad access" to customer data that would make otherwise anonymous data easy to identify.
The employees working on the recordings are largely based in Boston, Romania, and India, but some are provided extra information in an Amazon-produced tool that shows data about the device itself. While most of the data cannot be used to trace to a particular user, with items like device ID and customer ID numbers being relatively anonymous, location data is also offered in the tool.
One demonstration showed to the publication had a recording's coordinates pasted into Google Maps, quickly showing what could be the user's address.
On entering a customer ID into a separate tool, the home and work addresses, as well as phone numbers entered into the device at setup, can be viewed. Users who share contacts with Alexa can also find names, numbers, and email addresses appearing in the same tool.
It is thought Amazon is already restricting the amount of employees who can access the data, and the level of access. One dashboard tool that showed a user's contacts in full last year has been updated to have some digits from phone numbers obscured from view.
"Access to internal tools is highly controlled, and is only granted to a limited number of employees who require these tools to train and improve the service by processing an extremely small sample of interactions," an Amazon Spokesperson advises. "Our policies strictly prohibit employee access to or use of customer data for any other reason, and we have a zero tolerance policy for abuse of our systems. We regularly audit employee access to internal tools and limit access whenever and wherever possible."
Comments
To be fair, though, this is a Bloomberg report. Should I believe it any more than the spy chip report?
Business as usual when anybody else does it or far worse.
This is AMAZON dude, a company known to openly listen in, while Apple is a company known to protect user privacy. No one called Bloomberg "fake news", it was, in fact.
Now on to the source, I have no idea who the original source is but if you think Amazon is a saint, you're nuts.
It only makes sense that Amazon would need to know your address to send packages to and if you're ordering with an Echo that that Echo would be tied to your account and also to your address. It doesn't make sense that when Echo is recording you speaking, whether you have made a request of it or not, that the person transcribing what you said could also, potentially, find your name and physical address.
Yeah, I understand that this comes down to what is the purpose of an audit, which in this case is training, and what is the granularity of records maintained by Amazon. Should Amazon filter out any data that is not relevant to the audit from the records exposed to auditors? Sure, but they've chosen to place trust in their audit teams to use professional discretion about what portions of the information contained in the data records to use for their specific audit purposes. It's more like a laziness and cost issue, which are admittedly two principle drivers behind security and privacy breaches. At the end of the day however, we've placed a large amount of trust in Amazon (and Apple and any vendor or government agency we deal with) as a whole and expect them to safeguard all of our data across all of their internal teams who have access to our data. We really don't know who sees what we share with them because we do not have visibility into everything they do internally or see an audit trail that shows who sees what. Once we've given over our information and data to them it's a potential black hole with an assumption of trust. Perhaps with blockchain technology things will change and we'll retain more control. But until somethings changes the issues that are reported are just a tiny tip of the iceberg on a huge personal information security concern that we have to live with to stay connected to others.