“Alexa, you’re dangerous for kids?” The answer is: Yes. The Scientific services of the German Federal say tags. An opinion that carries Amazon language assistant risks for minors and uninvolved guests.

The experts criticise the fact that children may reveal personal information or with your voice to access content that is not suitable for minors. Also, Alexa can collect data from people, which is not at all clear that they are being recorded: if a household with a smart speaker, visits, know, always, that a Software environment record noises.

In the opinion of the Scientific services Amazon some that users inform sufficiently about what data we save and analyze. The experts criticise the fact that Alexa users “, however, often non-transparent, and could remain, when data are actually collected”.

activation word is required

The Echo devices from the Amazon transfer the recording to the Amazon Server, if the user speaks an activation word. To choose from Alexa”,” Computer”,” Echo “or” Amazon””. There is, however, that the language software is accidentally and unnoticed enabled. “Then be collected without the knowledge and consent of the user data and processed”, it means in the opinion.

With views of the United States is unclear “to what other purposes Amazon of its data in the future could be”. Amazon might have an interest, to associate the collected data with other platforms and third-party providers. It could also not be ruled out that Criminals have access to the data in the Amazon erlangen Cloud. There is a lot of information to be stored, so that a Hack “could make the users of Alexa are particularly sensitive”. In the past, but there were no instructions on security vulnerabilities in Amazon.

Amazon stresses that the “trust of our customers is of the highest priority”. “We know that we must deliver to the subject of data protection, compelling solutions to meet the high expectations of our customers,” according to the company. Each Echo speaker is equipped with a mute button to disconnect the power supply of the microphones, and cameras electronic.

Siri and Google’s Assistant-making

“This is it for customers very easy, when Alexa is in the position to recognize the activation word,” says a spokesman. The Amazon devices would give a clear visual indication when data is being streamed to the Cloud. On the criticism of the experts, that children could have access to content that is not suitable for you, not Amazon.

The group, lots of Bundestag Deputy Uwe Kamann, who had requested the report sees a need for action. The policy must be based on a clear Declaration of consent, the user to all hazards information. This concerns the “Transfer and use of the data and the data of third parties, who happen to be in the room,” said Kamann. To put a check mark for all, not rich.

Kamann had asked the Scientific services explicitly according to Alexa. This Problem is not to meet but only Amazon: “In all based recording systems there is this critical point.” Digital voice assistants such as the Alexa, Apple’s Siri or Google Assistant can answer questions, certain music playing, food ordering, and other tasks. They are also suitable to control the Smart Home.

to delete records

Amazon stores unlimited records and does not delete the data itself. Normally, the machine data will be evaluated. In April, revealed to Bloomberg on thousands of audio clips are also monitored by Amazon employees. Accordingly, Teams of the group are located in different cities across the globe, and edit a selection of the records. Supposedly very private audio messages to be discussed at larger Team Meetings. In the form of internal Chat channels, amusing audio clips would be shared.

Who wants to delete the data on Amazon’s servers, follow this Link, logs in with his account, selects the top bar Alexa-privacy and click on “voice recordings and check the history”. There a single or all records can be removed. In addition, the Deletion can be activated by voice command (“Alexa, delete what I’ve just said” or “Alexa, delete everything I’ve said today”).

(red/sih/bix/mri)

Created: 12.07.2019, 07:55 PM