The DPC’s take on digital assistants

02nd December 2019

The Data Protection Commission (DPC) attended the Oireachtas Joint Committee on Communications, Climate Action and Environment in November 2019 to discuss the topical issue of digital assistants, which in the run up to Christmas may be of particular interest to the public. The DPC was represented by Deputy Commissioner Dale Sunderland, responsible for the Consultation, Supervision and Policy functions of the DPC, as well as Assistant Commissioners Cathal Ryan and Ultan O’Carroll. The DPC presented an opening statement to the Committee and answered questions on the data protection implications of these technologies. We have summarised the DPC’s submissions on the topic below.

 

What are digital assistants?

Many phones, online services, and standalone devices these days offer what are variously referred to as digital assistants, virtual assistants or voice assistants - the terms commonly used to describe a consumer or in-home device or service that can operate by listening for, and interpreting, human voice commands or instructions.

Voice assistants record user audio clips and convert those clips into a text form that acts as inputs to online services such as search, weather, shopping, mapping and communications. In some cases, where the devices are home-based, the instructions may also be used to control smart home devices including those for lighting, TV and media, heating, and security.

Voice assistants often listen continuously for instructions and may in some cases also be setup to recognise individual voices. Keywords such as “Hey Google” or “Siri” can trigger recording of the user’s voice. Voice recordings can be stored alongside their converted text forms on the device or in the cloud. Service providers may also record this against a user’s profile, a user’s preferences, and choices derived from an analysis of the user’s voice commands.

Users can often ask their assistants questions, conduct searches, plan routes, control home automation devices and media playback, and manage other basic tasks such as email, to-do lists, and calendars with verbal commands.

The more common examples are Google’s Google Assistant, Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana.

 

What are some of the data protection issues raised by digital assistants?

Digital assistants are now very common. Most smartphones have a digital assistance facility and many people have devices in their homes. Whilst these devices and services can provide a host of helpful benefits and shortcuts for users, they can also have significant access to personal data, from many sources, and generate further data, by observing and inferring interests from user behaviour.

Concerns have also been raised about digital assistants recording conversations when users are not aware or do not expect it, such as where the recording functionality is triggered accidentally, or where a data protection by design and by default approach has not been followed by the developers and/or operators.

Often, because of the variation in human voice, accent, tone or phrase, voice assistants will be trained through machine learning, with large volumes of sample voices, to create a model of human speech. These models can be updated over time to refine them and improve quality. In some cases, quality control will require some human review of voice snippets, especially where words are being incorrectly recognised, or to help reduce mis-activations of the device.

Human review of voice data collected and processed by automated means is a common method to review, improve and train the algorithms used in voice assistant technology. While not problematic from a data protection perspective, this kind of processing has many data protection elements, which must be carefully assessed by the companies providing these services to ensure that the use of user data is appropriate.

For these reasons, anyone who produces or uses digital assistants should be aware of the risks of these technologies, as well as the rights and obligations associated with these systems. To reduce the risks associated with these technologies, organisations need to start by considering their compliance with all of the ‘principles of data protection’. In particular, those providing these digital assistants need to ensure that:

  • there is an appropriate legal basis to process the personal data (it appears that the processing is often based on user consent or legitimate interests);
  • adequate, clear, and comprehensive transparency information has been provided to users on the type, extent, and purpose of the processing their data; and
  • effective and appropriate measures are in place in order to safeguard the rights and freedoms of users and ensure the security and confidentiality of their data.

To ensure that the rights and freedoms of users are protected, organisations should adopt an approach that minimises the amount of personal data stored or processed. To ensure the security of the personal data being processed, particularly regarding the potential human review of voice recordings, organisations need to implement data security measures which are appropriate to the type and scale of processing which they undertake. Such measures could include:

  • designing the process with data protection in mind from the start;
  • technical security safeguards such as pseudonymisation and encryption of personal data;
  • access controls for staff or contractors processing personal data;
  • adequate training for relevant staff;
  • physical security, and proper training for staff;
  • opt-in participation by device owners in product improvement programmes; and
  • easy to use and effective user controls that allow owners to access, review and delete their personal data.

What work is the DPC doing on digital assistants?

As EU Lead Supervisory Authority (LSA) for a number of the companies concerned – Google, Apple, and Microsoft for example – the DPC is currently engaging with those organisations to establish the manner by which their voice assistant products comply with data protection requirements. In other cases, our colleagues abroad engage in similar work where they are EU LSA, such as the Luxembourg data protection authority with regards to Amazon’s Alexa assistant.

The DPC is continuing to examine the issues identified so far as presenting data protection risks (such as voice recording, or lack of transparency about the storage or personal data or the purpose of processing) in our ongoing engagement with the companies based in Ireland. We acknowledge and welcome the recent changes by a number of companies to enhance transparency to users concerning the practice of human review of voice data to improve voice assistant technology, as well as the implementation of greater user choice on the use of their data in such contexts.

The DPC, as LSA, also continues to cooperate with our EU data protection colleagues to identify common areas of concern and to identify what further steps including guidance may be necessary to bring additional clarity to the application of data protection requirements in the use of voice assistant technology.