Data Protection Commissioner
Data Protection Commissioner

Commissioner Dixon's Speech

at Data Summit Dublin 2017


Data :  A Regulator’s Perspective
Good morning everyone. What a very special occasion it is for me as the Data Protection Commissioner for Ireland to give this, the opening keynote talk at this inaugural Irish Government Data Summit. This Government in 2014 appointed the first Data Protection Minister in Europe and Dara Murphy as that Minister has brought a whole new energy and focus to the critical debate around the issues of technology and individual identity. Data is after all everywhere and everything in today’s world and it’s both a pleasure and a very useful opportunity for me and the team at the Irish data protection authority to engage with so many key stakeholders and experts at this event.  I’m delighted that so many of you could come to Dublin to drive forward the conversation on how we can each play a role in shaping the norms and values we want to underpin the regulation of data-driven technologies today and tomorrow. Not surprisingly, my short talk this morning is going to touch on the new EU General Data Protection Regulation which will come into force at the end of May 2018 – or GDPR as we more frequently abbreviate it. But the keyword for my talk today is CONTEXT and why it really is king in understanding the implications of data protection law.
Technology driving change – some downsides, some benefits
It’s certainly the case that we live in interesting times……. Interesting times fuelled in part by the fact that we are all agents of a digital revolution that is driving change in our economies, societies and political environments at an unprecedented pace – change that is well beyond the velocity of the previous global industrial revolutions.   In recent years, with the onslaught of social media in particular, we’ve seen a myriad of new challenges arise :  cyber-bulling, revenge porn, children suing their parents for posting their baby pictures online, fake news, online radicalisation, thumb injuries from using touch screens, time-wasting arising from watching videos of cats on youtube, global cyber systems vulnerabilities exploited by the likes of Wannacry, marital discord from social media addictions and so on and so on. We could make a long, long list.
But equally we see the significant benefits of technology too -  far fewer people across the world now live in extreme poverty, substantially helped by the links that technology and globalisation have brought to people all over all the world, albeit with continuing problems in the distribution of wealth.  And we can all think of fantastically positive examples of what data-driven technologies in particular gives us –better understanding of climate change, how to combat hospital infections, identification of genetic markers linked to cancer. And on an individual consumer level, I’ve often heard for example people with challenging disabilities talk about the huge independence and improvement to their quality of life internet technologies have given them. And for those of us with relatives spread across the globe, we can keep in easy contact with them using over-the-top call services.
Binary arguments about technology being good or bad – context is actually king
But one of the things it seems to me that the democratisation of publishing through social media has led to is a diminution of the status of real expertise and consequently to a  loss of nuance in debate. Individuals increasingly cast themselves on extreme sides of a debate: pro-remain versus pro-leave,  pro left versus pro right, pro technology versus pro privacy. And this really brings me to the crux of what I want to talk to you about today. What we have to understand clearly is that in the world of data protection and technology, context is king. With a largely principles-based law, it can’t be automatically black or white, left or right, innovation or privacy. 
Role of data protection authorities
First of all, however, as this is a broad-based audience today, I think it would be useful if I briefly explain the role of data protection authorities. We are responsible for supervising all types of public and private sector bodies in terms of how they collect and process “personal data”. Personal data is information about you – it’s your name, your phone number, your email address, your purchasing records, your health records – anything that is specific to you as an identifiable individual. And it’s worth reminding ourselves of the simplicity and actual common sense nature of what data protection principles require organisations to do in handling data of individuals : that they obtain information fairly by giving notice of what they are collecting and why, use the data for only the specified purposes, keep it safe, secure and up to date including not disclosing it to unauthorised third parties, retain it for only as long as necessary and give a copy of their data to any individual that requests it.  But perhaps of all the problems of the internet era, it is the issue of maintaining control over our digital identity and controlling how we are tracked and monitored that are amongst the most fundamental issues for us as human beings. American lawyers Warren and Brandeis, said to be the founding fathers of what they described as the “right to be let alone”, would surely shake their heads in disbelief if they saw our globally connected world and would be stunned to discover the portable camera they were so affronted by in 1890 is now an anywhere-anytime recording device carried by a majority of adults and teenagers in the western world. It’s a world where inferences can be drawn about us unbeknownst and where the price we pay for goods changes as we shop online based on an algorithm that calculates how urgent our need is and what we might be willing to pay based on a profile of our internet usage.
In all of this, data protection authorities have a watchdog role where we identify risks in terms of how data is collected and used (for example, we recently audited a number of insurance companies for new insurance products based on drivers agreeing to use telematic apps that monitor their driving behaviour)  and we also handle and investigate complaints from individual members of the public who consider their personal data has been mistreated by organisations. In Europe, of course, each individual under the EU Charter of Fundamental Rights Article 8 enjoys a specific right to have their personal data protected.
Context in data protection analysis
So let me revert to the points I made earlier about binary arguments of technology good versus bad or technology versus privacy and examine these in light of context. 
Think of genetic testing. I think we could all agree that in appropriate clinical settings with regulated professionals, ethics committees and counsellors, genetic testing and research can be a good thing where it leads to identification of markers for breast cancer, for example. In these types of regulated clinical environments, ethical decisions can be made on what information patients should receive, whether the rest of the family should receive certain results where they could be equally affected by what is identified, whether DNA samples should be destroyed and whether the data can be safely stored in the cloud etc. But then think of genetic testing in the context of online commercial firms offering for example “child talent genetic testing”. Yes indeed, a test that purports to tell you whether it’s a good idea to buy your child a piano or not and claims to give you a firm idea of your child’s “strengths and weaknesses”. A lot of the websites offering these online services fail to outline the privacy implications with this type of test – precisely what data is collected, how the data would be used, would it be sold, how it would be secured, how and where it would be stored etc. Neither typically are these websites clear that in fact no DNA test is going to be accurate in terms of predicting if a child is going to be good at playing the piano or in relation to any of the personality traits of the child. Genetic data also presents the privacy complication of family members being identifiable as well.  So, it’s not that we can say categorically that genetic testing is a positive use of personal data and does not violate data protection laws – context coupled with compliant implementation are as you can see everything. On that point, it’s of course the case that child talent genetic testing implemented in compliance with data protection laws could be achieved and data protection laws cannot otherwise decide on the ethics of such an offering. Similarly, genetic testing in a clinical setting could be poorly implemented and fail in compliance. Context is king but correct implementation must by the same token be a given.
 I know there are many public sector bodies here today. I’ve said this on a number of occasions before – many public sector bodies seem to struggle enormously with the high-level, principles-based nature of data protection law. Frequently, they visit my office and ask me questions like (and this is a fictional example) “can we roll out a programme of requiring every human in Ireland to swallow an ingestible sensor and wear a patch to pick up its signal and transfer the data via bluetooth  to a government database”.    When I ask for details of the purpose of the project, the legal basis for the collection, the necessity for this level of interference with privacy rights, the alternatives considered, the results of the privacy impact assessment conducted, the safeguards proposed etc, it usually draws a blank. When I explain the analysis the public body needs to conduct and give guidance on how to do so, they listen politely and then say “but can we do that project with the ingestible sensors?”. In other words, there is a reluctance about stepping through the detailed and contextual analysis that is necessary in every case.  In this type of scenario, my office will typically hear back that the public body has said “the data protection commissioner doesn’t like ingestible sensors so we can’t do that project”. But I’d hope I’ve made clear that Data Protection Commissioners don’t make policy choices and are not there to like or dislike any technologies or policies. Our role is to ensure that appropriate data privacy analysis has been conducted, so any measure can be lawfully implemented. There is no book that tells us the answer to any data protection implementation question – it hangs off the detail of the specific case. And the organisation proposing it is best placed to conduct that analysis and present it to us.
Context for individuals in making choices
And context counts for all of us individually too in terms of the privacy choices we have to make. Many of you here will be well familiar with the American cryptotographer Bruce Schneier. He’s the author of many highly-regarded books on data security and privacy and he talks a lot about the issues of modern internet technologies - in particular monitoring and tracking by private corporations and government surveillance. He cites surveillance as both a political and a legal problem which needs to be tackled on both of those fronts which I think we can see very clearly today is the case. But I was particularly interested in the chapter in his “Data and Goliath” book where he talks about what we should all be doing ourselves to avoid being subject to surveillance – pay in cash rather than credit card; alter our driving routes to avoid traffic cameras; leave your smartphone at home; use DuckDuckGo for internet searches; keep cash transactions below a certain threshold to avoid triggering notification to financial regulators; encrypt; refrain from posting identifying details on public sites; turn off location services on your smartphone; put a sticker over your computer’s camera; dress in drag (genuinely); swap loyalty cards with your neighbours; enter random information into webforms; avoid doing online surveys, search for complete strangers on Facebook to confuse it about who you really know. And so on. And I don’t think he’s being ironic or comedic about any of this. In fact, he acknowledges clearly that there would be huge social, time and monetary costs in operating this way, not to mention the psychological burden of total paranoia. But I think his list is useful in terms of tuning us into the internet world in which we live and what our level of choice is and sometimes how stark currently those choices are.
So the world and the choices we get to make have certainly changed and our relationship with how we manage our identity has changed. It’s not a matter of asking “is data privacy dead” but really “how has data privacy changed”. It’s pretty much impossible for any of us to live now without generating a digital footprint and being subject to some form of tracking.  But as Bruce Schneier advises us all - we must each individually identify our own “sweet-spot” and decide what trade-offs in terms of personal data and a service we are willing to make.
And finally, the GDPR
And finally, this brings me to the GDPR and its importance in terms of allowing all of us better understand our choices with regard to our personal data and indeed in some cases driving better choice for us.
The EU Commission is seeking to ensure the protection of rights of Europeans and in particular their Article 8 Charter right to have their personal data protected. But it is also strying to promote and strengthen the EU digital economy. And it knows that this can only be achieved where consumers have trust in the online services – whether public or private - they engage with. Rather than being mutually exclusive, data protection law and the strengthening of the digital economy are complementary. 
And I believe the GDPR is going to slowly transform our relationship with digital service providers of all types. Yes, of course, the new law is based on the same data protection principles with which we are familiar. But its real strength lies in the new accountability and transparency requirements it implements. These will drive the most significant new behaviours by organisations and by data subjects. The very simple idea of having to know your organisation’s personal data processing operations and having to document them and do due diligence on the legal basis for the collection, processing and retention is a “genius” innovation in the law which is going to be revolutionary in compliance terms and revelatory to many boards of organisations. Other requirements such appointing a Data Protection Officer, reporting breaches on a mandatory basis to data protection authorities, in cases of new technology implementation conducting a data protection impact assessment, will drive data protection as a central compliance activity in organisations. The requirement to demonstrate data minimisation, privacy by design and default will force organisations to step carefully through the analysis that must be done.  The notice requirements to users under Articles 13 and 14 of the GDPR should ensure the back of obscure privacy notices buried within other terms of conditions that talk in opaque language about potential sharing of data with affiliates and so on.  The new concise and intelligible notice we receive will allow all of us to decide if Bruce Schneier’s suggestions on how to protect our data privacy are actually quite reasonable in the circumstances – at the moment, we often don’t have a clear picture of what is being collected in order to understand the risks of signing up to an app or not.
The GDPR recognises a risk-based approach to implementation reflective of the fact that context is all important in terms of an analysis of the risks for data subjects and the mitigation measures that can be taken in a given scenario.
 And all of these new transparency requirements in the law we hope will drive the development of a new market of data privacy differentiation where companies that can distinguish their product or service by their privacy offering will start to win. And best of all – the GDPR has extra-territorial reach so any online company targeting sales or services at European users is caught in its net. The GDPR is going to change global behaviours and drive new global standards.
And of course the only reason data protection compliance is going to become a matter of board-level interest now is because of the sanctions and fines in the GDPR. As a data protection authority supervising the world’s largest internet companies from Dublin, we are very pleased to see our enforcement toolkit being expanded massively by the EU to allow us implement punitive sanctions where warranted. As a supervisory authority, we have been preparing our transformation now for the last 2 years. Our budget has been quadrupled by government since 2014, we have close to trebled our staffing numbers up to 70 and have about 100 people at the end of this year. We’ve hired specialists in the legal, technical, investigative and communications fields. We’ve ramped up our awareness and outreach programmes considerably in the last 6 months, most recently with the launch of a new microsite called to drive awareness and then preparation for the GDPR.
I said at the outset we live in interesting times in part because of the pace of change this current industrial revolution imposes on us. The pace is only going to accelerate as we move closer to a merging of the biological and technological in mainstream artificial intelligence applications where learning algorithms will dictate the future. The GDPR recognises the potential of innovation and technology. Butsimply demands that it’s done in a responsible way, where there is accountability and transparency about the implications for each of us in controlling our identities and access to our personal data. It won’t solve all of the issues our internet world is throwing up – other laws will be needed in certain areas to regulate some of these aspects – but the GDPR is overall going to make the world a better place and at the Irish data protection authority, we are getting ready and fully determined to play our part in enforcing that new order.