Case Studies (Annual Report)

The following is a list of case studies, by year, as featured in Annual Reports published by the DPC. These case studies provide an insight into some of the issues that the DPC investigates on a day to day basis. For ease of reference, some of the case studies have been indexed by categories below.

  1. Failure to respond to an Access Request
  2. Failure to respond to an Access Request (II)
  3. Fair obtaining complaint made against a Golf Club
  4. Right to be Forgotten (Microsoft)
  5. Access and Erasure request (Pinterest)
  6. Right to be Forgotten (Microsoft)
  7. Restrictions to the Right of Access – files from An Garda Siochána to the Director of Public Prosecutions
  8. Disclosure Without Consent
  9. Disclosure of Sensitive Data
  10. Prosecution of Guerin Media Limited
  11. Prosecution of Vodafone Ireland Limited
  12. Erasure request to Tinder by Greek data subject, handled by the DPC as Lead Supervisory Authority
  13. Law Enforcement Directive (LED)
  14. Disclosure of account statements by a bank to the representative of a joint account holder
  15. Inaccurate data leading to potential high risk resulting from inaccurate Central Credit Register data
  16. Hacking of third party email
  17. Enforcement follow-through: Surveillance Technologies and Data Protection in Limerick
  18. Article 60 decision concerning Twitter International Company – ID Request, Erasure Request
  19. Article 60 decision concerning Airbnb Ireland UC – Delayed response to an Access Request and an Erasure Request
  20. Cross-border complaint resolved through EU cooperation procedure

1)  Case Study 1: Failure to respond to an Access Request

The DPC received a complaint from an individual regarding a subject access request made by him to an organisation (the data controller) for a copy of all information held regarding his engagement with the data controller. The individual did not receive a response to this request.

The DPC intervened to see if the matter could be informally resolved. The complainant was in particular not satisfied with the fact that certain documents had not been provided in response to his access request. The position of the data controller was that the documents were not provided as the personal data had been provided “in another format”.

Data protection access rights are not about access to documents per se. They are about access to personal data. An access request may be fulfilled by providing the individual with a full summary of their data in an intelligible form. The form in which it is supplied must be sufficient to allow the applicant to become aware of the personal data being processed, check they are accurate and being processed lawfully.

Having examined what data the controller did provide in this case, the DPC was satisfied to advise the complainant that he had been provided with all of the data to which he was entitled under data protection legislation.

 

2)  Case Study 2: Failure to respond to an Access Request (II)

The DPC received a complaint from an individual regarding a subject access request made by her to a service establishment (the data controller) for a copy of CCTV footage relating to their visit to the data controller’s premises on a particular date. The individual did not receive a response to this request.

The DPC intervened to see if the matter could be informally resolved.

By the time the DPC had received the complaint, it transpires that the data controller no longer held any information relating to her as it was not aware of the access request until it was brought to its attention by this office. This was because the email address to which the access request was sent was not an address that was regularly used, despite this being the email address contained in the data controller’s Privacy Policy. The data controller further stated that CCTV footage is retained for 14 days due to the system storage capacity and it was therefore not in a position to provide the requested CCTV footage as more than 14 days had elapsed.

Having examined the matter thoroughly, it was apparent to this office that the data controller contravened Article 12(3) of the GDPR as controllers have an obligation to provide a response to the individual’s subject access request within the statutory timeframe as set out in Article 12 of the GDPR, even where the controller is not in possession of any such data. The failure by the data controller to monitor the inbox associated with the email address in its Privacy Policy resulted in its failure to secure the relevant CCTV footage before it was deleted in line with its retention policy. In this regard, the failure to have relevant organisational measures in place resulted in the data controller being unable to fulfil the subject access request.

The DPC issued directions to the data controller reminding it of its obligation to monitor any email mailbox which they provide for data subject requests. The DPC will take enforcement action if a repeat of this issue arises with the same controller.

 

3)  Case Study 3: Fair obtaining complaint made against a Golf Club

An individual made a complaint to the DPC concerning the data controller’s use of CCTV footage to investigate an incident in which the individual was involved.

The individual had organised an event in a leisure facility (the data controller), and displayed signage in relation to Covid-19 procedures to assist attendees. At the end of the event, the individual inadvertently removed a different sign also in relation to Covid-19 procedures when removing the signage they had installed for the event. The data controller reviewed its CCTV footage to establish who had removed the sign.

The complainant was of the opinion that the data controller did not process their personal data in a proportionate or transparent manner, and that it did not comply with its obligations as a data controller in how it investigated the incident. Accordingly, the individual lodged a complaint with the DPC.

The DPC intervened to seek to resolve the matter informally and the parties reached an amicable resolution when the leisure centre agreed to undertake an audit of its use of the CCTV system and to restrict access to review CCTV footage to designated staff members.

The individual thanked the DPC for handling their complaint in a professional and helpful manner and further stated that they were reluctant to submit the complaint initially as they are aware of the volume of complaints the DPC deals with and the accompanying constraints on resources. The complainant stated that they felt confident that the issue will not arise in the future as a result of the involvement of the DPC. The individual wished to express their appreciation and acknowledge the DPC’s efficiency in dealing with the matter.

4)  Case Study 4: Right to be Forgotten (Microsoft)

The complaint concerned the individual’s dissatisfaction with Microsoft Ireland Operations Limited’s (Data Controller) response to their right to be forgotten request pursuant to Article 17 GDPR. The individual requested the delisting of two URLs that were returning on the Data Controller’s search engine when searching the individual’s name.

The Data Controller confirmed to the individual that the URLs were delisted. However, a search of the individual’s name, carried out by their legal representative, showed that the URLs continued to be returned. The DPC reviewed the URLs when receiving the complaint and confirmed that the URLs were still being returned.

The DPC intervened to seek to swiftly and informally resolve the matter.

The DPC corresponded with the Data Controller and noted that despite confirmation that the URLs were delisted, they continued to return when searching the individual’s name. The Data Controller investigated the request further and confirmed to the DPC that the URLs had now been delisted.

Following further investigation by the DPC, it was determined that while the original URLs requested for delisting no longer appeared, a different URL was now appearing, distinct from the other URLs, redirecting to the same content. The Data Controller delisted this URL also at the request made by the DPC on behalf of the individual.

The DPC wrote to the individual and outlined the Data Controller’s actions. The DPC confirmed that all three URLs had been delisted by the Data Controller.

This case demonstrates the importance of Supervisory Authorities, in this case the DPC, carrying out their own investigations and ensuring that individuals’ requests are fulfilled in line with GDPR. The above is an example of how the DPC took extra measures to ensure that the individual could comprehensively achieve a satisfactory outcome, rather than having to submit a new complaint for the new URL.

 

5)  Case Study 5: Access and Erasure request (Pinterest)

The complaint concerned the individual’s dissatisfaction with Pinterest Europe’s (Data Controller) response to his access and erasure requests pursuant to Article 15 GDPR and Article 17 GDPR, respectively.

The individual submitted his requests following the suspension of his account, in order to obtain a copy of all of his personal data and to have it deleted from the Data Controller’s systems. The individual’s account was suspended due to a violation of the Data Controller’s policies regarding spam. The Data Controller responded to the requests via automated response which stated that it had reviewed the account and decided not to reactivate it because it noticed activity that violated its spam policy. As a result, the individual was no longer able to access his personal data stored on their account. The individual maintained that this information could not be correct as they seldom used their account and sought a more substantial response to their access and erasure requests.

The DPC took up the complaint with Pinterest.

The DPC outlined the individual’s concerns in relation to his access and erasure requests and requesting that the Data Controller address those concerns more substantively. The DPC also requested that the Data Controller indicate whether the individual was provided with an opportunity to appeal his account suspension and, if so, describe the procedure for such appeals.

The Data Controller responded to the DPC stating that it had investigated the matter and explained that once an account is suspended on the basis of a spam violation, all correspondence is automatically directed to its Spam Operations team. The Data Controller further explained the appeal process and noted that the individual corresponded with the Spam Operations team in relation to the appeal of their suspension. The Spam Operations team failed to identify that the correspondence also included the individual’s access and erasure requests and therefore this was not addressed in its response.

The Data Controller’s response also noted that, although the Spam Operations team had rejected the individual’s appeal of their account suspension, it had since carried out another review in light of its updated spam policies. Following this review, the Data Controller re-activated the individual’s account.

The Data Controller also acknowledged the delay in responding to the individual and confirmed that it had since taken steps to ensure that such delays would not occur in responding to future requests.

The Data Controller confirmed that it had actioned the individual’s access and erasure requests. It also confirmed that it had reached out to the individual to inform him of the steps it had taken in response to the DPC’s correspondence and provided the individual with the explanations set out above.

The actions taken and explanations given by the Data Controller were also outlined to the individual by the DPC. The individual informed the DPC that they were satisfied with the actions taken by the Data Controller in response to the DPC’s correspondence as it allowed him to download his data and delete his account.

This case study illustrates how often simple matters – such as a complaint being forwarded to the wrong unit in an organisation – can become data protection complaints if the matter is not identified appropriately.

 

6)  Case Study 6: Right to be Forgotten (Microsoft)

The complaint concerned the individual’s dissatisfaction with Microsoft Ireland’s (Data Controller) response to their right to be forgotten request pursuant to Article 17 GDPR. The individual requested to have 7 URLs delisted from being returned in a search against their name on the Data Controller’s search engine. The individual stated that their National Identity number was contained in the URLs returned and raised concerns that the availability of their National Identity number increased the risk of identity theft.

The DPC intervened on behalf of the complainant.

The Data Controller originally refused the delisting request, stating that the URLs contained information of public relevance, and that the information was published in an official bulletin of a government body; in this case, the Spanish Government.

The DPC corresponded with the Spanish Data Protection Authority in relation to the information published in the URLs. The Spanish Data Protection Authority stated that due to the introduction of the GDPR, the Spanish Data Protection law was modified and the Government is no longer permitted to disclose citizens’ complete National Identification number alongside their name and surnames when publicising administrative acts.

Following clarification from the Spanish Data Protection Authority, the DPC informed the Data Controller of the change in the Spanish Data Protection law. The Data Controller stated that based on the update in Spanish Data Protection law, it would delist all requested URLs from being returned against the individual’s name pursuant to Article 17 GDPR.

This case highlights the importance of communicating with other supervisory authorities during the complaint resolution process. In these circumstances, the DPC was provided with clarification on how Spain has adapted its national legislation to comply with the GDPR. It also allowed the Data Controller to adapt its current procedure to ensure that requests involving the delisting of URLs containing full National Identity numbers are handled in accordance with the updated national legislation.

 

7)  Case Study 7:Restrictions to the Right of Access – files from An Garda Siochána to the Director of Public Prosecutions

An individual complained to the DPC about the restrictions applied by the Director of Public Prosecutions (the DPP) in response to an access request. The person outlined they were a victim of a crime but a decision was reached by the DPP not to prosecute.

The DPC noted that the DPP imposed restrictions on access to the investigation file, the statement of a witness, memorandums of interviews taken as part of the investigation as well as correspondence between the DPP and An Garda Síochána (AGS). The DPC probed the restrictions applied by the DPP further as any restriction relied upon by data controllers must respect the essence of the fundamental rights and freedoms of individuals.

The data protection rights conferred under the Law Enforcement Directive as transposed in the Data Protection Act 2018 pertain solely to personal data relating to an individual’s own personal information and do not confer a right of access to the personal data of a third party. In this case, the DPP clarified that it restricted the right of access of the individual in question under Section 94(2)(e) of the Act in order to protect the rights and freedoms of other persons. The DPP also cited 91(7) of the Act which provides that a data controller shall not provide individuals with personal data relating to another individual where doing so would reveal, or would be capable of revealing, the identity of the other individual. The only circumstances in which 91(7) does not apply is where a third party consents to the provision of their information to the individual making the request as set out in 91(8) of the Act.

With regard to the investigation file submitted by AGS to the DPP and correspondence between the DPP and AGS, under Section 162 of the Act, individual data subject rights and controller obligations do not apply as far as these relate to personal data processed for the purpose of seeking, receiving or giving legal advice. Equally, such rights and obligations do not apply in respect of which a claim of privilege can be made for the purpose of or in the course of legal proceedings. The DPC noted the Act explicitly outlines that these legal proceedings include personal data consisting of communications between a client and his or her legal advisers or between those advisers. After seeking further clarification, it was apparent to the DPC that the restrictions invoked were valid and legal privilege applied to the data sent between AGS and the DPP.

The DPC handled several similar complaints during 2022 against the DPP in relation to subject access requests. Each complaint examined followed a procedure whereby the DPC probed further with regard to any restrictions applied by the DPP on a case-by- case basis, in addition to querying any privilege claimed in respect of data withheld.

In 2022, a complaint against the DPP examined by the DPC in 2020 was the subject of a challenge by the complainant in Carrick-on-Shannon Circuit Court (civil) with regard to the DPC’s acceptance of the restrictions applied by the DPP. The Court noted that the DPC had queried the reasons given by the DPP to the appellant for withholding certain personal data and that the DPP had provided the DPC with a further detailed response. The Court stated that it was clear from the pleadings that the handling of the complaint was not a rubber-stamping exercise and that the DPC had examined all matters. The Court stepped through each of the three documents withheld by the DPP and the privilege claimed in respect of each and found no error in respect of any of the three categories.

 

8)  Case Study 8:Disclosure Without Consent

An individual complained to the DPC that the Criminal Assets Bureau (CAB) disclosed his personal financial details without his consent, to a number of individuals against whom CAB had taken legal proceedings. CAB advised the DPC that the proceedings in question were under the Proceeds of Crime Act, 1996-2016 (PoCA), the purpose of which is to identify and confiscate property, established to the satisfaction of the High Court, to be the proceeds of crime. CAB stated the information contained in the subject documentation was required to establish the provenance of property the subject matter of the proceedings. CAB outlined that the personal data of the complainant was intertwined with the personal data of the individuals being prosecuted and could not be redacted from the court documents. The DPC noted such proceedings are governed by section 158(1) of the Data Protection Act, 2018 (the Act) which provides that the GDPR and Law Enforcement Directive as transposed in the Act may be restricted in order to ensure the protection of judicial independence and judicial proceedings.

As set out in Section 101(2) of the Act, the DPC is not competent for the supervision of data processing operations of the courts when acting in their judicial capacity. The DPC advised the complainant that CAB prepared the court documents for the purposes of court proceedings and that supervision of data processing operations of the courts when acting in their judicial capacity is assigned to a Judge appointed by the Chief Justice pursuant to section 157 of the Act. The DPC provided the complainant with the contact details for the assigned judge.

 

9)  Case Study 9:Disclosure of Sensitive Data

An individual complained to the DPC that a Clothing and Food Company disclosed their personal medical information by issuing postal correspondence with the words “Coeliac Mailing” printed on the outside of the envelope. As part of the Stores Value Card facility, the individual in question had signed up to receive an ‘Annual Certificate of Expenditure’ of gluten free products purchased during the year, which could be used for tax purposes. The DPC advised the Store that under Article 9 of the GDPR, health data is deemed sensitive data and is afforded additional protection and that displaying the words “Coeliac Mailing” has to be examined in light of Article 9 of the GDPR. In response, the Store advised the DPC that it instructed its marketing department to cease using this wording on the outside of envelopes for all future mailings. The DPC welcomes the positive outcome to this engagement.

 

10)  Case Study 10:Prosecution of Guerin Media Limited

In January 2022, the DPC received two complaints from two individuals regarding unsolicited marketing emails received from Guerin Media Limited. In response to the DPC’s investigation of the complaints, Guerin Media Limited explained that the two individuals’ email contact details had previously been removed from all marketing lists held by the company with the exception of a Gmail contact list that it maintain. It stated that due to human error and the fact that their details remained on the Gmail contact list, both individuals were sent marketing emails from Guerin Media Limited that should not have occurred.

The DPC had previously prosecuted Guerin Media in 2019 for breaching Regulation 13 of the ePrivacy Regulations in relation to previous complaints regarding similar incidents of unsolicited email marketing. Accordingly, the DPC decided to proceed to another prosecution arising from these complaint cases.

At Naas District Court on 5 December 2022, Guerin Media Limited pleaded guilty to three charges under Regulation 13(1) of the ePrivacy Regulations. The District Court convicted Guerin Media Limited on all three charges and it imposed fines totalling €6,000. Guerin Media Limited agreed to pay €1,000 towards the DPC’s legal costs.

 

11)  Case Study 11:Prosecution of Vodafone Ireland Limited

In July 2021, the DPC received one complaint from an individual regarding an unsolicited marketing telephone call received from Vodafone Ireland Limited. In response to the DPC’s investigation of the complaint, Vodafone Ireland Limited explained that the existing customer had opted out of receiving marketing communications in March 2018. Despite this, Vodafone Ireland Limited had carried out a manual check of preferences in advance of conducting a marketing campaign, and due to human error, the complainant was included in the marketing campaign.

The DPC had previously prosecuted Vodafone Ireland Limited in 2021, 2019, 2018, 2013 and 2011 for breaching Regulation 13 of the ePrivacy Regulations in relation to previous complaints. Accordingly, the DPC decided to proceed to another prosecution arising from this complaint case.

At Dublin Metropolitan District Court on 27 June 2022, Vodafone Ireland Limited pleaded guilty to one charge under Regulation 13(6) of the ePrivacy Regulations. The District Court applied the Probation of Offenders Act 1907 in this case, on the basis of a charitable donation of €500 to Little Flower Penny Dinners. Vodafone Ireland Limited agreed to discharge the DPC’s legal costs.

 

12)  Case Study 12: Erasure request to Tinder by Greek data subject, handled by the DPC as Lead Supervisory Authority

This case study concerns a complaint the DPC received via the One-Stop Shop (OSS) mechanism created by the GDPR from an individual regarding an erasure request made by them to MTCH Technology Services Limited (Tinder).

As way of background, the individual’s account was the subject of a suspension by Tinder. Following this suspension, the individual submitted a request to Tinder, under Article 17 of the GDPR, seeking the erasure of all personal data held in relation to them. When contacting Tinder, the individual also raised an issue with the lack of a direct channel for contacting Tinder’s DPO. As the individual was not satisfied with the response they received from Tinder, they made a complaint to the Greek Supervisory Authority. The individual asserted that neither their request for erasure nor their concerns about accessing the DPO channels, had been properly addressed by Tinder. As the DPC is the Lead Supervisory Authority (LSA) for Tinder, the Greek Supervisory Authority forwarded the complaint to the DPC for handling.

The DPC intervened to seek a swift and informal resolution of the matter in the first instance.

The DPC put the substance of the complaint to Tinder and engaged with it. In response and by way of a proposed amicable resolution, Tinder offered to conduct a fresh review of the ban at the centre of this case. Following this review, Tinder decided to lift the ban. The lifting of a ban by Tinder allows an individual to be then in a position to access their account on the platform. The individual can then decide if they wish to use the self-delete tools to erase their account from within the Tinder platform. In addition to the above, Tinder provided information for the individual in relation to its retention policies.

In relation to the matter of individuals being able to contact its DPO, on foot of the DPC’s engagement with Tinder, the platform agreed to strengthen its existing processes by posting a dedicated FAQ page on its platform. This page now provides enhanced information to individuals on specific issues relating to the processing of personal data and exercising those rights directly with Tinder’s DPO.

Via the Greek Supervisory Authority, the DPC informed the individual of the actions taken by Tinder. In their response the individual confirmed that they were content to conclude the matter and, as such, the matter was amicably resolved pursuant to section 109(3) of the Data Protection Act 2018 (the Act), and the complaint was deemed to have been withdrawn.

This case study again demonstrates the benefits — to individual complainants — of the DPC’s intervention by way of the amicable resolution process. The DPC’s engagement with the controller also resulted in Tinder improving the information that it makes available to all of its users on its platform.

 

13)  Case Study 13:Law Enforcement Directive (LED)

The Garda Síochána Ombudsman Commission (GSOC) sent a letter containing the outcome of its investigation into a complaint to an address where the person who made the complaint no longer resided. The DPC established the letter was posted to the address where the individual lived at the time of a previous complaint that they had made to GSOC. The individual in question had subsequently informed GSOC they no longer lived at that address and that with regard to the new complaint they were only contactable by email.

The DPC liaised extensively with GSOC regarding this complaint. GSOC reported the data breach to the DPC through the normal breach reporting channels. To avoid this type of incident happening again, GSOC advised the DPC that an email issued internally to all staff advising of the importance of ensuring the accuracy of personal data entered onto the Case Management System (CMS). GSOC also outlined that it sent a separate email to all line management in the GSOC Casework section advising them of the necessity to accurately input personal data on the CMS and to amend this information whenever updated information is received.

 

14)  Case Study 14:Disclosure of account statements by a bank to the representative of a joint account holder

The complainant in this case held a joint bank account with a family member. Following a request from the solicitors of the other joint account holder, the bank (the data controller) disclosed copies of bank statements relating to the account, which included the complainant’s personal data, to those solicitors. The complainant was concerned that this disclosure did not comply with data protection law.

During the course of the DPC’s handling of this complaint, the bank set out its position that any joint account holder is entitled to access the details and transaction information of the joint account as a whole. The bank further took the view that, in relation to solicitors who are acting for its customers, it is sufficient for it to accept written confirmation from a solicitor on their headed paper that the solicitor acts for the customer as authority for the bank to engage with the solicitor in their capacity as a representative of the bank’s customer.

Data protection law requires that personal data be collected or obtained for specified, explicit and legitimate purposes and not be further processed in a manner that is incompatible with those purposes (the “purpose limitation” principle). In this case, the DPC noted that the bank had obtained the complainant’s personal data in order to administer the joint account which the complainant held with the other account holder, including the making of payments, the collection of transaction information and the preparation of bank statements. It appeared to the DPC that it was consistent with the bank’s terms and conditions for the joint account, and the account holder’s signing instructions on the account (which allowed either party to sign for transactions without the consent of the other account holder), that the administration of the account could be completed by one account holder without the consent of the other. In the light of this, the DPC considered that the disclosure of bank statements to the solicitors of the other joint account holder was not incompatible with the specified, explicit and legitimate purpose for which the complainant’s personal data had been obtained by the bank, i.e. for the administration of the joint account.

Second, the DPC considered whether the bank had a lawful basis for the disclosure of the complainant’s personal data, as required under data protection law. In this regard, the DPC was satisfied that the bank was entitled to rely on the “legitimate interests” lawful basis, which permits the processing of personal data where that processing is necessary for the purposes of the legitimate interests pursued by the data controller or by a third party. In this case, the bank had disclosed the complainant’s personal data on the basis that the solicitor was acting for the other joint account holder and was seeking the statements for legitimate purposes, namely to carry out an audit of the other account holder’s financial affairs. In circumstances where, pursuant to the signing instructions on the account, the other account holder would have been entitled to administer the account, the DPC was satisfied that the bank would not have had any reason to suspect that the disclosure would be unwarranted by reason of any prejudice to the complainant’s fundamental rights or freedoms. Accordingly, the DPC considered that the bank had a lawful basis for the disclosure, regardless of whether the complainant had provided consent.

Finally, the DPC considered whether the bank had complied with its obligations under data protection law to take appropriate technical and organisational measures to ensure security of personal data against unauthorised or unlawful disclosure. In this regard, the DPC accepted the position of the bank, set out in its policies, that it was appropriate to accept written confirmation from a solicitor that they were authorised to act on behalf of an account holder, without seeking further proof. The bank’s policy in this regard was based on the fact that a solicitor has professional duties as an officer of the court and as a member of a regulated profession.

 

15)  Case Study 15: Inaccurate data leading to potential high risk resulting from inaccurate Central Credit Register data

The DPC received a notification from a financial sector data controller concerning an individual whose account had been incorrectly reported to the Central Credit Registrar (CCR). The controller had purchased the individual’s account as part of a portfolio sale in 2015 and was not aware that the individual had been adjudicated bankrupt in 2014. Individuals who have been declared bankrupt fall outside the scope of reporting obligations to the CCR. In addition, accounts with returns prior to the commencement of the CCR on the 30 June 2017 are not reportable to it.

The individual experienced difficulty obtaining a loan because their CCR record, which is visible to other lending institutions, had been reported in error by the controller as live and in arrears. The risk to the rights and freedoms of the individual was assessed as high and the breach was accordingly communicated by the controller to the individual under Article 34 of the GDPR.

The DPC confirmed with the controller that the individual’s CCR record had been amended. By way of mitigation, the controller introduced measures which require sellers of portfolios to disclose information on individuals such as bankruptcies.

This case highlights the importance of having systems in place to ensure the security and integrity of personal data under Article 5(1)(f) GDPR . Controllers should be aware of the personal data they hold on individuals and have measures in place to validate and understand the data when acquiring it from other parties. The case also demonstrates that controllers have a duty to prevent any alteration to or unauthorised disclosure of personal data, incorrect or otherwise to the CCR which poses risk to individuals.

 

16)  Case Study 16: Hacking of third party email

A Hospice Care Centre (Data Controller) utilises the services of Microsoft Office 365, a cloud based email service and also engaged third party IT Consultants.

An Office 365 Audit was conducted by the IT Provider every quarter, where a number of recommendations by the service provider were identified including but not limited to all user accounts to have Multifactor Authentication (MFA) and the disabling of forwarding rules on all accounts.

A user’s credentials were subsequently compromised and the IT Consultants established that the credentials were obtained as a result of a brute force attack, which may have been prevented had the controller introduced Multi Factor Authentication as recommended at the time of the audit. On the advice of the IT Consultants, the compromised user password was reset and MFA introduced for this user. The controller has now commenced the introduction of MFA to all users.

This breach could likely have been prevented if the recommendations of the audit were introduced in a timely manner.

 

17)  Case Study 17: Enforcement follow-through: Surveillance Technologies and Data Protection in Limerick

As reported in last year’s annual report the DPC issued a decision to Limerick City and County Council in December 2021 regarding a broad range of issues pertaining to surveillance technologies deployed by the Council, in 2022 the DPC followed up on the decision’s twenty-one corrective actions to be taken by Limerick City and County Council to ensure that these were implemented within the specified timeframes.

Amongst the issues of concern in the decision were the Council’s use of CCTV cameras where no authorisation from the Garda Commissioner was received, no lawful basis for the use of traffic management CCTV cameras, access from Henry Street Garda Station to the Council’s CCTV cameras in specified locations, the use of automatic number plate recognition technology and drones in public places which were used for the purposes of prosecuting crime or other purposes. The DPC in its decision imposed a temporary ban on the Council’s processing of personal data in respect of certain CCTV cameras and ordered the Council to bring its processing into compliance by taking specified actions. The Council was also reprimanded by the DPC in respect of infringements, and an administrative fine in the amount of €110,000 was imposed.

By way of follow-up enforcement action in respect of the implementation of the corrective actions, the DPC wrote to Limerick City and County Council and met virtually with them on a number of occasions in 2022 in order to monitor progress. On 27 July 2022, the DPC carried out an onsite inspection at Limerick City and County Council to verify that all corrective actions had been carried out. At the end of this process, the DPC was satisfied that Limerick City and County Council had implemented the corrective actions required by the DPC’s decision. Amongst the issues of note in that regard were the following:

  • Authorisation of the Garda Commissioner under Section 38 of the Garda Síochána Act, was obtained for 353 CCTV cameras across Limerick City and County;
  •  A joint controller agreement between An Garda Síochána and Limerick City and County Council in respect of the authorised cameras was put in place;
  • All automated number plate recognition capability was removed from all sites where it had been in operation;
  • All traffic management cameras were disconnected;
  • CCTV cameras that previously focussed on traveller accommodation sites were removed;
  • The link from some of the Council’s CCTV cameras to Henry Street Garda Station was disconnected;
  • Drones were grounded;
  • New CCTV signage was erected across all CCTV sites;
  • Plans to implement real-time monitoring of CCTV cameras in fourteen towns and villages across Co. Limerick were abandoned; and
  • 126 no. of CCTV cameras are no longer in operation.

In addition, in late November 2022, the Circuit Court confirmed the DPC’s decision to impose an administrative fine of €110,000 on Limerick City and County Council in relation to the GDPR infringements identified in the decision.

 

18)  Case Study 18:Article 60 decision concerning Twitter International Company – ID Request, Erasure Request

A complaint was lodged directly with the DPC on 02 July 2019 against Twitter International Company (“Twitter”), and accordingly was handled by the DPC in its role as lead supervisory authority. The complainant alleged that, following the suspension of their Twitter account, Twitter failed to comply within the statutory timeframe with an erasure request they had submitted to it. Further, the complainant alleged that Twitter had requested a copy of their photographic ID in order to action their erasure request without a legal basis to do so. Finally, the complainant alleged that Twitter had retained their personal data following their erasure request without a legal basis to do so.

The complainant’s Twitter account was suspended as Twitter held that the complainant was in breach of its Hateful Conduct Policy. Once Twitter suspended the account, the complainant sought that all of their personal details, such as email address and phone number, be deleted. They submitted multiple requests to Twitter asking that their data be erased. Twitter asked the complainant to submit a copy of their ID in order to verify that they were, in fact, the account holder. The complainant refused to do so. In the premises, Twitter ultimately complied with the erasure request without the complainant’s photographic ID.

The DPC initially attempted to resolve this complaint amicably by means of its complaint handling process. However, those efforts failed to secure an amicable resolution and the case was opened for further inquiry. The issues for examination and determination by the DPC’s inquiry were as follows: (i) whether Twitter had a lawful basis for requesting photographic ID where an erasure request had been submitted pursuant to Article 17 GDPR, (ii) whether Twitter’s handling of the said erasure request was compliant with the GDPR and Data Protection Act 2018 and (iii) whether Twitter had complied with the transparency requirements of Article 12 GDPR.

In defence of its position, Twitter stated that authenticating that the requester is who they say they are is of paramount importance in instances where a party requests the erasure of their account. It states that unique identifiers supplied at the time of registration of an account (i.e. email address and phone number) simply associate a user with an account but these identifiers do not verify the identity of an account holder. Twitter posited that it is cognisant of the fact that email accounts can be hacked and other interested parties might seek to erase an account particularly in a situation such as this, where the account was suspended due to numerous alleged violations of Twitter’s Hateful Conduct Policy. The company indicated that it retains basic subscriber information indefinitely in line with its legitimate interest to maintain the safety and security of its platform and its users.

Twitter further argued that, as it did not actually collect any ID from the complainant, Article 5 (1)(c) was not engaged. Notwithstanding this, it stated that the request for photo identification was both proportionate and necessary in this instance. It indicated that a higher level of authentication is required in circumstances where a person is not logged into their account, as will always be the case where a person’s account has been suspended.

Having regard to the complainant’s erasure request and the associated obligation that any such request be processed without ‘undue delay’, Twitter set out a timeline of correspondence pertaining to the erasure request between it and the complainant. Twitter stated that the Complainant had made duplicate requests and, as such, had delayed the process of deletion/ erasure themselves. Regarding data retention, Twitter advised the DPC that it retained the complainant’s phone number and email address following the completion of their access request. It stated that it retains this limited information beyond account deactivation indefinitely in accordance with its legitimate interests to maintain the safety and security of its platform and users. It asserted that if it were to delete the complainant’s email address or phone number from its systems, they could then use that information to create a new account even though they have been identified and permanently suspended from the platform for various violations of its Hateful Conduct Policy.

Following the completion of its inquiry, on 27 April, 2022 the DPC adopted its decision in respect of this complaint in accordance with Article 60(7) of the GDPR. In its decision the DPC found that the data controller, Twitter international Company, infringed the General Data Protection Regulation as follows:

  • Article 5(1)(c): Twitter’s requirement that the complainant verify his identity by way of submission of a copy of his photographic ID constituted an infringement of the principle of data minimisation, pursuant to Article 5(1)(c) of the GDPR;
  • Article 6(1): Twitter had not identified a valid lawful basis under Article 6(1) of the GDPR for seeking a copy of the complainant’s photographic ID in order to process his erasure request;
  • Article 17(1): Twitter infringed Article 17(1) of the GDPR, as there was an undue delay in handling the complainant’s request for erasure; and
  • Article 12(3): Twitter infringed Article 12(3) of the GDPR by failing to inform the data subject within one month of the action taken on his erasure request pursuant to Article 17 of the GDPR.

The DPC also found in its decision that Twitter had a valid legal basis in accordance with Article 6(1)(f) for the retention of the complainant’s email address and phone number that were associated with the account. It also found that, without prejudice to its finding above concerning the data minimisation principle with regard to photo ID, Twitter was compliant with the data minimisation principle as the processing of the email address and phone number data was limited to what was necessary in relation to the purposes for which they are processed.

In light of the extent of the infringements, the DPC issued a reprimand to Twitter International Company, pursuant to Article 58(2) (b) of the GDPR. Further the DPC ordered Twitter International Company, pursuant to Article 58(2)(d), to revise its internal policies and procedures for handling erasure requests to ensure that data subjects are no longer required to provide a copy of photographic ID when making data erasure requests, unless it can demonstrate a legal basis for doing so. The DPC ordered that Twitter International Company provide details of its revised internal policies and procedures to the DPC by 30 June 2022. Twitter complied with this order by the set deadline.

 

19)  Case Study 19: Article 60 decision concerning Airbnb Ireland UC – Delayed response to an Access Request and an Erasure Request

A complaint was lodged with the Berlin Commissioner for Data Protection and Freedom of Information (“Berlin DPA”) against Airbnb Ireland UC (“Airbnb”) and was thereafter transferred to the DPC to be handled in its role as lead supervisory authority.

The complainant alleged that Airbnb failed to comply with an erasure request and a subsequent access request they had submitted to it within the statutory timeframe. Further, the complainant stated that when they submitted their request for erasure, Airbnb requested that they verify their identity by providing a photocopy of their identity document (“ID”), which they had not previously provided to Airbnb.

The DPC initially attempted to resolve this complaint amicably by means of its complaint handling process. However, those efforts failed to secure an amicable resolution and the case was opened for further inquiry. The issues for examination and determination by the DPC’s inquiry were as follows: (i) whether Airbnb had a lawful basis for requesting a copy of the complainant’s ID where they had submitted an erasure request, pursuant to Article 17 GDPR, (ii) whether Airbnb’s handling of the said erasure request was compliant with the GDPR and Data Protection Act 2018 and (iii) whether Airbnb’s handling of the complainant’s access request was compliant with the GDPR and Data Protection Act 2018.

Airbnb responded to the complainant’s allegations, justifying its request for photographic ID given the adverse effects that would flow from a wrongful deletion of an account. Airbnb highlighted that fraudulent deletion of an Airbnb account can lead to significant real-world harm including, in the case of hosts, the economic harm through cancelled bookings and loss of goodwill built up in the account and, in the case of guests, the potential loss of accommodation while travelling abroad. Airbnb stated that these are not trivial risks and appropriate steps must be taken to address them. It further stated that the provision of an ID document to authenticate an erasure request is a reliable proof of identification and that it does not place a disproportionate burden on the individual making the erasure request. It posited that photographic identity can be considered to be an evidential bridge between an online and an offline identity.

Airbnb ultimately complied with the complainant’s erasure request, validating their identity by providing them with the option of logging into their account to verify their identity, without the necessity to provide ID. Following intervention by the DPC, Airbnb complied with the complainant’s access request. Having completed its inquiry, on 14 September 2022, the DPC adopted its decision in respect of this complaint in accordance with Article 60(7) of the GDPR. In its decision the Data Protection Commission found that the data controller, Airbnb Ireland UC, infringed the General Data Protection Regulation as follows:

  • Article 5(1)(c) of the GDPR

The DPC found that Airbnb’s requirement that the complainant verify their identity by way of submission of a copy of their photographic ID constituted an infringement of the principle of data minimisation, pursuant to Article 5(1) (c) of the GDPR. This infringement occurred in circumstances where less data-driven solutions to the question of identity verification were available to Airbnb;

  • Article 6(1) of the GDPR

The DPC found that, in the specific circumstances of this complaint, the legitimate interest pursued by the controller did not constitute a valid lawful basis under Article 6 of the GDPR for seeking a copy of the complainant’s photographic ID in order to process their erasure request; and

  • Article 12(3) of the GDPR

The DPC found that Airbnb infringed Article 12(3) of the GDPR with respect to its handling of the complainant’s access request. This infringement occurred when Airbnb failed to provide the complainant with information on the action taken on their request within one month of the receipt of the access request.

In light of the extent of the infringements, the DPC issued a reprimand to Airbnb Ireland UC, pursuant to Article 58(2)(b) of the GDPR. Further the DPC ordered Airbnb Ireland UC, pursuant to Article 58(2)(d), to revise its internal policies and procedures for handling erasure requests to ensure that data subjects are no longer required to provide a copy of photographic ID when making data erasure requests, unless it can demonstrate a legal basis for doing so. The DPC ordered that Airbnb Ireland UC provide details of its revised internal policies and procedures to the DPC by 4 November 2022. Airbnb complied with this order by the set deadline.

 

20)  Case Study 20:Cross-border complaint resolved through EU cooperation procedure

Background

In February 2021 a data subject lodged a complaint pursuant to Article 77 GDPR with the Data Protection Commission concerning an Irish-based data controller. The DPC was deemed to be the competent authority for the purpose of Article 56(1) GDPR.

The details of the complaint were as follows:

     a.      The data subject emailed the data controller in January 2021 to request erasure of his personal data.

     b.      The data subject did not receive any response from the data controller

Following a preliminary examination of the material referred to it by the complainant, the DPC considered that there was a reasonable likelihood of the parties concerned reaching informal resolution of the subject matter of the complaint within a reasonable timeframe.

Informal Resolution

The DPC engaged with both the data subject and the data controller in relation to the subject-matter of the complaint. Further to that engagement, it was established that during the week in which the data subject sent his erasure request by email to the controller a new process to better manage erasure requests was implemented by the controller. The data controller informed the DPC that it was in a transition period during the week the email came in and it appears a response was missed. New personnel were being trained on how to manage these types of requests during this transition period. The data controller stated that it was an oversight, possibly due to the technical transition or human error, and it regretted the error. In the circumstances, the data controller agreed to take the following actions:

1. The data controller agreed to comply with the erasure request; and

2. The data controller sincerely apologised for the error.

In January, 2022 the DPC informed the data subject by email of the final outcome of its engagement with the data controller. When doing so, the DPC noted that the actions now taken by the data controller appeared to adequately deal with the concerns raised in his complaint. In the circumstances, the DPC asked the data subject to notify it, within two months, if he was not satisfied with the outcome so that the DPC could consider the matter further.

On the following day the data subject informed the DPC by email that he agreed with the informal resolution given his concerns regarding the data controller were now satisfied. The DPC was subsequently informed by the data controller that the erasure request was completed and that the personal data of the data subject had been erased.

Confirmation of Outcome

For the purposes of the GDPR consistency and cooperation procedure, the DPC communicated a draft of the outcome which confirmed that:

  • The complaint, in its entirety, had been amicably resolved between the parties concerned;
  • The agreed resolution was such that the object of the complaint no longer existed.

No relevant and reasoned objections were received from the concerned supervisory authorities concerning the draft and the DPC subsequently closed the file in this case.

 

 

  1. Content absent from an access request (Amicable Resolution)
  2. Requests for identification when responding to access requests (Amicable Resolution)
  3. Processing of footage of funeral service by parish church
  4. Use of location data to verify expense claims
  5. Unauthorised disclosure in a workplace setting
  6. Lack of appropriate security measures unauthorised disclosure in a workplace setting
  7. Delisting request made to internet search engine
  8. Department of Employment Affairs and Social Protection – Independence of the DPO
  9. Data restrictions – absence of consent from all parties (Law Enforcement Directive)
  10. Data restrictions – third-party data; opinion given in confidence (Law Enforcement Directive)
  11. Data restrictions – prosecutions pending (Law Enforcement Directive)
  12. Access restrictions (Law Enforcement Directive)
  13. Prosecution of Three Ireland (Hutchison) Limited (ePrivacy)
  14. Prosecution of Vodafone Ireland Limited (ePrivacy)
  15. Request for footage from online meeting (Access Complaints)
  16. Exemptions applied to CCTV footage (Access Complaints)
  17. Amicable resolution in cross-border complaints - access request to Airbnb
  18. Amicable resolution in cross-border complaints: Google (YouTube)
  19. Amicable resolution in cross-border complaints - Yahoo EMEA Limited
  20. Repeated similar breaches
  21. Unauthorised disclosure arising from video conferencing
  22. Disclosure due to misdirected email
  23. Inappropriate disposal of materials by an educational institution
  24. Email addresses disclosed via group mail
  25. Social Engineering Attack
  26. Covid – 19 Vaccination Status and Schools
  27. TikTok and cooperation with other EU data protection authorities
  28. Facebook Election Day Information feature
  29. Facebook View (Ray-Ban stories)
  30. Instagram user self-compromise
  31. Facebook Viewpoints

1)  Case Study 1: Content absent from an access request (Amicable Resolution)

Case study 1: Content absent from an access request (Amicable Resolution) The DPC received a complaint from an individual regarding a subject access request made by them to a data controller for a copy of all information relating to them. The data controller was involved in car park management and a dispute had arisen following the clamping of the individual’s vehicle. The clamping incident was the subject of an appeal to the National Transport Authority. The individual did not receive any response from the data controller.

The individual was subsequently provided with their personal data but did not consider that the data provided to them was complete. Following the intervention of the DPC, further searches were undertaken and the data controller identified additional data which was released to the individual.

The individual remained unsatisfied as they had not been provided with a copy of a particular email which they had sent to the data controller. They stated that it was important for their appeal that they were able to prove that the data controller had received the email in question. The data controller subsequently provided this office with a report from the company which hosts its email services showing that the email in question was received but was quarantined as suspected spam and did not reach any of the intended mailboxes nor was it opened by any persons within the organisation.

This email was then automatically deleted from their servers after 14 days. The data controller also provided screenshots from searches conducted of each of the intended mailboxes, which did not return the email in question.

Article 12(3) of the GDPR states that “the controller shall provide information on action taken on a request under Articles 15 to 22 to the data subject without undue delay and in any event within one month of receipt of the request”.

Having examined the matter thoroughly, it was apparent to the DPC that the data controller did not comply with its obligations under Article 12(3) of the GDPR as it had an obligation to provide a response to the individual’s subject access request within the statutory timeframe, and the data provided to the individual in this case was provided outside of this timeframe. Regarding the email which was quarantined by the data controller’s system, it was clear that this email was not in existence at the time the access request was made. When making decisions around the quarantine of emails, the controller must have due regard to security obligations in line with Article 32 but also ensure that it does not infringe on the rights of individuals. In this case, there was no apparent right interfered with through the initial quarantine and deletion of the email in question.

 

2)  Case Study 2: Requests for identification when responding to access requests (Amicable Resolution)

A complaint was received from an individual who had submitted an access request to a hotel (the data controller) for a copy of all information relating to them. The hotel asked the requester to provide a copy of a utility bill and a copy of photo ID verified by An Garda Síochána. The DPC asked the data controller to set out the particular concerns it had regarding the identity of the requester in circumstances where the postal address and email address being used by the requester were the same as those provided by them during the booking and check-in process at the hotel. The data was subsequently released to the requester.

In relation to the general approach to requesting ID where data subjects seek to exercise their rights, controllers should only request the minimum amount of further information necessary and proportionate in order to prove the requester’s identity. Seeking proof of identity would be less likely to be appropriate where there was no real doubt about identity; but where there are doubts, or the information sought is of a particularly sensitive nature, then it may be appropriate to request proof.

Bearing in mind the general principle of data minimisation, seeking more information than that already held as a means of proving identity is likely to be disproportionate. A request for official ID is only likely to be proportionate to validate identification where the category of information relating to that individual is sensitive in nature and where the information on the official ID can be corroborated with the personal data already held by the data controller such as a photo, address or date of birth.

The categories of personal data held and the likelihood of the risks associated with its release should be considered on a case-by-case basis to determine the minimum level of information required. Where no special category personal data is held, confirmation of address may be sufficient. In cases where there is in fact special category personal, additional information may be proportionate but only that which would be sufficient to confirm identity, having regard to the data already being processed.

 

3)  Case Study 3: Processing of footage of funeral service by parish church (Amicable Resolution)

(Applicable Law – GDPR and Data Protection Act 2018)

An individual made a complaint against a parish church regarding the processing of the individual’s personal data arising from the live streaming and recording of a family member’s funeral service that the individual had attended. The individual also complained about a lack of transparency that the recording was taking place.

The individual complained to the DPC about the parish church’s response to their concern around the use of live streaming and recording for funeral services. In our examination of the complaint, the DPC engaged with the parish church to ascertain their lawful basis for processing and for clarification on their response to the data complaint. The parish church informed the DPC that live streaming of funeral services was used during Covid-19 restrictions and that they record funeral services when requested to do so by family members, which did happen in this complaint, usually when one cannot attend the funeral. The parish church informed the DPC they use one camera in a fixed location to make these recordings and for live streaming. The parish church removes the recordings from their website at the end of 30 days. The parish church apologised to the individual for any distress caused and particularly for not informing the individual of the 30 days only retention period. The parish church informs attendees at the beginning of services that they will be live streamed and have signs with this information at their entrance doors. The parish church implemented changes because of this complaint, including informing attendees during a service that it is being live streamed, including information on their live streaming and recording in parish newsletters and on their website, only responding to written requests for recordings and password protecting the recordings in future.

The DPC wrote to the individual and advised them under section 109(5)(c) of the 2018 Act that the parish church and those unable to attend a funeral service had a legitimate interest to view the service by live stream or recording. The DPC noted the 30-day retention period of the footage, the fixed restricted view of the camera and the changes the parish church had made arising from this complaint, including requiring a request for recording to be made in writing and password protecting these recordings. The DPC advised the individual that the response of the parish church was reasonable in the circumstances of this complaint and noted that the recording was requested by another family member of the deceased. Nevertheless, the DPC recommended under section 109(5)(f) of the 2018 Act that the parish church update the privacy policy available on its website with more information on the live streaming and recording of funeral services.

 

4)  Case Study 4: Use of location data to verify expense claims

The complainant in this case study was a former employee of a statutory service provider, whose work involved driving to locations assigned by his employer. Where this gave rise to claims for overtime or subsistence, the complainant would complete forms provided by the employer, detailing items such as relevant dates and places, dispatch reference numbers, and the amounts claimed. The employer made use of a dispatch system intended to ensure the most efficient use of drivers and vehicles, particularly as they provided response in emergency situations. This system logged the performance and completion of service calls, when vehicles were out on calls or back at base, and when drivers were on or off duty.

The complainant had made a claim for overtime and subsistence. The employer rejected this because of inconsistencies between the details on the complainant’s claim form and those recorded on the employer’s dispatch system. The complainant objected to the use of data from the dispatch system for this purpose and complained to the DPC.

The DPC considered whether the use of data from the dispatch system to verify overtime and subsistence claims was in line with fair processing requirements. The fairness of the processing was to be assessed by reference to whether the complainant and fellow employees had been made aware of the employer’s use of the data for that purpose, whether that processing was compatible with the purpose for which the data was collected, and whether the employer had a legal basis for that processing.

The employer did not have a written policy on the use of the dispatch system. Instead, it relied on the “general awareness” of employees that the system was used for that purpose. The employer pointed out that such use had been noted in an arrangement with its employees’ trade unions some years previously. The DPC noted that overtime and subsistence claims required employees to include relevant dispatch reference numbers from the dispatch system. The DPC took the view that the inclusion of relevant dispatch system reference numbers in overtime and subsistence claims indicated that employees were aware that the data was used not just for logistical processing but also to verify their claims. Even if the major purpose of the dispatch system was to aid logistics, its use to verify overtime claims was not incompatible with that purpose, as that data was the only means available to the employer to verify claims.

The DPC noted that applicable financial regulations required the employer to verify overtime and subsistence claims. The processing to verify overtime and subsistence claims was necessary not just to comply with that legal obligation, but to perform the complainant’s employment contract and for reasons of legitimate interests of the employers.

This case is an example of when data collected for one legitimate purpose – in this case, logistical control – may be appropriately processed for another, in this case verifying overtime claims. However, controllers should bear in mind the overarching requirement to process personal data fairly and must ensure that data subjects are made aware of what data is collected, and the nature and purpose of the processing. Equally important is that the processing have a legal basis, which in most cases will require that the processing is necessary for the stated purpose.

 

5)  Case Study 5: Unauthorised disclosure in a workplace setting

The complainant alleged that insecure processing by his former employer had made his personal data accessible to unauthorised persons, including former colleagues and external third parties.

The complainant was in legal dispute with the company arising from his dismissal. In connection with that dispute, the company had prepared documents including an internal investigation report and a legal submission to the Workplace Relation Commission (WRC). While the WRC submission did not contain a great deal of the complainant’s personal data, the internal investigation report did.

Approximately one month before the complainant first contacted the DPC, the company had notified the DPC of a data breach. The notification stated that the WRC submission had been inadvertently stored on a folder accessible by all employees, rather than on one that was accessible only by authorised HR staff. The error was noticed and corrected two days later, and the company notified the DPC shortly thereafter. The company’s systems did not record whether, when or by whom the WRC submission might have been accessed, or whether it had been copied or printed.

In the complaint, the complainant alleged that the breach affected not just the WRC submission but also the internal investigation report, and that these had been accessible from all parts of the company’s intranet, including on a device that could be used by both employees and visitors to the company’s premises. The complainant submitted statements from former colleagues who described having access to documents relating to “the internal investigation.” The company denied that the internal investigation report had ever been accessible by unauthorised persons.

It also maintained that, while the WRC submission had been inappropriately available for a short time on the company’s intranet, it was not on a part of it accessible to non-employees.

The DPC addressed two main issues: what had been the content and extent of the breach, and whether the company’s security measures had met the standard required by applicable data protection legislation.

The complainant’s former colleagues had said that documents concerning “the internal investigation” had been accessible by them. However, these statements had not described in any detail the nature or contents of the documents, did not say when or by whom they had been seen, and did not say that the documents were accessible by non-employees. Against that, the company had consistently maintained that the WRC submission, but not the internal investigation report, had been inappropriately accessible to employees for a number of days. Significantly, the company had notified the DPC of that approximately one month before the complainant had first lodged his complaint. The DPC took the view that there was insufficient evidence to support the claim that the internal investigation report had been disclosed, or that the complainant’s personal data had been accessible by non-employees as well as unauthorised employees.

Concerning the company’s security measures, the DPC noted that the applicable standard had to reflect and mitigate the harm that could be caused by relevant risks including, as in this case, disclosure to unauthorised persons. The company was clearly aware of the risk of disclosure, as it had arranged for confidential documents to be stored in a way that gave access only to authorised HR staff.

However, the company had failed to properly anticipate and mitigate the risk of human error in storing such documents, as had happened to the WRC submission. The DPC also reminded the company of the need to ensure that relevant personnel are aware of the need to handle personal data in accordance with applicable security measures, and to respond to breaches accordingly. This case illustrates how data controllers must consider all risks that can arise when they process personal data, including the risk of human error. The measures that they adopt to address those risks must reflect not just the possible causes of loss or harm, but also the consequences of a breach, and the ways in which those consequences can be minimised or remedied.

 

6)  Case Study 6: Lack of appropriate security measures unauthorised disclosure in a workplace setting

The DPC received a complaint against an employer, a manufacturing company, asserting that their private information including attendances with the company doctor, details of a personal injury claim being pursued against the company and details of a disciplinary procedure taken against the complainant had been placed on the company’s shared ‘C-Drive’, available to be viewed by anyone within the company, and that a copy of the data on a CD-ROM was also left on the complainant’s desk.

It became apparent during the examination of the complaint that a number of workplace computers had been used to access the data on the shared drive, which the company stated was downloaded, copied or sent to an external email address. The organisation advised that it had carried out an investigation of the incident resulting in two employees, identified as having a significant role in the incident, having their employment terminated and that An Garda Síochána had been notified about the incident. The company notified the DPC of the breach incident outlining that certain data was accessed and viewed by at least two of its employees.

It was stated that the data was being transferred internally from its Human Resources (HR) department to its Legal department due to the imminent departure of one of its HR employees. During the transfer a large volume of electronic files relating to legal cases involving a large number of individuals had the potential to be accessed and viewed by employees who would not ordinarily have access to these.

The implementation of measures to protect and secure personal data are foundational principles of data protection law particularly in terms of ensuring there is no unauthorised access to or destruction of personal data.

With regard to this specific complaint, the DPC observed firstly that the information in respect of the complainant which was disclosed as part of the data breach included very sensitive information, and which constituted “special category data”, in circumstances where special category data includes information about “data concerning health or data concerning a natural person’s sex life”.

The information (examples of which were provided to this office) included details of attendances with the company doctor which revealed very personal and sensitive information about the complainant’s physical health, mental health and their personal circumstances. It was noted that this information was being maintained by the company in the context of legal proceedings/ claims being taken by the individual. Given the nature of the information, there was a particularly strong onus on the company to ensure that only those who needed access to such information were granted and so could access and process same.

The issue regarding this complaint was the placing of files to include the complainant’s personal information on a shared drive accessible to all employees. The DPC considered that due regard was not given to the sensitivity of the information contained in the files and the risks entailed with making them available to any employee of the company, even if this was only for a very short period of time. It would seem that the decision to transfer the files to the shared drive was taken for pragmatic reasons, i.e. the company confirmed it was executed in this manner as the files were too large to be sent by email.

However, this did not justify the placing of the files somewhere where any employee of the company would be able to access them, particularly given the risk of harm to the data subject if colleagues of theirs were able to find out very personal and sensitive information which the complainant may, quite legitimately, not have expected or wanted other employees to know, save to the extent that it was strictly necessary for limited employees to know in relation to legal proceedings/claims between the data subject and their employer. Moreover, there were a number of alternative options in transferring the files to the Legal department which would not have presented the same risk to the security of the personal data, including placing the files on a folder, whether on the shared drive or otherwise, where access was restricted to limited individuals. That such alternative options might have been more time-consuming or difficult to implement were no justification for the placing of the files on the shared drive with unrestricted access to other employees.

The fall-out of the failure to protect personal data in this case was considerable giving rise to legal proceedings against the company by the affected individual, the loss of two long-term employees who were dismissed not to mention the impact on the individual whose data was disclosed.

7)  Case Study 7: Delisting request made to internet search engine

(Applicable Law – GDPR & Data Protection Act 2018)

A data subject made a complaint against an internet search engine regarding the search engine’s response to their delisting request. The complaint concerned two URLs that appeared as results to searches of the individual’s name on the search engine. During the handling of this complaint, the individual included one further URL that they sought the search engine to delist.

The criteria to be applied by search engines is that delisting must occur if the results are irrelevant, inadequate or excessive. A case-bybase balancing exercise must be conducted by the search engine that balances rights of access and rights of those individuals affected by search results.

The individual had originally personally engaged with the search engine seeking delisting of the URLs because the individual argued the URLs contained defamatory content, making it unlawful to process them, and that the URLs were impacting on the individual’s private and professional life given their content. The search engine operator refused to delist the URLs because they related to information about the individual’s professional life and there was a public interest in accessing this information.

The DPC engaged with the search engine operator regarding their refusal to delist. The search engine operator relied on the legitimate interest of third parties to access the information in the URLs. No defamation proceedings had been pursued by the individual against the original publishers of the relevant content and so it was not possible to definitively decide the question of whether content in the URLs was defamatory or not.

That being said, during the course of the handling of this complaint by the DPC, the search engine operator delisted the URLs in Ireland alone based on the defamation arguments of the individual. The individual continued with their DPC complaint seeking delisting across Europe and not just Ireland. Further, the webpages underlying all of the three URLs were deactivated by the webmaster during the handling of this complaint.

Article 17(3)(a) of the GDPR states the right to be forgotten will not apply where the processing of personal data is necessary “for exercising the right of freedom of expression and information”. In examining this complaint, the DPC noted the information contained in the webpages - the subject of the individual’s complaint - relates to previous business conduct by them relevant to their professional life. The individual continues to engage in the same professional sphere and activities. The individual accepted this by arguing the content was impacting their professional life. The individual argued the content was inaccurate because it was defamatory. The DPC noted that a significant majority of the content the individual said was inaccurate was a blog post and comments of third parties and related to their professional activities; appearing to be the opinions of third-party commentators.

The DPC concluded if a third party were to consider the webpages the subject of this complaint it would be clear that the comments were made as user-generated content and represent third party opinions rather than appearing as verified fact. The role of the search engine in listing is not to challenge or censor the opinions of third parties unless to list results gives rise to personal data processing on the part of the search engine that is irrelevant, inadequate or excessive.

The DPC concluded that given the individual’s business role and role in public life arising from their professional life, there is a public interest in accessing information regarding their professional life within the European Union. The DPC wrote to the individual and under section 109(5)(b) of the 2018 Act dismissed the individual’s complaint based on the above considerations.

 

8)  Case Study 8: Department of Employment Affairs and Social Protection – Independence of the DPO

(Applicable Law – GDPR & Data Protection Act 2018)

The DPC commenced this Inquiry after receiving a complaint from Digital Rights Ireland alleging interference with the independence of the Data Protection Officer (DPO) in the Department of Employment Affairs and Social Protection (DEASP) (now the Department of Social Protection – D/ SP) in the context of the D/SP’s amendment to its Privacy Statement on 6 July 2018, in which it removed the only reference to its processing of biometric data from the Statement. The decision considered whether the Department’s DPO was involved in the issue of amending the Privacy Statement in a proper and timely manner in accordance with Article 38(1) of the GDPR; and whether the DPO received instructions regarding the exercise of his tasks contrary to the requirements of Article 38(3) of the GDPR.

The scope of the inquiry did not concern whether the Department’s amendment complied with its transparency obligations under the GDPR. Having regard to all of the relevant information, the DPC found that the Department involved their DPO, properly and in a timely manner, in the Department’s amendment to its Privacy Statement as implemented on 6 July 2018. Therefore, the Department did not infringe Article 38(1) of the GDPR in the circumstances. The decision also found that the Department did not provide any instructions to the DPO regarding the exercise of the tasks referred to in Article 39 of the GDPR in respect of the Department’s amendment to its Privacy Statement as implemented on 6 July 2018. Therefore, the Department did not infringe Article 38(3) of the GDPR in the circumstances.

 

9)  Case Study 9: Data restrictions – absence of consent from all parties (Law Enforcement Directive)

In one case examined by the DPC, a parent applied to An Garda Síochána for copies of the personal data of his young children.

An Garda Síochána refused to supply the data. The DPC advised the parent that it agreed with the restriction imposed, as the controller in this case had particular knowledge of all of the circumstances pertaining to a shared guardianship arrangement in place and considered that consent of all legal guardians would be required in order to release the data in this case.

 

10)  Case study 10: Data restrictions – third-party data; opinion given in confidence (Law Enforcement Directive)

The DPC examined a case where restrictions were imposed by An Garda Síochána to access on the basis of Sections 91(7) and (8) of the Data Protection Act 2018.

The matter related to an individual seeking copies of allegations of abuse made against him with regard to the welfare of his parents. Having examined this matter, it was clear to the DPC that releasing the information would entail the release of third-party data and would reveal the identity of the person making the allegations. The DPC was satisfied on review that the information sought was provided in the strictest of confidence and considered the provisions of Section 91(9)(a) also applied.

 

11)  Case Study 11: Data restrictions – prosecutions pending (Law Enforcement Directive)

The DPC frequently examines complaints in relation to restrictions imposed by An Garda Síochána and the Director of Public Prosecutions (DPP) due to criminal prosecutions pending. Complaints range from assault cases where documentation such as PULSE records, photographs and An Garda Síochána reports of the incidents are sought, to requests for CCTV footage from within An Garda Síochána stations themselves.

In some cases, An Garda Síochána may supply an individual with a copy of their statement provided by the individuals but will withhold other data on the basis of Section 94(3)(a) of the Act whereby a data controller may restrict access, wholly or partly, for the purposes of “the prevention, detection or investigation of offences, the apprehension or prosecution of offenders or the effectiveness of lawful methods, systems, plans or procedures employed for the purposes of the matters aforesaid.”

Upon confirmation by a data controller that criminal prosecutions are pending, the DPC will advise an individual that once legal matters in relation to those cases are concluded, the individuals may re-apply for a copy of their data as set out in Section 91 of the Data Protection Act 2018.

 

12)  Case Study 12: Access restrictions (Law Enforcement Directive)

The DPC received a complaint from an individual who alleged they were a victim of a crime. The individual requested to have their sensitive personal data processed by An Garda Síochána (AGS) according to their specific terms, namely they requested to have a full copy of the medical results of forensic tests undertaken by Forensic Science Ireland (FSI) made available to them immediately upon receipt of the results by AGS. The individual then sought to have the sample kit split, with this request subsequently amended to seeking the analysis of specific sample vials.

The DPC noted that the entire process of seeking the analysis of forensic samples, following the alleged crime, was initiated by the individual data subject. In order to proceed with the forensic tests, the individual was required to complete a form entitled ‘Consent for Release of Stored Forensic and a Legal Report to the Custody of An Garda Síochána’. The DPC determined that any personal data processed by AGS in the context outlined would fall under the Law Enforcement Directive (EU) 2016/680 as transposed in the Data Protection Act.

AGS advised the DPC that in cases where an individual submits their personal data to AGS and FSI for further testing, any related further processing by AGS and FSI is carried out for the purposes of the prevention, investigation, detection or prosecution of criminal offences, or the execution of criminal penalties.

Thus, a report issued by Forensic Science Ireland to AGS, is governed by the provisions of Section 94 of the Act, which sets out restrictions on access that may be imposed by a data controller, including a restriction to avoid prejudicing an investigation. Having examined the matters raised, the DPC advised the individual that the Law Enforcement Directive (EU) 2016/680 as transposed in Parts 5 and 6 of the Act does not provide for individuals to stipulate the conditions under which data subjects consent to have their personal data processed by a law enforcement authority.

In relation to the processing of forensic samples in a law enforcement context, the DPC was satisfied the processing of sensitive data was in compliance with sections 71 and 73(1)(b)(i) of the Act. The DPC noted the ‘Consent for Release of Stored Forensic and a Legal Report to the Custody of An Garda Síochána’ form specified all the intended recipients of the data, as well as the fact that the findings of the laboratory tests and the legal report could also be released to the courts for use in evidence. The DPC recommended the addition of a Data Protection Notice to the form, to allow data subjects obtain detailed information on the legislative framework and procedures governing the conditions of processing in relation to forensic samples and AGS investigations.

 

13)  Case Study 13: Prosecution of Three Ireland (Hutchison) Limited (ePrivacy)

In February 2021, the DPC received one complaint from an individual concerning unsolicited marketing electronic mail they had received from the telecommunications company Three Ireland (Hutchison) Limited. The complainant opted out of receiving marketing emails in midFebruary 2021. In response to the DPC’s investigation, Three Ireland (Hutchison) Limited explained that when it attempted to execute the opt-out request an issue arose from a scenario of two records getting sent simultaneously and losing sequence, resulting in its system not being updated correctly. As a result, three further marketing emails were sent to the complainant in the following weeks. Three Ireland (Hutchison) Limited stated that it remedied the matter by implementing a script to resolve differences between permissions data. It also set up an email alert to monitor the script and raise an alert should the script stop working.

The DPC had previously prosecuted Three Ireland (Hutchison) Limited in 2020 and 2012 for breaching Regulation 13 of the ePrivacy Regulations in relation to previous complaints. Accordingly, the DPC decided to proceed to another prosecution arising from this complaint case.

At Dublin Metropolitan District Court on 6 September 2021, Three Ireland (Hutchison) Limited pleaded guilty to two charges under Regulation 13(1) of the ePrivacy Regulations. The District Court applied the Probation of Offenders Act 1907, on the basis of a charitable donation of €3,000 to Little Flower Penny Dinners. Three Ireland (Hutchison) Limited agreed to discharge the DPC’s legal costs.

 

14)  Case Study 14: Prosecution of Vodafone Ireland Limited (ePrivacy)

In August 2019, March and September 2020, the DPC received three complaints from individuals regarding unsolicited marketing telephone calls, text messages and emails they had received from Vodafone Ireland Limited. In response to the DPC’s investigation of the first complaint, Vodafone Ireland Limited explained that the former customer had called Vodafone Ireland Limited on seven separate occasions to try to opt-out of receiving marketing phone calls to their mobile phone. On each occasion the agent they spoke to did not follow proper procedures and this resulted in the former customer not being opted out of marketing and receiving further marketing calls. The complainant closed his account with Vodafone Ireland Limited and switched to another operator due to the marketing phone calls he received.

In the other two cases, the complainants are existing customers of Vodafone Ireland Limited. In one case the customer received a marketing call to their mobile phone number in February 2019 and during that call the customer told the caller that they did not want to receive further marketing calls. Despite this request, Vodafone Ireland Limited subsequently made a further twelve marketing phone calls to the complainant’s mobile phone as its agent did not take any action to change the complainant’s marketing preferences.

In the other case, the complainant completed a transfer of ownership form on which they clearly set out their marketing preferences not to receive any marketing communications from Vodafone Ireland Limited. The agent handling the transaction failed to follow a process to input the customer’s marketing preferences. As a result, the customer subsequently received a further fourteen unsolicited marketing messages – seven emails and seven text messages.

The DPC had previously prosecuted Vodafone Ireland Limited in 2019, 2018, 2013 and 2011 for breaching Regulation 13 of the ePrivacy Regulations in relation to previous complaints. Accordingly, the DPC decided to proceed to another prosecution arising from these complaint cases.

At Dublin Metropolitan District Court on 6 September 2021, Vodafone Ireland Limited pleaded guilty to seven charges under Regulation 13(1) and 13(6)(a) of the ePrivacy Regulations. The District Court convicted Vodafone Ireland Limited on seven charges and imposed fines totalling €1,400. Vodafone Ireland Limited agreed to discharge the DPC’s legal costs.

 

15)  Case Study 15: Request for footage from online meeting (Access Complaints)

An individual participated in a Zoom meeting that was recorded by the data controller. This was the sporting club’s AGM. The individual made an access request for a copy of this recording. The data controller refused the request stating that it didn’t fall within the remit of GDPR. The individual believed the data contained in the recording was their personal data. The data controller stated the video recordings of the AGM were no longer accessible due to corruption while saving and the inexperience of the data controller in employing this remote video hosting software. However they stated the minutes of the meeting would be available for viewing within a space of weeks.

At this time, the DPC proposed the conclusion of this case in light of the apparent inaccessibility of videos sought by the individual, but the individual did not agree with this approach, stating that video conferencing used during the AGM had been common practice for the data controller for some time and so it seemed unlikely to the individual that the difficulties described by the data controller would have occurred. Upon further questioning by the DPC, the data controller confirmed that video footage was in fact available, but advanced Article 15(4) of GDPR as a reason for its restriction. The data controller was now stating that the video footage of third parties visible in the recording could be considered third-party data and the individual was not entitled to this. However, they were willing to provide written transcripts of the footage to the individual. The DPC contested this, coming to the opinion that, in light of the public nature of the original recordings, as they were part of an AGM, they were made with the participant’s understanding that they could be considered accessible at a later date.

Further issues arose when the individual received written transcripts of the video. The individual claimed that the transcripts were inaccurate and did not reflect the contents of the original video.

In light of this, the DPC contacted the data controller once again, both highlighting the DPC’s opinion regarding the advancement of Article 15(4) and seeking sight of the video from which the transcript had been made. The data controller provided the audio of the video only. Upon assessment, it was clear that the transcript was an accurate reflection of the video’s audio content. The DPC recommended that in order to facilitate an amicable resolution at this stage the data controller should release the same audio content, previously provided to the DPC, to the individual. The data controller complied, but the individual was still not satisfied, once again restating their request for sight of the video content. Upon further request by the DPC to state the exemption it relied on to restrict access to the video content, it was decided by the data controller to release the full video content to the individual. The DPC did not receive copy of the full video content, and so was unable to directly assess whether there was any disparity between it and the audio provided. However, upon confirmation of its receipt, the individual stated they were satisfied with its content and thus this matter was concluded amicably.

The above case involved extensive communication between the DPC, the data controller and the individual. This matter could have been resolved by the data controller if they had released the requested video footage on receipt of the access request. If the data controller was aware of its obligations under GDPR in the first instance then this case would not have been lodged with the DPC.

 

16)  Case Study 16: Exemptions applied to CCTV footage (Access Complaints)

The DPC received a complaint from an individual regarding an access request made to the data controller, a retailer. The solicitors acting for the individual in relation to a personal injury claim had submitted the access request relating to a two-week period when the alleged incident had taken place. They were seeking records of the incident to include CCTV footage. Data was released but the individual identified that the CCTV footage, the accident report form and witness statements had not been released. In responding to the individual’s query in relation to these items, the data controller advised they were restricting access to the items as it was necessary to avoid any obstruction or impairment of the legal proceedings and/or operation of legal privilege.

This complaint was identified as potentially being capable of amicable resolution under Section 109 of the Data Protection Act 2018, with both the complainant and data controller agreeing to work with the DPC to try to amicably resolve the matter.

The DPC advised the data controller to prepare a list which would document any items which the organisation was applying an exemption to, while also documenting the exemption on which they were relying. On receipt of the list, the DPC probed the exemptions being used and looked for the organisation to demonstrate how they had ensured the restriction was necessary and proportionate. The DPC also looked for samples of the documents to be released so we could examine how the exemptions were being applied.

Upon investigation the DPC identified that the documents did contain some personal data of the individual and requested the data controller to release them with relevant redactions. In relation to the CCTV footage, the DPC stated that the primary reason for capturing the data was for security purposes and not for the defence of litigation claim and therefore requested the footage be released to the individual with relevant redactions. The DPC accepted the remaining exemptions were being validly applied as provided by the legislation.

 

17)  Case Study 17: Amicable resolution in cross-border complaints - access request to Airbnb

The DPC received a complaint in September 2020 relating to a request for access (under Article 15 of the GDPR), that the complainant had made to Airbnb Ireland UC (“Airbnb”). The complaint was made directly to the DPC, from an individual based in Malta. Upon assessment by the DPC, the complaint was deemed to be a cross border one because it related to Airbnb’s general operational policies and, as Airbnb is available throughout the EU, the processing complained of was therefore deemed to be of a kind “….which substantially affects or is likely to substantially affect data subjects in more than one Member State” (as per the definition of cross-border processing under Article 4(23) of the GDPR).

The complainant submitted an access request to Airbnb. Airbnb facilitated this access request by providing the complainant with a link to an access file containing his personal data. However, when the complainant tried to use the link, it was not operational. In addition, the complainant was frustrated with the difficulty they faced in contacting Airbnb in relation to this matter. The complainant submitted their complaint to the DPC on this basis.

The DPC contacted Airbnb and asked that it facilitate the complainant’s request. The DPC specified that Airbnb should ensure any links it sends to complainants are fully tested and operational.

In reply, Airbnb explained that once it was informed that the initial link it sent to the complainant was not operational, it sent a renewed link to the complainant and was unaware that the complainant had had any difficulty in accessing this second link. Nonetheless, in the interests of amicably resolving the complaint, Airbnb agreed to provide an additional link to an access file to the complainant and for an encrypted file to be sent to the complainant via secure email.

As a result, the matter was amicably resolved pursuant to section 109(3) of the Data Protection Act 2018 (“the Act”), and under section 109(3) of the Act the complaint was deemed to have been withdrawn. This case study demonstrates the benefits — to individual complainants — of the DPC’s intervention by way of the amicable resolution process.

In this case, the DPC’s involvement led to the complainant being able to access his data. This case study illustrates how often simple matters - such as links which do not operate properly - can become data protection complaints if the matter is not managed appropriately at the front end of data controllers’ customer service and data protection teams.

 

18)  Case Study 18: Amicable resolution in cross-border complaints: Google (YouTube)

The DPC received a complaint in September 2020, via its complaint webform, against Google Ireland Limited (YouTube). The complaint was made by a parent acting on behalf of their child and concerned a YouTube channel/account. The YouTube channel/account had been set up when the child was ten years old and at a time when they did not appreciate the consequences of posting videos online.

Although the complaint was made directly to the DPC by an Irish resident, upon assessment it was deemed to constitute a cross-border complaint because it related to YouTube’s general operational policies and, as YouTube is available throughout the EU, the processing complained of was therefore deemed to be of a kind “which substantially affects or is likely to substantially affect data subjects in more than one Member State” (as per the definition of cross-border processing under Article 4(23) of the GDPR).

According to the complainant, the child no longer had control over the account as they had lost their passwords and the account was no longer in use. However, classmates of the child had discovered the videos, previously posted by the child which were now the subject of embarrassment to the child. The parent of the child had engaged in extensive correspondence with Google, seeking inter alia the erasure of the account from the YouTube platform. The parent had provided the URL for a specific video on the account and for the account itself. The parent was informed by Google, on a number of occasions, that it had taken action and removed the content from the platform. However, the parent repeatedly followed up to note that the content had not in fact been removed and was still available online. As she considered that the complaint had not been appropriately addressed she thus raised the matter with the DPC.

This complaint was identified as potentially being capable of amicable resolution under Section 109 of the Data Protection Act 2018, with both the individual and Data Controller agreeing to work with the DPC to try to amicably resolve the matter. The DPC investigated the background to the complaint and noted that it appeared that Google had removed a specific video from the account, for which the URL had been provided, but it had not removed the account in its entirety, with the result that further videos remained online.

The DPC communicated with Google on the matter and informed Google of the particular background of the complaint. Google immediately took action and removed the YouTube account in its entirety. Google confirmed that a misunderstanding had arisen as its support team had incorrectly assessed the URL for a specific video provided by the complainant, rather than the entire account.

The DPC informed the parent of the outcome and it proposed an amicable resolution to the complaint. The parent thereafter informed the DPC that she had recently become aware of another YouTube channel that her child had created, which again was no longer in use, and the child wanted deleted. The DPC thus corresponded further with Google and Google confirmed it had taken immediate action to remove the account and informed the parent of the actions it had taken.

This case highlights that the DPC can assist data subjects during the amicable resolution process in explaining their particular requests to a data controller, often at the appropriate level, when an individual has previously been unsuccessful in initial engagement with the data controller. This further allows the DPC to monitor the compliance of data controllers by taking note of any issues that may be repeated across other complaints.

 

19)  Case Study 19: Amicable resolution in cross-border complaints - Yahoo EMEA Limited

The DPC received a complaint in March 2021 from the Bavarian data protection authority on behalf of a Bavarian complainant against Yahoo EMEA Limited. Under the One-Stop-Shop (OSS) mechanism created by the GDPR, the location of a company’s main EU establishment dictates which EU authority will act as the lead supervisory authority (LSA) in relation to any complaints received. Once the lead authority is established, the authority that received the complaint acts as a concerned supervisory authority (CSA). The CSA is the intermediary between the LSA and the individual. In this case, the DPC is the LSA, as the company complained of has its main establishment in Ireland.

The complainant in this matter had lost access to his email account following an update on his computer. The complainant noted that he had engaged with Yahoo in order to regain access and was asked for information relating to the account in order to authenticate his ownership of it. The complainant asserted that he had provided this information. However, Yahoo informed the complainant that it could not verify his identity with the use of the information that it had been provided. The complainant was unclear which information he had provided was not correct and thus continued to give the same answers to the security questions. As Yahoo could not authenticate the complainant’s ownership of the account, it recommended that he create a new email account.

The complainant was not satisfied with this solution and thus made a complaint to his local supervisory authority, who referred the complaint on the DPC in its role as Lead Supervisory Authority for Yahoo.

This complaint was identified as potentially being capable of amicable resolution under Section 109 of the Data Protection Act 2018, with both the individual and Data Controller agreeing to work with the DPC to try to amicably resolve the matter.

The DPC contacted Yahoo on the matter, and Yahoo took a proactive approach and immediately noted its desire to reach out to the complainant directly to seek to resolve the issue as soon as possible. Yahoo thereafter quickly confirmed to the DPC that its member services team made contact with the complainant, who provided alternative information that enabled Yahoo to successfully validate identity of the requester and subsequently restore their account access.

This case highlights that further direct engagement between the parties during the amicable resolution process can often achieve a swift resolution for data subjects. It further highlights that a proactive approach on the part of data controllers in the early stages of a complaint can often resolve matters and avoid the need to engage in a lengthy complaint handling process.

 

20)  Case Study 20: Repeated similar breaches

Over a period of 12 months, the DPC received notifications of a series of similar breaches from a data controller involved in financial matters. The controller sold services through a nationwide retail network owned and operated by a third party, which acted as its processor. The breaches occurred when existing customers of the controller made purchases at the processor’s outlets, but used an address different from the address they had previously registered with the controller. Recent changes to the controller’s customer database systems had not been fully coordinated with those for sales, resulting in sales documents containing personal data being sent to customers’ old addresses rather than their new ones. The controller had instructed the processor not to accept purchase requests until changes of address had been registered, but some counter staff did not consistently follow the correct procedures.

When the DPC flagged the pattern of breaches, the controller agreed that there was a systemic problem that required attention by its senior management. While a technical solution was being designed and tested, the controller and processor adopted interim measures including re-training of staff, increased supervision, and a notice that appeared on screens used by processor staff when effecting sales, prompting them to confirm that the customer’s current registered address was correct. The controller implemented the changes in its IT systems to prevent sales documents being sent to incorrect customer addresses, and the recurring breaches ceased.

This case demonstrates how the DPC monitors breaches notified under Article 33 of the GDPR to identify systemic problems, whether in individual controllers, industry types or economic sectors. It also shows how changes intended to improve information systems can have unforeseen side-effects that adversely affect data subjects and the controller. Lastly, it highlights that controllers must monitor the performance of processing agreements to ensure that processors clearly understand and follow procedures for processing personal data.

 

21)  Case Study 21: Unauthorised disclosure arising from video conferencing

An educational institute utilised a video conferencing application to allow students to deliver presentations to lecturers while pandemic restrictions prevented in-person meetings. To enable sharing with external examiners, which is a requirement, the presentations were recorded. All participants were aware of this arrangement, though it was not intended that students would have access to recordings of their presentations.

Two groups of students made presentations to lecturers in separate sessions. After each session, the lecturers discussed the students’ work among themselves. These discussions were also recorded, though the intention was to edit them out before sharing the recordings with external examiners. It was wrongly believed that saved recordings were accessible only to the lecturers. In fact, all invited participants, including the students who presented, had access to recordings of their sessions and were automatically emailed a link to the relevant file on the institution’s server. As a result, students gained access to lecturers’ discussion of other students’ work, which included personal remarks about some of the students.

These were accessed by several students. In the following days, excerpts were circulated on messaging applications and social media.

The organisation reported the breach to the DPC, which confirmed that the recordings accessible to students had been deleted, and clarified the steps taken by the organisation to have the excerpts removed from the social media to which they had been posted. The DPC concluded its assessment of the breach with comprehensive recommendations on the use of IT equipment including video conferencing, and on measures to ensure that staff and students understood and complied with relevant data protection policies.

This case highlights the potential risks posed by the use of video conferencing and similar technologies. Data controllers should ensure that persons who operate these applications are familiar with how they work and ensure that they do so in compliance with data protection law. Controllers should ensure that data protection policies and procedures fully reflect the practices and technologies that they use when processing personal data.

 

22)  Case Study 22: Disclosure due to misdirected email

A notification was received from a statutory body whose functions include the investigation of complaints concerning experts’ professional conduct, training or competence. The personal data breach occurred when a letter concerning a complaint against a specialist was attached to an email and sent to an incorrect address. The attachment contained personal data of several persons, including health data, and was encrypted. However, the password for the encrypted letter was issued in a separate email to the same incorrect address.

The nature of the personal data and the context all indicated a high risk to data subjects. The DPC accordingly confirmed that all affected persons had been notified of the breach, the risks and measures being taken in response to them, as required by Article 34 of the GDPR. The DPC reminded the organisation of its continuing obligation to secure personal data that was accidentally disclosed, and of the importance of ensuring security when emailing personal data. The statutory body has undertaken a review of all its data protection processes, policies and procedures.

Misaddressed emails are one of the most common causes of breaches reported to the DPC. Encryption is a valuable tool that can help to protect against accidental disclosures. However, it is advisable to use a separate medium – such as a telephone call or SMS message – to send the password, as a single mistake in an email address can negate the benefits.

 

23)  Case Study 23: Inappropriate disposal of materials by an educational institution

A health science focused university notified the DPC of a breach arising from inappropriate disposal of materials containing personal data. Due to pandemic restrictions, an employee worked from home on a recruitment project. The employee worked on printed copies of a number of job applications and accompanying CVs. The organisation had instructed employees working from home to minimise printing and to destroy documents before disposal. However, the employee placed the recruitment documents intact into a domestic recycling bin. High winds caused contents of the bin, including the recruitment documents, to be dispersed.

In concluding its examination of the breach, the DPC made a number of recommendations. These focused not just on the work practices of employees, but most importantly on the technical and organisational measures of the controller. While it is important for staff to understand and implement good data protection practices, it is the responsibility of the controller to ensure that they do so and have the means – including, where appropriate, devices such as shredders - of delivering the required standard of protection.

This case also illustrates how working from home can change people’s work environment or habits in ways that can pose risks to personal data. Office facilities, such as confidential shredding, secure printing or even private rooms for discussions – are not always available or feasible at home. As the number of people working remotely increases, controllers must review and adapt their resources, policies and procedures to ensure that they are adequate for the risks posed and the environment in which they occur.

 

24)  Case Study 24: Email addresses disclosed via group mail

The DPC received a breach notification from a charity that supports people with intellectual disabilities. The breach occurred when an email newsletter was addressed to recipients using the CC field rather than the BCC field. The result was that the email addresses of all recipients were disclosed to those who read the email. This is a common type of personal data breach that is often the result of simple human error and that usually poses low risks. While the risks posed in this instance may not have been significant, further inquiries and an analysis of previous submissions to the DPC indicated poor awareness of data protection issues and responsibilities among the charity’s staff and volunteers.

Following engagement with the DPC, the organisation introduced training on data protection for staff and volunteers, and moved to create a new management role with responsibility for data protection compliance across the organisation.

Charities frequently process personal data of vulnerable persons, often including special category data such as information concerning health. Data protection is a fundamental right in the European Union and protecting the rights of vulnerable persons requires care, planning and careful organisational measures. The hard work and goodwill of staff and volunteers must be matched by appropriate management and compliance resources to ensure the protection of personal data rights.

 

25)  Case Study 25: Social Engineering Attack

A medium-sized law firm reported that it was the victim of a social engineering attack. A staff member opened an email from a malicious third party that secretly installed malware on their computer. The malware enabled monitoring email communications and permitted the bad actor to defraud a client of a sum of money. The firm reported the breach to the DPC.

Through its DPC engagement with the firm, the DPC established that the firm used a widely used cloud email service which was managed by a contractor. Basic security settings such as strong passwords were not properly enforced and multi-factor authentication was not implemented. Upon becoming aware of the incident, the firm immediately commissioned a full investigation to establish the root cause and the extent of the breach. Based on the findings of the investigation, the firm responded promptly and implemented further technical security measures as well as additional cyber security and data protection training to all staff. The

DPC requested that updates be provided on the implementation of appropriate organisational and technical security measures to prevent a reoccurrence of a similar breach.

This case demonstrates in stark terms that an organisation cannot assume that it has adequate measures in place simply because it uses an established service provider for functions such as email, or engages a third party to manage applications. Controllers and processors must still ensure that they have security measures that are appropriate to any risk that may be posed to the personal data for which they are responsible.

 

26)  Case Study 26: Covid – 19 Vaccination Status and Schools

During 2021, a number of teachers contacted the DPC to raise concerns about, what they believed, was the processing of staff member’s vaccination data by the Department of Education.

Schools across the country have required staff members to complete ‘Self Declaration Forms,’ if they have been advised to restrict their movements, and provide HSE/medical confirmation of this. One of the listed reasons a person could select to explain why they could not come into work was ‘I am a close contact of a confirmed Covid – 19 case’. At the time the concern was raised with the DPC, public health advice indicated that if you were fully vaccinated and had no symptoms, you did not need to self-isolate if you were a close contact. The concern raised with the DPC was that the Department of Education was seeking vaccination status by proxy.

On foot of the concerns raised, the DPC engaged with the Department of Education on the Self-Declaration Form, the reasons for collecting personal data and the legal basis the Department was relying on to carry out the processing. We were informed by the Department that the Self-Declaration Form was required to determine the absence duration of a staff member and the length of contracts to be offered to a substitute teacher.

The Department also cited the lawful basis for the processing and that there was no intention, implied or otherwise, to collect information on the vaccination status of school staff.

Based on the wording of the form and its stated purpose, the DPC concluded that the requirement for employees to complete a Self-Declaration Form did not constitute the processing of special category (health) data by the Department of Education. The DPC’s reason for coming to this conclusion was threefold:

1) The personal data recorded on the form did not constitute special category data.

2) The controllers (schools and Department of Education) were not systematically collecting or processing special category data for an identified purpose.

3) The controllers were not further processing the data collected in a manner that revealed or drew inferences about the health status of an individual (such as combining the data with other personal data).

 

27)  Case Study 27: TikTok and cooperation with other EU data protection authorities

During 2021, GDPR Article 61 mutual assistance requests were received by the DPC from the Dutch and the French data protection authorities. Each of these requests sought the DPC to further investigate a number of concerns relating to TikTok’s processing of its users’ personal data, particularly child users.

The authorities concerned had been investigating TikTok prior to the company locating its main establishment (EU headquarters) in Ireland in July 2020, following which in December 2020 the DPC assumed the role of TikTok’s lead supervisory authority once other EU supervisory authorities had satisfied themselves TikTok was mainestablished in Ireland.

As a result, the Dutch and French authorities concluded that they no longer had competence to investigate TikTok and accordingly transferred their investigation files, requesting the DPC to investigate further. These investigations coupled with the DPC’s own identification of key concerns through active engagement with TikTok in 2021 led the DPC to commence two own-volition inquiries pursuant to Section 110 of the Data Protection Act 2018 in relation to TikTok compliance with requirements of the GDPR.

 

28)  Case Study 28: Facebook Election Day Information feature

As reported in the DPC’s Annual Report for 2020, Facebook suspended its Election Day Reminder feature following the DPC’s request that Facebook implement a mechanism to ensure that information on how personal data is used be made available to users in an easily accessible form before a user decides whether or not to interact/engage with the feature. Of particular concern to the DPC was the lack of clarity from Facebook on whether any data generated by a user interacting with the feature would be used for targeted advertising and newsfeed personalisation.

In 2021, Facebook outlined to the DPC a number of changes made to the feature, renamed Election Day Information, to take account of the DPCs recommendations. The changes included the prominent positioning of the ‘Learn More’ link to the feature specific Help Centre article; and enhanced in-product transparency clarifying that Facebook does not use personal data collected through interactions with EDI for advertising purposes and that Facebook does not share such data with third parties.

 

29)  Case Study 29: Facebook View (Ray-Ban stories)

During 2021, Facebook, in association with Ray-Ban, launched smart glasses known as ‘Ray-Ban stories’. The glasses allow the wearer to take photos and videos of what they see, activated by a touch or voice command. The images can then be relayed via a Facebook companion app for storage or sharing on social media. While it is acknowledged that many devices including smart phones can record third-party individuals, it is generally the case that the camera or the phone is visible as the device by which recording is happening, thereby putting those captured in the recordings on notice. Ray-Ban stories operate using a small indicator light which activates ‘on’ when taking a photo or recording.

The DPC engaged with Facebook, highlighting issues around the visibility and duration of this light, requesting that Facebook confirm and demonstrate that the LED indicator light is effective for its purpose. In response, Facebook has made software changes which increase the brightness of the external LED. Facebook also responded to the call from the DPC and the Italian Garante for Facebook to run an information campaign to alert the public as to how this new consumer product may give rise to less obvious recording of their images. Engagement with Facebook will continue into 2022. The DPC also continues to liaise with the Italian Garante in respect of any processing of personal data by Luxottica (the manufacturer of the glasses) whom the Garante is competent to supervise.

 

30)  Case Study 30: Instagram user self-compromise

In May 2021, the DPC was made aware of incidents whereby users of Instagram were misled into providing their Instagram credentials to third-party apps leading to their accounts being compromised.

Although no EEA users were affected by this particular incident, in order to reduce the likelihood of users being misled in the EEA and similar incidents occurring, the DPC recommended that the data controller should supplement information to users with clear warnings as to the risks posed by these apps.

The controller subsequently updated the Instagram Help Centre articles to provide additional clarity to users about the consequences of allowing such apps to access their accounts and has consolidated all Help Centre articles into a single, dedicated Help Centre page.

 

31)  Case Study 31: Facebook Viewpoints

During the course of 2021, the DPC engaged with Facebook on the planned launch of Viewpoints in the EU.

Facebook states that Viewpoints is a new market research platform that rewards users for participating in research programmes, the results of which Facebook uses to build and/ or improve their products and evaluate new market opportunities. As part of the ongoing cooperation between the DPC and the other EU/EEA Supervisory Authorities, including the French, Italian, Hamburg, Norwegian and Dutch authorities, the DPC communicated to Facebook a number of concerns about the GDPR and ePrivacy compliance of the Viewpoints product.

In December 2021, the DPC accordingly requested Facebook to review the schedule for further rollout in the EU/EEA of the Viewpoint app and the associated programs/ surveys so that the DPC and other data protection authorities could further assess and engage with Facebook on the concerns raised. Facebook in response has agreed to pause the EU/EEA rollout of the programme.

 

 

  1. Unauthorised publication of a photograph (Amicable Resolution)
  2. No response received to subject access request (Amicable Resolution)
  3. Retention of a minor’s personal databy a State Agency (Amicable Resolution)(Applicable Law — Data Protection Acts, 1988 and 2003
  4. Legal Privilege invoked to withhold personal data (Access Request Complaints)
  5. Attendance Monitoring and Facial Recognition at a secondary school(Direct Intervention)
  6. Handling an Irish data subject’s complaint against German-based Cardmarket using the GDPR One Stop Shop mechanism (Applicable law — GDPR & Data Protection Act 2018
  7. The Operation of the Article 60 Procedure in Cross Border Complaints: Groupon
  8. Amicable Resolution in Cross Border Complaints: MTCH
  9. Amicable Resolution in Cross Border Complaints: Facebook Ireland
  10. Case Study 10: Article 60 Non-response to an Access Request by Ryanair
  11. Purpose Limitation —Law Enforcement Directive
  12. Alleged disclosure of the complainant’s personal data by a local authority (Data Breach Complaint)
  13. Breach Notification (Voluntary Sector) — Ransomware Attack
  14. Breach Notification (Public Sector) Erroneous Publication on Twitter
  15. Breach Notification (Financial Sector) Bank Details sent by WhatsApp
  16. Breach Notification (12 Credit Unions) Processor Coding Error
  17. Vodafone seeks employment details from customers
  18. Facebook Dating
  19. Facebook Suicide and Self-Injury feature
  20. Facebook Election Day Reminder
  21. Google Voice Assistant Technology

1)  Case Study 1: Unauthorised publication of a photograph (Amicable Resolution)

The DPC received a complaint from an individual regarding the publication of their photograph in an article contained in a workplace newsletter without their consent. The data controller, who was the individual’s public sector employer, informed the individual that it should have obtained consent to use the photograph in the workplace newsletter as this was not the purpose for which the photograph was obtained. The data controller also informed the individual that a data breach had occurred in this instance.

This complaint was identified as potentially being amicably resolved under Section 109 of the Data Protection Act 2018, with both the complainant and data controller agreeing to work with the DPC to try to amicably resolve the issue.

The data controller engaged with the DPC on the matter, and advised that it had conducted an internal investigation and determined that a data breach did occur and that consent should have been obtained to use the individual’s photograph in the workplace newsletter. The purpose(s) for which the photograph was initial obtained did not include publication in a newsletter. An apology from the employer was issued to the individual. However, the complainant did not deem this to be an appropriate resolution to the complaint at hand.

The DPC provided recommendations that a consent information leaflet be distributed to staff in advance of using photography, audio and/or video, and that a consent form for photography, audio and video be completed and signed prior to images or recordings being obtained, which the controller subsequently implemented.

Article 5(1)(b) of the GDPR states that “personal data shall be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes (‘purpose limitation’) The DPC was satisfied that the data controller further processed the individual’s personal data without their consent (or other legal basis) for doing so when it published the employee photograph in the workplace newsletter. The DPC issued an outcome letter advising the complainant of same. The DPC was satisfied with the organisational measures subsequently introduced and as such no further actions by the controller in this case was warranted.

In this case study, the risks to the fundamental rights and freedoms of the individual could not be deemed significant, but nonetheless the personal data processing upset the individual and is an infringement of GDPR in the circumstances. This underlines the need for all organisations to train staff — at all levels and in all roles — to be aware of the GDPR and take account of its principles.

 

2)  Case Study 2: No response received to subject access request (Amicable Resolution)

The DPC received a complaint from an individual regarding a subject access request made by them to a data controller, an auction house whose platform the complainant had used to sell goods, for a copy of all information relating them. No response was received from the data controller despite the individual issuing two subsequent reminders.

This complaint was identified as potentially being capable of amicable resolution under Section 109 of the Data Protection Act 2018, with both the complainant and data controller agreeing to work with the DPC to try to amicably resolve the matter. The data controller engaged with the DPC on the matter and informed us that while it previously had a business relationship with the individual in 2016, it did not hold any information relating to them as it had installed a new system in May 2018, and no data was retained prior to that. It further informed the DPC that it had shredded all paper files and that its legal adviser’s informed them it was not a requirement to retain same.

The data controller also provided the DPC with screenshots from its electronic system of the results of a search against the individual’s name, which did not identify any results to display. Article 12(3) of the GDPR states that “the controller shall provide information on action taken on a request under Articles 15 to 22 to the data subject without undue delay and in any event within one month of receipt of the request.”

Having examined the matter thoroughly, it was apparent to the DPC that the data controller contravened Article 12(3) of the GDPR as controllers have an obligation to provide a response to the individual’s subject access request within the statutory timeframe as set out in Article 12 of the GDPR, even where the controller is not in possession of any such data.

Regarding the individual’s subject access request no further action on this matter was warranted as there was no evidence to suggest that any data relating to the individual was held by the data controller. The DPC issued advice to the data controller, reminding it of its obligations specifically under Articles 12 and 15 and the requirement to provide information on actions taken in relation to a subject access request, even in circumstances where this is to inform an individual that it does not hold any data.

 

3)  Case Study 3: Retention of a minor’s personal databy a State Agency (Amicable Resolution)(Applicable Law — Data Protection Acts, 1988 and 2003) 

In this case, the complainants involved had previously requested that an Irish state agency erase a file pertaining to an incident at school involving their young child which had originally been notified to the agency. However while the agency had decided that the incident did not warrant further investigation, it had refused to erase the minor’s personal data — indicating that such files are retained until the minor in question reaches the age of 25 years.

The DPC requested that the state agency outline its lawful basis for the retention of the minor’s personal data. The agency provided this and cited its retention policy as stated to the complainants, but the DPC did not consider a blanket retention period applicable in the particular circumstances.

The DPC informed both parties of the amicable resolution process and both expressed a willingness to engage on same. After iterative engagement between the complainants and the controller to discuss the matter, the state agency confirmed to the complainants that the file containing their child’s personal data would be deleted.

 

4)  Case Study 4: Legal Privilege invoked to withholdpersonal data (Access Request Complaints) 

The DPC dealt with a case which concerned an application by an individual to a hospital for their personal data. This individual had instructed their solicitor in relation to a negligence action against the hospital arising from care they received.

By the time the individual made a complaint the DPC through their solicitor the hospital had released some medical records, but the individual advised that they were awaiting non-clinical notes which the hospital was refusing to release on the basis that they were subject to litigation privilege. Specifically the individual (who was represented by their solicitor in the complaint to the DPC) was of the view that various staff statements had been withheld. Through the complaint-handling process the DPC established that staff statements had been prepared in the course of an internal review by the hospital of the care of the patient.

The DPC requested sight — on a voluntary basis — of the documentation withheld from the individual in response to the access request, in order to be satisfied that their contents and eligibility for exemption from release had been validly applied. In circumstances where the statement had been prepared for the dominant purpose of an internal review and no litigation had commenced or been threatened at the date of the creation of the statements, the DPC was not satisfied that litigation privilege applied and directed that they be released.

 

5)  Case Study 5: Attendance Monitoring and Facial Recognition at a secondary school (Direct Intervention) 

Following media reports regarding a facial recognition trial for attendance monitoring purposes in a secondary school, the DPC met with members of staff and the Board of Management of the school in February 2020.

The DPC outlined the data protection issues surrounding the use of biometrics data, specifically facial recognition technology, in an educational environment, including processing the data of minors. The DPC referred to the Swedish data protection authority’s first fine under GDPR, concerning a trial project in a secondary school where facial recognition technology was used to register student attendance.

The DPC stepped through the definition of biometric data as set out in Article 4(14) of the GDPR and highlighted additional GDPR provisions in Article 5 — Purpose limitation and data minimisation; Article 9 — Sensitive data; and Articles 35 and 36 — Data Protection Impact Assessment (DPIA) and Prior Consultation.

Subsequent to the meeting, the school provided the DPC with a full written report on the matter, including confirmation that it did not proceed to trial the attendance monitoring product in question. European data protection authorities have traditionally adopted strong positions with regard to facial recognition in schools and the use of biometric attendance systems in the education sector. In Ireland, the DPC regularly conducts inspections of schools where reports of biometric attendance systems or trials are received. The DPC considers that exposure to intrusive methods of surveillance without sufficient legal basis or justification can desensitise students at a young age to such technology and lead to them ceding their data protection rights in other contexts also.

 

6)  Case Study 6: Handling an Irish data subject’s complaint against German-based Cardmarket using the GDPR One Stop Shop mechanism (Applicable law — GDPR & Data Protection Act 2018) 

The DPC received a complaint from an Irish individual against Cardmarket, a German e-commerce and trading platform. The individual received an email from Cardmarket, notifying them that it had been hacked and that some of its users’ personal information may have been leaked. The individual alerted the DPC and submitted a complaint in relation to the breach.

Under the One-Stop-Shop (OSS) mechanism created by the GDPR, the location of a company’s main European establishment dictates which European authority will act as the lead supervisory authority in relation to any complaints received. Once the lead authority (LSA) is established, the authority that received the complaint acts as a concerned supervisory authority (CSA). The CSA is the intermediary between the LSA and the individual. Among other things, the reason for this separation is so that supervisory authorities can communicate with individual complainants in their native language. In this case, the Berlin DPA acted as the LSA, as the company had its main establishment in the Berlin territorial area. The DPC acted as a CSA, communicating with the Berlin DPA and transmitting updates in relation to the investigation (once they were translated from German to English) to the individual complainant in Ireland.

The Berlin DPA concluded its investigation into the breach and the individual’s complaint. It uploaded two draft decisions, one in relation to the overall breach which impacted many other users of the platform throughout Europe, and another in relation to the specific complaint which had been lodged by the Irish individual with the DPC and communicated to the Berlin DPA.

An important aspect of the OSS mechanism is that a CSA may comment on a draft decision issued by a lead supervisory authority. This is to ensure that European supervisory authorities are applying the GDPR consistently i.e. that a final decision reached by the Berlin DPA would have the same conclusion as a decision of the DPC if the company had been located in Ireland and the DPC had investigated the complaint as the lead supervisory authority. The DPC were satisfied with the Berlin DPA draft decisions and did not consider it necessary to raise any points of clarification or requests for amendment on this occasion.

The draft decision in relation to the overall breach described a number of measures taken by the platform to address the breach and mitigate its adverse effects. The measures included taking its servers off of their network and deleting all the data on them, as well as resetting all user passwords and ensuring new passwords were encrypted with the latest hashing methods. The draft decision considered that a repetition of the incident was unlikely, and that the mass disclosure of passwords had been rendered practically impossible in light of the measures taken.

The DPC informed the individual of the outcome of the Berlin DPA’s investigation, providing them with a copy of the overall decision investigating the breach and the decision dealing with their specific complaint.

This case illustrates the challenging handoffs and handovers involved in the OSS mechanism established by the GDPR. It demonstrates the depth of cooperation between European supervisory authorities required for the consistent application of the GDPR in Europe.

 

7)  Case Study 7: The Operation of the Article 60 Procedure in Cross Border Complaints: Groupon 

The DPC received a complaint in July 2018 from the Polish data protection authority on behalf of a Polish complainant against Groupon International Limited (“Groupon”). The complaint related to the requirements that Groupon had in place at that time to verify the identity of individuals who made data protection rights requests to it. In this case, the complainant alleged that Groupon’s practice of requiring them to verify their identity by way of electronic submission of a copy of a national identity card, in the context of a request they had made for erasure of personal data pursuant to Article 17 of the GDPR, constituted an infringement of the principle of data minimisation as set out in Article 5(1) (c) of the GDPR, in circumstances where there was no requirement to provide an identity document when a Groupon account was created. In addition, the complainant alleged that Groupon’s subsequent failure to act on the erasure request (in circumstances where the individual objected to providing a copy of their national identity card) constituted an infringement of their right to erasure under Article 17.

The DPC commenced an examination of the complaint upon receipt of same. In the course of its correspondence with Groupon on the matter, it became clear that Groupon’s policy of requiring a requester to provide a copy of a national identity card, which had been in place since before the GDPR came into force (and which was in place at the time of the complainant’s erasure request), had been discontinued since October 2018. In its place, Groupon had implemented an email authentication system which allowed Groupon users to verify their account ownership. The DPC attempted to amicably resolve the complaint (pursuant to section 109(2) of the Data Protection Act 2018), but the complainant was unwilling to accept Groupon’s proposals in respect of same. As such, the matter fell to be decided by way of a decision under Article 60 of the GDPR.

 

(i) Initial Draft Decision

The first step in the Article 60 process entailed the DPC preparing a draft decision in respect of the complaint. In its initial draft decision, the DPC made findings of infringements of Articles 5(1)(c) and 12(2) of the GDPR by Groupon. The DPC provided the draft decision to Groupon to allow it to make submissions. Groupon subsequently provided a number of submissions, which (along with the DPC’s analysis thereof) were taken into account in a further version of the draft decision.

 

(ii) Provision of Initial Draft Decision to Concerned Supervisory Authorities

The second stage in the Article 60 process involved the DPC’s initial draft decision being uploaded to the IMI to be circulated amongst the Concerned Supervisory Authorities (CSAs), pursuant to Article 60(3) of the GDPR. The DPC’s draft decision was uploaded to the IMI on 25 May 2020 and, pursuant to Article 60(4) of the GDPR, CSAs were thereafter entitled to four weeks in which to submit any relevant and reasoned objections to the decision. The DPC subsequently received a number of relevant and reasoned objections and comments on its decision from CSAs. In particular, certain CSAs argued that additional infringements of the GDPR ought to have been found, and in addition that a reprimand and/or administrative fine ought to have been imposed.

 

(iii) Revised Draft Decision

The next stage of the Article 60 process required the DPC to carefully consider each relevant and reasoned objection and comment received in respect of its draft decision, and incorporate its analysis of same into a revised draft decision. In revising its draft decision, the DPC followed certain relevant and reasoned objections received, and declined to follow certain relevant and reasoned objections. The DPC’s revised draft decision, taking into account its analysis of the relevant and reasoned objections and comments in respect of its draft decision, found additional infringements of Articles 17(1)(a) and 6(1) of the GDPR by Groupon. In addition, the DPC proposed in its revised draft decision to issue a reprimand to Groupon, pursuant to Article 58(2)(b) of the GDPR. The DPC provided its revised draft decision to Groupon to allow it to make final submissions. A number of final submissions were received from Groupon, which (along with the DPC’s analysis thereof) were taken into account in the DPC’s revised draft decision.

 

(iv) Provision of Revised Draft Decision to Concerned Supervisory Authorities

The next stage of the Article 60 process entailed the DPC uploading its revised draft decision to the IMI, for circulation among the CSAs. Under Article 60(5) of the GDPR, CSAs were entitled to two further weeks in which to indicate if they planned to maintain their objections. This raised the prospect that the Dispute Resolution procedure under Article 65 of the GDPR would have to be engaged, which would have involved the European Data Protection Board (EDPB) adjudicating on the point(s) of disagreement, and which would have extended further the time in which the decision in respect of the case could be completed. However, the additional query was subsequently withdrawn.

 

(v) Adoption of Final Decision

Upon the withdrawal of the final relevant and reasoned objection, and the passing of the deadline for receipt of any further objections, the last stage of the Article 60 process entailed the DPC adopting the final decision, which was uploaded to the IMI and communicated to Groupon. The final decision was uploaded on 16 December 2020. As per Article 60(6) of the GDPR, the CSAs were deemed at this point to be in agreement with the decision and to be bound by it. Pursuant to Article 60(7), the Polish data protection authority with which the complaint was initially lodged was responsible for informing the complainant of the decision.

In summary, the DPC found infringements of the following Articles of the GDPR in respect of this case: Articles 5(1)(c), 12(2), 17(1)(a) and 6(1). This case study demonstrates that, where a cross border data protection complaint cannot be amicably resolved, the Article 60 procedure that follows as a result is particularly involved, complex and time-consuming, especially as the views of other supervisory authorities across the EU/EEA must be taken into account and carefully considered in all such cases. In this case, following the completion of the investigation of the complaint, the initial draft of the DPC’s decision was uploaded to the IMI on 25 May 2020, and the final decision — incorporating submissions from Groupon, relevant and reasoned objections and comments from CSAs, and the DPC’s analysis thereof — was adopted on 16 December 2020, some seven months later.

 

8)  Case Study 8: Amicable Resolution in Cross Border Complaints: MTCH 

The DPC received a complaint in June 2020, via its complaint webforms, against MTCH Technology Services Limited (Tinder). Although the complaint was made directly to the DPC, from an Irish resident, upon assessment it was deemed to constitute a cross border complaint because it related to Tinder’s general operational policies and, as Tinder is available throughout the EU, the processing complained of was therefore deemed to be of a kind “….which substantially affects or is likely to substantially affect data subjects in more than one Member State” (as per the definition of cross border processing under Article 4(23) of the GDPR).

The complaint related to the banning of the complainant from the Tinder platform, subsequent to which the complainant had made a request to Tinder for the erasure of his personal data under Article 17 of the GDPR. In response to his request for erasure, the complainant was referred by Tinder to its privacy policy for information in relation to its retention policies in respect of personal data. In particular, Tinder informed the complainant that “after an account is closed, whatever the reason (deletion by the user, account banned etc.), the user’s data is not visible on the service anymore (subject to allowing for a reasonable delay) and the data is disposed on in accordance with [Tinder’s] privacy policy”. The complainant was dissatisfied with this response and followed up with Tinder again requesting the erasure of his personal data. Tinder responded by reiterating that “…personal data is generally deleted “upon deletion of the corresponding account”, further noting that deletion of such personal data is “only subject to legitimate and lawful grounds to retain it, including to comply with our statutory data retention obligations and for the establishment, exercise or defence of legal claims, as permitted under Art. 17(3) of GDPR.” The complainant subsequently made his complaint to the DPC.

Upon the DPC’s engagement with Tinder in respect of this complaint, Tinder informed the DPC that the complainant had been banned from the platform as his login information was tied to another banned profile. Also, Tinder identified eleven other accounts associated with the complainant’s device ID. All these accounts had been banned from the Tinder platform as it appeared that an unofficial client was being used to access Tinder (a violation of Tinder’s terms of service). The DPC reverted to the complainant with this information, and the complainant advised that he had used the official Tinder client for Android and the official Tinder web site on Firefox. However, it transpired that he had been using a custom Android build on his phone with various security and privacy add-ons. As a result, his phone had a different device ID after each update/ reboot. In the complainant’s view, this was the likely cause of the issue that resulted in his being banned from Tinder. In light of such a ban, as per Tinder’s policy on data retention, his personal data would have been retained for an extended period of time. However, in the circumstances, by way of a proposed amicable resolution, Tinder offered to immediately delete the complainant’s personal data so that he could open a new account.

The complainant had certain residual concerns regarding the manner in which Tinder responds to erasure requests. Upon being informed that such matters were being examined by the DPC by way of a separate statutory inquiry, the complainant agreed to accept Tinder’s proposal for the amicable resolution of the complaint. As such, the matter was amicably resolved pursuant to section 109(3) of the Data Protection Act 2018 (the Act), and under section 109(3) of the Act the complaint was deemed to have been withdrawn.

This case study demonstrates that a thorough examination of a seemingly intractable complaint can bring about its amicable resolution, which will often result in a fair and efficacious solution for the affected individual in a timely manner. In this case, the information gleaned by the DPC when it probed in more depth into the circumstances of the complainant’s ban from Tinder — namely the fact that the complainant used a custom Android build with security and privacy add-ons — contributed to a greater understanding between the parties and led to Tinder making its proposal for the resolution of the case, which the complainant accepted.

 

9)  Case Study 9: Amicable Resolution in Cross Border Complaints: Facebook Ireland 

The DPC received a multi-faceted complaint in April 2019 relating to requests for access (under Article 15 of the GDPR), rectification (under Article 16 of the GDPR) and erasure (under Article 17 of the GDPR) that the complainant had made to Facebook Ireland Limited (“Facebook”). The complaint was made directly to the DPC, from a data subject based in the UK. Upon assessment in the DPC, the complaint was deemed to be cross border because it related to Facebook’s general operational policies and, as Facebook is available throughout the EU, the processing complained of was therefore deemed to be of a kind “….which substantially affects or is likely to substantially affect data subjects in more than one Member State” (as per the definition of cross border processing under Article 4(23) of the GDPR).

The complainant initially made his requests to Facebook because his Facebook account had been locked for over a year, without reason in the view of the complainant, and he believed Facebook held inaccurate personal data relating to him. Wishing to ultimately erase all the personal data that Facebook held in relation to him, the complainant was of the view that this inaccurate information was preventing him from being successfully able to log into his Facebook account to begin the erasure process. He had therefore made an access request to Facebook, but had been unable to verify his identity to Facebook’s satisfaction. The complainant subsequently made his complaint to the DPC.

After a considerable amount of engagement by the DPC with both Facebook and the complainant with a view to amicably resolving the complaint, in the course of which the complainant was able to verify his identity to Facebook’s satisfaction, Facebook agreed to provide the complainant with a link containing the personal data that it held in relation to him. The complainant accessed the material at the link, but remained dissatisfied because he claimed that the material provided was insufficient. In particular, the complainant indicated that he wished to be advised of any personal data held in relation to him by Facebook beyond that which was processed in order to operate his Facebook profile. Facebook responded to the DPC indicating that the material provided to the complainant via the link was the totality of the account data that it held in relation to him. The complainant remained dissatisfied with this response, indicating that he wished to obtain information regarding any personal data that Facebook held in relation to him that was not related to his Facebook account. He also reiterated his belief that some of this personal data, allegedly held by Facebook but not related to his Facebook account, may be inaccurate, in which case he wished to have it rectified.

In response, Facebook advised the DPC that, since the commencement of the complaint, it had made certain enhancements to its ‘Download Your Information’ tool. Following this update to its access tools, it had determined that a very small amount of additional personal data existed in relation to the complainant’s Facebook account, and provided the complainant with a new link containing all of the personal data it held in relation to the complainant, including this additional data. The complainant accessed this additional material and, with a view to resolving his complaint, sought confirmation that, once the deletion of his account was effected, Facebook would no longer hold any personal data in relation to him. Facebook reverted to indicate that the material it had provided to the complainant was the totality of the data it held in relation to him that fell within the scope of Article 15, and indicated that it would proceed with the erasure of the complainant’s personal data once he had indicated that he was now satisfied for it to do so.

The complainant was content to conclude the matter on this basis and, as such, the matter was amicably resolved pursuant to section 109(3) of the Data Protection Act 2018 (the Act), and under section 109(3) of the Act the complaint was deemed to have been withdrawn.

This case study demonstrates the benefits — to individual complainants — of the DPC’s intervention by way of the amicable resolution process. In this case, the DPC’s involvement led to the complainant being able to verify his identity to Facebook’s satisfaction, and to Facebook providing him with links containing his personal data on two occasions. The DPC’s engagement with the controller also resulted in it confirming, to the complainant’s satisfaction, that all the personal data that fell to be released in response to an Article 15 request had been provided to him. This resulted in a fair outcome that was satisfactory to both parties to the complaint. This case study also illustrates the intense resource- investment necessary on the part of DPAs to resolve issues of this nature. The complainant in this case raises an issue of concern to themselves and is entitled to have that addressed. The question the case raises is whether the controller in this case should have been capable of resolving this matter without the requirement for extensive DPA-resources to mediate the outcome.

 

10)  Case Study 10: Article 60 Non-response to an Access Request by Ryanair 

In this case, the complainant initially submitted their complaint to the Information Commissioner’s Office (ICO) of the UK, which was thereafter received by the DPC, on 2 March 2019. The complaint related to the alleged failure by the Ryanair DAC (Ryanair) to comply with a subject access request submitted to it by the complainant on 26 September 2018 in accordance with Article 15 of the GDPR. The ICO provided the DPC with a copy of the complaint form submitted to the ICO by the complainant, a copy of the acknowledgement, dated 26 September 2018, that the complainant had received from the data controller when submitting the access request, and a copy of the complainant’s follow up email to the data controller requesting an update in relation to their request.

Acting in its capacity as Lead Supervisory Authority, the DPC commenced an examination of the complaint by contacting the data controller, outlining the details of the complaint and instructing the data controller to respond to the access request in full and to provide the DPC with a copy of the cover letter that issued to the complainant. Ryanair provided the complainant with access to copies of their personal data relating to the specific booking reference that the complainant had provided to the ICO and data relating to a separate complaint. Ryanair advised that it could not provide the complainant with a copy of the call recording they had requested as, due to the delay on Ryanair’s part in processing the request, the call recording had been deleted in accordance with company policy and they had been unable to retrieve it. Ryanair advised the DPC that it had previously informed the complainant of this via its online portal. Ryanair stated that at the time the request was submitted, due to the volume of data subjects who did not verify their email address, access requests were not assigned to the relevant department until the email was verified by the data subject. Ryanair advised the DPC that the complainant responded to the request, verifying their email address, but the agent who was working on the request had ceased working on the online portal and therefore the request had not been assigned to the relevant department. Ryanair asserted that this error was not discovered until sometime later, when the request was then assigned to the Customer Services department to provide the necessary data, including the call recording, at which point the call record had been deleted in accordance with Ryanair’s retention policy. Ryanair provided the DPC with a copy of its retention policy, in which it states that call recordings are retained for a period of 90 days from the date of the call. Ryanair advised that, as the complainant’s call had been made on 5 September 2018, it would have been automatically deleted on 04 December 2018. Ryanair further stated that it does not have the functionality to retrieve deleted call recordings. Pursuant to Section 109(2) of the Data Protection Act 2018, the DPC attempted to facilitate the amicable resolution of the complaint. However the complainant was unwilling to accept Ryanair’s proposals in respect of same. As such, the matter fell to be decided by way of a decision under Article 60 of the GDPR.

 

(i) Initial Draft Decision

As the complaint related to cross border processing, the DPC was obliged, in accordance with the Article 60 process, to make a draft decision in respect of the complaint. In its initial version of the draft decision, the DPC made a finding of infringement of Article 15 of the GDPR in that Ryanair failed to provide the complainant with a copy their personal data that was undergoing processing at the time of the request. The DPC also found an infringement of Article 12(3) of the GDPR in that Ryanair failed to provide the complainant information on action taken on their request under Article 15 within the statutory timeframe of one month. The DPC provided the draft decision to Ryanair to allow it to make submissions. Ryanair subsequently provided a number of submissions, which (along with the DPC’s analysis thereof) were taken into account in the draft decision.

 

(ii) Provision of Draft Decision to Concerned Supervisory Authorities

In accordance with the Article 60 process, the DPC proceeded to submit its draft decision to the IMI to be circulated amongst the Concerned Supervisory Authorities (CSAs), pursuant to Article 60(3) of the GDPR. The DPC’s draft decision was uploaded to the IMI on 25 May 2020 and, pursuant to Article 60(4) of the GDPR, the CSAs were thereafter entitled to four weeks in which to submit any relevant and reasoned objections to the decision.

The DPC subsequently received a number of relevant and reasoned objections and comments in relation to its draft decision from the CSAs. In particular, certain CSAs argued that additional infringements of the GDPR ought to have been found, and in addition that a reprimand ought to have been imposed.

 

(iii) Revised Draft Decision

In accordance with Article 60(3) of the GDPR, the DPC is obliged to take due account of the views of the CSA’s. In light of the objections and comments received from the CSAs, the DPC carefully considered each relevant and reasoned objection and comment received in respect of its draft decision. The DPC revised its draft decision to include a summary and analysis of the objections and comments expressed by the CSAs. In revising its initial draft, the DPC followed certain relevant and reasoned objections received, and declined to follow others. In the its revised draft decision, the DPC proposed to issue a reprimand to Ryanair, pursuant to Article 58(2) (b) of the GDPR. The DPC provided its revised draft decision to Ryanair to allow it to make final submissions. Ryanair noted that the DPC had found that it had infringed the GDPR, and that the DPC had exercised its powers in this case in line with Recital 129 and the due process requirements in Article 58 of the GDPR. Ryanair advised the DPC that it accepted the findings and the associated reprimand and did not wish to make any further submissions.

 

(iv) Provision of Revised Draft Decision to Concerned Supervisory Authorities

In accordance with Article 60(5) of the GDPR, once the DPC submitted its revised draft decision to the CSAs for their views, the CSAs were entitled to two further weeks in which to submit any further objections to the decision.

Pursuant to Article 60(5) of the GDPR, the DPC submitted its revised draft decision to the CSAs for their opinion on 20 October 2020. As the DPC received no further objections or comments in relation to the revised draft decision from the CSAs within the statutory period, the CSAs were deemed to be in agreement with the revised draft decision of the DPC and bound by it in accordance with Article 60(6) of the GDPR.

 

(v) Adoption of Final Decision

Upon the passing of the deadline for receipt of any further objections, the DPC proceeded to adopt the final decision, in accordance with Article 60(7) of the GDPR. The DPC then uploaded its final decision to the IMI and communicated it to Ryanair. The final decision was uploaded on 11 November 2020. Pursuant to Article 60(7), the ICO, with whom the complaint was initially lodged, was responsible for informing the complainant of the decision.

 

In summary, the DPC found infringements of Articles 12(3) and Article 15 of the GDPR in respect of this complaint.

This case study demonstrates that, where a complaint relating to the cross border processing of personal data cannot be amicably resolved, the Article 60 procedure that follows as a result is particularly involved, complex and time-consuming. In this case, the initial draft of the DPC’s decision was uploaded to the IMI on 25 May 2020, and the final decision was not adopted until 11 November 2020, some six months later.

This case study also demonstrates — once again — the intensity of DPA resources consumed in delivering outcomes on issues that could have been resolved by the controller without recourse to the DPC, raising again the question of unwarranted DPA resource-drainage away from resolving wider systemic issues which would achieve improved outcomes for the maximum number of individuals.

 

11)  Case Study 11: Purpose Limitation —Law Enforcement Directive 

The DPC examined a complaint where an individual alleged that data gathered in one particular law enforcement context was being used by the same data controller for another law enforcement purpose. The complaint concerned the prosecution of an individual for offences in the equine and animal remedies area by the Department of Agriculture, Food & the Marine (DAFM) and the separate referral by DAFM of allegations of professional misconduct to the Veterinary Council of Ireland (VCI) in relation to the same person.

Having examined the matters raised, the DPC referred the complainant to Section 71(5) of the Data Protection Act 2018:

Where a controller collects personal data for a purpose specified in section 70 (1)(a), the controller or another controller may process the data for a purpose so specified other than the purpose for which the data were collected, in so far as— (a) the controller is authorised to process such personal data for such a purpose in accordance with the law of the European Union or the law of the State, and (b) the processing is necessary and proportionate to the purpose for which the data are being processed.

With regard to section 70(1)(a) and “the law of the State”, the DPC noted the provisions set out in the Veterinary Practice Act 2005 regarding the conduct of inquiries by the VCI into allegations of professional misconduct. In particular, section 76 of the Veterinary Practice Act 2005 outlines that the VCI or any person may apply for an inquiry with regards to the fitness to practice veterinary medicine of a registered person. On this basis, the DPC did not consider data protection legislation to disallow the separate referral by DAFM of allegations of professional misconduct to the VCI in relation to a person, in tandem with prosecution proceedings by DAFM against the same individual for offences in the equine and animal remedies area.

 

12)  Case Study 12: Alleged disclosure of the complainant’s personal data by a local authority (Data Breach Complaint) 

The DPC received a complaint from an individual concerning an alleged disclosure of the complainant’s personal data by a local authority. The complainant alleged that the local authority had disclosed the complainant’s name, postal address and information relating to the housing assistance payment in error to a third-party. The individual had been informed by the local authority that this disclosure had occurred. However, the individual was dissatisfied with the actions taken by the local authority in response to the disclosure and did not wish to engage further with the local authority with a view to seeking an amicable resolution of the complaint.

The DPC examined the complaint and contacted the local authority in order to seek further information regarding the individual’s allegations. The local authority confirmed to the DPC that a personal data breach had occurred when the complainant’s personal data was included, in error, in a Freedom of Information request response to a third-party. In addition to the information provided by the local authority to the DPC in the context of its examination of the complaint, the incident in question was notified to the DPC by the local authority as a personal data breach, as required by Article 33 of the GDPR. In that context, the DPC engaged extensively with the local authority regarding the circumstances of the personal data breach, the data security measures in place at the time the personal data breach occurred and the mitigating measures taken by the local authority, including the local authority’s ongoing efforts to retrieve the data from the recipient.

On the basis of this information, the DPC concluded its examination of the complaint by advising the individual that the DPC was satisfied that the complainant’s personal data were not processed by the local authority in a manner that ensured appropriate security of the personal data and that an unauthorised disclosure of the complainant’s personal data, constituting a personal data breach, had occurred. On the basis of the actions that had been taken by the local authority in response to the personal data breach and, in particular, the fact that the recipient of the complainant’s personal data had returned the data to the local authority, the DPC did not consider that any further action against the local authority was warranted in relation to the subject matter of the complaint.

 

13)  Case Study 13: Breach Notification (Voluntary Sector) — Ransomware Attack

In May 2020, the DPC received a breach notification from an Irish data processor and subsequently a notification from an Irish data controller operating in the voluntary sector who had engaged this processor to provide webhosting and data management services.

The breach related to a ransomware attack that occurred in the data centre utilised by the data processor, and which was the result of malware gaining access via an RDP* 1 port to the server.    

The DPC engaged with both the controller and processor and through a number of communications — including the issuing of technical and organisational questionnaires focusing on areas of potential non-compliance with data protection regulation. These areas included the processor’s use of a data centre within the US to store back-up data without adequate agreements — and sufficient oversight by the controller over its processor — as required under Article 28 of the GDPR. The DPC engaged intensively with both parties and the DPC concluded this case by issuing recommendations to both controller and processor. Thereafter the DPC continued to engage with both parties to ensure that implementation of the DPC recommendations had occurred.

* RDP — Remote desktop protocol

14)  Case Study 14: Breach Notification (Public Sector) Erroneous Publication on Twitter 

A public sector organisation notified the DPC that they had inadvertently published personal data via their social media platform (Twitter).

The personal data was posted in violation of its policy to anonymise all content, which could potentially identify an individual data subject. The organisation in question informed the DPC that the root cause of this incident was human error and the offending tweet was removed without undue delay. Based on the action the data controller had taken to mitigate against the risk of this type of incident reoccurring the DPC concluded its examination of this matter and issued a number of further recommendations to the organisation centring on the appropriate use of its social media platforms and how its social media accounts should be secured and limited to a specified number of authorised personnel.

 

15)  Case Study 15: Breach Notification (Financial Sector) Bank Details sent by WhatsApp

A private financial sector organisation notified the DPC that a customer had made a request to obtain their IBAN and BIC numbers which were held on file. The customer making the request was personally known to the member of staff dealing with the request. The member of staff, deviating from approved practices, used their personal mobile phone to send a picture of what they believed to be the requested information over a messaging platform (WhatsApp). However the staff member erroneously sent details pertaining to another customer to the requesting customer.

The customer who received this information contacted the organisation to advise that the information received did not relate to their account and that they had undertaken to delete all offending material from their device. The organisation communicated with staff to remind them that only authorised methods of communication should be utilised when handling future requests of this nature. The organisation has also issued an apology to all affected data subjects.

The DPC issued a number of recommendations encompassing the use of only approved organisational communication tools, making staff fully aware of acceptable and non-acceptable behaviour when using organisational communications tools, and to ensure staff have undergone appropriate training in terms of their obligations/responsibilities under the provisions of the GDPR and the Data Protection Act 2018.

 

16)  Case Study 16: Breach Notification (12 Credit Unions) Processor Coding Error 

The DPC received separate breach reports from 12 credit unions that employed the services of the same processor which was based in the UK. The breach by the processor arose from a coding error made by the processor when implementing measures introduced in response to the Covid-19 pandemic.

Credit unions are required to report information to the Central Bank of Ireland concerning their borrowers and the performance of their loans. The Central Bank utilises this information to maintain the Central Credit Register (or CCR). Lenders and credit rating agencies in turn use this information to verify borrowers’ debts and credit histories. A large number of lenders, particularly credit unions, use the services of data processing companies to prepare such CCR returns and forward them to the Central Bank.

During 2020, the Irish Government introduced a series of measures to mitigate financial distress caused by the pandemic and resulting lock-downs. These included measures allowing financial institutions to pause loan repayments without adversely affecting borrowers’ credit ratings. Lenders were instructed to use particular codes in the CCR returns to flag paused loans. This was intended to prevent those loans being interpreted as delinquent or otherwise suggesting that the relevant borrowers’ credit-worthiness had deteriorated.

In this incident the processor employed by the 12 credit unions used incorrect codes on CCR returns dealing with paused loans. The incorrect codes indicated that the borrowers affected had undergone a ‘restructuring event’ — a restructuring event typically occurs when a borrower is unable to repay a loan over the agreed period, and the lender agrees to change the loan’s terms to improve the borrower’s ability to repay. This can greatly reduce a borrower’s credit rating, so an inaccurate CCR record of a restructuring event could have serious consequences for the persons affected.

The credit unions in question became aware of the processor’s coding error in relation to their CCR returns several weeks after the processor first sent CCR returns for them using the incorrect codes to the Central Bank. The issue was reported to the DPC as a breach and credit unions took the matter up with the processor directly and through a user group. This allowed affected records to be identified, the appropriate coding procedures to be worked out, and corrected CCR returns to be sent to the Central Bank.

These cases illustrate the importance of processing contracts that properly implement the requirements of Article 28 of the GDPR. Most relevantly to these cases, processing contracts must provide for the processor to assist the controller in meeting its obligations for security of processing, and for reporting and responding to breaches.

 

17)  Case Study 17: Vodafone seeks employment details from customers 

The DPC received a number of queries regarding new or existing customers being requested by Vodafone to produce their employment details and work phone number as a requirement for the provision of service by that company.

The concerns arising were that the requests were excessive and contrary to the Article 5 principle of lawful, fair and transparent collection as the processing of data relating to their employment status was entirely unrelated to the product or service that they were receiving from the telecommunications company, which was for their personal or domestic use only.

Second, there were concerns that the mandatory request for a customer’s occupation/place of work/work phone number was not adequate, relevant or necessary under the “data minimisation” requirement and did not meet the purpose limitation principle as set out in Article 5 of GDPR. Third, there were also concerns amongst customers that the company’s data protection/privacy notice did not comply with the transparency requirement of GDPR Article 13(1).

Following engagement with the DPC, Vodafone admitted that it had made an error in the collection of this information. The company stated that the problems were caused by a legacy IT system that had not been updated to remove this requirement and that any access to the data was exceptionally limited and was not used for any additional processing purposes by them. Vodafone immediately commenced a plan to remediate the problems caused and, on the insistence of the DPC, published on its website the details of what had occurred, so that customers would be aware of the issue.

 

18)  Case Study 18: Facebook Dating 

In February 2020, the DPC was informed of Facebook’s impending launch of ‘Facebook Dating’ in the EU. A cause for significant concern was the short notice given about its launch, together with very limited information on how Facebook had ensured the Dating feature would comply with data protection requirements. As a result, the DPC undertook an on-site inspection of Facebook’s offices in Dublin to obtain more extensive documentation and information. A number of queries and concerns identified by the DPC were put to Facebook on the new product and its features. As a result Facebook provided detailed clarifications on the processing of personal data and made a number of changes to the product prior to ultimately being launched in the EU in October 2020.

These changes included:

  • clarification on the uses of special category data which was very unclear in the original proposal. Facebook agreed that there would be no advertising using special category data and special category data collected in the dating feature will not be used by the core Facebook service;
  • changes to the user interface around a user’s selection of religious belief so that the “prefer not to say” option was moved to the top of the list of options;
  • greater transparency to users by making it clear in sign-up flow that Dating is a Facebook product and that it is covered by Facebooks terms of service and data policy and the Supplemental Facebook Dating Terms; and
  • revisions to the consent header for the processing of special category data to specifically flag that special category data (in this instance sexual preference and religious belief) will not be processed for the purposes of advertising (targeted or otherwise).

 

19)  Case Study 19: Facebook Suicide and Self-Injury feature 

In early 2019 the DPC was initially approached by Facebook and informed of their plans to implement an expansion of its Suicide and Self Injury Prevention Tool (SSI), which involved using advanced algorithms to monitor Facebook and Instagram users’ online interactions and posts. Facebook intended that the tool would help identify users at risk of suicide or self-harm. Details of these users would then be notified to external parties (police and voluntary organisations) to action an intervention with the users concerned. The DPC raised a number of concerns during the engagement (2019–2020) including lawful basis and adequate safeguards relating to the processing of special category data. Facebook took the position that the processing of this data would rely on the public interest exemption under Article 9 GDPR.

As part of the DPC assessment it was suggested that Facebook should consult public health authorities in Europe before proceeding. Facebook acknowledged that they had further work to do and would undertake the consultation and further research with public health authorities across Europe on the SSI tool. Facebook has indicated that this engagement will continue to be a long-term initiative given the challenges experienced by Member State Governments and national public health authorities due to the Covid-19 pandemic. The DPC understands this engagement is ongoing. In late 2020, Facebook approached the DPC proposing a more limited use of this tool for the sole purpose of removing content contravening Facebook Community Standards and Instagram Community Guidelines, pending resolution of the concerns raised by the DPC. No significant concerns were identified by the DPC so long as the processing was for the sole purpose of content moderation.

 

20)  Case Study 20: Facebook Election Day Reminder 

In advance of the Irish General Election in February 2020 the DPC notified Facebook that the Facebook Election Day Reminder (EDR) feature raised a number of data protection concerns particularly around transparency to users about how personal data is collected when interacting with the feature and subsequently used by Facebook.

The DPC requested that Facebook implement a mechanism at the point at which users engage (or will engage) with the EDR function to ensure that the information referenced in Article 13 of the GDPR, including information addressing the specific circumstances and context in which the processing operations are undertaken, be made available to users in an easily accessible form before a user decides whether or not to interact/engage with the EDR function. Of particular concern to the DPC was the lack of clarity from Facebook on whether any data generated by a user interacting with the feature would be used for targeted advertising and newsfeed personalisation.

As it was not possible to implement changes in advance of the Irish election, Facebook responded to the DPC advising that it intended to withdraw the roll-out of the EDR function for the election and that the feature would not be activated during any EU elections pending a response to the DPC which addressed the concerns raised.

 

21)  Case Study 21: Google Voice Assistant Technology 

The DPC engagement with Google on the company’s voice assistant product continued in 2020. This engagement commenced following media coverage in the summer of 2019. The DPC sought a response from Google on the further actions that could be taken by Google to mitigate against risks to the personal data of users, particularly arising from misactivations of Google assistant. Google has implemented a number of changes to address the concerns raised.

These include:

  • A new transparent user engagement and consent flow to include information about the suite of safeguards in place to minimise the risks to data subjects and make user controls more accessible;
  • Measures to decrease misactivations. Users can now adjust how sensitive Google Assistant devices are to prompts like “Hey Google,” giving users more control to reduce unintentional activations, or to make it easier for users to get help in noisy environments. Google is also continuing to improve device and server side measures to detect false activations of Google assistant;
  • Deletion by voice command on Assistant. Users are now able to delete their Assistant interactions from their account by saying things like “Hey Google, delete the last thing I said” or “Hey Google, delete everything I said to you last week.” If users ask to delete more than a week’s worth of interactions from their account, the Assistant will direct them to the page in their account settings to complete the deletion.

 

  1. Right to rectification request to a healthcare group
  2. Unauthorised disclosure of mobile phone e-billing records, containing personal data, by a telecommunications company, to the data subject’s former employer
  3. Reliance on consent in the use of child’s photograph in the form of promotional material by a State Agency
  4. Receivers and fair processing
  5. Prosecution of Vodafone Ireland Limited
  6. Prosecution of Just-Eat Ireland Limited
  7. Prosecution of Cari’s Closet Limited
  8. Prosecution of Shop Direct Ireland Limited t/a Littlewoods Ireland
  9. HSE Hospital/Healthcare Agency
  10. Loss of control of paper files
  11. Ransomeware Attack
  12. Disclosure of CCTV footage via social media
  13. Proposals for Fraud Sharing Databases

1)  Case Study 1: Right to rectification request to a healthcare group (Applicable Law — GDPR & Data Protection Act 2018)

We received a complaint against a healthcare group arising from its refusal of a request for rectification under Article 16 of the GDPR. The complainant alleged that the healthcare group was incorrectly spelling his name on its computer system by not including the síneadh fada, an accent that forms part of the written Irish language.

Hospitals under the administration of this healthcare group use a patient administration system to initially record patient data which is then shared with other systems at later points of patient care, i.e. Laboratory, Radiology and Cardiology. The healthcare group informed the complainant that it is not possible to record the síneadh fada because syntax characters are recorded as commands on the PAS, impacting on the way data is stored and processed. The healthcare group informed the DPC that the patient administration system is due to be replaced in 2019/2020. However, the group’s new system will not allow for the use of the síneadh fada. The healthcare group informed the DPC this was for the purpose of enabling a streamlined single point of contact for patient information across different systems. This would enable professionals to access this information across different units within a hospital or hospital group without re-entering the data at a later point, thereby avoiding potential for later errors. The other systems across the current healthcare group network and/or wider hospital network do not support the use of the síneadh fada. The healthcare group further advised the DPC that they identify patients with Patient ID numbers rather than isolated names.

The DPC examined this submission and concluded that any update of the computer system would lead to costs in terms of significant costs and time, along with errors in storage and matching of records. The DPC also engaged with An Coimisinéir Teanga (Irish Language Regulator) about its advice to public sector organisations with respect to computer systems supporting the síneadh fada. An Coimisinéir Teanga advised there is no such obligation arising from the Official Languages Act 2003 but such an obligation can arise from a language scheme — an agreement put in place between a public body and the Minister for Culture, Heritage and the Gaeltacht.

The DPC queried the healthcare group on the existence of a language scheme and was provided a copy. This scheme sets out a respect for patient choices regarding names, addresses and their language of choice. The scheme also provides a commitment to update computer systems to achieve “language compliancy”. There is no timeframe provided for the fulfilment of this commitment in the language scheme.

The healthcare group advised the DPC they are committed to patient safety as a primary, core concern and further advised the DPC of the difficulties associated with sharing and storing information across other systems if they updated their system to allow for the use of the síneadh fada. They also advised that they will be testing the possibility of using the síneadh fada in any update of their computer system.

The DPC had regard to Article 16 and Article 5(1) (d) of the GDPR in examining this complaint. Both articles set out the rights of individuals subject to “the purposes of the processing”. The right to rectification under Article 16 of the GDPR is not an absolute right. Organisations that control or process personal data are required to take reasonable steps in the circumstances. The DPC had regard to case law from the European Court of Human Rights on linguistic rights and/or naming. This case law reflects that the spelling of names falls under the ambit of Article 8 of the European Convention on Human Rights but that the Court adopts a restrictive approach in this regard. As such, the DPC reiterated the purpose of the processing in the circumstances of the complaint was the administration of health care to the complainant and involved the use of Patient ID numbers. The name of the complainant was not the isolated means of identification and therefore the purpose of the processing is being achieved without the use of diacritical marks.

The DPC had regard to any risks to the complainant in the refusal of their Article 16 request also. The DPC noted the risk to the complainant would increase because of the difficulties associated with cross-system handling of the síneadh fada and the impact this would have on any health care decision making for the individual. In the circumstances, the non-use of the síneadh fada would not constitute an interference with the fundamental rights of the individual.

Under section 109(5) (f) of the Data Protection Act 2018 (the 2018 Act), the DPC requested the healthcare group to inform the complainant of its actions in the implementation of a computer system enabled to reflect the síneadh fada. Also, the DPC requested that the group add an addendum to the individual’s file to show the síneadh fada forms part of the individual’s name. The DPC, under section 109(5)(c) of the 2018 Act, advised the complainant that he may contact An Coimisinéir Teanga about the language scheme and any contravention of same.

 

2)  Case Study 2: Unauthorised disclosure of mobile phone e-billing records, containing personal data, by a telecommunications company, to the data subject’s former employer (Applicable law: Data Protection Acts 1988 and 2003 (“the Acts”))

Background

The complainant, during a previous employment, asked the telecommunications company to link her personal mobile phone number to her (then) employer’s account. This enabled the complainant to avail of a discount associated with her (then). While this step resulted in the name on the complainant’s account changing to that of her (then) employer, the complainant’s home address remained associated with the account and the complainant remained responsible for payment of any bills. Following termination of the employment relationship, the complainant contacted the telecommunications company to ask that it (i) restrict her former employer’s access to her mobile phone records; and (ii) separate the account from that of her former employer. Following this request, an account manager took a number of steps in the mistaken belief that this would result in the separation of the complainant’s account from that of her former employer. The complainant, however, became aware that, subsequent to her request, her former employer continued to access her account records. On foot of further inquiries from the complainant, the telecommunications company discovered its error and the complainant’s account was eventually separated from that of her former employer.

The complainant subsequently submitted a complaint to the telecommunications company. Having investigated the complaint, the company informed the complainant that it did not have a record of the original account restriction request. In the circumstances, the complainant referred a complaint to this office.

Investigation

During our investigation, the telecommunications company acknowledged that the initial action taken by its account manager was insufficient as it did not separate the complainant’s account from that of her former employer and neither did it prevent her former employer from accessing her e-billing records. The company further acknowledged that its records were incomplete when it investigated the complainant’s complaint. It confirmed, in this regard, that it had since located the complainant’s initial restriction/separation request.

The issues for determination, therefore, were whether the telecommunication company, as data controller:

  1. implemented appropriate security measures, having regard to Sections 2(1)(d) and 2C(1) of the acts in order to protect the complainant’s personal data against unauthorised access by, and disclosure to, a third party (i.e. the complainant’s former employer); and
  2. kept the complainant’s data accurate, complete and up to date, as required by Section 2(1)(b) of the Acts.

Appropriate Security Measures

This office found that the telecommunications company did not implement appropriate security measures to protect the complainant’s personal data from unauthorised access by, and disclosure to, her former employer. This was self-evident from the fact that the complainant’s former employer continued to access her e-billing records despite the initial actions taken by the telecommunications company.

This office further noted the obligation, set out in Section 2C(2) of the Acts, for a data controller to “… take all reasonable steps to ensure that — (a) persons employed by him or her … are aware of and comply with the relevant security measures aforesaid …”. This office found that the telecommunications company had not complied with its obligations in this regard. Again, this was self-evident from the fact that the account manager who initially actioned the complainant’s request was operating on the mistaken belief that the actions taken were sufficient to achieve separation of the complainant’s account from that of her former employer.

Accurate, complete and up to date

This office also considered the fact that, at the time when the complainant referred her complaint to the telecommunications company, the company could not locate her initial account restriction request. The result of this was that the outcome of the company’s own investigation into the individual’s complaint was incorrect. Accordingly, and notwithstanding the subsequent rectification of the position, this office found that the telecommunications company failed to comply with its obligations under Section 2(1)(b) of the Acts in circumstances where the complainant’s records, at the relevant time, were inaccurate, incomplete and not up to date.

Key Takeaways

The above case study highlights the fact that the obligation to keep personal data safe and secure is an ongoing one. Data controllers must ensure that they continuously monitor and assess the effectiveness of their security measures, taking account of the possibility that the circumstances or arrangements surrounding its data processing activities may change from time to time. In this case, the data controller failed to take the required action to reflect the change in circumstances that was notified to it by the complainant when she requested the restriction and separation of her account from that of her former employer. The case study further highlights the importance of effective training for employees in relation to any internal protocols.

 

3)  Case Study 3: Reliance on consent in the use of child’s photograph in the form of promotional material by a State Agency (Applicable law — Data Protection Acts 1988 and 2003)

We received a complaint from a parent in respect of their child. The parent had attended a festival organised by a state agency with their child, where a professional photographer took the child’s photograph. The following year the state agency used this photograph in promotional material. The child’s parent, while accepting that they had conversed with the photographer, had understood at the time of the photograph that they would be contacted prior to any use of the image.

During the investigation, the state agency indicated that they had relied upon consent pursuant to section 2A(1) (a) of the Acts as the photographer had obtained verbal permission from the child’s parent. However, the state agency also accepted that it was not clear to the child’s parent that the image would be used for media/PR purposes. The state agency further accepted that the parent was not adequately informed regarding the retention of the image. The DPC welcomed the state agency’s indications that it would immediately review their practices and procedures. In conclusion, the DPC found that the state agency had not provided the child’s parent with adequate information in order to consent to the processing of the image used in promotional material.

 

4)  Case Study 4: Receivers and fair processing

We received a complaint against a private receiver who was appointed by a financial institution over the complainant’s property.

The complaint alleged infringements of the Acts on the basis that the receiver:

  • was not registered as a controller pursuant to section16 of the Acts;
  • had no lawful basis for obtaining the complainant’s personal data from the financial institution;
  • further processed personal data unlawfully by disclosing information to a company appointed by the receiver to manage the receivership (the receiver’s “managing agent”);
  • opened a bank account in the complainant’s name;
  • obtained the property ID and PIN from Revenue which gave the receiver access to the complainant’s personal online Revenue account; and
  • insured the property in the complainant’s name. Following an investigation pursuant to section 10 of the

Acts, the DPC established that the receiver was appointed by the financial institution on foot of a Deed of Appointment of Receiver (DOA) which granted the receiver powers pursuant to the Conveyancing Act 1881, and pursuant to the mortgage deed between the complainant and the financial institution. On being appointed, the receiver wrote to the complainant informing them of their appointment as the receiver over the complainant’s property and provided a copy of the DOA. The receiver appointed a separate company as their managing agent to assist in the managing of the property. During the receivership, the receiver liaised with Revenue in order to pay any outstanding taxes on the property, such as the Local Property Tax (LPT). It was also established that the receiver opened a bank account for the purpose of managing the income from the property. The bank account name included the name of the complainant. It was further established that an insurance policy was taken out, in respect of the property. This insurance policy referred to the complainant’s name.

The DPC first considered whether a receiver was required to register as a data controller in accordance with section 16 the Acts, and whether the exemptions listed in the Data Protection Act 1988 (Section 16(1)) Regulations 2007 (the “Registration Regulations”) applied. The DPC held that a receiver was not required to register, as the exemption under regulation 3(1)(g) of the Registration Regulations applied to the receiver. Regulation 3(1)(g) exempted data controllers who were processing data in relation to its customers. Having considered the relationship between the complainant and the receiver, the DPC held that the exemption applied in respect of the receiver’s activities regarding the complainant.

Next the DPC considered whether the receiver had a lawful basis for obtaining the personal data from the financial institution, disclosing it to the managing agent, and whether such processing constituted further processing incompatible with the original purpose it was obtained pursuant to section 2(1)(c)(ii) of the Acts. The complainant had a mortgage with the financial institution which had fallen into arrears. Under section 19(1)(ii) of the Conveyancing Act 1881, the financial institution could appoint a receiver once the debt on the mortgage had come due. Section 2A(1)(b)(i) of the Acts permits processing of personal data where the processing is necessary “for the performance of a contract to which the data subject is party”. The mortgage deed was a contract between the data subject and the financial institution, and in circumstances where the terms of the contract were not being adhered to, the appointment of the receiver by the financial institution was necessary for the performance of the contact. The DPC held that the receiver had a lawful basis for obtaining the complainant’s personal data from the financial institution.

The DPC also found that the receiver had a lawful basis pursuant to section 2A(1)(b)(i) of the Acts to disclose personal data to its managing agent, to assist in the day to day managing of the receivership. The DPC found that the financial institution obtained the complainant’s personal data for the purposes of entering into a loan agreement. This was specific, explicit and a legitimate purpose. The disclosure of the complainant’s personal data by the financial institution to the receiver, and by the receiver to the managing agent was in accordance with the initial purpose for which the personal data was obtained. This processing during the receivership did not constitute further processing pursuant to section 2(1)(c)(ii) of the Acts. The DPC assessed whether the receiver had a lawful basis to open a bank account in the complainant’s name. The complainant submitted that this account was opened without their knowledge or consent. Consent is one of the lawful bases for processing personal data under the Acts. The DPC considered whether the receiver otherwise had a lawful basis for processing under section 2A(1)(d) of the Acts, on the basis of legitimate interests. To assess this lawful basis, the DPC took account of the Court of Justice of the European Union (CJEU) case in Rīgas C-13/16(1) which sets out a three step test for processing on the basis of legitimate interests, as follows:

Valsts policijas Rīgas reģiona pārvaldes Kārtības policijas pārvalde v Rīgas pašvaldības SIA ‘Rīgas satiksme’ Case C-13/16

  • the processing of personal data must be for the pursuit of a legitimate interest of the controller or a third party;
  • the processing must be necessary for the purpose and legitimate interests pursued; and
  • the fundamental rights and freedoms of the individual concerned do not take precedence.

The DPC held that the opening of the bank account was a reasonable measure to manage the income and expenditure during a receivership. The receiver submitted that referring to complainant’s name as part of the bank account name was necessary to ensure the receivership was carried out efficiently and to avoid confusion between different receiverships. While it would have been possible to open an account without using the complainant’s name, the DPC took account of the CJEU’s judgment in Huber v Bundesrepublik C-524/062 where the Court held that processing could be considered necessary where it allowed the relevant objective to be more effectively achieved. The DPC held that the reference to the complainant’s name on the bank account was therefore necessary, as it allowed for the more effective pursuit of the receiver’s legitimate interests.

With regard the third element of the legitimate interests test (which requires a balancing exercise, taking into account the fundamental rights and freedoms of the data subject) the DPC held that the reference to the complainant’s name on the account would have identified them to individuals who had access to the bank account or been supplied with the bank account name. The DPC balanced these concerns against the administrative and financial costs which would result from the need for the receiver to implement an alternative procedure for naming accounts. On balance, the DPC did not find that the complainant’s fundamental rights took precedence over the legitimate interests of the receiver and as a result, the receiver had a lawful basis for processing the complainant’s name, for the purpose of the receiver’s legitimate interests.

With regard to the allegation that the receiver had gained access to the personal Revenue account of the complainant, the DPC found that the receiver did not gain access to the complainant’s personal online Revenue account as alleged. The receiver was acting as a tax agent in relation to the LPT and this did not allow access to a personal Revenue account. In relation to the insurance policy being taken out in the complainant’s name the DPC held that the receiver did not process personal data in this instance.

During the course of the investigation the DPC also examined whether the receiver had complied with the data protection principles under section 2 of the Acts. In this regard, the DPC examined the initial correspondence the receiver had sent to the complainant notifying them of their appointment. This correspondence consisted of a cover letter and a copy of the DOA. The cover letter and DOA were assessed in order to determine whether the receiver had met their obligation to process the personal data fairly. Section 2D of the Acts required an organisation in control of personal data to provide information on the identity of the data controller, information on the intended purposes for which the data may be processed, the categories of the data concerned as well as any other information necessary to enable fair processing. The DPC held that the correspondence was sufficient in informing the complainant of the identity of the data controller (and original data controller). However, the DPC held that, while a receiver was not required to provide granular information on each purpose for which personal data was to be processed, the receiver should have given a broad outline of the purposes for which the personal data was intended to be processed, and this was not done in this case. It was also held that the receiver should have provided the categories of personal data they held in relation to the complainant, but this was not done. In light of this, the DPC held that the receiver had not complied with section 2D of the Acts.

This decision of the DPC demonstrates that private receivers and their agents may lawfully process personal data of borrowers, where such processing is necessary in order to manage and realise secured assets. Individuals should be aware that their information may be processed without their consent in circumstances where a deed of mortgage provides for the appointment of a receiver. At the same time, receivers must comply with their obligations under the Acts and GDPR to provide individuals with information on processing at the outset of the receivership. The decision is currently the subject of an appeal by the complainant to the Circuit Court.

1. Valsts policijas Rīgas reģiona pārvaldes Kārtības policijas pārvalde v Rīgas pašvaldības SIA ‘Rīgas satiksme’ Case C-13/16

2. Heinz Huber v Bundesrepublik Deutschland Case C-524/06

3. The processing of personal data was considered in a similar case where the same complainant made a complaint against the managing agent in this case. In that decision the DPC held that the managing agent had legitimate interest in processing the complainant’s personal data for the purposes of insuring the property

5)  Case Study 5: Prosecution of Vodafone Ireland Limited

In April 2019 the DPC received two separate complaints from an individual who had received unsolicited direct marketing communications by text and by email from the mobile network operator Vodafone. The individual stated that Vodafone had ignored their customer preference settings, which recorded thatthey did not wish to receive such marketing.

During our investigation, Vodafone confirmed that the complainant had been opted-out of direct marketing contact but that communications were sent to them due to human error in the case of both the text message and the email marketing campaigns.

In the case of the SMS message, Vodafone confirmed that a text offering recipients the chance to win tickets to an Ireland v France rugby match was sent to approximately 2,436 customers who had previously opted-out of receiving direct marketing by text. This was as a result of a failure to apply a marketing preferences filter to the SMS advertising campaign before it was sent.

In the case of the email received by the complainant, an application that was intended to be used to send direct marketing to prospective customers was used in error and the message was sent to existing Vodafone customers. While Vodafone was unable to definitively confirm the number of customers who were contacted by email contrary to their preference, the marketing email was sent to 29,289 existing Vodafone customers. The company confirmed that some 2,523 out of 7,615 of these were contacted in error. However, it was unable to link the remaining 21,674 customers who were sent the same email with their marketing preferences in Vodafone’s data warehouse to confirm the total number contacted in error.

The DPC had also received a separate complaint in February 2019 from another individual who was a former customer of Vodafone. This customer had ceased to be a Vodafone customer more than five years earlier and they still continued to receive promotional text messages. In the course of our investigation, Vodafone confirmed that the direct marketing messages were sent to the complainant in error. It said that in this exceptional case, the complainant’s mobile number was not removed from the platform used to send marketing communications when their number was no longer active on the network. As the DPC had previously prosecuted Vodafone in 2011, 2013 and 2018 in relation to direct electronic marketing offences, we decided to initiate prosecution proceedings in relation to these complaints.

At Dublin Metropolitan District Court on 29 July 2019, Vodafone pleaded guilty to five charges of sending unsolicited direct marketing communications in contravention of S.I. No. 336 of 2011 (‘the ePrivacy Regulations’). The company was convicted and fined €1,000 on each of three charges and convicted and fined €750 each in respect of the two remaining charges.

 

6)  Case Study 6: Prosecution of Just-Eat Ireland Limited

We received a complaint from an individual in November 2018 regarding unsolicited direct marketing emails from Just-Eat Ireland Limited. The complainant had unsubscribed from the company’s direct marketing emails but several days later received an unsolicited marketing email. During our investigation of this complaint the company informed us that the complainant’s attempt to unsubscribe was unsuccessful due to a technical issue with its email platform. This issue affected 391 customers in Ireland.

As Just-Eat Ireland Limited had previously been warned by the DPC in 2013 on foot of complaints in relation to unsolicited direct marketing emails, we decided to initiate prosecution proceedings.

At Dublin Metropolitan District Court on 29 July 2019, Just- Eat Ireland Limited pleaded guilty to one charge in relation to sending an unsolicited direct marketing email. The court applied section 1(1) of the Probation of Offenders Act in lieu of a conviction and fine on the basis that the company donate €600 to the Peter McVerry Trust charity.

 

7)  Case Study 7: Prosecution of Cari’s Closet Limited

In May 2018, we received a complaint against the online fashion retailer Cari’s Closet from an individual who had in the past placed an online order with the company. The complaint concerned the receipt of three unsolicited direct marketing emails. The same person had previously complained to the DPC in January 2018 about unsolicited emails from that company. On that occasion, the complainant said they had received over forty marketing emails in one month alone. The person had attempted, without successs, to unsubscribe on a couple of occasions.

Cari’s Closet attributed the failure to properly unsubscribe the complainant from emails to a genuine mistake on its behalf.

As the DPC had issued a warning in April 2018 in relation to the earlier complaint, we decided to initiate prosecution proceedings against the company.

At Dublin Metropolitan District Court on 29 July 2019, Cari’s Closet pleaded guilty to one charge of sending an unsolicited direct marketing email to the complainant. In lieu of a conviction and fine, the court applied section 1(1) of the Probation of Offenders Act on the basis that the company donate €600 to the Little Flower Penny Dinners charity.

 

8)  Case Study 8: Prosecution of Shop Direct Ireland Limited t/a Littlewoods Ireland

In May 2019, the DPC received a complaint from an individual who said they had been receiving direct marketing text messages from Littlewoods since March. The complainant stated that they had followed the instructions to unsubscribe by texting the word ‘STOP’ on five occasions to a designated number known as a short code, but they had not succeeded in opting out and they continued to get marketing text messages.

In the course of our investigations, Shop Direct Ireland Limited (t/a Littlewoods Ireland) confirmed it had a record of the complainant’s opt-out from direct marketing texts submitted through their account settings on the Littlewoods website on 8 May 2019. It did not, however, have a record of their attempts to opt-out of direct marketing texts on previous occasions using the SMS short code. This was due to human error in setting up the content for the SMS marketing messages. The company said that the individual responsible for preparing and uploading content relating to marketing texts had mistakenly included the opt-out keyword ‘STOP’ instead of ‘LWISTOP’ at the end of the marketing texts.

Shop Direct Ireland Limited had previously been prosecuted by the DPC in 2016 in relation to a similar issue which resulted in a customer attempting, without success, to unsubscribe from direct marketing emails. On that occasion, the court outcome resulted in the company making a donation of €5,000 to charity in lieu of a conviction and fine.

The DPC decided to prosecute the company in respect of direct electronic marketing offences in relation to the May 2019 complaint.

At Dublin Metropolitan District Court on 29 July 2019, Shop Direct Ireland Limited (t/a Littlewoods Ireland) entered guilty pleas to two charges relating to sending unsolicited direct marketing text messages. The court ruled that the company would be spared a conviction and fine if it donated €2,000 each to the Peter McVerry Trust and the Little Flower Penny Dinners charities and section 1(1) of the Probation of Offenders Act was applied.

 

9)  Case Study 9: HSE Hospital/Healthcare Agency

In 2019, the DPC received a complaint about the disclosure of a patient’s data via Facebook messenger by a hospital porter regarding her attendance at the Early Pregnancy Unit of a hospital. Upon examination of the complaint, the HSE clarified to the DPC that the hospital porter who disclosed the personal information of the patient was in fact employed by a healthcare agency contracted by the HSE. The DPC contacted the agency and sought an update in relation to its internal investigation, details of any remedial action as well as details of any disciplinary action taken against the employee in question. At the same time, the DPC advised the HSE that, as it contracts the company concerned to provide agency staff to work in the hospital, ultimately the HSE is the data controller for the personal data in this instance.

The complaint was subsequently withdrawn by the solicitor acting on behalf of the woman following a settlement being agreed between the affected party and the hospital/ healthcare agency. Data controllers/data processors may be liable under Section 117 of the Data Protection Act 2018 to an individual for damages if they fail to observe the duty of care they owe in relation to personal data in their possession.

The DPC has no role whatsoever in dealing with compensation claims and no function in relation to the taking of any such proceedings under Section 117 of the 2018 Act or in the provision of any such legal advice.

What this case illustrates is that ongoing training is necessary for all staff in relation to their obligations under data protection law and that controllers must do due diligence and satisfy themselves that any contractors/processors they engage are fully trained and prepared to comply with data protection laws.

 

10)  Case Study 10: Loss of control of paper files

A public sector health service provider notified the DPC that a number of files containing patient medical information had been found in a storage cabinet on a hospital premises which was no longer occupied The records were discovered by a person who had gained illegally accessed a restricted premises and subsequently posted photographs of the cabinet containing the files on social media. The public sector organisation in question informed the DPC that, having become aware of the breach, a representative of the organisation was sent to locate and secure the files. The files were removed from the premises and secured.

This breach highlights the importance of having appropriate records management policies; including mechanisms for tracking files, appropriate secure storage facilities and full procedures for the retention or deletion of records. The DPC issued a number of recommendations to the organisations to improve their personal data processing practices.

 

11)  Case Study 11: Ransomeware Attack

An organisation operating in the leisure industry notified the DPC that it had been the victim of a ransomware attack which potentially encrypted/disclosed the personal data of up to 500 customers and staff stored on the organisations server. The route of the infiltration was traced to a modem router that had been compromised (back up data was however stored securely via a cloud server).

Following examination of the incident, the DPC issued a number of recommendations to the organisation. The DPC recommended that the organisation conduct an analysis of its ICT infrastructure to establish if further malware was present, to review and implement appropriate measures to ensure there is an adequate level of security surrounding the processing of personal data, and to conduct employee training to encompass cyber security risks. The DPC has received regular updates from the organisation and is satisfied that significant steps to improve and implement both organisational and technical measures concerning shortfalls in the security of their ICT infrastructure have been taken, including the development of a training plan for all staff in this area.

 

12)  Case Study 12: Disclosure of CCTV footage via social media

A commercial and residential property management company notified the DPC that an employee of a security company whose services they retained hadused their personal mobile phone to record CCTV footage of two members of the public engaged in an intimate act, which had been captured by the management company’s security cameras.

The video taken was subsequently shared via WhatsApp to a limited number of individuals. The business advised the DPC that they communicated to staff who may have received the footage that they must delete it and requested no further dissemination of the video.

Both the property management company and the security company were able to demonstrate that adequate policies and procedures did exist, however appropriate oversight and supervision to ensure compliance with these policies and procedures were lacking.

Following recommendations made by the DPC to the property management company, the company has subsequently engaged with its staff to deliver further data protection training with an emphasis on personal data breaches. In addition, further signage was displayed prohibiting the use of personal mobile devices within the confines of the CCTV control room.

 

13)  Case Study 13: Proposals for Fraud Sharing Databases

During 2019 the DPC was consulted on proposals for the creation of two separate fraud information-sharing databases.

The first proposal from Insurance Ireland is to expand an existing database, called Insurance Link, to include additional data fields. Insurance Link contains details of insurance claims made by individuals to facilitate the exchange of information between insurance companies when a claim for compensation has been made by a customer for the purpose of identifying fraud where false claims are being potentially processed. One of the proposed additional data sets is third party personal data such as witnesses to accidents.

The second proposal was from Banking and Payments Federation Ireland (BPFI) on behalf of the main retail banks, who wish to create a fraud information-sharing da- database that would be operated by an independent trusted third party. Each bank that establishes fraudulent activity would, according to predefined rules, transmit that information to the database and all participant banks would be permitted to check client details against the database for the purposes of identifying and preventing fraud.

The DPC has emphasised to both Insurance Ireland and BPFI that industry fraud databases, involving the processing of significant volumes of sensitive data, must meet necessity and proportionality requirements under EU law and jurisprudence. We have also emphasised that the operation of each database must, as necessary, have a statutory underpinning to ensure compliance with data protection obligations under the GDPR and the Data Protection Act 2018, such as, for example, where the processing is in the public interest and/or involves data relating to offences or alleged offences.

It is the DPC’s view that both proposals raise significant risks for individuals, in particular to persons who may be wrongly identified as participating in fraudulent activity, or, in the case of insurance claims, to persons who are not directly linked to a claim such as a witness. We have advised the parties that these risks must be fully assessed and mitigated, including by building in very robust safeguards, rules and procedures and ensuring that the principles of data protection such as data minimisation are complied with. Furthermore, we have highlighted the importance of public consultation and awareness on the scope and purpose of these proposals.

 

  1. Transmission of data by a Government Department via WhatsApp
  2. Provision of CCTV footage by a bar to an employer
  3. Ryanair web-chat transcript sent to another customer
  4. Unlawful processing arising from billing error
  5. Late response to an access request
  6. Access request to golf club for CCTV
  7. Financial information erroneously cc’d to a restaurant
  8. CSO data breach — Disclosure of P45 data
  9. Failure to implement the data protection policies in place
  10. Unencrypted USB device lost in the post
  11. Website phishing
  12. Loss of paper files in transit
  13. SIM swap attack
  14. ‘Mentions in the news’ feature
  15. Prosecution of Viking Direct (Ireland) Limited
  16. Prosecution of Clydaville Investments Limited, T/A The Kilkenny Group
  17. Prosecution of DSG Retail Ireland Limited
  18. Prosecution of Vodafone Ireland Limited
  19. Prosecution of Starrus Eco Holdings Limited, T/A Panda and Greenstar

1)  Case Study 1: Transmission of data by a Government Department via WhatsApp (Applicable law — Data Protection Acts 1988 and 2003 (the Acts))

We received a complaint against the Department of Foreign Affairs and Trade (the DFAT), alleging that the mission in Cairo, Egypt, had shared the complainant’s personal data with a third party (his employer) without his knowledge or consent, and that it had failed to keep the complainant’s personal data safe and secure, having transmitted it via WhatsApp to his employer. This related to processing of the complainant’s personal data contained in a short-term visa application that the complainant had submitted in order to sit an exam in Ireland.

During our investigation, the DFAT informed us that it was standard practice in processing visa applications to check for accuracy, completeness and the validity of supporting documents. According to DFAT, a suspicion had arisen as to the veracity of a supporting document submitted by the complainant, which had purportedly been signed by his employer. In order to verify its validity, a staff member in the Cairo mission had contacted the employer (an official of an Egyptian government agency, whose name and signature appeared on the document) by telephone as he was best placed to verify the authenticity of the document. The employer confirmed that he would need to see the document to verify it, but that as he did not have an official email address, the only way to receive it was via WhatsApp. The DFAT informed us that prior to sending the data via WhatsApp it had carried out a local risk assessment, including looking at the security/ encryption associated with WhatsApp. It had concluded that in light of the end-to-end encryption on WhatsApp, this was the most secure means of transmission available, given the urgency of the visa application, as outlined by the complainant in his application. In this context, DFAT informed us that many government officials and civil servants in Egypt do not have access to official email accounts/ systems and often use services like Gmail, Hotmail, WhatsApp and Viber to carry out official business. In this case, the government official in question had confirmed that this was the only method of communication available to him.

The documents had been sent by using the mobile phone of the only staff member of the Cairo mission with WhatsApp and had been deleted from the device immediately after being sent. Ultimately, the official informed the Cairo mission that the documents were fraudulent and he visa application was denied. During our investigation, the complainant informed us that he was seeking €3,000 in compensation from the DFAT, as the lost cost of sitting the exam in Ireland. Upon the DPC informing the complainant that it did not have the power to award compensation, the complainant requested a formal decision from the DPC. In considering whether a contravention of the Acts had occurred when the complainant’s personal data was sent by DFAT, via WhatsApp to the official in question, the DPC sought to establish the facts in relation to, first, whether the transmission in question was necessary, and, second, whether it was secure, including whether there were more secure methods available to DFAT to transmit the data. On the first issue, the DPC was satisfied that it was necessary for the DFAT to share the complainant’s personal data with the official who, in the application for the short-term visa, was stated to be his employer and who, according to the application documents, had purportedly signed certain supporting documents. We noted in this regard that the relevant privacy policy (for the Irish Naturalisation and Immigration Services) explicitly states that burden of proof in a visa application is on the applicant and that the visa officer may verify any evidence submitted in support of an application. The policy also states that any information provided in an application form can be disclosed to, among others, foreign governments and other bodies for immigration purposes.

The DPC was satisfied that given the lack of any other secure means to contact the official in question, the transmission via WhatsApp was necessary to process the personal data for the purpose provided (visa eligibility) and that the complainant was on notice that supporting documentation could be shared with third parties to verify authenticity. The DPC also took account of the fact that the local risk assessment carried out by DFAT had established that, in the circumstances, sending the personal data via WhatsApp was the most secure means of transmission. Accordingly the DPC found that DFAT had complied with the Acts.

This was an exceptional case arising from the particular on-the-ground circumstances of the country in question. Here, transmission of information for official purposes via WhatsApp was in fact the most secure method available and the complainant’s employer, while a government official, had no access to an official communications system through which the personal data could have been transmitted. In this case, the key data protection principles of necessity and proportionality, applied against the unique context of the processing in question, resulted in the DPC reaching a finding of compliance with the Acts. Such a finding would likely not have prevailed had the complaint arisen in an equivalent case where other official communication channels had been available to transmit the personal data contained in the supporting documents.

 

 

2)  Case Study 2: Provision of CCTV footage by a bar to an employer (Applicable law — Data Protection Acts 1988 and 2003 (the Acts))

We received a complaint against a city-centre bar, alleging that it had disclosed the complainant’s personal data, contained in CCTV footage, to his employer without his knowledge or consent and that it did not have proper CCTV signage notifying the public that CCTV recording was taking place.

During our investigation, we established that a workplace social event had been hosted by an employer organisation in the bar on the night in question. The complainant was an employee of that organisation and had attended the workplace social event in the bar. An incident involving the complainant and another employee had taken place in the context of that workplace social event and there was an allegation of a serious assault having occurred. An Garda Síochána had been called to the premises on the night in question and the incident had been reported for a second time by the then manager and headwaiter to the local Garda station the following day. We established that the employer organisation had become aware of the incident and had contacted the bar to verify the reports it had received. Ultimately the bar manager had allowed an HR officer from the employer organisation to view the CCTV footage on the premises. The HR officer, upon viewing the CCTV footage, considered it a serious incident and requested a copy of the footage so that the employer organisation could address the issue with the complainant. The bar manager allowed the HR officer to take a copy of the footage on their mobile phone as the footage download facility was not working.

The DPC considered whether there was a legal basis, under the grounds of the ‘legitimate interests’ of the data controller or a third party under Section 2A(1)(d) of the Acts, for the bar to process the complainant’s personal data by providing the CCTV footage to the employer organisation. This provision allows for the processing that is ‘necessary for the purposes of the legitimate interests pursued by the data controller or by a third party or

parties to whom the data are disclosed except where the processing is unwarranted in any particular case by reason of prejudice to the fundamental rights and freedoms or legitimate interests of the data subject’.

In its analysis of this case, the DPC had regard to the judgment of the CJEU in the Riga regional security police case in which the CJEU had considered the application of Article 7(f) of the Data Protection Directive (95/46/EC) on which Section 2A(1)(d) of the Acts is based, and identified three conditions that the processing must meet in order to justify the processing as follows:

a) There must be the existence of a legitimate interest justifying the processing;

b) The processing of the personal data must be necessary for the realisation of the legitimate interest; and

c) That interest must prevail over the rights and interests of the data subject.

The DPC established during its investigation that, arising from the incident in question, there was an allegation of a serious assault committed by the complainant against a colleague and the bar had provided a copy of the CCTV footage to the complainant’s employer so that the employer could properly investigate that incident and the allegations made. The DPC took into account that as the incident had occurred during the employer organisation’s workplace social event, the employer might have been liable for any injuries to any employee that could have occurred during the incident. Accordingly, the CCTV was processed in furtherance of the employer organisation’s obligation to protect the health and safety of its employees. As the CJEU has previously held that the protection of health is a legitimate interest, the DPC was satisfied that there was a legitimate interest justifying the processing. The DPC also considered that the disclosure of the CCTV in this instance was necessary for the legitimate interests pursued by the employer organisation so that it could investigate and validate allegations of wrongdoing against the complainant. The DPC considered, in line with the comments of Advocate General Bobek in the Riga regional security police case, that it was important that data protection is not utilised in an obstructive fashion where a limited amount of personal data is concerned. In these circumstances the DPC considered that it would have been unreasonable to expect the bar to refuse a request by the employer organisation to view and take a copy of the CCTV footage, against a backdrop of allegations of a serious assault on its premises, especially where the personal data had been limited to the incident in question and had not otherwise been disclosed. On the question of balancing the interest of the employer organisation against the complainant’s rights and interests, the DPC had primary regard to the context of the processing, where the bar had received a request for the viewing and provision of a serious incident on its premises, which it had deemed grave enough to report to An Garda Síochána. A refusal of the request might have impeded the full investigation of an alleged serious assault, and the employer organisation’s ability to protect the health and welfare of its employees. Accordingly the DPC considered that it was reasonable, justifiable and necessary for the bar to process the CCTV footage by providing it to the employer organisation, and that the legitimate interest of the employer organisation took precedence over the rights and freedoms of the complainant, particularly given that the processing did not involve sensitive personal data and there had not been excessive processing.

On the facts, the DPC was also satisfied that the bar currently had adequate signage alerting patrons to the use of CCTV for the purpose of protecting staff and customers and preventing crime, and that in the absence of any evidence to the contrary offered by the complainant, the complainant had been on notice of the use of CCTV at the time in question.

In many of the complaints that the DPC handles, data subjects hold the mistaken belief that because they have not consented to the processing of their personal data, it is de facto unlawful. However, there are a number of legal bases other than consent that justify processing depending on the particular circumstances. With regard to the legitimate interests justification, the DPC will rigorously interrogate whether the circumstances of the processing satisfy the elements that the CJEU has indicated must be present for controllers to rely on this legal basis. Equally, however, the DPC emphasises that where the circumstances genuinely meet the threshold required for this justification, as per the sentiment of Advocate General Bobek of the CJEU, protection of personal data should not disintegrate into obstruction of genuine legitimate interests by personal data.

 

3)  Case Study 3: Ryanair web-chat transcript sent to another customer (Applicable law — GDPR & Data Protection Act 2018)

We received a complaint from a data subject whose web-chat with a Ryanair employee was accidentally disclosed by Ryanair in an email to another individual who had also used the Ryanair web-chat service. The transcript of the webchat contained details of the complainant’s name and that of his partner, his email address, phone number and flight plans. The complainant told us that he had been alerted to the disclosure by the individual who had been erroneously sent the transcript of his web-chat.

In our examination of the complaint, we established that Ryanair’s live web-chat service is provided by a third  party, which is a data processor for Ryanair. We also established that the system that sends the web-chat transcripts by email has an auto-fill function that populates the recipient field with the email address of the last customer emailed. On the date in question, the data processor received requests from four Ryanair customers for transcripts of their web-chats, all of which were processed by the same agent. However, the agent did not correctly change the recipient email address when sending each transcript so that they were sent to the wrong recipients. Ryanair informed us that in order to prevent a recurrence of this issue the auto-fill function in the live web-chat system has been disabled by the data processor and refresher GDPR training has been provided to staff.

Many of the complaints that the DPC receives relating to unauthorised disclosure of personal data in an electronic context — e.g. emails containing personal data sent to the wrong recipient — stem from use of the auto-fill functions in software. While data controllers may consider this a useful timesaver tool in a data-entry context, it has inherent risks when it is used to populate recipient details for the purposes of transmitting personal data. Auto-fill functions should therefore be used with caution, and where controllers decide to integrate such a function into their software for data-processing purposes, at a minimum other safeguards should be deployed, such as dummy addresses at the start of the address book, or on-screen prompts to double-check recipient details. The principle of safeguarding the security and confidentiality of personal data goes hand in hand with data protection by design and default so that when data controllers and processors are devising steps in a personal-data-processing programme or software, the highest standards of protection for the personal data are built in, particularly with regard to assuring the integrity, security and confidentiality of personal data.

 

4)  Case Study 4: Unlawful processing arising from billing error (Applicable law — Data Protection Acts 1988 and 2003 (the Acts))

In April 2018, we received a complaint from a data subject who had ceased to be a customer of the data controller. However, she had discovered that her data was still being processed as she continued to receive bills from the data controller. The complainant had received verbal and written assurances that she did not owe the amount being billed.

However, he complainant subsequently received a text message from a debt-collection company, asking that she contact them. When the complainant phoned the debt-collection company, it refused to provide her with any information regarding the alleged debt until she provided them with personal data verifying her identity, which she refused to do. Later the same day, the complainant received a letter from the debt-collection company confirming that it was seeking to recover monies owed by her to the data controller.

This complaint was identified as potentially capable of amicable resolution under Section 109 of the Data Protection Act 2018, with both the complainant and data controller agreeing to work with the DPC to try to amicably resolve the matter. Company A confirmed with the DPC that an error had caused the complainant’s account balance to appear outstanding but that when the error was identified by the data controller, the outstanding balance was removed from the account. The data controller also confirmed that it had instructed the debt-collection company to cease any collection activities, and also to delete any data associated with the complainant.

While the complainant was satisfied with the ultimate outcome, the DPC emphasised to the data controller that the complainant had previously been informed on at least two occasions that the matter had been resolved. Despite this, her data had been unfairly processed by being passed to a debt-collection company without there being any justification for such disclosure.

In recognition of its failings, the data controller apologised to the complainant, provided certain assurances to her that the matter would have no effect on her credit rating, and made donations to charities of her choice.

For a controller to lawfully engage a processor to process personal data, there must be a justification for the processing of the personal data in the first place. In this case, the controller had disregarded previous concerns raised by the complainant that bills were being issued to her despite her no longer receiving services from the controller and had failed to look into the continued use of her personal data for billing purposes in circumstances where she was no longer a customer. The DPC encourages individuals to raise data protection concerns directly with the controller in the first instance so that they can address them. However, data controllers frequently

ignore or disregard direct attempts made by a data subject to raise complaints until the DPC becomes involved. This is unacceptable and, as part of each organisation’s accountability obligations, it should have meaningful and efficient measures in place to deal with and address data protection complaints when raised directly by a data subject, without the need for the data subject to resort to DPC intervention.

 

5)  Case Study 5: Late response to an access request (Applicable law — GDPR & Data Protection Act 2018)

The GDPR places timelines on data controllers to respond to requests from data subjects when they are exercising their rights. In the case of one data subject who requested a recording of a telephone call conducted between the data subject and the customer-service operator line of a multinational technology company in order to progress a customer-service complaint, a complaint was made to the DPC that the access request submitted pursuant to Article 15 of the GDPR had not been processed within the timeframe set out by the GDPR.

Upon receipt of the complaint, the DPC contacted the company concerned to make it aware of the complaint and to enquire as to whether there was any action it would like to take on this matter. The company responded to the data subject with a copy of the requested telephone call and, accordingly, the data subject was satisfied for the complaint to be amicably resolved. Based on the circumstances of this individual case, the DPC deemed no further regulatory action necessary.

 

6)  Case Study 6: Access request to golf club for CCTV (Applicable law — GDPR & Data Protection Act 2018)

In November 2018, we received a complaint from a data subject in relation to an access request for his personal data comprising CCTV footage for a particular time and date, made to a golf club, the data controller.

The data subject provided us with initial correspondence from the golf club asking him why he required the footage and subsequent correspondence informing him that it had discovered a problem with the CCTV system software and was unable to provide him with the requested footage.

This complaint was deemed potentially capable of being amicably resolved under Section 109 of the Data Protection Act 2018.

As part of the amicable resolution process, we sought an explanation from the golf club as to why the requested CCTV could not be provided to the complainant. The golf club informed us that its CCTV system was not operational on the date for which the data subject had requested footage, and that this had only been discovered when it sought to comply with the access request. The DPC was not satisfied with the generality of this explanation and required a more detailed written explanation on the issues affecting the CCTV, which could also be shared with the complainant. In response to this request, we were supplied with a letter from the golf club’s security company that outlined the issues with the CCTV system, including the fact that the hard drive on the CCTV system had failed and that the system had not been in use for some time. The DPC was satisfied with the technical explanation provided and golf club agreed that this letter could be shared with the complainant. The complainant was satisfied with the explanation, leading to an amicable resolution. This case illustrates that even when working towards the facilitation or arrangement of an amicable resolution of a complaint, the DPC still expects accountability on the part of the controller or processor, and will scrutinise explanations and reasons given as to non-compliance with its obligations in order to ensure that the position put forward is verifiable and demonstrable.

 

7)  Case Study 7: Financial information erroneously cc’d to a restaurant (Applicable law — Data Protection Acts 1988 and 2003 (the Acts))

We received a complaint concerning the alleged disclosure by a motor dealership of the complainants’ personal data to a third party. The complainants had provided the dealership with copies of their driver’s licences and bank details, including bank statements and full account details, in order to purchase a car through a Personal Contract Plan. They were subsequently copied in on an email from the dealership to a third-party email address, believed to be an address associated with a bank, which contained the complainants’ driver’s licences and bank details. The complainants were concerned that the third-party address was that of a restaurant and contacted the dealership about this, but were assured that the email address in question pertained to a bank and was secure.

The complainants remained concerned over the ownership of the email address, conducted online research into the matter, and were confident the email address was that of a restaurant. In order to confirm their suspicions, a friend of the complainants sent an email to the address in question and the response received confirmed it was that of a restaurant.

In the course of our examination, the dealership accepted that the email had been sent in error to the wrong address. Notwithstanding this acknowledgment, it was clear that no attempt had been subsequently made to contact the restaurant in order to request that the information erroneously sent be deleted by the unintended recipient. Upon instruction from this office, we received confirmation that the dealership had contacted the restaurant and requested that the email, including the documents, be deleted. The dealership put forward a proposal for amicable resolution that was accepted by the complainants.

This case demonstrates that it is vital for data controllers (and their employees) to implement and ensure a practice of precautionary measures when electronically transmitting personal data, particularly financial information. A large proportion of the data-breach notifications that the DPC receives are of the unauthorised-disclosure variety, with a common cause being emails sent in error to the wrong address. Where a data controller identifies that such an incident occurs, it is not enough to acknowledge it, whether to the data subject or to the DPC. Instead, it is incumbent on the data controller to take all reasonable steps to remedy such a breach. This includes recalling the email from the sender, asking the unintended recipient to confirm they have deleted the email, and thereafter putting in place measures to prevent a recurrence. Human error by staff presents a high risk of data breaches on an ongoing basis and it is critically important that efforts are made to mitigate those risks by driving data protection awareness throughout the organisation, particularly in regard to new staff.

 

8)  Case Study 8: CSO data breach — Disclosure of P45 data (Applicable law — Data Protection Acts 1988 and 2003)

We received several complaints in late 2017 against the Central Statistics Office (the CSO), each alleging that the CSO had disclosed the respective complainants’ personal data without their consent or knowledge. The complaints related to a data breach that the CSO had previously reported to us (under the voluntary Personal Data Breach Code of Practice) and to the affected individuals.

The data breach originated from actions taken by the CSO in response to three requests over a five-day period from separate former census enumerators seeking their P45 information. Emails with PDF attachments containing their own P45 and P45s of thousands of third parties were sent to the requesting enumerators. The CSO informed us that the data breach had been identified when a member of CSO staff had reviewed the relevant CSO sent-items mailbox, as part of the CSO’s standard due-diligence practices. The CSO confirmed that the disclosed third-party P45 information contained personal data including PPSNs, dates of birth, addresses and details of earnings from employment as census enumerators.

During our investigation, the CSO informed us that upon discovering the breach it had notified the recipients of the error, who had subsequently confirmed in writing that they had deleted the files. The CSO told us that it had also notified the affected individuals of the facts of the breach as they pertained to each individual. The CSO also informed us that following the data breach it had implemented a range of new procedures for handling P45 requests, including a rule that P45 requests were to be answered only by post going forward.

This data breach had impacted on the thousands of individuals whose personal data was contained in the files that were unlawfully disclosed to the three former enumerators. The incident essentially occurred in triplicate because the erroneously disclosed files had been attached to three separate outgoing communications. This incident would have been preventable had the CSO had the appropriate processes in place for the oversight of releasing tax-related personal data.

The DPC issued a number of individual decisions in respect of complaints in relation to this breach, finding in each case that a contravention of Section 2A(1) of the Data Protection Acts 1988 and 2003 had occurred, in that personal data had been processed without a legal basis, as was clear from the breach report submitted to the DPC from the CSO. Having examined the new measures implemented by the CSO to guard against a recurrence, the DPC was satisfied that they comprehensively addressed the failings that had brought about this incident. However, from the perspective of ensuring the lawfulness of the processing and the security and confidentiality of personal data held by the CSO, those new organisational procedures only served to underline the inadequacy of the previous measures for responding to requests for tax-related information.

 

9)  Case Study 9: Failure to implement the data protection policies in place

An employee of the data controller, a public-sector body, lost an unencrypted USB device containing personal information belonging to a number of colleagues and service users.

The public controller had the appropriate policy and procedures in place prohibiting the removal and storage of personal data from its central IT system by way of unencrypted devices. However, it lacked the appropriate oversight and supervision necessary to ensure that its rules were complied with, and the employee appeared not to have been aware of the policy regarding the use of unencrypted devices. The breach could have been prevented had the organisation fully implemented the policy and made staff aware of it.

 

10)  Case Study 10: Unencrypted USB device lost in the post

A private-sector data controller notified the DPC that a package containing consent forms and an unencrypted USB device had been sent using standard postal services.

However, the package was damaged in transit, causing the USB device to fall out and become lost. The USB device contained pictures of minors participating in an organised educational event. The potential loss/disclosure of the personal data contained on the USB device could have been prevented/mitigated had the data controller had in place and implemented an encryption policy surrounding the used of portable memory devices and an adequate policy concerning sending sensitive material through the post, e.g. registered post/courier service.

 

11)  Case Study 11: Website phishing

A private sector (educational) data controller reported an incident of phishing, where a staff member had clicked on a suspicious website link and entered their credentials resulting in their email account becoming compromised.

The data controller had not enabled multi factor authentication on its email accounts. Had this technical measure and appropriate cyber security training been in place from the outset this data breach may have been preventable.

 

12)  Case Study 12: Loss of paper files in transit

The data controller, a public body, notified the DPC about an incident involving the transportation of hard-copy legal files containing special-category personal data.

The controller had contracted a courier company to transport the files to another department but the files went missing in transit. It transpired that the controller did not retain a backup of the original files, resulting in a loss of personal data. The controller did not have sufficient procedures in place for the secure removal and storage of hard-copy files that contained special-category personal data. The breach could have been prevented had the organisation properly considered its requirements when transporting such materials to another location and the inherent risks involved in such activities, and implemented more secure measures to ensure the protection of personal data.

 

13)  Case Study 13: SIM swap attack

A data subject notified the data controller (a mobile-phone network operator) that a SIM card swap was requested and authorised on her mobile-phone account by an unauthorised third party.

The data subject was concerned because her mobile- phone number had been used to receive text messages for two-factor authentication from her bank in relation to her banking service. Further investigation undertaken by the data controller indicated that an unknown third party had obtained limited personal data belonging to the data subject by some external means and had managed to pass the controller’s identity-validation processes. The customer-service agent for the data controller did not follow the validation process fully, and facilitated a SIM card swap on the customer’s account contrary to the controller’s policy. The breach would not have occurred had the controller had more robust processes preventing access to key account information and the customer-service agent had received sufficient data protection training, including on the risks posed to customer personal data by deviating from the company’s validation policy.

 

14)  Case Study 14: ‘Mentions in the news’ feature (Applicable law — GDPR & Data Protection Act 2018)

In 2018, the DPC received two complaints about a feature on a professional networking platform (the data controller), whereby the data controller sends emails and notifications to a member’s connections and followers to inform them if and when the member is mentioned in the news.

The complaints, one of which was lodged with the DPC in March 2018, pre the application of the GDPR and the second of which was received by the DPC in October 2018, arose as a result of the data controller incorrectly associating members with media articles that were not about them. In one of the complaints, a media article that set out details of the private life and unsuccessful career of a person of the same name as the complainant was circulated to the complainant’s connections and followers by the data controller. The complainant raised the matter with the data controller and, when it was not resolved to their satisfaction, brought the complaint to the DPC. The complainant stated that the article had been detrimental to their professional standing and had resulted in the loss of contracts for their business. The second complaint involved the circulation of an article that the complainant believed could be detrimental to future career prospects, which the data controller had not vetted correctly. The key concern arising from these complaints was the failure of the data controller to correctly identify matches between members and those referenced in specific third-party media articles, resulting in members being associated with new stories that were not about them. It was clear from the complaints that matching by name only was insufficient, giving rise to data protection concerns, primarily the lawfulness, fairness and accuracy of the personal data processing utilised by the ‘Mentions in the news’ feature.

As a result of these complaints and the intervention of the DPC, the data controller undertook a review of the feature. The result of this review was to suspend the feature for EU-based members, pending improvements to safeguard its members’ data.

 

15)  Case Study 15: Prosecution of Viking Direct (Ireland) Limited

In April 2017, we received a complaint from a business owner regarding unsolicited marketing emails that the business email address was receiving from Viking Direct (Ireland) Limited. The complainant indicated that she had previously contacted the company to ask for her business email address to be removed from the marketing list but, despite this, further marketing emails continued to be sent.

During our investigation, Viking Direct (Ireland) Limited confirmed that the complainant had asked to be removed from its mailing list several times. It explained that the internal processes of moving the data to the suppression list had failed and the data remained on the mailing list. The company stated that the systems had now been corrected and tested, such that the situation should not recur. It apologised for any inconvenience caused to the complainant. Our investigation found evidence of three opt-out requests sent by the complainant to Viking Direct (Ireland) Limited by email between 30 March 2017 and 11 April 2017.

Viking Direct (Ireland) Limited had been the subject of an investigation in 2012 on foot of a complaint made to the DPC about unsolicited marketing emails. At that time, we concluded that investigation with a warning to the company. In light of that warning, the DPC decided to prosecute the company in respect of the 2017 complaint.

At Dublin Metropolitan District Court on 14 May 2018, the company entered a guilty plea to one charge of sending an unsolicited marketing email to a business email address in contravention of Regulation 13(4) of S.I. No. 336 of 2011. Under this regulation, it is an offence to send an unsolicited direct-marketing communication by electronic mail to a subscriber (which includes business subscribers) where that subscriber has notified the sender that it does not consent to the receipt of such a communication. The case was adjourned for sentencing until 11 June 2018. At the sentencing hearing, the court applied Section 1(1) of the Probation of Offenders Act in lieu of a conviction and fine. The company agreed to cover the prosecution costs incurred by the DPC.

 

16)  Case Study 16: Prosecution of Clydaville Investments Limited, T/A The Kilkenny Group

In November 2017, we received a complaint from an individual who received a marketing email from the Kilkenny Group. The email, which was personally addressed to him, promoted a pre-Christmas sale and informed him that there was up to 50% off and that everything was reduced. The complainant informed us that he did not believe that he had opted into receiving marketing emails.

During our investigation, it emerged that a previous marketing email had been sent to the same complainant one year earlier, in November 2016, inviting him to a corporate event in the company’s Cork store. The complainant subsequently advised us that he recalled replying to that email, asking that his email address be deleted. In September 2012, arising from our investigation of a complaint about unsolicited marketing text messages sent by the Kilkenny Group to a different complainant, we had issued a warning to the company. In light of that, the DPC decided to prosecute the company in respect of the 2017 complaint.

The matter came before Tralee District Court on 15 October 2018. The defendant faced a total of four charges. Two related to alleged contraventions of Regulation 13(1) of S.I. No. 336 of 2011 for the sending of unsolicited marketing emails to the complainant in November 2016 and November 2017 without his consent. Two further charges related to alleged contraventions of Regulation 13(12) (c) of S.I. No. 336 of 2011. This regulation provides that a person shall not send electronic marketing mail that does not have a valid address to which the recipient may send a request that such a communication shall cease. As guilty pleas were not entered to any of the charges, the matter went to a full hearing involving three defence witnesses and two prosecution witnesses, including the complainant. At the end of the proceedings, the court found the facts were proven in relation to two contraventions of Regulation 13(1) in relation to the sending of two marketing emails without consent. On the understanding that the defendant would discharge the prosecution costs of €1,850, the court applied Section 1(1) of the Probation of Offenders Act in respect of both charges in lieu of a conviction and fine. The court dismissed the two charges in respect of Regulation 13(12)(c).

 

17)  Case Study 17: Prosecution of DSG Retail Ireland Limited

DSG Retail Ireland Limited operates under various trading names and registered business names such as Dixons, Currys, PC World and Currys PC World. In November 2017, we received a complaint from a woman who had purchased a television from Currys a year previously. She informed us that she gave her email address to the company for the purposes of receiving a receipt and that she did not consent to receiving marketing emails. She stated she had unsubscribed from receiving further emails but the unsolicited emails continued.

During our investigation, the company told us that the customer had successfully unsubscribed from its mailing list in November 2016. However, when she made a new purchase in January 2017 and once again opted out of receiving marketing communications, a duplicate record was created following the customer’s second transaction. According to the company, this duplicate record, coupled with a system bug arising during an update to its systems in May 2017, resulted in an error regarding the recording of the customer’s marketing preferences. As a result, there was a period between August and November 2017 during which marketing emails were sent to her.

As we had previously issued a warning to the company in November 2014 on foot of a previous complaint from a member of the public concerning an alleged contravention of the regulations in relation to unsolicited marketing emails, the DPC decided to prosecute the company in respect of the latest suspected contravention.

At Dublin Metropolitan District Court on 22 October 2018 the company entered a guilty plea in relation to a charge for contravention of Regulation 13(1) of S.I. No. 336 of 2011 for the sending of an unsolicited marketing email to the complainant without her consent. In lieu of a conviction and fine, the court ordered the company to make a charitable donation of €1,500 to the Peter McVerry Trust. The defendant company agreed to cover the prosecution costs of the DPC. Confirmation of the charitable donation was subsequently provided to the court on 26 November 2018 and the matter was struck out.

 

18)  Case Study 18: Prosecution of Vodafone Ireland Limited

In May 2018, we received a complaint from an individual who stated he was receiving frequent unsolicited calls from Vodafone’s marketing team. He claimed that Vodafone initially called him on 10 May 2018, at which point he said he was not interested in their offer; since then the company had called him every day. He ignored the communications.

During our investigation, we confirmed that a recording of the marketing telephone call on 10 May 2018 included the complainant advising the calling agent that he was not interested in Vodafone’s broadband service. Vodafone told us that the agent should have then removed the telephone number from the marketing campaign by using an appropriate code when closing the call. Human error had led to the phone call being closed with an incorrect code for a call-back — meaning the complainant’s phone number remained, leading to the further calls.

We received a separate complaint in July 2018 from a Vodafone customer. He reported that he had received an unsolicited marketing telephone call from Vodafone in June 2018 despite having opted out of receiving marketing telephone calls during a previous unsolicited marketing telephone call in May 2018, confirmation of which had been sent to him by email shortly afterwards.

In response to our enquiries, Vodafone referred to a data-breach report that it had submitted to the DPC on 21 June 2018. This report notified the DPC that several customers who had opted out of marketing between 18 May and 11 June 2018 had erroneously received marketing communications due to difficulties in the implementation of system changes as part of its GDPR-compliance programme. This resulted in recently changed marketing preferences not being read clearly on all its systems and, accordingly, the customers concerned were wrongly included in marketing campaigns.

The DPC decided to prosecute Vodafone in relation to both cases. At Dublin Metropolitan District Court on 22 October 2018, the company entered guilty pleas in relation to two charges for contraventions of Regulation 13(6) (a) of S.I. No. 336 of 2011 for the making of unsolicited marketing telephone calls to the mobile telephones of the two complainants without their consent. The court convicted Vodafone on the two charges and imposed fines of €1,000 in respect of each of the two charges (a total fine of €2,000). Vodafone agreed to cover the prosecution costs of the DPC.

 

19)  Case Study 19: Prosecution of Starrus Eco Holdings Limited, T/A Panda and Greenstar

In April 2018, a customer of the bin-collection service provider, Panda, complained to us that he had received unsolicited marketing SMS and email messages to which he had not consented, advertising Panda’s electricity business. He stated that the messages did not provide an unsubscribe option.

During our investigation, we were informed by Panda that the complainant should not have received the marketing messages. It said that due to a human error, a staff member of the marketing department had incorrectly believed that the complainant had consented to receiving direct-marketing messages. It regretted the failure to include an opt-out on the messages and explained that its service provider for marketing emails had failed to act in accordance with its instructions to include an opt-out. In May 2018, we received a complaint from a customer of Greenstar, another bin-collection service provider. This individual had previously complained to us in 2011 about unsolicited marketing text messages sent to him without consent. We concluded that previous complaint by issuing a warning to Greenstar in September 2011. The complainant now reported to us that direct marketing from Greenstar by means of SMS messages had started aggressively once again.

In response to our enquiries, Greenstar informed us that given the lapse of time (which it acknowledged was absolutely no excuse) since the 2011 complaint, its records pertaining to the complainant were not what they should have been with respect to the complainant having previously opted out of receiving marketing from the company — that neither the complainant’s details nor details of the 2011 complaint were accurate and up-to-date, insofar as it should not have used the complainant’s mobile telephone number for marketing purposes.

In light of our previous warning, the DPC decided to prosecute Starrus Eco Holdings Limited, T/A Panda and Greenstar in respect of offences committed in both cases. At Dublin Metropolitan District Court on 24 October 2018, the company entered guilty pleas in relation to charges for contraventions of Regulation 13(1) of S.I. No. 336 of 2011 for the sending of unsolicited marketing SMS messages to the two complainants without their consent. In lieu of a conviction and fine, the court ordered to company to make a charitable donation of €2,000 to the Peter McVerry Trust. The defendant company agreed to cover the prosecution costs of the DPC. Confirmation of the charitable donation was subsequently provided to the court on 15 November 2018 and the matter was struck out.

 

  1. Prosecution of Guerin Media Limited
  2. Prosecution of AA Ireland Limited
  3. The Dublin Mint Office Limited
  4. Access Request made to NAMA
  5. Disclosure of CCTV footage from a direct provision centre
  6. The importance of data controllers having appropriate mechanisms in place to respond to access requests and document compliance

1)  Case Study 1 : Prosecution of Guerin Media Limited

The DPC received unrelated complaints from three individuals about unsolicited marketing emails that they had received from Guerin Media Limited. In all cases, the complainants received the marketing emails to their work email addresses. None of the complainants had any previous business relationship with Guerin Media Limited. The marketing emails did not provide the recipients with an unsubscribe function or any other means to opt out of receiving such communications. Some of the complainants replied to the sender requesting that their email address be removed from the company’s marketing list. However, these requests were not actioned and the company continued to send the individuals further marketing emails. In one case, nine marketing emails were sent to an individual’s work email address after he had sent an email request to Guerin Media Limited to remove his email address from its mailing list.

The DPC’s investigation into these complaints established that Guerin Media Limited did not have the consent of any of the complainants to send them unsolicited marketing emails and that it had failed in all cases to include an opt-out mechanism in its marketing emails.

The DPC had previously received four similar complaints against Guerin Media Limited during 2013 and 2014 in which the company had also sent unsolicited marketing emails without having the consent of the recipients to receive such communications and where the emails in question did not contain an opt-out mechanism. On foot of the DPC’s investigations at that time, the DPC warned Guerin Media Limited that it would likely face prosecution by the DPC if there was a recurrence of such breaches of the E-Privacy Regulations. Taking account of the previous warning and the DPC’s findings in its current investigation, the DPC decided to prosecute Guerin Media Limited for 42 separate breaches of the E-Privacy Regulations.

The prosecutions came before Naas District Court on 5 February 2018 and the company pleaded guilty to four sample charges out of the total of 42 charges. Three of the sample charges related to breaches of Regulation 13(1) of the E-Privacy Regulations for sending unsolicited marketing emails to individuals without their consent. The fourth sample charge related to a breach of Regulation 13(12)(c) of the E-Privacy Regulations for failure to include an opt-out mechanism in the marketing emails. The Court convicted Guerin Media Limited on all four charges and imposed four fines each of €1,000, i.e. a total of €4,000. The company was given a period of six months in which to pay the fine. It also agreed to make a contribution towards the prosecution costs incurred by the DPC.

Marketing to work email addresses

There is a common misconception that the sending of email communications to individuals at a work email address is a form of business-to-business communication where consent of the individual is not required. The E-Privacy Regulations allow a carve out to the default rule (i.e. that the sending organisation must have the consent of the receiving individual) which allows for such communications to be sent to an email address that reasonably appears to be one used by a person in the context of their commercial or official activity. However, in order to rely on this exception to the general rule requiring consent, the sender must be able to show that the email sent related solely to the recipient’s commercial or official activity, in other words, that it was a genuine business-to-business communication. In effect, this means that marketing material that is directly relevant to the role of the recipient in the context of their commercial or official activity (i.e. within their workplace) may be sent by an organisation without the prior consent of the recipient. However, this was not the case in the circumstances at issue. Instead, the marketing communications sent by Guerin Media Limited related to attempts by that company to sell advertisement space in various publications and to sell stands at exhibitions. However, none of the individual complainants who received those communications had any role in relation to marketing related matters within their own workplaces.

While not directly applicable here, as the complainants were all individuals, organisations should also take note of a further rule in the E-Privacy Regulations concerning situations where the recipient of an unsolicited direct marketing communication is not an individual (e.g. the email address used is a solely company/corporate one and does not relate to the email account of an individual, whether at work or otherwise). In such a case where the company/ corporate recipient notifies the sender that it does not consent to receiving such emails, it is unlawful for the sender to subsequently send such emails.

This case is an important demonstration that any organisation engaging in electronic direct marketing activities should carefully establish the basis on which it considers that the primary default rule requiring a sending organisation to have the consent of the recipient does not apply to it in any given case, and how it can demonstrate this. The case also illustrates the importance of including an opt-out mechanism in each and every electronic direct marketing communication as failure to do so constitutes a separate offence, (in addition to any offences in relation to failure to obtain consent) in respect of each such email/ message.

2)  Case Study 2: Prosecution of AA Ireland Limited

In December 2017 the DPC received a complaint from an individual who had received unsolicited marketing text messages from AA Ireland Limited. He informed the DPC that he had recently received his motor insurance renewal quotation from his current insurance provider and had decided to shop around to try to get a more competitive quotation. One of the companies he telephoned for a quotation was AA Ireland Limited. The complainant informed the DPC that he had expressly stated to the agent who answered his call that he wanted an assurance his details would not be used for marketing purposes and that he had been given that assurance by the agent. The phone call continued with the agent providing a quotation. The complainant noted that the quotation was higher than the renewal quotation from his current insurance provider and the complainant had indicated to the agent that he would not be proceeding with the quotation offered by AA Ireland Limited. The complainant informed the DPC that at his point in the call he had reiterated to the agent that he should not receive marketing material and he was once again assured by the agent that this would not happen.

The essence of the complainant’s complaint however was that the day after the phone call in question he had received a marketing text message from AA Ireland Limited offering him €50 off the quote provided. A further similar text message was sent to his mobile phone one day later. The complainant stated in his complaint that he felt that this action was a blatant breach of his very clear and precise instructions that he did not wish to receive any marketing communications.

During the course of our investigation, AA Ireland Limited confirmed that it had sent both text messages to the complainant and admitted that it had not obtained consent to send these messages to the complainant. The company acknowledged that the complainant had requested that he not receive marketing messages, that the complainant’s request should have been actioned and that his details should not have been used for marketing purposes. The company claimed that the incident arose as a result of human error. It explained that the correct process had not been followed by the agent so that the complainant’s details had been recorded with an opt-in for him to receive marketing messages therefore resulting in marketing text messages being sent to him.

As the DPC had previously issued a warning in separate circumstances to AA Ireland Limited in relation to unsolicited marketing communications, in this instance the DPC decided to initiate prosecute proceedings. At Dublin Metropolitan District Court on 14 May 2018 AA Ireland Limited entered a guilty plea to one offence. It also agreed to cover the prosecution costs incurred by the DPC. In lieu of a conviction and fine, the Court applied Section 1(1) of the Probation of Offenders Act.

3) Case Study 3:  The Dublin Mint Office Limited

The DPC received a complaint on 13 October 2017 from an individual who had received two marketing telephone calls that same day, one targeted at him and one at his son, from The Dublin Mint Office Limited. The caller in each case had attempted to sell commemorative coins. In his complaint, the complainant explained that he had registered online a few months earlier with the company for an online offer on his own behalf and on behalf of his son, providing the same telephone contact number for both of them during the online registration process. The complainant stated that he ticked the marketing opt-out box during that online registration process.

During the course of the DPC’s investigation, The Dublin Mint Office Limited admitted that it had made the marketing telephone calls. It explained that when the complainant supplied his telephone number during the online application process in May 2017 the order form had only offered an opt-in option to receive marketing mails and emails. The company confirmed that the complainant had not selected the opt-in option and he was therefore marked as opt-out for marketing mails and emails only. The company explained that a gap in the system in place at the time only allowed for an opt-in to marketing mails and emails but that it was not an opt-out for telesales. As a result, the complainant’s details were included in a list for a follow-up telesales call. The company informed the DPC that it had written to the complainant to apologise for the inconvenience caused to him and to his son by its inadvertent mistake.

The DPC had previously issued a warning to The Dublin Mint Office Limited in September 2017 concerning other complaints which had been made to the DPC concerning unsolicited marketing communications by the company. The DPC therefore decided to prosecute The Dublin Mint Office Limited. At Dublin Metropolitan District Court on 14 May 2018 the company pleaded guilty to two charges in relation to both marketing telephone calls. It also agreed to cover the DPC’s prosecution costs. In lieu of a conviction and fine, the Court applied Section 1(1) of the Probation of Offenders Act.

4)  Case Study 4: Access Request made to NAMA.

Background

In February 2018, the DPC issued a decision on a complaint which had been made to it by two individuals against the National Asset Management Agency (NAMA). The complaint concerned allegations of non-compliance with a joint access request which had been made to NAMA in September 2014 by the complainants who were the directors and/or shareholders of a number of companies whose loans had transferred to NAMA. Certain personal loans of those individuals had also transferred to NAMA. The joint access request which had been made to NAMA expressly referenced personal data held by NAMA in connection with both the personal loans and the company loans.

NAMA responded to the complainants in October 2014, asking them to identify which of a number of categories of personal data (which NAMA itself had identified) that they wished to receive. The complainants replied, objecting to the manner in which NAMA’s response had sought to limit the scope of the request. NAMA subsequently provided the complainants with a copy of the personal data which it considered the complainants were entitled to but noted that it was not required to provide personal data which was subject to legal privilege, which comprised confidential expressions of opinion or which would prejudice the interests of NAMA in respect of a claim or which would prejudice the ability of NAMA to recover monies owed to the State. However, NAMA did not identify the personal data in respect of which it considered such exemptions from the right of access applied. While the personal data provided by NAMA to the complainants related to the personal loans of the complainants which had previously transferred to NAMA, it did not include personal data relating to the complainants as directors and/or shareholders in the companies whose loans had transferred to NAMA.

Complaint to the DPC

The data subjects subsequently made a complaint to the DPC which alleged:

  • that NAMA had failed to provide all of the complainants’ personal data to them;
  • that NAMA had incorrectly applied exemptions under the Data Protection Acts 1988 and 2003; and
  • that even if NAMA was entitled to rely on one or more exemptions, that it was obliged to provide the complainants with a description of the personal data so that they had a reasonable and fair opportunity to consider whether it did fall under an exemption; and
  • that NAMA had failed to conduct searches for personal data relating to ten additional categories of

information identified by the complainants.

NAMA’s position on the complaint

NAMA stated that it had fully complied with the access request. Following an exchange of correspondence with the DPC, NAMA contended:

  • that “corporate data”, i.e. information relating to the loans of the companies linked to the complainants did not fall within the definition of “personal data”;
  • that it was released from its obligations to provide access to personal data contained within the totality of the records held in relation to both the personal loans and the company loans, on the basis that conducting such searches would require ‘disproportionate effort’ on the part of NAMA to do so; and
  • that it was appropriate for NAMA to rely on statutory exemptions to the right of access, as provided under Sections 5(1)(a), 5(1)(f) and 5(1)(g) of the Data Protection Acts 1988 and 2003.

DPC Investigation

In a submission to the DPC, NAMA provided estimates  of the number of relevant records it held, and the potential financial cost of completing a comprehensive search for all personal data requested. NAMA confirmed that it had not conducted searches for the complainants’ personal data held in relation to company loans.

In order to substantiate its position, NAMA agreed to conduct sample searches for personal data in respect of a particular two-month period. Authorised officers on behalf of the DPC conducted three on-site investigations at NAMA premises to corroborate NAMA’s position on issues relating to its searches. Following a review of the sample searches carried out, DPC officers were not satisfied that a comprehensive search would involve a disproportionate effort on the part of NAMA, or that information held by NAMA relating to the complainants’ company loans did not also contain personal data of the complainants.

Following engagement between the DPC and NAMA, additional personal data was released to the complainants. However, efforts to resolve this matter informally were to no avail. The DPC subsequently issued a lengthy statutory decision running to some 67 pages in relation to the complaint. This decision addressed the three core issues referred to above. The DPC’s findings on The Commissioner’s Decision each of these issues was as follows.

(1) The Corporate Data Issue

While NAMA acknowledged that the complainants’ names appeared in records relating to the company loans, reflecting that they were directors and/or shareholders of the companies in question and while NAMA accepted that the complainants’ names were their personal data, it contended that this did not make the other information in those records their personal data. The complainants’ position meanwhile was that there was no doubt but that information relating to a person in their capacity as a company director could constitute personal data. They also pointed to the fact that information referencing an assessment of their performance / conduct or the evaluation of their assets constituted personal data even it if was concerned with company loans or the business of those companies. The complainants also contended that while records in relation to the company loans and their personal loans were held separately, the reality was that all of NAMA’s dealings with them were interconnected.

The DPC in her decision noted that the mere fact of one of the complainant’s names appearing in records relating to the company loans (for example if they had simply signed a commercial agreement in their capacity as director of a company) was not sufficient in and of itself for other information in that agreement to constitute personal data. However, the records which had been identified through the sample searches bore out the complainants’ contentions that those records could not be assumed to contain no personal data at all. The DPC noted by way of example that it was clear from a document, the title of which referred to a NAMA board meeting, that while the board meeting had discussed and considered a business plan referable to one of the companies, there was information in that document relating to the financial assets of the complainants in their personal capacities. The DPC accepted the complainants’ position that the records held by NAMA regarding the company loans contained at least some personal data relating to them. Therefore the DPC considered that NAMA must at the very least, identify the records or types of records in which the complaints were identified by name or otherwise but which NAMA considered did not constitute personal data, and provide sufficient information for the complainants to understand why it was said that those records or types of records do not constitute or contain personal data.

(2) The Disproportionate Effort Issue

The DPC then considered whether the time and money costs involved in NAMA conducting searches of the records held in relation to the company loans would be disproportionate relative to the amount of personal data that might be found and disclosed to the complainants. The DPC noted that while there is no express obligation on a data controller to search for personal data in order to comply with a properly made access request, she accepted that there was an implied obligation on a data controller to undertake searches so as to identify what personal data it might hold on a requester. The question for consideration concerned the nature and extent of this implied duty. The DPC noted that the disproportionate effort obligation found in Section 4(9)(a) of the Data Protection Acts 1988 and 2003, on the face of that provision, applied only to limit the obligation to provide to the data constituting the personal day in permanent form. However, it did not limit the earlier steps in the process such as the obligation to search for the data. While the DPC referred to jurisprudence from the UK Courts which has established that the implied obligation to search for personal data is limited to a reasonable and proportionate search, she noted that she was not aware of any judicial authority in Ireland dealing with the nature or extent of a controller’s obligations to conduct searches in order to comply with Section 4 of the Data Protection Acts 1988 and 2003. While accepting that there was no obligation on her to recognise the principles established by the UK authorities, the DPC noted that one particularly pertinent decision to this effect (Deer v. University of Oxford) had previously been referenced by the Irish High Court (in the judgment of Coffey J. delivered on 26 February 2018 in the case of Nowak v. Data Protection Commissioner). The DPC considered that decision to be helpful in interpreting Sections 4(1) and 4(9) of the Data Protection Acts 1988 and 2003, particularly given its analysis of case law from the CJEU. On that basis the DPC therefore accepted NAMA’s contention that the obligation to search for personal data was not without limits but rather NAMA was required to undertake reasonable and proportionate searches to identify the personal data of the complainants which it held.

The DPC then went on to consider whether NAMA had discharged this obligation, by carrying out the type of balancing exercise which had been contemplated in the UK case law, between upholding the data subject’s right of access and the burden which it would impose on the data controller. In doing so, the DPC considered a number of factors bearing upon this balancing exercise, including the intrinsic significance of the personal data and its relative importance to the requesters. In this regard, the DPC noted that the personal data in question related to the business and financial interests of the complainants both personally and in respect of the companies of which they were directors and/ or shareholders. It was also considered relevant that (as evident from the correspondence seen by the DPC’s officers) that the complainants were trying to bring about a situation in which the company loans would be dealt with by NAMA in a way that would ensure the survival of the companies and preserve the complainants’ ability to retain some level of ownership or control in those companies. Consequently, the DPC considered the personal data held by NAMA to be of significant importance to the complainants.

The DPC then considered the countervailing points made by NAMA, including specific estimates (calculated on the basis of the results from the sample searches) provided to the DPC relating to the estimated number of hits produced if searches were to be carried out (approximately 62,000), the estimated number of relevant records which would be identified following a review of those hits (approximately 12,600) and the estimated time which it would take to review, assemble and redact the material for release to the complainants (over 2,700 hours). It was also noted by the DPC that while NAMA had referred to the potential for technical solutions to counteract the manual input required, that NAMA had stated it was not something which it had assessed and its view was that should such solutions exist, they would incur a disproportionate cost of implementation.

The DPC found NAMA’s estimates as regards the time and effort involved in carrying out the full period searched to be speculative in nature and lacking in specific detail, and that it had failed to discharge the burden of proof on it in this regard. This was particularly so in light of the fact that NAMA’s previous position (prior to the sample searches having been conducted) that there was no personal data of the complainants held in the records relating to the company loans, had not been borne out in fact by reference to the results of those sample searches. NAMA had, it was noted, originally agreed to conduct searches for the whole period during which it held the company loans if the sample searches had demonstrated that there was personal data of the complainants held in the records relating to the company loans. However, some 14 months later NAMA had changed its position and decided not to undertake any further searches at all despite the sample searches having shown the presence of personal data in the company loans records. The DPC also considered that NAMA’s claims (in the absence of an assessment to this effect) that (1) a technical solution would not be feasible and (2) its unparticularised claim that the disproportionate effort involved in carrying out the searches and providing the personal data identified would divert its resources away from its statutory remit, did not discharge the burden of proof to which it was subject in respect of its claims of disproportionate effort.

The DPC found that in refusing to conduct the searches NAMA had not sought to balance its rights against the complainants’ rights but had set them at nought. NAMA had not discharged its obligation by conducting reasonable and proportionate searches to find relevant personal data and supply it. The DPC was not satisfied on the basis of the arguments and evidence put forward by NAMA that by conducting the searches this would constitute disproportionate effort on its part.

(3) The Statutory Exemptions Issue

The sample searches which had been carried out by NAMA led to the identification of 14 hard copy documents containing the personal data of the complainants, drawn from NAMA’s records relating to both the company loans and the personal loans. However, NAMA withheld or redacted 3 of these documents on the basis of certain exemptions to the right of access. These exemptions related to Section 5(1)(g), Section 5(1)(f) and Section 5(1)(a) of the Data Protection Acts 1988 and 2003. As a preliminary matter the DPC found that NAMA must prove convincingly, and by evidence, meeting the civil standard of proof that each of the exemptions on which it sought to rely on did in fact apply in this case and operated to trump the complainants’ rights of access.

In the case of the legal privilege exemption which NAMA claimed applied to an internal email passing between solicitors employed at NAMA, the DPC noted that this document on its face was labelled as attracting litigation privilege. However given that no litigation was in being between the complainants and NAMA at the time of its creation (and in fact the only litigation now in being had only come into existence some 2 to 3 years later), the DPC was not satisfied that NAMA had discharged the burden of proof on it to show that litigation privilege applied to the personal data in question. However, the DPC then went on to consider whether legal advice privilege applied and concluded that it did because the content of the email in question set out the basis on which certain issues relating to the personal loans might be considered and addressed. The DPC was therefore satisfied that the email in question was privileged and exempt from release under Section 5(1)(g) of the Data Protection Acts 1988 and 2003.

With regard to two further documents, NAMA claimed that the exemption in Section 5(1)(a) applied. This provides that the right of access does not apply to personal data kept for the purposes of preventing, detecting or investigating offences, apprehending or prosecuting offenders, or assessing or collecting any tax, duty or other moneys owed or payable to the State, a local authority or health board in any case in which granting access to the personal data would prejudice any such matters. The DPC applied the test for application of this exemption which had been set out in the UK judgment of Guriev & another v. Community Safety Development (UK) Limited [2016] EWHC 643. That case had concerned the equivalent exemption under the UK Data Protection Act 1998. The DPC found that NAMA had simply asserted that in the case of the two records in question, providing access to the personal data would have the effect of disclosing its strategy in dealing with liabilities. However NAMA had made no effort to explain the nature and effect of the prejudice that would be suffered if the personal data in question was released, how the release of it would lead to the prejudice, nor how applying the exemption was a necessary and proportionate interference with the complainants’ rights having regard to the gravity of the threat to the public interest. In light of this lack of evidence, the DPC decided that it was not open to NAMA to rely on this exemption.

The final exemption relied on by NAMA and considered by the DPC was Section 5(1)(f) which provides that the right of access does not apply to personal data consisting of an estimate or kept for the purposes of estimating the amount of liability of a data controller on foot of a claim in respect of damages or compensation where granting access would be likely to prejudice the interests of the data controller in relation to the claim. The DPC found that no evidence had been put forward by NAMA as to the factual basis for relying on the exemption. For example, NAMA had not identified the prejudice which it would suffer if it provided the personal data, or how or in what context the prejudice would arise. As NAMA had failed to discharge the burden of proof on it in relation to its claim to this exemption, the DPC found that it was not open to NAMA to rely on it.

Decision

Arising from the DPC’s findings, the DPC concluded that NAMA was in breach of its obligations under Section 4(1) (a) and Section 4(9) of the Data Protection Acts 1988 and 2003.

5) Case Study 5:  Disclosure of CCTV footage from a direct provision centre.

We received a complaint from solicitors for a resident of a direct provision accommodation centre in relation to an alleged disclosure of CCTV footage capturing the complainant’s images. The accommodation centre in question is owned by the State (with responsibility for it resting with the Reception and Integration Agency (RIA) which sits within the Department of Justice and Equality). The centre is managed on a day-to-day basis by Aramark Ireland (Aramark). The alleged disclosure of the complainant’s personal data came to her attention during her participation in a radio programme. The subject matter of that radio show concerned a matter that had arisen between residents of the accommodation centre and its staff. During the course of the radio programme, the radio host claimed that he had a copy of CCTV footage, which was apparently taken from a room in the accommodation centre, which allegedly showed an altercation between the complainant and another resident of the direct provision centre.

The complainant subsequently made complaints to RIA, to Aramark and to the radio station which had aired the radio programme in question. An access request for a description of all recipients to whom the complainant’s personal data had been disclosed was also made on behalf of the complainant under Section 4 of the Data Protection Acts 1988 and 2003 to RIA. However, the complaint noted that RIA had not responded to that access request.

We commenced an investigation into the complaint seeking information from both Aramark and the RIA. The RIA informed us that it was liaising with Aramark and had requested a report from it. During the investigation, we established that Aramark was a data processor processing personal data on behalf of the RIA. Aramark submitted that CCTV is used for security purposes and to monitor health and safety within the accommodation centre. Aramark informed us that it processes personal data in line with the RIA’s instructions and that access to the storage medium within the accommodation centre was limited to specific authorised personnel, with accompanying user name and passwords requirements.

In relation to the specific allegation of disclosure of the CCTV footage, Aramark told us that CCTV footage of an altercation involving the complainant had been downloaded by authorised personnel from Aramark and transmitted to the RIA. The reason for the download and transmission were that the captured events related to security, and health and safety issues. According to Aramark, due to the size of the file in question, the employee had saved the footage to a Google link for onward transmission to the RIA.

Aramark informed us about a detailed forensic IT enquiry that had been conducted in relation to the complaint, across its IT systems to identify whether any other disclosure of the CCTV footage had taken place. It maintained on the basis of its own investigations that the link had not been sent from any Aramark email account to an outside party other than the RIA. Amongst other things, as part of the forensic enquiry, Aramark said that it had checked internet logs on the Aramark computer used at the accommodation centre, searched the mailboxes of Aramark staff who worked at the accommodation centre and searched for email correspondence inbound and outbound relating to the incident. A data recovery program had also been installed on the computer inquestion to review all deleted content on the computer. No activity indicating disclosure of the CCTV footage to any third party had been identified. Aramark further informed us that the Google link no longer existed and was therefore not accessible.

Aramark also maintained that the authorised personnel who had downloaded the footage had confirmed that the footage had not been disclosed to any third party and that it had been deleted following transmission to the RIA.

Separately the RIA confirmed to us during our investigation that the Google link to the CCTV footage which it had received, referenced the complainant and another resident. It stated that a copy of the footage had not been retained by the RIA.

In relation to the management of the CCTV system in the accommodation centre, the RIA furnished us with certain documentation including Aramark’s data protection and CCTV policies and a confidentiality agreement in place with Aramark. However, the RIA acknowledged during our investigation that there were no policies or practice documents in place for the management of CCTV  operating in accommodation centres.

Ultimately neither Aramark nor the RIA were able to definitively confirm that CCTV footage in question had not been disclosed to the radio station. In relation to its non-compliance with the access request, the RIA’s position was that it was waiting on a detailed report from Aramark and that it could not respond to the access request until it had received that report.

In her decision, the DPC found that the RIA did not respond to the request by the complainant for a description under Section 4 of the Data Protection Acts 1988 and 2003 of all recipients to whom the personal data was disclosed, within the prescribed timeframe of 40 days. This was in direct contravention of RIA’s obligation under that provision.

In relation to the oversight of the processing carried out by Aramark as a processor for RIA, based on the submissions made by both the RIA and Aramark in the course of the DPC’s investigation, there was no evidence of a written contract in place which delineated the respective obligations applicable to the RIA and Aramark in relation to the processing of personal data by Aramark on the RIA’s behalf. This constituted a contravention by the RIA, as the data controller, of Section 2C(3) of the Data Protection Acts 1988 and 2003.

Although the DPC was unable to establish how the CCTV footage in question came to be in possession of a radio station, the DPC found that ultimately the complainant’s rights were infringed. In this regard both the RIA and Aramark failed in their duty of care to the complainant by failing to ensure that appropriate security measures were taken against the unauthorised disclosure as required by Section 2(1)(d). The DPC also decided that a contravention of Section 2C(2) of the Data Protection Acts 1988 and 2003 had occurred. This provision requires a controller to take reasonable measures to ensure that its employees and other persons at the place of work are aware of and comply with security measures. The lack of agreed procedures and in-depth policies in place between the RIA and Aramark relating to the transfer of personal data over a network led to this decision.

This case illustrates the unintended and unforeseen consequences which can result from an absence of clear, documented policies and procedures governing the transmission of personal data over a network. In this case, that failure was compounded by the further failure by the RIA to also have a written agreement in place which clearly set out the parameters of Aramark’s instructions to process personal data on behalf of the RIA. As this case demonstrates, such failures by a controller to comply with its data protection obligations are not just administrative or regulatory breaches but can result in grave incursions into an individual’s Charter protected right to protection of their personal data which otherwise should have been avoidable.

6)  Case Study 6: The importance of data controllers having appropriate mechanisms in place to respond to access requests and document compliance.

We received a complaint from a data subject concerning the alleged failure of eir to comply in full with an access request. The complainant advised us that in response to his access request he had received from eir what he described as “a bundle of random pages of information without any explanation of content”.

In the course of our investigation we established that eir was in fact seeking to rely on certain statutory exemptions to the right of access. However in its response to the requester’s access request, it had not referred at all to the fact that it had withheld certain personal data. It was only in communications with eir, during the course of our investigation, some five months after eir’s receipt of the access request, that eir indicated that they were withholding personal data based on exemptions and outlined the details of the exemptions relied on by reference to an attached list.

In the course of our investigation it also became apparent that eir was unable to determine what personal data had actually been provided to the complainant as it had not retained a copy of the personal data which had been provided. As a consequence of the lack of records kept on the personal data which had been released, eir was also unable to identify what personal data had been withheld/ not provided either in reliance on an exemption under the Data Protection Acts 1988 and 2003, or otherwise.

We pointed out to eir that it would be difficult to see how eir would be in a position to provide clarification to us as to their purported application of any statutory exemption to this particular access request given that they were not clear on what personal data had been provided to the complainant in the first place. We accordingly directed eir to re-commence the process of responding to the access request afresh. We specified that in doing so, eir should:

  1. Examine its systems, both manual and electronic and carry out a review of all the personal data held by it relating to the complainant in manual and electronic form;
  2. Write to the complainant within a period of not more than fourteen days of the date of our request, responding to the substance of his access request in accordance with the provisions of Section 4 of the Acts. In so doing, we required that eir provide access to all personal data held or controlled by it, while also explaining to the requester the reason for the re-issue to him of personal data which had already been provided, i.e that eir was unable to determine what personal data had already issued to him. We also directed that in this response, eir also provide the requester with a statement of the reasons for the refusal to provide access to any personal data, identifying any statutory exemption relied on by eir and the basis on which eir contended that such exemption(s) applied in this case. Finally we required that eir’s letter to the requester should be copied to us.

While ultimately the complainant in this case withdrew his complaint against eir, the issues identified during the course of our investigation underline the critical importance of data controllers having adequate organisational and operational mechanisms to allow them to comply with their statutory obligations with regard to access requests. However, it is equally important that a data controller is able to post facto demonstrate (where required by the DPC, such as in the context of a complaint) compliance with its obligations. A data controller must be able to justify decisions it has taken to deny access to personal data in reliance on one or more statutory exemptions. As a basic starting point of being able to provide justification as to the position taken in relation to a request by a data subject to exercise a right, data controllers should have appropriate record keeping systems and processes in place. These mechanisms should allow them to track and produce copies of any correspondence exchanged with a data subject in relation to an access request or request to exercise any other data protection right.

This case study also underscores the fact that the right of an individual to access personal data held about them is not just about being provided with access to the data itself. Importantly it is also concerned with sufficient, meaningful information being given to the data subject so that they can understand, amongst other things, what personal data is processed about them, in what circumstances and for what purposes. In this case the provision of a bundle of unexplained documents in response to the access request failed to satisfy the minimum requirements applicable to eir as a data controller under Section 4 of the Data Protection Acts 1988 and 2003, ultimately causing confusion for the data subject and prompting a complaint to the DPC.

  1. Right to be Forgotten
  2. Prosecution of Eamon O’Mordha & Company Limited and one of Its Directors
  3. Loss of sensitive personal data contained in an evidence file kept by An Garda Síochána
  4. Use of CCTV footage in a disciplinary process.
  5. Disclosure of sensitive personal data by a hospital to a third party
  6. Publication of personal information - journalistic exemption
  7. Compliance with a Subject Access Request & Disclosure of personal data / capture of images using CCTV
  8. Failure to respond fully to an access request
  9. Personal data of a third party withheld from an access request made by the parent of a minor
  10. Disclosure of Personal Data via a Social Media App
  11. Failure by the Department of Justice and Equality to impose the correct access restrictions on access to medical data of an employee
  12. Virgin Media Ireland Limited
  13. Sheldon Investments Limited (trading as River Medical)
  14. Tumsteed Unlimited Company (trading as EZ Living Furniture)
  15. Cunniffe Electric Limited
  16. Argos Distributors (Ireland) Limited
  17. Expert Ireland Retail Limited

1)   Right to be Forgotten

We received a complaint from a Lithuanian national concerning articles about that individual which had been published by a number of Lithuanian news sources ten years earlier. Links to these articles were returned in search results when a search against the individual’s name was carried out using a particular search engine. The articles in question detailed the termination of the individual’s employment as an official in a municipal government department in connection with the individual’s involvement in potentially fraudulent activities. The article also detailed criminal charges which had been brought against the individual for allegedly accepting bribes in the context of their employment.

During the course of our investigation into this complaint, the search engine operator contended that the information detailed in the articles in question related to serious professional wrongdoing committed by an individual involved in public administration. It maintained that where such wrongdoing resulted in criminal sanctions that this was sufficiently serious for the information to be considered to be in the public interest and therefore any interference with the data subject’s rights was justified.

However in the course of our investigation the complainant provided us with official court documents which showed that they had been found not guilty of all the charges which had been referred to in the articles. The complainant also provided us with documents which showed that the termination of their employment with the municipal government department had been on a voluntary basis with the complainant having resigned due to personal reasons. We considered that this documentary information demonstrated that the complainant’s personal data, which was being processed by way of the search engine returning search results to the articles in question, was inaccurate, incomplete and out of date and on that basis we requested that the search engine operator delist the links to the webpages in question from search results which were returned from searches conducted against the complainant’s name. The search engine operator complied with our request and delisted the links in question.

This case illustrates that the onus is on a search engine, as the data controller, to satisfy itself to the appropriate level that the personal data to which search engine results provide links fully accords with the laws on data protection. In this case, it appeared that the search engine operator did not properly examine the complaint but simply took the approach of assuming that because the complainant had previously been employed in a public official role that the information in question was automatically in the public interest, regardless of whether it was in fact accurate, complete and up to date. The search engine operator had assumed, without apparently even checking the factual background, that the complainant had been convicted of the criminal charges

2)   Prosecution of Eamon O’Mordha & Company Limited and one of Its Directors

The investigation of this case arose in the context of a wide-ranging investigation of the Private Investigator sector that commenced in 2016. As part of that investigation, the Special Investigations Unit obtained and examined copies of several private investigator reports written in 2014 and 2015 by Eamon O’Mordha & Company Limited (the company) for its clients in the insurance sector. The Special Investigations Unit became suspicious of the origin of some of the personal data in those reports and it immediately commenced an investigation involving the Department of Social Protection and An Garda Síochána.

The investigation subsequently uncovered access by the company to social welfare records held on databases in the Department of Social Protection. An official in that Department was interviewed by Authorised Officers of the Data Protection Commissioner. During the course of that interview, the official revealed that the two directors of the company were friends of hers and she admitted that one of the company directors met with her regularly and asked her to check information on the Department’s database. The official admitted that she carried out those checks and provided personal information to the company director.

Separately, the investigation uncovered access by the company to records held on the PULSE database of An Garda Síochána. Two serving members of An Garda Síochána (who are brothers and nephews of one of the directors of the company) were interviewed by Authorised Officers of the Data Protection Commissioner. During the course of those interviews, both Gardaí confirmed that they had been contacted by their aunt to obtain information from them in relation to individuals and vehicle registration numbers. They both admitted that they had accessed the Garda PULSE database and that they had subsequently passed on personal information to their aunt, the company director.

Eamon O’Mordha & Company Limited was charged with 37 counts of breaches of the Data Protection Acts, 1988 and 2003 (the Acts). All charges related to breaches of section 22 of the Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept and disclosing the data to another person. The personal data was kept by the Department of Social Protection and An Garda Síochána. The personal data was disclosed to entities in the insurance sector. Two directors of the company, Eamonn O’Mordha and his wife Ann O’Mordha were separately charged with thirty-seven counts of breaches of section 29 of the Acts for their part in the offences committed by the company. This section of the Acts provides for the prosecution of company directors where an offence by a company is proved to have been committed with the consent or connivance of, or to be attributable to any neglect on the part of the company directors or other officers.

On 8 May, 2017 at Dublin Metropolitan District Court, guilty pleas on behalf of the company were entered to twelve charges for offences under section 22 of the Acts. The Court convicted the company on ten charges and it took the further two charges into account. It imposed ten fines of €1,000 on the company (totalling €10,000). All remaining charges were struck out. Company director Ms. Ann O’Mordha pleaded guilty to twelve charges for offences under section 29 of the Acts. The Court convicted Ms. O’Mordha on ten charges and it took the further two charges into account. It imposed ten fines of €1,000 on Ms. O’Mordha (totalling €10,000). All remaining charges were struck out. The charges against her husband, the other company director, were not proceeded with.

3)   Loss of sensitive personal data contained in an evidence file kept by An Garda Síochána

We received a complaint from a couple against An Garda Síochána (AGS), concerning the loss of an evidence file that held, among other things, the couple’s sensitive personal data relating to details of medical treatment. We established that the couple had previously made a criminal complaint to AGS and had subsequently made an access request. However, in response to the access request, they were informed that the evidence file in relation to their complaint, which contained their original statements, a DVD and postal documents containing their sensitive personal data, had been misplaced while in the possession of AGS. The complainants requested that we conduct a formal investigation into the matter.

AGS informed us that upon identifying that the evidence file in question was missing, a comprehensive search had taken place of all files retained at local level in the District Office, and other relevant sections of AGS, in order to try to locate the file. Ultimately, however, the file had not been located.

During the course of our investigation, we studied the chain of custody supplied to us by AGS and established that the last known whereabouts of the file was in the investigating officer’s possession. That officer had been instructed by a superior to update the couple about the criminal complaint and to then return the file to the District Office for filing. However, the officer had failed to return the file to the District Office for filing. AGS informed us that the failure by the officer to return the file to the relevant location in the District Office was in contravention of its policy and procedures at the time and that consequently both an AGS internal investigation and a Garda Síochána Ombudsman Commission investigation had been conducted. Following the latter investigation, the officer in question had been disciplined and sanctioned for the contravention.

One of the central requirements of data protection law is that data controllers have an obligation to have appropriate security measures in place to ensure that personal data in their possession is kept safe and secure. This requires the controller to consider both technical and organisational measures and importantly, to take all reasonable steps to ensure that its employees, amongst others, are aware of and comply with the security measures. In her decision, the Commissioner found that AGS, as data controller, had infringed Section 2(1)(d) of the Data Protection Acts 1988 and 2003, as it failed to take appropriate security measures to ensure the safe storage of the complainants’ sensitive personal data which was contained on the evidence file in question.

This case demonstrates that the obligation on a data controller to maintain appropriate security measures goes beyond simply putting in place procedures regarding the storage and handling of personal data. Such procedures are only effective as a security control if they are consistently adhered to, so data controllers must monitor staff compliance with these measures and take meaningful steps (for example training, auditing and potentially disciplinary measures where non-compliance is identified) to ensure that staff systematically observe such procedures.

4)   Use of CCTV footage in a disciplinary process.

We received a complaint from an individual regarding the use of CCTV footage by their employer in a disciplinary process against them. The complainant informed us that while employed as a security officer, their employer had used their personal data, in the form of CCTV footage, to discipline and ultimately dismiss them. The complainant stated that they had not been given prior notification that CCTV footage could be used in disciplinary proceedings.

In the course of our investigation, the employer informed us that the complainant had worked as a night officer assigned to client premises, and had been required to monitor the CCTV system for the premises from a control room. The employer’s position was that, upon being assigned to the client premises in question, the complainant had been asked to read a set of “Standing Operating Procedures” which indicated that CCTV footage could be used in an investigative process concerning an employee. The employee had also been asked to sign a certificate of understanding to confirm that he had read and understood his responsibilities. The employer maintained that the CCTV system in place at the client premises was not used for supervision of staff as there was a supervisor at the premises during office hours between Monday and Friday.

The employer informed our investigators that it was the complainant’s responsibility, as the sole night security officer on duty at the client premises, to monitor the CCTV system for the premises from the control room. The requirement to have a night security officer on duty in that control room for that purpose was a term of the employer’s contract with its client. The employer was also contractually obligated under its contract with its client to carry out routine audits of employee access cards (which were swiped by the holder to gain access to various locations in the client premises). The employer told us that during such an audit, it had discovered irregularities in data derived from the complainant’s access card which could not be the result of a technical glitch as those irregularities were not replicated in the access card data of the complainant’s fellow night officers. These irregularities suggested that the complainant had been absent from their assigned post in the control room for prolonged periods of time on a number of separate occasions. On the basis of the access card data irregularities and upon noting the apparent absence of the employee from the control room during prolonged periods, the employer had commenced an investigation into the employee’s conduct. During the course of this investigation, the complainant disputed the accuracy of the access card data, and had sought that the employer provide further evidence of his alleged prolonged absences from the control room. The employer had therefore obtained CCTV stills at times when the access card data suggested the complainant was away from their post in order to verify the location of the complainant. The employer maintained that because the CCTV system was independent of the access card data system, it was the only independent way to verify the access card data. The employer also provided us with minutes of a disciplinary meeting with the complainant where they had admitted to being away from the control room for long periods. The employer also informed us that the complainant had later admitted in an email, also provided to us, that the reason for these absences was that the complainant had gone into another room so that they could lie down on a hard surface in order to get relief from back pain arising from a back injury.

We queried with the employer what the legal basis was for processing the complainant’s personal data from the CCTV footage. The employer’s position was that as a result of its contractual obligations to its client (whose premises were being monitored), if an adverse incident occurred during a period of absence of the assigned security officer (the employee) from the control room, that would potentially expose the employer to a breach of contract action by its client which could lead to significant financial and reputational consequences for the employer. On this basis the employer contended that it had a legitimate interest in processing CCTV footage of the employee for the purpose of the disciplinary process. Under Section 2A(1)(d) a data controller may process an individual’s personal data, notwithstanding that the controller does not have the consent of the data subject, where the processing is necessary for the purposes of the legitimate interests pursued by the data controller. However, in order to rely on legitimate interests as a legal basis for processing, certain criteria have to be met as follows:

  • there must be a legitimate interest justifying the processing;
  • the processing of personal data must be necessary for the realisation of the legitimate interest; and
  • the legitimate interest must prevail over the rights and interests of the data subject.

Having considered the three step test above, the Commissioner was satisfied that the employer had a legitimate interest in investigating and verifying whether there was misconduct on the part of the employee (or whether there was a fault in the access card security system). Furthermore, the Commissioner considered that the use of the CCTV footage was necessary and proportionate to the objective pursued in light of the seriousness of the allegation because it was the only independent method of verifying the accuracy of the access card data. The Commissioner noted that the CCTV footage was used in a limited manner to verify other information and that the principle of data minimisation had been respected. Finally, given the potential risk of damage to the employer’s reputation and the need to ensure the security of its client’s premises, the Commissioner was satisfied that the use of CCTV footage for the purpose of investigating potential employee misconduct, which raised potential security issues at a client premises, in these circumstances took precedence over the complainant’s rights and freedoms as a data subject. On the issue of whether the controller had provided the complainant with notice of the fact that their personal data might be processed through the use of CCTV footage, the Commissioner was satisfied that there had been adequate notice of this by way of the SOP document which had been acknowledged by the complainant signing the certificate of understanding.

This Commissioner therefore formed the view that the employer had a legal basis for processing the complainant’s personal data contained in the CCTV footage under Section 2A(1)(d) of the Data Protection Acts 1988 and 2003.

This case demonstrates that the legal basis of legitimate interests will only be available to justify the processing of personal data where, in balancing the respective legitimate interests of the controller against the rights and freedoms of the data subject, the particular circumstances of the case are clearly weighted in favour of prioritising the legitimate interests of the controller. It is an essential that in order to justify reliance on this legal basis that the processing in question is proportionate and is necessary to the pursuit of the legitimate interests of the controller.

5)   Disclosure of sensitive personal data by a hospital to a third party.

We received a complaint concerning the alleged unauthorised disclosure of a patient’s sensitive personal data by a hospital to a third party. The complainant had attended the hospital for medical procedures and informed us that the medical reports for these procedures were received to their home address in an envelope that had no postage stamp. The envelope had a hand-written address on it which included the name of a General Practitioner (GP) and also included the home address of the complainant’s neighbour. A hand-written amendment had been made to the address, stating that it was the wrong address. The complainant informed us that they had made enquiries with their neighbour in relation to the correspondence and the neighbour had stated that they had received the correspondence a number of days prior but that it had not been delivered by a postman. The neighbour further advised the complainant that they opened the envelope and viewed the contents in an effort to locate the correct recipient/address.

Following the initial complaint, the complainant provided us with correspondence which they subsequently received from the hospital apologising that correspondence containing the complainant’s medical results had been inadvertently sent to the wrong address. The hospital indicated that this appeared to have been due to a clerical error confusing part of the GP’s address and part of the complainant’s address. We commenced an investigation to establish how the error had happened, what procedures the hospital had in place at the time and what the hospital since had done to avoid repetition of this incident.

The hospital informed us that their normal procedure is to issue medical reports in batches to the relevant GP so that multiple sets of medical reports for different patients are placed in a windowed envelope, which shows the relevant GP’s address in the window. In this case however, the medical report was put in a nonwindowed envelope and the address was hand-written on the front. In doing so, the staff member who had addressed the envelope manually, erroneously intermixed the GP’s name, part of the GP’s address and part of the complainant’s address on the envelope. The hospital also informed us that the envelopes containing results to be dispatched to GPs are franked by the hospital post room. However, in this case because the envelope containing the complainant’s medical information was not franked, the hospital concluded that it was unlikely that it had been sent out directly from their post room and indicated that it could have been sent on via the relevant GP, although they acknowledged that they could not be certain about this this. We were unable to establish during the course of the investigation the precise manner in which the envelope containing the complainant’s medical reports came to be delivered to the complainant’s neighbour’s house. The hospital informed us that administrative staff had since been briefed on the correct procedure for issuing medical reports and that non-window envelopes would no longer be used for this purpose.

The complainant rejected the apology from the hospital made by way of an offer of amicable resolution and instead requested a formal decision from the Commissioner. In her decision, the Commissioner found that the hospital had contravened Section 2(1)(b) (requirement to keep personal data accurate, complete and up to date), Section 2(1)(d) (requirement to take appropriate security measures) and Section 2B(1) (requirement for a legal basis for processing sensitive personal data) of the Data Protection Acts 1988 and 2003 when it processed the complainant’s sensitive personal data by way of disclosing their personal data inadvertently to a third party.

This case illustrates how a seemingly innocuous deviation by a single staff member from a standard procedure for issuing correspondence can have significant consequences for the data subject concerned. In this case, highly personal medical information was accessed by a third party in circumstances which were entirely avoidable. If the hospital had had in place appropriate quality control and oversight mechanisms to ensure that all staff members rigidly adhere to its standard procedures it unlikely that this unauthorised disclosure of sensitive personal data would have occurred.

6)   Publication of personal information - journalistic exemption.

We received a complaint concerning an article published in the Sunday World (in both newspaper and online news forms) which named the complainant and published their photograph. The focus of the article was official complaints made by Irish prisoners under the Prisons Act 2007 concerning their treatment in prison (known as “Category A” complaints) and it included details of the number of “Category A” complaints which had been made by the complainant. It was alleged by the complainant that the Sunday World had gained unauthorised access to their personal data from the Irish Prison Service.

The complainant provided us with a letter which they had written to the editor of the Sunday World asserting that the information contained in the article was inaccurate and violated their right to privacy and requesting that the link to the online article be removed. We were also provided with a previous decision of the Press Ombudsman which dealt with various alleged breaches of the Code of Practice of the Press Council of Ireland (the Code) by the Sunday World, including allegations of breaches arising from the article in question. The Press Ombudsman had decided that there had been a breach of Principle 5 of the Code concerning privacy and that the article could have been written without publishing the complainant’s name or photograph. The position taken by the Press Ombudsman was that as “Category A” complaints are not part of the public record, the complainant’s reasonable expectation of privacy had been breached by the publication of their name and photograph.

In the course of our investigation we queried with the Sunday World why it had not removed the online version of the article from its website in light of the Press Ombudsman’s decision and in light of the complainant’s written request to do so. We also queried how the Sunday World had obtained the complainant’s personal data. In its response, the Sunday World stated its position that the publication was in the public interest as it related to the regimes of care and management of inmates as well as staff of prisons. It also contended that the article had highlighted how the [complaint] system was being deliberately over-used and abused. The Sunday World informed us that the online version of the article had been removed upon receiving the formal request from the complainant. However, the Sunday World relied on the journalistic exemption provision under Section 22A of the Data Protection Acts 1988 & 2003 (the Acts) in relation to the obtaining of the information in relation to the “Category A” complaints and the complainant’s personal data.

The Commissioner issued a formal decision in relation to the complaint and specifically in relation to the application of Section 22A exemption. The rationale behind the exemption in Section 22A is to reconcile the protection of privacy and freedom of expression. Following the entry into the force of the Lisbon Treaty, data protection acquired the status of a fundamental right. The right to freedom of expression is also a fundamental right. Both rights are also recognised in the European Convention on Human Rights, and also referred to in the EU’s Data Protection Directive 95/46/EC which is given effect in Irish law through the Acts.

Section 22A of the Acts specifies that personal data that is processed only for journalistic purposes shall be exempt from compliance with certain provisions of that legislation (including the requirement to have a legal basis for processing the personal data) provided that three cumulative criteria are met. Under Section 22A(1) (b), one of these three criteria is that the data controller, in this instance the Sunday World, must reasonably believe that, having regard in particular to the special importance of the public interest in freedom of expression, such processing (in this case by way of publication in the newspaper) would be in the public interest. The Sunday World claimed that the purpose of the article in question was essentially to highlight what it perceived to be an abuse of process within the Irish Prison Service. In her decision, the Commissioner found that it was not reasonable for the data controller to believe that the processing of the complainant’s personal data by publishing their name and photograph would be in the public interest in achieving the stated objective of the Sunday World. It was the view of the Commissioner that the special importance in freedom of expression could have been satisfied had the journalist in question used other means to reach the desired objective for example by using statistics in relation to the number of ‘Category A’ complainant prisoners and the public interest had been neither enhanced nor diminished by identifying the complainant by means of their name and photograph. As one criterion out of the three cumulative criteria for the application of the journalistic exemption under Section 22A of the Acts had not been satisfied, the Commission found that it was not necessary to consider the remaining two criteria.

As the data controller was unable to rely on Section 22A of the Acts as an exemption from the requirement to have a legal basis for processing by publishing the complainant’s personal data, the Commission in her decision then went on to consider whether there was in fact such basis for the processing. While the Commission considered that the Sunday World had a legitimate interest in obtaining and processing statistical information in relation to ‘Category A’ complaints for the purpose of research for the article in question, she considered that the Sunday World had contravened Section 2(1)(c)(iii) by further processing the complainant’s personal data, through publishing it. This contravention arose as the processing of the data by publication was excessive and unnecessary for the purpose of the point being made by the Sunday World in the article i.e. that the system was being abused.

This case illustrates that the journalistic exemption under Section 22A of the Acts is not a blanket exemption that can be routinely relied on by publishers or journalists seeking to justify publishing unnecessary personal data. The mere existence of a published article is not sufficient to come within the scope of this exemption and instead a data controller must be able to demonstrate that they satisfy all three cumulative criteria in this section, as follows:

(i) the processing is undertaken solely with a view to the publication of journalistic, literary or artistic material;

(ii) the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, such publication would be in the public interest; and

(iii) the data controller reasonably believes that, in all the circumstances, that having to comply with the relevant requirement of the Acts would be incompatible with journalistic, artistic or literary purposes.

7)   Compliance with a Subject Access Request & Disclosure of personal data / capture of images using CCTV

We received a complaint from an individual employed as a service engineer by a company, which was contracted to provide certain services to a company which was the operator of a toll plaza (the Toll Company). The complainant alleged, amongst other things, that the Toll Company had disclosed the complainant’s personal data (consisting of an audio recording and CCTV footage of a conversation between the complainant and an individual operating a tollbooth at the toll plaza) to the complainant’s employer without the complainant’s knowledge or consent.

During our investigation we established that an incident had occurred involving the complainant resulting in a request being made by the Toll Company to the complainant’s employers that the complainant was not to attend the toll plaza premises again in his capacity as a service engineer. We established that the incident in question involved a dispute at a toll both between the complainant and an individual operating the toll both, over the price of the toll which the complainant was charged. The Toll Company alleged that during the incident in question (which had been captured on CCTV and by audio recording) the complainant had threatened to “bring down” the toll plaza system. The complainant’s employer had confirmed that it would comply with the Toll Company request that the complainant not attend the toll plaza premises again and the Toll Company confirmed to us that at that point it had considered the matter to be concluded. However, approximately two months after the incident had occurred, the complainant’s employers had requested the CCTV footage and audio recording of the alleged incident which the Toll Company then provided to the employer. It was contended by the Toll Company that it was in its legitimate interests to process the complainant’s personal data as a threat to it had been made by the complainant and that one its employees had reported the threat to the Gardaí, who had been called to the toll plaza by the complainant at the time of the incident. The Toll Company also claimed that Sections 8(b) and Section 8(d) of the Data Protection Acts 1988 and 2003 (the Acts) allowed for this processing of the complainant’s personal data as the processing was necessary to prevent damage to the Toll Company’s property. The Company stated that the personal data of the complainant (the CCTV footage and audio recording) had been sent to the complainant’s employer two months after the incident as it had not been requested by the employer prior to that.

As part of our investigation, we noted that signs at the tollbooth notified the public that there was CCTV in operation. We also examined the Toll Company’s data protection policy which was available on its website and which stated that all vehicles using the toll plaza in question are  photographed/video recorded and that images are retained for enforcement purposes and to address and resolve any disputes that may arise in relation to a vehicle or account.

In her decision, the Commissioner considered the Toll Company’s purported reliance on pursuit of its legitimate interests as the legal basis under Section 2A(1)(d) of the Acts for the processing. Taking account of the two-month period which had elapsed between the incident in question and the request for the CCTV footage and audio recording being made by the employer, and also having regard to the confirmation of the Toll Company that (prior to receiving the employer’s request for the CCTV footage and audio recording) it had considered the incident to be concluded, the Commissioner decided that this legal basis could not be relied upon for the processing of the personal data. Consequently, a contravention of Section 2A(1) occurred as there had been no other legal basis (e.g. the consent of the complainant) to the processing of his personal data by disclosing it to his employer. The Commissioner also found that there was no adequate notice of the processing of the personal data had not been to the complainant, as it was not apparent from the data protection privacy policy or indeed the public signs at the tollbooth what the extent of the processing was, that audio recording was in operation nor was it stated who the data controller was. Consequently the Toll Company had contravened Section 2D(1) arising from this lack of transparency. Finally, the Commissioner also found that Section 2(1) (c)(ii) of the Acts had been contravened because further processing of the complainant’s personal data had occurred for a purpose (sharing it with the complainant’s employer) which was incompatible with the original purpose for its collection (enforcement purposes and resolving

This case is indicative of a common trend amongst data controllers to seek to rely on legitimate interests as the legal basis for processing personal data as something of a catch-all to cover a situation where personal data has been processed reactively and without proper consideration having been given in advance as to whether it is legitimate to carry out the processing. However, as this case illustrates a data controller must be able to provide evidence to support their assertion as to the legitimate interest relied on. Here, the passage of time since the incident and the fact that the data controller of its own admission considered that the matter had been concluded contradicted the purported reliance on this legal basis. This case also underscores the principle of the foreseeability of processing of personal data as an important element of the overarching principle of fair processing in data protection. At its core this means that a data subject should not be taken by surprise at the nature, extent or manner of the processing of their personal data.

8)   Failure to respond fully to an access request.

We received a complaint that an educational organisation had not fully complied with an access request submitted to it by the complainant who was an employee of that organisation. The complainant informed us that in the access request they had specifically sought CCTV footage from the educational organisation’s premises for 4 hour period during which the complainant had allegedly been assaulted by another employee. The complainant informed us that although there were 8 cameras on the premises, in response to their access request they only received an 11 second clip from the CCTV footage for the premises which ended just as the alleged assault came into view. The complainant told us that they had queried the limited amount of CCTV footage and reminded the educational organisation that the access request had been in respect of all footage within that 4 hour period. However, the educational organisation’s response had been that this query would be treated as a new access request. The complainant considered that the CCTV footage had been intentionally withheld and that this approach had been adopted as a delaying tactic so that the CCTV footage would ultimately not have to be released on the grounds that it had been lost or was no longer retained.

In the course of our investigation, we established that the complainant had made a subject access request to the educational organisation which had accepted it as a valid request. The educational organisation’s position was that it understood the complainant’s request to relate to footage of the incident in question only. However, the educational organisation acknowledged that the complainant would have been captured by other CCTV cameras for which the CCTV footage had not been provided. On this basis, we established that, as of the date of the complainant’s access request, additional personal data existed in the form of further CCTV footage which had not been provided to the data subject. The educational organisation informed us that as the CCTV was only retained for 28 days, by the time that the complainant had come back to query the limited amount of CCTV footage received in response to the access request, the additional CCTV footage had been subsequently overwritten without being retained for release to the complainant.

In her decision the Commissioner noted that it was clear that in the complainant’s access request the complainant was specifically seeking access to CCTV footage over a four-hour period and that having received the initial request, the educational organisation should have preserved the footage for that date and sought to clarify with the complainant what CCTV footage exactly they were seeking rather than unilaterally determining that issue itself. The educational organisation therefore contravened Section 4 of the Data Protection Acts 1988 and 2003 in failing to provide the complainant with all of their personal data within the statutory 40-day period.

This case clearly illustrates the position of the DPC which is that upon receipt of an access request relating to CCTV footage from a specific day, a data controller is obliged to preserve any such footage from that day pending resolution of the access request. This obligation applies irrespective of whether any such footage may be ordinarily subject to deletion (whether automated or not) after certain timeframes under the provisions of the data controller’s retention policy. Where a data controller considers that further clarification should be sought from the data subject as to the scope of the personal data requested, that requirement for clarification should not be interpreted as if the access request had not yet been made, as to do so could undermine the data subject’s right to access their personal data or enable a data controller to circumvent its obligations in respect of the access request.

9)   : Personal data of a third party withheld from an access request made by the parent of a minor

We received a complaint from an individual who had submitted an access request to a sports club for the personal data of their minor child, for whom the parent was the joint legal guardian. Following intervention from this office, the complainant had received personal data relating to their child from the sports club which was contained in an application for membership of the sports club which had been submitted to the sports club on behalf of the child. However certain information had been redacted from that application form, namely the names of the persons who were submitted to the sports club as emergency contacts for the child, the signature of the person who consented to images of the child being used on digital media by the sports club and the address of the minor. The complainant asserted that the third-party details and the address were all the personal data of their child and that the complainant as the joint legal guardian was therefore entitled to access to it. The sports club’s position was that there was no express provision within Section 4 of the Data Protection Acts 1988 and 2003 (the Acts) which relates to the right of access, which allows a person access to another party’s personal data without their consent. The sports club had also checked with the third parties whose personal data was the subject of the redactions on the application form as to whether they consented to the release of the data to the complainant but they had refused to give their consent.

Section 4(4) of Acts which precludes the release of third party data without that party’s consent was brought to the attention of the complainant. However, the complainant put forward the argument that because the information requested pertained to matters concerning the minor’s welfare and that because the third party was the legal representative of that minor, this rendered the data to be the child’s personal data. We outlined the definition of personal data to the complainant and highlighted case law which has established that a individual’s name represents the personal data of that individual. The complainant was also advised that the address of their child could not be provided without also providing the personal data of a third party and therefore the complainant had no right of access to it.

The complainant sought a decision on their complaint from the Commissioner. In her decision, the Commissioner pointed out that taking account of Section 8(h) of the Acts (which lifts restrictions on the processing of personal data where the processing is made with the consent of the data subject or a person acting on their behalf), her office’s position is that a parent or legal guardian of a young child has an entitlement to exercise the right of access on that child’s behalf. However, in this case as the child in question could not be identified by the names of third parties who were listed as emergency contacts with the sports club, the information to which the complainant sought access was not the personal data of the complainant’s child. The Commissioner in her decision pointed out that if the complainant’s logic were to be followed and an emergency contact were deemed the personal data of a third party, an adult who has listed another adult as an emergency contact would have the right of access over that third party’ name, telephone number, address, etc. The Commissioner found that no contravention of the Acts had occurred in relation to the redactions made to documents which had been released by the sports club on foot of the access request.

This case illustrates that irrespective of the relationship, dependency or connection between two parties, the name of a third party cannot be deemed to be the personal data of a data subject. As highlighted in the Commissioner’s decision, to do so would deprive that third party of control over their own personal data and allow another individual to exercise data subject rights, including the right of access, over the personal data of the third party. Such an outcome would run contrary to the core principle of data protection which is that each data subject has the right to determine the use of their own personal data. However, it is important to distinguish this principle from the limited circumstances in which the rights of a data subject may be lawfully exercised by another person who is permitted to do so on their behalf. Even where data subject rights may be exercised by a third party (such as the parent of a young minor child) this does not render the personal data of the data subject to be the personal data of the third party who is authorised to exercise the data subject’s rights on their behalf.

10)   Disclosure of Personal Data via a Social Media App.

Disclosure of Personal Data via a Social Media App

We received complaints from two individuals who each claimed that their personal data had been unlawfully disclosed when it was broadcast on “Snapchat”, an instant messaging and multimedia mobile application.

The complainants, who were friends, informed us that they had each submitted their CV with a cover letter to a particular retailer, in person, by way of application for employment with that retailer. The applications had been made by the complainants on the same day and had been received by the same employee of the retailer. Later on the same day the complainants had learned from a third party that a photograph showing both cover letters was appearing on “Snapchat” with a message drawing attention to similarities in the cover letters. It was the complainants’ common understanding that the employee of the retailer to whom they had submitted their CVs had taken this photograph and posted it to “Snapchat”.

During the course of our investigation of these complaints, we established that the employee of the retailer to whom the complainants had handed their CVs and cover letters had been recently notified by the retailer of the termination of their employment. Contrary to the retailer’s policy and the terms of their contract of employment, the employee had a mobile phone on their person during work hours and had used it to take a photograph of both the cover letters and to post it to “Snapchat”. The retailer informed our investigators that the employee was aware that this action was contrary to their contract of employment and the actions of the employee were done in circumstances where the employee was about to leave their employment. The retailer insisted that, in this instance, there was nothing further it could have done to prevent this incident from occurring.

In her decision the Commissioner found that the retailer, as the data controller for the complainants’ personal data, had contravened Section 2A(1) of the Data Protection Acts 1988 and 2003 as the processing of the complainants’ personal data, by way of the taking and posting of the photograph by the retailer’s employee, was incompatible with the purposes for which it had been provided to the retailer by the complainants.

The case should serve as a cautionary reminder to data controllers that as a general principle under data protection law, they are responsible for the actions of their employees in connection with the processing of personal data for which they are the data controller. The motive of an employee or the deliberate or accidental nature of the actions which they have undertaken in relation to personal data does not absolve data controllers of such responsibility. Data controllers have an obligation to ensure that their employees comply with data protection law in relation to the personal data which they hold irrespective of whether it is the employee’s first or last day or employment with the data controller. Indeed this obligation will continue even after an employee leaves a data controller’s employment if that employee can still access the personal data controlled by their former employer.

11)   Failure by the Department of Justice and Equality to impose the correct access restrictions on access to medical data of an employee

 

We received a complaint from an individual concerning an alleged disclosure of their sensitive personal data by the Department of Justice & Equality (the Department). It was claimed by the complainant, who was an employee of the Department, that a report containing information on the complainant’s health had been uploaded to a general departmental open document management database in 2012 and that the report had remained on that database for up to three years where it could be accessed by approximately 80 employees. The complainant informed us that they had been notified of the accessibility of the report on the database by a colleague. The complainant told us that they had requested an explanation from the Department as to why the report had been placed on an open database but had not received official confirmation that the report had since been removed.

We commenced an investigation into the complaint. The Department confirmed that notes relating to a discussion which had taken place between the complainant and their line manager in 2012 (which included a note concerning the complainant’s health) had been stored to the database in question and marked private. However, the line manager had inadvertently omitted to restrict access to the document with the result that it could be accessed by approximately 80 staff members from the Department. The Department informed us that the document had been removed from the database in question some 3 years after having been saved to it. As the line manager in question had since left the Department, it had been unable to establish exactly why the document had been saved there in the first place but claimed that it was due to human error. The Department was also unable to establish how many staff had actually accessed the document during the 3 year period in which it was accessible as the Department’s IT section had been unable to restore the historic data in question.

The Department made an offer, by way of amicable resolution, to write to the complainant confirming that the document in question had been removed from the database and apologising for any distress caused. The complainant chose not to accept this offer and instead sought a formal decision of the Commissioner. In her decision, the Commissioner concluded that the Department had contravened Section 2A(1) and 2B(1) of the Data Protection Acts 1988 & 2003 by processing the complainant’s sensitive personal data without the required consent or another valid legal basis for doing so and by disclosing the complainant’s sensitive personal data to at least one third party. These contraventions had occurred by way of the placing of a confidential document containing details of the complainant’s health on an open database where it appeared to have remained accessible for 3 years and had been accessed by at least one co-worker.

This case is a stark illustration of the consequences for a data subject and general distress which can be caused where the data controller fails to ensure that its staff have adhered to, and continue, to adhere to proper document management protocols for documents containing personal data and moreover, sensitive personal data. While the controller in question was unable to identify how many times and by how many different staff members the document in question had been accessed during the 3 year period when it was accessible to approximately 80 staff members, the potential for further and continuing interference with the data subject’s fundamental rights and freedom remained throughout this period. Had the controller in this case had adequate regular audit and review measures in place for evaluating the appropriateness of documents stored to open access databases, the presence of this confidential document would have been detected much sooner than actually occurred. Further, had the Department an adequate system of training and ensuring awareness by staff managers of basic data protection rules in place, this issue may not have arisen in the first instance.

12)   Virgin Media Ireland Limited.

We received a complaint in May 2016 from an individual who had received unsolicited marketing telephone calls from Virgin Media Ireland Limited in March and in May 2016 after she had previously asked the company not to call her again. The complainant is a customer of Virgin Media Ireland Limited and she informed us that the calls promoted Virgin Media products. She advised us that when the company first called her in January 2016 she had asked that her details be placed on the “Do Not Call” list as she did not wish to receive any further marketing calls. She stated that when the company called her again in March 2016 she repeated that she wanted her details to be placed on the “Do Not Call” list but despite her two requests she had received a further unsolicited marketing telephone call to her mobile phone on 27 May 2016.

During our investigation of this complaint, Virgin Media Ireland Limited informed us that due to human error the complainant’s account was not updated correctly to record the “Do Not Call” requests. The company advised us that a review had been conducted on all “Do Not Call” requests handled by the team in question for the period from January 2016 to July 2016 to ensure that all opt-out requests had been completed correctly. It confirmed that the complainant’s details had been removed from the marketing database and it apologised for any inconvenience caused to her.

Prior to September 2015 Virgin Media Ireland Limited traded under the name UPC Communications Ireland Limited. That company had previously been prosecuted, convicted and fined in March 2011 and in April 2010 for twenty similar marketing offences involving telephone calls to subscribers who had not consented to the receipt of such marketing calls. The Data Protection Commissioner therefore decided to prosecute Virgin Media Ireland Limited in respect of the offences identified following the investigation of the latest complaint.

At Dublin Metropolitan District Court on 3 July 2017, Virgin Media Ireland Limited pleaded guilty to two charges of making unsolicited marketing telephone calls to its customer after she notified the company that she did not wish to receive such calls. The Court convicted the company on both charges and it imposed fines of €1,500 and €1,000 respectively on the charges. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

13)   Sheldon Investments Limited (trading as River Medical)

In September 2015 we received a complaint against Sheldon Investments Limited, trading as River Medical, from an individual who had received unsolicited marketing emails to which he had not consented and which were subsequent to his attempts to opt out of such emails. In making his complaint, the individual explained that he had previously had a consultation with River Medical during which he was obliged to complete a form. He stated that when completing the form he expressly stated that he did not wish to receive any marketing emails from them. He subsequently received a marketing email from River Medical in April 2015 and he replied to the email with a request that his address be removed from their marketing list immediately. He received confirmation two days later that his contact details were removed. Despite this, he received a further unsolicited marketing email from River Medical in September 2015 which prompted him to submit a complaint to the Data Protection Commissioner.

During our investigation of this complaint, River Medical told us that the failure to respect the complainant’s opt-out request was due to human error. It explained that it had made his file ‘inactive’ on receipt of his opt-out request, but it did not realise that it needed to manually delete his file in order to prohibit the sending of further marketing material to him. It assured us that on foot of our investigation of the complaint, the individual’s email address had been deleted from its systems. We concluded the investigation of that complaint in December 2015 with a warning to the company that it would likely be prosecuted if it committed any further offences under the marketing regulations.

One year later, in December 2016, the individual submitted a new complaint after he received a further unsolicited marketing email from River Medical. We investigated this complaint and we were informed once again that the latest infringement had been caused by human error in the selection of an incorrect mailing list on Newsweaver, the system used by the company to issue emails. The company apologised for the incident.

As we had previously issued a warning to the company, the Data Protection Commissioner decided to prosecute it in respect of the two unsolicited marketing emails issued in December 2016 and in September 2015. At Dublin Metropolitan District Court on 3 July 2017, Sheldon Investments Ireland Limited pleaded guilty to two charges of sending unsolicited marketing emails without consent. The Court sought the payment of €800 in the form of a charitable donation to Focus Ireland and it adjourned the matter. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges.

14)   Tumsteed Unlimited Company (trading as EZ Living Furniture)

In June 2016 we received a complaint from an individual who received unsolicited marketing text messages from EZ Living Furniture despite having, on three previous occasions, requested them to stop. The complainant informed us that she had made a purchase from the company in the past.

As part of our investigation of this complaint, we asked EZ Living Furniture to show us evidence of the consent of the complainant to receive marketing text messages in the first instance. We also sought an explanation as to why her requests to opt out had not been actioned.

In response to our investigation, EZ Living Furniture stated that, in respect of marketing consent, customers sign into the company’s terms and conditions printed on the back of receipts. It drew our attention to one of the terms and conditions to the effect that customer information will be retained by the EZ Living marketing department and will be added to its database to be used for mailing lists and text messages. In relation to the complainant’s opt out requests not being complied with, EZ Living Furniture explained that there had been a changeover of service providers and the new service provider had a different method for opting out. It claimed that it was totally unaware that the opt-out facility was not working until it received our investigation letter. It assured us that the opt-out issue had now been resolved and it said that it had sent an apology to the complainant. In our response to EZ Living Furniture, we advised it, in relation to customer consent, that while it was relying on terms and conditions of sale, it was in fact obliged by law to provide its customers with an opportunity to opt out of receiving marketing communications at the point of collection of their personal data. We pointed out that, in practice, this means that customers must be provided with an opt-out box for them to tick in order to opt out of marketing, if that is their wish. In a subsequent reply, the company informed us that it had examined the matter further and that it had decided to introduce a stamp that would be placed on the sales docket to provide a checkbox to allow customers to opt out of receiving marketing emails and text messages.

The Data Protection Commissioner had previously issued a warning to EZ Living Furniture in April 2015 following the investigation of a complaint from a different individual in relation to sending her unsolicited marketing text messages without consent. Consequently, the Data Protection Commissioner decided to prosecute the company in respect of the offences which came to light arising from the latest complaint.

At Galway District Court on 4 July 2017, Tumsteed Unlimited Company, trading as EZ Living Furniture, pleaded guilty to two charges of sending unsolicited marketing text messages without consent. The Court convicted the company and it imposed fines of €500 on each of the two charges. The company agreed to make a contribution towards the prosecution costs of the Data Protection Commissioner.

15)   Cunniffe Electric Limited.

In December 2016 an individual complained to us that he had recently received unsolicited marketing text messages from Cunniffe Electric Limited of Galway Shopping Centre despite the fact that he had been advised previously on foot of an earlier complaint to us that his mobile phone number had been removed from its marketing database. In early 2015 we had received the complainant’s first complaint in which he informed us that he had given his mobile phone number some years ago to Cunniffe Electric Limited to facilitate the delivery of an electrical appliance which he had purchased from the company. He stated that he did not give the company consent to use his mobile phone number for marketing purposes.

Following our investigation of the first complaint, we received confirmation from Cunniffe Electric Limited that the complainant’s mobile phone number had been removed from its marketing database. We concluded that complaint by issuing a warning to the company that it would likely face prosecution if it breached the marketing regulations again.

On receipt of the complainant’s second complaint, we commenced a new investigation in which we sought from Cunniffe Electric Limited an explanation for the sending of the latest marketing text messages in circumstances where we were previously informed that the complainant’s mobile phone number had been removed from its marketing database. In response, the company admitted that it did not have the consent of the complainant to send him marketing text messages. It said that his mobile number was not on its database but it appeared that there was an error on the part of the service provider that it was using to send marketing text messages and that this error arose from transition issues when the service provider was acquired by another company. It apologised for the inconvenience caused to the complainant.

As the company had previously received a warning, the Data Protection Commissioner decided to prosecute it in relation to the most recent offences. At Galway District Court on 4 July 2017, Cunniffe Electric Limited entered a guilty plea for the sending of an unsolicited marketing text message without consent. In lieu of a conviction and fine, the Court asked the company to make a contribution of €500 to the Court Poor Box and it then struck out the charges. The company agreed to make a contribution towards the prosecution costs of the Data Protection Commissioner.

16)   Argos Distributors (Ireland) Limited

Five individuals lodged complaints with us between December 2016 and February 2017 arising from difficulties they were experiencing in opting out of email marketing communications from Argos Distributors (Ireland) Limited. The complainants had supplied their email addresses in the context of making online purchases and they had not opted out of marketing communications at that point. However, when they subsequently attempted to opt out on receipt of marketing emails, the ‘unsubscribe’ system failed. Some complainants subsequently followed up by email to the company seeking to have their email addresses removed from the marketing database and they received responses by email to inform them that their requests had been actioned. However, they continued to receive further email marketing from Argos Distributors (Ireland) Limited.

In response to our investigation, the company acknowledged that its ‘unsubscribe’ system was not working properly for a period of time. It also discovered an issue in processing ‘unsubscribe’ requests for customers based in Ireland. It found that requests from Irish customers were being added to the ‘unsubscribe’ list for UK marketing. In all cases, it confirmed that the opt-out requests of the individuals concerned were now properly processed.

As the company had been warned previously in 2013 following the investigation of a similar complaint of a breach of the marketing regulations, the Data Protection Commissioner decided to prosecute it in relation to these offences. At Navan District Court on 14 July 2017, Argos Distributors (Ireland) Limited pleaded guilty to five charges of sending unsolicited marketing emails to five individuals without consent. In lieu of a conviction and fine, the Court ordered the defendant to contribute €5,000 to a charity of the Court’s choosing. The defendant agreed to pay the prosecution costs incurred by the Data Protection Commissioner.

17)   Expert Ireland Retail Limited

In October 2016 an individual complained to us about regular marketing text messages which she received from Expert Ireland Retail Limited. She informed us that in August 2014 she purchased a tumble dryer at the Expert Naas store and she stated that she gave her mobile phone number at the point of sale for the sole purpose of arranging the delivery of the appliance. She stated that she was not asked if she wished to receive marketing text messages and she did not request or agree to same. She informed us that she began receiving regular marketing text messages from December 2015 onwards and despite replying by text message on numerous occasions with the opt-out keyword, further text messages continued to arrive on her phone. She advised us that early in October 2016 her husband called to the Expert store in Naas and he asked the staff there to remove her number from their marketing database. Despite this request the complainant received a further marketing text message about two weeks later, prompting her to lodge a complaint with the Data Protection Commissioner.

In response to our investigation, the company claimed that the complainant would have been asked during the course of the sale if they would like to be contacted by text message for marketing purposes. However, it was unable to provide any evidence that the complainant was given an opportunity to opt out of marketing at the point of sale. Furthermore, it admitted that the sending of the first marketing message after a period of over twelve months had expired was an oversight. The company was unable to explain why no action was taken to remove the complainant’s mobile phone number from the marketing database after her husband called to the Naas store.

As the company had previously been issued with a warning in May 2010 on foot of a similar complaint which we received about unsolicited marketing text messages sent to a different former customer of the Expert store in Naas without her consent, the Data Protection Commissioner decided to prosecute this latest complaint. At Mullingar District Court on 13 October 2017, Expert Ireland Retail Limited pleaded guilty to one charge of sending an unsolicited marketing text message to the complainant without her consent. The Court convicted the company and it imposed a fine of €500. The defendant company agreed to pay the legal costs incurred by the Data Protection Commissioner in respect of this prosecution.

  1. Prosecution of James Cowley Private Investigator
  2. Disclosure of Personal Data to a Third Party in Response to a Subject Access Request
  3. Data Breach at Retail and Online Service Provider
  4. Prosection of Yourtel for Marketing Offences
  5. Prosecution of Glen Collection Investments Limited and One of its Directors
  6. Prosecution of Shop Direct Ireland Limited T/A Littlewoods Ireland for Marketing Offences
  7. Further Processing of an Individual's Personal Data in an Incompatible Manner
  8. Disclosure of Personal Information to a Third Party by a Data Processor
  9. The Necessity to Give Clear Notice When Collecting Biometric Data at a Point of Entry
  10. Residential Care Home's Legimate Use of Audio Recording and Photograph of Data Subject Concerning Allegations of Misconduct
  11. Disclosure of Personal Information to a Third Party
  12. Failure of a Data Controller to Keep Individual's Personal Information Accurate and Up to Date Which Resulted in the Disclosure of Personal Data to a Third Party
  13. Failure by BOI to Properly Verify the Identity of Individual on the Phone Which Resulted in the Disclosure of Personal Information to a Third Party
  14.  Data Controller Obliged to Demonstrate Effort Made to Locate Data Within the Statutory 40 Day Period
  15. Personal Data Withheld from an Access Request by Airbnb on the Basis of an Opinion Given in Confidence
  16.  Crypto Ransomware Attack on a Primary School
  17. Data Breach at an Online Retailer
  18.  Incorrect Association of an Individual's Personal Details with Another File
  19. Prosecution of The Irish Times Limited for Marketing Offences
  20. Prosecution of Coopers Marquees Limited for Marketing Offences
  21. Prosecution of Robert Lynch T/A The Energy Centre for Marketing Offences
  22. Prosecution of Paddy Power Betfair Public Limited Company for Marketing Offences
  23.  Prosecution of Trailfinders Ireland Limited for Marketing Offences
  24. Prosecution of Topaz (Local Fuels) Limited for Marketing Offences
  25. Prosecution of Dermaface Limited for Marketing Offences 

1)   Prosecution of James Cowley Private Investigator

James Cowley was charged with sixty-one counts of breaches of the Data Protection Acts, 1988 & 2003. All charges related to breaches of Section 22 of the Data Protection Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept and disclosing the data to another person. The personal data was kept by the Department of Social Protection. The personal data was disclosed to entities in the insurance sector – the State Claims Agency, Zurich Plc and Allianz Plc.

On 13 June 2016, at Dublin Metropolitan District Court, James Cowley pleaded guilty to thirteen sample charges. He was convicted on the first four charges and the Court imposed a fine of €1,000 in respect of each of these four charges. The remaining nine charges were taken into consideration in the sentence imposed.

The investigation in this case uncovered access by the defendant to social welfare records held on databases in the Department of Social Protection. To access these records, the defendant used a staff contact who was known to him. Mr. Cowley then used the information he obtained for the purposes of compiling private investigator reports for his clients. These activities continued for a number of years up to September 2015 when our investigation team first made contact with him about its concerns in relation to his processing of personal data.

 

2) Disclosure of Personal Data to a Third Party in Response to a Subject Access Request

An ex-employee of Stobart Air made a complaint in August 2015 to us regarding the unlawful disclosure of their redundancy details to another member of staff following an access request made by that person to the company. The complainant also informed us they had equally received third party personal information in response to a subject access request that they themselves had made to the company in May 2015.

Stobart Air, on commencement of our investigation, confirmed to us that a breach of the complainant’s data had occurred in November 2014. It stated that it had not initially notified the complainant of the breach when it first learned of it as it was unaware of the data protection guidelines that advise the reporting of disclosures to the data subjects involved where the disclosure involves a high risk to the individual’s rights and requesting the third party in receipt of the information to destroy or return the data involved.

The complainant in this case declined an offer of amicable resolution and requested a formal decision of the Commissioner. In her decision the Commissioner found that Stobart Air had, in including the complainant’s personal data in a letter to ex-employees, had carried out unauthorised processing and disclosure of the complainant’s personal data. This had contravened Section 2A(1) of the Data Protection Acts, 1988 and 2003, by processing the complainant’s personal information without the complainant’s consent or another legal basis under the Data Protection Acts 1988 and 2003 for doing so.

Stobart Air identified itself that it had inadequate training and safeguards around data protection in place which it has since sought to rectify.

In a separate complaint received by the DPC in September 2015, we were notified that Stobart Air had disclosed financial data of a third party to the complainant in response to a subject access request. We proceeded to remind Stobart Air of its obligations as a data controller and Stobart Air identified a number of individuals who had been affected by these issues. Stobart Air subsequently notified all affected third parties of the breach of their personal data. However, in trying to comply by notifying the affected individuals, Stobart Air disclosed the complainant’s data, by divulging the fact that the complainant was the recipient of this data, in a letter notifying the individuals whose data was originally disclosed.

Stobart Air had no legal basis to disclose the complainant’s personal data to the third parties involved nor did it have consent of the individual affected. The disclosure of the complainant’s identity to the individuals affected by the original breach was unnecessary in the circumstances and in contravention of Section 2A(1) of the Data Protection Acts 1998 and 2003.

 

3) Data Breach at Retail and Online Service Provider

In July 2016, we received a breach report from an organisation providing retail and online services.

The organisation was victim of a “brute force” attack, whereby over a two-week period, the attackers tried various username/password combinations, with some combinations successfully being used to gain access to user accounts. When these accounts were accessed, the attackers attempted to withdraw user balances. These withdrawals were enabled by the attacker having the ability to add new payment methods. It was also possible for the attacker to access the personal data associated with the account.

On assessing the breach, we identified that the organisation had deficiencies in the measures it had taken to secure users’ personal data including:

  • Insufficient measures on password policy and user authentication;
  • Insufficient control measures to validate changes to a user’s account; and
  • Insufficient control measures on the retention of dormant user accounts.

We considered that the organisation contravened Section 2(1)(d) of the Data Protection Acts 1988 and 2003 by failing to take appropriate security measures against unauthorised access to, or unauthorised alteration, disclosure or destruction of, its users’ personal data.

 

Recommendations were issued to the organisation that it take steps to mitigate the deficiencies identified or face enforcement action. The organisation subsequently informed us that it had taken the following steps based on our recommendations:

  • Implementation of passwords which require more than one factor
  • Implementation of a comprehensive data retention policy

This case highlights the need for organisations to ensure that they have appropriate technical organisational and security measures in place to prevent loss of data through “brute force” or reuse of password attacks. In this scenario, the use of appropriate access and authentication controls, such as multifactor authentication, network rate limiting and logon alerts, could have mitigated the risks. Further, poor retention policies provide an “attack vector” for hackers such as that used as a means of entry in this breach.

 

4) Prosection of Yourtel for Marketing Offences

We received a complaint in December 2014 from an individual who received marketing telephone calls from Yourtel Limited, a telephone service provider which entered the Irish market in 2013, after he had instructed the company during a previous call not to call him again. The complainant informed us that the calls related to an offer to switch telephone service providers.

In February 2015 a separate complaint was received on behalf of another individual who received marketing telephone calls from Yourtel Limited after the company had been instructed during a similar marketing call on Christmas Eve 2014 not to call his number again. The marketing calls to this individual also concerned switching telephone service provider.

During our investigation of these complaints Yourtel Limited acknowledged the making of the marketing telephone calls. It claimed that it blocked the telephone numbers from receiving further marketing calls on the occasion of the last call in each case when it was informed by the individuals concerned that they did not wish to be contacted again for marketing purposes. It did not accept in either case that it continued to call the individuals after they had instructed Yourtel Limited not to call them again.

The Data Protection Commissioner decided to prosecute the offences as Yourtel Limited had come to our attention previously in 2014 on foot of a complaint about the making of a marketing telephone call to a telephone number which stood recorded on the National Directory Database (NDD) Opt Out Register. Following the investigation of that complaint, we warned the company that it would likely face prosecution if it committed further offences under Regulation 13 of SI 336 of 2011 (known as the ePrivacy Regulations) at any future time.

At Dublin Metropolitan District Court on 21 January 2016 Yourtel Limited pleaded guilty to two charges of making unsolicited marketing telephone calls after the two individuals it called had notified the company that they did not consent to the receipt of such calls. The Court convicted the company on both charges and it imposed two fines of €2,500 each. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

5) Prosecution of Glen Collection Investments Limited and One of its Directors

The investigation in this case established that the defendant company obtained access to records held on computer databases in the Department of Social Protection over a lengthy period of time and that a company director used a family relative employed in the Department of Social Protection to access the records. The defendant company had been hired by a Dublin-based firm of solicitors to trace the current addresses of bank customers that the respective banks were interested in pursuing in relation to outstanding debts. Having obtained current address information or confirmed existing addresses of the bank customers concerned from the records held by the Department of Social Protection, the defendant company submitted trace reports containing this information to the firm of solicitors which acted for the banks. The case came to light on foot of a complaint which we received in February 2015 from a customer of AIB bank who alleged that an address associated with him and which was known only to the Department of Social Protection was disclosed by that department to an agent working on behalf of AIB bank.

The Data Protection Commissioner decided to prosecute both the company and the director in question, Mr Michael Ryan. Glen Collection Investments Limited was charged with seventy-six counts of breaches of the Data Protection Acts, 1988 & 2003. Sixty-one charges related to breaches of Section 19(4) of the Data Protection Acts for processing personal data as a data processor while there was no entry recorded for the company in the public register which is maintained by the Data Protection Commissioner under Section 16(2) of the Data Protection Acts. Fifteen charges related to breaches of Section 22 of the Data Protection Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept and disclosing the data to another person.

Mr. Michael Ryan, a director of Glen Collection Investments Limited, was separately charged with seventy-six counts of breaches of Section 29 of the Data Protection Acts, 1988 & 2003 for his part in the offences committed by the company. This Section provides for the prosecution of company directors where an offence by a company is proved to have been committed with the consent or connivance of, or to be attributable to any neglect on the part of the company directors or other officers.

The cases against Glen Collection Investments Limited and its director were called in Tuam District Court in January, May and July of 2016 before the defendants eventually entered guilty pleas on 10 October 2016. While the defendant company was legally represented in court on all occasions, the Court issued a bench warrant for the arrest of the company director, Mr Ryan, on 10 May 2016 after he had twice failed to appear. The bench warrant was executed at Tuam District Court on 10 October, 2016 prior to the commencement of that day’s proceedings.

At Tuam District Court on 10 October 2016 Glen Collection Investments Limited pleaded guilty to twenty-five sample charges – thirteen in relation to offences under Section 22 and twelve in relation to offences under Section 19(4). The company was convicted on the first five counts with the remainder taken into consideration. The court imposed five fines of €500 each. Mr. Ryan pleaded guilty to ten sample charges under Section 29. He was convicted on all ten charges and the court imposed ten fines of €500 each. In summary, the total amount of fines imposed in relation to this prosecution was €7,500

 

6) Prosecution of Shop Direct Ireland Limited T/A Littlewoods Ireland for Marketing Offences

In January 2015 we received a complaint against Shop Direct Ireland Limited T/A Littlewoods Ireland from an individual who received an unsolicited marketing email after she opted out of marketing from the company. The individual, who was a customer of Littlewoods Ireland, complained further a few weeks later when she received a marketing email promoting offers for Mother’s Day from Littlewoods Ireland. We had previously issued a warning to Littlewoods Ireland in December 2014 following the investigation of a complaint received from the same complainant with regard to unsolicited marketing emails which she had received after she opted out of receiving marketing. That previous complaint led to an investigation which found that the customer had not been given the opportunity to opt out of marketing from Littlewoods when she opened her account. (She had been given the opportunity to opt out from third party marketing only – an option which she availed of). Arising from our investigation of that complaint, Littlewoods Ireland informed us that the customer’s email address was opted out of direct marketing from 7 March, 2014.

During the investigation of the 2015 complaints the solicitors acting for Littlewoods Ireland informed us that, following the conclusion of the previous complaint in December 2014, Littlewoods Ireland carried out a review of the customer’s account. It found that while she was correctly opted out of email marketing, she was not opted out of third party marketing. It then took steps to opt the customer out of third party marketing. When the update to the third party marketing preference was applied to the customer’s account in January 2015 a null value was applied to the email marketing field. The intention in applying this null value was to signify that no change was to be made to this field. However, the application of this value had the unintended consequence of opting the customer back into email marketing. Subsequently, as a result of this incorrect update, two marketing emails were sent to the customer in January 2015 and March 2015.

The Data Protection Commissioner decided to prosecute the company. At Dublin Metropolitan District Court on 4 April 2016 Shop Direct Ireland Limited T/A Littlewoods Ireland pleaded guilty to one charge of sending an unsolicited marketing email without consent. The Court ordered the payment of €5,000 in the form of a charitable donation to Pieta House and it adjourned the matter for seven weeks. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charge.

 

7) Further Processing of an Individual's Personal Data in an Incompatible Manner

An individual submitted a complaint regarding the unfair processing of their personal data. The individual stated that they had received letters from Thornton’s Recycling and Oxigen Environmental respectively explaining that there would be a change-over of refuse collection services from Oxigen Environmental to Thornton’s Recycling within a week of the issuing of the letters. The complainant advised that they had not authorised the transfer of their personal details and had not been previously informed of this transfer of ownership.

We raised the matter with Oxigen Environmental requesting an explanation as to the reason for processing personal data in this manner in light of the data protection requirements of fair obtaining and fair processing of personal data. Oxigen Environmental confirmed that the customer details that were transferred to Thorntons consisted of a name, address and any balance that remained on the customer’s pre-paid account. It advised that no banking details were passed over at any stage. It also alleged that a letter had been sent out to all customers advising them of the transfer and that this letter had been issued before any customer data had been transferred but they were not able to clarify the date on which this allegedly occurred.

Oxigen Environmental indicated that the first and only notification that customers received regarding the transfer of services from Oxigen Environmental to Thorntons Recycling was made by way of two letters, one each from Oxigen Environmental and Thorntons Recycling, contained in the same envelope delivered to customers. The interval between this notification and the transfer of services spanned less than four working days. We considered that this was an insufficient timeframe for customers to consider the change-over and to make alternative arrangements to prevent the further processing of personal data. Whilst the issue of takeovers/mergers is often covered by a company’s contractual terms with its customers, we established that Oxigen Environmental’s terms and conditions and Customer Charter did not cover such issues.

Taking into account the short timeframe that had elapsed between the notification of the transfer of services and the date from which the transfer became effective, our view was that the fair processing requirements under the Acts were not fulfilled. Whilst a proposal for amicable resolution was put forward, we were unable to conclude an amicable resolution of the complaint and a formal decision of the Commissioner issued in July 2016. The Commissioner found Oxigen Environmental to be in contravention of Section 2(1)(a) of the Data Protection Acts 1988 and 2003 in that it unfairly processed personal data without sufficient notice to its customers.

The requirement to provide proper notice of processing to data subjects in accordance with Section 2(1)(a) and Section 2D of the Data Protection Acts 1988 and 2003 is an essential pre-requisite to the lawful processing of personal data. A data subject has the right to be properly informed with adequate notice of a change in the ownership of a business holding his or her personal data, in order to be able to withdraw from the services being provided and prevent the further processing of their personal data (including preventing the transfer to a new owner) and to make alternative arrangements. The issue of what constitutes adequate notice will vary from case to case but in any event it must be at minimum a sufficient period that will allow a data subject to have a meaningful opportunity to consider the changes contemplated and to take steps to exercise their preferences in relation to the proposed changes.

 

8) Disclosure of Personal Information to a Third Party by a Data Processor

We received a complaint concerning the alleged unauthorised disclosure of the complainant’s personal information by An Post to a third party. The complainant, who had recently been bereaved, informed us that An Post had erroneously issued a valuation statement in respect of a joint savings deposit account that they had previously held with their late partner, to a solicitor acting on behalf of their late partner’s son. The statement contained the complainant’s personal financial data in relation to their joint State Savings account held with the National Treasury Management Agency (NTMA). Prior to making the complaint to this Office, the complainant had received an apology from An Post, on behalf of the NTMA, who acknowledged that the complainant’s personal information had been disclosed in error. However, because the complainant had received very little information as to how the disclosure had occurred they requested that we investigate this matter.

Although the complainant submitted a complaint against An Post, we established in our preliminary that An Post offers products and services on behalf of State Savings, which is the brand name used by the NTMA to describe the range of savings products offered by the NTMA to personal savers. An Post is therefore a "data processor" as defined under the Data Protection Acts 1988 and 2003 as it processes customers’ personal data on behalf of the NTMA. The NTMA is the "data controller" as defined under the Data Protection Acts 1988 and 2003 as it controls the content and use of its customers’ personal data for the purposes of managing their State Savings account.

 

We commenced an investigation by writing to the NTMA which NTMA did not contest the fact that the complainant’s personal information had been disclosed. The NTMA stated that, having received a full report from its data processor, An Post, it had confirmed that, contrary to State Savings standard operating procedures, a valuation statement, which included details of an account held jointly by the complainant and their deceased partner, was sent to a solicitor acting on behalf of a third party. The NTMA acknowledged that the information should not have been sent to the third party and that correct procedures were not followed in this instance by the data processor.

The complainant chose not to accept an apology and goodwill gesture from the NTMA as an amicable resolution of their data protection complaint, opting instead to seek a formal decision of the Data Protection Commissioner.

A decision of the Data Protection Commissioner issued in July 2016. In her decision, the Commissioner formed the opinion that the NTMA contravened Section 2A(1) of the Data Protection Acts 1988 and 2003 by processing the complainant’s personal information without their consent by way of the disclosure, by An Post as an agent of the NTMA, of the complainant’s personal information to a third party.

This case illustrates that it is vital for data controllers to ensure that their policies and procedures for the protection of personal data are properly and routinely adhered to by all staff. Staff awareness is key to this issue but employers should also ensure that regular reviews of how those policies and procedures are applied in practice are carried out so as to identify potential issues and enable the taking of appropriate remedial actions/ changes to the practices, policies and procedures.

 

9) The Necessity to Give Clear Notice When Collecting Biometric Data at a Point of Entry

In October 2015, we received a complaint from a contractor in relation to the alleged unfair obtaining and processing of their personal data. The complainant stated that in the course of attending a data centre for work-related purposes the company had collected their biometric data without their consent and had also retained their passport until they had completed the training course. While the complainant had been advised in advance by the data controller to bring identification on the day of attendance at the data centre for security purposes, they had not been informed at that time that the data controller would be collecting their biometric data upon arrival at the data centre.

 

In the course of our investigation, we established that the data controller had collected the complainant’s biometric data upon their arrival at the data centre by way of a fingerprint scan. However, no information about this process had been provided to the complainant at that time – they were simply told that they could not go through security without this biometric fingerprinting.  The data controller confirmed to us that this fingerprint scan data had not been retained, rather it had been used to generate a numerical template which was then stored in encrypted form and that numerical information was associated with a temporary access badge provided to the complainant for the duration of the time which the complainant was in attendance at the data centre. The data controller confirmed that it had deleted this information from its system and back-up files at the data subject’s request upon the data subject’s departure from the data centre. The data controller further confirmed that, while it had retained the complainant’s passport for the duration of the complainant’s attendance at the data centre pursuant to a policy to ensure the return of temporary access badges, it had not taken or retained a copy of the complainant’s passport.

The complainant in this case did not wish to accept the offer of amicable resolution made by the data controller and instead requested that the Commissioner make a formal decision on their complaint.

 

The decision by the Data Protection Commissioner in October 2016 found that the data controller contravened Section 2(1)(a) and Section 2D(1) of the Data Protection Acts 1988 and 2003 as the data controller should have supplied the complainant with the purposes of the collection and processing of the biometric data, the period for which it would be held and the manner in which it would be retained, used and, if applicable disclosed to third parties. This could have been done by the data controller either when it was in contact with the complainant to advise them of the requirement to bring identification to gain entry to the data centre, or at the latest, at the time the complainant arrived at the data centre.

However in relation to the obtaining and processing of the complainant’s biometric data, having reviewed the information provided by the data controller in the course of the investigation by this office, the Data Protection Commissioner found that the data controller had a legitimate interest under Section 2A(1)(d) of the Acts in implementing appropriate security procedures for the purposes of safeguarding the security of data centre, in particular for the purposes of regulating and controlling access by third parties to the data centre. Given that the biometric data was used solely for the purposes of access at the data centre, it was not transferred to any other party and was deleted in its entirely at the data subject’s request upon departing the data centre, the Data Protection Commissioner’s view was that this did not amount to potential prejudice which outweighed the legitimate interests of the data controller in protecting the integrity of the data centre and preventing unauthorised access to it. Accordingly, the Data Protection Commissioner concluded that the data controller had a legal basis for processing the complainant’s biometric data.

In relation to the retention of the complainant’s passport for the duration of their visit at the data centre, the Commissioner found that this did not give rise to any contravention of the Data Protection Acts 1988 and 2003, as the data controller had a legitimate interest in doing so and the limited processing of the complainant’s passport information (i.e. the retention of the passport itself) did not give rise to any disproportionate interference with the complainant’s fundamental rights.

Transparency is a key principle under data protection law and the giving of notice of processing of personal data to a data subject is a major element of demonstrating compliance with this principle. In particular, the central tenet that individuals whose data is collected and processed should not generally be “surprised” at the collection and processing or its scale or scope, should inform all aspects of a data controller’s data processing operations.

 

10) Residential Care Home's Legimate Use of Audio Recording and Photograph of Data Subject Concerning Allegations of Misconduct

We received a complaint from a former employee of a residential care home who claimed that photographic evidence and an audio recording of them were used in a disciplinary case against them by their employer resulting in their dismissal.

During our investigation, the complainant’s former employer (the operators of the residential care home) advised us that a formal, externally led investigation had been conducted into allegations that the complainant had been found by a supervisor to be asleep during a night shift on two separate occasions. On the nights in question, the complainant had been the sole staff member on duty responsible for the care of a number of highly vulnerable and dependent adults who had complex medical and care needs and who needed to be checked regularly. Having discovered the complainant asleep on the first occasion, the supervisor had warned the complainant that if it happened again it would be reported in line with the employer’s grievance and disciplinary procedure. On the second occasion, when the supervisor discovered the complainant to be asleep, fully covered by a duvet on a recliner with the lights in the room dimmed and the television off, the supervisor had used their personal phone to take photographs of the complainant sleeping and make a sound recording of the complainant snoring. The allegations had been upheld by the investigation team and a report prepared. This was followed by a disciplinary hearing convened by the employer. The employer had informed the complainant at that hearing that it accepted the verbal and written account given by the supervisor. The employer had found that the act of sleeping on duty constituted gross misconduct in light of the vulnerabilities and dependencies of the clients in the complainant’s care and the complainant had been dismissed.

Having regard to the information supplied to us by the operators of the residential care home and, in particular, the vulnerability of the clients involved and the nature of the complainant’s duties, we formed the view that no breach of the Data Protection Acts 1988 and 2003 had occurred. In this case, we considered that the processing of the complainant’s data, by way of the photograph and audio recording made by the supervisor, and the subsequent disclosure of these to the employer was necessary for the purposes of the legitimate interests pursued by the data controller, the employer, under Section 2A(1)(d) of the Data Protection Acts 1988 and 2003. This legal basis for processing requires the balancing of the data controller’s (or a third party’s or parties’) legitimate interests against the fundamental rights and freedoms or legitimate interests of the data subject, including an evaluation of any prejudice caused to those rights of the data subject.

We considered that the processing of personal data here was limited in nature and scope as it consisted of a one-off taking of a photograph and the making of an audio recording by the supervisor, who acted of their own volition and not in response to any direction or request from the employer. There had been limited further disclosure of the personal data concerned afterwards, i.e. to the employer, while the original photograph and recording were deleted from the supervisor’s phone. A copy of the material had also been provided to the complainant in advance of the complainant meeting the investigation team. We therefore considered that, in the circumstances, the processing was proportionate and that the legitimate interests of the data controller (and indeed the legitimate interests of third parties, being the clients of the residential care home) outweighed the complainant’s right to protection of their personal data.

While the right to protection of one’s personal data attracts statutory protection within the national legal system and, moreover, is a fundamental right under EU law, such rights are not absolute. Accordingly, they must be interpreted to allow a fair balance to be struck between the various rights guaranteed by the EU legal order. In particular, as this case demonstrates, data-protection rights should not be used to ‘trump’ the rights of particularly vulnerable members of society or the legitimate interests pursued by those organisations responsible for safeguarding the health and life of such persons in discharging their duties of care and protection

 

11) Disclosure of Personal Information to a Third Party

We received two complaints from public servants (a husband and wife) whose personal data was disclosed by PeoplePoint, the human resources and pension shared services for public service employees. The initial complainant, in November 2015, stated that after applying for annual leave, he subsequently made an application to change this request to sick leave. The officer in PeoplePoint responsible for this section proceeded to email the complainant’s line manager at the government department in which the complainant worked. However, on receiving an ‘out-of-office’ reply the officer proceeded to email the complainant’s non-supervisory peer. PeoplePoint had notified us of the breach in June 2015. However, on commencing an investigation and receiving a copy of the email at the centre of the breach, we established that the personal data of the complainant’s spouse, who was also a public servant in a different department, was also contained in the email and that the email had been sent to three third parties. It became apparent that the official in PeoplePoint, when considering the initial complainant’s annual leave, had also accessed his spouse’s personal information without the authorisation of her employer or her consent.

Upon further investigation into this matter it became apparent that the PeoplePoint official had informed the complainant’s spouse and their colleagues about information in relation to the complainant when they had no legal basis to do so and without any authority from the data controller of their personal data, i.e. the employer.

PeoplePoint were subject to an audit by this Office. In relation to this complaint, it informed us that upon being made aware of the breach, it acted to retrieve the data and confirmed that the data had been deleted by all parties involved. It also stated that corrective action had been taken to improve the relevant official’s awareness of data privacy. Whilst a proposal for amicable resolution was proposed by Peoplepoint, the complainants declined it and requested a formal decision of the Commissioner.

The Commissioner concluded the opinion that Section 21(1) of the Data Protection Acts 1988 and 2003 had been contravened. PeoplePoint, is a processor engaged by the data controller (being the relevant government department which is the employer) and as such the data processor owes a duty of care to the data subjects whose personal data it is processing. Under Section 21, a data processor must not disclose personal data without the prior authority of the data controller on behalf of whom the data are processed.

This case is a stark reminder to data processors of the importance of processing data only with the prior consent of the data subject or the data controller. Actions in relation to personal data which may appear innocuous to ill-informed staff can have serious ramifications for data subjects. It is not acceptable for data processors and data controllers to rely on an excuse that an employee did not realise that what they were doing was a breach of data protection law. It is the responsibility of such employers to ensure that all staff are appropriately trained and supervised in relation to the processing of personal data, in order to minimise to the greatest degree possible, the risks to the fundamental rights and freedoms of data subjects whose personal data they process.

 

12) Failure of a Data Controller to Keep Individual's Personal Information Accurate and Up to Date Which Resulted in the Disclosure of Personal Data to a Third Party

We received a complaint in February 2015 concerning the alleged unauthorised disclosure by Permanent TSB (PTSB) of the data subject’s personal information to a third party. In this complaint the data subject stated that she had lived at a property with her ex-husband, that the mortgage for this property was a joint account in both her and her ex-husband’s names and that she was subsequently removed from this mortgage as part of a divorce settlement. The data subject informed this Office that she subsequently took out a separate mortgage with PTSB, solely in her own name, for a different property. However, PTSB had sent a letter of demand, addressed to her at her new property and also addressed to a third party property which she had never been associated with. The complainant’s ex-husband had been raised at this property; his stepmother was still living there and she had opened the PTSB letter of demand and notified her stepson (the data subject’s ex-husband), who in turn had notified the data subject. We commenced and investigation and PTSB accepted that the data subject’s personal data had been disclosed to a third party. PTSB informed us that this had occurred because the third party address (which the data subject had provided to PTSB as a correspondence address when applying for the previous loan which she held with her ex-husband), was incorrectly linked to the entirely separate subsequent mortgage loan in the data subject’s sole name.

We sought an amicable resolution of this complaint but the proposal which PTSB made the data subject was declined and she instead sought a formal decision of the Commissioner.

The Commissioner found that PTSB had contravened both Section 2A(1) of the Data Protection Acts 1988 and 2003 by processing the data subject’s personal data without her consent or another legitimate basis for doing so and also Section 2(1)(b) by failing to keep her personal data accurate, complete and up to date.

The circumstances of this complaint are a case in point as to the rationale behind the principle that personal data must be kept accurate, complete and up to date. Failure to adhere to this principle, particularly in the context of contact information perpetuates the risk that further data protection failures (such as unauthorised disclosure to third parties) will flow from such non-compliance.

 

13) Failure by BOI to Properly Verify the Identity of Individual on the Phone Which Resulted in the Disclosure of Personal Information to a Third Party

We received a complaint that Bank of Ireland (BOI) had disclosed the complainant’s personal information to a third party. BOI had notified the complainant of this disclosure which occurred when, in an attempt to contact him regarding his account, a member of BOI staff called his mobile and did not get an answer. BOI stated that as the staff member could not contact him on his mobile, they then attempted to contact him via the landline number listed on his account. According to BOI’s notification, the complainant’s mother had answered the phone and the BOI advisor requested to speak with the complainant, who shares his name with his father, and explained to the complainant’s mother that they could not discuss the account with her as she was not listed on the account. By referring to the complainant by his last name Mr X, his mother mistakenly thought the call was in relation to the account she held with her husband who is also called Mr X. BOI’s position was that that the complainant’s mother was adamant that she was listed on the account and therefore the advisor should speak to her about it. Certain information was then provided to the complainant’s mother regarding his account.

We commenced the investigation of this complaint by writing to BOI asking it to confirm if it had already reported this breach to us as is considered good practice under our “Personal Data Security Code of Practice”. BOI did not contest the fact that the complainant’s personal data had been disclosed and it confirmed that the breach had been previously reported to us. BOI had indicated that some confusion arisen, due to complainant’s father having the same name as him and having a banking relationship with the same bank branch and as a result of this confusion, BOI failed to properly identify the person with whom it was dealing and disclosed the complainant’s personal information to a third party. BOI claimed that it was only made aware of the disclosure of his personal information when the complainant’s mother phoned the advisor later that day to inform BOI that the complainant was her son and that the information was in relation to his loan accounts. BOI also advised us that a letter of apology had been issued to the complainant.

The complainant in this case declined the offer of amicable resolution which was made by BOI and requested a formal decision of the Commissioner.

The Commissioner concluded in her June 2016 decision that BOI contravened Section 2A(1) of the Data Protection Acts 1988 and 2003 when it processed the complainant’s personal information without his consent by disclosing it to a third party.

This case is a further demonstration of how a simple failure by a staff member to rigorously adhere to the requirement to verify a data subject’s identity before disclosing their personal data can result in unauthorised disclosure of personal data. While the circumstances of this case involved the verbal unauthorised disclosure of personal data to a family member of the data subject concerned, this in no way makes it any less serious than if it had been a written disclosure to an unrelated third party

 

14) Data Controller Obliged to Demonstrate Effort Made to Locate Data Within the Statutory 40 Day Period

We received a complaint from an individual concerning an access request which they had submitted to Meteor seeking a copy of their personal data and, in particular, the call recordings of calls which they had made to Meteor Customer Care for a particular period. Meteor responded initially to his request by stating that only 10% of calls to its Customer Care line are recorded and retained for 30 days and that there was no guarantee that his calls from the previous 30 days had been recorded. Meteor subsequently replied to the complainant’s access request definitively stating that there were no calls recorded and available in relation to the complainant.

We commenced an investigation of the complaint requesting information from Meteor in relation to the efforts it had undertaken to retrieve the call recordings which were the subject of the access request as well as information on the locations and/or business units to which enquiries were made in relation to the requester’s access request. Meteor supplied us with a printout showing the searches undertaken and it responded that that it did not hold any calls in relation to the complainant.

In this case the issue of compliance with the 40 days for responding to an access request under the Data Protection Acts 1988 and 2003 was at issue. The complainant had made a valid access request to Meteor by email dated 24 August 2015. Meteor had finally responded to the requester by email on 29 October 2015 with a substantive answer. This substantive response to the access request fell nearly four weeks outside the 40 day statutory period for responding. Furthermore, Meteor did not provide us with any evidence that it had commenced the search for the call recordings which the complainant had sought within that 40 day period but instead chose to rely on its policy that only 10 % of Customer Care line calls are recorded and simply assumed that the complainant’s calls had not been recorded.

Despite attempting to amicably resolve this complaint we were unable to do so and the data subject requested a formal decision from the Data Protection Commissioner. In her decision the Data Protection Commissioner concluded that Meteor had contravened the Data Protection Acts 1988 and 2003 by not responding to the complainant’s access request within the 40 day period as provided for under Section 4(1)(a).

This case demonstrates that a data controller must not approach a valid data access request on a simple assumption that it does not hold the personal data which is sought. Irrespective of the circumstances of the request, any policies employed or assumptions held by a data controller, it must take all steps necessary to establish in fact whether the requested data is, or is not, held by the data controller and to respond substantively to the access request within the 40 day statutory period. The right of access of a data subject is one of the cornerstones to the protection of an individual's personal data and this right must not be stymied by the actions of data controllers, whether unintentional or otherwise.

 

15) Personal Data Withheld from an Access Request by Airbnb on the Basis of an Opinion Given in Confidence

We received a complaint in July 2016 from an individual (an Airbnb guest) concerning an access request which he had submitted to Airbnb. The essence of the complaint was that Airbnb had not provided the guest with a particular email about him which had been sent to Airbnb by the host of Airbnb accommodation which the guest had rented. That email related to a complaint by the host about the guest. In responding to the guest’s access request, Airbnb had withheld this email on the basis that it consisted of an expression of opinion given in confidence by the host.

Of relevance here was Section 4(4A)(a) of the Data Protection Acts 1988 and 2003 which allows for personal data which consists of an expression of opinion about the data subject by another person to be disclosed by the data controller to the data subject in response to an access request without the need to obtain the consent of the person who gave the opinion. Equally relevant was Section 4(4A)(b)(ii) of the Data Protection Acts 1988 and 2003 which provides for an exemption from the right of access to personal data where the personal data consists of the expression of an opinion about the data subject by another person which has been given in confidence or on the understanding that it could be treated as confidential.

We commenced an investigation which examined in particular whether the email in question from the host to the data controller, Airbnb, consisted of the expression of a confidential opinion by the host about the guest. We found that the content of the email in question was predominately factual in nature. While one element of the email comprised of an expression of opinion, there was no reference or indication in the email to an expectation on the part of the host that the contents of the email would be kept confidential or not disclosed by Airbnb to the guest. In fact, we noted that in another email directly from the host to the guest, the host had indicated to the guest that they had contacted the Airbnb about the guest.

While Airbnb was clearly trying to fairly balance the rights of the guest against the rights of the host in this case, it was our view based on our examination of the issues and communications involved that there was no evidence at all of an expectation or understanding by the host that their email about the guest would not be released to him. In those circumstances no exemption from the right of access applied under Section 4(4A)(b)(ii). Airbnb accepted our position and accordingly released the email in question to the guest. This allowed the complaint to be amicably resolved.

As this case demonstrates, before withholding personal data on the basis that it consists of the expression of an opinion given in confidence or on the understanding that it could be treated as confidential, a data controller must ensure that there is a solid basis for such an assertion. It is not enough for a data controller to simply assume that this was the case in the absence of any indication to this effect from the person who expressed the opinion.

Furthermore, the inclusion of an opinion which attracts this exemption does not mean that all other personal data which is contained within the same document is similarly exempt from the right of access. Rather, in the context of a full document of personal data, the data subject is entitled to access the personal data within it which is not an opinion given in confidence and the data controller may only redact the part or parts to which the exemption validly applies. Opinions about individuals in respect of which no expectation of confidentiality can be shown to apply, or indeed information which is simply confidential, are not exempt from an access request.

As outlined in our published guidance, an opinion given in confidence on the understanding that it will be kept confidential must satisfy a high threshold of confidentiality. Simply placing the word "confidential" at the top of the page, for example, will not automatically render the data confidential. In considering the purported application of this exemption to a right of access, we will examine the data and its context and will need to be satisfied that the data would not otherwise have been given but for this understanding of confidentiality.

 

16) Crypto Ransomware Attack on a Primary School

In October 2016, we received a breach report from a primary school that had been the victim of a “Crypto Ransomware” attack, whereby parts of the school’s information systems had been encrypted by a third party thereby rendering the school’s files inaccessible. These files contained personal details including names, dates of birth and Personal Public Service Numbers (PPSNs). A ransom was demanded from the school to release the encrypted files.

Our assessment of the attack identified that the school had deficiencies in the measures it had taken to secure pupils’ personal data including:

  • No polices or procedures were in place to maintain adequate backups;
  • No procedures or policy documents existed focusing on system attacks such as ransomware or viruses;
  • No contracts with data processors (the ICT services providers) were in place (as is required under Section 2C(3) of the Data Protection Acts 1988 and 2003) setting out their obligations and, as a result, actions taken by the ICT suppliers were inadequate in response to the attack; and
  • A lack of staff training and awareness of the risks associated with opening unknown email attachments or files.

We considered that the school had contravened the provisions of Section 2 (1) (d) of the Acts, having failed to ensure that adequate security measures were in place, to protect against the unauthorised processing and disclosure of personal data.

Recommendations were issued to the school that it take steps to mitigate the risks identified. The school subsequently informed us that it had taken the following steps based on the recommendations issued:

  • Implement a staff training and awareness programme on the risks associated with email and the use of personal USB keys.
  • Implementation of a contract review process to ensure appropriate contracts are in place with its ICT suppliers
  • Ensure that any ICT support the school engages with either on a local basis or as recommended by the Board is performed by competent data processors.

This case demonstrates that schools, like any other organisation - commercial, public sector or private, operating electronic data storage systems and interacting online must ensure that they have appropriate technical security and organisational measures in place to prevent loss of personal data, and to ensure they can restore data in the event of Crypto Ransomware attacks. 

 

17) Data Breach at an Online Retailer

In July 2016, we received a breach report from an organisation operating retail and online sales. The organisation had been notified by a customer that their credit card was used in a fraudulent transaction without their knowledge which they believed arose from their provision of payment details online to the organisation.

 

The organisation engaged an expert third party to conduct an analysis of its website. It was determined that the payments system on the website had been compromised by malware for the previous 6-8 weeks. The malware copied data entered by customers during the online payment stage to an external destination.

 

Our assessment of the breach identified that there were deficiencies in the measures which the organisation had taken to secure users’ personal data including the following.

  • No contract or service level agreement existed between the data controller and the data processor.
  • No steps were taken to ensure that the data processor was compliant with technical security and organisational measures.
  • Insufficient measures were in place relating to appropriate technical security and organisational security measures to:
    • ensure that the server and website platform were maintained and that the software versions were up to date;
    • ensure that appropriate user authentication and access control measures were in place;
    • ensure appropriate technical security was in place, such as secure configuration of the website platform, measures to detect malware, measures to monitor suspicious activity and measures to ensure regular backups were taken; and
    • ensure governance processes were in place such as periodic reviews of the data processor and its technical security and organisational measures.

In light of the above, we considered that the organisation had contravened Section 2(1)(d) of the Data Protection Acts 1988 and 2003 by failing to take appropriate security measures against unauthorised access to, or unauthorised alteration, disclosure or destruction of, its users’ personal data.

Recommendations were issued to the organisation that it take steps to mitigate the risks identified. The organisation subsequently informed us that it had taken the following steps to address the recommendations:

  • Contracts are now in place to ensure that the appropriate technical security and organisational measures are in operation;
  • The organisation conducts regular reviews of the server and website platforms to ensure they are maintained and that the software versions are up to date;
  • The organisation conducts annual reviews by a third party expert to ensure compliance and to independently validate that the appropriate technical security and organisational measures are in place.

This case highlights the need for organisations to ensure that they have appropriate technical security and organisational measures for ICT security in place, particularly when engaging a data processor. Organisations should be cognisant of the measures outlined under Section 2C of the Acts to understand their obligations, in particular:

  • To ensure that appropriate security measures are in place;
  • Reasonable steps are taken to ensure that employees of the Data Controller and any other persons, for example, Data Processor employees, associated with the processing are aware of their obligations;
  • To ensure that proper contractual agreements are in place governing the processing;
  • That reasonable steps are taken to ensure compliance with the measures. 

 

18)  Incorrect Association of an Individual's Personal Details with Another File

We received a complaint concerning an alleged breach of an individual’s data protection rights by an insurance company.

During our investigation, the insurer (Insurer X) advised us that the complainant had in the past requested a quotation for household insurance from another insurance company (Insurer Y), the undertakings of which had been transferred to Insurer X. Insurer Y had failed to delete the quotation (the complainant had never proceeded to take out a policy) in line with its own data retention policy. In addition, Insurer Y had mistakenly linked the complainant’s personal details on the quotation to an insurance claim file in respect of a claim it had received from a person with an identical name.

When a transfer of Insurer Y's undertakings to Insurer X was being completed, the insurance claim file which mistakenly included the complainant as the claimant (rather than another individual who had the same name) was transferred to Insurer X. The claim when assessed later turned out to be fraudulent and Insurer X had its solicitors write to the complainant advising that their claim was found to be fraudulent and indicating the follow-up action which Insurer X intended to pursue to protect its interests.

At its centre, this case concerned sloppy handling of personal data. Many people in Ireland have the same name and there was no reason why the complainant’s personal details collected when the complainant obtained a quotation should have been added to an insurance claim file. Sufficient checks and balances should have existed in Insurer Y's data handling processes. However, the more significant issue that arose for this complainant is that they were unable to ascertain, prior to our involvement, how their details came to be in the possession of Insurer X and how the issue that arose had come about.

A number of contraventions therefore occurred in this case – a breach of the requirement of a reasonable retention period due to holding onto the quotation data longer than necessary and longer than was set out in the company’s own retention policy; unlawful further processing of the personal data by associating it with a claim file; failure to respond in a clear and timely manner to the complainant to explain how their data had been sourced and how it came to be processed in the way that it was. The complainant in this case suffered particularly serious consequences as they incurred significant legal costs in defending the accusation of making a fraudulent claim and the threat by Insurer X of instigating Circuit Court proceedings against them.

 

19) Prosecution of The Irish Times Limited for Marketing Offences

On 28 April 2015 we received a complaint from an individual who received an unsolicited marketing email earlier that day from The Irish Times Limited in the form of a “Get Swimming” newsletter. He explained that he signed up for the “Get Swimming” newsletter some months previously and he told us that he opted out after the receipt of the third or fourth issue by using the unsubscribe instruction at the bottom of the newsletter. However, he claimed that The Irish Times Limited continued to send him the “Get Swimming” newsletter each week thereafter and he continued to unsubscribe using the unsubscribe instruction. He informed us that he also emailed Customer Care in The Irish Times Limited on 21 April 2015 asking to be removed from the newsletter and warning that if not, he would report the matter to the Data Protection Commissioner. Customer Care responded on the same day stating that they would remove him from the newsletter immediately. However, he received a further newsletter one week later.

In response to our investigation, The Irish Times Limited stated that this was a once-off issue that arose from a human error in configuring the unsubscribe process, which had subsequently been fixed. It confirmed that sixty-four other users were affected. It informed us that a procedure had been put in place to prevent a recurrence.

The Data Protection Commissioner had previously issued a warning to The Irish Times Limited in November 2012 following the investigation of a complaint from a different individual in relation to marketing emails which he continued to receive after he had opted out of the receipt of such emails.

The Data Protection Commissioner decided to prosecute the company. At Dublin Metropolitan District Court on 4 April 2016, The Irish Times Limited pleaded guilty to one charge of sending an unsolicited marketing email without consent. The Court ordered the payment of €3,000 in the form of a charitable donation to Pieta House and it adjourned the matter for seven weeks. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charge.

 

20) Prosecution of Coopers Marquees Limited for Marketing Offences

In September 2015 we received a complaint from an individual about a marketing email which she received a few weeks earlier from Coopers Marquees Limited. The same individual had previously complained to us in January 2014 after she received a marketing email from that company which, she stated, she had not consented to receiving. During the course of our investigation of the first complaint, the company undertook to remove the individual’s email address from its marketing database. We concluded that complaint by issuing a warning to the company that the Data Protection Commissioner would likely prosecute it if it re-offended.

In response to our investigation of the second complaint, we were informed that a new marketing executive for the company used an old version of the marketing database for a marketing campaign. This resulted in the sending of the offending marketing email to the email address of the individual whose details had been removed for over a year. The company accepted that it did not have consent to contact the individual concerned by email and it claimed that there was human error on the part of the new staff member which caused the email to be sent. The Data Protection Commissioner decided to prosecute the company.

At Virginia District Court on 7 June, 2016 Coopers Marquees Limited pleaded guilty to one charge of sending an unsolicited email without consent. The Court ordered a contribution in the amount of €300 as a charitable donation to Mullagh Scout Troop and it indicated that it would apply the Probation of Offenders Act in lieu of a conviction. The defendant company agreed to make a contribution towards the prosecution costs of the Data Protection Commissioner.

 

21) Prosecution of Robert Lynch T/A The Energy Centre for Marketing Offences

In January 2015 two individuals complained to us about unsolicited marketing calls which they received from The Energy Centre on their landline telephones. In the case of both complainants, their telephone numbers stood recorded on the National Directory Database (NDD) Opt-Out Register. In the case of the first complainant, he informed us that he received an unsolicited marketing call on 5 January 2015 during which the caller offered to arrange to conduct a survey of his home for the purpose of recommending energy saving initiatives that The Energy Centre could sell him. The complainant said that he told the caller not to call him again and he pointed out that his number was on the NDD Opt-Out Register. Three days later, the complainant received a further unsolicited marketing call from The Energy Centre. In the case of the second complainant, he received an unsolicited marketing phone call on 23 January 2015 from a caller from The Energy Centre who told him that there were sales agents in his area and that she wished to book an appointment for one of them to visit his home. The same complainant had previously complained to us in November 2013 having received an unsolicited marketing phone call from the same entity at that time. His first complaint was amicably resolved when he received a letter of apology, a goodwill gesture and an assurance that steps had been taken to ensure that he would not receive any further marketing calls.

By way of explanation during the course of our investigation of the two complaints received in January 2015 The Energy Centre indicated that its IT expert had examined the matter and concluded that there was human error somewhere along the line when someone transferred some telephone numbers from a non-contact list back into the system to be contacted.

The Data Protection Commissioner had previously issued a warning to The Energy Centre following the investigation of a complaint from a different individual in relation to unsolicited marketing calls which he received on his landline telephone while his number was recorded on the NDD Opt-Out Register.

The Data Protection Commissioner decided to prosecute. At Drogheda District Court on 21 June 2016, Robert Lynch T/A The Energy Centre pleaded guilty to three charges of making unsolicited marketing telephone calls to the telephone numbers of two individuals whose numbers were recorded on the NDD Opt-Out Register. In relation to the first case where the complainant’s number was called on two occasions three days apart, the Court convicted the defendant in respect of the charge for the second telephone call, it applied a fine of €100 and it took the other charge in relation to the first telephone call into account. In relation to the second case, the Court applied the Probation of Offenders Act in respect of that charge. The defendant agreed to pay the prosecution costs incurred by the Data Protection Commissioner.

 

22) Prosecution of Paddy Power Betfair Public Limited Company for Marketing Offences

In June 2016 an individual complained to us about marketing text messages he was receiving from Paddy Power Betfair Plc and he also alleged that the ‘stop’ command at the end of the text messages was not working. He stated that he had never placed a bet with Paddy Power Betfair Plc but he recalled having used its Wi-Fi once.

During our investigation of this case, the company, in relation to the allegation that the ‘stop’ command was not working, admitted that there were technical issues with the opt-out service of its text provider and stated that it had it acted immediately to rectify this once it became aware of it. On the matter of marketing consent, the company informed our investigation that the complainant had logged onto the Wi-Fi at its Lower Baggot Street, Dublin outlet in April 2016. It described how a user must enter their mobile phone number on the sign-in page following which they receive a PIN number to their phone which enables the user to proceed. After entering the PIN correctly, the customer is presented with a tick box to accept the terms of service which includes a privacy policy. Having examined the matter, we advised Paddy Power Betfair Plc that we did not see any evidence that the user was given an opportunity to opt out of marketing as is required by S.I. 336 of 2011 (the ePrivacy Regulations). We formed the view that the company was unable to demonstrate that the complainant unambiguously consented to the receipt of marketing communications. The company understood our position and it undertook to work with its Wi-Fi providers to add the required marketing consent tick box on its registration page. It also immediately excluded all mobile phone numbers acquired through the Wi-Fi portals from further marketing communications.

The Data Protection Commissioner decided to prosecute the company. A warning had previously been issued to the company in 2015 following the investigation of a complaint from a different individual who continued to receive marketing text messages after opting out.

At Dublin Metropolitan District Court on 28 November2016 Paddy Power Betfair Plc pleaded guilty to one charge of sending an unsolicited marketing text message without consent and one charge of not providing the recipient with a valid means of opting out of the receipt of further marketing messages. In lieu of a conviction and fine, the Court ordered the defendant to contribute €500 to the Simon Community by 12 December 2016 and it adjourned the matter for two weeks. The company agreed to discharge the prosecution costs incurred by the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges.

 

23) Prosecution of Trailfinders Ireland Limited for Marketing Offences

A complaint was lodged with us in June 2016 by an individual who received unsolicited marketing emails at that time from Trailfinders Ireland Limited despite having been informed previously that her email address had been removed from the company’s marketing database in August 2015. In its response to our investigation, the company acknowledged that the offending emails were sent in error. It explained that it had received a written communication about a customer care issue from the complainant a few days prior to the sending of the marketing emails and that its Customer Care team had updated her case concerning that particular issue. This update triggered an automated process which inserted the complainant’s email address into its marketing database. Trailfinders Ireland Limited apologised for the system error and it said that it should not have happened in any circumstances.

On foot of a previous complaint in 2015 against Trailfinders Ireland Limited from the same complainant concerning unsolicited marketing emails to which she had not consented, the Data Protection Commissioner had issued a warning to the company in January 2016. Following our investigation of the second complaint, the Data Protection Commissioner decided to prosecute the company.

At Dublin Metropolitan District Court on 28 November, 2016 Trailfinders Ireland Limited pleaded guilty to two charges of sending unsolicited marketing emails without consent. In lieu of a conviction and fine, the Court ordered the defendant to contribute €500 to the Simon Community by 12 December 2016 and it adjourned the matter for two weeks. The company agreed to discharge the prosecution costs incurred by the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges. 

 

24) Prosecution of Topaz (Local Fuels) Limited for Marketing Offences

In July 2016 an individual complained to us about an unsolicited marketing telephone call which he received on his mobile telephone from Topaz (Local Fuels) Limited. He had previously complained to us in November 2015 about marketing text messages which the company sent him without his consent and he informed us that despite attempting to opt out by replying ‘Stop’ he continued to receive more text messages. In its response to our first investigation, the company said that the inclusion of the complainant’s mobile telephone number in its promotional campaign was as a result of a human error and it acknowledged the failure of its system to register his opt out attempts. It informed us in February 2016 that it had removed the mobile phone number concerned from its marketing database. We concluded that complaint at the time with a warning to Topaz (Local Fuels) Limited.

On receipt of the second complaint, we commenced a further investigation by seeking an explanation for the making of a marketing phone call to the individual’s mobile telephone in circumstances where we had previously been advised that the telephone number had been removed from the company’s marketing database. The company said that the number was called by the call centre due to its presence on a list of leads/lapsed customers that was provided to the call centre by another area of the business. It stated that it did not go far enough to ensure that a failure in its systems would not occur again in relation to this individual. It accepted that another marketing contact should not have happened in the absence of the individual’s consent. The Data Protection Commissioner decided to prosecute the company.

At Dublin Metropolitan District Court on 28 November, 2016 Topaz (Local Fuels) Limited pleaded guilty to one charge of sending an unsolicited marketing text message without consent and one charge of not providing the recipient with a valid means of opting out of the receipt of further marketing messages. In lieu of a conviction and fine, the Court ordered the defendant to contribute €500 to Our Lady’s Children’s’ Hospital Crumlin by 12 December, 2016 and it adjourned the matter for two weeks. The company agreed to discharge the prosecution costs incurred by the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges.

 

25) Prosecution of Dermaface Linited for Marketing Offences

In August 2016 we received a complaint from a former customer of Dermaface Limited after she received an unsolicited marketing email. The complainant had previously been informed in 2014 on foot of a previous complaint about unsolicited marketing emails that Dermaface Limited had removed her details from its marketing list. Our investigation sought an explanation from Dermaface Limited. It informed us that the marketing email which was the subject of the latest complaint was sent through the clinic’s software system which it had purchased. It claimed that the new system contacted patients and former patients who had previously been opted out of receiving marketing communications from it. It admitted that the complainant was one of those patients/ former patients who had been sent a marketing email. It sent an apology to the complainant.

Following an investigation in 2011 of a complaint from a different individual who received numerous marketing text messages from Dermaface Limited, the Data Protection Commissioner had issued a warning to the company. The Commissioner decided, therefore, to prosecute the company in respect of the latest offence.

At Dublin Metropolitan District Court on 28 November 2016 Dermaface Limited pleaded guilty to one charge of sending an unsolicited marketing email without consent. In lieu of a conviction and fine, the Court ordered the defendant to contribute €300 to Our Lady’s Children’s’ Hospital Crumlin by 12 December, 2016. The Court also indicated that it expected the company to discharge the prosecution costs incurred by the Data Protection Commissioner and it adjourned the matter for two weeks. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Data Protection Commissioner’s costs. The Court struck out the charge.

  1. Marketing offences by MTS Property Management Limited – prosecution
  2. Marketing offences by Greyhound Household – prosecution
  3. Marketing offences by Imagine Telecommunications Business Limited – prosecution
  4. Marketing Offences by Eircom Limited – prosecution
  5. Defence Forces Ireland – failure to keep data safe and secure
  6. Further processing of personal data by a state body
  7. Supermarket’s excessive use of CCTV to monitor member of staff
  8. Disclosure of personal information to a third party by the Department of Social Protection
  9. Covert CCTV installed without management knowledge
  10. Danske Bank erroneously shares account information with third parties
  11. Failure to update customer’s address compromises the confidentiality of personal data
  12. Unfair use of CCTV Data

Case Study 1: Marketing offences by MTS Property Management Limited – prosecution

We received a complaint in February 2013 from an individual who received marketing SMS messages from MTS Property Management Limited advertising the company’s property-management services. The complainant informed us that she had dealt with the company on one occasion over five years previously but she did not consent to her mobile phone number being used for marketing purposes. She also pointed out that the SMS messages that she received did not provide her with a means of opting out.

Our investigation of this complaint became protracted as the company denied knowledge of the mobile number to which the SMS messages were sent and it denied knowledge of the account holder of the sending phone number. However, our investigation established sufficient evidence to satisfy itself that MTS Property Management Limited was responsible for the sending of the marketing SMS messages to the complainant. We decided to prosecute the offences.

MTS Property Management Limited had come to our attention previously in the summer of 2010 when two individuals complained about unsolicited marketing SMS messages sent to them without consent and without the inclusion of an opt-out mechanism. Following the investigation of those complaints, we warned the company that it would likely face prosecution if it committed further offences under Regulation 13 of SI 336 of 2011 at any future time.

At Dublin Metropolitan District Court on 23 February 2015, MTS Property Management Limited pleaded guilty to one charge of sending an unsolicited marketing SMS without consent and it pleaded guilty to one charge of failing to include an opt-out mechanism in the marketing SMS. The Court convicted the company on both charges and it imposed two fines of €1,000 each. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

 

Case Study 2: Marketing offences by Greyhound Household – prosecution

In May 2014, we received a complaint against Greyhound Household from an individual who received an unsolicited marketing phone call on his mobile telephone from the company’s sales department. The same individual had previously complained to us in December 2013 as he was receiving marketing SMS messages from Greyhound Household that he had not consented to receiving. He informed us that he had ceased being a customer of the company in May 2013. Arising from the investigation of the previous complaint, Greyhound Household had undertaken to delete the former customer’s details and it apologised in writing to him. On that basis, we concluded the matter with a formal warning to the effect that any future offences would likely be prosecuted.

On receipt of the latest complaint, we commenced a further investigation. Greyhound Household admitted that a telephone call was made to the complainant’s mobile phone number without consent but it was unable to explain why his details had not been deleted in line with the company’s previous undertaking. We decided to prosecute the offence.

At Dublin Metropolitan District Court on 23 February 2015, Greyhound Household pleaded guilty to one charge of making an unsolicited marketing phone call to a mobile phone number without consent. The Court applied Section 1(1) of the Probation of Offenders Act subject to the defendant making a charitable donation of €1,000 to Pieta House. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

 

Case Study 3: Marketing offences by Imagine Telecommunications Business Limited – prosecution

In March 2015, we received a complaint against Imagine Telecommunications Business Limited from a company that had received unsolicited marketing telephone calls. The same company had previously complained to us in 2014 about repeated cold calling to its offices. Despite having submitted an opt-out request to Imagine Telecommunications Business Limited, it continued to receive marketing phone calls. Following our investigation of the first complaint, and having been assured that the phone number of the complainant company had been removed from the marketing database, we issued a formal warning to Imagine Telecommunications Business Limited that any future offences would likely be prosecuted.

On investigating the current complaint, we were informed by Imagine Telecommunications Business Limited that it had failed to mark the telephone number concerned as ‘do not contact’ on the second of two lists on which it had appeared. This led to the number being called again in March and June 2015. It stated that the only reason the number was called after the previous warning was due to this error and it said that it took full responsibility for it.

We prosecuted the offences at Dublin Metropolitan District Court on 2 November 2015. Imagine Telecommunications Business Limited pleaded guilty to one charge of making an unsolicited marketing telephone call without consent. The Court applied Section 1(1) of the Probation of Offenders Act conditional upon a charitable donation of €2,500 being made to the Merchant’s Quay Project. Prosecution costs were recovered from the defendant.

 

Case Study 4: Marketing offences by Eircom Limited – prosecution

We received complaints from two individuals in February and April 2015 concerning marketing telephone calls that they had received on their landline telephones from Eircom Limited. In both cases, and prior to lodging their complaints, the individuals had submitted emails to Eircom Limited requesting that they not be called again. Eircom’s Customer Care Administration Team replied to each request and informed the individuals that their telephone numbers had been removed from Eircom’s marketing database. Despite this, each individual subsequently received a further marketing telephone call in the following months, thus prompting their complaints to this Office.

Eircom informed our investigations that the agents in its Customer Care Administration Team who handled the opt-out requests had not updated the system to record the new marketing preference after sending out the replying email to the individuals concerned. It undertook to provide the necessary refresher training to the agents concerned.

Separately, a former customer of Eircom complained in May 2013 that he continued to regularly receive unsolicited marketing phone calls from Eircom on his landline telephone despite clearly stating to each caller that he did not wish to receive further calls. He stated that the calls were numerous and that they represented an unwarranted intrusion into his privacy. Eircom continued to make a further ten marketing telephone calls to the individual after the commencement of our investigation of this complaint. Our investigation subsequently established that this former customer had received over 50 marketing contacts from Eircom since 2009 when he ceased to be an Eircom customer. Eircom explained that the continued calls arose from a misunderstanding of what systems the former customer’s telephone number was to be opted out from.

In October 2014, an Eircom customer complained that he had received a marketing SMS from Eircom that did not provide him with a means to opt out of receiving further marketing SMS messages. Eircom informed our investigation of this complaint that the inclusion of an opt-out is the norm in all of its electronic-marketing campaigns but, in this instance, and due to human error, the link to the necessary opt-out had not been set properly. Our investigation established that this error affected over 11,600 marketing messages that were sent in the campaign concerned.

We proceeded to prosecute the offences identified on foot of the complaints received in the aforementioned cases. At Dublin Metropolitan District Court on 2 November 2015, Eircom Limited pleaded guilty to six charges of making unsolicited marketing calls without consent and it pleaded guilty to one charge of sending a marketing SMS without a valid address to which the recipient may send an opt-out request. The Court applied Section 1(1) of the Probation of Offenders Act conditional on the defendant making donations amounting to €35,000 as follows: €15,000 to Pieta House, €10,000 to LauraLynn (Children’s Hospice) and €10,000 to Our Lady’s Children’s Hospital, Crumlin. The company agreed to pay the prosecution costs incurred by this Office.

 

Case Study 5: Defence Forces Ireland – failure to keep data safe and secure

A member of the Defence Forces made a complaint to this Office that certain personal data relating to him was not kept safe and secure by the Defence Forces.

The circumstances of the individual’s complaint to our Office arose when a Military Investigating Officer (MIO) was appointed to review an internal complaint made by him as a member of the Defence Forces. Subsequently, the Defence Forces Ombudsman was appointed to review the process of the handling of the complaint and, during the course of its review, it was ascertained that the MIO could not supply details of interview notes of an interview he had conducted with the complainant as he had stored them at an unsecure location and they were damaged or lost following flooding and a burglary at that location when the MIO was on an overseas mission. The unsecure location was in fact the MIO’s private house.

We raised the matter with the Defence Forces, who confirmed the complainant’s allegation that the notes had been stored at an unsecure location and had been damaged or lost as stated.

The Defence Forces informed us of the measures taken to keep data safe and secure, and referred us to its Administration Instruction, which provides for the prohibition of removal of records.

The Defence Forces further stated thatthe removal of records from their place of custody to a private residence would breach this instruction and that a breach of this provision may constitute an offence under S.168 of the Defence Act 1954. It advised that, as the MIO was no longer a serving member of the Defence Forces, he is not subject to military law.

The Defence Forces unequivocally acknowledged that the loss of the data in this case should not have occurred and was fully regretted. It informed us that it had recently undertaken a full review of practices and procedures in respect of both the processing and disclosure of data to mitigate the possibility of any future unauthorised or accidental disclosure of personal data.

The Commissioner’s decision on this complaint issued in June 2015, and it found that the Defence Forces contravened Section 2(1)(d) of the Data Protection Acts by failing to take appropriate security measures against unauthorised access to, or unauthorised alteration, disclosure or destruction of, the complainant’s personal data when it allowed it to be stored at an unsecure location, namely a private house.

This Office acknowledges that the Defence Forces has procedures in place in relation to the protection of personal data as set out in its Administration Instruction. However, those procedures were not followed in this case and when an official record was removed from its place of custody, it resulted in the complainant’s personal data being lost or stolen because the appropriate security measures in place were not followed.

There are many workplace scenarios where staff and managers, in particular, may need to take files, including personal data, home with them. Extreme caution should always be exercised in such cases to ensure that there is no risk to the security of personal data either in the transit of the files or while the files are in the employee’s home. Data controllers must ensure that employees act in a responsible manner with regard to the safe custody and handling of workplace files. This demands a proper system that records the taking of and returning of files and the following of prescribed procedures for the safe keeping of personal data while the files concerned are absent from the workplace. Likewise, it is critical that employees are prohibited from emailing official files from their workplace email account to their personal email account for afterhours work or for any other reason. In such situations, data controllers lose control of personal data that they are obliged by law to protect.

 

Case Study 6: Further processing of personal data by a state body

In February 2015, we received a complaint from an employee of a state body in relation to the alleged unfair processing of his personal data. The complainant stated that, in the course of a meeting, he had been advised that his manager had requested access to data from his security swipe card in order to compare it with his manually completed time sheets. The complainant explained that this had been carried out without any prior consultation with him or his line manager. By way of background, the complainant informed us that the security swipe cards used by the employees are for accessing the building and secured areas only, and are not used as a time management/attendance system.

We sought an explanation from the body concerned as to how it considered that it had complied with its obligations under the Data Protection Acts in the processing of the complainant’s personal information obtained from his swipe-card data. We also advised it that we had sight of the relevant section of its staff handbook and we noted that there was no reference to the swipe card being used for the purpose of checking attendance.

We received a response explaining that the swipe-card data relating to the complainant was handed over to the complainant’s manager in good faith on the basis that it was corporate rather than personal data. The organisation also confirmed that it checked the staff handbook and any other information that may have been circulated to staff regarding the purposes of the swipe card and that there was no mention of the use of swipe cards in relation to recording time or attendance. It advised that the focus of the information circulated with regard to swipe cards was on security and access only.

After consideration of the response received, along with the content of the complaint, we informed the organisation concerned that we considered that the Data Protection Acts were breached when the employee’s swipe-card details were provided to his manager to verify his working hours. We referred to the provisions of Section 2(1)(c)(ii) of the Data Protection Acts, which state that data shall not be further processed in a manner incompatible with the purpose for which it was obtained. Given that we considered the information concerned had been processed in contravention of the Data Protection Acts 1988 and 2003, we required an assurance that all email records created in relation to the further processing of the swipe-card details concerned be deleted from its systems; this assurance was duly provided.

The complainant in this case agreed, as an amicable resolution to his complaint, that he would accept a written apology from his employer. This apology acknowledged that the complainant’s data protection rights had been breached and it confirmed that the organisation had taken steps to ensure that this type of error did not recur in the future.

This case highlights the temptation organisations face to use personal data that is at their disposal for a purpose other than that for which it was originally obtained and processed. The scenario outlined above is not uncommon, unfortunately. Time and attendance monitoring may occasionally prove difficult for managers, and contentious issues arise from time to time. The resolution of those issues should not involve an infringement of the data protection rights of employees similar or otherwise to the circumstances in this case.

 

Case Study 7: Supermarket’s excessive use of CCTV to monitor member of staff

A former staff member of a supermarket submitted a complaint to this Office regarding her employer’s use of CCTV.

The complainant informed us that she had been dismissed by her employer for placing a paper bag over a CCTV camera in the staff canteen. She informed us that the reason for her covering the CCTV camera was that when she was on an official break in the staff canteen, a colleague styled her. The complainant also stated that the camera was placed in the corner of the staff canteen and there was no signage to inform staff that surveillance was taking place. She informed us that she was never officially advised of the existence of the camera nor had her employer ever informed her of the purpose of the CCTV in the canteen.

In its response to our investigation, the supermarket informed us that the complainant was dismissed for gross misconduct, which occurred when she placed a plastic bag over the camera in the canteen to prevent her actions being recorded and thereby breaching the store’s honesty policy as outlined in the company handbook. The supermarket owner informed us that the operation of CCTV cameras within the retail environment was to prevent shrinkage, which can arise from customer theft, waste and staff theft. He stated that it was also used for health and safety, to counter bullying and harassment and for the overall hygiene of the canteen. In relation to the incident concerning the complainant, the owner informed us that, on the day in question, the store manager noticed some customers acting suspiciously around the off-licence area and that on the following day CCTV footage was reviewed. It was during the viewing of the footage in relation to suspicious activity in the off-licence area that he noticed the complainant putting a bag over the camera.

Following an inspection by one of our Authorised Officers, we informed the supermarket owner that, in our view, there was no justification from a security perspective for having a camera installed in the canteen area.

The complainant in this case declined an offer of an amicable resolution and she requested a formal decision of the Commissioner.

The decision by the Commissioner in January 2015 found that the supermarket contravened Section 2(1)(c)(iii) of the Data Protection Acts, 1988 and 2003, by the excessive processing of the complainant’s personal data by means of a CCTV camera in a staff canteen.

Data controllers are tempted to use personal information captured on CCTV systems for a whole range of purposes. Many businesses have justifiable reasons, usually related to security, for the deployment of CCTV systems on their premises but any further use of personal data captured in this way is unlawful under the Data Protection Acts unless the data controller has at least made it known at the time of recording that images captured may be used for those additional purposes, as well as balancing the fundamental rights of employees to privacy at work in certain situations, such as staff canteens and changing rooms.

 

Case Study 8: Disclosure of personal information to a third party by the Department of Social Protection

This Office received a complaint in July 2014 concerning an alleged unauthorised disclosure of the complainant’s personal information by the Department of Social Protection to a third party. The complainant informed us that, in the course of an Employment Appeals Tribunal hearing, her employer produced to the hearing an illness-benefit statement relating to her. The statement contained information such as her name, address, PPSN, date of birth, bank details and number of child dependants. She stated that her employer was asked how he had obtained this illness-benefit statement. He stated that he had phoned the Department of Social Protection and the statement had subsequently been sent to him by email. Prior to making the complaint to this Office, the complainant had, via her solicitors, received an apology from the Department, who acknowledged that her information had been disclosed in error and that proper procedures had not been followed. However, she informed us that she had very little information as to how the disclosure had occurred and that the matter had caused her considerable distress.

We commenced an investigation by writing to the Department of Social Protection. In response, it stated that it accepted that a statement of illness benefit was disclosed to the complainant’s employer in error, on foot of a telephone call from the employer. The Department acknowledged that the information should not have been sent out to the employer and that the correct procedures were not followed on this occasion. It stated that the staff member who supplied the information was new to the Department. It explained that it was not normal practice to issue a screenshot to the employer; the correct procedure was to issue a statement to the employee along with a note informing the employee that the information had been requested by their employer.

The data subject chose not to accept an apology from the Department as an amicable resolution of her data protection complaint, opting instead to seek a formal decision of the Data Protection Commissioner.

A decision of the Data Protection Commissioner issued in October 2015. In her decision, the Commissioner formed the opinion that the Department of Social Protection contravened Section 2(1)(c)(ii) of the Data Protection Acts 1988 and 2003 by the further processing of the complainant’s personal data in a manner incompatible with the purpose for which it had been obtained. The contravention occurred when the Department of Social Protection disclosed the complainant’s personal data to an unauthorised third party.

This case serves as a reminder to data controllers of the importance of ensuring that new staff are fully trained and closely supervised in all tasks, particularly in those tasks that involve the processing of personal data. Errors by staff present a high risk of data breaches on an ongoing basis and it is critically important that efforts are made to mitigate against those risks by driving data protection awareness throughout the organisation, with particular focus on new or re-assigned staff.

 

Case Study 9: Covert CCTV installed without management knowledge

This Office received a complaint from staff of Letterkenny General Hospital in relation to the operation of covert CCTV surveillance by management within the Maintenance Department of Letterkenny General Hospital.

We also received a ‘Data-Breach Incident Report’ from the Health Service Executive (HSE) about this matter. This breach report recorded the incident as ‘Unauthorised CCTV Surveillance of Office Area’ and stated that a covert CCTV camera was installed by two maintenance foremen in their two-man office due to concerns they had in relation to the security of their office.

We commenced an investigation of the complaint by writing to the Health Service Executive (HSE), outlining the details of the complaint. We sought information from it in relation to the reporting arrangements between the maintenance staff in Letterkenny General Hospital and the maintenance foremen who installed the covert CCTV; the whereabouts of footage captured by the covert CCTV; the outcome of the internal investigation; how the covert CCTV was installed without notice to the management of Letterkenny General Hospital; and details of any instruction or notification issued to staff on foot of the internal investigation.

In response, the HSE stated that the foremen who had installed the camera were direct supervisors of the maintenance department staff and that the footage recorded was stored on a DVD and secured in a locked safe. It further stated that an internal investigation concluded that two staff had installed the covert CCTV without the authority, consent or knowledge of the management of Letterkenny General Hospital, due to concerns regarding unauthorised access/security in their office. We established that the camera in question was previously installed in a now disused area of the hospital, had been decommissioned and was re-installed in the office in question.

As well as confirming that the footage captured by the covert camera was of normal daily comings and goings to the maintenance office, the HSE stated that this was an unauthorised action by staff in the maintenance section and that it was keenly aware of its duty to all staff to provide a workplace free from unauthorised surveillance. The HSE confirmed that it would initiate steps to ensure that there would be no repetition of this action.

The HSE subsequently issued a written apology to the complainants in which it also confirmed that the recordings had been destroyed.

A decision of the Data Protection Commissioner issued in April 2015. In her decision, the Commissioner formed the opinion that the HSE contravened Section 2(1)(a) of the Data Protection Acts 1988 and 2003 by failing to obtain and process fairly the personal data of individuals whose images were captured and recorded by a covert CCTV camera installed without its knowledge or consent.

Covert surveillance is normally only permitted on a case-by-case basis, where the data is kept for the purpose of preventing, detecting or investigating offences, or apprehending or prosecuting offenders. This implies that a written specific policy must be put in place detailing the purpose, justification, procedures, measures and safeguards that will be implemented in respect of the covert surveillance, with the final objective being an active involvement of An Garda Síochána or other prosecutorial authority. Clearly, any decision by a data controller to install covert cameras should be taken as a last resort after the full exhaustion of all other available investigative steps.

 

Case Study 10: Danske Bank erroneously shares account information with third parties

We received a complaint against Danske Bank alleging that it had disclosed personal data and account information in relation to a mortgage on a property owned by the complainant to third parties. We commenced an investigation of the matter by writing to Danske Bank, outlining the details of the complaint. We received a prompt response from Danske Bank, which stated that the complainant and the individual who received his personal data were joint borrowers on certain loan facilities and that it was during the course of email communications with the other individual in respect of that individual’s loan arrears that the personal data relating to the complainant was disclosed to two third parties. Danske Bank admitted that this was an error on its part and stated that it was unfortunate that it had occurred. It went on to explain that, in dealing with the queries raised by the other individual in respect of his arrears and entire exposure to Danske Bank, the relationship manager also included information on all arrears in respect of that individual’s connections, which included the complainant. The staff member concerned expressed his regret at the incident and Danske Bank confirmed that the staff member was reminded of its procedures with regard to data protection and the need to be vigilant when dealing with the personal data of customers. Danske Bank apologised for the incident and offered reassurance that it would endeavour to prevent a future reoccurrence.

Danske Bank went on to state that it had robust controls in place to ensure that such incidents did not occur; however, it admitted that, despite such controls, this was a case of a human error and it did not believe that it was in any way intentional.

The complainant requested that the Data Protection Commissioner issue a formal decision on his complaint. A decision of the Commissioner issued in January 2015, and it stated that, following the investigation of the complaint, she was of the opinion that Danske Bank contravened Section 2(1)(d) the Data Protection Acts 1988 and 2003 by disclosing the complainant’s personal data to a number of third parties without his knowledge or consent.

This case is illustrative of the need for financial institutions to be vigilant when dealing with the personal data of individuals who have common banking relationships with others, and to ensure that appropriate safeguards are in place to prevent accidental or erroneous sharing of personal data.

 

Case Study 11: Failure to update customer’s address compromises the confidentiality of personal data

This Office received a complaint that Allied Irish Banks (AIB) failed to keep the complainant’s personal data up-to-date over a prolonged period, despite repeated requests by the individual to do so, and that it failed to maintain the security of the individual’s personal information. The complainant informed us that he had repeatedly asked AIB to update his address details but that it had failed to do so. As a result, his correspondence from AIB continued to be sent to a previous address. The complainant alleged that, arising from the failure of AIB to update his address, his correspondence containing his personal data, which was sent to his previous address by AIB, was disclosed to unknown third parties at this previous address.

We commenced an investigation of the matter by writing to AIB, outlining the details of the complaint. AIB confirmed to us that, due to a breakdown in internal processes, the complainant’s correspondence address was had not been updated on all its systems in a timely manner, resulting in automated arrears letters continuing to issue to an old address.

In circumstances where AIB had been advised that the complainant had changed address, our investigation was satisfied that its continued sending by post or delivering by hand of correspondence intended for the complainant to the previous address failed to secure the complainant’s personal data against unauthorised access by parties who had access to the letterbox at the previous address.

Efforts to resolve the complaint by means of an amicable resolution were unsuccessful and the complainant sought a formal decision. In her decision, the Commissioner formed the opinion that AIB contravened Section 2(1)(b) of the Data Protection Acts 1988 and 2003 by failing to keep the complainant’s personal data up to date. This contravention occurred when AIB failed to remove the complainant’s previous address from his account despite notification from him to do so. The Commissioner also formed the opinion that AIB contravened Section 2(1)(d) by failing to take appropriate security measures against unauthorised access to the complainant’s personal data by sending correspondence by post and by hand delivery to an address at which he no longer resided, while knowing that this was no longer his residential address.

This case demonstrates the need for all data controllers to ensure that personal data is kept accurate and up-to-date at all times. Failure to do so may result in the disclosure of personal data to unauthorised persons as well as unnecessary distress and worry for data subjects who have updated the data controller with the most accurate information, only to find that the necessary safeguards were not in place to prevent their personal data being compromised by use, as in this case, of a previous address.

 

Case Study 12: Unfair use of CCTV data

The subject matter of this complaint was the use by the data controller of CCTV footage in a disciplinary process involving one of its drivers. The data controller, Aircoach, advised that it was reviewing CCTV footage from one of its coaches as part of dealing with an unrelated customer-complaint issue when it happened to observe a driver using her mobile phone while driving a coach.

As is often the case with such complaints, the complainant objected to the use of the CCTV footage as evidence in a disciplinary process that was taken by Aircoach against her, the basis of the objection being that it was unfairly obtained.

Aircoach informed us that it had introduced CCTV across its fleet in order to further enhance safety and security for both staff and customers. It further advised that all staff are informed that CCTV is installed and of the reasons behind its use, but admitted that it was not until the middle of 2014 that significant efforts were made to fully inform both staff and customers as to the presence of CCTV on its coaches.Aircoach provided us with a copy of its new CCTV policy and it also provided us with photos showing the CCTV signage on the coach entrance doors, adding that the process of putting appropriate signage in place on its coaches commenced in January 2014 and was concluded by October 2014.

The law governing the processing of personal data, including CCTV images, is provided for under Section 2 of the Data Protection Acts 1988 and 2003. Processing includes, among other things, the obtaining and use of personal data by a data controller and it must be legitimate by reference to one of the conditions outlined under Section 2A(1) of the Acts. In addition, a data controller must also satisfy the fair-processing requirements set out under Section 2D(1) of the Acts, which requires that certain essential information is supplied to a data subject before any personal data is recorded.

The investigation in this case established that, at the time of the relevant incident on 19 February 2014, the roll-out of CCTV signage by Aircoach had commenced; however, the company failed to properly or fully inform staff that CCTV footage might be used in disciplinary proceedings. Any monitoring of employee behaviour through the use of CCTV cameras should take place in exceptional cases rather than as a norm and must be a proportionate response by an employer to the risk faced, taking into account the legitimate privacy and other interests of workers. In this case, when processing the complainant’s image, Aircoach was not aware of any particular risk presented and, by its own admission, was investigating an unrelated matter. While it subsequently transpired that the incident in question was indeed a very serious matter, involving alleged use by a driver of a mobile phone while driving, there was no indication at the time of the actual processing that this was the case and the processing therefore lacked justification. In addition, the fair-processing requirements set out in Section 2D were not fully met and fair notice of the processing for the specific purpose of disciplinary proceedings was not given to drivers whose images might be captured and used against them. In those circumstances, the processing could not be said to have been done in compliance with the Acts and the Commissioner found that Section 2(1)(a) had been contravened.

It is important to note that the processing of CCTV images in disciplinary proceedings against an employee is very much circumstance-dependent. Thus, while on this occasion the employer was found to have been in contravention of the Acts because the images were processed without justifiable cause or fair notice to the employee in question, in other circumstances the processing might be regarded as being proportionate and fair, especially if the processing is done in response to an urgent situation and the employer has the correct procedures in place. Employers should therefore be careful to ensure that a comprehensive CCTV policy is in place and followed if they wish to stay within their legal obligations.

  1. Prosecutions: Private Investigators
  2. Prosecutions: Marketing Offences
  3. Excessive Data Collection by An Post
  4. Disclosure of Employee Salary Details by the HSE
  5. Excessive Data Collection by a Letting Agency
  6. Disclosure of Financial Information by a Credit Union
  7. Complaint of Disclosure by Permanent TSB Not Upheld
  8. Patient Denied Right of Access by SouthDoc
  9. Excessive Data Collection by the Department of Agriculture
  10. Personal Data Disclosed by County Council
  11. Eircom Fails to Meet Statutory Timeframe for Processing Access Request
  12. Third-Level Student Data Appeared on Third-Party Website
  13. Data Controller Discloses Personal Data to Business Partner
  14. Employee of Financial Institution Resigns Taking Customer Personal Data
  15. Theft of Unencrypted Laptop
  16. Compromise of Adobe Network

Case Study 1: Prosecutions: Private Investigators

This Office initiated prosecutions in the private investigator/tracing-agent sector for the first time in 2014. These prosecutions arose from a detailed investigation that commenced in the summer of 2013. Arising from audits carried out in a number of credit unions at that time, the Office became concerned about the methods employed by some private investigators hired by credit unions to trace the current addresses of members who had defaulted on their loans. The Office launched a major investigation to identify the sources from which the private investigators had obtained the current address data. This investigation involved a wide range of public bodies and private companies. As a result of our findings, the Office established that personal data on databases kept by the Department of Social Protection, the Primary Care Reimbursement Service of the Health Service Executive, An Garda Síochána and the Electricity Supply Board had been accessed unlawfully and the information was disclosed thereafter to credit unions. Details of the prosecutions that ensued are as follows:

M.C.K. Rentals Limited and its Directors

M.C.K. Rentals Limited (trading as M.C.K. Investigations) was charged with 23 counts of breaches of Section 22 of the Data Protection Acts 1988 and 2003 for obtaining access to personal data without the prior authority of the data controller by whom the data is kept, and disclosing the data to another person. The personal data was kept by the Department of Social Protection (7 cases) and by the Primary Care Reimbursement Service of the Health Service Executive (16 cases). In all cases, the personal data was disclosed to various credit unions in the state.

The two directors of M.C.K. Rentals Limited, Ms Margaret Stuart and Ms Wendy Martin, were separately charged with 23 counts of breaches of Section 29 of the Data Protection Acts 1988 and 2003 for their part in the offences committed by the company. This Section provides for the prosecution of company directors where an offence by a company is proved to have been committed with the consent or connivance of, or to be attributable to any neglect on the part of, the company directors or other officers.

At Bray District Court on 6 October 2014, M.C.K. Rentals Limited pleaded guilty to five sample charges for offences under Section 22 of the Data Protection Acts 1988 and 2003. The Court convicted the company in respect of each of the five charges and it imposed a fine of €1,500 per offence. Company Secretary and Director Ms Margaret Stuart pleaded guilty to one sample charge for an offence under Section 29 of the Data Protection Acts 1988 and 2003. The Court convicted Ms Stewart in respect of that offence and imposed a fine of €1,500. Company Director Ms Wendy Martin pleaded guilty to one sample charge for an offence under Section 29 of the Data Protection Acts 1988 and 2003. The Court convicted Ms Martin in respect of that offence and it imposed a fine of €1,500.

This was the first occasion on which company directors were prosecuted by the Data Protection Commissioner for their part in the commission of data-protection offences by their company, and the proceedings in this case send out a strong warning to directors and other officers of bodies corporate that they may be proceeded against and punished in a court of law for criminal offences committed by the body corporate.

The investigation of this company uncovered wholesale and widespread “blagging” techniques used by the offenders, and this was the first prosecution by the Data Protection Commissioner of offenders engaged in such practices. The findings of the investigation carried out in this case expose the constant threat to the security of personal data that is in the hands of large data controllers and the vigilance that is required by front-line staff at all times to prevent unlawful soliciting of personal data, in particular by means of telephone contact, by unscrupulous agents. Data controllers across the state should regularly review their data-protection procedures to maximise the effectiveness of their security protocols in order to counter such criminal activity. They must ensure that all staff, and particularly those at the front line who handle telephone calls, are fully trained in the security protocols in order to be able to recognise and deal with the threat of information blagging or pretext calling if it arises.

Michael J. Gaynor

Michael J. Gaynor (trading as MJG Investigations) was charged with 72 counts of breaches of the Data Protection Acts 1988 and 2003. Twelve charges related to breaches of Section 22 of the Data Protection Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept, and disclosing the data to another person. The personal data was kept by the Electricity Supply Board (9 cases) and by An Garda Síochána (3 cases). In all cases, the personal data was disclosed to various credit unions in the state. A further 60 charges related to breaches of Section 16(2) of the Data Protection Acts in respect of the processing of personal data of a number of individuals in circumstances where no record was recorded in respect of the accused in the public register maintained by the Data Protection Commissioner. Mr Gaynor is a former member of An Garda Síochána.

On 25 November 2014, at Dublin Metropolitan District Court, Michael J. Gaynor was convicted on two charges for offences under Section 22 of the Data Protection Acts 1988 and 2003. The Court imposed a fine of €2,500 in each of these two charges. Separately the defendant pleaded guilty to 69 charges (60 of which related to breaches of Section 16(2)) and these were taken into consideration in the sentence imposed.

This was the first prosecution to be completed by the Data Protection Commissioner of a data processor for processing personal data without having registered as a data processor on the public register of the Office of the Data Protection Commissioner. The investigation in this case uncovered access by the defendant to customer data held on databases held by the Electricity Supply Board. To access the personal data, the defendant used a staff contact in the Electricity Supply Board, which he had established during his previous Garda career.

These prosecutions send a strong message to private investigators and tracing agents to comply fully with data-protection legislation in the conduct of their business, and that if they fail to do so they will be pursued and prosecuted for offending behaviour. They also serve to remind all companies and businesses who hire private investigators or tracing agents that they have onerous responsibilities under the Data Protection Acts to ensure that all tracing or other work carried out on their behalf by private investigators or tracing agents is done lawfully. Specifically, in this regard, those operating in the credit union, banking, financial services, legal and insurance sectors should review their engagement of private investigators and tracing agents to ensure they have fully safeguarded all personal data against unlawful forms of data processing.

These investigations uncovered serious issues in relation to the hiring of private investigators or tracing agents by credit unions, particularly in respect of a lack of awareness on their part of how the private investigators were tracing members and, in some cases, in relation to the disclosure of PPS numbers by credit unions to private investigators. This Office has pursued all of these issues with the credit unions concerned and with their representative bodies in recent months. In addition, we have undertaken a range of follow-up work with the Department of Social Protection, the Health Service Executive, An Garda Síochána and the Electricity Supply Board on the implications of the data-security breaches that occurred in their organisations and on the measures required to deal with those breaches and to prevent a recurrence. This Office welcomes the fact that the Private Security Authority has proposed the introduction of regulation of private investigators.

Case Study 3: Prosecutions: Marketing Offences

Pure Telecom Limited

We received a complaint in March 2013 from an individual who received two marketing phone calls from Pure Telecom Limited on his landline telephone. The individual’s telephone number was listed on the National Directory Database opt-out register. It is an offence to make a marketing call to a telephone number listed on that register.

Pure Telecom Limited informed our investigators that it used the services of a third-party representative to make the marketing calls and it explained that the agent sourced the individual’s number themselves rather than using marketing data provided by Pure Telecom Limited. The company admitted that the third-party agent did not have consent to contact the complainant for marketing purposes.

At Dublin District Court on 3 February 2014, Pure Telecom Limited pleaded guilty to two charges concerning breaches of Regulation 13 (5)(b) of S.I. 336 of 2011 relating to two marketing phone calls to a phone number listed on the opt-out register. The Court imposed a conviction in respect of both charges and a fine of €500. It further ordered payment of the prosecution costs of the Data Protection Commissioner. The hearing was informed that the defendant had a previous conviction from 2010 for a similar offence.

Next Retail Limited

In February 2013, this Office received a complaint from an individual who received a number of unsolicited marketing emails from Next Retail Limited after she requested the company not to send her any more such emails. The complainant claimed to have unsubscribed firstly by using the unsubscribe link that was provided in a marketing email sent by the company and, following this, in four separate emails to the company requesting not to be contacted with marketing emails again.

Next Retail Limited informed our investigators that as it no longer used the services of the company that it had engaged to process unsubscriptions it was unable to explain what happened to the first unsubscribe request. With regard to the emails containing unsubscribe requests, the company confirmed that they did reach its complaints inbox but it was unable to trace where the emails went afterwards.

At Dublin District Court on 3 February 2014, Next Retail Limited pleaded guilty to two charges concerning breaches of Regulation 13(1) of S.I. 336 of 2011 relating to the sending of two unsolicited marketing emails without consent. The Court imposed a conviction in respect of one charge, with the second charge taken into consideration. A fine of €100 was imposed. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

Next Retail Limited subsequently appealed the severity of the sentence. On 19 March 2014, the Circuit Court affirmed the conviction and penalty previously imposed by the District Court and it noted the appellant’s intention to discharge the Data Protection Commissioner’s reasonable costs for the appeal.

Airtricity Limited

In May 2013, this Office received a complaint against Airtricity Limited from a person who received an unsolicited marketing phone call on his landline telephone, which was listed on the National Directory Database opt-out register. The complainant informed us that the purpose of the marketing call was to encourage him to switch energy supplier to Airtricity.

In response to our investigation, Airtricity admitted that the phone call had been made by a third-party contractor acting on its behalf. It explained that the error occurred when an old PC, on which the 2009 phone book was installed, was re-commissioned by the contractor. A spreadsheet containing the complainant’s phone number was still on the old PC and this led to the number being dialled in error.

At Dublin District Court on 3 February 2014, Airtricity Limited pleaded guilty to one charge concerning a breach of Regulation 13(5)(b) of S.I. 336 of 2011 relating to one marketing phone call to a phone number listed on the opt-out register. The Court imposed a conviction in respect of the charge and a fine of €75. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

The Carphone Warehouse Limited

In March 2013, we received a complaint from a customer of The Carphone Warehouse Limited after he received marketing text messages from the company despite having ticked the marketing opt-out box when he had previously made a purchase in one of its stores. The company informed our investigators that a systems error resulted in the customer being incorrectly included in its marketing list.

In April 2013, we received a complaint from another customer of The Carphone Warehouse Limited who received regular offers by text message from the company even though he had called the company on at least three occasions, asking that it stop. The company told our investigators that its system temporarily did not recognise the customer’s preference not to receive marketing due to an internal issue within the electronic filter process and this resulted in the customer’s phone number being accidentally selected for marketing campaigns.

At Dublin District Court on 3 March 2014, The Carphone Warehouse Limited entered a guilty plea in respect of five charges concerning breaches of Regulations 13(1) and 13(4) of S.I. 336 of 2011. The court imposed convictions in respect of four charges, with the fifth charge taken into consideration. It imposed fines of €1,500 in respect of each conviction. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. The hearing was informed that the defendant had two previous convictions from 2012 in relation to the sending of unsolicited marketing emails.

Valterous Limited (trading as Therapie Clinic and/or Therapie)

A former customer of Valterous Limited (trading as Therapie Clinic and/or Therapie) complained to this Office in June 2013 after receiving an unsolicited marketing text message despite having opted out of receiving such communications over three months earlier. Therapie explained to our investigators that the complainant’s contact details were on systems in two branches and that when the opt-out request was made the company removed their details from one database and did not realise they were also on another one, thus leading to a further unsolicited text message being sent to the same contact number.

In July 2013, we received a complaint from another former customer of Therapie who had received marketing text messages on several occasions. The complainant informed us that she sent a text message to opt out but the company continued to send her further marketing text messages. Our investigation found no evidence that Therapie had obtained consent at any time for the sending of marketing text messages to this individual. In relation to the sending of text messages after the former customer had opted out, Therapie explained that the individual should have texted the word “STOP” rather than the word “OPTOUT” at the time of attempting to opt out of the marketing database. We did not accept this as a valid excuse as the opt-out instruction on the marketing text message sent to the individual read “OptOut:086.......”.

At Dublin District Court on 3 March 2014, Valterous Limited (trading as Therapie Clinic and/or Therapie) pleaded guilty in relation to three charges concerning breaches of Regulation 13(1) of S.I. 336 of 2011 concerning the sending of unsolicited marketing text messages without consent. The Court imposed convictions in respect of two charges, with the third charge taken into consideration. It imposed fines of €1,500 in respect of each conviction. The defendant agreed to pay the prosecution costs of the Data Protection Commissioner. The Court was told that in 2012 Therapie Laser Clinics Limited (trading as Therapie Clinic and/or Therapie) was convicted for two offences in relation to the sending of unsolicited marketing text messages.

Case Study 4: Excessive Data Collection by An Post

This Office received two complaints from members of the public concerning new requirements that were introduced in November 2013 by An Post in relation to direct-debit applications for payment of TV licence fees. A mandatory requirement was introduced to provide a recent bank statement with the direct-debit application and mandate form. An Post’s TV licence website explained that a copy of a bank statement was required to verify the bank-account details provided by the licensee for payment of their TV licence fee. It went on to state that the bank statement must show the BIC, IBAN and the full name and address of the bank-account holder. The complainants argued that requesting a copy of confidential financial information that appears on bank statements was excessive.

We investigated these complaints with An Post. By way of background, An Post explained that the new SEPA regulations impose significant new obligations on direct-debit originators such as An Post with the TV Licence Direct Debit Scheme. It said that the commercial risk attached to accepting direct debits is now the sole responsibility of An Post and therefore An Post has to verify the direct-debit details supplied by the customer. It stated that An Post does not have proof that the bank-account details exist, are accurate or that the account is owned by the person stated on the mandate. Accordingly, it developed its new bank-detail verification process to check the mandate details supplied, and in that new process it seeks extra documentation to verify that the bank-account details supplied by the applicant are accurate, complete and up to date. It also pointed out that it cannot process a direct-debit application without having valid BIC and IBAN numbers in respect of the account on which the direct debit is drawn. An Post indicated that, further to our correspondence, it had decided that customers who choose direct-debit payment are no longer required to submit details of their bank balances.

We considered the matter further and we advised An Post that applicants should either be allowed to submit a copy of only the portion of the bank statement containing the name, address, BIC and IBAN numbers or they should be allowed to blacken out all of the transaction information on any copies supplied. An Post agreed to implement our advice. It amended its TV licence direct-debit application form to include the following text: “You should ensure that financial transactions on your bank statement are fully masked or removed before you attach it to your application. All bank statements are destroyed once the first successful payment has gone through.” An Post also amended its website to reflect this change and to clarify that it does not require the balance on the bank statement to be shown. We were satisfied with the changes implemented by An Post and with the manner in which it dealt with the matter expeditiously once we had drawn it to its attention.

Organisations that seek copies of bank statements for purposes such as proof of current address, as a verifier of identify or other similar issues should bear in mind that such documents contain a range of financial information that is private to the individual to whom it relates. As a general rule, individuals must be permitted to blacken out or otherwise mask those financial details and transactions as they are irrelevant for the purposes of address verification, etc. This case study should serve as a reminder to organisations to consider all the implications and the potential to collect an excessive amount of personal data in circumstances where they seek copies of bank statements from customers or clients.

Case Study 5: Disclosure of Employee Salary Details by the HSE

An employee of the Health Service Executive (HSE) complained in March 2014 concerning the alleged disclosure on two occasions of his salary details to his ex-wife. He informed us in his complaint that the matter came to his attention when his ex-wife went to court in the summer of 2013 in relation to maintenance issues, and in court she provided exact details from his payslips. In December of the same year, his ex-wife went back to court for a review of maintenance and on that occasion she produced a copy of his P60 along with his salary details for the previous four months.

We commenced an investigation of the matter by writing to the HSE. In response, the HSE accepted that on two separate occasions, in May 2013 and in November 2013, personal data relating to its employee was disclosed to a third party without his consent. It acknowledged that there was no legal basis for the disclosure of the personal data. It stated that it established who, within the HSE, made the first disclosure but it was not possible to establish who made the second disclosure. It explained that its payroll department had received a number of court orders directing the HSE to make maintenance payments to its employee’s ex-wife. It stated that numerous queries were raised by a firm of accountants and tax professionals called Accountax on behalf of its employee’s ex-wife. Those queries sought clarifications with regards to the payments made. It went on to state that, in relation to the first breach, a specific request was made seeking a copy of its employee’s most recent payslip showing the maintenance deductions from January 2013 to date. The HSE admitted that the requests for constant updates regarding maintenance payments ultimately resulted in the unauthorised disclosure of its employee’s personal data. The HSE accepted that in hindsight the only data that should have been released by its payroll department to its employee’s ex-wife (or to a person acting on her behalf) was a summary of payments made that related to the court orders.

We informed the HSE that we considered that the Data Protection Acts were breached when the personal data of its employee was disclosed to a third party without his consent. The HSE indicated that it wished to pursue an amicable resolution to the complaint and, to this end, it enclosed a letter of apology for the complainant. The data subject considered the letter of apology and he decided that he did not wish to accept it, opting instead to seek a formal decision of the Data Protection Commissioner on his complaint.

A decision of the Data Protection Commissioner was issued in August 2014. In his decision, the Commissioner formed the opinion that the HSE contravened Section 2(1)(c)(ii) of the Data Protection Acts 1988 and 2003 on two occasions by the further processing of the complainant’s personal data in a manner incompatible with the purpose for which it had been obtained. These contraventions occurred in May 2013 and in November 2013 when the HSE disclosed his personal information to a third party. Section 2(1)(c)(ii) of the Data Protection Acts 1988 and 2003 provides that data shall not be further processed in a manner incompatible with the purpose for which it was obtained. In this case, the HSE acknowledged that on two separate occasions the personal data was disclosed to a third party without the consent or knowledge of the data subject. Such disclosures constitute further processing of personal data.

Case Study 6: Excessive Data Collection by a Letting Agency

In July 2014, a prospective tenant complained about the collection of bank details, PPS numbers and copies of utility bills by a letting agency when applying to rent a property. The complainant stated that this information was in addition to the usual material, such as previous landlord’s reference, which one would expect to submit at application stage. She stated that she believed that if she did not supply all of the sought data up-front, her application would not be seriously considered by the letting agency. The complainant said that the practice of collecting such a broad range of personal data forces prospective tenants who are desperate to rent a property to submit this personal information at application stage even though they do not know if their application will be successful. She pointed out that the majority of applications are unsuccessful given the high demand for a limited supply of available rental properties in the Dublin area.

We commenced an investigation of the matter with the letting agency concerned, seeking an explanation for the collection of such a broad range of personal data at application stage. In response, the letting agency said that it requested PPS numbers from applicants because this verifies that they are entitled to work in the state, and that bank details are required to show that a tenant has a bank account because they would be ineligible if they were not able to pay rent through a bank account. We told the letting agency that we could not see any basis for collecting bank details, PPS numbers or copies of utility bills at application or property-viewing stage and we urged it to cease the practice immediately. We questioned the letting agency further about using the PPS number to verify the applicant’s work status. It replied to the effect that the main reason it requests PPS numbers is that it is required for the Private Residential Tenancies Board (PRTB) registration form and it said that it cannot register a tenant without it. It went on to say that it is only an added assurance that the applicant is working and it stated that it does not verify the PPS number.

We accepted that personal data concerning bank details, PPS numbers and utility bills could be requested once the applicant had been accepted as a tenant. In October 2014, the letting agency confirmed, following our investigation, that it had ceased the requesting of this personal data prior to the property being let and it undertook that it would only request this information once the tenant had been accepted. The complainant informed us that she was very satisfied with the outcome of her complaint.

This case study is a classic example of the temptation of some data controllers to collect a whole range of personal data in case they might need it in the future. In this case, the letting agency collected a significant amount of personal data from every applicant who expressed an interest in renting a property even though, at the end of the process, only one applicant could be accepted as the new tenant and it was only in the case of that successful applicant that the full range of personal data was required. Section 2(1)(c)(iii) places an obligation on data controllers to ensure that personal data which they process is adequate, relevant and not excessive in relation to the purpose or purposes for which it is collected or are further processed. Data controllers must be mindful of this requirement and abide by it despite the temptation for convenience or other reasons to embark on an unnecessary broad data collection exercise.

Case Study 7: Disclosure of Financial Information by a Credit Union

A member of a credit union complained in 2013 in relation to the alleged disclosure of his loan and savings information by the credit union to his daughter. By way of background, the complainant explained that he was a guarantor on a credit union loan to his daughter. He received a letter from the credit union to inform him of difficulties that his daughter was experiencing with her loan. The purpose of the letter was to call on him, as the loan guarantor, to pay the balance of monthly repayments. He outlined that the letter was addressed to him and that it contained his membership number along with his savings and loan details, including balance outstanding. Soon afterwards, his daughter called to his house with a copy of the same letter as the credit union had also sent it to her. The complainant said that he considered this disclosure of his financial information to be a gross violation of his privacy.

We investigated the matter with the credit union concerned. It explained that the error that led to the disclosure occurred when the letter to the guarantor was issued under the guarantor’s membership number and not under the membership number of his daughter, whose loan it referred to. It explained that the computer system automatically brings across the account details of the membership number keyed in. The credit union admitted that a member of its credit-control staff inadvertently typed the letter under the guarantor’s membership number and, as a result, his account details were printed on the letter.

The credit union proposed that, as a means of trying to reach an amicable resolution of the complaint, it would issue a letter of apology to the guarantor. It also carried out staff training in regard to issuing letters to members, in particular letters to guarantors, and it re-circulated its data-protection policy to all staff. The complainant considered the offer and rejected it. He sought a formal decision of the Data Protection Commissioner on his complaint.

In April 2014, a decision issued to the complainant. In his decision, the Commissioner formed the opinion, following the investigation of the complaint, that the credit union contravened Section 2(1)(d) of the Data Protection Acts by providing details of the complainant’s membership account to a third party by means of a letter that was copied to the third party. Section 2(1)(d) obliges data controllers, among other things, to take appropriate security measures against unauthorised disclosure of personal data.

This case highlights the serious consequences for the complainant concerned arising from what appeared to be an innocuous error on the part of the staff member typing a letter for the complainant on his own account rather than on the account of his daughter, to whom the subject matter of the letter related. It serves as a reminder to data controllers generally to keep data-protection awareness to the forefront, with regular staff training for those whose work involves any form of data processing.

Case Study 8: Complaint of Disclosure by Permanent TSB Not Upheld

A complaint from a customer of Permanent TSB alleged that the bank had violated the Data Protection Acts by discussing their accounts and personal details with a third party, the complainant’s tenant, thereby causing financial loss and stress.

We investigated the allegation with Permanent TSB. In response, the bank informed us that it had made no contact with residents in the properties concerned to discuss the mortgage account details of the complainant concerned. It further stated that all telephone calls received from the tenant concerned had been listened to and at no time did any staff member discuss the details of the mortgage account with her. As part of our investigation we sought a copy of the recordings of phone calls that took place between Permanent TSB and the tenant. We listened to the call recordings and we were satisfied that no personal data relating to the complainant was passed to the tenant during the phone calls with Permanent TSB. Instead, the tenant was repeatedly told that Permanent TSB could not discuss anything with her without the written authority of the account holder. In one instance, the tenant offered to give her contact number to Permanent TSB but she was informed that it was not required as Permanent TSB would not be contacting her. This Office’s investigation found no evidence that Permanent TSB disclosed any personal data relating to the complainant to the third party concerned.

In a separate aspect to the same complaint, it was alleged by the complainant that Permanent TSB had sent correspondence to a previous residential address after it had been notified of a change of address. The complainant supplied us with a copy of a letter sent by them in August 2011 notifying the bank of the new address for correspondence and we were also supplied with copies of letters sent by Permanent TSB to the previous address after that date. In response to our investigation of this matter, Permanent TSB confirmed that it had received the August 2011 letter, which notified it of the new address, but it could offer no explanation as to why its systems had not been updated at that time to reflect this. It informed us that it was not until it received a further letter in January 2012 that the system was updated. To assist with trying to resolve the complaint, the bank offered a goodwill gesture as an acknowledgement of the delay encountered and of any stress the delay may have caused, but this was rejected by the complainant.

The complainant sought a formal decision on the complaint. With regard to the failure to update the contact address, having been requested to do so in August 2011, the Commissioner formed the opinion that Permanent TSB contravened Section 2(1(b) of the Data Protection Acts. This section obliges data controllers to comply with the requirement to keep personal data accurate and up to date.

With regard to the allegation of disclosure of the complainant’s personal data to a tenant, the Commissioner was unable to form the opinion that a contravention of the Data Protection Acts occurred in this instance.

Case Study 9: Patient Denied Right of Access by SouthDoc

We received a complaint in June 2014 from a firm of solicitors whose client had made an access request in May 2014 to the Practice Manager at South West Doctors-On-Call Limited (trading as SouthDoc) seeking a copy of his medical notes. In response to the access request, SouthDoc replied to the solicitors, stating that they are advised to contact the patient’s own GP, who holds a complete record for the patient. The solicitors wrote back to SouthDoc, pointing out that the access request was made to SouthDoc and that it was a separate request to any request their client may make to his own GP. The solicitors pointed out that SouthDoc was obliged to comply with the request. In submitting the complaint to this Office, the solicitors informed us that SouthDoc had not replied to their latest letter but had returned it to them unanswered.

We began an investigation by writing to SouthDoc. It responded by return post, indicating that the request for medical records had now been dealt with. Soon afterwards, the solicitors for the complainant supplied us with a copy of a letter they had received from SouthDoc stating that, further to the access request, the patient’s records had been forwarded to his own GP. The solicitors pointed out that SouthDoc had not complied with the access request as it was their client who requested the records, and it was not sufficient for SouthDoc to give them to his GP. We wrote to SouthDoc again, seeking an explanation. A few days later we received from SouthDoc a copy of a letter that it had issued to the patient’s solicitors, enclosing a copy of the patient’s medical records. We then concluded our investigation.

There are a number of after-hours or on-call service providers such as SouthDoc in operation in Ireland, all of which provide an essential medical service for the general public. In doing so, these service providers collect and process both personal data and sensitive personal data (data relating to the physical or mental health of the attending patient). For the purposes of data protection, it is important that patients and service providers understand that when a patient attends one of those services, they provide their personal data to an organisation (data controller) that is entirely separate to their usual GP practice. Accordingly, the records created by the service provider in respect of the patient’s attendance and treatment are new records in respect of which the service provider is the data controller. For that reason, the patient has a right to access those records directly from the service provider by making an access request for a copy of them. This right of access to the records of the service provider exists whether or not the service provider passes on details of the patient’s attendance and treatment to the patient’s GP. Furthermore, the service provider is obliged to supply a copy of the personal data directly to the requesting patient (or to the solicitor acting on his behalf, as in the above case) rather than to the patient’s own GP. (Access to medical records is subject to the provisions of S.I. 82 of 1989, which prohibits the supply of data to a patient in response to an access request if that would cause harm to his or her physical or mental health.)

Case Study 10: Excessive Data Collection by the Department of Agriculture

An individual complained to this Office about new requirements introduced by the Department of Agriculture to produce bank-account details in relation to registering premises to comply with the Diseases of Animals Act 1966–2001. He explained that horse owners are required to register the premises in which horses are kept with the Register of Horse Premises and he said he had no difficulty with that requirement. However, he objected to being asked to supply his bank-account details and he pointed out that there was no possibility of this information being needed by the Department as there were no schemes or grants that entitle horse owners to payment. He told us that he and his wife each own a horse and that both horses are kept purely for pleasure purposes. He said that he had expressed his concerns directly to the Department initially but the Department continued to insist that he submit bank details.

We sought an explanation from the Department of Agriculture. In its response, the Department referred to the government’s drive towards e-commerce and the fact that government departments can no longer issue payable orders. It said that payments due by the Department can only be made by way of electronic fund transfer to a bank account. Accordingly, all clients of the Department in receipt of payments are asked to supply bank details as a prerequisite for entry onto the Department’s Corporate Customer System. It said that as most of the Department’s clients are in receipt of payments or could potentially receive payments, it was decided that all new clients (applicants), including those who exceptionally might not currently qualify for payments, would be asked for their bank-account details.

We referred the Department to the provisions of Section 2(1)(c)(iii) of the Data Protection Acts, which places a requirement on data controllers to ensure that personal data shall be adequate, relevant and not excessive in relation to the purpose for which it is collected. We pointed out that the principle established by this provision required that personal data should be collected when required and not on the basis that it might be required at some future point. We received confirmation from the Department in February 2014 that the practice of seeking bank details in anticipation of possible future payments had ceased. We were informed that an information notice had been issued to staff, stating that customer bank details are required only where a customer will be in receipt of payments from the Department.

The complainant in this case raised a very valid complaint with this Office, having failed to resolve the matter directly with the Department himself. Insufficient thought appears to have been given at the outset to the concept of requiring bank details from every customer or potential customer of the Department – whether that information was needed or not. More disappointingly, however, was the fact that the Department did not review the situation and fix it after this individual drew the Department’s attention to his circumstances and the circumstances of others who keep horses for pleasure purposes – pointing out that the Department would never need to use his bank-account details as he was not an applicant for a scheme or grant. In the end, it took the intervention of this Office to persuade the Department to cease seeking excessive personal data and to comply with the principle that data collection shall be adequate, relevant and not excessive.

Case Study 11: Personal Data Disclosed by County Council

In April 2014, we received a complaint from an individual who alleged that her private email address was disclosed to third parties without her permission by Dun Laoghaire Rathdown County Council. The complainant had made a submission to the county council in respect of a local area plan. She found out about the disclosure when one of the parties to whom her email address had been disclosed made an unsolicited contact with her using her email address. She indicated that she was worried as she did not know how many people were in possession of her private email address as a result of the disclosure.

We commenced an investigation by writing to Dun Laoghaire Rathdown County Council. In response, the county council by way of background explained that it supplies notices, agendas and minutes of its meetings to parliamentary representatives in accordance with Local Government Act 2001 (Section 237A) Regulations 2003.

It went on to state: “It has been the practice of this Authority heretofore to supply copies of all reports that issue with these agenda, as this is how the agenda issues to our councillors. In accordance with the Planning and Development Act 2000 [as amended], Section 20(3)(c)(ii), a Manager's Report for a Local Area Plan must list the persons who made submissions or observations. In all cases a list of submitters is prepared, for internal use and file, which includes necessary contact details, home address and email address. It is our standard practice, however, to remove the email addresses before circulation to councillors. The home addresses are left on as councillors wish to see who in their constituency made a submission. In this case we inadvertently included the email and home addresses with the list of submitters. This was an error on our part, and not standard practice. What has been placed on our website, however, is the list without the contact details. In order to prevent a recurrence of this, we have reminded all staff not to include the contact details of submitters in reports which are circulated to councillors or placed on the website. Additionally, although as mentioned above the list that went to councillors usually contained the submitter's address for the councillors’ information, we will not include either home address or email address in any reports issuing to councillors. In addition to the above, and to further prevent the inadvertent release of personal information, the Council will cease the practice of issuing reports with the agenda which are supplied to parliamentary representatives.”

The county council stated that it had issued a revised report, with all of the personal contact details removed, to all of the recipients and it asked that they delete the original version. The county council concluded by saying that in this case the information was disclosed accidentally and it said that it would endeavour to ensure that there will be no repeat of this incident by adhering to its standard procedure and by reminding all staff concerned of those procedures.

The complainant sought a formal decision on her complaint.

Section 2(1)(c)(ii) of the Data Protection Acts provides that personal data shall not be further processed in a manner incompatible with the purpose for which it was obtained. The data controller in this case, Dun Laoghaire Rathdown County Council, explained to our investigation that in accordance with the Planning and Development Act 2000, a County Manager's Report for a Local Area Plan must list the persons who made submissions or observations. The data controller further stated that in all cases a list of submitters is prepared for internal use, which includes contact details, home address and email address, and that it is its standard practice to remove the email addresses from this list before circulation to councillors. However, it was clear that in this particular instance the email addresses of the submitters was not removed from the circulation list. In making his decision, the Commissioner formed the opinion that Dun Laoghaire Rathdown County Council contravened Section 2(1)(c)(ii) of the Data Protection Acts. This contravention occurred by the further processing of the complainant’s personal data in a manner incompatible with the purpose for which it had been obtained when her email address was disclosed by Dun Laoghaire Rathdown County Council via the circulation of a report to county councillors, TDs and senators in relation to a local area plan.

Case Study 12: Eircom Fails to Meet Statutory Timeframe for Processing Access Request

A staff member of Eircom submitted a complaint to this Office in relation to the alleged failure of Eircom to comply with an access request submitted by him to the company in September 2013. In his access request, he specifically requested a copy of a particular letter that was sent on a date in February 2013 to Eircom's Chief Medical Officer.

We commenced the investigation of the complaint and we asked Eircom to respond to the access request without further delay. We were informed by Eircom that it had already provided the data subject with a copy of the letter that was the subject of his access request, and it subsequently provided us with a copy of its response to an access request. However, on further inspection of Eircom's response to that access request, it was unclear to us that the response was in relation to the particular access request that was the subject of the current complaint as the response issued to the data subject prior to the date of his access request. We asked Eircom to review the matter. Eventually, on 2 May 2014, we received an email from Eircom enclosing a copy of the response of that date to the data subject’s access request of 22 September 2013, supplying a copy of the document that the data subject had sought access to.

The complainant asked for a formal decision of the Data Protection Commissioner on his complaint. In making his decision, the Commissioner formed the opinion that Eircom Limited contravened Section 4(1)(a) of the Data Protection Acts by failing to supply the data subject with a copy of his personal data in response to his access request submitted on 22 September 2013 within the statutory period of 40 days. This contravention occurred when Eircom Limited released a copy of the data subject’s personal data to him on 2 May 2014 – which was outside the statutory period of 40 days.

As outlined elsewhere in this annual report, over half of the complaints received by this Office in 2014 were made by data subjects who experienced difficulties in accessing their personal data. One common theme that emerges in many of these complaints is lateness on the part of the data controller in processing the access request. The Acts lay down a period of 40 days for compliance with an access request and if this is not met, as in the case outlined above, the data controller contravenes the Data Protection Acts. The Office of the Data Protection Commissioner is very concerned about the prevalence of this particular contravention. In some instances, the data controller fails to even acknowledge receipt of the access request within the 40-day period. This means that the requester has no idea whether their access request is being dealt with or ignored. There have been many instances where the data controller has taken no action whatsoever in terms of processing the access request until this Office commences an investigation on foot of receiving a complaint from the data subject. Clearly, that is an undesirable situation. Data subjects have a statutory right to access their personal data held by a data controller by the simple means of submitting an access request, and the data controller has a statutory obligation to comply with that request within 40 days. A data subject should not have to resort to the extra step of lodging a complaint with the Office of the Data Protection Commissioner in order to have their statutory right of access enforced. Unfortunately, as the complaint statistics reveal, far too many data subjects are experiencing barriers and access-denying tactics on the part of data controllers.

In the above case, the data subject’s right of access was severely delayed. There is no justification for such a lengthy delay in any circumstances. Such a delay is particularly unacceptable in a situation where the requester simply sought a copy of personal data contained in one relatively recently created letter and where the data controller is a large telecommunications company that is well aware of the Data Protection Acts and receives and processes subject access requests on a regular basis. Eircom is the subject of several data-protection complaints every year across a range of issues, many of which relate to access requests. The Office of the Data Protection Commissioner expects to see a marked improvement in that company’s data-protection performance in the near future, particularly in the context of processing subject access requests in a timely manner.

Case Study 13: Third-Level Student Data Appeared on Third-Party Website

The Office received a notification from a data controller, in accordance with the Personal Data Security Breach Code of Practice. The notification alerted the Office to the fact that data relating to a large number of students had been discovered on a website that was unrelated to the data controller. The data related to the 2010 academic year.

The Office began an investigation of the matter. The data controller advised the investigation team that the information disclosed on the website included the name, email address and password of the student. The investigation team confirmed that there was no financial or sensitive data involved.

The data controller engaged an external security company to carry out its own investigation into the security breach.

Due to the passage of time, there were no server logs showing when or by whom the data had been uploaded to the website. However, the data controller was able to identify that the data published matched a file created for testing purposes in mid-2011. This file was then sent to a third-party service provider who was engaged in developing a management system for the data controller. The file was sent via unsecured email.

The third-party service provider informed the data controller that while there was a relationship between their staff and the website on which the data was published, they had conducted a very thorough review of the matter and could find no evidence to show that the file had been posted onto the website due to an act of omission on their part.

Our evaluation of the information showed that the data controller, when creating student accounts, used generic passwords when generating the student accounts. The password was the date of birth of the student. While students could change their passwords, they were never advised to change them.

While it could not be determined exactly how the data appeared on the website, it was evident that there had been a breach of the Data Protection Acts, in that appropriate security measures were not in place to prevent the unauthorised disclosure of personal data.

Our investigation also found that the use of live data for testing purposes was not in accordance with data-protection best practices. Where live data is being used by an organisation for testing purposes, there would have to be a strong justification for such use and we were not aware of any justification applicable in this particular case. The Office recommended that the data controller cease the use of live personal data for testing and either anonymise the data or create a fictitious data set for testing purposes.

The transmission of such student data via an unsecured channel is also inconsistent with the Data Protection Acts. It was found that, during the development of the management system, personal data, including passwords, was exchanged between the data controller and the service provider, using an unsecured channel. The data controller advised my Office of the fact that they now transmit such data via a secure mechanism. The Office recommended that this mechanism be brought to the attention of all staff.

Another issue discovered during our investigation that caused great concern was the use of a generic password. The fact that the date of birth of the student was assigned as their password meant that any individual who had access to the date of birth of another student could access the user account of that student. The Office recommended that the data controller communicate with students, advising that they change their password and that the new password be a minimum of 12 characters and include upper- and lower-case characters, numerals and special characters, such as a symbol or punctuation mark.

Case Study 14: Data Controller Discloses Personal Data to Business Partner

The Office received notification from a data controller advising that an email had been issued to a business partner which included personal data that should not have been disclosed.

The data controller advised the Office that it had entered into a business agreement with a third-party company to provide anonymised data to allow for a feasibility assessment of a proposed business venture. An email was issued to the third-party company which included the names of individuals in addition to the agreed anonymised data. This allowed for the third-party company to identify the individuals involved.

The data controller, in notifying this Office, stated that the third-party company had provided assurances that the data had been deleted.

The Office commenced an investigation of a data-security breach, under Section 10 of the Data Protection Acts.

Given the nature of the data involved and additional information received by a third party, this Office decided to visit the premises of the third-party business partner to satisfy ourselves that the data had been deleted and not further processed.

An investigation team, using our powers under Section 24 of the Data Protection Acts, arrived unannounced at the premises of the business partner. The team obtained documents in relation to the business agreement; these showed that only anonymised data had been sought. The team also obtained reports that had been created on foot of the receipt of the personal data. It was evident from these reports that, while personal data was available to the third party, it had not been used in the preparation of the reports and had no impact on the reports.

The team then examined the computer systems of the company and discovered several instances of the email it had received which contained the personal data.

The Commissioner felt it appropriate to issue an Enforcement Notice to the third-party company, requiring them to engage an external IT security company to delete any and all copies of the personal data it had received. The IT security company was to provide my Office with a report on the completion of the work. This report was duly received and this Office was satisfied that all copies of the personal data had been securely deleted.

The investigation found that personal data had been disclosed without consent or a legal basis. The investigation also noted that non-business related email accounts had been used by members of staff of the data controller in the conduct of business matters. The data controller was advised to prevent the use of non-business email accounts as the data controller could not control any data that would be transmitted through these non-business accounts.

Case Study 15: Employee of Financial Institution Resigns Taking Customer Personal Data

The Office received a notification from a data controller, in accordance with the Personal Data Security Breach Code of Practice. The notification stated that an employee had tendered their resignation and the data controller then discovered that the employee had emailed a spreadsheet to their personal email account prior to their resignation. The spreadsheet contained details of customers, including their employment details, salaries, contact details and medical consultant.

The data controller provided the name and home address of the employee.

The Office was also contacted by the umbrella organisation of the data controller seeking assistance on how to advise their member.

The Office verified, through the Companies Registration Office, that a business was operating from the home address of the employee. We then contacted the employee on the basis that they were now operating as a data controller in their own right. We sought clarification from the employee as to the consent they had to process any personal data they obtained from their previous employment.

The employee advised the Office that, as part of their employment, they were asked to use their own laptop and personal phone for all business dealings. The employee also advised that they had not yet started canvassing for clients. The employee also confirmed that they had deleted all the personal data they held in relation to their previous employment.

We also engaged with the data controller who had made the notification in relation to the security procedures that were in place to protect customer data in its possession. The Office noted that the employment contract contained appropriate data-protection clauses. However, of concern was the fact that employees were using their own equipment for business purposes. In such circumstances, the data controller has little or no control over that data held on personal equipment.

The data controller introduced further procedures and policies on foot of the issue to prevent a repeat of this type of incident, including the introduction of software to password protect any data records being emailed. Furthermore, all employees must sign an undertaking on termination of employment that all data has been returned and will not be further processed.

Case Study 16: Theft of Unencrypted Laptop

The Office received a data-security breach notification during the year from a medical professional relating to a stolen laptop.

The notification advised that the laptop was password protected, but not encrypted. The notification also advised that the data stored on the laptop related to a medical study that was undertaken in 2009 and included audio files of interviews carried out with the study subjects which contained limited information. It was determined that a file listing the subjects of the study contained an ID number rather than the name of the individual. However, a further file that correlated the ID number with the subject name was also stored on the laptop. This file was also password protected.

It was noted that, before the study began, approval was obtained from the relevant Ethics Committee that covered the storage of data.

This Office advised the data controller of our guidance in relation to the notification of the affected individuals. In this particular case, the data controller advised the Office that it was of the view that notification to affected individuals would cause more distress than help to the affected individuals. This view was offered by the relevant medical professional overseeing the project. This Office must note the opinion of a medical professional who has a professional relationship with the affected individuals. We assume this decision is taken weighing the potential effects of an unauthorised disclosure of this data against the potential distress of the individual being notified of the security breach.

The Office, however, noted that laptops are now being encrypted. This case highlights the fact that data-protection considerations need to be constantly monitored. What may have been an acceptable standard five years previous may not now be acceptable, and security arrangements must be periodically reviewed.

Case Study 17: Compromise of Adobe Network

Adobe Systems Software Ireland Ltd notified this Office in October 2013, in accordance with the Personal Data Security Breach Code of Practice, of a data-security breach regarding an unauthorised access to their systems. Personal data was compromised and the attacker also took Adobe software source-code elements.

Two data controllers were affected: Adobe US and Adobe Systems Software Ireland Ltd (Adobe Irl). We engaged in a coordinated investigation with the Office of the Privacy Commissioner of Canada and we were co-joined in our investigation by the Office of the Australian Information Commissioner.

Nature of Data Compromised

Adobe Irl created three classifications of individuals affected:

  • Payment-card users, i.e. those whose encrypted payment-card numbers were accessed during the breach. The data involved was encrypted payment-card data – approximately 3.65 million payment cards (1 million controlled by Adobe Irl) relating to approximately 3.1 million individuals.
  • Active users, i.e. those who had logged in to Adobe systems at least once in the two years prior to the discovery of the breach. The data involved was: email address and current encrypted password – 41 million (reduces to 33 million, as 8 million email notifications were undeliverable) (20.5 million controlled by Adobe Irl).
  • Non-active users, i.e. those who had not logged in to Adobe in the two years prior to the discovery of the breach. The data involved was: email address and current encrypted password – 71 million (reduces to 46.5 million due to 25 million email notifications undeliverable) (28.5 million controlled by Adobe Irl).

How the Breach Occurred

The attack was a sophisticated and sustained intrusion of Adobe’s computer systems. Attackers identified and removed data from a backup server that stored the compromised data described above. Adobe states it has no evidence to show that unencrypted card details were taken. Forensic consultants engaged by Adobe supported this conclusion.

When Adobe learned of the security breach, they began an investigation of the cause of the issue and also initiated a series of measures including the following:

  • Disconnected the impacted database server from the network
  • Blacklisted IP addresses from which the attacker accessed their systems
  • Reset passwords for all potentially affected users (including active, non-active)
  • Changed passwords for relevant administrator accounts
  • Notified the banks processing customer payments for Adobe, so they could work to protect customers’ accounts
  • Reported the breach to law-enforcement authorities
  • Employed a third-party company to conduct an investigation of the cause of the security breach of its systems and to identify what data may have been compromised
  • Took actions to reduce the risks related to the theft of certain source-code elements
  • Issued notifications to affected individuals, beginning on 3 October 2013, which alerted customers to the security breach

Passwords

At risk: the attacker posted some data that was exfiltrated on a website and included the email address and encrypted password of certain Adobe users. A number of research articles have demonstrated that some passwords have been deciphered by reference to password hints and repeated passwords (i.e., the same password used by more than one user). One article highlighted an organisation that had checked the compromised usernames and deciphered passwords against its own platform and found a significant number of these credentials would have worked on its own platform. The organisation contacted some of its affected users, alerting them to the issue, and also confirmed the scenario to this office. At issue here is that while Adobe enforced a password change on its own site and advised users to change their passwords elsewhere, it is evident that not all users followed such advice.

Hints: Parts of the data exfiltrated by the attacker were the password hints of a small percentage of users. These hints were stored in clear text and associated with the username (email address). This information, along with an analysis of the encrypted passwords, will allow for the identification of certain simple passwords. However, as previously noted, Adobe reset the passwords for all impacted users.

Storage: The Office queried why passwords were stored in one system in an encrypted manner rather that hashed and salted. Encrypted passwords can be unencrypted, which would allow a data controller to see the passwords of users, or attackers, if they gained access. Adobe stated it was actually hashing and salting passwords within a new system for a number of years prior to the discovery of the security breach, but decided to also keep the database in the old system as a backup measure in case of issues with the new system. Passwords in the old system’s database had been encrypted.

Retention of Card Data with Customer Records

Customers who used payment cards to purchase Adobe products or services had their card details (encrypted) stored with the customer account within one particular system. Card numbers have now been replaced with a token system. This process began prior to the discovery of the security breach and was completed shortly thereafter. The token, which is encrypted, represents the payment-card number within the customer record and Adobe systems transmits the encrypted token to a third-party service provider, whose systems are located outside Adobe’s network, for payment processing.

Notifications to Affected Individuals

Adobe provided the Office with a list of when they notified each class of affected individuals and the relevant notification. In addition, Adobe publicly announced the 2013 breach in posts on its website, which included discussion of the theft of source code. The various notifications did advise individuals to monitor their credit-card statements and change their password if it was used on another site.

When we queried why notifications did not issue to those individuals where only contact details were compromised and did not include password or payment-card data, Adobe replied that it believed that notice in this scenario would lead to over-notification and notification fatigue and that there is not a significant risk of harm with respect to a compromise of this type of data element. The Code of Practice recommends that affected users are notified, so that each affected individual can consider the consequences for themselves and take appropriate measures.

This Office would expect that if a similar incident were to occur in the future, Adobe, or any other data controller, would automatically include all individuals for whom personal data had been compromised in its notification process.

Conclusion and Findings

Adobe fully cooperated with our investigation of the security breach reported to us on 2 October 2013. Adobe took appropriate action on discovery of the attack to prevent further access to their systems as required under Section 2(1)(d) of the Data Protection Acts 1988 and 2003. It also enforced a password change for its users to protect against unauthorised access to account data. Adobe’s quick reaction on learning of the security breach prevented the attacker from exfiltrating unencrypted payment-card details.

Adobe’s transitioning from the use of encrypted passwords in the old system to the use of hashed and salted passwords in the new system could have been achieved more effectively and expeditiously than was the case. Of concern to those users who provided password hints, Adobe stored these in plain text rather than in an encrypted format, some of which have been compromised.

This Office is cognisant of the fact that data controllers such as Adobe will always be a target for attackers and new attack methods are constantly being devised.

This Office found that Adobe was in breach of Section 2(1)(d) of the Acts by failing to have in place appropriate security measures to protect the data under its control, despite its documented security programme. It was also recommended that Adobe engages a third party to carry out an independent review of its systems.

Adobe has since put in place substantial improvements in its security protocols, practices and procedures, and this Office is satisfied that it now has appropriate procedures in place to minimise the possibility of a similar security breach in the future.