UK & Europe
Insurance & Reinsurance
Welcome to the November Global Data & Privacy Update. This update is dedicated to covering the latest legislative developments affecting the way data is managed and protected, as well as reporting on the most recent news governing data breaches and industry developments.
The Court of Appeal has upheld the High Court's decision that Morrisons were vicariously liable for the torts committed by their then employee. The facts of this case are that a rogue employee deliberately took a personal copy of a database containing details of almost 100,000 employees and shared this on the internet, and to unauthorised third parties, misusing access granted to such data by Morrisons for the performance of his role.
Morrisons raised three grounds of appeal. The first and second were essentially that vicarious liability was excluded under the Data Protection Act 1998 (DPA 1998). The third was that the wrongful acts of the individual did not occur "during the course of his employment" therefore Morrisons was not vicariously liable.
Morrisons maintained that the absence of any provision in the DPA 1998 about the legal accountability of an employer meant that the common law remedy of vicarious liability was expressly or impliedly excluded by the DPA 1998. Morrisons argued that the liability of a controller under the DPA 1998, for the wrongful processing of data by staff, is limited by that company taking appropriate technical and organisation measures, in order to safeguard the data, as required under data protection principle 7 of the DPA 1998. It was argued that the DPA 1998 did not impose the strict liability offence of vicarious liability (an offence imposed regardless of an employer's fault).
The Court of Appeal disagreed that vicarious liability for misuse of private information, breach of confidence or breach of the DPA 1998 by an employee (including where the employee has acted as a data controller) had been excluded by the DPA 1998.
Morrisons argued that the close connection test, a component required for a finding of vicarious liability, was not met. The Court disagreed, in line with the High Court's finding, and found that on the facts there was no broken chain of events between the wrongful act and the individual's employed role. The Court considered that the individual's motive was irrelevant to this finding. Morrisons submission that this finding exposes companies to significant financial burden was dismissed, the Court highlighted that insurance can mitigate such risks.
This case may be appealed to the Supreme Court. The GDPR and the Data Protection Act 2018 do not, like the DPA 1998, explicitly address this point.
Click here to read the full judgement.
The Information Commissioner's Office (ICO) has fined Facebook Inc and Facebook Ireland (Facebook) the maximum amount under the DPA 1998 for breach of Principle 1 (the lawful processing of personal data) and Principle 7 (keeping personal data secure).
The Facebook platform allowed third party developers to operate applications on the platform and access information of Facebook users who installed their apps. Crucially, the platform also allowed third party developers to access information of Facebook users who did not themselves install their apps, but who were friends with Facebook users that had installed the apps; although Facebook did have in place rules about how third party developers should collect and use Facebook users' personal data.
A third party app on the Facebook platform called thisisyourdigitallife collected personal data of both Facebook users of the thisisyourdigitallife app and friends of Facebook users that had installed the thisisyourdigitallife app up until May 2015. Thisisyourdigitallife then shared this personal data (both before and after May 2015) on a commercial basis with a number of companies for their own use, including Cambridge Analytica, which has been widely mentioned in the press due to the company's use of personal data in connection with political campaigning.
These friends of Facebook users that had installed the thisisyourdigitallife app were not aware that thisisyourdigitallife was collecting personal data about them and sharing it with third parties. Clearly, this also meant that the friends had not consented to the collection or subsequent use of their personal data by the app.
Thisisyourdigitallife's collection and use of Facebook users' personal data was in breach of Facebook's rules – for example, third party developers were only allowed to collect personal data that was needed to run their app and they were not entitled to sell the personal data they collected to third parties – albeit those rules did not expressly require third party developers to inform Facebook users about the collection and use of their personal data nor obtain consent from the users for such collection and use.
It is estimated by Facebook that the personal data of at least 1 million UK Facebook users had been collected by the thisisyourdigitallife app, the majority of whom had not downloaded the app themselves.
The ICO determined that:
The fine levied by the ICO was the maximum sum under the old legislation. The GDPR provides a greater level of potential fine, the maximum fine being the higher of €20 million or 4% of global turnover.
Click here to read the ICO's monetary penalty notice.
On the 4th of October the European Commission, the Council of the EU and the European Parliament reached a provisional agreement regarding a regulation on the free flow of non-personal data. The aim of the new regulation is to:
This new regulation will bring into scope aggregated data sets, including the use of big data. On enforcement of this law all types of data being processed, both non-personal data and personal data, will be regulated within the EU. The text is due to be approved this month by the EU Council of Ministers.
Click here for the European Commission press release.
The ICO plans to create a regulatory sandbox for companies, as part of its 2018 – 2021 technology strategy. The ICO's regulatory sandbox would create a safe space for companies to innovate and experiment with data processing, whilst engaging with the ICO to ensure there are appropriate safeguards. It is a concept developed and used for the past 4 years by the Financial Conduct Authority with respect to financial regulation.
The ICO has been collecting views on the form, scope and support of a potential Regulatory Sandbox. The regulator intends to publish detailed plans for a consultation by the end of the year.
Click here for the ICO's latest news release.
Convention 108 (Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data) has been updated, by way of a protocol, to reflect changes to technology and the world since it was first agreed 35 years ago. The Convention is an international treaty, agreed by the Council of Europe and signed up to by a number of other countries, which sets out principles for the processing of personal data. The Convention is not limited, as the title might suggest, to the concept of automated processing as understood under the GDPR.
The protocol makes a number of changes to the Convention, with a number of clear parallels to the GDPR. The changes include a strengthening of obligations regarding transparency of processing and accountability of controllers, a requirement to notify data breaches to regulators, a transnational data transfer regime and a ‘privacy by design’ principle.
The UK Government has released a voluntary code of practice on Consumer Internet of Things (IoT) Security, aimed at manufacturers, IoT service providers, mobile application developers and retailers. IoT devices are sometimes considered to have poor security and have recently been the target of a number of high profile data breaches.
Leading technology companies have emphasised the importance of strengthening security in this area. The code of practice has been introduced to help increase the level of security of IoT devices. HP Inc. and Centrica Hive Ltd have already pledged to commit to the code. The code of practice contains thirteen guidelines for companies to implement in product design. These include no default passwords, keeping software updated, monitoring telemetry data and ensuring that personal data is protected.
Regulation in this area has been growing in response to the increased use of such devices and security issues. California has become the first state in the USA to introduce an IoT cyber security law which will take effect in 2020 and requires all IoT devices to meet certain minimum security standards.
Click here for the UK Code of Practice for Consumer IoT Security.