This Data Protection Day we thought we would look ahead to the legislative changes taking place, the regulatory work in progress, recent judicial landmarks, trends we are seeing our clients grapple with and other areas the privacy community is addressing.
Undoubtedly, data protection continues to be in a state of flux and presents many a practical challenge to global businesses navigating a patchwork of international legislation and this does not stop with data transfers. As we hope to begin to emerge from a two year pandemic, the metaverse is developing apace. Today’s society and the metaverse are interdependent on emerging and innovative technologies. With consumers more familiar with their data protection rights and the increased sophistication of artificial intelligence and machine learning, we are beginning to see more focus on and demand for data ethics.
Rapid technological change continues whilst institutional chambers around the globe debate and reform legislation. Data protection laws have recently been implemented in countries as diverse as China and Rwanda and are in progress across US States, India and Australia to name but a few. The data protection community is aware of how international our subject matter is. Whilst we can hope for a global harmonised standard, at times is seems nuances abound when implementing a multijurisdictional project. We see this variation presenting a greater challenge, albeit one that can be met with a sound strategy.
As the ePrivacy Directive 2002/58/EC turns 20 years old this year, we continue to await the finalised ePrivacy Regulation, which is now in trilogue. The Programme for the French Presidency of the Council for the six months ending 30 June 2022, has confirmed that it will “continue work on the ePrivacy Regulation”. Given there are a number of aspects of this Regulation which remain to be agreed and with a proposed two year implementation wanted by the Council, it could be awhile yet before we see the effects of this piece of legislation in practice.
The Draft EU Artificial Intelligence Act
This year we are likely to see the continued formation and development of the European Union’s Artificial Intelligence Act. Whilst we expect a long legislative process, followed by a two-year implementation period, it will be advantageous for developers and users of automated technologies to keep a keen eye on the development of the Act and remember to always design with data protection from the outset. Under current proposals put forward by the European Commission, the Act will prohibit certain types of automated decision making; provide GDPR like transparency and accountability controls on high risk systems; and fines for non-compliance will be up to a maximum of EUR 30m or 6% of global turnover, depending on the seriousness of the violation. At present, the Act is in preparatory stages, ahead of a first reading within the European Parliament under the EU’s ordinary legislative process. This year we are likely to see how close the intentions of the EU Parliament are to those of the EU Commission. Alex Voss, MEP responsible for commissioning a draft report for Parliament from the Special Committee on Artificial Intelligence in a Digital Age, has spoken about the potential of AI and how the EU could lead the way in promoting “a human-centred approach to AI, based on our core ethical and democratic values”. Mr Voss has stressed how the EU needs to move fast and be agile in order to be an international leader on AI.
Digital Services Act
In 2020, the European Commission proposed an ambitious reform of digital services through the Digital Services Act (“DSA”). The DSA, which will apply to online platforms, such as social media and online market places, aims to tackle illegal content, misinformation and transparency around targeted advertising and algorithms. Under the DSA, online platforms will also face greater obligations to moderate content and disclose how their algorithms work to regulators. Now ready for trilogue stage, the amendments tabled in the European Parliament’s finalised text suggest a ban on targeted ads based on sensitive data, including religious beliefs and race, and “nudging” tactics designed to influence a user’s behaviour. Such restrictions on targeted advertising could reshape the entire tech industry, with the business model of many companies reliant on such popular marketing practices as their leading source of income. Since the Council has expressed little appetite for taking on advertising in the DSA, the final outcome is difficult to predict. However, companies may not have long to wait with the trilogue imminent and the French Presidency wanting to “move as far forward as possible with talks”. Implementation periods are quite diverse however, from the Commission’s proposed 3 months to the Council’s preferred period of 18 months.
UK Privacy Reform
Undoubtedly, there was more than a sigh of relief when the UK achieved an adequacy decision from the EU, but “Brexit means Brexit” and whilst the UK legislation is at present aligned closely to the EU, is it only a matter of time before the UK does start to diverge? Besides adequacy though, there are the core principles of data protection, which makes one question, what parameters does the UK actually have in reforming the UK data protection landscape?
In September 2021 the Department of Digital, Culture, Media and Sport published its consultation Data: A New Direction (the “Consultation”), part of the wider National Data Strategy piece, which links into the Government’s 10 Tech Priorities. Despite the Government’s intention that the Consultation “create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data”, the content of the document has been met with mixed reactions. Whilst there are some promising proposals for businesses, aimed at reducing some of the compliance hurdles besides the financial and time investment introduced by the GDPR, any business operating in Europe or elsewhere with legislation modelled on the GDPR, will need to remain compliant with EU GDPR.
The preliminary proposals, arguably more commercial and less administrative, include:
- the introduction of white listed grounds for processing on the basis of legitimate interests;
- changes to cookie consent requirements, to make it easier to use analytical cookies;
- the introduction of Freedom of Information Act style costs ceilings and nominal fees for the handling of data subject rights; and
- raising data breach reporting thresholds, such that only breaches resulting in a material risk to individuals need to be reported to the ICO.
Proposals less well-received include the introduction of the requirement for a “privacy management programme” and the replacement of UK DPOs with a “suitable individual responsible for the privacy management programme” position. These and other suggestions under the proposals have been criticised as confusing changes that replace one set of administrative requirements with another, with no tangible benefit to UK businesses. It will be interesting to see which proposals survive the consultation process and how soon we will see a report published that consolidates the consultation responses. We expect that data protection reform will feature in the Queen’s Speech in May, with a draft Data Protection Bill following shortly afterwards. The UK adequacy decision will be reviewed in June 2025, which allows time for carefully, thought out changes to any regime that can maintain this invaluable data transfer mechanism.
Unless you are completely new to data protection, you will be more than familiar with the staple, usual suspects that have been and continue to keep regulators busy. The data transfer conundrum has been assisted by the introduction of new Standard Contractual Clauses, including the longed for processor to sub processor clauses and the finalised European Data Protection Board’s (EDPB) supplemental measures, i.e., transfer impact assessments (TIA) although a periodic review of those assessments will need to be commence soon enough. With the consultation on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR ending on 31 January 2022, it is likely we will see further change. Perhaps some less onerous obligations for organisations to which the territorial effect applies. As for adtech, it is a whole industry in itself and business model at the centre of many companies. This paradigm influences discussions, attitudes and possible workable solutions, which all stakeholders will continue to explore in 2022 and beyond. As Google’s delay in its “cookieless” technology demonstrates, this is an extremely complex area. Some variety in this mix though comes from work on artificial intelligence including investigations into Clearview AI Inc, children’s data and the various data collection and data tracking arising from Covid including vaccination and booster status, infection rates and test and trace initiatives.
Perhaps this is something of an end of year wrap. In December 2020, Google and Amazon received hefty fines totalling €135 million from the CNIL, the French data protection authority, for violating cookie tracking rules. One year on, Google received a hefty fine of €150 million and Facebook €60 million for further adtech abuses. The CNIL has stepped up as one of the leading enforcement bodies within Europe yet the UK has been somewhat absent from taking any regulatory action against the adtech industry. In November 2021, the Information Commissioner published an opinion on Data protection and privacy expectations for online advertising proposals, which sets out what developers in this area need to consider to prevent the “excessive collection of personal information”. Whilst adtech did not get an explicit mention in John Edwards’, the new UK Information Commissioner, introductory copy, he could certainly make his mark in this area.
As mentioned above this is a notoriously complex area which will continue to evolve this year across all mechanisms and not just Standard Contractual Clauses, as this blog entitled, What future for the transfers of personal data discusses. One challenge, to prepare for logistically, is repapering of old SCCs before the deadline of 27 December 2022 (allowing for the first Christmas party season in three years, no new Covid variant permitting).
The new EU SCCs have been welcomed as a more comprehensive and flexible tool for transferring data. Whilst organisations seem to be adopting them well, many are still grappling with the practical and legal challenges of carrying out TIAs and, where required, implementing “supplementary measures” and keeping them under review whilst assessing any new risks. The UK’s ICO is expected to reveal any time now its new transfer mechanisms that will replace the old EU SCCs. We predict (and hope) that, as proposed in the ICO’s consultation, the UK will approve the use of the new EU SCCs in conjunction with the UK Addendum. This would help to reduce the additional complexities created by Brexit.
Recent decisions from the Austrian DPA and the European Data Protection Supervisor on the use of Google Analytics tools and transfers of data to the US, suggest that 2022 might be the year of Schrems II enforcement. As multiple European regulators are currently investigating the remaining one hundred complaints filed by nyob in relation to transfers of data to US service providers, it seems inevitable that more regulatory decisions will follow. It will be interesting to see their impact on the operations and business strategies of European organisations. The prospect of looming enforcement is likely to intensify pressure on the EU and the US to negotiate a new data sharing deal to replace Privacy Shield. Even if such agreement is reached though, organisations might be reluctant to rely solely on a framework, which will inevitably be a target of legal challenges.
With the ICO’s Age Appropriate Design Code aka the Children’s Code becoming applicable on 2 September 2021 and the DPC’s The Fundamentals for a Child-Oriented Approach to Data Processing (the “Fundamentals”) finalised guidance published on 17 December 2021, there is a lot happening in the area of children’s data. Both of these regulatory documents embrace recital 38, GDPR that “children merit specific protection with regard to their personal data”. Sweden, France and the Netherlands have also produced guidance on children who are a priority area of the EDPB’s Work Programme for 2021-2022 with Ireland appointed as the Rapporteur. There have been calls from US Senators for US based companies to apply the Children’s Code to children in the US, such is the high benchmark it sets for children’s data.
As with data protection per se, this area is not static. Upon the Children’s Code becoming applicable, the ICO sent investigatory letters to social media and app messaging platforms, and gaming companies to determine what work they had undertaken to comply with the Code’s various standards. The overall results of these letters, asking 50 plus questions, are imminently expected and the ICO will be reviewing the Code itself before its first anniversary. Age assurance / verification / gating is a tricky area, especially when you need to determine with certainty the age of a child user although as the safety tech market matures and technical measures increase their sophistication this problem is lessening.
Whilst the suggested privacy icons published with the early drafts of the GDPR had little universal appeal, will we finally get some agreed international privacy icons? Transparency is a fundamental principle of data protection and, with respect to explaining privacy content to children, it is being championed more than ever. There is a real push for privacy notices to use “diagrams, cartoon, graphics, video and audio content, and gamified or interactive content”. Beyond its Children’s Code, the ICO has a whole web section dedicated to additional resources, including design guidance and a round-up of examples of good practice from “transparency champions”. Arguably, Lego is setting the bar but creativity abounds and we imagine the static written privacy notice will soon be complemented for adults and children alike by something more contemporary and engaging.
Privacy class actions
The UK Supreme Court decision in Lloyd v Google decision in November 2021 has left the future of opt-out privacy class actions in the UK somewhat uncertain. Presently, it is not yet clear whether pending privacy class actions (which were previously awaiting the outcome of the decision) will pursue their claims into 2022, and resolve some of the unanswered questions in the Lloyd v Google decision.
By way of reminder, the case concerned an application to serve proceedings on Google outside of jurisdiction for privacy violations concerning Google’s 2011-2012 Safari workaround. The application was denied on the basis that damages for mere loss of control of data were not available under the UK Data Protection Act 1998, and even if they had been, the claim could not succeed as class members were unable to meet the ‘same interest’ requirement under Civil Procedure Rule 19.6.
Whilst the decision was good news for business, the door to opt-out privacy class actions is not firmly closed. Notably:
- the position under UK GDPR has not yet been considered. UK GDPR contains references to “non-material damage to natural persons such as loss of control” (Recital 85) and the right of data subjects to bring a properly constituted representative claim under Article 80 GDPR. Both of these deliberations could influence a different finding in relation to cases brought under the 2018, rather than 1998, Data Protection Act;
- building on point i., there are several outstanding references to the CJEU that have the potential to extend the scope of compensation under the EU GDPR and may have a persuasive (non-binding) effect on future UK decisions; and
- litigants will be looking closely at the decision, to find creative approaches to differentiate their case from the Lloyd v Google decision. One such approach is to focus on more serious contraventions of UK privacy laws, ensuring that the lowest common denominator of damage between the class members is sufficiently material. Another possibility is to scrutinise categories of class members with a view to only including those of sufficiently similar interest in the harm suffered and outcome of the claim.
New Information Commissioner arrives at the ICO
The new year brought with it a change to the helm of the ICO, with John Edwards starting his five-year term as the UK Information Commissioner. Mr Edwards has announced that he will be focusing on strengthening links with other digital regulators and will be actively engaging with the Government over the proposed reforms to the Data Protection Act and introduction of the Online Safety Bill (still in draft but reported on by the Joint Select Committee in December 2021).
The Information Commissioner is currently (until 24 March) seeking feedback on the ICO’s Regulatory Action Policy, Statutory Guidance on our Regulatory Action and Statutory Guidance on our PECR Powers, which together will set out the approach to be adopted by the new Information Commissioner. Whilst we may see some change of direction, Mr Edwards has already confirmed that he will continue to prioritise work to protect children online through the Age Appropriate Design Code.
It is clear we have our work cut out and data protection is a multifaceted discipline that will continue to be engaging throughout the year(s) ahead. The value of data is more than apparent and the amount of data available is unquantifiable. The future is exciting and whilst there are a number of moving parts and challenges, it is clear we need workable solutions to protect personal data in a global environment but ones that allow for innovation, commercial enterprise, economic growth and benefit society overall.