Data Protection update – February 2023 – Stephenson Harwood

03 Mar 2023

Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from February 2023.
While February may be the shortest month of the year, it was far from quiet on the data protection front as regulators across the UK, US and EU stepped up enforcement efforts.
In the UK, the Information Commissioner’s Office experienced both wins and setbacks. It helpfully clarified its approach to regulating the use of Facial Recognition Technology, setting out a number of recommendations after a local council was found using the technology to manage “cashless catering” in school canteens. But the regulator also suffered a setback after a tribunal upheld an appeal from Experian over the credit reference agency’s handling of personal data for direct marketing purposes. The First-Tier Tribunal rejected the ICO’s decision that Experian’s privacy notice was not transparent or that using credit reference data for direct marketing purposes was inherently unfair. 
Elsewhere in the US, the Federal Trade Commission announced its first enforcement action under its Health Breach Notification Rule, which requires vendors of personal health records to notify consumers following a data breach revealing unsecured information. It fined digital healthcare platform GoodRx Holdings $1.5million after it was found to have shared individuals’ health data with Facebook and Google and others.
In the EU, the European Commission helpfully set out a plan to streamline cooperation efforts between data protection authorities of member states. The European Commission has yet to publish details on how the initiative will work in practice, but as it seeks to aid harmony in cross-border cases it could mean big changes to the ‘one-stop-shop’ regime in this area – so watch this space.
In this month’s issue:
Following a complaint by the Irish Council for Civil Liberties (“ICCL“) and subsequent suggestions by the European Ombudsman, the European Commission has confirmed that it will require all EEA national supervisory data protection authorities to share details of large-scale cross-border investigations under the EU General Data Protection Regulation (“EU GDPR“) with it on a bi-monthly basis. All information shared will be done so on a confidential basis, giving the European Commission greater oversight over the enforcement of the EU GDPR and the protection of data subjects’ rights throughout the EEA.
The European Commission’s enhanced supervision over the actions of data supervisory authorities looks set to accelerate investigations and enforcement action and could ensure a more harmonised approach to enforcement action under the EU GDPR. ICCL senior fellow, Dr Johnny Ryan, has commented that the increased supervision by the European Commission signals the “the beginning of true enforcement of the GDPR, and of serious European enforcement against Big Tech“.
The European Commission has also proposed a plan to create a legislative initiative to streamline cooperation between the data protection authorities of member states. The European Commission has stated that this initiative will allow increased harmony in the actions taken by data protection authorities when considering cross-border cases. This could be considered to be an effective overhaul of the ‘one-stop shop’ regime currently in practice, implementing a new regime for consistency across member states. The European Commission has yet to publish details on how this initiative will work in practice but it has opened a ‘call for evidence’ to receive feedback to “develop and fine-tune” the initiative between 24 February 2023 and 24 March 2023.
Feedback can be submitted to the European Commission here.
On 24 February 2023, the Cyberspace Administration of China officially published the Measures on the Standard Contract for Export of Personal Information (“Measures on Standard Contract“). A standard contract for the export of personal data from the People’s Republic of China (“PRC“) to other countries (“PRC Standard Contract“), similar to the Standard Contractual Clauses (“SCCs“) used under the EU GDPR, is attached to the Measures on Standard Contract, which will be effective from 1 June 2023. 
Unlike the SCCs, the PRC Standard Contract only has one universal template regardless of the roles of the data importer and data exporter as controller or processor. The PRC Standard Contract cannot be adopted where the data exporter is a critical information infrastructure operator (such as a telecoms or public utility provider), is processing the personal data of more than 1 million data subjects, has exported the personal data of more than 100,000 individuals (in aggregate) since 1 January of the preceding year, or has exported the sensitive personal data of more than 10,000 individuals (in aggregate) since 1 January of the preceding year. In these instances the data exporter will be instead required to carry out a mandatory data export security assessment.
For further detail on the Measures on Standard Contract and the PRC Standard Contract, an alert from our associate firm in the PRC (Wei Tu) is available on our data protection hub.
In December last year, the European Commission published a draft adequacy decision (“Draft Decision“) endorsing the proposed EU-US Data Privacy Framework (“Framework“). A short article on the Draft Decision can be found here. That action came after US president Joe Biden signed Executive Order 14086 on Enhancing Safeguards for U.S. Signals Intelligence Activities (“EO“).
In the last month, there have been two developments in relation to the Draft Decision and the Framework. The first came in the form of a draft motion of the Members of the European Parliament (“MEPs“) making up the Committee on Civil Liberties, Justice and Home Affairs (“LIBE Committee“) on 14 February 2023 (the “Motion“), which states that the Framework “fails to create actual equivalence in the level of protection” and urges the European Commission “not to adopt the adequacy finding“.
The Motion takes the view that the Draft Decision does not offer EU businesses legal certainty, with the LIBE Committee remaining concerned about the Framework becoming subject to future legal challenges. Some of the potential areas for future challenge are identified as follows:
The Motion has since been followed by a non-binding opinion of the European Data Protection Board’s (“EDPB“) on 28 February 2023 (“EDPB Opinion“) which highlights the “substantial improvement” in the Framework when compared to the old Privacy Shield regime. Unlike the Motion, the EDPB Opinion is overall fairly positive about the Framework and resulting Draft Decision, acknowledging that the test of essential equivalence under the EU GDPR does not require data protection safeguards in the US to be identical to those in the EU. In this regard, the EDPB makes some suggested improvements in relation to how the Framework interacts with US laws, including in relation to data subject rights requests, onward transfers and profiling or automated decision making. Critically, the EDPB recognises that these suggestions are not the issues that were criticised by the CJEU in Schrems II, but instead speak to pre-existing concerns over the overall equivalence of US law to the EU GDPR.
The EDPB Opinion also sets out some areas in which the Framework falls short, as far as access to personal data by US national security authorities is concerned. The EDPB refers to its European Essential Guarantees for surveillance measures from 2021 (“Guarantees“) and proposes that the European Commission adopt the Draft Decision on the condition that US intelligence agencies implement updated policies and procedures that meet the requirements of the Guarantees.
So, what is next?  The European Commission needs the Draft Decision to be approved by a committee composed of representatives of the EU member states. In addition, the European Parliament has a right of scrutiny over adequacy decisions. We now await a final vote by the European Parliament on the Draft Decision (which is expected to take place during the Spring). If approved, the Framework could be operational as soon as July.
The European Commission’s new approach of enhanced supervision (as reported on above) follows recent controversy regarding the Irish Data Protection Commission’s (“DPC“) handling of complaints against Big Tech companies. The DPC responded to recent criticisms of its own decision-making processes by criticising the EDPB’s oversight of the DPC’s enforcement action against Meta Platforms Ireland Limited (“Meta“), as reported in our last bulletin. Subsequently, the DPC has now issued multiple claims against the EDPB in the Court of Justice of the European Union (“CJEU“). Although the details of these claims have not yet been published, they are likely made pursuant to Article 263 of the Treaty on the Functioning of the European Union, which allows the CJEU to examine the legality of legal acts of bodies, offices or agencies of the Union that produce legal effects in a process similar to that of judicial review in the United Kingdom.
As reported in our December 2021/January 2022 bulletin, Meta attempted a similar course of action in relation to the EDPB’s decision on WhatsApp which led the DPC to issue a €225 million fine against Meta. However, in December 2022 the CJEU ruled the claim brought by Meta to be “inadmissible“.
It remains to be seen what the DPC’s precise arguments will be and whether or not these arguments will be enough to convince the CJEU that the EDPB “overreached” its authority by requiring the DPC in its decision to initiate a fresh investigation into Meta’s platforms’ processing of special category personal data.
A group of four of Europe’s largest telecoms providers have announced their plans to form a joint venture company aimed at providing a privacy-by-design digital marketing technology platform (“Platform“). Each of the telecoms providers – Deutsche Telekom AG, Orange SA, Telefónica SA and Vodafone Group Plc – will hold a 25% stake in the joint venture.
In a statement Vodafone noted that the Platform has been designed with a focus on compliance with the EU GDPR and European Directive 2002/58/EC (“ePrivacy Directive“), as well as increased transparency for consumers using the Platform. Consumers will only receive communications from brands where they have given affirmative opt-in consent and will also be able to revoke this consent with a single click through the Platform. The Platform is already undergoing a trial in Germany to evaluate the Platform’s transparency and the control it gives to consumers.
In the UK, digital marketing is governed by the UK GDPR, the Data Protection Act 2018 (“DPA 2018“) and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR“), which implements the ePrivacy Directive into UK law. As with their European equivalents, the UK GDPR, DPA 2018 and PECR require organisations carrying out digital marketing to obtain opt-in consent to receiving marketing communications, which most companies do by asking individual users to tick a box if they wish to receive such communications. Similarly, under Regulation 6 PECR, when using cookies or other tracking technologies (to enable targeted or behavioural digital marketing), organisations must tell people that the cookies or tracking technologies are there, explain what they do and why, and get the individual’s consent to store the cookie on their device.
The Platform appears to provide consumers with greater control over their consents by allowing users to manage their consents for multiple brands in one place. Moreover, the Platform is designed so that consumers’ consents are anonymised when data is shared with brands in order to protect the consumers’ personal data.
Whilst the Platform has not yet been adopted for commercial use, it appears to address a wider emphasis across the EEA and UK on protecting consumers’ data protection rights in relation to digital marketing; including with the publication of updated guidance from the Information Commissioner’s Office (“ICO“) on e-mail direct marketing (as reported on in our October bulletin).
Following a cyber-attack on JD Sports Fashion plc (“JD Sports“), the personal data of ten million customers who made online orders between November 2018 and October 2020 may be at risk. JD Sports made clear that while full payment card details should not be at risk, as it did not hold full card details for customers (only the last 4 digits of their card), personal data such as names, addresses, and phone numbers could be accessed by cyber-attackers.
Neil Greenhalgh, the Chief Operating Officer of JD Sports, apologised to customers affected and advised them to be “vigilant about potential scam emails, calls and texts and providing details on how to report these“. 
The cyber-attacks have been reported to the ICO and JD Sports has stated that it is reviewing its cybersecurity with external specialists. This cyber-attack, along with the recent cyber-attack on Royal Mail (as discussed in our December 2022/January 2023 bulletin), is a stark warning to companies holding large amounts of customer personal data. Companies should ensure that they are appropriately safeguarding their systems and have a clear policy in place for preventing and managing cyber-attacks. While limiting card information held (often by using third party payment processors) helps to reduce the potential financial impact on data subjects, the ancillary information held by consumer facing companies is generally more than enough to impart obligations to notify both regulators and data subjects, which invariably leads to consequential reputational damage. Notably, several claimant law firms have posted data breach claim websites looking to bring claims on behalf of impacted data subjects, a common feature of large scale publicly reported breaches.
On 6 December 2022, the Product Security and Telecommunications Infrastructure Act 2022 (the “Act“) received Royal Assent and came into force. As discussed in our December 2021/January 2022 bulletin, the Act’s purpose is to create a new regulatory regime for the security of consumer-connectable products and for the provision of electronic communications infrastructure.
The Act includes obligations and legal duties for manufacturers, authorised representatives, importers and distributors (the “Relevant Persons“).
These duties for the Relevant Persons include duties to:
The Minister for Digital, Culture, Media and Sport stated that the Act and its supporting legislation will also strengthen cyber protection to make sure the UK has the strongest security regime for smart tech in the world“. The Act will ensure has a practical impact by creating a range of penalties for non-compliance. Potential penalties include a fine of the greater of £10 million or 4% of qualifying worldwide revenue. This may also be followed by fines of £20,000 per day for those who fail to remediate and comply with the Act once notified. The authority enforcing the Act will also be able to issue stop, recall or compliance notices and failing to adhere to these shall be a criminal offence.
While the Act is now in force, many of the effective obligations will be brought into force through regulations in the future. These subsequent regulations are set to introduce obligations for manufacturers, importers and distributors amongst others. The Government have stated that Relevant Persons shall have at least 12 months to comply with the Act and any subsequent regulations, but it is advisable for organisations likely to be subject to the Act and these legislations to start building the necessary organisational infrastructure to ensure compliance. Organisations should account for the costs associated with the new obligations under the Act and ensure that records of compliance are maintained, as well as ensuring that remediation plans are feasible.
The Act can be read in full here.
On 9 February 2023, the HM Treasury Office of Financial Sanctions Implementation (“OFSI“) released a notice (the “Notice“) freezing the assets of seven Russian individuals who are alleged members of Russian cybercrime gang Trickbot, and banning these individuals from travelling to the UK. These sanctions have been implemented pursuant to the Cyber (Sanctions) (EU Exit) Regulations 2020 (S.I. 2020/597) which allow for an individual’s funds and economic resources to be frozen if they are “conducting or directing cyber activity that undermines, or is intended to undermine, the integrity, prosperity or security of the United Kingdom or a country other than the United Kingdom; international organisations; and non-governmental organisations whose purposes relate to the governance of international sport or the Internet“.
The sanctioned individuals are said to have been involved in the deployment of ransomware in both the UK and US. Consequently, they have also been sanctioned in the US by the US Department of the Treasury’s Office of Foreign Assets Control, in accordance with Executive Order 13694, as amended by Executive Order 13757. The estimated value of the funds extricated using the ransomware is said to be over £27 million, with entities such as hospitals, local authorities, and schools targeted in attacks. The sanctions have been announced in the midst of a large-scale, ongoing investigation by the National Crime Agency (“NCA“).
The Notice also obliges any person or entity potentially connected financially or economically to the sanctioned individuals to check if they are holding any accounts, funds or economic resources for these individuals and, if relevant. freezing such accounts.
The OFSI has simultaneously released updated guidance relating to ransomware and sanctions (the “Ransomware Guidance“). The Ransomware Guidance seeks to outline mitigating steps which can be taken to allow the OFSI and NCA to resolve breach cases involving ransomware payment through alternative means to monetary penalties and criminal investigations.
The key takeaways from the Ransomware Guidance are:
Outside of the above, the Ransomware Guidance also offers practical steps to consider if there is a ransomware attack, tools and online platforms/services provided by the NCSC for the purpose of cyber resilience and mitigation and key contacts for reporting ransomware attacks and obtaining further information.
The Notice can be read here, the US Department of the Treasury’s press release can be read here and the Ransomware Guidance can be read here. The joint letter from the NCSC and ICO to the Law Society can be read here.
Following the ransomware attack on Royal Mail, the negotiation history between Royal Mail International and ransomware gang, LockBit, has been released. On 14 February 2023, LockBit leaked the full transcript of the live chat between Royal Mail and itself.
The transcript highlighted some of the common negotiation strategies either side employs, notably in response to LockBit’s ransom demand for $80 million made in February 2023. For example, Royal Mail’s negotiator is seen requesting sample decryption evidence by providing files to be decrypted. This is common to obtain ‘proof of life’ that the decryption works. The LockBit threat actor is seen rejecting some on the basis that it believes they could be used to restore functionality without the decryptor. 
On the other hand, the LockBit threat actor focused on the alleged annual revenue of Royal Mail, citing news sources as evidence that the Royal Mail could pay (which the Royal Mail negotiator disputed). The threat actor also referred to the possible fines that regulators could impose, stating that payment would save Royal Mail from a much larger fine. This is a common tactic from threat actors, but as noted elsewhere in this update, regulators have publicly stated they will not take into account payment of ransoms when considering when to issue a fine, or the size of any fine to be issued. Further, payment of a ransom does not obviate the need to report a breach to regulators, as the breach occurred when the threat actor accessed and exfiltrated data, regardless of whether it later publishes it. 
The negotiation appeared to end without achieving a resolution, and on 23 February various outlets reported that some 44GB of Royal Mail data had been leaked onto the dark web, with a further ransom demand issued in respect of the remaining unpublished data in the threat actor’s hands.
The US Federal Trade Commission (“FTC“) has announced the first enforcement action under its Health Breach Notification Rule, which requires vendors of personal health records and related entities to notify consumers following a data breach revealing unsecured information. 
GoodRx Holdings Inc. (“GoodRx“) was found by the FTC to have shared individuals’ health data with Facebook and Google amongst others and agreed a US$1.5 million civil penalty with the FTC for the breach.
The FTC’s proposed order for the breaches (which must be approved by a US federal court to be effective) suggests that, in the US, opt-in consent is required to use any health data for advertising and that health data should not be shared without the relevant individual’s knowledge. This echoes the decision of the ICO to fine Easylife Limited in the UK for using inferred health data to target digital marketing to individuals and the fines issued by a number of European data protection supervisory authorities to Clearview AI Inc. in relation to its collection of biometric data without individuals’ knowledge for similar purposes.
The FTC’s action against GoodRx highlights the importance of ensuring any health or other special category personal data is held and used in accordance with applicable data protection laws. Any special category personal data collected should only be shared with great caution and only when full transparency can be given to the individual whose data is being shared.
A former RAC employee has been found to have stolen the personal data of at least twenty-one individuals relating to their involvement in road traffic accidents following an ICO investigation. The former employee, who pleaded guilty to stealing data in breach of the DPA 2018, was fined £5,000 and ordered to pay a victim surcharge and court costs.
In the ICO’s statement on the action, the ICO’s Director of Investigations, Stephen Eckersley, noted that: “receiving nuisance calls can be hugely frustrating and people often wonder how these companies got their details in the first place. This case shows one such way that it happens. But also shows that those who do this crime will be caught, will be convicted and justice will be served.
Whilst ICO action against individuals is rare, this action illustrates that the ICO will take action against individuals as well as organisations for breaching data protection laws, particularly where these breaches result in distress to data subjects. 
The ICO has made a number of recommendations to North Ayrshire Council (the “Council“) following the Council’s controversial use of Facial Recognition Technology in its schools. The ICO’s recommendations centre around three key requirements when using biometric data and children’s data.
Where biometric data is used to uniquely identify a natural person, the data used will be classified as special category data and require both a lawful basis for processing under Article 6 UK GDPR and a condition for processing special category data under Article 9 UK GDPR to be lawfully processed. In this instance, the Council purported to rely on consent and explicit consent as its lawful bases for processing special category data. Investigating the matter, the ICO found a number of issues with the forms used to collect these consents from data subjects. 
Firstly, the ICO found that the forms did not present the use of Facial Recognition Technology as an option. For a public authority to rely on consent as a lawful basis for processing, it must have given individuals a genuine choice to accept or reject the processing. 
The ICO also found that the forms used a number of technical terms that were unlikely to have been understood by the data subjects. 
The Council’s use of the wording “I do wish to grant consent to participate in the use of facial recognition systems within the school” in its forms was deemed by the ICO to be too vague and broad to properly grant explicit consent. Whenever collecting explicit consent, it is critical that any consent collected is given in both an explicit and unambiguous manner. Such explicit consent should be in a clear oral or written statement, must specify the nature of the special category data the individual is consenting to have collected, and must be separate from any other more general consent.
The ICO noted in its recommendations that the right for individuals to be informed about how their personal data is collected and used is a key transparency requirement under the UK GDPR. Pursuant to Article 12 and Recital 58 UK GDPR, when using children’s data there is a particular need to make efforts to provide this information to the relevant children in a way that they “can easily understand“. Particular care should also be taken to ensure that children understand the potential risks with collecting their data and their rights under the UK GDPR and other applicable data protection legislation.
Any communications with data subjects should also not be misleading. The ICO found in this instance that the Council had underplayed the complexity of the Facial Recognition Technology, giving a misleading impression that this technology was more well-tested and commonplace than it is in reality.
The ICO found the Council had made a number of errors when completing its Data Protection Impact Assessment (“DPIA“) in relation to the Facial Recognition Technology. Before processing is commenced, a comprehensive DPIA, complying with the Article 35 UK GDPR requirements, should be completed by the controller. This should be signed and dated by a senior employee or the data protection officer (“DPO“), prior to the processing commencing. In its recommendations to the Council, the ICO reiterated the importance of consulting and involving the DPO throughout the process of completing the DPIA. Any residual risks highlighted in the DPIA must either be mitigated or the ICO must be consulted before processing commences, so as not to impinge on the rights of data subjects.
The ICO’s report on the recommendations given to the Council can be read in full here.
The First-Tier Tribunal (the “Tribunal“) has upheld an appeal from credit reference agency Experian Limited (“Experian“) in regard to its processing of data from a number of sources for the purpose of direct marketing. Of particular interest were the Tribunal’s findings on legitimate interest as a basis for processing for direct marketing purposes.
The judgment follows enforcement action taken by the ICO against Experian in October 2020 (which we reported on in detail here), following a two-year-long investigation relating to the use of personal data by Experian’s direct marketing arm acquired from a variety of public sources (such as the electoral register), internal sources (from its credit reference business) and third party sources, which it used to build profiles for 51 million adults. It sold this profile data on to third parties for direct marketing purposes. The profiles combined name and address information with up to 13 other attributes. From these Experian produced modelled datapoints which reflected ‘predictions’ about the likelihood of people having certain characteristics.
The ICO’s investigation held that Experian was breaching the UK GDPR by processing the personal data of data subjects for direct marketing purposes without consent, and that Experian could not rely on the legitimate interest basis to render its processing lawful. It was not appropriate for Experian to process data for direct marketing purposes on the basis of its legitimate interests when that data was originally obtained on the basis of consent.
While Experian had, in October 2020, introduced a notice in its Customer Information Platform collating the privacy information required that popped up on its consumer information portal, the ICO held that this was not sufficient. As a result, Experian was conducting ‘invisible processing’, breaching its transparency obligations and not processing data with the consent of data subjects. The ICO’s enforcement notice (“Enforcement Notice“) required that Experian amend its practices within a period of nine months, including by providing a UK GDPR-compliant privacy notice to all data subjects, subject to a fine of £20 million or 4% of its annual global turnover should no changes be implemented.
The Enforcement Notice was appealed by Experian (the “Appeal“). In the Appeal, Experian asserted that the law had been applied incorrectly and flawed conclusions reached on the facts, and that the requirements of the Enforcement Notice were disproportionate. Experian asserted that the Enforcement Notice was an attempt by the ICO to impose subjective preferences as if they were legal requirements under the GDPR, and that the effect of the Enforcement Notice would be that Experian would be forced to adopt an unworkable consent-based model for offline marketing services, which would effectively shut down its business in this area.
There were five grounds of appeal underpinning this broad argument:
The Tribunal struck out the ICO’s Enforcement Notice and confirmed that the legitimate interests ground can be relied upon for direct marketing activities. There are many fact-specific points to the decision, but some of the key areas of note are as follows:
The Tribunal’s decision can be read here. It is not yet known whether the ICO will appeal the decision.
The CJEU has ruled that data protection officers (“DPOs“) may hold other roles and carry out other obligations or tasks so long as they do not result in a conflict of interest.
The 9 February 2023 decision from the CJEU follows a request for a preliminary ruling from the Federal Labour Court of Germany in a claim relating to a former DPO. The former DPO of X-Fab Dresden, who was also chair of the company’s works council, was dismissed, with his former employer arguing that his two roles posed a risk of a conflict of interests.
The CJEU’s decision focused on interpreting Article 38 EU GDPR, which does not allow for the dismissal or penalising of DPOs for performing their tasks. It clarified that the aim of Article 38 EU GDPR is to allow DPOs to be independent from controllers and processors. Article 38 does not prevent member states from placing stricter conditions for the dismissal of a DPO, but these conditions could not be incompatible with the EU GDPR.
In respect of determining the circumstances of a conflict of interest, the CJEU concluded that a DPO “cannot be entrusted with tasks or duties which would result in him or her determining the objectives and methods of processing personal data on the part of the controller or its processor“. Data protection objectives and methods should be carried out independently by the DPO.
The decision sheds some light on what external tasks and duties it is appropriate for DPOs in organisations to carry out alongside their roles as DPOs. The focus should always be on ensuring that DPOs are not placed in a position in which they are determining processing or where their role may conflict with ensuring compliance with the EU GDPR.
The decision from the CJEU can be read in full here.
The conference of German Data Protection Authorities (“DSK“) has published a decision on whether the risk that a parent company based in a third country could instruct an EEA-based subsidiary to grant access to the personal data it holds (“Access Risk“) can be considered a data transfer under Article 44 EU GDPR. The DSK’s decision, which appears to have been influenced by a judgment of the Oberlandesgericht Karlsruhe (in the proceedings 15 Verg 8/22), was that the Access Risk was not to be considered in isolation as a data transfer. However, the DSK noted that an Access Risk should be taken into account by a controller when assessing the reliability of their processor in accordance with Article 28(1) EU GDPR, as it may indicate that the processor is unable to give sufficient guarantees of the implementation of technical and organisations required to protect the rights of data subjects and meet the requirements of the EU GDPR.
Where an Access Risk exists, determining the reliability of the EEA-based processor with a parent company in a third country requires an assessment of all the relevant criteria, such as the likelihood of requests from the parent company to transfer to third-country or assurances on how conflicts between the laws of the member state and third country are managed. According to the DSK, for a processor with an Access Risk to be used, there must be particularly stringent safeguards in place to ensure that the Access Risk is mitigated. 
The DSK have referred their decision on this topic to the EDPB for further determination. The DSK’s decision can be read (in German) here.
Peter Dalton
T:  +44 20 7809 2151 M:  Email Peter | Vcard Office:  London
Katie Hewson
T:  +44 20 7809 2374 M:  Email Katie | Vcard Office:  London
Alison Llewellyn
Managing associate
T:  +44 20 7809 2278 M:  Email Alison | Vcard Office:  London


Leave a Comment