Demystifying Schrems II for the cross-border transfer of clinical research data

Abstract The Courts of Justice of the European Union (CJEU) held in its July 2020 Schrems II decision that, in order for entities in other countries to import personal data from the European Economic Area (EEA), the importer must be able to provide data protections ‘essentially equivalent’ to those the EEA offers under its General Data Protection Regulation. The CJEU expressed particular concern that United States’ national security intelligence gathering laws prevent U.S.-based entities from providing such protections. This decision has sharply limited the sharing of clinical research data from the EEA to the United States. After describing the pertinent aspects of the Schrems II decision, this article evaluates U.S. national security intelligence gathering frameworks, including Section 702 of the Foreign Intelligence Surveillance Act and Executive Order 12333. The article then leverages recent draft guidance from the European Data Protection Board to explain how entities may be able to adopt widely used contractual and technical measures, such as data pseudonymization, to provide ‘essentially equivalent’ protections in the clinical research context.

tools, including end-to-end encryption and Certificates of Confidentiality, may provide some additional benefit.

II. DATA EXPORT REQUIREMENTS AFTER SCHREMS II
Both the Charter of Fundamental Rights of the European Union ('the Charter') and the GDPR govern the protection of Europeans' personal data. The Charter provides the rights to respect for private and family life, the protection of personal data and the ability to obtain redress before an independent tribunal. 6 E.U. courts read and interpret E.U. regulations 'in the light of the Charter' . 7 The GDPR extends expansively to all 'personal data' that EEA-based entities process or that foreign entities process in the course of offering goods and services to or monitoring the behavior of persons in the EEA. 8 An EEA-based entity may not send personal data to a country located outside the EEA (referred to in the GDPR as a 'third country') unless the subject of the data receives protections essentially equivalent to those offered under E.U. law. While the European Commission may determine that a particular third country provides comparable protections to the GDPR, 9 a process referred to as granting an 'adequacy decision', the United States and most other countries have not received such a decision. The E.U. had previously granted a partial adequacy decision for the United States under the GDPR's predecessor in the form of the E.U.-U.S. Safe Harbor regime (the 'Safe Harbor'), which permitted for-profit U.S. companies to self-certify compliance with certain data privacy standards, but the CJEU found the Safe Harbor invalid in 2015, in Schrems I. 10 The U.S. subsequently received a partial adequacy decision applicable to for-profit companies that self-certified to the E.U.-U.S. Privacy Shield regime (the 'Privacy Shield'). 11 However, the CJEU invalidated that determination in Schrems II, finding that the Privacy Shield did not guarantee sufficient data protection. 12 In particular, the CJEU found that U.S. intelligence gathering laws did not adhere to GDPR's 'principle of proportionality' in failing to 'lay down clear and precise rules governing the[ir] scope and application' and provided insufficient 'effective and enforceable rights and effective administrative and judicial redress', especially for non-U.S. persons. 13 In the aftermath of Schrems II, some commentators, such as Mr Schrems himself, have called for localizing data in the E.U., while others have critiqued localization as undermining both data privacy and economic development. 14 6 Charter of Fundamental Rights of the European Union (2000) For these reasons, data exporters seeking to export data from the EEA to the United States or to other countries lacking an adequacy decision must put in place another means to provide protections. One such mechanism, and that most commonly used, is the European Commission's SCCs. The SCCs are European Commission-approved form contracts that permit data to flow between a data exporter located in the EEA and a data importer located outside of the EEA. The European Commission designed the clauses to afford adequate protection to personal data. E.U. authorities recently released proposed updates to the SCCs that are intended to rectify some long-recognized gaps in the coverage of the clauses. 15 The draft SCCs were finalized in June 2021, after this article was accepted for publication; the finalized clauses are substantively similar to the draft. 16 Historically, EEA-based data exporters have generally entered the SCCs with data importers without performing further analysis into the legal regime and data protection principles of the third country to which data are transferred, though the SCCs contain other important requirements. 17 In the Schrems II decision, however, the CJEU made clear that data exporters cannot simply use the SCCs without additional diligence. Rather, the data exporter must evaluate 'the relevant aspects of the legal system of that third country', especially related to 'any access by the public authorities of that third country to the personal data transferred' . 18 The GDPR permits E.U. Member States to restrict GDPR rights when 'such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard: national security.' 19 In a November 2020 decision referred to as La Quadrature du Net, the CJEU interpreted a similar provision of a different E.U. law, holding that threats of terrorism could justify laws ordering electronic communications providers indiscriminately to retain certain information about, but not the contents of, large quantities of communications; however, retention of such collections must be time limited, and the collections subject to effective judicial oversight. 20 The European Data Protection Board (EDPB), in draft guidance issued in November 2020 in response to the Schrems II decision, 21 noted that data exporters must analyze whether public authorities in third countries access data only through programs which are 'necessary and proportionate in a democratic society' . 22 In performing this analysis, the EDPB wrote that data exporters should consider 'objective factors' rather than 'subjective [factors] such as the likelihood of public authorities' access to your data in a manner not in line with E.U. standards' . 23 However, despite the statement that the likelihood of public authorities' access should not be taken into account when conducting the analysis, the EDPB highlights that, when conducting the analysis, 'objective factors' that may be taken into account include precedents, legislation and practice 'demonstrating that a third country authority will seek to access the data with or without the data importer's knowledge . . . [or] will be able to access the data through the data importer or through direct interception of the communication channel.' 24 Yet evaluating these factors calls for reliance on experience and history, rather than on the theoretical ability of a national security apparatus to obtain personal data. Furthermore, the implementing decision for the recently released draft SCCs similarly urges data exporters to 'take into account the specific circumstances of the transfer', including 'practical experience indicating the existence or absence of prior instances of requests for disclosure from public authorities' for that type of data or data recipient. 25 The analysis cannot be static, but rather must be updated from time to time to evaluate changes in the laws of the importing country that may affect the protections afforded to personal data. 26 Furthermore, the analysis must reach any contractor or sub-contractor that touches the data, 27 such as email or cloud computing providers. The requirement is, therefore, essentially that the data exporter must evaluate an entire foreign legal system for these variables, and this has understandably proved a significant challenge to many EEA institutions, resulting in great confusion and the slowing-or suspending-of trans-national research collaborations. In response to the Schrems II court's finding that the U.S. legal system fails to afford equivalent protection to EEA residents, the U.S. government has argued that many E.U. Member States have even less intelligence oversight protections than does the United States. In particular, the U.S. government has highlighted how European Union Member States afford individuals more limited judicial review and provide less stringent regulation of both domestic and international clandestine intelligence collection than does the United States. 28 Indeed, a 2015 report commissioned by the European Parliament found that unclear definitions, the lack of independent oversight and a lack of policies regarding when to inform targets about intelligence collection 22 32 The certification must specify targeting, querying and data collection procedures that ensure only non-U.S. persons are targeted, minimize the capture of information about U.S. persons and ensure a 'significant purpose of the acquisition is to obtain foreign intelligence information' . 33 Once the FISC approves a 'certification', the NSA uses the approved methodology to generate lists of account identifiers, such as email addresses, and collects information flowing to or from one of those identifiers. 34 FISA Section 702 provides limited protections for non-U.S. residents. The FISC's annual certification focuses, as does the rest of Section 702, on whether U.S. intelligence agencies are targeting foreigners, as opposed to U.S. residents, not whether the foreigners targeted actually possess relevant intelligence information. 35  individuals targeted under Section 702 need not even be notified that the NSA is collecting data about them. 36 The CJEU critiqued Section 702's lack of both 'limitations on the power it confers to implement surveillance programmes' and 'guarantees for non-US persons potentially targeted by those programmes' . 37 Thus, researchers face two important questions before transferring clinical research data from the EEA to the United States: (i) from whom may the U.S. government collect data under FISA Section 702 and (ii) would the U.S. use FISA Section 702 to target clinical research data?
The definition of an 'electronic communications services provider' is potentially expansive. The definition primarily points to three other statutes, one of which is the definition of 'electronic communication service' under the Stored Communications Act (SCA). 38 An electronic communication service, under the SCA, means 'any service which provides to users thereof the ability to send or receive wire or electronic communications' . 39 Courts have found the SCA definition covers a private employer that merely provides email services to its employees, 40 which would cover the vast majority of entities conducting clinical research.
However, despite the potentially expansive reach of FISA Section 702, it seems likely-based on the limited public sources available-that the primary targets of NSA collection are major technology firms. For example, a 2014 report from the Privacy and Civil Liberties Oversight Board (PCLOB) implied that internet service providers were the key targets of much of the NSA's data collection. 41 The Edward Snowden leaks revealed that Microsoft, Google, Yahoo, AOL and Apple have provided the vast majority of the data that the NSA collected via its 'downstream' collections, 42 formerly known as PRISM. 43 The NSA supplements data collection from these entities with 'upstream' collections from 'communications as they cross the backbone of the internet' . 44 Thus, while firms operating their own email servers could become targets of Section 702 'downstream' data collection under the statute's expansive definitions, such targeting appears infrequent, at least based on information currently publicly available.
Moreover, the United Stated is unlikely to target clinical researchers for Section 702 collection. Section 702 authorizes collections when 'a significant purpose . . . is to obtain foreign intelligence information', 45 a term that the statutory framework and court decisions suggest should be interpreted expansively. 46 However, the NSA has, according to the FISC, indicated a narrower focus on queries 'reasonably likely to retrieve foreign intelligence information', 47 a view reflected in the limitations Presidential Policy Directive 28 (PPD-28) placed on the gathering of signals intelligence (i.e., intelligence derived from electronic signals and systems, such as electronic communications). 48 The 2019 National Intelligence Strategy indicated a similarly narrow focus on national security challenges. 49 Clinical research studies are highly unlikely to involve data of these types, thus supporting a conclusion by exporters that an essentially equivalent level of protection can be available in the narrow 'case' of clinical research, without significant risk of disclosure for national security purposes. 50 Additionally, Section 702 has at least some of the qualities that the CJEU has identified as essential to such national security legislation. For example, the FISC provides a level of independent oversight, the FISC's certifications are time-limited and the threats targeted are national-security related, all factors the CJEU indicated were favorable in La Quadrature du Net. 51 However, the CJEU found in Schrems II that 'surveillance programmes based on Section 702 of the FISA and on E.O. 12333 are not covered by requirements ensuring, subject to the principle of proportionality, a level of protection essentially equivalent to that guaranteed' by E.U. law. 52 Thus, entities that the U.S. government successfully targets for data collection under Section 702 could not provide the adequate protections that Schrems II requires.
The other U.S. national security provision highlighted by the Schrems II decision, E.O. 12333, organizes the intelligence community and seeks to facilitate robust communication between intelligence agencies. 53 The Order, which directs intelligence gathering that rests on the President's inherent constitutional authority, 54 seeks to strike 'the proper balance between the acquisition of essential information and protection of individual interests' and is permissive of more intrusive collection when directed against non-U.S. persons located abroad. 55 The NSA Director serves as the U.S. intelligence community's functional manager for collecting signals intelligence. 56 Collecting data with the compelled assistance of private entities requires independent statutory authority and, thus, would not fall under E.O. 12333. 57 Concerns about the volume and type of intelligence collection under E.O. 12333 led directly to President Obama's Presidential Policy Directive 28 (PPD-28), 58 which has remained 'in full force and effect' even after the end of the Obama presidency. 59 PPD-28 imposed 'appropriate safeguards for the personal information of all individuals, regardless of . . . nationality . . . or where that individual resides' . These safeguards include 'minimize[ing] the dissemination and retention of personal information', limiting data access to those with proper training and a need to know, and providing oversight. 60 PPD-28 limits bulk signals intelligence (i.e., where the agency collects signals, such as phone or email communications, en masse, and then searches through the data for specific targets later 61 ) to six specific threats: espionage, terrorism, the proliferation of weapons of mass destruction, cybersecurity threats, threats to the military and allies, and transnational criminal threats. 62 The U.S. government has argued that PPD-28 is a substantial protection for the civil liberties of non-U.S. persons. 63 However, the CJEU found PPD-28's protections inadequate, since the NSA can collect bulk intelligence without specifying a particular target. 64 Notably, however, as the U.S. government has argued, it is not clear that entities are putting their data at any greater risk of capture or review under programs authorized by E.O. 12333 merely by sending such data to the United States. First, no entity could be compelled to participate in bulk data collection under E.O. 12333, since such action requires separate statutory authority. 65 While Section 702 provides such separate authority, as discussed above, the government is unlikely to use Section 702 to target clinical research data. Second, while the Schrems II court expressed particular concern about the NSA's tapping of undersea cables that transfer communications to and from the United States, 66 the Snowden leaks and other reporting have shown that the NSA collects data worldwide, such as when flowing between Europe, the Middle East, India and East Asia or when stored in China. 67 A risk of data capture remains whenever and wherever data are sent electronically, thus challenging the CJEU's concern that the act of transferring data to the United States increases the risk that such data will be accessed by U.S. authorities. 68 Third, non-state actors, foreign intelligence services and other groups could theoretically seek to capture personal data at any point, suggesting that E.O. 12 333 does not pose a unique data capture risk. 69 While not addressed by the CJEU in Schrems II, certain E.U. institutions have highlighted the CLOUD Act as an additional law that increases U.S. authorities' access to data following transfer to the United States. 70  The presence of the CLOUD Act does not increase the risk that the U.S. government will access data transmitted to the United States because the CLOUD Act focuses on access to data held overseas. This means that the CLOUD Act would permit access to data stored abroad but have no effect on data stored in the United States, and thus, the CLOUD Act does not provide a rationale for limiting transfers of personal data to the United States.

IV. STRATEGIES FACILITATING DATA TRANSFER
The EDPB guidance on data transfers post-Schrems II requires that entities exporting personal data from the EEA must put in place supplemental measures to safeguard the data. Fortunately, pseudonymization, a strategy the EDPB highlighted as an example of a supplemental measure, is commonplace in the clinical research environment. The EDPB explains that Schrems II permits EEA-based data exporters to combine the SCCs with 'supplementary measures' that ensure an 'essentially equivalent' level of protection, 72 while noting that the Schrems II decision 'sets a high bar' . 73 The supplementary measures must 'address[] the specific deficiencies identified in [the data exporter's] assessment of the legal situation in the third country' 74 and 'preclude potentially infringing access by preventing the authorities from identifying the data subjects, inferring information about them, singling them out in another context, or associating the transferred data with other datasets.' 75 Data exporters must ensure that U.S. laws do not undermine their chosen safeguards. 76 Notably, any contractors or vendors that touch the data likely must adopt the SCCs. 77 Furthermore, the EDPB provides examples of potentially useful supplementary measures. For example, pseudonymization of data-where 'the personal data can no longer be attributed to a specific data subject'-'provides an effective supplementary measure' if (i) the key to that data is 'held exclusively by the data exporter and kept separately in a Member State' or a third country with an adequacy decision and (ii) public authorities cannot use other information to re-identify the data. 78 Pseudonymization is a technique routinely employed in clinical research to safeguard data, and typically, if data are collected by an investigator located in Europe and transferred in pseudonymized form to the United States, the key needed to link the pseudonymized data to the identity of the data subject will remain in the EEA. Data exporters may wish to consider additional security measures to protect the data key, such as limiting how often and with whom the key is shared and imposing appropriate contractual requirements on data recipients that they will not obtain a key and that if they do, they will immediately destroy it.
Of note for clinical researchers, the EDPB warned that 'in many situations, factors specific to physical, physiological, genetic, mental, economic, cultural or social identity of a natural person . . . may allow the identification of that person' even without 'plain identifiers' . 79 As the EDPB's predecessor noted, genetic data inherently uniquely identifies an individual, 80 making it more difficult to pseudonymize effectively. However, genotypic data can only be re-identified when linked to a reference database of genetic information containing the names of individuals. Therefore, imposing other technical and administrative safeguards can reduce, but not eliminate, the risk of reidentification of such data; given that it is impossible to entirely eliminate the risk of re-identification, it seems unlikely that transfers may only take place when no such risk exists. Additionally, there are ways to make genotypic data less helpful to, and thus less likely to be seized by, intelligence authorities; for example, removing identifying but non-research relevant phenotypic information may make genetic data more difficult to connect to a specific individual. This technique is also consistent with 'the GDPR principle of "data minimisation"' . 81 Examples of other supplementary measures include the following: First, end-to-end encryption can reduce the possibility that data will be re-identified, though the EDPB has expressed concern that the United States could use FISA Section 702 to obtain cryptographic keys. 82 Second, since the NSA requests information flowing to or from particular 'identifiers', such as an email address, 83 using separate mechanisms-such as separate servers or email addresses-to transmit personal data and research results would reduce the likelihood that the government will incidentally obtain personal data if it tries to obtain scientific information. Third, obtaining a Certificate of Confidentiality (CoC) may offer additional protection even though it may not prevent disclosure of data in response to a Section 702 request. CoCs prohibit disclosure of identifiable, sensitive information by researchers except in certain limited circumstances. One such circumstance permits certificate holders to release identifiable, sensitive information if federal, state or local law requires such release, unless the demand for release is part of a 'proceeding' . As a result, a researcher who possesses a CoC and is faced with an applicable subpoena from a court for such identifiable, sensitive information is able to refuse complying with the subpoena because the court action is a 'proceeding' . 84 An NSA request under an FISC 'certificate' may not be considered a 'proceeding' unless the request recipient has challenged the request. 85 Moreover, the NSA may request data from a vendor, such as an email provider, rather than the holder of the certificate, thus avoiding application of the CoC. In addition, the request recipient would have a complete release from liability for complying with the NSA's request, reducing its incentive to challenge the NSA request. 86 Nevertheless, the CoC could demonstrate that the data recipient has taken steps to shield the data from disclosure generally.
Pseudonymization, when combined with the SCCs and other supplementary measures, such as encryption and division of communication, will be helpful in common research scenarios. For example, registry studies may use a data coordinating center in the United States but have clinical sites collecting information across the world, including in the E.U., especially in the case of rare disease research. While the coordinating center often needs detailed medical information to facilitate research, patient identifying information-such as name, social insurance number, medical record number and other 'direct identifiers'-is typically not required, thus making pseudonymisation a practical safeguard. Another common scenario is multi-regional drug trials that use a common laboratory in the United States to analyze all samples. While, as we note above, there are some questions as to whether it is possible to pseudonymize specimens that inherently contain genetic material, European data exporters can work with U.S.based laboratories to impose contractual provisions and create protocols that prohibit U.S.-based researchers from using data to re-identify individuals.
If the data exporter cannot demonstrate the presence of adequate safeguards, an alternative mechanism that permits the cross-border transfer of data is obtaining the explicit consent of the data subject. First, the data exporter must inform the data subject about 'the possible risks of such transfers for the data subject due to the absence of an adequacy decision and appropriate safeguards' . 87 Second, explicit consent requires 'an express statement of consent', such as a written statement. 88 However, 'consent for data transfers [to countries without adequate levels of protection] that occur periodically or on an on-going basis is inappropriate'; 89 thus, explicit consent could only be used for occasional or one-off transfers. Moreover, consent would not be an available basis for cross-border transfer in most secondary research for which data subjects are generally not asked to provide consent. The GDPR allows data transfers to third countries in other limited situations, such as when export is 'necessary for the performance of a contract' with or in the interest of the data subject or when 'necessary for important reasons of public interest' . 90 However, EDPB guidance construes such situations narrowly: for example, the public interest exception is available only when the public interest is shared by the third country and the EEA and transfers to perform a contract must be both