Abstract
In an data-driven world, people increasingly value the protection and security of their personal data. Many states have adopted or are in the process of adopting data protection laws based on principles similar to those of the General Data Protection Regulation (GDPR) of the European Union, resulting in a global convergence of data protection rules. Yet doubts may arise regarding the ability of the GDPR to effectively safeguard data protection principles and rights in the age of big data and related algorithmic decision-making. Against this background, the choice of the risk-based approach as the key law enforcement mode under the GDPR might be interpreted to be a result of the recognition of a law/technology lag in this instance, in the sense of an intrinsic difficulty of the law in dealing with the pervasiveness and automation involved in present-day collection, circulation and use of the data. The risk-based approach leaves data protection decisions mainly to the data controllers. A better explanation for the GDPR’s novel enforcement mode may be found in the “law as technology” proposition, meaning a law intended to liberate the use of digital technologies. In the context of its strategy for a data-driven economy, the EU had shown its concern with European lateness in embracing the data revolution. Accordingly, EU data protection reform has been meant to reduce the administrative burden on the data controllers and processors so as to further the competitiveness of the digital single market. As a consequence, data protection will ultimately depend on how the data controllers will meet their greater responsibilities.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
By February 2020, the Commission had recognised 13 countries as providing adequate level of protection for personal data (European Commission, 2020).
- 2.
The Article 29 Data Protection Working Party defined the data controller as the natural or legal person, public authority, agency or any other body, which alone or jointly with others determines the purposes and means of the processing of personal data, and the data processor as a natural or legal person, public authority, agency or any other body, which processes personal data on behalf of the controller. See Opinion 1/2010 on the concepts of “controller” and “processor”, adopted on 16 February 2010, https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2010/wp169_en.pdf
- 3.
According to the International Organization for Standardization (ISO), data is “a reinterpretable representation of information in a formalized manner, suitable for communication, interpretation or processing”. Data can either be created/authored by people or generated by machines/sensors, often as a “by-product”. Examples are geospatial information, statistics, weather data, research data, etc.”, https://www.iso.org/obp/ui/#iso:std:iso-iec:11179:-4:ed-2:en. Big Data is a data set(s) with characteristics (e.g. volume, velocity, variety, variability, veracity, etc.) that for a particular problem domain at a given point in time cannot be efficiently processed using current/existing/established/traditional technologies and techniques in order to extract value, https://www.iso.org/files/live/sites/isoorg/files/developing_standards/docs/en/big_data_report-jtc1.pdf
- 4.
See “Les assureurs demandent à leurs clients de se mettre à nu. Generali lance une assurance ‘comportementale’ dans la santé. Une première en France”, and “Assurance: votre vie privée vaut bien une ristourne”, Le Monde 7 September 2016. The case of the reuse by Cambridge Analytica of thousands of profiles provided by Facebook, rendered public in 2018, also illustrates this point (Gibney, 2018).
- 5.
Note that the term “risk” appears 76 times in the text of the GDPR, whereas it appeared 8 times in the text of Directive 95/46/EC (including the whereases).
- 6.
A somewhat similar trend towards reliance on the self-regulation and responsibility of operators may be found in Directive (EU) 2019/790 on copyright and neighbouring rights in the digital single market, which has been regarded as “favouring private ordering over public policy” (Quintais, 2020).
- 7.
Self-defence by the data subjects is given more strength through the novel right to be forgotten (Article 17 GDPR), empowering data subjects to obtain from the controller the erasure of personal data concerning him or her without undue delay where one of a series of grounds enumerated in the article concerned applies. Also here, it is up to the data controller to evaluate whether the values or interests underlying the right to be forgotten are surpassed by other values, namely the right of freedom of expression and information, or specific duties by the data controller such as to comply with a legal obligation which requires processing or to perform a task carried out in the public interest or in the exercise of official authority vested in the controller, etc.
- 8.
Both the EDPS and the Article 29 DPWP demanded that the data subjects be given access to their profiles, as well as to the logic used in algorithms to determine the profiles (Article 29 DPWP, 2013b). Some everyday examples where the logic of decision-making should be disclosed include a personalised car insurance scheme (using car sensor data to judge driving habits); credit scoring services; a pricing and marketing system that determines how much discount an individual will receive or what media content to recommend to an individual. Transparency could include, for example, informing people about re-identification risks stemming from data collected about them (Narayanan et al., 2016).
- 9.
In the USA, an initiative by the Federal Trade Commission named “Reclaim Your Name” is meant to empower the consumer to find out how brokers are collecting and using data; give her access to information that data brokers have amassed about her; allow her to opt-out if she learns a data broker is selling her information for marketing purposes; and provide her the opportunity to correct errors in information used for substantive decisions like credit, insurance, employment, and other benefits, https://www.ftc.gov/sites/default/files/documents/public_statements/reclaim-your-name/130626computersfreedom.pdf
- 10.
Recital 71 states the data subject’s right to obtain an explanation on how the decision was reached.
- 11.
The World Economic Forum emphasised the importance of ensuring understanding beyond transparency, in the following terms: “People need to understand how data is being collected, whether with their consent or without – through observations and tracking mechanisms, given the low cost of gathering and analysing data”, and added, “From Passive consent to engaged Individuals: Too often the organizations collecting and using data see their role as a yes-no/on-off degree of consent. New ways are needed to allow individuals to exercise more choice and control over this data that affects their lives”; “From Black to White to Shades of Gray: the context by which data is collected and used matters significantly”, World Economic Forum, Unlocking the Value of Personal Data: From Collection to Usage, http://www3.weforum.org/docs/WEF_IT_UnlockingValuePersonalData_CollectionUsage_Report_2013.pdf
- 12.
References
Andrejevic, M., & Gates, K. (2014). Editorial. Big data surveillance: Introduction. Surveillance & Society, 12(2), 185–196.
Beck, U. (2009). Critical theory of world risk society: A cosmopolitan vision. Constellations, 16(1), 3–22.
Black, J. (2010). Risk-based regulation: Choices, practices and lessons being learnt. Risk and regulatory policy: Improving the governance of risk. OECD Reviews of Regulatory Reform, OECD Publishing. https://doi.org/10.1787/9789264082939-11-en. Accessed 24 August 2020.
Black, J., & Baldwin, R. (2010). Really responsive risk-based regulation. Law and Policy, 32(2), 181–213.
Block, F., & Somers, M. R. (2016). The power of market fundamentalism, Karl Polanyi’s critique. Harvard University Press.
Bobbio, N. (1996). The age of rights. Polity.
Boyd, D., & Crawford, K. (2016). Critical questions for big data. Information, Communication & Society, 15(5), 662–679.
Cardon, D. (2015). À quoi rêvent les algorithmes: Nos vies à l’heure du big data. Seuil.
Cattan, J. (2012). L’innovation par le droit des communications électroniques. Jurisdoctoria 8, Les Nouvelles Technologies et le Droit. https://old.jurisdoctoria.net/pdf/numero8/aut8_CATTAN.pdf. Accessed 11 September 2020.
CNIL. (2015). Privacy Impact Assessment (PIA), Tools (Templates and Knowledge Bases), Commission Nationale Informatique et Libertés. https://www.cnil.fr/sites/default/files/typo/document/CNIL-PIA-2-Tools.pdf. Accessed 10 September 2020.
Cohen, J. E. (2016). The regulatory state in the information age. Theoretical Inquiries in Law, 17(2) https://ssrn.com/abstract=2714072
Colonna, L. (2014). Data mining and its paradoxical relationship to the purpose of limitation principle. In S. Gutwirch, R. Leenes, & P. De Hert (Eds.), Reloading data protection: Multidisciplinary insights and contemporary challenges (pp. 299–321). Springer.
Couldry, N., & Powell, A.. (2014). Big data from the bottom up. Big Data and Society, 1. http://eprints.lse.ac.uk/57941/1/Couldry_Powell_Big-data-from-the-bottom-up_2014.pdf. Accessed 2 December 2019.
Davies, S. (2016). The data protection regulation: A triumph of pragmatism over principle? European Data Protection Law Review, 2(3), 290–296.
De Hert, P., & Papakonstantinou, V. (2016). The new general data protection regulation: Still a sound system for the protection of individuals? Computer Law & Security Review, 32(2), 179–194.
Determann, L. (2016). Comment, adequacy of data protection in the USA: Myths and facts. International Data Privacy Law, 6(3), 244–250.
Digital Europe. (2013, August 28). Comments on the risk-based approach. http://teknologiateollisuus.fi/sites/default/files/file_attachments/elinkeinopolitiikka_digitalisaatio_tietosuoja_digitaleurope_risk_based_approach.pdf. Accessed 6 October 2020.
Duckett, D., et al. (2015). Can policy be risk-based? The cultural theory of risk and the case of livestock disease containment. Sociologia Ruralis, 55(4), 379–399.
Ferretti, F. (2012). A European perspective on data processing consent through the re-conceptualization of European data Protection’s looking glass after the Lisbon treaty: Taking rights seriously. European Review of Private Law, 20(2), 473–506.
Fuchs, C. (2008). Internet and society: Social theory in the information age. Routledge.
Gellert, R. (2015). Data protection: A risk regulation? Between the risk management of everything and the precautionary alternative. International Data Privacy Law, 5(1), 3–19.
Gibney, E. (2018). The scant science behind Cambridge Analytica’s controversial marketing techniques. Nature, 29, March 4.
Gonçalves, M. E. (2020). The risk-based approach under the new EU data protection regulation: A critical perspective. Journal of Risk Research, 23(2), 139–152.
Graber, C. B. (2017). Bottom-up constitutionalism: The case of net neutrality. Transnational Legal Theory, 7(4), 524–552.
Hasty, R., Nagel, T. W., & Subjally, M. (2013). Data protection law in the USA, A4ID advocates for international development. https://www.neighborhoodindicators.org/sites/default/files/course-materials/A4ID_DataProtectionLaw%20.pdf. Accessed 15 April 2021.
Horten, M. (2016). The closing of the net. Polity.
ICO. (2015). Conducting privacy impact assessments code of practice. Information Commissioner’s Office, February. https://ico.org.uk/media/about-the-ico/consultations/2052/draft-conducting-privacy-impact-assessments-code-of-practice.pdf. Accessed 10 December 2019.
ITU. (2014). GSR discussion paper big data: Opportunity or threat?. International Telecommunications Union. https://www.itu.int/en/ITU-D/Conferences/GSR/Documents/GSR2014/Discussion%20papers%20and%20presentations%20-%20GSR14/Session3_GSR14-DiscussionPaper-BigData.pdf. Accessed 2 December 2019.
Jasanoff, S. (2016). The ethics of invention: Technology and the human future. W. W. Norton & Company.
Koops, B. J. (2014). The trouble with European data protection law. International Data Privacy Law. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2505692
Koops, B.-J., & Leenes, R. (2014). Privacy regulation cannot be hardcoded. A critical comment on the ‘privacy by design’ provision in data-protection law. International Review of Law, Computers & Technology, 28(2), 159–171.
Kuner, C., Cate, F. H., Millard, C., Svantesson, D. J. B., & Lynskey, O. (2015). Editorial: The data protection credibility crisis. International Data Privacy Law, 5(3), 161–162.
Le Métayer, D., & Monteleone, S. (2009). Automated consent through privacy agents: Legal requirements and technical architecture. Computer Law & Security Review, 25(2), 136–144.
Lynskey, O. (2015). The foundations of EU data protection law. Oxford University Press.
Mantelero, A., & Vaciago, G. (2013). The ‘dark side’ of big data: Private and public interaction in social surveillance. How data collections by private entities affect governmental social control and how the EU reform on data protection responds. Computer Law Review International, 14(6), 161–169.
Mantelero, A., & Vaciago, G. (2015). Data protection in a big data society. Digital Investigation, 15. https://doi.org/10.1016/j.diin.2015.09.006
Marchant, G. E. (2011). The growing gap between emerging technologies and the law. In G. E. Marchant, B. R. Allenby, & J. R. Herkert (Eds.), The growing gap between emerging technologies and legal-ethical oversight (pp. 19–33). Springer.
Maxwell, W., & Bourreau, M. (2014). Technology neutrality in internet, telecoms and data protection regulation. Computer and Telecommunications Law Review. https://doi.org/10.2139/ssrn.2529680
McArdle, E. (2016, May 10). The new age of surveillance. Harvard Law Bulletin, 38. http://today.law.harvard.edu/feature/new-age-surveillance/. Accessed 21 January 2020.
Mittelstadt, B., Allo, P., Taddeo, M., Wachter, S., & Floridi, U. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society. https://doi.org/10.1177/2053951716679679
Morozov, E. (2015). Le mirage numérique: Pour une politique des big data. Les Prairies Ordinaires.
Moses, Lyria B. (2007, April 11). Recurring dilemmas: The law’s race to keep up with technological change. UNSW Law Research Paper No. 2007–21. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=979861. Accessed 21 December 2020.
Narayanan, A., Huey, J., & Felten, E. W. (2016). A precautionary approach to big data privacy. In S. Gutwirth, R. Leenes, & P. De Hert (Eds.), Data protection on the move (pp. 357–385). Springer.
Nissenbaum, H. (2010). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.
Pasquale, F. (2016). The black box society: The secret algorithms that control money and information. Harvard University Press.
Quelle, C. (2015). The data protection impact assessment, or: How the General Data Protection Regulation may still come to foster ethically responsible data processing. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2695398. Accessed 21 January 2020.
Quintais, J. (2020). The new copyright in the digital single market directive: A critical look. European Intellectual Property Review, 2020(1) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3424770. Accessed 11 January 2020
Reding, V. (2011, June 20). Assuring data protection in the age of the internet. SPEECH/11/452 at BBA (British Bankers’ Association) Data Protection and Privacy Conference. europa.eu/rapid/press-release_SPEECH-11-452_en.pdf.
Reidenberg, J. (2014). The data surveillance state in Europe and the United States. Wake Forest Law Review, 49(2), 583–608.
Rosen, J. (2012). The deciders: The future of privacy and free speech in the age of Facebook and Google. Fordham Law Review, 80(4), 1525–1538.
Sarr, M. (2012). Droit souple et commerce électronique. Jurisdoctoria. Les nouvelles technologies et le droit, 8. https://old.jurisdoctoria.net/pdf/numero8/aut8_SARR.pdf. Accessed 5 November 2020.
Tene, O. (2011). Privacy: The new generations. International Data Privacy Law, 15. https://www.stanfordlawreview.org/online/privacy-paradox-privacy-and-big-data/. Accessed 22 February 2020.
Tene, O., & Polonetsky, J. (2012). Privacy in the age of Big Data: A time for big decisions. Stanford Law Review. https://www.stanfordlawreview.org/online/privacy-paradox-privacy-and-big-data/. Accessed 25 January 2021.
Tutt, A. (2017). An FDA for algorithms. Administrative Law Review, 69(1), 83–123.
Wacks, R. (1980). The protection of privacy. Sweet & Maxwell.
Waterman, K. K., & Bruening, P. J. (2014). Big data analytics: Risks and responsibilities. International Data Privacy Law, 4(2), 89–95.
Westin, A. F. (1967). Privacy and freedom. Atheneum.
Wiener, J. B. (2004). The regulation of technology, and the technology of regulation. Technology in Society, 26. https://scholarship.law.duke.edu/cgi/viewcontent.cgi?article=1960&context=faculty_scholarship. Accessed 20 December 2020.
Wiener, J. B., Rogers, M. D., Hammitt, J. K., & Sand, P. H. (Eds.). (2011). The reality of precaution: Comparing risk regulation in the United States and Europe. RFF Press.
Wilkinson, M. A., & Lokdam, H. (2018). Law and political economy. LSE Law, Society and Economy Working Papers 7/2018.
Wright, D., & De Hert, P. (2012). Introduction to privacy impact assessment. In D. Wright & P. De Hert (Eds.), Privacy impact assessment (pp. 3–32). Springer.
Zanfir, G. (2014). Forgetting about consent: Why the focus should be on ‘suitable safeguards’. In S. Gutwirth, R. Leenes, & P. De Hert (Eds.), Reloading data protection: Multidisciplinary insights and contemporary challenges (pp. 237–257). Springer.
Official Documents
Article 29 DPWP. (2011). Opinion 9/2011 on the revised industry proposal for a privacy and data protection impact assessment framework for RFID applications, Article 29 Data Protection Working Party, 11 February (wp 180). https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2011/wp180_en.pdf. Accessed 6 January 2020.
Article 29 DPWP. (2012). Opinion 01/2012 on the data protection reform proposals, Article 29 Data Protection Working Party, 23 March (wp 191). https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp191_en.pdf. Accessed 6 January 2020.
Article 29 DPWP. (2013a). Opinion 03/2013 on purpose limitation, Article 29 Data Protection Working Party, 2 April (wp 203). https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf. Accessed 20 May 2020.
Article 29 DPWP. (2013b). Advice paper on essential elements of a definition and a provision on profiling within the EU general data protection regulation, Article 29 Data Protection Working Party, 13 May. https://ec.europa.eu/justice/article-29/documentation/other-document/files/2013/20130513_advice-paper-on-profiling_en.pdf. Accessed 9 January 2020.
Article 29 DPWP. (2014a). Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, Article 29 Data Protection Working Party, 9 April (wp 217). https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf. Accessed 10 January 2020.
Article 29 DPWP. (2014b). Statement on the role of a risk-based approach in data protection legal frameworks, Article 29 Data Protection Working Party, 30 May (wp 218). https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf. Accessed 9 February 2020.
EDPS. (2013). Additional EDPS comments on the data protection reform package. European Data Protection Supervisor, 15 March. https://edps.europa.eu/sites/edp/files/publication/13-03-15_comments_dp_package_en.pdf. Accessed 9 February 2020.
EDPS. (2015). Opinion 7/2015, Meeting the challenges of Big data. A call for transparency, user control, data protection by design and accountability. European Data Protection Supervisor, 19 November. https://edps.europa.eu/sites/edp/files/publication/15-11-19_big_data_en.pdf. Accessed 9 February 2020.
EUFRA. (2010). Data protection in the european union: the role of national data protection authorities. European Union Agency for Fundamental Rights, Luxembourg: Publications Office of the European Union.
European Commission. (2010). A comprehensive approach on personal data protection in the European Union, Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions, COM (2010) 609 final, Brussels, 4 November. https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2010:0609:FIN:EN:PDF. Accessed 12 December 2019.
European Commission. (2014). Towards a thriving data-driven economy, Communication from the Commission to the European Parliament, the Council, the European economic and social committee and the committee of the regions, Brussels, COM (2014) 442 final, 2 July. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52014DC0442&from=EN. Accessed 12 December 2019.
European Commission. (2015). Press release, agreement on Commission’s EU data protection reform will boost Digital Single Market, Brussels, 15 December. http://europa.eu/rapid/press-release_IP-15-6321_en.htm. Accessed 14 February 2020.
European Commission. (2016, January). The EU data protection reform and big data, Factsheet.
European Commission. (2018). Artificial intelligence for Europe, COM (2018) 237 final. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018DC0237&from=EN. Accessed 17 February 2020.
European Commission. (2020). A European strategy for data, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, COM (2020) 66 final, Brussels, 19 February. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52020DC0066&from=EN. Accessed 25 March 2020.
European Union. (2016). Statement of the Council’s reasons: Position (EU) No. 6/2016 of the Council at first reading with a view to the adoption of a Regulation of the European Parliament and of the Council on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and repealing Directive 95/46/EC (General Data Protection Regulation) 2016/C 159/02. https://publications.europa.eu/en/publication-detail/-/publication/8263b3d6-10f6-11e6-ba9a-01aa75ed71a1. Accessed 20 December 2019.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Gonçalves, M.E. (2023). Law-Technology Lag or Law as Technology in the Big Data Age. In: Jerónimo, H.M. (eds) Portuguese Philosophy of Technology. Philosophy of Engineering and Technology, vol 43. Springer, Cham. https://doi.org/10.1007/978-3-031-14630-5_15
Download citation
DOI: https://doi.org/10.1007/978-3-031-14630-5_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-14629-9
Online ISBN: 978-3-031-14630-5
eBook Packages: Religion and PhilosophyPhilosophy and Religion (R0)