Skip to main content

The Future of Face Recognition Technology and Ethico: Legal Issues

  • Chapter
  • First Online:
Face Recognition Technology

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 41))

  • 934 Accesses

Abstract

This chapter summarises the impact of face recognition technology on privacy and confidentiality and looks to the future of FRT in terms of its technical developments, and the main ethico-legal issues that may arise. Moreover, the use of FRT by commercial enterprises and governments is divergent when, opposing interests occur between the lawmakers and the public, and between the public and commercial enterprises. The future of FRT should, and to some extent shall, also be shaped by its ethical acceptability and legal regulation. Ultimately, an individual’s identity is intimately tied up with their face, and direct technological recognition by the facial contours will be a continuing issue of social concern and sensitivity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Lyon (2008), p. 8.

  2. 2.

    Mohammed Atta.

  3. 3.

    See Stikeman (2001). Cited by Gates (2011).

  4. 4.

    See Eggen (2005).

  5. 5.

    See Foresight Future Identities (2013), p. 10.

  6. 6.

    See Van Zoonen et al. (2012).

  7. 7.

    Gates (2011), p. 64.

  8. 8.

    ibid.

  9. 9.

    See Visionics Corporation.

  10. 10.

    See Brey (2004), pp. 97–109.

  11. 11.

    London Borough of Newham (2015).

  12. 12.

    BBC News (2013).

  13. 13.

    Parliamentary Office for Science and Technology (2002).

  14. 14.

    Home Office (2013).

  15. 15.

    ICO (2014).

  16. 16.

    Hence the reason for substantial password strength that ensures data security.

  17. 17.

    Cognitec (2013).

  18. 18.

    Cognitec (n.d.) FaceVacs video scan.

  19. 19.

    See Trepp (2019).

  20. 20.

    Arthur (2010).

  21. 21.

    For example: Key Lemon ‘Oasis Face’.

  22. 22.

    ibid.

  23. 23.

    I am using ‘transparency’ to describe the loss of confidentiality and privacy.

  24. 24.

    This is conjectural as quantifiable risk data may be available but is not germane here. The general trend is towards face recognition, although passwords can be archived or saved to reduce the risks of hacking or forgetfulness.

  25. 25.

    Other biometrics such as fingerprints or voice recognition have also been tried, mainly to log-on to protected devices (mobile phones and tablet computers) without using a password.

  26. 26.

    Apple; see Dormehl (2014).

  27. 27.

    Chapter 2.

  28. 28.

    See Soper (2017).

  29. 29.

    See Weiming et al. (2004).

  30. 30.

    See Paul et al. (2013), pp. 12–13.

  31. 31.

    ibid.

  32. 32.

    ADDPRIV (2011), p. 8; 69.

  33. 33.

    Gates (2011), pp. 100–101.

  34. 34.

    ibid (2011), p. 136.

  35. 35.

    Tagg (1988).

  36. 36.

    ibid pp. 62–64.

  37. 37.

    ibid p. 64.

  38. 38.

    See President Obama (2014).

  39. 39.

    Cited by Gates (2011), p. 168 op cit.

  40. 40.

    ibid p. 169.

  41. 41.

    Emotional Intelligence Academy.

  42. 42.

    Gates (2011), op cit p. 180.

  43. 43.

    GAO (2013).

  44. 44.

    Gates (2011), op cit p. 181.

  45. 45.

    ibid; Ekman (2006).

  46. 46.

    UC San Diego (2007).

  47. 47.

    DHS Science and Technology Directorate (n.d.).

  48. 48.

    See Collier (2014).

  49. 49.

    See Anonymous (2015), ISCP (2015).

  50. 50.

    For example: Apple ‘Privacy Policy’.

  51. 51.

    CIGI-Ipsos (2019).

  52. 52.

    Chesterman (2011), p. 244.

  53. 53.

    Internet World Stats (2019).

  54. 54.

    ISCP (2015).

  55. 55.

    ibid paras 90–91, p. 33.

  56. 56.

    ibid para 80, p. 32; and the later Investigatory Powers Act 2016 §136.

  57. 57.

    ibid paras 92–94 p. 35.

  58. 58.

    Section 7.6 Data Protection and Face Recognition.

  59. 59.

    ISCP op cit: para 94, p. 35.

  60. 60.

    ibid at AAA, p. 105.

  61. 61.

    See Sect. 9.3 Liberty and State Power.

  62. 62.

    ISCP op cit paras ii & v, p. 1.

  63. 63.

    ibid at XX, p. 103.

  64. 64.

    MRS Reports (2015).

  65. 65.

    See Chap. 6.

  66. 66.

    Dixon (2010).

  67. 67.

    Chesterman (2011), pp. 248–250.

  68. 68.

    ibid p. 258.

  69. 69.

    MRS (2015), p. 8.

  70. 70.

    ibid p. 12.

  71. 71.

    GDPR Article 17.

  72. 72.

    ibid Article 15.

  73. 73.

    ibid Article 17(2).

  74. 74.

    ibid Article 17(1).

  75. 75.

    Data Protection Act 2018 §§43-45.

  76. 76.

    As per Data Protection Act 1998; see ICO Data Sharing Checklist.

  77. 77.

    See University of California, Irvine (2011).

  78. 78.

    Berle (2011), pp. 43–44.

  79. 79.

    European Convention on Human Rights (and UK Human Rights Act 1998).

  80. 80.

    Kindt (2013), pp. 192 §347.

  81. 81.

    Sciacca v. Italy (2006) 43 EHRR 400. Cited by Kindt (2013) op cit.

  82. 82.

    See Hughes (2009), p. 163.

  83. 83.

    Reklos and Davourlis v. Greece 27 BHRC 420.

  84. 84.

    Kindt (2013) op cit p. 193 §347.

  85. 85.

    Reklos and Davourlis v. Greece para 35 op cit.

  86. 86.

    ibid para 42.

  87. 87.

    Lupker and others v. The Netherlands. Cited by Vermeulen (2014).

  88. 88.

    ibid § 5.

  89. 89.

    Reklos and Davourlis v. Greece, para 37 op cit.

  90. 90.

    See Chap. 7.

  91. 91.

    Reklos and Davourlis v. Greece, para 40 op cit.

  92. 92.

    Kindt (2013), p. 194 §349 op cit.

  93. 93.

    Samuelson (1999).

  94. 94.

    ibid p. 10.

  95. 95.

    Velu (1970) quoted by Loukaidēs L. Cited by Vermeulen (2014), p. 19 op cit.

  96. 96.

    Which followed the Universal Declaration of Human Rights 1948 and the Convention for the Protection of Human Rights and Fundamental Freedoms 1950.

  97. 97.

    Chapter 7.

  98. 98.

    Laurent (2013).

  99. 99.

    France: Penal Code - Article 226-1.

  100. 100.

    See Logeais and Schroeder (1998).

  101. 101.

    Solove (2011).

  102. 102.

    ibid p. 50.

  103. 103.

    Tagg (1988) op cit.

  104. 104.

    ISCP op cit para 277.

  105. 105.

    Vis-à-vis 11.4 above.

  106. 106.

    United States Crimes and Criminal Procedure 18 USC §2721 Chapter 123.

  107. 107.

    DVLA.

  108. 108.

    Adams (2011). From a UK and EU perspective, how similar activity plays-out in the new data regulation landscape remains to be seen.

References

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Berle, I. (2020). The Future of Face Recognition Technology and Ethico: Legal Issues. In: Face Recognition Technology. Law, Governance and Technology Series, vol 41. Springer, Cham. https://doi.org/10.1007/978-3-030-36887-6_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36887-6_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36886-9

  • Online ISBN: 978-3-030-36887-6

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics