diff --git a/www/report/report.css b/www/report/report.css index d78c3e1..885f158 100644 --- a/www/report/report.css +++ b/www/report/report.css @@ -588,8 +588,13 @@ h1.Title{ color: var(--color-bg-secondary); padding: 1rem; margin: 0 -1rem; + font-weight: bold; } .keypoints > p > strong{ margin-left:2.5rem; text-transform: uppercase; +} + +a.maplink{ + cursor: pointer; } \ No newline at end of file diff --git a/www/report/report.html b/www/report/report.html index f8c54d4..89898e0 100644 --- a/www/report/report.html +++ b/www/report/report.html @@ -51,8 +51,44 @@ // start observing intersectionObserver.observe(caseEl); } + + + const linkEls = document.getElementsByClassName('maplink'); + for (let linkEl of linkEls) { + linkEl.addEventListener('click', (ev) => { + const toSelect = typeof linkEl.dataset.title == 'undefined' || linkEl.dataset.title == 'none' ? null : frameEl.contentWindow.getIdForTitle(linkEl.dataset.title); + + if(toSelect === null) { + frameEl.contentWindow.mapGraph.deselectNode(); + frameEl.contentWindow.mapGraph.resetZoom(); + } else { + const node = frameEl.contentWindow.mapGraph.graph.nodes.filter(n => n.id == toSelect)[0] + frameEl.contentWindow.mapGraph.selectNode(node); + } + + }) + linkEl.addEventListener('mouseover', (ev) => { + const toSelect = typeof linkEl.dataset.title == 'undefined' || linkEl.dataset.title == 'none' ? null : frameEl.contentWindow.getIdForTitle(linkEl.dataset.title); + if(toSelect){ + + const node = frameEl.contentWindow.mapGraph.graph.nodes.filter(n => n.id == toSelect)[0] + frameEl.contentWindow.mapGraph.hoverNode(false, node); + } + + }) + linkEl.addEventListener('mouseout', (ev) => { + const toSelect = typeof linkEl.dataset.title == 'undefined' || linkEl.dataset.title == 'none' ? null : frameEl.contentWindow.getIdForTitle(linkEl.dataset.title); + if(toSelect){ + const node = frameEl.contentWindow.mapGraph.graph.nodes.filter(n => n.id == toSelect)[0] + frameEl.contentWindow.mapGraph.endHoverNode(node); + } + + }) + + } + + }); - }) }); // frame.contentWindow; @@ -96,7 +132,7 @@
  • The Dragonfly project (Hungary)
  • Recommendations
  • -
  • REFERENCES
  • +
  • REFERENCES
  • ANNEX: CASES
  • @@ -191,7 +227,7 @@ BPI -Public Investment Bank (France) +Public Investment Bank (France) BPOL @@ -247,7 +283,7 @@ DITSS -Dutch Institute for Technology, Safety & Security +Dutch Institute for Technology, Safety & Security DPA @@ -343,7 +379,7 @@ INPOL -Criminal Case Management System (Germany) +Criminal Case Management System (Germany) KAK @@ -439,7 +475,7 @@ TAJ -Criminal case history database (France) +Criminal case history database (France) TASZ @@ -554,7 +590,7 @@

    CHAPTER 9: Facial Recognition in Südkreuz Berlin, Hamburg G20 and Mannheim (Germany)

    @@ -660,8 +696,8 @@

    The intrusiveness of the system, and its impact on fundamental rights is best exemplified by its deployment in the Xinjiang province. The province capital, Urumqi, is chequered with checkpoints and identification stations. Citizens need to submit to facial recognition ID checks in supermarkets, hotels, train stations, highway stations and several other public spaces (Chin and Bürge 2017). The information collected through the cameras is centralised and matched against other biometric data such as DNA samples and voice samples. This allows the government to attribute trust-worthiness scores (trustworthy, average, untrustworthy) and thus generate a list of individuals that can become candidates for detention (Wang 2018).

    -

    European countries’ deployments are far from the Chinese experience. But the companies involved in China’s pervasive digital surveillance network (such as Tencent, Dahua Technology, Hikvision, SenseTime, ByteDance and Huawei) are exporting their know-how to Europe, under the form of “safe city” packages. Huawei is one of the most active in this regard. On the European continent, the city of Belgrade has for example deployed an extensive communication network of more than 1.000 cameras which collect up to 10 body and facial attributes (Stojkovski 2019). The cameras, deployed on poles, major traffic crossings and a large number of public spaces allow the Belgrade police to monitor large parts of the city centre, collect biometric information and communicate it directly to police officers deployed in the field. Belgrade has the most advanced deployment of Huawei’s surveillance technologies on the European continent, but similar projects are being implemented by other corporations – including the European companies Thales, Engie Ineo or Idemia – in other European cities and many “Safe City” deployments are planned soon in EU countries such as France, Italy, Spain, Malta, and Germany (Hillman and McCalpin 2019). Furthermore, contrary to the idea China would be the sole exporter of Remote Biometric Identification technologies, EU companies have substantially developed their exports in this domain over the last years (Wagner 2021)

    -

    The turning point of public debates on facial recognition in Europe was probably the Clearview AI controversy in 2019-2020. Clearview AI, a company founded by Hoan Ton-That and Richard Schwartz in the United States, maintained a relatively secret profile until a New York Times article revealed in late 2019 that it was selling facial recognition technology to law enforcement.  In February 2020, it was reported that the client list of Clearview AI had been stolen, and a few days later the details of the list were leaked (Mac, Haskins, and McDonald 2020). To the surprise of many in Europe, in addition to US government agencies and corporations, it appeared that the Metropolitan Police Service (London, UK), as well as law enforcement from Belgian, Denmark, Finland, France, Ireland, Italy, Latvia, Lithuania, Malta, the Netherlands, Norway, Portugal, Serbia, Slovenia, Spain, Sweden, and Switzerland were on the client list. The controversy grew larger as it emerged that Clearview AI had (semi-illegally) harvested a large number of images from social media platforms such as Facebook, YouTube and Twitter in order to constitute the datasets against which clients were invited to carry out searches (Mac, Haskins, and McDonald 2020).

    +

    European countries’ deployments are far from the Chinese experience. But the companies involved in China’s pervasive digital surveillance network (such as Tencent, Dahua Technology, Hikvision, SenseTime, ByteDance and Huawei) are exporting their know-how to Europe, under the form of “safe city” packages. Huawei is one of the most active in this regard. On the European continent, the city of Belgrade has for example deployed an extensive communication network of more than 1.000 cameras which collect up to 10 body and facial attributes (Stojkovski 2019). The cameras, deployed on poles, major traffic crossings and a large number of public spaces allow the Belgrade police to monitor large parts of the city centre, collect biometric information and communicate it directly to police officers deployed in the field. Belgrade has the most advanced deployment of Huawei’s surveillance technologies on the European continent, but similar projects are being implemented by other corporations – including the European companies Thales, Engie Ineo or Idemia – in other European cities and many “Safe City” deployments are planned soon in EU countries such as France, Italy, Spain, Malta, and Germany (Hillman and McCalpin 2019). Furthermore, contrary to the idea China would be the sole exporter of Remote Biometric Identification technologies, EU companies have substantially developed their exports in this domain over the last years (Wagner 2021)

    +

    The turning point of public debates on facial recognition in Europe was probably the Clearview AI controversy in 2019-2020. Clearview AI, a company founded by Hoan Ton-That and Richard Schwartz in the United States, maintained a relatively secret profile until a New York Times article revealed in late 2019 that it was selling facial recognition technology to law enforcement.  In February 2020, it was reported that the client list of Clearview AI had been stolen, and a few days later the details of the list were leaked (Mac, Haskins, and McDonald 2020). To the surprise of many in Europe, in addition to US government agencies and corporations, it appeared that the Metropolitan Police Service (London, UK), as well as law enforcement from Belgian, Denmark, Finland, France, Ireland, Italy, Latvia, Lithuania, Malta, the Netherlands, Norway, Portugal, Serbia, Slovenia, Spain, Sweden, and Switzerland were on the client list. The controversy grew larger as it emerged that Clearview AI had (semi-illegally) harvested a large number of images from social media platforms such as Facebook, YouTube and Twitter in order to constitute the datasets against which clients were invited to carry out searches (Mac, Haskins, and McDonald 2020).

    The news of the hacking strengthened a strong push-back movement against the development of facial recognition technology by companies such as Clearview AI, as well as their use by government agencies. In 2018, Massachusetts Institute of Technology (MIT) scholar and Algorithmic Justice League founder Joy Buolamwini together with Temnit Gebru had published the report Gender Shades (Buolamwini and Gebru 2018), in which they assessed the racial bias in the face recognition datasets and algorithms used by companies such as IBM and Microsoft. Buolamwini and Gebru found that algorithms performed generally worse on darker-skinned faces, and in particular darker-skinned females, with error rates up to 34% higher than lighter-skinned males (Najibi 2020). IBM and Microsoft responded by amending their systems, and a re-audit showed less bias. Not all companies responded equally. Amazon’s Rekognition system, which was included in the second study continued to show a 31% lower rate for darker-skinned females. The same year ACLU conducted another key study on Amazon’s Rekognition, using the pictures of members of congress against a dataset of mugshots from law enforcement. 28 members of Congress, largely people of colour were incorrectly matched (Snow 2018). Activists engaged lawmakers. In 2019, the Algorithmic Accountability Act allowed the Federal Trade Commission to regulate private companies’ uses of facial recognition. In 2020, several companies, including IBM, Microsoft, and Amazon, announced a moratorium on the development of their facial recognition technologies. Several US cities, including Boston, Cambridge (Massachusetts) San Francisco, Berkeley, Portland (Oregon), have also banned their police forces from using the technology.

    @@ -759,7 +795,7 @@

    People tracking and counting

    -

    This is perhaps the form of person tracking with which the least information about an individual is stored. An object detection algorithm estimates the presence and position of individuals on a camera image. These positions are stored or counted and used for further metrics. It is used to count passers-by in city centres, and for a one-and-a-half-meter social distancing monitor in Amsterdam2. See also the case study in this document on the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), which goes into more detail about the use of the recorded trajectories of individuals to label anomalous behaviour.

    +

    This is perhaps the form of person tracking with which the least information about an individual is stored. An object detection algorithm estimates the presence and position of individuals on a camera image. These positions are stored or counted and used for further metrics. It is used to count passers-by in city centres, and for a one-and-a-half-meter social distancing monitor in Amsterdam2. See also the case study in this document on the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), which goes into more detail about the use of the recorded trajectories of individuals to label anomalous behaviour.

    Emotion recognition.

    @@ -771,7 +807,7 @@

    Audio recognition

    -

    From a technological perspective, neural networks process audio relatively similarly to how video is processed: rather than feeding an image, a spectrogram is used as input for the network. However, under the GDPR, recording conversations, is illegal in the European Union without informed consent of the participants. In order to adhere to these regulations, on some occasions, only particular frequencies are recorded and processed. For example, in the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), only two frequencies are used to classify audio; making conversations indiscernible while being able to discern shouting or the breaking of glass3.

    +

    From a technological perspective, neural networks process audio relatively similarly to how video is processed: rather than feeding an image, a spectrogram is used as input for the network. However, under the GDPR, recording conversations, is illegal in the European Union without informed consent of the participants. In order to adhere to these regulations, on some occasions, only particular frequencies are recorded and processed. For example, in the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), only two frequencies are used to classify audio; making conversations indiscernible while being able to discern shouting or the breaking of glass3.

    @@ -799,7 +835,7 @@

    Availability

    Facial recognition algorithms can be developed in-house, taken from an open-source repository, or purchased (IPVM Team 2021b, 14). Popular open-source facial recognition implementations include OpenCV, Face_pytorch, OpenFace and Insightface. Many of these software libraries are developed at universities or implement algorithms and neural network architectures presented in academic papers. They are free, and allow for a great detail of customisation, but require substantial programming skills to be implemented in a surveillance system. Moreover, when using such software, the algorithms run on one’s own hardware which provides the developer with more control, but also requires more maintenance.

    -

    Proprietary facial recognition. There are three possible routes for the use of proprietary systems: There are “turnkey” systems sold by manufacturers such as Hikvision, Dahua, Anyvision or Briefcam. Those integrate the software and hardware, and as such can be directly deployed by the client. Algorithm developers such as Amazon AWS Rekognition (USA), NEC (Japan), NTechlab (Russia), Paravision (USA) allow to implement their algorithms and customise them to one’s needs, and finally there are “cloud” API systems, a sub-set of the former category, where the algorithm is hosted in a datacentre and is accessed remotely (IPVM Team 2021b, 16). The latter type of technology bears important legal ramifications, as the data may travel outside of national or European jurisdictions. It should be noted that many of the proprietary products are based on similar algorithms and network architectures as their open-source counterparts (OpenCV, 2021). Contrary to the open-source software, it is generally unclear which datasets of images have been used to train the proprietary algorithms.

    +

    Proprietary facial recognition. There are three possible routes for the use of proprietary systems: There are “turnkey” systems sold by manufacturers such as Hikvision, Dahua, AnyVision or Briefcam. Those integrate the software and hardware, and as such can be directly deployed by the client. Algorithm developers such as Amazon AWS Rekognition (USA), NEC (Japan), NTechlab (Russia), Paravision (USA) allow to implement their algorithms and customise them to one’s needs, and finally there are “cloud” API systems, a sub-set of the former category, where the algorithm is hosted in a datacentre and is accessed remotely (IPVM Team 2021b, 16). The latter type of technology bears important legal ramifications, as the data may travel outside of national or European jurisdictions. It should be noted that many of the proprietary products are based on similar algorithms and network architectures as their open-source counterparts (OpenCV, 2021). Contrary to the open-source software, it is generally unclear which datasets of images have been used to train the proprietary algorithms.

    @@ -843,29 +879,29 @@

    A broad range of deployments, which we consider in this first section, is not aimed at surveillance, but at authentication (see section 2.3 in this report), namely making sure that the person in front of the security camera is who they say they are.

    Live authentication

    -

    As in the cases of the use of Cisco systems powered FRT in two pilot projects in high schools of Nice (see section 8.1) and Marseille (France)7, or as in the case of the Anderstorp Upper Secondary School in Skelleftea (Sweden)8, the aim of these projects was to identify students who could have access to the premises. School-wide biometric databases were generated and populated with students’ portraits. Gates were fitted with cameras connected to facial recognition technology and allowed access only to recognised students. Another documented use has been for the Home Quarantine App (Hungary), in which telephone cameras are used by authorities to verify the identity of the persons logged into the app (see also section 10.1).

    +

    As in the cases of the use of Cisco systems powered FRT in two pilot projects in high schools of Nice (see section 8.1) and Marseille (France)7, or as in the case of the Anderstorp Upper Secondary School in Skelleftea (Sweden)8, the aim of these projects was to identify students who could have access to the premises. School-wide biometric databases were generated and populated with students’ portraits. Gates were fitted with cameras connected to facial recognition technology and allowed access only to recognised students. Another documented use has been for the Home Quarantine App (Hungary), in which telephone cameras are used by authorities to verify the identity of the persons logged into the app (see also section 10.1).

    In these deployments, people must submit themselves to the camera in order to be identified and gain access. While these techniques of identification pose important threats to the privacy of the concerned small groups of users (in both high school cases, DPAs banned the use of FRTs), and run the risk of false positives (unauthorised people recognised as authorised) or false negatives (authorised people not recognised as such) the risk of biometric mass surveillance strictly speaking is low to non-existent because of the nature of the acquisition of images and other sensor-based data.

    -

    However, other forms of live authentication tie in with surveillance practices, in particular various forms of blacklisting. With blacklisting the face of every passer-by is compared to a list of faces of individuals who have been rejected access to the premises. In such an instance, people do not have to be identified, as long as an image of their face is provided. This has been used in public places, for example in the case of the Korte Putstraat in the Dutch city of 's-Hertogenbosch: during the carnival festivities of 2019 two people were rejected access to the street after they were singled out by the system (Gotink, 2019). It is unclear how many false positives were generated during this period. Other cases of blacklisting can be found at, for example, access control at various football stadiums in Europe, see also section 3.3. In many cases of blacklisting, individuals do not enrol voluntarily.

    +

    However, other forms of live authentication tie in with surveillance practices, in particular various forms of blacklisting. With blacklisting the face of every passer-by is compared to a list of faces of individuals who have been rejected access to the premises. In such an instance, people do not have to be identified, as long as an image of their face is provided. This has been used in public places, for example in the case of the Korte Putstraat in the Dutch city of 's-Hertogenbosch: during the carnival festivities of 2019 two people were rejected access to the street after they were singled out by the system (Gotink, 2019). It is unclear how many false positives were generated during this period. Other cases of blacklisting can be found at, for example, access control at various football stadiums in Europe, see also section 3.3. In many cases of blacklisting, individuals do not enrol voluntarily.

    Forensic authentication

    Biometric systems for the purposes of authentication are also increasingly deployed for forensic applications among law-enforcement agencies in the European Union. The typical scenario for the use of such technologies is to match the photograph of a suspect (extracted, for example, from previous records or from CCTV footage) against an existing dataset of known individuals (e.g., a national biometric database, a driver’s license database, etc.). (TELEFI, 2021). The development of these forensic authentication capabilities is particularly relevant to this study, because it entails making large databases ready for searches on the basis of biometric information.

    -

    To date, 11 out of 27 member states of the European Union are using facial recognition against biometric databases for forensic purposes: Austria (EDE)9, Finland (KASTU)10, France (TAJ)11, Germany (INPOL)12, Greece (Mugshot Database)13, Hungary (Facial Image Registry)14, Italy (AFIS)15, Latvia (BDAS)16, Lithuania (HDR)17, Netherlands (CATCH)18 and Slovenia (Record of Photographed Persons)19 (TELEFI 2021).

    -

    Seven additional countries are expected to acquire such capabilities in the near future: Croatia (ABIS)20, Czech Republic (CBIS)21, Portugal (AFIS) Romania (NBIS)22, Spain (ABIS), Sweden (National Mugshot Database), Cyprus (ISIS Faces)23, Estonia (ABIS) (TELEFI 2021).

    -

    When it comes to international institutions, Interpol (2020) has a facial recognition system (IFRS)24, based on facial images received from more than 160 countries. Europol has two sub-units which use the facial recognition search tool and database known as FACE: the European Counter Terrorism Center (ECTC) and the European Cybercrime Center (ECC). (TELEFI, 2021 149-153) (Europol 2020)

    +

    To date, 11 out of 27 member states of the European Union are using facial recognition against biometric databases for forensic purposes: Austria (EDE)9, Finland (KASTU)10, France (TAJ)11, Germany (INPOL)12, Greece (Mugshot Database)13, Hungary (Facial Image Registry)14, Italy (AFIS)15, Latvia (BDAS)16, Lithuania (HDR)17, Netherlands (CATCH)18 and Slovenia (Record of Photographed Persons)19 (TELEFI 2021).

    +

    Seven additional countries are expected to acquire such capabilities in the near future: Croatia (ABIS)20, Czech Republic (CBIS)21, Portugal (AFIS) Romania (NBIS)22, Spain (ABIS), Sweden (National Mugshot Database), Cyprus (ISIS Faces)23, Estonia (ABIS) (TELEFI 2021).

    +

    When it comes to international institutions, Interpol (2020) has a facial recognition system (IFRS)24, based on facial images received from more than 160 countries. Europol has two sub-units which use the facial recognition search tool and database known as FACE: the European Counter Terrorism Center (ECTC) and the European Cybercrime Center (ECC). (TELEFI, 2021 149-153) (Europol 2020)

    Only 9 countries in the EU so far have rejected or do not plan to implement FRT for forensic purposes: Belgium (see CHAPTER 6), Bulgaria, Denmark, Ireland, Luxembourg, Malta, Poland, Portugal, Slovakia.

    Map Description automatically generated

    Figure 1. EU Countries use of FRT for forensic applications25

    When it comes to databases, some countries limit the searches to criminal databases (Austria, Germany, France, Italy, Greece, Slovenia, Lithuania, UK), while other countries open the searches to civil databases (Finland, Netherlands, Latvia, Hungary).

    This means that the person categories can vary substantially. In the case of criminal databases it can range from suspects and convicts, to asylum seekers, aliens, unidentified persons, immigrants, visa applicants. When civil databases are used as well, such as in Hungary, the database contains a broad range of “individuals of known identity from various document/civil proceedings” (TELEFI 2021, appendix 3).

    -

    Finally, the database sizes, in comparison to the authentication databases mentioned in the previous section, are of a different magnitude. The databases of school students in France and Sweden, mentioned in the previous section contains a few hundred entries. National databases can contain instead several millions. Criminal databases such as Germany’s INPOL contains 6,2 million individuals, France’s TAJ 21 million individuals and Italy’s AFIS 9 million individuals. Civil databases, such as Hungary’s Facial Image Registry contain 30 million templates (TELEFI, 2021 appendix 3).

    +

    Finally, the database sizes, in comparison to the authentication databases mentioned in the previous section, are of a different magnitude. The databases of school students in France and Sweden, mentioned in the previous section contains a few hundred entries. National databases can contain instead several millions. Criminal databases such as Germany’s INPOL contains 6,2 million individuals, France’s TAJ 21 million individuals and Italy’s AFIS 9 million individuals. Civil databases, such as Hungary’s Facial Image Registry contain 30 million templates (TELEFI, 2021 appendix 3).

    Authentication has also been deployed as part of integrated “safe city” solutions, such as the NEC Technology Bio-IDiom system in Lisbon and London, deployed for forensic investigation purposes. For this specific product, authentication can occur via facial recognition, as well as other biometric authentication techniques such as ear acoustics, iris, voice, fingerprint, and finger vein recognition. We currently do not have public information on the use of Bio-IDiom in Lisbon nor in London. On NEC’s Website (2021) however, Bio-IDiom is advertised as a “multimodal” identification system, that has been used for example by the Los Angeles County Sheriff’s Department (LASD) for criminal investigations. The system “combines multiple biometric technologies including fingerprint, palm print, face, and iris recognition” and works “based on the few clues left behind at crime scenes. In Los Angeles, “this system is also connected to the databases of federal and state law enforcement agencies such as the California Department of Justice and FBI, making it the world’s largest-scale service-based biometrics system for criminal investigation”. We don’t know if that is the case in Portugal and in the UK deployments.

    Case study: INPOL (Germany)

    -

    In order to give a concrete example of the forensic use of biometric technology, we can take the German case. Germany has been using automated facial recognition technologies to identify criminal activity since 2008 using a central criminal information system called INPOL (Informationssystem Polizei), maintained by the Bundeskriminalamt (BKA), which is the federal criminal police office. INPOL uses Oracle Software and includes the following information: name, aliases, date and place of birth, nationality, fingerprints, mugshots, appearance, information about criminal histories such as prison sentences or violence of an individual, and DNA information. However, DNA information is not automatically recorded (TELEFI 2021).

    -

    The INPOL database includes facial images of suspects, arrestees, missing persons, and convicted individuals. For the purpose of facial recognition, anatomical features of a person's face or head as seen on video surveillance or images are used as a material to match with data in INPOL. The facial recognition system compares templates and lists all the matches ordered by degree of accordance. The BKA has specific personnel visually analysing the system's choices and providing an assessment, defining the probability of identifying a person. This assessment can be used in a court of law if necessary (Bundeskriminalamt, n.d.). Searches in the database are conducted by using Cognitec Face VACS software (TELEFI 2021).

    -

    As of March 2020, INPOL consists of 5,8 million images of about 3,6 million individuals. All police stations in Germany have access to this database. The BKA saves biometric data and can be used by other ministries as well, for instance, to identify asylum seekers. Furthermore, the data is shared in the context of the Prüm cooperation on an international level (mostly fingerprints and DNA patterns). Furthermore, the BKA saves DNA analysis data as part of INPOL, accessible for all police stations in Germany. That database contains 1,2 million data sets (Bundeskriminalamt, n.d.). Other recorded facial images, for instance, driver’s licenses or passports, are not included in the search, and the database is mainly used for police work (TELEFI 2021).

    +

    In order to give a concrete example of the forensic use of biometric technology, we can take the German case. Germany has been using automated facial recognition technologies to identify criminal activity since 2008 using a central criminal information system called INPOL (Informationssystem Polizei), maintained by the Bundeskriminalamt (BKA), which is the federal criminal police office. INPOL uses Oracle Software and includes the following information: name, aliases, date and place of birth, nationality, fingerprints, mugshots, appearance, information about criminal histories such as prison sentences or violence of an individual, and DNA information. However, DNA information is not automatically recorded (TELEFI 2021).

    +

    The INPOL database includes facial images of suspects, arrestees, missing persons, and convicted individuals. For the purpose of facial recognition, anatomical features of a person's face or head as seen on video surveillance or images are used as a material to match with data in INPOL. The facial recognition system compares templates and lists all the matches ordered by degree of accordance. The BKA has specific personnel visually analysing the system's choices and providing an assessment, defining the probability of identifying a person. This assessment can be used in a court of law if necessary (Bundeskriminalamt, n.d.). Searches in the database are conducted by using Cognitec Face VACS software (TELEFI 2021).

    +

    As of March 2020, INPOL consists of 5,8 million images of about 3,6 million individuals. All police stations in Germany have access to this database. The BKA saves biometric data and can be used by other ministries as well, for instance, to identify asylum seekers. Furthermore, the data is shared in the context of the Prüm cooperation on an international level (mostly fingerprints and DNA patterns). Furthermore, the BKA saves DNA analysis data as part of INPOL, accessible for all police stations in Germany. That database contains 1,2 million data sets (Bundeskriminalamt, n.d.). Other recorded facial images, for instance, driver’s licenses or passports, are not included in the search, and the database is mainly used for police work (TELEFI 2021).

    A blurred boundary between authentication and surveillance

    @@ -881,17 +917,17 @@

    A first range of deployments of “smart” systems correspond to what can broadly be defined as “smart surveillance” yet do not collect or process biometric information per se26. Smart systems can be used ex-post, to assist CCTV camera operators in processing large amounts of recorded information, or can guide their attention when they have to monitor a large number of live video feeds simultaneously. Smart surveillance uses the following features:

    - Anomaly detection. In Toulouse (France), the City Council commissioned IBM to connect 30 video surveillance cameras to software able to "assist human decisions" by raising alerts when "abnormal events are detected." (Technopolice 2021) The request was justified by the “difficulties of processing the images generated daily by the 350 cameras and kept for 30 days (more than 10,000 images per second)”. The objective, according to the digital direction is "to optimise and structure the supervision of video surveillance operators by generating alerts through a system of intelligent analysis that facilitates the identification of anomalies detected, whether: movements of crowds, isolated luggage, crossing virtual barriers north of the Garonne, precipitous movement, research of shapes and colour. All these detections are done in real time or delayed (Technopolice 2021). In other words, the anomaly detection is a way to operationalise the numerical output of various computer vision based recognition systems. Similar systems are used in the Smart video surveillance deployment in Valenciennes (France) or in the Urban Surveillance Centre (Marseille).

    - Object Detection. In Amsterdam, around the Johan Cruijff ArenA (Stadium), the city has been experimenting with a Digitale Perimeter (digital perimeter) surveillance system. In addition to the usual features of facial recognition, and crowd monitorining, the system includes the possibility of automatically detecting specific objects such as weapons, fireworks or drones. Similar features are found in Inwebit’s Smart Security Platform (SSP) in Poland.

    -

    - Feature search. In Marbella (Spain), Avigilon deployed a smart camera system aimed at providing “smart” functionalities without biometric data. Since regional law bans facial and biometric identification without consent, the software uses “appearance search”. “Appearance search” provides estimates for “unique facial traits, the colour of a person’s clothes, age, shape, gender and hair colour”. This information is not considered biometric. The individual’s features can be used to search for suspects fitting a particular profile. Similar technology has been deployed in Kortrijk (Belgium), which provides search parameters for people, vehicles and animals (Verbeke 2019).

    -

    - Video summary. Some companies, such as Briefcam and their product Briefcam Review, offer a related product, which promises to shorten the analysis of long hours of CCTV footage, by identifying specific topics of interest (children, women, lighting changes) and making the footage searchable. The product combines face recognition, license plate recognition, and more mundane video analysis features such as the possibility to overlay selected scenes, thus highlighting recurrent points of activity in the image. Briefcam is deployed in several cities across Europe, including Vannes, Roubaix (in partnership with Eiffage) and Moirand in France.

    -

    - Object detection and object tracking. As outlined in chapter 2, object detection is often the first step in the various digital detection applications for images. An ‘object’ here can mean anything the computer is conditioned to search for: a suitcase, a vehicle, but also a person; while some products further process the detected object to estimate particular features, such as the colour of a vehicle, the age of a person. However, on some occasions — often to address concerns over privacy — only the position of the object on the image is stored. This is for example the case with the test of the One-and-a-half-meter monitor in Amsterdam (Netherlands), Intemo’s people counting system in Nijmegen (Netherlands), the KICK project in Brugge, Kortrijk, Ieper, Roeselare and Oostende in Belgium or the Eco-counter tracking cameras pilot project in Lannion (France).

    +

    - Feature search. In Marbella (Spain), Avigilon deployed a smart camera system aimed at providing “smart” functionalities without biometric data. Since regional law bans facial and biometric identification without consent, the software uses “appearance search”. “Appearance search” provides estimates for “unique facial traits, the colour of a person’s clothes, age, shape, gender and hair colour”. This information is not considered biometric. The individual’s features can be used to search for suspects fitting a particular profile. Similar technology has been deployed in Kortrijk (Belgium), which provides search parameters for people, vehicles and animals (Verbeke 2019).

    +

    - Video summary. Some companies, such as Briefcam and their product Briefcam Review, offer a related product, which promises to shorten the analysis of long hours of CCTV footage, by identifying specific topics of interest (children, women, lighting changes) and making the footage searchable. The product combines face recognition, license plate recognition, and more mundane video analysis features such as the possibility to overlay selected scenes, thus highlighting recurrent points of activity in the image. Briefcam is deployed in several cities across Europe, including Vannes, Roubaix (in partnership with Eiffage) and Moirand in France.

    +

    - Object detection and object tracking. As outlined in chapter 2, object detection is often the first step in the various digital detection applications for images. An ‘object’ here can mean anything the computer is conditioned to search for: a suitcase, a vehicle, but also a person; while some products further process the detected object to estimate particular features, such as the colour of a vehicle, the age of a person. However, on some occasions — often to address concerns over privacy — only the position of the object on the image is stored. This is for example the case with the test of the One-and-a-half-meter monitor in Amsterdam (Netherlands), Intemo’s people counting system in Nijmegen (Netherlands), the KICK project in Brugge, Kortrijk, Ieper, Roeselare and Oostende in Belgium or the Eco-counter tracking cameras pilot project in Lannion (France).

    - Movement recognition. Avigilon’s software that is deployed in Marbella (Spain) also detects unusual movement. “To avoid graffiti, we can calculate the time someone takes to pass a shop window, “explained Javier Martín, local chief of police in Marbella to the Spanish newspaper El País. “If it takes them more than 10 seconds, the camera is activated to see if they are graffitiing. So far, it hasn’t been activated.” (Colomé 2019) Similar movement recognition technology is used in, the ViSense deployment at the Olympic Park London (UK) and the security camera system in Mechelen-Willebroek (Belgium). It should be noted that movement recognition can be done in two ways: where projects such as the Data-lab Burglary-free Neighbourhood in Rotterdam (Netherlands)27 are only based on the tracking of trajectories of people through an image (see also ‘Object detection’), cases such as the Living Lab Stratumseind28 in Eindhoven (Netherlands) also process the movements and gestures of individuals in order to estimate their behaviour.

    Audio recognition

    -

    - In addition to image (video) based products, some deployments use audio recognition to complement the decision-making process, for example used in the Serenecity (a branch of Verney-Carron) Project in Saint-Etienne (France), the Smart CCTV deployment in public transportation in Rouen (France) or the Smart CCTV system in Strasbourg (France). The project piloted in Saint-Etienne for example, worked by placing “audio capture devices” - the term microphone was avoided- in strategic parts of the city. Sounds qualified by an anomaly detection algorithm as suspicious would then alert operators in the Urban Supervision Center, prompting further investigation via CCTV or deployment of the necessary services (healthcare or police for example) (France 3 Auvergne-Rhône-Alpes 2019.)

    +

    - In addition to image (video) based products, some deployments use audio recognition to complement the decision-making process, for example used in the Serenecity (a branch of Verney-Carron) Project in Saint-Etienne (France), the Smart CCTV deployment in public transportation in Rouen (France) or the Smart CCTV system in Strasbourg (France). The project piloted in Saint-Etienne for example, worked by placing “audio capture devices” - the term microphone was avoided- in strategic parts of the city. Sounds qualified by an anomaly detection algorithm as suspicious would then alert operators in the Urban Supervision Center, prompting further investigation via CCTV or deployment of the necessary services (healthcare or police for example) (France 3 Auvergne-Rhône-Alpes 2019.)

    Emotion recognition

    -

    - Emotion recognition is a rare occurrence. We found evidence of its deployment only in a pilot project in Nice (see section 8.1) and in the Citybeacon project in Eindhoven, but even then, the project was never actually tested. The original idea proposed by the company Two-I was “a "real-time emotional mapping" capable of highlighting "potentially problematic or even dangerous situations". "A dynamic deployment of security guards in an area where tension and stress are felt, is often a simple way to avoid any overflow," also argues Two-I, whose "Security" software would be able to decipher some 10,000 faces per second. (Binacchi 2019)

    +

    - Emotion recognition is a rare occurrence. We found evidence of its deployment only in a pilot project in Nice (see section 8.1) and in the Citybeacon project in Eindhoven, but even then, the project was never actually tested. The original idea proposed by the company Two-I was “a "real-time emotional mapping" capable of highlighting "potentially problematic or even dangerous situations". "A dynamic deployment of security guards in an area where tension and stress are felt, is often a simple way to avoid any overflow," also argues Two-I, whose "Security" software would be able to decipher some 10,000 faces per second. (Binacchi 2019)

    Gait recognition

    @@ -902,16 +938,17 @@

    Integrated solutions

    Smart cities

    -

    While some cities or companies decide to implement some of the functionalities with their existing or updated CCTV systems, several chose to centralise several of these “smart” functions in integrated systems often referred to as “safe city” solutions. These solutions do not necessarily process biometric information. This is the case for example for the deployments in TIM’s, Insula and Venis’ Safe City Platform in Venice (Italy), Huawei’s Safe City in Valenciennes (France), Dahua’s integrated solution in Brienon-sur-Armançon (France), Thalès’ Safe City in La Défense and Nice (France), Engie Inéo’s and SNEF’s integrated solution in Marseille (France), the Center of Urban Supervision in Roubaix (France), AI Mars (Madrid, in development) or NEC’s platform in Lisbon and London.

    -

    The way “Smart/Safe City” solutions work is well exemplified by the “Control room” deployed in Venice, connected to an urban surveillance network. The system is composed of a central command and control room which aggregates cloud computing systems, together with smart cameras, artificial intelligence systems, antennas and hundreds of sensors distributed on a widespread network. The idea is to monitor what happens in the lagoon city in real time. The scope of the abilities of the centre is wide-ranging. It promises to: manage events and incoming tourist flows, something particularly relevant to a city which aims to implement a visiting fee for tourists; predict and manage weather events in advance, such as the shifting of tides and high water, by defining alternative routes for transit in the city; indicating to the population in real time the routes to avoid traffic and better manage mobility for time optimisation; improve the management of public safety allowing city agents to intervene in a more timely manner; control and manage water and road traffic, also for sanctioning purposes, through specific video-analysis systems; control the status of parking lots; monitor the environmental and territorial situation; collect, process data and information that allow for the creation of forecasting models and the allocation of resources more efficiently and effectively; bring to life a physical "Smart Control Room" where law enforcement officers train and learn how to read data as well. (LUMI 2020)

    +

    While some cities or companies decide to implement some of the functionalities with their existing or updated CCTV systems, several chose to centralise several of these “smart” functions in integrated systems often referred to as “safe city” solutions. These solutions do not necessarily process biometric information. This is the case for example for the deployments in TIM’s, Insula and Venis’ Safe City Platform in Venice (Italy), Huawei’s Safe City in Valenciennes (France), Dahua’s integrated solution in Brienon-sur-Armançon (France), Thalès’ Safe City in La Défense and Nice (France), Engie Inéo’s and SNEF’s integrated solution in Marseille (France), the Center of Urban Supervision in Roubaix (France), AI Mars (Madrid, in development) or NEC’s platform in Lisbon and London.

    +

    The way “Smart/Safe City” solutions work is well exemplified by the “Control room” deployed in Venice, connected to an urban surveillance network. The system is composed of a central command and control room which aggregates cloud computing systems, together with smart cameras, artificial intelligence systems, antennas and hundreds of sensors distributed on a widespread network. The idea is to monitor what happens in the lagoon city in real time. The scope of the abilities of the centre is wide-ranging. It promises to: manage events and incoming tourist flows, something particularly relevant to a city which aims to implement a visiting fee for tourists; predict and manage weather events in advance, such as the shifting of tides and high water, by defining alternative routes for transit in the city; indicating to the population in real time the routes to avoid traffic and better manage mobility for time optimisation; improve the management of public safety allowing city agents to intervene in a more timely manner; control and manage water and road traffic, also for sanctioning purposes, through specific video-analysis systems; control the status of parking lots; monitor the environmental and territorial situation; collect, process data and information that allow for the creation of forecasting models and the allocation of resources more efficiently and effectively; bring to life a physical "Smart Control Room" where law enforcement officers train and learn how to read data as well. (LUMI 2020)

    Smartphone apps

    -

    Integrated solutions can entail smartphone apps, used to connect citizens with the control and command centres. This is for example the case in Nice with the (failed) Reporty App project (See Chapter 5), the Dragonfly project (Hungary) (See chapter 10) and was part of the original plan of Marseille’s Safe City project.

    +

    Integrated solutions can entail smartphone apps, used to connect citizens with the control and command centres. This is for example the case in Nice with the (failed) Reporty App project (See Chapter 5), the Dragonfly project (Hungary) (See chapter 10) and was part of the original plan of Marseille’s Safe City project.

    Crowd management

    -

    Integrated solutions are generally comprised of a set of crowd management features, such as in the case of the systems in Valenciennes and Marseille (France), Mannheim (Germany), Venice (Italy), Amsterdam, Eindhoven and Den Bosch with the Crowdwatch project (Netherlands). Such crowd management software generally does not recognise individuals, but rather estimates the number of people on (a part of) the video frame. Sudden movements of groups or changes in density are then flagged for attention of the security operator (Nishiyama 2018).

    +

    Integrated solutions are generally comprised of a set of crowd management features, such as in the case of the systems in Valenciennes and Marseille (France), Mannheim (Germany), Venice (Italy), Amsterdam, Eindhoven and Den Bosch with the Crowdwatch project (Netherlands). Such crowd management software generally does not recognise individuals, but rather estimates the number of people on (a part of) the video frame. Sudden movements of groups or changes in density are then flagged for attention of the security operator (Nishiyama 2018).

    @@ -921,12 +958,12 @@

    Deployment of RBI in public spaces

    Here are the documented cases of RBI in public spaces we could find through our research:

    -

    - Live Facial Recognition pilot project in Brussels International Airport / Zaventem (Belgium, see detailed case study, CHAPTER 6)

    -

    - Live Facial Recognition in Budapest (Hungary, see detailed case study, CHAPTER 10)

    -

    - Live Facial Recognition pilot project during the Carnival in Nice (France, see detailed case study, CHAPTER 8)

    -

    - Live Facial Recognition Pilot Project Südkreuz Berlin (Germany, see detailed case study, CHAPTER 9)

    +

    - Live Facial Recognition pilot project in Brussels International Airport / Zaventem (Belgium, see detailed case study, CHAPTER 6)

    +

    - Live Facial Recognition in Budapest (Hungary, see detailed case study, CHAPTER 10)

    +

    - Live Facial Recognition pilot project during the Carnival in Nice (France, see detailed case study, CHAPTER 8)

    +

    - Live Facial Recognition Pilot Project Südkreuz Berlin (Germany, see detailed case study, CHAPTER 9)

    As most of these cases are extensively discussed in the following chapters, we do not comment further on them here.

    @@ -1093,7 +1130,7 @@

    Taking in particular issue with Article 4 and the possible exemptions to regulation of AI “in order to safeguard public safety”, they urge the commissionEuropean Commission “to make sure that existing protections are upheld and a clear ban on biometric mass surveillance in public spaces is proposed. This is what a majority of citizens want” (Breyer et al. 2021)

    European Digital Rights (EDRi), an umbrella organisation of 44 digital rights NGOs in Europe takes a radical stance on the issue. They argue that mass processing of biometric data in public spaces creates a serious risk of mass surveillance that infringes on fundamental rights, and therefore they call on the Commission to permanently stop all deployments that can lead to mass surveillance. In their report Ban Biometric Mass Surveillance (2020) they demand that the EDPB and national DPAs) “publicly disclose all existing and planned activities and deployments that fall within this remit.” (EDRi 2020, 5). Furthermore, they call for ceasing all planned legislation which establishes biometric processing as well as the funding for all such projects, amounting to an “immediate and indefinite ban on biometric processing”.

    -

    La Quadrature du Net (LQDN) one of EDRi’s founding members (created in 2008 to “promote and defend fundamental freedoms in the digital world") similarly called for a ban on any present and future use of facial recognition for security and surveillance purposes. Together with a number of other French NGOs monitoring legislation impacting digital freedoms, as well as other collectives, companies, associations and trade unions, the LQDN initiated a joint open letter in which they call on French authorities to ban any security and surveillance use of facial recognition due to their uniquely invasive and dehumanising nature. In their letter they point to the fact that in France there are a “multitude of systems already installed, outside of any real legal framework, without transparency or public discussion” referring, among others, to the PARAFE system and the use of FRTs by civil and military police. As they put it:

    +

    La Quadrature du Net (LQDN) one of EDRi’s founding members (created in 2008 to “promote and defend fundamental freedoms in the digital world") similarly called for a ban on any present and future use of facial recognition for security and surveillance purposes. Together with a number of other French NGOs monitoring legislation impacting digital freedoms, as well as other collectives, companies, associations and trade unions, the LQDN initiated a joint open letter in which they call on French authorities to ban any security and surveillance use of facial recognition due to their uniquely invasive and dehumanising nature. In their letter they point to the fact that in France there are a “multitude of systems already installed, outside of any real legal framework, without transparency or public discussion” referring, among others, to the PARAFE system and the use of FRTs by civil and military police. As they put it:

    “Facial recognition is a uniquely invasive and dehumanising technology, which makes possible, sooner or later, constant surveillance of the public space. It creates a society in which we are all suspects. It turns our face into a tracking device, rather than a signifier of personality, eventually reducing it to a technical object. It enables invisible control. It establishes a permanent and inescapable identification regime. It eliminates anonymity. No argument can justify the deployment of such a technology.”
    (La Quadrature du Net. et al. 2019)
    @@ -1143,25 +1180,25 @@

    Belgium is, with Spain, one of the few countries in Europe that has not authorised the use of facial recognition technology, neither for criminal investigations nor for mass surveillance (Vazquez 2020). This does not mean that it is unlikely to change its position in the very near future. Law enforcement is indeed strongly advocating its use, and the current legal obstacles are not likely to hold for very long (Bensalem 2018). The pilot experiment that took place in Zaventem / Brussels International Airport, although aborted, occurred within a national context in which biometric systems are increasingly used and deployed.

    Belgium will, for example, soon roll out at the national level the new biometric identity card “eID”, the Minister of Interior Annelies Verlinden has recently announced. The identification document, which will rely on the constitution of a broad biometric database and is part of a broader European Union initiative, is developed in partnership with security multinational Thales, was already trialled with 53.000 citizens in (Prins 2021; Thales Group 2020).30

    -

    Municipalities in different parts of the country are experimenting with Automated Number Plate Recognition (ANPR) technology. A smaller number have started deploying “smart CCTV” cameras, which fall just short of using facial recognition technology. The city of Kortrijk has for example deployed “body recognition” technology, which uses walking style or clothing of individuals to track them across the city’s CCTV network. Facial recognition is possible with these systems, but has not been activated as of yet pending legal authorisation to do so. In the city of Roeselare, “smart cameras” have been installed in one of the shopping streets. Deployed by telecom operator Citymesh, they could provide facial recognition services, but are currently used to count and estimate crowds, data which is shared with the police (van Brakel 2020). All the emerging initiatives of remote biometric identification are however pending a reversal of the decision to halt the experiment at Zaventem Brussels International Airport.

    +

    Municipalities in different parts of the country are experimenting with Automated Number Plate Recognition (ANPR) technology. A smaller number have started deploying “smart CCTV” cameras, which fall just short of using facial recognition technology. The city of Kortrijk has for example deployed “body recognition” technology, which uses walking style or clothing of individuals to track them across the city’s CCTV network. Facial recognition is possible with these systems, but has not been activated as of yet pending legal authorisation to do so. In the city of Roeselare, “smart cameras” have been installed in one of the shopping streets. Deployed by telecom operator Citymesh, they could provide facial recognition services, but are currently used to count and estimate crowds, data which is shared with the police (van Brakel 2020). All the emerging initiatives of remote biometric identification are however pending a reversal of the decision to halt the experiment at Zaventem Brussels International Airport.

    The Zaventem pilot in the context of Face Recognition Technology in Belgium

    The use of facial recognition technology at the Brussels International Airport was announced on 10 July 2019 in the Flemish weekly Knack by General Commissioner of Federal Police Marc De Mesmaeker (Lippens and Vandersmissen 2019). There is currently no publicly available information as to whom provided the technical system. De Mesmaeker explained that an agreement had been found with the company managing the airport and the labour unions, and thus that the technology was already in use (Organe de Controle de l'Information Policière 2019, 3).

    As part of the justification for the deployment of FRT in Zaventem, De Mesmaeker made a comparison with ANPR-enabled cameras, arguing that “They have already helped to solve investigations quickly, (…). Citizens understand this and have learned to live with their presence, but privacy remains a right”. (7sur7 2019)

    -

    The Belgian Supervisory Body for Police Information (COC)31, in its advisory document, explained that it had no prior knowledge of the deployment and learned about the existence of the facial recognition systems through the interview of De Mesmaeker in the Knack magazine (Organe de Controle de l'Information Policière 2019, 3). On 10 July 2019, the COC thus invited the General Commissioner to communicate all the details of the deployment of this technology in the Brussels International Airport. On 18 July 2019, COC received a summary of the system’s main components. On 9 August 2019, it subsequently visited the premises of the federal police deployment in Zaventem airport (Organe de Controle de l'Information Policière 2019, 3).

    +

    The Belgian Supervisory Body for Police Information (COC)31, in its advisory document, explained that it had no prior knowledge of the deployment and learned about the existence of the facial recognition systems through the interview of De Mesmaeker in the Knack magazine (Organe de Controle de l'Information Policière 2019, 3). On 10 July 2019, the COC thus invited the General Commissioner to communicate all the details of the deployment of this technology in the Brussels International Airport. On 18 July 2019, COC received a summary of the system’s main components. On 9 August 2019, it subsequently visited the premises of the federal police deployment in Zaventem airport (Organe de Controle de l'Information Policière 2019, 3).

    We know some technical details about the system through the public information shared by the COC. In early 2017, Brussels airport had acquired 4 cameras connected to a facial recognition software for use by the airport police (Police Aéronautique, LPA) (Farge 2020, 15; Organe de Controle de l'Information Policière 2019, 3). The system works in two steps.

    When provided with video feeds from the four cameras, the software first creates snapshots, generating individual records with the faces that appear in the frame. These snapshots on record are then in a second step compared and potentially matched to previously established “blacklists” created by the police itself (the reference dataset is thus not external to this particular deployment) (Organe de Controle de l'Information Policière 2019, 3).

    The system did however not live up to its promise and generated a high number of false positives. Many features such as skin colour, glasses, moustaches, and beards led to false matches. The system was thus partially disconnected in March 2017, and at the time of the visit of the COC, the system was no longer fully in use (Organe de Controle de l'Information Policière 2019, 3). Yet the second step had not been de-activated (matching video feeds against pre-established blacklists of faces), the first function of creating a biometric record of the video feeds was still in place (Organe de Controle de l'Information Policière 2019, 3).

    @@ -1259,23 +1296,23 @@

    Although several French cities such as Paris, Valenciennes or Marseille have launched pilot projects for “safe city” projects involving biometric technologies (facial, voice, sound recognition), the city of Nice is perhaps the national leader in the experimentation with such technologies at a local level (Nice Premium 2017). The mayor of Nice, Christian Estrosi (Les Républicains Party, right) a prominent political figure on the national political scene, has made clear his intention was to make Nice a “laboratory” of crime prevention (Barelli 2018). Since 2010, more than 1.962 surveillance cameras have been deployed throughout the city, making it the city with highest CCTV coverage in France (27 cameras per square meter). Nice also possesses the most local police in France per inhabitant: 414 agents, for a population of 340.000 (in comparison, the neighbouring city of Marseille has 450 agents for 861.000 inhabitants).

    The various facets of the “Safe city” project in Nice

    -

    Nice has experimented with various initiatives related to remote biometric identification – many of which fall into the category of biometric mass surveillance. In 2017, Christian Estrosi announced a partnership with the energy company Engie Ineo for the development of an Urban Surveillance Centre (Centre de Surveillance Urbain, CSU). Based on a touch-interface technology, it centralises a platform of real-time data such as traffic accidents, patrol locations, as well as video feeds from CCTVs on the streets and in public transportation. (Dudebout 2020, 1). The video feeds from the city tramways are connected to an emotion recognition algorithm to flag suspicious situations (Allix 2018).

    -

    In June 2018, an additional step was taken with the signing of a partnership agreement with a consortium of companies headed by Thales, specialised in social network intelligence, geolocation, biometrics and crowd simulation33 for a “Safe City” project (Dudebout 2020, 2). Established for three years (2018-2021) with a budget of EUR 10,9 million, the project is financed by the city council, subsidised in part by BPI France34, and supported by the Committee for the Security Industrial Sector, an agency under the tutelage of the Prime Minister’s office35 (Allix 2018; BPI France 2018)

    +

    Nice has experimented with various initiatives related to remote biometric identification – many of which fall into the category of biometric mass surveillance. In 2017, Christian Estrosi announced a partnership with the energy company Engie Ineo for the development of an Urban Surveillance Centre (Centre de Surveillance Urbain, CSU). Based on a touch-interface technology, it centralises a platform of real-time data such as traffic accidents, patrol locations, as well as video feeds from CCTVs on the streets and in public transportation. (Dudebout 2020, 1). The video feeds from the city tramways are connected to an emotion recognition algorithm to flag suspicious situations (Allix 2018).

    +

    In June 2018, an additional step was taken with the signing of a partnership agreement with a consortium of companies headed by Thales, specialised in social network intelligence, geolocation, biometrics and crowd simulation33 for a “Safe City” project (Dudebout 2020, 2). Established for three years (2018-2021) with a budget of EUR 10,9 million, the project is financed by the city council, subsidised in part by BPI France34, and supported by the Committee for the Security Industrial Sector, an agency under the tutelage of the Prime Minister’s office35 (Allix 2018; BPI France 2018)

    The first facial recognition test of the Safe city project took place from 16 February to 2 March 2019, during the Nice Carnival. The experiment was a simulation, involving matching faces collected through CCTV footage of the crowd attending the carnival with a fictitious set of databases (lost individuals, wanted individuals, or individuals with restraining orders). The fictitious datasets were constituted by 50 volunteers, recruited mostly among the municipality, who provided their pictures, or were freshly photographed for the test. The system used live facial recognition software provided by the company Anyvision. The live feeds were filmed during the carnival. Passers-by (approximately 1000 people were concerned) were informed of the ongoing test and asked to wear a bracelet if they consented to being filmed (Hassani 2019).

    A second experiment took the form of a software application (app) named “Reporty”, rolled out in January 2018. The app, developed by the Israeli American company Carbyne, allows citizens to be in direct audio and video connection and share geolocation information with the Urban Supervision Centre in order to report any incivility, offense, or crime that they might witness (Barelli 2018).

    -

    The third project, involving facial recognition was tested in the education context. In February 2019, a high school in Nice and a high school in Marseille were fitted with facial recognition technology at their gates in order to grant or bar access to the premises. The official motivation behind the deployment was to "assist the personnel of the high schools and to fight against identity theft’’ (Dudebout 2020, 3–4).

    +

    The third project, involving facial recognition was tested in the education context. In February 2019, a high school in Nice and a high school in Marseille were fitted with facial recognition technology at their gates in order to grant or bar access to the premises. The official motivation behind the deployment was to "assist the personnel of the high schools and to fight against identity theft’’ (Dudebout 2020, 3–4).

    Mobilisations and contestations

    @@ -1296,7 +1333,7 @@

    The French digital rights organisation La Quadrature du Net was quick to highlight the problems raised by the deployment of these technologies in Nice. “The safe city is the proliferation of tools from the intelligence community, with a logic of massive surveillance, identification of weak signals and suspicious behaviour," commented Félix Tréguer, a Marseilles-based leader of the association La Quadrature du Net and member of the campaign Technopolice36. “We do not find it reassuring that the municipal police will become the intelligence service of the urban public space and its digital double" (Allix 2018).

    The Ligue des Droits de l’Homme emphasised similar points, highlighting the political dangers involved. As Henri Busquet of the Ligue des Droits de l'Homme in Nice put “improving emergency services and traffic is legitimate, but the generalisation of video surveillance worries us, and scrutinising social networks is not the role of a mayor. Without any safeguards, such a tool cannot demonstrate the necessary neutrality [...] It is potentially a tool for political destruction, which puts opponents and journalists at particular risk” (Allix 2018).

    In July 2019, the city of Nice hoped the CNIL would provide advice related to its first test experiment during the Carnival. The CNIL responded however that not enough information was provided by the municipality for the DPA to assess it. The French DPA pointed out in particular the lack of “quantified elements on the effectiveness of the technical device or the concrete consequences of a possible bias (related to gender, skin colour ...) of the software” (Dudebout 2020, 3).

    -

    The launch of the smartphone application “Reporty” was the catalyst for mobilisation in Nice, united under the umbrella organisation “Collectif anti-Reporty". The coalition was formed by local representatives from two left-wing parties (Parti Socialiste, Les Insoumis), Tous Citoyens, the union CGT and the anti-discrimination NGO MRAP. The coalition appealed to two institutions to block the use of the application: The Defender of Rights (Défenseur des Droits) and the French DPA (CNIL). The coalition denounced “a risk of generalised denunciation and a serious breach of privacy”, calling to “put an end to the securitarian drift of Christian Estrosi” (Barelli 2018).

    +

    The launch of the smartphone application “Reporty” was the catalyst for mobilisation in Nice, united under the umbrella organisation “Collectif anti-Reporty". The coalition was formed by local representatives from two left-wing parties (Parti Socialiste, Les Insoumis), Tous Citoyens, the union CGT and the anti-discrimination NGO MRAP. The coalition appealed to two institutions to block the use of the application: The Defender of Rights (Défenseur des Droits) and the French DPA (CNIL). The coalition denounced “a risk of generalised denunciation and a serious breach of privacy”, calling to “put an end to the securitarian drift of Christian Estrosi” (Barelli 2018).

    On 15 March 2018, the CNIL stated that the application was too invasive and did not meet the criteria set out by the legislation. It did not meet the proportionality test; it failed to fall within the frame of existing law on video-protection due to the integration of private citizens terminals (smartphones) with a security database managed by the police; it was excessively intrusive due to the collection of images and voice of people in the public space and finally it covered a field of offenses that was too broad (CNIL 2018).

    The school experimentation further pushed the CNIL to take a position on the technological activism of Nice’s mayor. On 29 October 2019, it expressed serious concerns over the experimentation, arguing that the technology was clashing with the principles of proportionality and data collection minimisation enshrined in the principles of the GDPR. It pointed out that other methods, less intrusive for the privacy of the students, could be used to achieve the technology’s stated goal, namely increasing the student’s security and traffic fluidity (Dudebout 2020, 4).

    In a landmark opinion published on 15 November 2019, the CNIL clarified what it defined as guidelines related to facial recognition (CNIL 2019a). The French DPA expressed concerns over a blanket and indiscriminate use of the technologies, highlighting possible infringements to fundamental rights, because these technologies operate in the public space, where these freedoms (expression, reunion, protest) are expressed. It however did not suggest that they should be banned in all circumstances – it suggested instead that its uses could be justified if properly regulated, on a case-by-case basis. Certain uses could be rejected a priori – such as in the case of minors, whose data are strictly protected. The question of data retention is also central, warning against excessive data duration and excessive centralisation, suggesting instead citizen’s control over their own data. But as the president of the CNIL, Marie-Laure Denis explained, facial recognition technology “can have legitimate uses, and there is a not firm position of the CNIL’s board” (Untersinger 2019).

    @@ -1424,9 +1461,9 @@

    In May 2020 Hungarian Authorities rolled out two digital applications, the contract-tracing app called VirusRadar (Kaszás 2020) and the Home Quarantine App (Házi Karantén Rendszer, abreviated HKR). Both of these apps are centralised tracing apps meaning that they send contact logs with pseudonymised personal data to a central (government) back-end server (Council of Europe 2020, 28). While the VirusRadar only uses Bluetooth data and proximity of other devices, the HKR processes biometric data when comparing facial images of its users.

    Those who, according to the COVID-19 regulations in Hungary, are confined to home quarantine are offered the option to use the app instead of being checked by the police. For those who return from abroad, the use of the app is compulsory. But even those who can choose are encourage by the authorities to make use of the HKR app otherwise they will be subjected to frequent visits by police agents. Once a person downloads the app, its use becomes compulsory and failure to do so or attempts to evade its tracking is considered an administrative offense. From a data protection law point of view, this is a clear case where the data subject’s consent (and in the case of biometric data, their explicit consent) cannot provide the lawful ground for the processing of data through the app (see section 4.2.2). Even if the processing can be based on another lawful ground such as public interest, the punitive nature of non-compliance may raise issues in terms of adhering to the necessity test, which requires a balancing act between the objective pursued and the data subject’s interests.

    -

    The HKR app is developed by Asura Technologies and implemented by IdomSoft Ltd., the same company that provides the software and technical implementation for the nation-wide Dragonfly Project. The HKR application works with face recognition technology combined with location verification. The application sends notifications at random times prompting the user to upload a facial image while retrieving the location data of the mobile device.  The user must respond within 15 minutes and the location data must match the address registered for quarantine. In order for the Home Quarantine App to work, the user first needs to upload a facial image which is compared by a police officer with the photo of the same individual stored in the central database. After this facial verification, the app creates a biometric template on the mobile phone of the user and the photo is deleted. The consecutive photos are only compared to this biometric template, so neither the photos nor the template leave the personal device. If there is suspicion about the identity or whereabouts of the user, a police officer visits the address to make sure that the person is adhering to the quarantine rules.

    +

    The HKR app is developed by Asura Technologies and implemented by IdomSoft Ltd., the same company that provides the software and technical implementation for the nation-wide Dragonfly Project. The HKR application works with face recognition technology combined with location verification. The application sends notifications at random times prompting the user to upload a facial image while retrieving the location data of the mobile device.  The user must respond within 15 minutes and the location data must match the address registered for quarantine. In order for the Home Quarantine App to work, the user first needs to upload a facial image which is compared by a police officer with the photo of the same individual stored in the central database. After this facial verification, the app creates a biometric template on the mobile phone of the user and the photo is deleted. The consecutive photos are only compared to this biometric template, so neither the photos nor the template leave the personal device. If there is suspicion about the identity or whereabouts of the user, a police officer visits the address to make sure that the person is adhering to the quarantine rules.

    -

    Interestingly, the HKR app, — just like the contact tracing app VirusRadar, which was developed by Nextsense — has been “donated” to the Hungarian Government by Asura Technologies “free of charge”

    +

    Interestingly, the HKR app, — just like the contact tracing app VirusRadar, which was developed by Nextsense — has been “donated” to the Hungarian Government by Asura Technologies “free of charge”

    Graphical user interface, application, chat or text message Description automatically generatedA picture containing text, screenshot, monitor Description automatically generated

    Figure 5. Snapshots from the video Home Quarantine System Short Presentation by Asura Technologies38

    @@ -1756,13 +1793,13 @@
  • Criminal case history database, managed by the French Ministry of Interior↩︎

  • Criminal case management system, managed by the German Federal Criminal Police Office (Bundeskriminalamt)↩︎

  • Managed by the Video and Image Laboratory of the Audiovisual Evidence of the Department of Photography and Modus Operandi of the Hellenic Police Forensic Science Division↩︎

  • -
  • The Facial Image registry is interrogated through a search engine developed by NEC, and accessible to the National Investigation Agency, the Criminal Courts, the National Protective Service, the Counter-Terrorism Centre, the Hungarian Prison Service, the Prosecution Service of Hungary, the Public Administration, the Special Service for National Security, the Intelligence Agencies, the Hungarian Police, the Hungarian Parliamentary Guard, Hungarian Ministry of Justice, Witness Protection Service, the National Directorate-General for Aliens Policing and Institution of the President of the Republic.↩︎

  • +
  • The Facial Image registry is interrogated through a search engine developed by NEC, and accessible to the National Investigation Agency, the Criminal Courts, the National Protective Service, the Counter-Terrorism Centre, the Hungarian Prison Service, the Prosecution Service of Hungary, the Public Administration, the Special Service for National Security, the Intelligence Agencies, the Hungarian Police, the Hungarian Parliamentary Guard, Hungarian Ministry of Justice, Witness Protection Service, the National Directorate-General for Aliens Policing and Institution of the President of the Republic.↩︎

  • Automated Fingerprint Identification System. The system can be interrogated via a software developed by the company Reco 3.26, a subsidiary of Parsec 3.26. Another software used is provided by the japanese company NEC.↩︎

  • Biometric Data Processing System (criminal data array), supported by database software from RIX Technologies↩︎

  • Habitoscopic Data Register↩︎

  • Central Automatic TeChnology for Recognition of Persons, managed by the Centrum voor Biometrie, connected to the Dutch Judicial Information Service (Justid).↩︎

  • The database uses VeriLook and Face Trace software from the Lithuanian company Neurotechnology.↩︎

  • -
  • Automated Biometric Identification System, searchable by the IntellQ software from the company IntellByte, managed by the Ministry of the Interior (Croatia).↩︎

  • +
  • Automated Biometric Identification System, searchable by the IntellQ software from the company IntellByte, managed by the Ministry of the Interior (Croatia).↩︎

  • Central Biometric Information System↩︎

  • National Biometric Identification System↩︎

  • Managed by the Photographic and Graphic Laboratory of Criminalistic Services, using search software by the company Unidas↩︎