diff --git a/www/report/report.css b/www/report/report.css index d78c3e1..885f158 100644 --- a/www/report/report.css +++ b/www/report/report.css @@ -588,8 +588,13 @@ h1.Title{ color: var(--color-bg-secondary); padding: 1rem; margin: 0 -1rem; + font-weight: bold; } .keypoints > p > strong{ margin-left:2.5rem; text-transform: uppercase; +} + +a.maplink{ + cursor: pointer; } \ No newline at end of file diff --git a/www/report/report.html b/www/report/report.html index f8c54d4..89898e0 100644 --- a/www/report/report.html +++ b/www/report/report.html @@ -51,8 +51,44 @@ // start observing intersectionObserver.observe(caseEl); } + + + const linkEls = document.getElementsByClassName('maplink'); + for (let linkEl of linkEls) { + linkEl.addEventListener('click', (ev) => { + const toSelect = typeof linkEl.dataset.title == 'undefined' || linkEl.dataset.title == 'none' ? null : frameEl.contentWindow.getIdForTitle(linkEl.dataset.title); + + if(toSelect === null) { + frameEl.contentWindow.mapGraph.deselectNode(); + frameEl.contentWindow.mapGraph.resetZoom(); + } else { + const node = frameEl.contentWindow.mapGraph.graph.nodes.filter(n => n.id == toSelect)[0] + frameEl.contentWindow.mapGraph.selectNode(node); + } + + }) + linkEl.addEventListener('mouseover', (ev) => { + const toSelect = typeof linkEl.dataset.title == 'undefined' || linkEl.dataset.title == 'none' ? null : frameEl.contentWindow.getIdForTitle(linkEl.dataset.title); + if(toSelect){ + + const node = frameEl.contentWindow.mapGraph.graph.nodes.filter(n => n.id == toSelect)[0] + frameEl.contentWindow.mapGraph.hoverNode(false, node); + } + + }) + linkEl.addEventListener('mouseout', (ev) => { + const toSelect = typeof linkEl.dataset.title == 'undefined' || linkEl.dataset.title == 'none' ? null : frameEl.contentWindow.getIdForTitle(linkEl.dataset.title); + if(toSelect){ + const node = frameEl.contentWindow.mapGraph.graph.nodes.filter(n => n.id == toSelect)[0] + frameEl.contentWindow.mapGraph.endHoverNode(node); + } + + }) + + } + + }); - }) }); // frame.contentWindow; @@ -96,7 +132,7 @@
Several French cities have launched “safe city” projects involving biometric technologies, however Nice is arguably the national leader. The city currently has the highest CCTV coverage of any city in France and has more than double the police agents per capita of the neighbouring city of Marseille.
Through a series of public-private partnerships the city began a number of initiatives using RBI technologies (including emotion and facial recognition). These technologies were deployed for both authentication and surveillance purposes with some falling into the category of biometric mass surveillance.
One project which used FRT at a high school in Nice and one in Marseille was eventually declared unlawful. The court determined that the required consent could not be obtained due to the power imbalance between the targeted public (students) and the public authority (public educational establishment). This case highlights important issues about the deployment of biometric technologies in public spaces.
One project which used FRT at a high school in Nice and one in Marseille was eventually declared unlawful. The court determined that the required consent could not be obtained due to the power imbalance between the targeted public (students) and the public authority (public educational establishment). This case highlights important issues about the deployment of biometric technologies in public spaces.
The use of biometric mass surveillance by the mayor of Nice Christian Estrosi has put him on a collision course with the French Data Protection Authority (CNIL) as well as human rights/ digital rights organisations (Ligue des Droits de l’Homme, La Quadrature du Net). His activities have raised both concern and criticism over the usage of the technologies and their potential impact on the privacy of personal data.
CHAPTER 9: Facial Recognition in Südkreuz Berlin, Hamburg G20 and Mannheim (Germany)
@@ -660,8 +696,8 @@The intrusiveness of the system, and its impact on fundamental rights is best exemplified by its deployment in the Xinjiang province. The province capital, Urumqi, is chequered with checkpoints and identification stations. Citizens need to submit to facial recognition ID checks in supermarkets, hotels, train stations, highway stations and several other public spaces (Chin and Bürge 2017). The information collected through the cameras is centralised and matched against other biometric data such as DNA samples and voice samples. This allows the government to attribute trust-worthiness scores (trustworthy, average, untrustworthy) and thus generate a list of individuals that can become candidates for detention (Wang 2018).
-European countries’ deployments are far from the Chinese experience. But the companies involved in China’s pervasive digital surveillance network (such as Tencent, Dahua Technology, Hikvision, SenseTime, ByteDance and Huawei) are exporting their know-how to Europe, under the form of “safe city” packages. Huawei is one of the most active in this regard. On the European continent, the city of Belgrade has for example deployed an extensive communication network of more than 1.000 cameras which collect up to 10 body and facial attributes (Stojkovski 2019). The cameras, deployed on poles, major traffic crossings and a large number of public spaces allow the Belgrade police to monitor large parts of the city centre, collect biometric information and communicate it directly to police officers deployed in the field. Belgrade has the most advanced deployment of Huawei’s surveillance technologies on the European continent, but similar projects are being implemented by other corporations – including the European companies Thales, Engie Ineo or Idemia – in other European cities and many “Safe City” deployments are planned soon in EU countries such as France, Italy, Spain, Malta, and Germany (Hillman and McCalpin 2019). Furthermore, contrary to the idea China would be the sole exporter of Remote Biometric Identification technologies, EU companies have substantially developed their exports in this domain over the last years (Wagner 2021)
-The turning point of public debates on facial recognition in Europe was probably the Clearview AI controversy in 2019-2020. Clearview AI, a company founded by Hoan Ton-That and Richard Schwartz in the United States, maintained a relatively secret profile until a New York Times article revealed in late 2019 that it was selling facial recognition technology to law enforcement. In February 2020, it was reported that the client list of Clearview AI had been stolen, and a few days later the details of the list were leaked (Mac, Haskins, and McDonald 2020). To the surprise of many in Europe, in addition to US government agencies and corporations, it appeared that the Metropolitan Police Service (London, UK), as well as law enforcement from Belgian, Denmark, Finland, France, Ireland, Italy, Latvia, Lithuania, Malta, the Netherlands, Norway, Portugal, Serbia, Slovenia, Spain, Sweden, and Switzerland were on the client list. The controversy grew larger as it emerged that Clearview AI had (semi-illegally) harvested a large number of images from social media platforms such as Facebook, YouTube and Twitter in order to constitute the datasets against which clients were invited to carry out searches (Mac, Haskins, and McDonald 2020).
+European countries’ deployments are far from the Chinese experience. But the companies involved in China’s pervasive digital surveillance network (such as Tencent, Dahua Technology, Hikvision, SenseTime, ByteDance and Huawei) are exporting their know-how to Europe, under the form of “safe city” packages. Huawei is one of the most active in this regard. On the European continent, the city of Belgrade has for example deployed an extensive communication network of more than 1.000 cameras which collect up to 10 body and facial attributes (Stojkovski 2019). The cameras, deployed on poles, major traffic crossings and a large number of public spaces allow the Belgrade police to monitor large parts of the city centre, collect biometric information and communicate it directly to police officers deployed in the field. Belgrade has the most advanced deployment of Huawei’s surveillance technologies on the European continent, but similar projects are being implemented by other corporations – including the European companies Thales, Engie Ineo or Idemia – in other European cities and many “Safe City” deployments are planned soon in EU countries such as France, Italy, Spain, Malta, and Germany (Hillman and McCalpin 2019). Furthermore, contrary to the idea China would be the sole exporter of Remote Biometric Identification technologies, EU companies have substantially developed their exports in this domain over the last years (Wagner 2021)
+The turning point of public debates on facial recognition in Europe was probably the Clearview AI controversy in 2019-2020. Clearview AI, a company founded by Hoan Ton-That and Richard Schwartz in the United States, maintained a relatively secret profile until a New York Times article revealed in late 2019 that it was selling facial recognition technology to law enforcement. In February 2020, it was reported that the client list of Clearview AI had been stolen, and a few days later the details of the list were leaked (Mac, Haskins, and McDonald 2020). To the surprise of many in Europe, in addition to US government agencies and corporations, it appeared that the Metropolitan Police Service (London, UK), as well as law enforcement from Belgian, Denmark, Finland, France, Ireland, Italy, Latvia, Lithuania, Malta, the Netherlands, Norway, Portugal, Serbia, Slovenia, Spain, Sweden, and Switzerland were on the client list. The controversy grew larger as it emerged that Clearview AI had (semi-illegally) harvested a large number of images from social media platforms such as Facebook, YouTube and Twitter in order to constitute the datasets against which clients were invited to carry out searches (Mac, Haskins, and McDonald 2020).
The news of the hacking strengthened a strong push-back movement against the development of facial recognition technology by companies such as Clearview AI, as well as their use by government agencies. In 2018, Massachusetts Institute of Technology (MIT) scholar and Algorithmic Justice League founder Joy Buolamwini together with Temnit Gebru had published the report Gender Shades (Buolamwini and Gebru 2018), in which they assessed the racial bias in the face recognition datasets and algorithms used by companies such as IBM and Microsoft. Buolamwini and Gebru found that algorithms performed generally worse on darker-skinned faces, and in particular darker-skinned females, with error rates up to 34% higher than lighter-skinned males (Najibi 2020). IBM and Microsoft responded by amending their systems, and a re-audit showed less bias. Not all companies responded equally. Amazon’s Rekognition system, which was included in the second study continued to show a 31% lower rate for darker-skinned females. The same year ACLU conducted another key study on Amazon’s Rekognition, using the pictures of members of congress against a dataset of mugshots from law enforcement. 28 members of Congress, largely people of colour were incorrectly matched (Snow 2018). Activists engaged lawmakers. In 2019, the Algorithmic Accountability Act allowed the Federal Trade Commission to regulate private companies’ uses of facial recognition. In 2020, several companies, including IBM, Microsoft, and Amazon, announced a moratorium on the development of their facial recognition technologies. Several US cities, including Boston, Cambridge (Massachusetts) San Francisco, Berkeley, Portland (Oregon), have also banned their police forces from using the technology.
@@ -759,7 +795,7 @@This is perhaps the form of person tracking with which the least information about an individual is stored. An object detection algorithm estimates the presence and position of individuals on a camera image. These positions are stored or counted and used for further metrics. It is used to count passers-by in city centres, and for a one-and-a-half-meter social distancing monitor in Amsterdam2. See also the case study in this document on the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), which goes into more detail about the use of the recorded trajectories of individuals to label anomalous behaviour.
+This is perhaps the form of person tracking with which the least information about an individual is stored. An object detection algorithm estimates the presence and position of individuals on a camera image. These positions are stored or counted and used for further metrics. It is used to count passers-by in city centres, and for a one-and-a-half-meter social distancing monitor in Amsterdam2. See also the case study in this document on the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), which goes into more detail about the use of the recorded trajectories of individuals to label anomalous behaviour.
From a technological perspective, neural networks process audio relatively similarly to how video is processed: rather than feeding an image, a spectrogram is used as input for the network. However, under the GDPR, recording conversations, is illegal in the European Union without informed consent of the participants. In order to adhere to these regulations, on some occasions, only particular frequencies are recorded and processed. For example, in the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), only two frequencies are used to classify audio; making conversations indiscernible while being able to discern shouting or the breaking of glass3.
+From a technological perspective, neural networks process audio relatively similarly to how video is processed: rather than feeding an image, a spectrogram is used as input for the network. However, under the GDPR, recording conversations, is illegal in the European Union without informed consent of the participants. In order to adhere to these regulations, on some occasions, only particular frequencies are recorded and processed. For example, in the Burglary-Free Neighbourhood in Rotterdam (CHAPTER 7), only two frequencies are used to classify audio; making conversations indiscernible while being able to discern shouting or the breaking of glass3.
Facial recognition algorithms can be developed in-house, taken from an open-source repository, or purchased (IPVM Team 2021b, 14). Popular open-source facial recognition implementations include OpenCV, Face_pytorch, OpenFace and Insightface. Many of these software libraries are developed at universities or implement algorithms and neural network architectures presented in academic papers. They are free, and allow for a great detail of customisation, but require substantial programming skills to be implemented in a surveillance system. Moreover, when using such software, the algorithms run on one’s own hardware which provides the developer with more control, but also requires more maintenance.
-Proprietary facial recognition. There are three possible routes for the use of proprietary systems: There are “turnkey” systems sold by manufacturers such as Hikvision, Dahua, Anyvision or Briefcam. Those integrate the software and hardware, and as such can be directly deployed by the client. Algorithm developers such as Amazon AWS Rekognition (USA), NEC (Japan), NTechlab (Russia), Paravision (USA) allow to implement their algorithms and customise them to one’s needs, and finally there are “cloud” API systems, a sub-set of the former category, where the algorithm is hosted in a datacentre and is accessed remotely (IPVM Team 2021b, 16). The latter type of technology bears important legal ramifications, as the data may travel outside of national or European jurisdictions. It should be noted that many of the proprietary products are based on similar algorithms and network architectures as their open-source counterparts (OpenCV, 2021). Contrary to the open-source software, it is generally unclear which datasets of images have been used to train the proprietary algorithms.
+Proprietary facial recognition. There are three possible routes for the use of proprietary systems: There are “turnkey” systems sold by manufacturers such as Hikvision, Dahua, AnyVision or Briefcam. Those integrate the software and hardware, and as such can be directly deployed by the client. Algorithm developers such as Amazon AWS Rekognition (USA), NEC (Japan), NTechlab (Russia), Paravision (USA) allow to implement their algorithms and customise them to one’s needs, and finally there are “cloud” API systems, a sub-set of the former category, where the algorithm is hosted in a datacentre and is accessed remotely (IPVM Team 2021b, 16). The latter type of technology bears important legal ramifications, as the data may travel outside of national or European jurisdictions. It should be noted that many of the proprietary products are based on similar algorithms and network architectures as their open-source counterparts (OpenCV, 2021). Contrary to the open-source software, it is generally unclear which datasets of images have been used to train the proprietary algorithms.
A broad range of deployments, which we consider in this first section, is not aimed at surveillance, but at authentication (see section 2.3 in this report), namely making sure that the person in front of the security camera is who they say they are.
As in the cases of the use of Cisco systems powered FRT in two pilot projects in high schools of Nice (see section 8.1) and Marseille (France)7, or as in the case of the Anderstorp Upper Secondary School in Skelleftea (Sweden)8, the aim of these projects was to identify students who could have access to the premises. School-wide biometric databases were generated and populated with students’ portraits. Gates were fitted with cameras connected to facial recognition technology and allowed access only to recognised students. Another documented use has been for the Home Quarantine App (Hungary), in which telephone cameras are used by authorities to verify the identity of the persons logged into the app (see also section 10.1).
+As in the cases of the use of Cisco systems powered FRT in two pilot projects in high schools of Nice (see section 8.1) and Marseille (France)7, or as in the case of the Anderstorp Upper Secondary School in Skelleftea (Sweden)8, the aim of these projects was to identify students who could have access to the premises. School-wide biometric databases were generated and populated with students’ portraits. Gates were fitted with cameras connected to facial recognition technology and allowed access only to recognised students. Another documented use has been for the Home Quarantine App (Hungary), in which telephone cameras are used by authorities to verify the identity of the persons logged into the app (see also section 10.1).
In these deployments, people must submit themselves to the camera in order to be identified and gain access. While these techniques of identification pose important threats to the privacy of the concerned small groups of users (in both high school cases, DPAs banned the use of FRTs), and run the risk of false positives (unauthorised people recognised as authorised) or false negatives (authorised people not recognised as such) the risk of biometric mass surveillance strictly speaking is low to non-existent because of the nature of the acquisition of images and other sensor-based data.
-However, other forms of live authentication tie in with surveillance practices, in particular various forms of blacklisting. With blacklisting the face of every passer-by is compared to a list of faces of individuals who have been rejected access to the premises. In such an instance, people do not have to be identified, as long as an image of their face is provided. This has been used in public places, for example in the case of the Korte Putstraat in the Dutch city of 's-Hertogenbosch: during the carnival festivities of 2019 two people were rejected access to the street after they were singled out by the system (Gotink, 2019). It is unclear how many false positives were generated during this period. Other cases of blacklisting can be found at, for example, access control at various football stadiums in Europe, see also section 3.3. In many cases of blacklisting, individuals do not enrol voluntarily.
+However, other forms of live authentication tie in with surveillance practices, in particular various forms of blacklisting. With blacklisting the face of every passer-by is compared to a list of faces of individuals who have been rejected access to the premises. In such an instance, people do not have to be identified, as long as an image of their face is provided. This has been used in public places, for example in the case of the Korte Putstraat in the Dutch city of 's-Hertogenbosch: during the carnival festivities of 2019 two people were rejected access to the street after they were singled out by the system (Gotink, 2019). It is unclear how many false positives were generated during this period. Other cases of blacklisting can be found at, for example, access control at various football stadiums in Europe, see also section 3.3. In many cases of blacklisting, individuals do not enrol voluntarily.
Biometric systems for the purposes of authentication are also increasingly deployed for forensic applications among law-enforcement agencies in the European Union. The typical scenario for the use of such technologies is to match the photograph of a suspect (extracted, for example, from previous records or from CCTV footage) against an existing dataset of known individuals (e.g., a national biometric database, a driver’s license database, etc.). (TELEFI, 2021). The development of these forensic authentication capabilities is particularly relevant to this study, because it entails making large databases ready for searches on the basis of biometric information.
-To date, 11 out of 27 member states of the European Union are using facial recognition against biometric databases for forensic purposes: Austria (EDE)9, Finland (KASTU)10, France (TAJ)11, Germany (INPOL)12, Greece (Mugshot Database)13, Hungary (Facial Image Registry)14, Italy (AFIS)15, Latvia (BDAS)16, Lithuania (HDR)17, Netherlands (CATCH)18 and Slovenia (Record of Photographed Persons)19 (TELEFI 2021).
-Seven additional countries are expected to acquire such capabilities in the near future: Croatia (ABIS)20, Czech Republic (CBIS)21, Portugal (AFIS) Romania (NBIS)22, Spain (ABIS), Sweden (National Mugshot Database), Cyprus (ISIS Faces)23, Estonia (ABIS) (TELEFI 2021).
-When it comes to international institutions, Interpol (2020) has a facial recognition system (IFRS)24, based on facial images received from more than 160 countries. Europol has two sub-units which use the facial recognition search tool and database known as FACE: the European Counter Terrorism Center (ECTC) and the European Cybercrime Center (ECC). (TELEFI, 2021 149-153) (Europol 2020)
+To date, 11 out of 27 member states of the European Union are using facial recognition against biometric databases for forensic purposes: Austria (EDE)9, Finland (KASTU)10, France (TAJ)11, Germany (INPOL)12, Greece (Mugshot Database)13, Hungary (Facial Image Registry)14, Italy (AFIS)15, Latvia (BDAS)16, Lithuania (HDR)17, Netherlands (CATCH)18 and Slovenia (Record of Photographed Persons)19 (TELEFI 2021).
+Seven additional countries are expected to acquire such capabilities in the near future: Croatia (ABIS)20, Czech Republic (CBIS)21, Portugal (AFIS) Romania (NBIS)22, Spain (ABIS), Sweden (National Mugshot Database), Cyprus (ISIS Faces)23, Estonia (ABIS) (TELEFI 2021).
+When it comes to international institutions, Interpol (2020) has a facial recognition system (IFRS)24, based on facial images received from more than 160 countries. Europol has two sub-units which use the facial recognition search tool and database known as FACE: the European Counter Terrorism Center (ECTC) and the European Cybercrime Center (ECC). (TELEFI, 2021 149-153) (Europol 2020)
Only 9 countries in the EU so far have rejected or do not plan to implement FRT for forensic purposes: Belgium (see CHAPTER 6), Bulgaria, Denmark, Ireland, Luxembourg, Malta, Poland, Portugal, Slovakia.
Figure 1. EU Countries use of FRT for forensic applications25
When it comes to databases, some countries limit the searches to criminal databases (Austria, Germany, France, Italy, Greece, Slovenia, Lithuania, UK), while other countries open the searches to civil databases (Finland, Netherlands, Latvia, Hungary).
This means that the person categories can vary substantially. In the case of criminal databases it can range from suspects and convicts, to asylum seekers, aliens, unidentified persons, immigrants, visa applicants. When civil databases are used as well, such as in Hungary, the database contains a broad range of “individuals of known identity from various document/civil proceedings” (TELEFI 2021, appendix 3).
-Finally, the database sizes, in comparison to the authentication databases mentioned in the previous section, are of a different magnitude. The databases of school students in France and Sweden, mentioned in the previous section contains a few hundred entries. National databases can contain instead several millions. Criminal databases such as Germany’s INPOL contains 6,2 million individuals, France’s TAJ 21 million individuals and Italy’s AFIS 9 million individuals. Civil databases, such as Hungary’s Facial Image Registry contain 30 million templates (TELEFI, 2021 appendix 3).
+Finally, the database sizes, in comparison to the authentication databases mentioned in the previous section, are of a different magnitude. The databases of school students in France and Sweden, mentioned in the previous section contains a few hundred entries. National databases can contain instead several millions. Criminal databases such as Germany’s INPOL contains 6,2 million individuals, France’s TAJ 21 million individuals and Italy’s AFIS 9 million individuals. Civil databases, such as Hungary’s Facial Image Registry contain 30 million templates (TELEFI, 2021 appendix 3).
Authentication has also been deployed as part of integrated “safe city” solutions, such as the NEC Technology Bio-IDiom system in Lisbon and London, deployed for forensic investigation purposes. For this specific product, authentication can occur via facial recognition, as well as other biometric authentication techniques such as ear acoustics, iris, voice, fingerprint, and finger vein recognition. We currently do not have public information on the use of Bio-IDiom in Lisbon nor in London. On NEC’s Website (2021) however, Bio-IDiom is advertised as a “multimodal” identification system, that has been used for example by the Los Angeles County Sheriff’s Department (LASD) for criminal investigations. The system “combines multiple biometric technologies including fingerprint, palm print, face, and iris recognition” and works “based on the few clues left behind at crime scenes. In Los Angeles, “this system is also connected to the databases of federal and state law enforcement agencies such as the California Department of Justice and FBI, making it the world’s largest-scale service-based biometrics system for criminal investigation”. We don’t know if that is the case in Portugal and in the UK deployments.
In order to give a concrete example of the forensic use of biometric technology, we can take the German case. Germany has been using automated facial recognition technologies to identify criminal activity since 2008 using a central criminal information system called INPOL (Informationssystem Polizei), maintained by the Bundeskriminalamt (BKA), which is the federal criminal police office. INPOL uses Oracle Software and includes the following information: name, aliases, date and place of birth, nationality, fingerprints, mugshots, appearance, information about criminal histories such as prison sentences or violence of an individual, and DNA information. However, DNA information is not automatically recorded (TELEFI 2021).
-The INPOL database includes facial images of suspects, arrestees, missing persons, and convicted individuals. For the purpose of facial recognition, anatomical features of a person's face or head as seen on video surveillance or images are used as a material to match with data in INPOL. The facial recognition system compares templates and lists all the matches ordered by degree of accordance. The BKA has specific personnel visually analysing the system's choices and providing an assessment, defining the probability of identifying a person. This assessment can be used in a court of law if necessary (Bundeskriminalamt, n.d.). Searches in the database are conducted by using Cognitec Face VACS software (TELEFI 2021).
-As of March 2020, INPOL consists of 5,8 million images of about 3,6 million individuals. All police stations in Germany have access to this database. The BKA saves biometric data and can be used by other ministries as well, for instance, to identify asylum seekers. Furthermore, the data is shared in the context of the Prüm cooperation on an international level (mostly fingerprints and DNA patterns). Furthermore, the BKA saves DNA analysis data as part of INPOL, accessible for all police stations in Germany. That database contains 1,2 million data sets (Bundeskriminalamt, n.d.). Other recorded facial images, for instance, driver’s licenses or passports, are not included in the search, and the database is mainly used for police work (TELEFI 2021).
+In order to give a concrete example of the forensic use of biometric technology, we can take the German case. Germany has been using automated facial recognition technologies to identify criminal activity since 2008 using a central criminal information system called INPOL (Informationssystem Polizei), maintained by the Bundeskriminalamt (BKA), which is the federal criminal police office. INPOL uses Oracle Software and includes the following information: name, aliases, date and place of birth, nationality, fingerprints, mugshots, appearance, information about criminal histories such as prison sentences or violence of an individual, and DNA information. However, DNA information is not automatically recorded (TELEFI 2021).
+The INPOL database includes facial images of suspects, arrestees, missing persons, and convicted individuals. For the purpose of facial recognition, anatomical features of a person's face or head as seen on video surveillance or images are used as a material to match with data in INPOL. The facial recognition system compares templates and lists all the matches ordered by degree of accordance. The BKA has specific personnel visually analysing the system's choices and providing an assessment, defining the probability of identifying a person. This assessment can be used in a court of law if necessary (Bundeskriminalamt, n.d.). Searches in the database are conducted by using Cognitec Face VACS software (TELEFI 2021).
+As of March 2020, INPOL consists of 5,8 million images of about 3,6 million individuals. All police stations in Germany have access to this database. The BKA saves biometric data and can be used by other ministries as well, for instance, to identify asylum seekers. Furthermore, the data is shared in the context of the Prüm cooperation on an international level (mostly fingerprints and DNA patterns). Furthermore, the BKA saves DNA analysis data as part of INPOL, accessible for all police stations in Germany. That database contains 1,2 million data sets (Bundeskriminalamt, n.d.). Other recorded facial images, for instance, driver’s licenses or passports, are not included in the search, and the database is mainly used for police work (TELEFI 2021).
A first range of deployments of “smart” systems correspond to what can broadly be defined as “smart surveillance” yet do not collect or process biometric information per se26. Smart systems can be used ex-post, to assist CCTV camera operators in processing large amounts of recorded information, or can guide their attention when they have to monitor a large number of live video feeds simultaneously. Smart surveillance uses the following features:
- Anomaly detection. In Toulouse (France), the City Council commissioned IBM to connect 30 video surveillance cameras to software able to "assist human decisions" by raising alerts when "abnormal events are detected." (Technopolice 2021) The request was justified by the “difficulties of processing the images generated daily by the 350 cameras and kept for 30 days (more than 10,000 images per second)”. The objective, according to the digital direction is "to optimise and structure the supervision of video surveillance operators by generating alerts through a system of intelligent analysis that facilitates the identification of anomalies detected, whether: movements of crowds, isolated luggage, crossing virtual barriers north of the Garonne, precipitous movement, research of shapes and colour. All these detections are done in real time or delayed (Technopolice 2021). In other words, the anomaly detection is a way to operationalise the numerical output of various computer vision based recognition systems. Similar systems are used in the Smart video surveillance deployment in Valenciennes (France) or in the Urban Surveillance Centre (Marseille).
- Object Detection. In Amsterdam, around the Johan Cruijff ArenA (Stadium), the city has been experimenting with a Digitale Perimeter (digital perimeter) surveillance system. In addition to the usual features of facial recognition, and crowd monitorining, the system includes the possibility of automatically detecting specific objects such as weapons, fireworks or drones. Similar features are found in Inwebit’s Smart Security Platform (SSP) in Poland.
-- Feature search. In Marbella (Spain), Avigilon deployed a smart camera system aimed at providing “smart” functionalities without biometric data. Since regional law bans facial and biometric identification without consent, the software uses “appearance search”. “Appearance search” provides estimates for “unique facial traits, the colour of a person’s clothes, age, shape, gender and hair colour”. This information is not considered biometric. The individual’s features can be used to search for suspects fitting a particular profile. Similar technology has been deployed in Kortrijk (Belgium), which provides search parameters for people, vehicles and animals (Verbeke 2019).
-- Video summary. Some companies, such as Briefcam and their product Briefcam Review, offer a related product, which promises to shorten the analysis of long hours of CCTV footage, by identifying specific topics of interest (children, women, lighting changes) and making the footage searchable. The product combines face recognition, license plate recognition, and more mundane video analysis features such as the possibility to overlay selected scenes, thus highlighting recurrent points of activity in the image. Briefcam is deployed in several cities across Europe, including Vannes, Roubaix (in partnership with Eiffage) and Moirand in France.
-- Object detection and object tracking. As outlined in chapter 2, object detection is often the first step in the various digital detection applications for images. An ‘object’ here can mean anything the computer is conditioned to search for: a suitcase, a vehicle, but also a person; while some products further process the detected object to estimate particular features, such as the colour of a vehicle, the age of a person. However, on some occasions — often to address concerns over privacy — only the position of the object on the image is stored. This is for example the case with the test of the One-and-a-half-meter monitor in Amsterdam (Netherlands), Intemo’s people counting system in Nijmegen (Netherlands), the KICK project in Brugge, Kortrijk, Ieper, Roeselare and Oostende in Belgium or the Eco-counter tracking cameras pilot project in Lannion (France).
+- Feature search. In Marbella (Spain), Avigilon deployed a smart camera system aimed at providing “smart” functionalities without biometric data. Since regional law bans facial and biometric identification without consent, the software uses “appearance search”. “Appearance search” provides estimates for “unique facial traits, the colour of a person’s clothes, age, shape, gender and hair colour”. This information is not considered biometric. The individual’s features can be used to search for suspects fitting a particular profile. Similar technology has been deployed in Kortrijk (Belgium), which provides search parameters for people, vehicles and animals (Verbeke 2019).
+- Video summary. Some companies, such as Briefcam and their product Briefcam Review, offer a related product, which promises to shorten the analysis of long hours of CCTV footage, by identifying specific topics of interest (children, women, lighting changes) and making the footage searchable. The product combines face recognition, license plate recognition, and more mundane video analysis features such as the possibility to overlay selected scenes, thus highlighting recurrent points of activity in the image. Briefcam is deployed in several cities across Europe, including Vannes, Roubaix (in partnership with Eiffage) and Moirand in France.
+- Object detection and object tracking. As outlined in chapter 2, object detection is often the first step in the various digital detection applications for images. An ‘object’ here can mean anything the computer is conditioned to search for: a suitcase, a vehicle, but also a person; while some products further process the detected object to estimate particular features, such as the colour of a vehicle, the age of a person. However, on some occasions — often to address concerns over privacy — only the position of the object on the image is stored. This is for example the case with the test of the One-and-a-half-meter monitor in Amsterdam (Netherlands), Intemo’s people counting system in Nijmegen (Netherlands), the KICK project in Brugge, Kortrijk, Ieper, Roeselare and Oostende in Belgium or the Eco-counter tracking cameras pilot project in Lannion (France).
- Movement recognition. Avigilon’s software that is deployed in Marbella (Spain) also detects unusual movement. “To avoid graffiti, we can calculate the time someone takes to pass a shop window, “explained Javier Martín, local chief of police in Marbella to the Spanish newspaper El País. “If it takes them more than 10 seconds, the camera is activated to see if they are graffitiing. So far, it hasn’t been activated.” (Colomé 2019) Similar movement recognition technology is used in, the ViSense deployment at the Olympic Park London (UK) and the security camera system in Mechelen-Willebroek (Belgium). It should be noted that movement recognition can be done in two ways: where projects such as the Data-lab Burglary-free Neighbourhood in Rotterdam (Netherlands)27 are only based on the tracking of trajectories of people through an image (see also ‘Object detection’), cases such as the Living Lab Stratumseind28 in Eindhoven (Netherlands) also process the movements and gestures of individuals in order to estimate their behaviour.
- In addition to image (video) based products, some deployments use audio recognition to complement the decision-making process, for example used in the Serenecity (a branch of Verney-Carron) Project in Saint-Etienne (France), the Smart CCTV deployment in public transportation in Rouen (France) or the Smart CCTV system in Strasbourg (France). The project piloted in Saint-Etienne for example, worked by placing “audio capture devices” - the term microphone was avoided- in strategic parts of the city. Sounds qualified by an anomaly detection algorithm as suspicious would then alert operators in the Urban Supervision Center, prompting further investigation via CCTV or deployment of the necessary services (healthcare or police for example) (France 3 Auvergne-Rhône-Alpes 2019.)
+- In addition to image (video) based products, some deployments use audio recognition to complement the decision-making process, for example used in the Serenecity (a branch of Verney-Carron) Project in Saint-Etienne (France), the Smart CCTV deployment in public transportation in Rouen (France) or the Smart CCTV system in Strasbourg (France). The project piloted in Saint-Etienne for example, worked by placing “audio capture devices” - the term microphone was avoided- in strategic parts of the city. Sounds qualified by an anomaly detection algorithm as suspicious would then alert operators in the Urban Supervision Center, prompting further investigation via CCTV or deployment of the necessary services (healthcare or police for example) (France 3 Auvergne-Rhône-Alpes 2019.)
- Emotion recognition is a rare occurrence. We found evidence of its deployment only in a pilot project in Nice (see section 8.1) and in the Citybeacon project in Eindhoven, but even then, the project was never actually tested. The original idea proposed by the company Two-I was “a "real-time emotional mapping" capable of highlighting "potentially problematic or even dangerous situations". "A dynamic deployment of security guards in an area where tension and stress are felt, is often a simple way to avoid any overflow," also argues Two-I, whose "Security" software would be able to decipher some 10,000 faces per second. (Binacchi 2019)
+- Emotion recognition is a rare occurrence. We found evidence of its deployment only in a pilot project in Nice (see section 8.1) and in the Citybeacon project in Eindhoven, but even then, the project was never actually tested. The original idea proposed by the company Two-I was “a "real-time emotional mapping" capable of highlighting "potentially problematic or even dangerous situations". "A dynamic deployment of security guards in an area where tension and stress are felt, is often a simple way to avoid any overflow," also argues Two-I, whose "Security" software would be able to decipher some 10,000 faces per second. (Binacchi 2019)
While some cities or companies decide to implement some of the functionalities with their existing or updated CCTV systems, several chose to centralise several of these “smart” functions in integrated systems often referred to as “safe city” solutions. These solutions do not necessarily process biometric information. This is the case for example for the deployments in TIM’s, Insula and Venis’ Safe City Platform in Venice (Italy), Huawei’s Safe City in Valenciennes (France), Dahua’s integrated solution in Brienon-sur-Armançon (France), Thalès’ Safe City in La Défense and Nice (France), Engie Inéo’s and SNEF’s integrated solution in Marseille (France), the Center of Urban Supervision in Roubaix (France), AI Mars (Madrid, in development) or NEC’s platform in Lisbon and London.
-The way “Smart/Safe City” solutions work is well exemplified by the “Control room” deployed in Venice, connected to an urban surveillance network. The system is composed of a central command and control room which aggregates cloud computing systems, together with smart cameras, artificial intelligence systems, antennas and hundreds of sensors distributed on a widespread network. The idea is to monitor what happens in the lagoon city in real time. The scope of the abilities of the centre is wide-ranging. It promises to: manage events and incoming tourist flows, something particularly relevant to a city which aims to implement a visiting fee for tourists; predict and manage weather events in advance, such as the shifting of tides and high water, by defining alternative routes for transit in the city; indicating to the population in real time the routes to avoid traffic and better manage mobility for time optimisation; improve the management of public safety allowing city agents to intervene in a more timely manner; control and manage water and road traffic, also for sanctioning purposes, through specific video-analysis systems; control the status of parking lots; monitor the environmental and territorial situation; collect, process data and information that allow for the creation of forecasting models and the allocation of resources more efficiently and effectively; bring to life a physical "Smart Control Room" where law enforcement officers train and learn how to read data as well. (LUMI 2020)
+While some cities or companies decide to implement some of the functionalities with their existing or updated CCTV systems, several chose to centralise several of these “smart” functions in integrated systems often referred to as “safe city” solutions. These solutions do not necessarily process biometric information. This is the case for example for the deployments in TIM’s, Insula and Venis’ Safe City Platform in Venice (Italy), Huawei’s Safe City in Valenciennes (France), Dahua’s integrated solution in Brienon-sur-Armançon (France), Thalès’ Safe City in La Défense and Nice (France), Engie Inéo’s and SNEF’s integrated solution in Marseille (France), the Center of Urban Supervision in Roubaix (France), AI Mars (Madrid, in development) or NEC’s platform in Lisbon and London.
+The way “Smart/Safe City” solutions work is well exemplified by the “Control room” deployed in Venice, connected to an urban surveillance network. The system is composed of a central command and control room which aggregates cloud computing systems, together with smart cameras, artificial intelligence systems, antennas and hundreds of sensors distributed on a widespread network. The idea is to monitor what happens in the lagoon city in real time. The scope of the abilities of the centre is wide-ranging. It promises to: manage events and incoming tourist flows, something particularly relevant to a city which aims to implement a visiting fee for tourists; predict and manage weather events in advance, such as the shifting of tides and high water, by defining alternative routes for transit in the city; indicating to the population in real time the routes to avoid traffic and better manage mobility for time optimisation; improve the management of public safety allowing city agents to intervene in a more timely manner; control and manage water and road traffic, also for sanctioning purposes, through specific video-analysis systems; control the status of parking lots; monitor the environmental and territorial situation; collect, process data and information that allow for the creation of forecasting models and the allocation of resources more efficiently and effectively; bring to life a physical "Smart Control Room" where law enforcement officers train and learn how to read data as well. (LUMI 2020)
Integrated solutions can entail smartphone apps, used to connect citizens with the control and command centres. This is for example the case in Nice with the (failed) Reporty App project (See Chapter 5), the Dragonfly project (Hungary) (See chapter 10) and was part of the original plan of Marseille’s Safe City project.
+Integrated solutions can entail smartphone apps, used to connect citizens with the control and command centres. This is for example the case in Nice with the (failed) Reporty App project (See Chapter 5), the Dragonfly project (Hungary) (See chapter 10) and was part of the original plan of Marseille’s Safe City project.
Integrated solutions are generally comprised of a set of crowd management features, such as in the case of the systems in Valenciennes and Marseille (France), Mannheim (Germany), Venice (Italy), Amsterdam, Eindhoven and Den Bosch with the Crowdwatch project (Netherlands). Such crowd management software generally does not recognise individuals, but rather estimates the number of people on (a part of) the video frame. Sudden movements of groups or changes in density are then flagged for attention of the security operator (Nishiyama 2018).
+Integrated solutions are generally comprised of a set of crowd management features, such as in the case of the systems in Valenciennes and Marseille (France), Mannheim (Germany), Venice (Italy), Amsterdam, Eindhoven and Den Bosch with the Crowdwatch project (Netherlands). Such crowd management software generally does not recognise individuals, but rather estimates the number of people on (a part of) the video frame. Sudden movements of groups or changes in density are then flagged for attention of the security operator (Nishiyama 2018).
Here are the documented cases of RBI in public spaces we could find through our research:
-- Live Facial Recognition pilot project in Brussels International Airport / Zaventem (Belgium, see detailed case study, CHAPTER 6)
-- Live Facial Recognition in Budapest (Hungary, see detailed case study, CHAPTER 10)
-- Live Facial Recognition pilot project during the Carnival in Nice (France, see detailed case study, CHAPTER 8)
-- Live Facial Recognition Pilot Project Südkreuz Berlin (Germany, see detailed case study, CHAPTER 9)
+- Live Facial Recognition pilot project in Brussels International Airport / Zaventem (Belgium, see detailed case study, CHAPTER 6)
+- Live Facial Recognition in Budapest (Hungary, see detailed case study, CHAPTER 10)
+- Live Facial Recognition pilot project during the Carnival in Nice (France, see detailed case study, CHAPTER 8)
+- Live Facial Recognition Pilot Project Südkreuz Berlin (Germany, see detailed case study, CHAPTER 9)
Live Facial Recognition during Carnival 2019 in 's-Hertogenbosch’s Lange Putstraat (the (Netherlands)
Live Facial Recognition during Carnival 2019 in 's-Hertogenbosch’s Korte Putstraat (the (Netherlands)
As most of these cases are extensively discussed in the following chapters, we do not comment further on them here.
Taking in particular issue with Article 4 and the possible exemptions to regulation of AI “in order to safeguard public safety”, they urge the commissionEuropean Commission “to make sure that existing protections are upheld and a clear ban on biometric mass surveillance in public spaces is proposed. This is what a majority of citizens want” (Breyer et al. 2021)
European Digital Rights (EDRi), an umbrella organisation of 44 digital rights NGOs in Europe takes a radical stance on the issue. They argue that mass processing of biometric data in public spaces creates a serious risk of mass surveillance that infringes on fundamental rights, and therefore they call on the Commission to permanently stop all deployments that can lead to mass surveillance. In their report Ban Biometric Mass Surveillance (2020) they demand that the EDPB and national DPAs) “publicly disclose all existing and planned activities and deployments that fall within this remit.” (EDRi 2020, 5). Furthermore, they call for ceasing all planned legislation which establishes biometric processing as well as the funding for all such projects, amounting to an “immediate and indefinite ban on biometric processing”.
-La Quadrature du Net (LQDN) one of EDRi’s founding members (created in 2008 to “promote and defend fundamental freedoms in the digital world") similarly called for a ban on any present and future use of facial recognition for security and surveillance purposes. Together with a number of other French NGOs monitoring legislation impacting digital freedoms, as well as other collectives, companies, associations and trade unions, the LQDN initiated a joint open letter in which they call on French authorities to ban any security and surveillance use of facial recognition due to their uniquely invasive and dehumanising nature. In their letter they point to the fact that in France there are a “multitude of systems already installed, outside of any real legal framework, without transparency or public discussion” referring, among others, to the PARAFE system and the use of FRTs by civil and military police. As they put it:
+La Quadrature du Net (LQDN) one of EDRi’s founding members (created in 2008 to “promote and defend fundamental freedoms in the digital world") similarly called for a ban on any present and future use of facial recognition for security and surveillance purposes. Together with a number of other French NGOs monitoring legislation impacting digital freedoms, as well as other collectives, companies, associations and trade unions, the LQDN initiated a joint open letter in which they call on French authorities to ban any security and surveillance use of facial recognition due to their uniquely invasive and dehumanising nature. In their letter they point to the fact that in France there are a “multitude of systems already installed, outside of any real legal framework, without transparency or public discussion” referring, among others, to the PARAFE system and the use of FRTs by civil and military police. As they put it:
“Facial recognition is a uniquely invasive and dehumanising technology, which makes possible, sooner or later, constant surveillance of the public space. It creates a society in which we are all suspects. It turns our face into a tracking device, rather than a signifier of personality, eventually reducing it to a technical object. It enables invisible control. It establishes a permanent and inescapable identification regime. It eliminates anonymity. No argument can justify the deployment of such a technology.”@@ -1143,25 +1180,25 @@
Belgium is one of two European countries that has not yet authorised the use of FRT, however, law enforcement is strongly advocating for its use and the current legal obstacles to its implementation might not hold for very long given the political pressure.
In 2017, unbeknownst to the Belgian Supervisory Body for Police Information (COC), Brussels International Airport acquired 4 cameras connected to a facial recognition software for use by the airport police. Though the COC subsequently ruled that this use fell outside of the conditions for a lawful deployment, the legality of the airport experiment fell into a legal grey area because of the ways in which the technology was deployed.
One justification for the legality of the airport experiment from the General Commissioner of Federal Police was to compare the technological deployment to that of the legal use of other intelligent technologies such as Automated Number Plate Recognition (ANPR). Although this argument was rejected at the time, such a system could be re-instated if the grounds for interruption are no longer present in the law.
One justification for the legality of the airport experiment from the General Commissioner of Federal Police was to compare the technological deployment to that of the legal use of other intelligent technologies such as Automated Number Plate Recognition (ANPR). Although this argument was rejected at the time, such a system could be re-instated if the grounds for interruption are no longer present in the law.
Some civil society actors in Belgium contest the legitimacy of remote biometric identification. However, current legislative activity seems to point in the direction of more acceptance for remote biometric surveillance.
Belgium is, with Spain, one of the few countries in Europe that has not authorised the use of facial recognition technology, neither for criminal investigations nor for mass surveillance (Vazquez 2020). This does not mean that it is unlikely to change its position in the very near future. Law enforcement is indeed strongly advocating its use, and the current legal obstacles are not likely to hold for very long (Bensalem 2018). The pilot experiment that took place in Zaventem / Brussels International Airport, although aborted, occurred within a national context in which biometric systems are increasingly used and deployed.
Belgium will, for example, soon roll out at the national level the new biometric identity card “eID”, the Minister of Interior Annelies Verlinden has recently announced. The identification document, which will rely on the constitution of a broad biometric database and is part of a broader European Union initiative, is developed in partnership with security multinational Thales, was already trialled with 53.000 citizens in (Prins 2021; Thales Group 2020).30
-Municipalities in different parts of the country are experimenting with Automated Number Plate Recognition (ANPR) technology. A smaller number have started deploying “smart CCTV” cameras, which fall just short of using facial recognition technology. The city of Kortrijk has for example deployed “body recognition” technology, which uses walking style or clothing of individuals to track them across the city’s CCTV network. Facial recognition is possible with these systems, but has not been activated as of yet pending legal authorisation to do so. In the city of Roeselare, “smart cameras” have been installed in one of the shopping streets. Deployed by telecom operator Citymesh, they could provide facial recognition services, but are currently used to count and estimate crowds, data which is shared with the police (van Brakel 2020). All the emerging initiatives of remote biometric identification are however pending a reversal of the decision to halt the experiment at Zaventem Brussels International Airport.
+Municipalities in different parts of the country are experimenting with Automated Number Plate Recognition (ANPR) technology. A smaller number have started deploying “smart CCTV” cameras, which fall just short of using facial recognition technology. The city of Kortrijk has for example deployed “body recognition” technology, which uses walking style or clothing of individuals to track them across the city’s CCTV network. Facial recognition is possible with these systems, but has not been activated as of yet pending legal authorisation to do so. In the city of Roeselare, “smart cameras” have been installed in one of the shopping streets. Deployed by telecom operator Citymesh, they could provide facial recognition services, but are currently used to count and estimate crowds, data which is shared with the police (van Brakel 2020). All the emerging initiatives of remote biometric identification are however pending a reversal of the decision to halt the experiment at Zaventem Brussels International Airport.
The use of facial recognition technology at the Brussels International Airport was announced on 10 July 2019 in the Flemish weekly Knack by General Commissioner of Federal Police Marc De Mesmaeker (Lippens and Vandersmissen 2019). There is currently no publicly available information as to whom provided the technical system. De Mesmaeker explained that an agreement had been found with the company managing the airport and the labour unions, and thus that the technology was already in use (Organe de Controle de l'Information Policière 2019, 3).
As part of the justification for the deployment of FRT in Zaventem, De Mesmaeker made a comparison with ANPR-enabled cameras, arguing that “They have already helped to solve investigations quickly, (…). Citizens understand this and have learned to live with their presence, but privacy remains a right”. (7sur7 2019)
-The Belgian Supervisory Body for Police Information (COC)31, in its advisory document, explained that it had no prior knowledge of the deployment and learned about the existence of the facial recognition systems through the interview of De Mesmaeker in the Knack magazine (Organe de Controle de l'Information Policière 2019, 3). On 10 July 2019, the COC thus invited the General Commissioner to communicate all the details of the deployment of this technology in the Brussels International Airport. On 18 July 2019, COC received a summary of the system’s main components. On 9 August 2019, it subsequently visited the premises of the federal police deployment in Zaventem airport (Organe de Controle de l'Information Policière 2019, 3).
+The Belgian Supervisory Body for Police Information (COC)31, in its advisory document, explained that it had no prior knowledge of the deployment and learned about the existence of the facial recognition systems through the interview of De Mesmaeker in the Knack magazine (Organe de Controle de l'Information Policière 2019, 3). On 10 July 2019, the COC thus invited the General Commissioner to communicate all the details of the deployment of this technology in the Brussels International Airport. On 18 July 2019, COC received a summary of the system’s main components. On 9 August 2019, it subsequently visited the premises of the federal police deployment in Zaventem airport (Organe de Controle de l'Information Policière 2019, 3).
We know some technical details about the system through the public information shared by the COC. In early 2017, Brussels airport had acquired 4 cameras connected to a facial recognition software for use by the airport police (Police Aéronautique, LPA) (Farge 2020, 15; Organe de Controle de l'Information Policière 2019, 3). The system works in two steps.
When provided with video feeds from the four cameras, the software first creates snapshots, generating individual records with the faces that appear in the frame. These snapshots on record are then in a second step compared and potentially matched to previously established “blacklists” created by the police itself (the reference dataset is thus not external to this particular deployment) (Organe de Controle de l'Information Policière 2019, 3).
The system did however not live up to its promise and generated a high number of false positives. Many features such as skin colour, glasses, moustaches, and beards led to false matches. The system was thus partially disconnected in March 2017, and at the time of the visit of the COC, the system was no longer fully in use (Organe de Controle de l'Information Policière 2019, 3). Yet the second step had not been de-activated (matching video feeds against pre-established blacklists of faces), the first function of creating a biometric record of the video feeds was still in place (Organe de Controle de l'Information Policière 2019, 3).
The legality of the Zaventem airport experiment fell into a legal grey area, but eventually the COC ruled that it fell outside of the conditions for a lawful deployment.
+The legality of the Zaventem airport experiment fell into a legal grey area, but eventually the COC ruled that it fell outside of the conditions for a lawful deployment.
The right to privacy is enshrined in Article 22 of the Belgian Constitution, which reads as “everyone has the right to the respect of his private and family life, except in the cases and conditions determined by the law.” The ECHR and the case law of the ECtHR have had considerable influence over the interpretation of Article 22 of the Belgian Constitution (Lavrysen et al. 2017) and thus the right enshrined therein can be broadly construed to encompass the right to protection of personal data and to address risks associated with the use of new technologies (Kindt et al. 2008; De Hert 2017). Articles 7 and 8 of the Charter are also relevant where the legislator acts within the scope of EU law (Cour constitutionnelle, N° 2/2021, 14 January 2021).
Belgium adapted its data protection law to the GDPR by enacting the Act of 30 July 2018 on the Protection of Natural Persons with regard to the Processing of Personal data (the Data Protection Act). The same act implements the LED, as well.
In regard to processing of sensitive data for non-law enforcement purposes, the Act sets out certain processing activities which would be regarded as necessary for reasons of substantial public interest, which is one of the lawful grounds listed in Article 9 of the GDPR to process said data. Overall, the relevant public interest purposes relate to processing by human rights organisations in relation to their objective of defending and promoting human rights and fundamental freedoms and in relation to an offence in relation to missing persons or sexual exploitation (Article 8, §1, the Data Protection Act). A separate data processing purpose for personal data of sexual life of the data subject is introduced in relation to the statutory purpose of evaluating, supervising, and treating persons whose sexual behaviour may be qualified as a criminal offence (Article 8, §1, 3°, the Data Protection Act).
@@ -1198,30 +1235,30 @@Key points
The Fieldlab Burglary Free Neighbourhood is a public-private collaboration with two aims: to detect suspicious behaviour and to influence the behaviour of the suspect. While the system of smart streetlamps does collect some image and sound-based data, it does not record any characteristics specific to the individual.
The Fieldlab Burglary Free Neighbourhood is a public-private collaboration with two aims: to detect suspicious behaviour and to influence the behaviour of the suspect. While the system of smart streetlamps does collect some image and sound-based data, it does not record any characteristics specific to the individual.
From a legal perspective, there is a question as to whether or not the data processed by the Burglary Free Neighbourhood programme qualifies as personal data and thus would fall within the scope of data protection legislation.
It is contested whether forms of digital monitoring and signalling are actually the most efficient methods for preventing break ins. Despite the aims of the programme, to date, the streetlights have only been used to capture data for the purposes of machine learning.
The infrastructure installed for the experiments can potentially be used for more invasive forms of monitoring. During the project, local police, for example, already voiced an interest in access to the cameras.
In March 2021, the Fieldlab trial ended. The data collected over the course of the project was not sufficient enough to have the computer distinguish suspicious trajectories. The infrastructure of cameras and microphones is currently disabled, yet remains in place.
In October 2019, the Carlo Collodihof, a courtyard in the Rotterdam neighbourhood Lombardijen, was equipped with a new kind of streetlamp. The twelve new luminaires did not just illuminate the streets; they were fitted with cameras, microphones, speakers, and a computer which was connected to the internet. They are part of the so called Fieldlab Burglary Free Neighbourhood: an experiment in the public space with technologies for computer sensing and data processing, aimed at the prevention of break-ins, robberies, and aggression; increasing the chances of catching and increasing a sense of safety for the inhabitants of the neighbourhood ((Redactie Inbraakvrije Wijk 2019; Kokkeler et al. 2020b). The practical nature of a Fieldlab provides a way to examine concretely how the various technologies come together, and how they fit in with existing infrastructures and regulations.
+In October 2019, the Carlo Collodihof, a courtyard in the Rotterdam neighbourhood Lombardijen, was equipped with a new kind of streetlamp. The twelve new luminaires did not just illuminate the streets; they were fitted with cameras, microphones, speakers, and a computer which was connected to the internet. They are part of the so called Fieldlab Burglary Free Neighbourhood: an experiment in the public space with technologies for computer sensing and data processing, aimed at the prevention of break-ins, robberies, and aggression; increasing the chances of catching and increasing a sense of safety for the inhabitants of the neighbourhood ((Redactie Inbraakvrije Wijk 2019; Kokkeler et al. 2020b). The practical nature of a Fieldlab provides a way to examine concretely how the various technologies come together, and how they fit in with existing infrastructures and regulations.
The national programme Burglary Free Neighbourhood was initiated and funded by the Dutch Ministry of Justice and Security. It is led by DITSS (Dutch Institute for Technology, Safety & Security), a non-profit organisation, that has been involved in earlier computer sensing projects in the Netherlands – for example in Stratumseind, Eindhoven (The Hague Security Delta 2021). Other parties involved include the municipality of Rotterdam, the police –both on a local and national level– the Public Prosecutor’s Office and insurance company Interpolis. Part of the research is carried out by University of Twente, Avans Hogeschool, the Network Institute of the Vrije Universiteit Amsterdam and the Max Planck Institute for Foreign and International Criminal Law (Freiburg, Germany).
+The national programme Burglary Free Neighbourhood was initiated and funded by the Dutch Ministry of Justice and Security. It is led by DITSS (Dutch Institute for Technology, Safety & Security), a non-profit organisation, that has been involved in earlier computer sensing projects in the Netherlands – for example in Stratumseind, Eindhoven (The Hague Security Delta 2021). Other parties involved include the municipality of Rotterdam, the police –both on a local and national level– the Public Prosecutor’s Office and insurance company Interpolis. Part of the research is carried out by University of Twente, Avans Hogeschool, the Network Institute of the Vrije Universiteit Amsterdam and the Max Planck Institute for Foreign and International Criminal Law (Freiburg, Germany).
Figure 2. Fieldlab in Rotterdam Lombardijen
From a technological perspective, the project has two aims: to detect suspicious behaviour, and in turn, to influence the behaviour of the suspect. As such, project manager Guido Delver, who agreed to be interviewed for this report, describes the project as being primarily a behavioural experiment (Delver 2021). The twelve luminaires are provided by Sustainder (their Anne series (Sustainder 2021)). The processing of the video and audio is done on the spot by a computer embedded in the luminaire, using software from the Eindhoven based company ViNotion (ViNotion 2020). This software reads the video frames from the camera and estimates the presence and position of people – thereby mapping the coordinates of the video frame to coordinates in the space. It then determines the direction they are facing. Only these values –position and direction– and no other characteristics nor any images, are sent over the internet to a datacentre somewhere in the Netherlands, where the position data is stored for further processing (Delver 2021).
Currently, there is no immediate processing of the position data to classify behaviour as being suspicious or not. The proposed pipeline consists of two stages: first, an unsupervised machine algorithm for anomaly (outlier) detection processes the gathered trajectories, in order to distinguish trajectories that statistically deviate from the norm. As an example, both children playing, as well as burglars making a scouting round through the neighbourhood can potentially produce anomalous trajectories. Secondly, these anomalous trajectories are judged as being suspicious or not by a computer model that was trained with human supervision. In the Fieldlab’s first data collection experiment 100.000 trajectories were collected, totalling 20.000.000 data points (Hamada 2020). It turned out however that this was still too few to draw any conclusions about viability of the approach; the big data was still too small (Delver 2021).
Another input for detecting suspicious situations is the microphone with which some of the streetlamps are equipped. By recording two frequencies of sound, sounds can be categorised as coming from for example a conversation, shouting, dog barking, or the breaking of glass. The two frequencies recorded provide too little information to distinguish the words in a conversation (Delver 2021).
Aside from experimenting with the automated detection of suspicious behaviour, the Fieldlab experiments with various ways in which the detected situations can be played out. Project manager Guido Delver notes that the aim is not per se to involve the police. Instead, the suspect should be deterred before any crime is committed (Delver 2021). Various strategies are laid out: the yet-to-be-autonomous system can voice warnings through the speakers embedded in the streetlamps. Or, in line with the work of DITSS in Eindhoven’s Stratumseind street, the light intensity or colour of the streetlamps can be changed (Intelligent Lighting Institute, n.d.). Both strategies are aimed at signalling the subjects that their behaviour is noticed, which generally suffices to have burglars break off their scouting. Another option under consideration is to send a signal to the residents living nearby.
-The process of data gathering in the Burglary Free Neighbourhood is quite similar to technologies that are deployed for anonymous people counting. One such application has been developed by Numina and is deployed in the Dutch city of Nijmegen: individuals are traced through space and time, but not identified or categorised. This information is then used to provide statistics about the number of visitors in the city centre (Schouten and Bril 2019). Another Dutch deployment of technologically similar software is the One-and-a-half-meter monitor developed by the municipality of Amsterdam, which is based on the YOLO5 object detection algorithm and trained on the COCO dataset. This data processing architecture can detect the presence of persons but is incapable of deducing any characteristics (Amsterdam-Amstelland safety region 2020). These implementations show biometrics can be used to detect the presence of people, while refraining from storing these characteristics.
+The process of data gathering in the Burglary Free Neighbourhood is quite similar to technologies that are deployed for anonymous people counting. One such application has been developed by Numina and is deployed in the Dutch city of Nijmegen: individuals are traced through space and time, but not identified or categorised. This information is then used to provide statistics about the number of visitors in the city centre (Schouten and Bril 2019). Another Dutch deployment of technologically similar software is the One-and-a-half-meter monitor developed by the municipality of Amsterdam, which is based on the YOLO5 object detection algorithm and trained on the COCO dataset. This data processing architecture can detect the presence of persons but is incapable of deducing any characteristics (Amsterdam-Amstelland safety region 2020). These implementations show biometrics can be used to detect the presence of people, while refraining from storing these characteristics.
Figure 3. The one-and-a-half-meter monitor developed by the municipality of Amsterdam
The Fieldlab Burglary Free Neighbourhood programme shows how data can be used to conduct monitoring and nudging of individuals’ behaviours. From a legal point of view, the question is whether the data processed in the context of the programme qualifies as personal data and would thus fall within the scope of data protection legislation.
+The Fieldlab Burglary Free Neighbourhood programme shows how data can be used to conduct monitoring and nudging of individuals’ behaviours. From a legal point of view, the question is whether the data processed in the context of the programme qualifies as personal data and would thus fall within the scope of data protection legislation.
The Constitution for the Kingdom of the Netherlands provides for a general right to protection for privacy in Article 10, according to which restrictions to that right must be laid down by law. The GDPR Implementation Act (Uitvoeringswet Algemene Verordening Gegevens-bescherming) (UAVG), as well as the Police Data Act (Wet Politiegegevens) or the Judicial Data and Criminal Records Act (Wet Justitiele en Strafvorderlijke Gegevens) which implement the GDPR and the LED, provides the legal framework regarding privacy and data protection.
The definition of personal data as enshrined in the GDPR and the LED is directly applicable under the Dutch law. To qualify data as such, “any information” must relate to an identified or identifiable natural person. Based on the data that can be captured by the Fieldlab programme, two elements of this definition need further attention.
-“Information “relating to” a natural person”. The former Article 29 Working Party (2007) substantiated this element by noting that information can relate to an individual based on its content (i.e., information is about the individual), its purpose (i.e., information is used or likely to be used to evaluate, treat in a way, or influence the status or behaviour of an individual), or its result (i.e., information is likely to have an impact on a certain person’s rights and interests, taking into account all the circumstances surrounding the precise case). These three alternative notions to determine whether the information relates to an individual was endorsed by the CJEU in its Nowak decision (C-434/16), where it dealt with the purpose (i.e., it evaluates the candidate’s competence) and the result (i.e., it is used to determine whether the candidate passes or fails, which can have an impact on the candidate’s rights) of the information in question in determining whether the written answers to an exam would qualify as personal data. In brief, in determining whether the data captured by the Fieldlab programme qualify as personal data, the context for which the data is used or captured is important. Information about the level of crowding or sound could “relate” to an individual if it is used to evaluate or influence the behaviour of a person (based on its purpose), or to affect a person’s rights (based on its result) (Galič and Gellert 2021).
@@ -1237,7 +1274,7 @@Schuilenburg frames the interest of cities in technologies such as those used in the Burglary Free Neighbourhood as being part of the well-marketed narrative of the “smart city” that is sold by technology companies: “no city wants to be dumb” (“Nieuwsuur” 2020b, 36m). To some extent, Guido Delver positions the project’s privacy-by-design methodology in contrast to many of these commercial products for surveillance. In his conversations with various municipalities he recognises, and shares, the interest for “smart” surveillance technologies. However, Delver attempts to minimise the data gathering in the Burglary Free Neighbourhood. This proves to be a constant negotiation, for example the police have voiced an interest in access to the camera feeds in case suspicious behaviour was detected. However, access to the camera feeds has been deliberately kept outside of the scope of the project (Delver 2021).
While the project currently only stores the position of passers-by, there are also technical considerations for the capture of more information. For example, the video cameras cannot cover the entire area, therefore, as no characteristics of individuals are stored, tracking people from one camera to the next is problematic. It raises the question of whether biometric measurements such as a person’s estimated volume, length, or colour of clothing should be recorded, this would allow the computer to link the trace of one camera to another. Posing ethical and legal questions for the project: what are the legal ramifications of deducing and (temporarily) storing these characteristics, and for how long should they be stored (Delver 2021)? Even for projects that decide to consider privacy by design, it can be tempting to store and process biometric information. However, as mentioned above (see section 7.2.), the challenges in determining whether the Fieldlab or any other similar initiatives process personal data as defined in the GDPR raises questions on the extent to which these programmes fall within the scope of the data protection legislation, irrespective of the fact that they may be designed to affect the personal autonomy of individuals (as opposed to an identified or identifiable individual) by influencing and nudging their behaviours.
Finally, commentators have pointed out the discrepancy between what is expected of the technology, and what it is actually doing. For example, the Algemeen Dagblad (Krol 2019) writes that the “smart streetlights” are actually able to “recognise behaviour” and to “sound the alarm” if necessary. Whereas up until now, the streetlights have only been used to capture data for machine learning.
-These observations raise the question as to whether or not the communication about the technologies used suffices. When entering the neighbourhood, a sign signals to the visitor that the Fieldlab is operative, however, much of the information discussed above could not be found on the website that is mentioned on the sign – as is indicated by the breath of references used. This situation is substantially different from the way that, for example, the city of Amsterdam lays out its use of algorithms: one website presents the goals of the projects, the kinds of data processing that is happening, the datasets on which the algorithms are trained, and in some cases the source code is shared (Amsterdam Algoritmeregister, 2021). The Dutch government is currently drafting regulations for a national register of cameras and sensors as deployed by municipalities (Nieuwsuur 2020b).
+These observations raise the question as to whether or not the communication about the technologies used suffices. When entering the neighbourhood, a sign signals to the visitor that the Fieldlab is operative, however, much of the information discussed above could not be found on the website that is mentioned on the sign – as is indicated by the breath of references used. This situation is substantially different from the way that, for example, the city of Amsterdam lays out its use of algorithms: one website presents the goals of the projects, the kinds of data processing that is happening, the datasets on which the algorithms are trained, and in some cases the source code is shared (Amsterdam Algoritmeregister, 2021). The Dutch government is currently drafting regulations for a national register of cameras and sensors as deployed by municipalities (Nieuwsuur 2020b).
Several French cities have launched “safe city” projects involving biometric technologies, however Nice is arguably the national leader. The city currently has the highest CCTV coverage of any city in France and has more than double the police agents per capita of the neighbouring city of Marseille.
Through a series of public-private partnerships the city began a number of initiatives using RBI technologies (including emotion and facial recognition). These technologies were deployed for both authentication and surveillance purposes with some falling into the category of biometric mass surveillance.
One project which used FRT at a high school in Nice and one in Marseille was eventually declared unlawful. The court determined that the required consent could not be obtained due to the power imbalance between the targeted public (students) and the public authority (public educational establishment). This case highlights important issues about the deployment of biometric technologies in public spaces.
One project which used FRT at a high school in Nice and one in Marseille was eventually declared unlawful. The court determined that the required consent could not be obtained due to the power imbalance between the targeted public (students) and the public authority (public educational establishment). This case highlights important issues about the deployment of biometric technologies in public spaces.
The use of biometric mass surveillance by the mayor of Nice Christian Estrosi has put him on a collision course with the French Data Protection Authority (CNIL) as well as human rights/ digital rights organisations (Ligue des Droits de l’Homme, La Quadrature du Net). His activities have raised both concern and criticism over the usage of the technologies and their potential impact on the privacy of personal data.
Although several French cities such as Paris, Valenciennes or Marseille have launched pilot projects for “safe city” projects involving biometric technologies (facial, voice, sound recognition), the city of Nice is perhaps the national leader in the experimentation with such technologies at a local level (Nice Premium 2017). The mayor of Nice, Christian Estrosi (Les Républicains Party, right) a prominent political figure on the national political scene, has made clear his intention was to make Nice a “laboratory” of crime prevention (Barelli 2018). Since 2010, more than 1.962 surveillance cameras have been deployed throughout the city, making it the city with highest CCTV coverage in France (27 cameras per square meter). Nice also possesses the most local police in France per inhabitant: 414 agents, for a population of 340.000 (in comparison, the neighbouring city of Marseille has 450 agents for 861.000 inhabitants).
Nice has experimented with various initiatives related to remote biometric identification – many of which fall into the category of biometric mass surveillance. In 2017, Christian Estrosi announced a partnership with the energy company Engie Ineo for the development of an Urban Surveillance Centre (Centre de Surveillance Urbain, CSU). Based on a touch-interface technology, it centralises a platform of real-time data such as traffic accidents, patrol locations, as well as video feeds from CCTVs on the streets and in public transportation. (Dudebout 2020, 1). The video feeds from the city tramways are connected to an emotion recognition algorithm to flag suspicious situations (Allix 2018).
-In June 2018, an additional step was taken with the signing of a partnership agreement with a consortium of companies headed by Thales, specialised in social network intelligence, geolocation, biometrics and crowd simulation33 for a “Safe City” project (Dudebout 2020, 2). Established for three years (2018-2021) with a budget of EUR 10,9 million, the project is financed by the city council, subsidised in part by BPI France34, and supported by the Committee for the Security Industrial Sector, an agency under the tutelage of the Prime Minister’s office35 (Allix 2018; BPI France 2018)
+Nice has experimented with various initiatives related to remote biometric identification – many of which fall into the category of biometric mass surveillance. In 2017, Christian Estrosi announced a partnership with the energy company Engie Ineo for the development of an Urban Surveillance Centre (Centre de Surveillance Urbain, CSU). Based on a touch-interface technology, it centralises a platform of real-time data such as traffic accidents, patrol locations, as well as video feeds from CCTVs on the streets and in public transportation. (Dudebout 2020, 1). The video feeds from the city tramways are connected to an emotion recognition algorithm to flag suspicious situations (Allix 2018).
+In June 2018, an additional step was taken with the signing of a partnership agreement with a consortium of companies headed by Thales, specialised in social network intelligence, geolocation, biometrics and crowd simulation33 for a “Safe City” project (Dudebout 2020, 2). Established for three years (2018-2021) with a budget of EUR 10,9 million, the project is financed by the city council, subsidised in part by BPI France34, and supported by the Committee for the Security Industrial Sector, an agency under the tutelage of the Prime Minister’s office35 (Allix 2018; BPI France 2018)
The first facial recognition test of the Safe city project took place from 16 February to 2 March 2019, during the Nice Carnival. The experiment was a simulation, involving matching faces collected through CCTV footage of the crowd attending the carnival with a fictitious set of databases (lost individuals, wanted individuals, or individuals with restraining orders). The fictitious datasets were constituted by 50 volunteers, recruited mostly among the municipality, who provided their pictures, or were freshly photographed for the test. The system used live facial recognition software provided by the company Anyvision. The live feeds were filmed during the carnival. Passers-by (approximately 1000 people were concerned) were informed of the ongoing test and asked to wear a bracelet if they consented to being filmed (Hassani 2019).
A second experiment took the form of a software application (app) named “Reporty”, rolled out in January 2018. The app, developed by the Israeli American company Carbyne, allows citizens to be in direct audio and video connection and share geolocation information with the Urban Supervision Centre in order to report any incivility, offense, or crime that they might witness (Barelli 2018).
-The third project, involving facial recognition was tested in the education context. In February 2019, a high school in Nice and a high school in Marseille were fitted with facial recognition technology at their gates in order to grant or bar access to the premises. The official motivation behind the deployment was to "assist the personnel of the high schools and to fight against identity theft’’ (Dudebout 2020, 3–4).
+The third project, involving facial recognition was tested in the education context. In February 2019, a high school in Nice and a high school in Marseille were fitted with facial recognition technology at their gates in order to grant or bar access to the premises. The official motivation behind the deployment was to "assist the personnel of the high schools and to fight against identity theft’’ (Dudebout 2020, 3–4).
The use of facial recognition systems in high schools in Nice and Marseille, which was declared unlawful by the Administrative Court of Marseille, raised important issues on the legality of deploying biometric technologies in public places.
-There is no specific provision devoted to the right to privacy or data protection in the French Constitution of 1958, but constitutional safeguards for the interests protected under said rights exists. The French Constitutional Council (Conseil Constitutionnel) has recognised that the respect for privacy is protected by Article 2 of the 1789 Declaration of the Rights of Man and of the Citizen, which is incorporated in the French constitutionality bloc as binding constitutional rule (bloc de constitutionnalité) (French Constitutional Council, Decision N° 2004-492 DC of 2 March 2004). Accordingly, the collection, retention, use and sharing of personal data attracts protection under the right to privacy (French Constitutional Council, Decision n° 2012-652 DC of 22 March 2012). The limitations to that right must thus be justified on grounds of general interest and implemented in an adequate manner, proportionate to this objective (ibid).
+There is no specific provision devoted to the right to privacy or data protection in the French Constitution of 1958, but constitutional safeguards for the interests protected under said rights exists. The French Constitutional Council (Conseil Constitutionnel) has recognised that the respect for privacy is protected by Article 2 of the 1789 Declaration of the Rights of Man and of the Citizen, which is incorporated in the French constitutionality bloc as binding constitutional rule (bloc de constitutionnalité) (French Constitutional Council, Decision N° 2004-492 DC of 2 March 2004). Accordingly, the collection, retention, use and sharing of personal data attracts protection under the right to privacy (French Constitutional Council, Decision n° 2012-652 DC of 22 March 2012). The limitations to that right must thus be justified on grounds of general interest and implemented in an adequate manner, proportionate to this objective (ibid).
France has updated the Act N°78-17 of 6 January 1978 on information technology, data files and civil liberties in various stages to incorporate the provisions of the GDPR, address the possible exemptions contained in the GDPR, and implement the LED.
@@ -1288,7 +1325,7 @@The Act N°78-17 provides the data subject rights against the processing of their personal data with restrictions to the exercise of those rights subject to certain conditions (e.g., the restriction for protecting public security to the right to access the data processed for law enforcement purposes pursuant to Art 107 of Act N°78-17). An important data subject’s right in the context of biometric surveillance is the data subject’s right not to be subjected to solely automated decision-making, including profiling, except if it is carried out in light of circumstances laid out in Article 22 of the GDPR and for individual administrative decisions taken in compliance with French legislation (Article 47 of Act N°78-17). That said, for the latter circumstance, the automated data processing must not involve sensitive data (Article 47(2), Act N°78-17). Regarding the data processing operations relating to State security and defence (Article 120, Act N°78-17) and to the prevention, investigation, and prosecution of criminal offences (Article 95, Act N°78-17), the Act lays out an absolute prohibition against solely automated decision-making, according to which no decision producing legal effects or similarly significant effects can be based on said decision-making intended to predict or assess certain personal aspects of the person concerned. Particularly, with respect to data processing operations for law enforcement purposes, Article 95 of the Act prohibits any type of profiling that discriminates against natural persons based on sensitive data as laid out in Article 6.
-In addition to the data protection legislation, the other legislation applicable to biometric surveillance is the Code of Criminal Procedure. Its Article R40-26 allows the national police and gendarmerie to retain in a criminal records database (Traitement des Antécédents Judiciaires or TAJ) photographs of people suspected of having participated in criminal offences as well as victims and persons being investigated for causes of death, serious injury or disappearance to make it possible to use a facial recognition device. According to a 2018 report by Parliament, TAJ contains between 7 and 8 million facial images (Assemblée Nationale N°1335, 2018, 64, f.n. 2). La Quadrature du Net lodged legal complaints against the retention of facial images before the Conseil d'État, arguing that this practice does not comply with the strict necessity test required under Article 10 of LED and Article 88 of Act N°78-17 (La Quadrature du Net, 2020).
+In addition to the data protection legislation, the other legislation applicable to biometric surveillance is the Code of Criminal Procedure. Its Article R40-26 allows the national police and gendarmerie to retain in a criminal records database (Traitement des Antécédents Judiciaires or TAJ) photographs of people suspected of having participated in criminal offences as well as victims and persons being investigated for causes of death, serious injury or disappearance to make it possible to use a facial recognition device. According to a 2018 report by Parliament, TAJ contains between 7 and 8 million facial images (Assemblée Nationale N°1335, 2018, 64, f.n. 2). La Quadrature du Net lodged legal complaints against the retention of facial images before the Conseil d'État, arguing that this practice does not comply with the strict necessity test required under Article 10 of LED and Article 88 of Act N°78-17 (La Quadrature du Net, 2020).
The French digital rights organisation La Quadrature du Net was quick to highlight the problems raised by the deployment of these technologies in Nice. “The safe city is the proliferation of tools from the intelligence community, with a logic of massive surveillance, identification of weak signals and suspicious behaviour," commented Félix Tréguer, a Marseilles-based leader of the association La Quadrature du Net and member of the campaign Technopolice36. “We do not find it reassuring that the municipal police will become the intelligence service of the urban public space and its digital double" (Allix 2018).
The Ligue des Droits de l’Homme emphasised similar points, highlighting the political dangers involved. As Henri Busquet of the Ligue des Droits de l'Homme in Nice put “improving emergency services and traffic is legitimate, but the generalisation of video surveillance worries us, and scrutinising social networks is not the role of a mayor. Without any safeguards, such a tool cannot demonstrate the necessary neutrality [...] It is potentially a tool for political destruction, which puts opponents and journalists at particular risk” (Allix 2018).
In July 2019, the city of Nice hoped the CNIL would provide advice related to its first test experiment during the Carnival. The CNIL responded however that not enough information was provided by the municipality for the DPA to assess it. The French DPA pointed out in particular the lack of “quantified elements on the effectiveness of the technical device or the concrete consequences of a possible bias (related to gender, skin colour ...) of the software” (Dudebout 2020, 3).
-The launch of the smartphone application “Reporty” was the catalyst for mobilisation in Nice, united under the umbrella organisation “Collectif anti-Reporty". The coalition was formed by local representatives from two left-wing parties (Parti Socialiste, Les Insoumis), Tous Citoyens, the union CGT and the anti-discrimination NGO MRAP. The coalition appealed to two institutions to block the use of the application: The Defender of Rights (Défenseur des Droits) and the French DPA (CNIL). The coalition denounced “a risk of generalised denunciation and a serious breach of privacy”, calling to “put an end to the securitarian drift of Christian Estrosi” (Barelli 2018).
+The launch of the smartphone application “Reporty” was the catalyst for mobilisation in Nice, united under the umbrella organisation “Collectif anti-Reporty". The coalition was formed by local representatives from two left-wing parties (Parti Socialiste, Les Insoumis), Tous Citoyens, the union CGT and the anti-discrimination NGO MRAP. The coalition appealed to two institutions to block the use of the application: The Defender of Rights (Défenseur des Droits) and the French DPA (CNIL). The coalition denounced “a risk of generalised denunciation and a serious breach of privacy”, calling to “put an end to the securitarian drift of Christian Estrosi” (Barelli 2018).
On 15 March 2018, the CNIL stated that the application was too invasive and did not meet the criteria set out by the legislation. It did not meet the proportionality test; it failed to fall within the frame of existing law on video-protection due to the integration of private citizens terminals (smartphones) with a security database managed by the police; it was excessively intrusive due to the collection of images and voice of people in the public space and finally it covered a field of offenses that was too broad (CNIL 2018).
The school experimentation further pushed the CNIL to take a position on the technological activism of Nice’s mayor. On 29 October 2019, it expressed serious concerns over the experimentation, arguing that the technology was clashing with the principles of proportionality and data collection minimisation enshrined in the principles of the GDPR. It pointed out that other methods, less intrusive for the privacy of the students, could be used to achieve the technology’s stated goal, namely increasing the student’s security and traffic fluidity (Dudebout 2020, 4).
In a landmark opinion published on 15 November 2019, the CNIL clarified what it defined as guidelines related to facial recognition (CNIL 2019a). The French DPA expressed concerns over a blanket and indiscriminate use of the technologies, highlighting possible infringements to fundamental rights, because these technologies operate in the public space, where these freedoms (expression, reunion, protest) are expressed. It however did not suggest that they should be banned in all circumstances – it suggested instead that its uses could be justified if properly regulated, on a case-by-case basis. Certain uses could be rejected a priori – such as in the case of minors, whose data are strictly protected. The question of data retention is also central, warning against excessive data duration and excessive centralisation, suggesting instead citizen’s control over their own data. But as the president of the CNIL, Marie-Laure Denis explained, facial recognition technology “can have legitimate uses, and there is a not firm position of the CNIL’s board” (Untersinger 2019).
@@ -1424,9 +1461,9 @@In May 2020 Hungarian Authorities rolled out two digital applications, the contract-tracing app called VirusRadar (Kaszás 2020) and the Home Quarantine App (Házi Karantén Rendszer, abreviated HKR). Both of these apps are centralised tracing apps meaning that they send contact logs with pseudonymised personal data to a central (government) back-end server (Council of Europe 2020, 28). While the VirusRadar only uses Bluetooth data and proximity of other devices, the HKR processes biometric data when comparing facial images of its users.
Those who, according to the COVID-19 regulations in Hungary, are confined to home quarantine are offered the option to use the app instead of being checked by the police. For those who return from abroad, the use of the app is compulsory. But even those who can choose are encourage by the authorities to make use of the HKR app otherwise they will be subjected to frequent visits by police agents. Once a person downloads the app, its use becomes compulsory and failure to do so or attempts to evade its tracking is considered an administrative offense. From a data protection law point of view, this is a clear case where the data subject’s consent (and in the case of biometric data, their explicit consent) cannot provide the lawful ground for the processing of data through the app (see section 4.2.2). Even if the processing can be based on another lawful ground such as public interest, the punitive nature of non-compliance may raise issues in terms of adhering to the necessity test, which requires a balancing act between the objective pursued and the data subject’s interests.
-The HKR app is developed by Asura Technologies and implemented by IdomSoft Ltd., the same company that provides the software and technical implementation for the nation-wide Dragonfly Project. The HKR application works with face recognition technology combined with location verification. The application sends notifications at random times prompting the user to upload a facial image while retrieving the location data of the mobile device. The user must respond within 15 minutes and the location data must match the address registered for quarantine. In order for the Home Quarantine App to work, the user first needs to upload a facial image which is compared by a police officer with the photo of the same individual stored in the central database. After this facial verification, the app creates a biometric template on the mobile phone of the user and the photo is deleted. The consecutive photos are only compared to this biometric template, so neither the photos nor the template leave the personal device. If there is suspicion about the identity or whereabouts of the user, a police officer visits the address to make sure that the person is adhering to the quarantine rules.
+The HKR app is developed by Asura Technologies and implemented by IdomSoft Ltd., the same company that provides the software and technical implementation for the nation-wide Dragonfly Project. The HKR application works with face recognition technology combined with location verification. The application sends notifications at random times prompting the user to upload a facial image while retrieving the location data of the mobile device. The user must respond within 15 minutes and the location data must match the address registered for quarantine. In order for the Home Quarantine App to work, the user first needs to upload a facial image which is compared by a police officer with the photo of the same individual stored in the central database. After this facial verification, the app creates a biometric template on the mobile phone of the user and the photo is deleted. The consecutive photos are only compared to this biometric template, so neither the photos nor the template leave the personal device. If there is suspicion about the identity or whereabouts of the user, a police officer visits the address to make sure that the person is adhering to the quarantine rules.
-Interestingly, the HKR app, — just like the contact tracing app VirusRadar, which was developed by Nextsense — has been “donated” to the Hungarian Government by Asura Technologies “free of charge”.
+Interestingly, the HKR app, — just like the contact tracing app VirusRadar, which was developed by Nextsense — has been “donated” to the Hungarian Government by Asura Technologies “free of charge”.
Figure 5. Snapshots from the video Home Quarantine System Short Presentation by Asura Technologies38
@@ -1756,13 +1793,13 @@Criminal case history database, managed by the French Ministry of Interior↩︎
Criminal case management system, managed by the German Federal Criminal Police Office (Bundeskriminalamt)↩︎
Managed by the Video and Image Laboratory of the Audiovisual Evidence of the Department of Photography and Modus Operandi of the Hellenic Police Forensic Science Division↩︎
The Facial Image registry is interrogated through a search engine developed by NEC, and accessible to the National Investigation Agency, the Criminal Courts, the National Protective Service, the Counter-Terrorism Centre, the Hungarian Prison Service, the Prosecution Service of Hungary, the Public Administration, the Special Service for National Security, the Intelligence Agencies, the Hungarian Police, the Hungarian Parliamentary Guard, Hungarian Ministry of Justice, Witness Protection Service, the National Directorate-General for Aliens Policing and Institution of the President of the Republic.↩︎
The Facial Image registry is interrogated through a search engine developed by NEC, and accessible to the National Investigation Agency, the Criminal Courts, the National Protective Service, the Counter-Terrorism Centre, the Hungarian Prison Service, the Prosecution Service of Hungary, the Public Administration, the Special Service for National Security, the Intelligence Agencies, the Hungarian Police, the Hungarian Parliamentary Guard, Hungarian Ministry of Justice, Witness Protection Service, the National Directorate-General for Aliens Policing and Institution of the President of the Republic.↩︎
Automated Fingerprint Identification System. The system can be interrogated via a software developed by the company Reco 3.26, a subsidiary of Parsec 3.26. Another software used is provided by the japanese company NEC.↩︎
Biometric Data Processing System (criminal data array), supported by database software from RIX Technologies↩︎
Habitoscopic Data Register↩︎
Central Automatic TeChnology for Recognition of Persons, managed by the Centrum voor Biometrie, connected to the Dutch Judicial Information Service (Justid).↩︎
The database uses VeriLook and Face Trace software from the Lithuanian company Neurotechnology.↩︎
Automated Biometric Identification System, searchable by the IntellQ software from the company IntellByte, managed by the Ministry of the Interior (Croatia).↩︎
Automated Biometric Identification System, searchable by the IntellQ software from the company IntellByte, managed by the Ministry of the Interior (Croatia).↩︎
Central Biometric Information System↩︎
National Biometric Identification System↩︎
Managed by the Photographic and Graphic Laboratory of Criminalistic Services, using search software by the company Unidas↩︎