The “Sensing Project” uses cameras and sensors to collect data on vehicles driving in and around Roermond, a small city in the southeastern Netherlands. An algorithm then purportedly calculates the probability that the driver and passengers intend to pickpocket or shoplift, and directs police towards the people and places it deems “high risk.” The police present the project as a neutral system guided by objective crime data. But Amnesty found that it’s specifically designed to identify people of Eastern European origin — a form of automated ethnic profiling. The project focuses on “mobile banditry,” a term used by Dutch law enforcement to describe property crimes, such as pickpocketing and shoplifting. Police claim that these crimes are predominantly committed by people from Eastern European countries — particularly those of Roma ethnicity, a historically marginalized group. Amnesty says law enforcement “explicitly excludes crimes committed by people with a Dutch nationality from the definition of ‘mobile banditry’.” The watchdog discovered that these biases are deliberately embedded in the predictive policing system: [Read: Are EVs too expensive? Here are 5 common myths, debunked] The investigation also found that the system creates many false positives, that police haven’t demonstrated its effectiveness, and that no one in Roermond had consented to the project. “The Dutch authorities must call for a halt to the Sensing Project and similar experiments, which are in clear violation of the right to privacy, the right to data protection and the principles of legality and non-discrimination,” said Merel Koning, senior policy officer of technology and human rights at Amnesty. So you’re interested in AI? Then join our online event, TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses.