A startup from Russia is building an AI which uses facial recognition to determine ethnicity, prompting fears it could be used for automated racial profiling.
NtechLab lists ‘ethnicity detection’ as an upcoming feature of its solution. The algorithm promises the ability to examine people and determine their ethnicity. An image, which has since been pulled as the result of backlash, showed classifications of people including ‘European’, ‘African’, and ‘Arabic’.
The company is full of award-winning talent and the accuracy of NtechLab’s system has even been independently verified and lauded by the U.S. Department of Commerce and the University of Washington. It powers Moscow’s ~170,000 surveillance cameras and the app FindFace, which identified random strangers with 70 percent accuracy back in 2017.
While detecting ethnicity is ripe for abuse and causing offence, there are some genuine use cases. In health, for example, some medicines are targeted at specific ethnic clusters which are shown to have a propensity for a certain disorder.
As found on Wikipedia:
“The first example of this in the U.S. was when BiDil, a medication for congestive heart failure, was licensed specifically for use in American patients that self-identify as black. Previous studies had shown that African American patients with congestive heart failure generally respond less effectively to traditional treatments than white patients with similar conditions.”
Although not the first to promote an ethnicity detection feature, NtechLab is receiving a lot more scrutiny. This is most likely due to an increased awareness of the problem of racial profiling.
Many privacy advocates, such as the American Civil Liberties Union, are against facial recognition at any level. It’s widely used in nations such as China but in the more privacy-conscious West it remains fairly taboo. Adding the ability to detect people’s ethnic background feels even more Orwellian.
Back in April 2017, former ImageNet competition winner Matthew Zeiler launched an API called ‘Demographics’ as part of Clarifai which also identified ‘multicultural appearance’.
In a blog post, Zeiler admitted the feature could be used maliciously but “it’s not our business to limit what developers create and we choose to believe that most people will use our Demographics for good – like bringing attention to female representation in tech, or bringing an end to human trafficking.”
Call me a cynic, but I don’t believe any of these solutions will be mostly used for good.
What are your thoughts on ethnicity-detecting facial recognition? Let us know in the comments.