Facial recognition software is virtually inescapable.
The technology has become a critical tool for government agencies and law enforcement. The Trump administration recently announced an expansion of the technology’s use in airports for international travelers and along the border using drones. Some local police departments already use facial recognition technology, and the feature is expected to be added to body cameras. Even consumer electronics companies are building facial recognition into everything from smartphones to video games.
Since you can’t change your facial structure quite as easily as a password, the startup D-ID — short for “de-identification” — offers a solution if your facial data is ever compromised.
The company promises a “firewall for your identity” using tools that scramble image data so that facial recognition software detect a likeness to the original image rather than the exact one.
The company was founded in 2016 by three former Israeli military special forces and intelligence officers who had to be mindful of facial recognition software when photo-sharing first became popularized 10 years ago, TechCrunch reported.
“We couldn’t share our photos and profiles over the web because of sensitive positions. Even after we finished our service, we couldn’t share our photos when we traveled in South America,” D-ID’s CEO Gil Perry told TechCrunch.
“Now everyone needs to be aware of it. Streets today are covered by cameras, we all carry smartphones. We are being photographed all the time. When you combine all the cameras and face recognition technology, privacy is actually gone.”
D-ID’s technology makes it so that the image facial recognition software reads and stores isn’t an exact replica of an individual’s face, just similar. That way, if the data is ever compromised, a person’s true face and identity is still private. According to the company’s website, “once an image or video has been secured with D-ID’s solution, protection is complete. The image’s new version becomes the permanent, irreversible one, while maintaining visual similarity to the human eye.”
Facial recognition software is controversial because of its imperfections and its tendency to be used on people without them being aware. The technology is prone to false positives and, thanks to the limits of cameras and image processors, disproportionately implicate people of color because of the software fails to accurately detect facial features on darker skin complexions.
Those problems have broad implications as law enforcement agencies increasingly use the technology. About half of all Americans are included in law enforcement databases, even if they don’t have a criminal history. Facial recognition carries a 15 percent error rate.
There also very few guidelines in place to protect individuals’ data. The lack of protections has led to legal challenges. In Jacksonville, Florida, Willie Allen Lynch, who was arrested for drug dealing after undercover officers took a photo of him last year, challenged the sheriff department’s use of the technology in court. The case exposed the Jacksonville Sheriff Department for relying on the technology when making arrests without disclosing it. In Lynch’s case, the department said it manually picked the photo from a line-up.
Law enforcement agencies have used facial recognition technology to scan crowds of protesters and individuals without reasonable suspicion, much like license plate scanners. (These indiscriminate scanners were in the news earlier this month after Florida State Attorney Aramis Ayala was pulled over in Orlando after a police officer scanned her plates. When she asked what the reason for the stop was, the officer replied, “oh, we run tags all the time.”)
This catchall approach to gathering data has recently extended to border security, as the Department of Homeland Security rolls out the tech in airports across the country. When citizens raised concerns about the program, the agency said those worried about privacy and facial scans should “refrain from traveling,” ZDNet reported.
Some locales have offered stiffer resistance to the technology. Biometric programs are being disbanded because of their overly broad use in places like Vermont, where the Attorney General deemed the state’s Department of Motor Vehicles’ use of facial recognition software illegal and suspended the program indefinitely.
“We concluded that the facial recognition program used by the Vermont Department of Motor Vehicles is not fully compliant with Vermont law and should remain suspended until the Vermont legislature provides specific authorization for DMV to use biometrics,” Attorney General T.J. Donovan told NPR’s Vermont affiliate.
The review found that the DMV database directly violated a state law that prohibits the agency’s use of biometric data. The Attorney General also found the database was disproportionately used to search for individuals of color.