Facial Recognition Technology Is Raising Serious Issues
A reader, Eton, asks, “Do you have any recommendations on facial recognition startups, or thoughts on them?”
I usually save questions like this for our Mailbag. But this is such a big and basic issue, I didn’t want to wait.
Facial recognition technology is spreading like wildfire. I first started to use it when my Google Photos app made it available a couple of years ago. It makes searching for people in your photo albums quick and easy. Apple uses facial recognition to enable users to unlock their smartphones. Banks employ it to verify transactions. And supermarkets use it to try to catch underage drinkers.
The technology isn’t perfect. Using a 3D mask of somebody else’s face can trick it. Hong Kong protesters have flashed laser beams at cameras to obscure their image.
But let’s assume facial recognition technology improves. Should we welcome widespread adoption? Is it something that needs to be regulated? Or even prohibited?
Frankly, the technology makes me very nervous. We don’t have to use our imagination to see its downside. China has built a network of facial recognition cameras to track and control its minority Muslim population (the Uighurs) in Xinjiang. It’s effectively a police state. The country also uses a dense infrastructure of cameras in its major cities to crack down on victimless crimes, such as traffic violations and jaywalking. Big Brother is watching.
But that’s China, you say. It could never happen here. Really?
Misidentifying a Threat
Imagine a wave of terrorism hitting the biggest cities in the United States. Arrests are few and far between. Feeling frustrated, the police in these cities rapidly install hundreds of facial recognition cameras on virtually every street corner. (San Francisco and Oakland are the biggest of a handful of cities that have forbidden agencies such as the police from using this technology.)
Now YOU are the one being watched. And this is a scenario where facial recognition technology’s faults become dangerous. The technology is most accurate at identifying white males. It errs more frequently when identifying minorities. So if it were being used to combat foreign terrorism, you’d better believe there’d be racial profiling going on.
This past August, facial recognition technology incorrectly matched 26 California lawmakers with images from an “arrest photo database” during a test, with more than half of the misidentified politicians being “lawmakers of color,” according to the American Civil Liberties Union.
If facial recognition tech is being used to combat domestic terrorism, then everybody would be under suspicion. Is this what Americans want?
According to a recent poll of 4,000 Americans by the Pew Research Center, it’s indeed okay with a majority of them. Fifty-nine percent said it was acceptable for police to use the technology to assess security threats in public.
I brought this issue up with my son, Nick (he’s a cop here in Maryland), last night over dinner at my favorite kebab restaurant. I asked him if he thought it was possible that the police could use the technology to improperly profile minorities. He said it’s a legit investigative tool that’s been very useful in solving crimes.
Then I asked Nick about Sen. Ed Markey’s (D-Massachusetts) letter to Amazon asking about its doorbell camera, Ring, and its relationship with law enforcement agencies. Ring has 400 partnerships with U.S. police forces that grant them access to the footage from the homeowners’ internet-connected video cameras.
“It’s voluntary. Users opt in. I don’t see any problem,” he said rather tersely.
I’d agree, if I knew that all cops were as honest and as good at their jobs as my son. I just don’t think that’s the case.
Proceed With Caution
But, right now, there’s not enough pushback to slow down adoption of this technology. In a House Committee hearing in July, members of the Department of Homeland Security (DHS) argued that face recognition and biometric surveillance is safe, regulated, and essential for the purposes of keeping airports and U.S. borders secure.
And by 2023, DHS estimates facial recognition will scrutinize 97% of outbound airline passengers. Its high estimate indicates that it’s assuming no additional regulations will restrict the technology.
These are serious issues. Keeping the peace is a big deal. Protecting our national security is one of the fundamental obligations of our government. But protecting our privacy and civil liberty is also a high-ranking moral obligation that requires constant vigilance… especially in the face of competing agendas.
Let’s proceed with the utmost caution. We’ve witnessed heinous crimes perpetrated in the name of the “motherland” (Germany, Russia, Bosnia, etc.). Giving up elements of our personal freedom should be a last-resort measure. A camera on every street corner is not the America I want for my children or grandchildren.
Let’s do our best to ensure it doesn’t come to that.
About Andy Gordon
Andy has three decades of experience in the private and public sectors as an entrepreneur and advisor. The CIA, former Maryland Governor William Donald Schaefer, and Fortune 500 companies such as Lockheed Martin and Dow Chemical have all trusted his advice. Andy founded and ran an international trade and finance company based in Asia. Upon returning to the U.S., he joined a Florida investment advisory service that quickly gained a reputation for recommending companies with outstanding value and fundamentals. Andy has taught marketing and finance courses at local Maryland universities and has written a half-dozen books on global business, published by McGraw-Hill, Frost & Sullivan and others. He now regularly shares his worldly knowledge about investing in startups, cryptocurrency and cannabis with everyday investors in the free daily e-letter, Early Investing.