Apple Introduces Face ID and Exposes Another Discriminatory Blind Spot in “Cool” Technology

Cory Lancaster, MPP, Staff Writer, Brief Policy Perspectives

Could Face ID become a tool for authoritarianism? Photo by Aaron Hedquist, @aaronhedquistphotography

The people of America have whipped out their smart phones and called their government to the stand. The social issues that have been offending our citizens for centuries are on trial and cell phone technology has provided a mass of incriminating evidence. Smart phones have drastically revised the strategy for social justice. They attach visuals to the externalities of policy absence, abuse, and reform and act as the primary medium for continuous content sharing. Cool software, intended solely for entertainment and lifestyle efficiency, morphed into a crucial weapon against injustice. But, smart phone technology, like its human creators, is flawed. Software programming bias has caused society’s most serious catastrophes and discriminatory behaviors that were once only fears in the brick and mortar world have algorithmically evolved into covert inequity.

Smart Phones, Serious Privacy Invasion

Smart phones are physical manifestations of our most private thoughts, interests, and desires; location trackers that completely relinquish our anonymity; evidence boards with years of biographical data; and gateways between our three-dimensional and internet realities. So, dots are connected and narratives are framed before individuals have the opportunity to explain. This probably isn’t worrisome for those that live quiet lives. But, for the student activists, journalists, community organizers, hot button issue advocates, or minorities that generally receive the short end of the criminal justice stick, this is terrifying.

During inauguration weekend, a chaotic time of American uncertainty, officials seized protesters’ smart phones, cracked their passwords, and used the call detail records, email logs, messages, messaging applications, website search history, and images and videos to charge them with conspiracy to riot. Black Lives Matter activists have been regularly monitored through the cell phone GPS locator that connects public social media posts to geographic tags. Stingrays, cell phone surveillance devices that mimic cell phone towers and send out signals that trick phones into transmitting their locations and identifying information, have been used to track and capture undocumented immigrants. Apps like Uber and Lyft track location data even when consumers are not using the app. Twitter and Instagram track and store data analytics to share with advertisers. Facebook even collects users’ faceprints to fuel their photo tagging feature. Google, which owns all of its search history and Gmail content, uses information gleaned from users’ email, search results, map requests, and YouTube views to create and specialize advertising content. If granted a warrant, law enforcement can gain access to all of this information.

New Technology, New Concerns

This year, both Apple and Samsung introduced new generation smart phones with facial recognition software, Face ID, that locks the device until the owner simply looks at it. With one glance, the user gains complete access. In the current political climate, this software warrants concern for police and civilian misconduct. Face ID is not a strong security measure. In fact, Samsung’s device was already tricked with a photograph. Moreover, facial recognition systems have a history of racial bias attributable to the lack of diversity in product development: algorithms trained on mostly white faces have higher error rates when interacting with black, Chinese, and Indian faces. No reports have been published on databases that consider transitioning members of the LGBTQIA community. Georgetown Law published a report showing that 1 in 2 adults in the U.S. have their images in a facial recognition network. Currently, police departments can search these faces without regulation using algorithms that have not been audited for accuracy. Law enforcement is also embracing machine learning for predictive policing. Some judges are using machine generated risk scores to determine the length of prison sentences.

In this political climate where islamophobia, racism, sexism, and gender discrimination are bigots’ primary weapons of terror, it is impossible to ignore the repercussions Face ID could produce in the heightened tension of a wrongful arrest, peaceful protest gone awry, or chance encounter between two strangers with opposing beliefs and ethics. There is even the concern that a person with a bruised, bloodied, or swollen face may not be able to enter his or her own phone to access specific contacts or live apps that have been the sole link between crime and justice. Here lies the double-edged sword of innovation. Apple and Samsung do allow users to opt out of biometric security and they do disclose their data use practices in their privacy policies. However, this is not enough because Face ID is a social impact technology with security concerns far beyond the scope of advertising.

New Innovation, No Regulation

Furthermore, there are no constitutional protections for biometric security measures. Adi Roberston, Senior Reporter for technology news publication The Verge, writes that the Fifth Amendment, which protects individuals from having to incriminate themselves, holds that passcodes are “testimonial” evidence. Passcodes are not legally obligated to be disclosed because doing so would mean answering a question based on the contents of one’s thoughts, not physical evidence. However, security experts have warned that fingerprints do not fall under this rule and that face scanning likely would not either. Standing there while a law enforcement officer holds a phone up to your face, eye, or picture is not a “testimonial” act because it does not require the suspect to provide any information that is inside his or her mind. A Virginia judge, Steven Frucci, let police use a fingerprint to unlock a phone in 2014, and other courts granted similar requests in 2016 and 2017.

Although cool, face scanning is another invasive software. It has the ability to discriminate against African Americans, Hispanics, and immigrants battling police brutality, racial profiling, and travel bans. Implications for transitioning members of the LGBTQIA community are also up for debate. This technology, in conjunction with those previously mentioned, is another unregulated puzzle piece in the infrastructure of authoritarianism. Technology is exploding, simultaneously expanding the negative impact of all of our human prejudices, blind spots, and flaws. Yet, society is focused on the cool factor. That is, until the daily news uncovers another disenfranchised community and public servants scramble to regulate the privacy and security threats of innovation.

One thought on “Apple Introduces Face ID and Exposes Another Discriminatory Blind Spot in “Cool” Technology

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s