The Death of Privacy

In April of 2015, the death of Freddie Gray, a twenty-five-year-old black man, provoked enormous public outcry. Protests erupted across Gray’s hometown of Baltimore, with some expressing their hatred towards police brutality, and others pleading for a systematic justice they had yet to witness. The reaction became an additional proponent of a bleak future, as the Baltimore police department used facial recognition technology to identify and arrest attendees of the protests. In a matter of days, authorities permanently smeared citizens’ current definition of privacy.

The Rise and Fall

For years, facial recognition technology permeated the globe through its use in smartphones and security. However, an increasing number of countries have taken a liking to its broad surveillance capabilities as a method of reducing crime. In 2019, 64 countries had incorporated mass surveillance systems throughout their cities. Cameras were placed in public environments to monitor residents and collect data.

The first step to facial recognition involves ‘detecting’ faces—a process that has become somewhat customary. The ‘analysis’ phase is performed by a neural network examining an individual’s facial geometry and assembling a unique faceprint. The third step poses the greatest speculation, as it focuses on recognizing the identity behind the face. This stage will often scan through a watchlist of people or a colossal database to confirm the correspondence.

The initial allure of the platform quickly transcended into an abyss of concerns and repercussions. The primary issue lies in the apparent discriminatory practices of the technology. The algorithm is trained on various faces from each demographic and gender. However, the groups with lower emphasis will receive less accurate matches. The National Institute of Standards and Technology deemed that most facial recognition algorithms display differentials that can exacerbate their accuracy based on race and gender. Moreover, the tests produced higher rates of false positives for those of Asian and African-American descent in comparison to Caucasians. In the hands of law enforcement, these errors can drastically increase societal inequities. Moreover, the sale of such software to government agencies can target vulnerable populations, such as refugees.

The separation based upon perceived sexuality introduces a new set of consequences. A Stanford professor determined facial recognition can be trained to differentiate individuals who are heterosexual from those that are homosexual. The links between faces and psychology are often invisible to the human eye. However, the heightened advances in artificial intelligence can enable these links to be interpreted. Countries can acquire this technology as a method of oppression. Saudi Arabia and Afghanistan are examples where homosexuality is already punishable by the death penalty. Thus, the requirement for vigilance and regulation on facial recognition technology becomes paramount.

The Privacy Dilemma

Privacy is arguably one of the most sought-after privileges. Common items such as blinds, fences, and tinted windows were all created to provide a degree of seclusion. However, monitored blanket surveillance can violate a person’s right to privacy through the storage and analysis of public footage. Moreover, many organizations store facial data in local services as opposed to the cloud. Therefore, there is a greater likelihood of data being leaked to security threats. This lack of privacy can also instigate several repercussions, such as an altered democratic landscape; citizens may shy away from expressing their political beliefs or support for public campaigns in fear of being reprimanded through the technology.

China’s Black Mirror

China has embraced artificial intelligence. The country provides an apt example of a future sans regulations on facial recognition technology. It has approximately 200 million surveillance cameras dispersed across streets, train stations, and food markets. However, this is simply a component of a much greater system, which tracks citizens’ internet use, hotel stays, and communication. In the city of Xiangyang, enormous screens display images of jaywalkers along with their personal information as a method of public humiliation. Chinese authorities refer to the strategy as a panopticon; the uncertainty of being watched is what provokes residents to obey rules. Ultimately, the government wants to assign everyone a social credit score based upon their travel history, browsing habits, and criminal record, which would display their credibility. The government also uses this system to track the whereabouts of Uighur Muslims and Tibetans. In other words, it is nearly impossible to hide. This technology has given the country a level of control that once appeared unachievable. An automated global future no longer seems so far off.

Utopia versus Dystopia

Facial recognition technology has already begun revolutionizing the world. What initially seemed like a utopian fantasy has gradually transformed into a plausible reality. However, the lack of legislation on the technology will act as a catalyst for discrimination, privacy breaches, and chaos. Therefore, a temporary ban on its use by public and private agencies would appear appropriate until strict regulations are developed in Canada. Companies such as Google, IBM, and Microsoft have already requested bans on facial recognition by police bodies until the government releases reasonable guidelines controlling its use. Another suggestion would involve prohibiting the storage of faceprints from places of public accommodation as a method of sustaining both privacy and freedom. Therefore, the ultimate decision on biometrics must be democratic and involve the opinions of the public, who must decide the extent of surveillance they are willing to accept.

A New Beginning

It is evident that technology has transformed existence. From cars to cell phones, each tool attempts to facilitate living and provide an ironic interconnectivity amongst the world. Although facial recognition poses a plethora of “what-ifs'', it is only software by itself. The choice is upon society to decide how to responsibly use technology. It is human behaviour and our thirst for the unknown that truly has the potential to modify our lives.

Previous
Previous

Algorithmic Bias in Hiring

Next
Next

From Billionaires to Billion Shares