Discussing Facial Recognition Technology: A Missed Opportunity
This blog post is an article critique for, Public faces? A critical exploration of the diffusion of face recognition technologies in online social networks, written by Norval and Prasopoulou (2017).
Norval and Prasopoulou (2017) discuss Facial Recognition Technology (FRT) and its implications on security and privacy for users on Social Network Sites (SNSs). The paper lays out a detailed conceptual framework that explicates how biometric technologies, like FRT, “diffuse” between offline to online use. Diffusion is the process in which technologies are transmitted to digital environments, and along the way, the best practices and ethics for how the technology is used offline is misapplied or eliminated in the social sphere. Overall, the conceptualization is sound and is applicable in the global context.
Despite a sound framework, the article’s in-depth analysis on Facebook’s “tag suggestions” tool lacks in some areas. Norval and Prasopoulou (2017) provides readers with fair (but not strong) examples about how FTR works offline (e.g.: remote biometrics and government identity schemes). However, the authors’ examples for how FTR is used on Facebook falls short. What is presented are: (1) Quotes regarding the “tag suggestion” feature, made by government officials; (2) Legal arguments made by attorneys and/or think tanks; and (3) Counter-arguments made by Facebook representatives. From these three themes, the article quickly turns to user outcomes and consequences.
Norval and Prasopoulou (2017) miss the opportunity to paint a multi-dimensional picture of the depth of diffusion. This could be accomplished by incorporating use cases written from the user’s perspective. This is crucial because the article criticizes how Facebook uses legalese to either confuse or manipulate users to opt-in, however, this same terminology (through its legal, governmental, and managerial centered examples) is leveraged to describe and define the problem-at-hand. A more effective approach is to saliently articulate use cases such as:
New ways in which a user must manage their own identities
New opportunities for marketers to enhance collected and/or purchased user data
Figures (screenshots) of the user settings page to illustrate the overly simplistic and/or ambiguity embedded into the user experience
User reactions (e.g.: interviews or survey responses) to discovering photos of themselves uploaded and tagged by other individuals
FRT effects on users with visual impairments (e.g.: colorblindness)
FRT effects on users whose phenotypic features are perceived as non-white
The above points are a short-list of many possible use cases that could be served from a user’s point-of-view in clear and succinct narrative form.
A related book I highly recommend is, Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance, by Kelly Gates (2011). The book, 294 pages in length, in written from a critical-cultural lens to exemplify how “new technologies are pursued as shortsighted solutions to complex social problems.” This work isn’t centered on SNSs. However, it does offer a comprehensive background of the technology’s brief history, uses, and governance and policy implications – a very good precursor for understanding how FRT evolved, and unbeknown to us, infiltrated SNSs and ultimately our homes and handheld screens.
To learn more about how Facebook uses FRT visit their help center article.