“The Technological Takeover of the Human Body” … and Jewelry to Fight Back

woman wearing Nowak's jewelry: a metallic rectangle over her nose, and two circles under her eyesDebbie says:

My interest in privacy doesn’t often overlap with the general topics of Body Impolitic, so I wouldn’t ordinarily write here about Quentin Fottrell’s article at MarketWatch, ‘The neoliberal takeover of the human body’:

Aram Sinnreich recently went grocery shopping at a Whole Foods Market in his hometown of Washington, D.C., and realized he had left his wallet at home. He had no cards and no cash, but he had no reason to worry — at least, not about paying for his food. “I used my iPhone to pay, and I unlocked it with my face,” he said.

That’s when it struck him: We are just one small step away from paying with our bodily features alone. With in-store facial-recognition machines, he wouldn’t even need his smartphone. Sinnreich, associate professor of communication studies at American University, said he got a glimpse of the future that day. …

Removing the last physical barrier — smartphones, watches, smart glasses and credit cards — between our bodies and corporate America is the final frontier in mobile payments. “The deeper the tie between the human body and the financial networks, the fewer intimate spaces will be left unconnected to those networks,” Sinnreich said.

I don’t want my “intimate spaces,” by which the capitalists mean both my kitchen cupboards and my junk, connected to any networks, whether they are shopping networks or monitoring my political activities.

I’m afraid, however, as Fottrell discusses, that a critical mass of people may choose “convenience” over privacy, and get to the point where their phone and their face are their wallets and credit cards. And it could eventually get to the point where not opting into that technology makes shopping more difficult and punishes people who still carry a wallet.

I found Fottrell’s article while I was looking for background to put Eva Nowak’s jewelry in context. Nowak has designed jewelry for the purpose of avoiding facial recognition software (pictured above). Ironically, the first article I found about this was so extraordinarily advertising and tracker-heavy that I refuse to link to it. Fortunately, several other outlets have also covered this story, including

Polish designer Ewa Nowak tackled the issue of algorithms that use facial recognition. After all, while it might seem helpful when you’re trying to tag your friends in your birthday pictures on Facebook, such technology could pose serious threats to anyone’s privacy if it was used with malicious intent.

According to , Nowak was working with DeepFace, Facebook’s own proprietary facial recognition software. Facebook uses this program to identify people in photos posted to Facebook. That’s disturbing enough, but this technology has uses beyond tagging  your friends or letting you buy groceries. According to the American Civil Liberties Union,

Facial recognition systems are built on computer programs that analyze images of human faces for the purpose of identifying them. Unlike many other biometric systems, facial recognition can be used for general surveillance in combination with public video cameras, and it can be used in a passive way that doesn’t require the knowledge, consent, or participation of the subject.

The biggest danger is that this technology will be used for general, suspicionless surveillance systems. State motor vehicles agencies possess high-quality photographs of most citizens that are a natural source for face recognition programs and could easily be combined with public surveillance or other cameras in the construction of a comprehensive system of identification and tracking.

Some governments are beginning to fight back. Just today, the Chinese Ministry of Information has issued new guidance recommending limiting the use of facial recognition and related applications in schools, including “smart uniforms” (with trackers so the administration knows where each student is at any time).

Closer to home, my home and work cities have banned facial recognition technologies, and so has Somerville, Massachusetts. More cities will follow (probably very blue cities based on this beginning). Again, who knows where the mass of people will wind up — and who knows how many police departments and government agencies will ignore the bans. In case you didn’t know, along with all of its other problems, this technology has deeply racist consequences.

July test results from the National Institute of Standards and Technology indicated that two of Idemia’s [popular French facial recognition software] latest algorithms were significantly more likely to mix up black women’s faces than those of white women, or black or white men.

The [National Institute of Standards and Technology] test challenged algorithms to verify that two photos showed the same face, similar to how a border agent would check passports. At sensitivity settings where Idemia’s algorithms falsely matched different white women’s faces at a rate of one in 10,000, it falsely matched black women’s faces about once in 1,000—10 times more frequently. A one in 10,000 false match rate is often used to evaluate facial recognition systems.

If you think that this won’t be used to the detriment of dark-skinned people, you have some reading to do.

I don’t expect Eva Nowak’s designs to last terribly long: what art can foil, tech can work around. Nonetheless, it’s both comforting and aesthetically pleasing to see artists and governments using their own tools to try to keep us safe from the rapacious tech industry: technology is only an unstoppable juggernaut if we don’t unite to stop it.