Tag Archives: privacy

“The Technological Takeover of the Human Body” … and Jewelry to Fight Back

woman wearing Nowak's jewelry: a metallic rectangle over her nose, and two circles under her eyesDebbie says:

My interest in privacy doesn’t often overlap with the general topics of Body Impolitic, so I wouldn’t ordinarily write here about Quentin Fottrell’s article at MarketWatch, ‘The neoliberal takeover of the human body’:

Aram Sinnreich recently went grocery shopping at a Whole Foods Market in his hometown of Washington, D.C., and realized he had left his wallet at home. He had no cards and no cash, but he had no reason to worry — at least, not about paying for his food. “I used my iPhone to pay, and I unlocked it with my face,” he said.

That’s when it struck him: We are just one small step away from paying with our bodily features alone. With in-store facial-recognition machines, he wouldn’t even need his smartphone. Sinnreich, associate professor of communication studies at American University, said he got a glimpse of the future that day. …

Removing the last physical barrier — smartphones, watches, smart glasses and credit cards — between our bodies and corporate America is the final frontier in mobile payments. “The deeper the tie between the human body and the financial networks, the fewer intimate spaces will be left unconnected to those networks,” Sinnreich said.

I don’t want my “intimate spaces,” by which the capitalists mean both my kitchen cupboards and my junk, connected to any networks, whether they are shopping networks or monitoring my political activities.

I’m afraid, however, as Fottrell discusses, that a critical mass of people may choose “convenience” over privacy, and get to the point where their phone and their face are their wallets and credit cards. And it could eventually get to the point where not opting into that technology makes shopping more difficult and punishes people who still carry a wallet.

I found Fottrell’s article while I was looking for background to put Eva Nowak’s jewelry in context. Nowak has designed jewelry for the purpose of avoiding facial recognition software (pictured above). Ironically, the first article I found about this was so extraordinarily advertising and tracker-heavy that I refuse to link to it. Fortunately, several other outlets have also covered this story, including

Polish designer Ewa Nowak tackled the issue of algorithms that use facial recognition. After all, while it might seem helpful when you’re trying to tag your friends in your birthday pictures on Facebook, such technology could pose serious threats to anyone’s privacy if it was used with malicious intent.

According to , Nowak was working with DeepFace, Facebook’s own proprietary facial recognition software. Facebook uses this program to identify people in photos posted to Facebook. That’s disturbing enough, but this technology has uses beyond tagging  your friends or letting you buy groceries. According to the American Civil Liberties Union,

Facial recognition systems are built on computer programs that analyze images of human faces for the purpose of identifying them. Unlike many other biometric systems, facial recognition can be used for general surveillance in combination with public video cameras, and it can be used in a passive way that doesn’t require the knowledge, consent, or participation of the subject.

The biggest danger is that this technology will be used for general, suspicionless surveillance systems. State motor vehicles agencies possess high-quality photographs of most citizens that are a natural source for face recognition programs and could easily be combined with public surveillance or other cameras in the construction of a comprehensive system of identification and tracking.

Some governments are beginning to fight back. Just today, the Chinese Ministry of Information has issued new guidance recommending limiting the use of facial recognition and related applications in schools, including “smart uniforms” (with trackers so the administration knows where each student is at any time).

Closer to home, my home and work cities have banned facial recognition technologies, and so has Somerville, Massachusetts. More cities will follow (probably very blue cities based on this beginning). Again, who knows where the mass of people will wind up — and who knows how many police departments and government agencies will ignore the bans. In case you didn’t know, along with all of its other problems, this technology has deeply racist consequences.

July test results from the National Institute of Standards and Technology indicated that two of Idemia’s [popular French facial recognition software] latest algorithms were significantly more likely to mix up black women’s faces than those of white women, or black or white men.

The [National Institute of Standards and Technology] test challenged algorithms to verify that two photos showed the same face, similar to how a border agent would check passports. At sensitivity settings where Idemia’s algorithms falsely matched different white women’s faces at a rate of one in 10,000, it falsely matched black women’s faces about once in 1,000—10 times more frequently. A one in 10,000 false match rate is often used to evaluate facial recognition systems.

If you think that this won’t be used to the detriment of dark-skinned people, you have some reading to do.

I don’t expect Eva Nowak’s designs to last terribly long: what art can foil, tech can work around. Nonetheless, it’s both comforting and aesthetically pleasing to see artists and governments using their own tools to try to keep us safe from the rapacious tech industry: technology is only an unstoppable juggernaut if we don’t unite to stop it.

 

Open Offices: Who Watches and Who Gets Watched?

[DISPLAY_ULTIMATE_SOCIAL_ICONS]

Debbie says:

My day job is moving from our own office into a co-working space. If you don’t know what that is, one way to describe it is “Uber for offices.” Various small companies and individuals rent space from a company whose only purpose is to provide office space. Some of these are small and local, but we are moving into a building operated by WeWork, the giant in this field. It’s a difficult change for many of us. Like almost all co-working spaces, WeWork is open plan, with lots of glass and very little privacy.

A friend who works at a different WeWork locality sent me this fascinating article by Katharine Schwab at fastcodesign, about the implications of open plan offices for women. Schwab is reporting on a tiny study conducted by researchers Alison Hirst and Christina Schwabenland. Their 50-person sample definitely doesn’t satisfy my urge for statistical reliability, but the conclusions are nonetheless plausible and worth considering:

While some female employees felt like the new office space promoted equality, others had the opposite reaction. The researchers found that many women became hyper-aware of being constantly watched and their appearance constantly evaluated; multiple women told them that “there isn’t anywhere that you don’t feel watched.” Of the men Hirst interviewed, there was no evidence they felt similarly or changed their actions as a result of the lack of privacy.

The architect (kept anonymous, and I have to wonder why) compared the experience of open plan work environments to a nudist beach:

You know, first you’re a little bit worried that everyone’s looking at you, but then you think, hang on, everybody else is naked, no one’s looking at each other,” he told the researchers. “I think that’s what’ll happen, they’ll get on with it.”

The only problem is that sociological research of nudist beaches has shown that people do continue to watch each other–“men in particular, often in groups, look obsessively at women,” the researchers write. This kind of all-glass, no-privacy environment leads to a subtle kind of sexism, where women are always being watched and thus judged on their appearances, causing anxiety for many employees.

For me personally, this is not an issue. I’m old, fat, and generally don’t care if anyone is watching me. At the same time, reading the article did make me think differently about the younger women I work with, and the generally much younger women I have seen the two times I’ve been in the new space.

Even the men who watch women all the time don’t seem to be aware that women feel watched all the time. And it’s not just being ogled …

Not only were women’s physical appearances up for judging, the open office also meant there was no private space where workers could go if they were emotionally distressed or needed to conduct a private conversation. “If you’re upset about something, there’s nowhere to go,” one woman told the researchers. “Where can you go? All you can do is go to the ladies, so there’s nowhere that you can go and speak to somebody on a one-to-one basis where you can’t be observed.”

One of my first tasks in the new office will be to find the actual private spaces (there are always some). I tend to get involved in intimate one-on-one conversations, and be a person that people in tears come to, so I will need to know. And then I can let other women know, too.

The comments to the article — for once, you should read them — bring up other gendered issues, such as increased sexual harassment. My partner asked, and I haven’t yet found out, who handles sexual harassment issues across employers? In other words, we work in one glassed-in office, and if someone in our office is behaving badly toward someone else, we have a human resources department that has actually shown some emotional intelligence around these issues, at least some of the time. But what happens if someone in the next office, or on the next floor, or an off-and-on visitor, is behaving badly toward someone in our office? Where is her recourse?

Once again, we notice how money-saving, employer-friendly workplace changes seem to have a disproportionate effect on women? While Schwab unfortunately doesn’t discuss this, it is a foregone conclusion that people of color (and even more so, women of color) have similar problems in open offices.

I would rather not become the person who crusades around these issues in our new space, but I have a sinking feeling that I’m going to have to.