Tag Archives: Facebook

If Real Teens Weren’t Being Harmed, Watching Facebook Squirm Would Be Fun

Four teenagers of different races and genders, walking together, arms around each other.

Debbie says:

The evidence is damning. Whistleblower Frances Haugen, who initially spoke anonymously with the Wall Street Journal and then did a public interview on 60 Minutes, has raised questions about several key Facebook behaviors. Haugen, a former Facebook product manager who was tasked with “civic integrity issues,” has revealed Facebook’s own studies which document that 32% of teen girls have experienced negative effects on their body image when they spend time on Instagram (which is owned by Facebook). In the wake of the current scandals, Facebook has at least temporarily withdrawn its plans for an “Instagram for Kids” site.

Social media is not just a site of body image harm. The platforms can provide routes for teens (and everyone else) with body image issues to find support and community. High-profile entertainers like Lizzo offer remarkably strong body-positive images and statements. Well-designed studies and good medical advice are available to anyone with a good search string.

Nonetheless, the majority of people just scratch the surface of their social media sites, and so the tyranny of the majority has a lot of power. However much fat-positive work is out there to be proud of, there’s no denying that it’s overwhelmed by mainstream definitions of beauty … and mainstream peer pressure tactics. If a social media user doesn’t go looking for positive body image support, they’re unlikely to trip over it–while they will trip over “the norm” just by logging on.

Facebook — and TikTok, and SnapChat, and all the others — have ways to combat this. They have the tools to know who the central disseminators of negative information are. They could shut off the loudest dangerous voices and the most pervasive destructive images with a week’s effort. And they probably wouldn’t lose any significant amount of money. Instead, they double down:

They deny the value of their own data because of its small sample size. Small sample size is a problem, but who designed the study?

They say that Instagram didn’t do harm to teenagers in “other areas,” such as loneliness. Okay, great. But that doesn’t make this problem better.

They say a majority of respondents didn’t have exacerbated body image issues. Sure, fine, but 32% is a lot of people to hurt just because they aren’t the majority. And since some of those 32% reported increased or new suicidal thoughts, maybe we should take them seriously.

In case you weren’t thinking about it, boys have body image issues too. People of color are certainly affected, often very negatively, by the linkage between whiteness and “beauty.” Teens facing gender issues suffer from narrow expectations. And body image issues are not confined to teens–they affect everyone from pre-teens to octogenarians. Teen girls are the canaries in the mine, the group that is (often) most dramatically affected and most seriously at risk. The policies that fail to protect teen girls are dangers to a vast range of people.

It’s well known that people are ruder and more threatening on social media because they don’t really believe they are interacting with real people. Watching Facebook treat their own data as if it didn’t matter makes me wonder if social media users are just learning from social media moguls; if Mark Zuckerberg and Nick Clegg and other Facebook higher-echelon folks don’t believe their customers are real, how will we learn to believe in each other?

NOTE:  This post is drawn from several sources, rather than just one or two. References available on request; just ask in the comments.

======================

Follow Debbie on Twitter.

Follow Laurie’s new Pandemic Shadows photos on Instagram.

======================

“The Technological Takeover of the Human Body” … and Jewelry to Fight Back

woman wearing Nowak's jewelry: a metallic rectangle over her nose, and two circles under her eyesDebbie says:

My interest in privacy doesn’t often overlap with the general topics of Body Impolitic, so I wouldn’t ordinarily write here about Quentin Fottrell’s article at MarketWatch, ‘The neoliberal takeover of the human body’:

Aram Sinnreich recently went grocery shopping at a Whole Foods Market in his hometown of Washington, D.C., and realized he had left his wallet at home. He had no cards and no cash, but he had no reason to worry — at least, not about paying for his food. “I used my iPhone to pay, and I unlocked it with my face,” he said.

That’s when it struck him: We are just one small step away from paying with our bodily features alone. With in-store facial-recognition machines, he wouldn’t even need his smartphone. Sinnreich, associate professor of communication studies at American University, said he got a glimpse of the future that day. …

Removing the last physical barrier — smartphones, watches, smart glasses and credit cards — between our bodies and corporate America is the final frontier in mobile payments. “The deeper the tie between the human body and the financial networks, the fewer intimate spaces will be left unconnected to those networks,” Sinnreich said.

I don’t want my “intimate spaces,” by which the capitalists mean both my kitchen cupboards and my junk, connected to any networks, whether they are shopping networks or monitoring my political activities.

I’m afraid, however, as Fottrell discusses, that a critical mass of people may choose “convenience” over privacy, and get to the point where their phone and their face are their wallets and credit cards. And it could eventually get to the point where not opting into that technology makes shopping more difficult and punishes people who still carry a wallet.

I found Fottrell’s article while I was looking for background to put Eva Nowak’s jewelry in context. Nowak has designed jewelry for the purpose of avoiding facial recognition software (pictured above). Ironically, the first article I found about this was so extraordinarily advertising and tracker-heavy that I refuse to link to it. Fortunately, several other outlets have also covered this story, including

Polish designer Ewa Nowak tackled the issue of algorithms that use facial recognition. After all, while it might seem helpful when you’re trying to tag your friends in your birthday pictures on Facebook, such technology could pose serious threats to anyone’s privacy if it was used with malicious intent.

According to , Nowak was working with DeepFace, Facebook’s own proprietary facial recognition software. Facebook uses this program to identify people in photos posted to Facebook. That’s disturbing enough, but this technology has uses beyond tagging  your friends or letting you buy groceries. According to the American Civil Liberties Union,

Facial recognition systems are built on computer programs that analyze images of human faces for the purpose of identifying them. Unlike many other biometric systems, facial recognition can be used for general surveillance in combination with public video cameras, and it can be used in a passive way that doesn’t require the knowledge, consent, or participation of the subject.

The biggest danger is that this technology will be used for general, suspicionless surveillance systems. State motor vehicles agencies possess high-quality photographs of most citizens that are a natural source for face recognition programs and could easily be combined with public surveillance or other cameras in the construction of a comprehensive system of identification and tracking.

Some governments are beginning to fight back. Just today, the Chinese Ministry of Information has issued new guidance recommending limiting the use of facial recognition and related applications in schools, including “smart uniforms” (with trackers so the administration knows where each student is at any time).

Closer to home, my home and work cities have banned facial recognition technologies, and so has Somerville, Massachusetts. More cities will follow (probably very blue cities based on this beginning). Again, who knows where the mass of people will wind up — and who knows how many police departments and government agencies will ignore the bans. In case you didn’t know, along with all of its other problems, this technology has deeply racist consequences.

July test results from the National Institute of Standards and Technology indicated that two of Idemia’s [popular French facial recognition software] latest algorithms were significantly more likely to mix up black women’s faces than those of white women, or black or white men.

The [National Institute of Standards and Technology] test challenged algorithms to verify that two photos showed the same face, similar to how a border agent would check passports. At sensitivity settings where Idemia’s algorithms falsely matched different white women’s faces at a rate of one in 10,000, it falsely matched black women’s faces about once in 1,000—10 times more frequently. A one in 10,000 false match rate is often used to evaluate facial recognition systems.

If you think that this won’t be used to the detriment of dark-skinned people, you have some reading to do.

I don’t expect Eva Nowak’s designs to last terribly long: what art can foil, tech can work around. Nonetheless, it’s both comforting and aesthetically pleasing to see artists and governments using their own tools to try to keep us safe from the rapacious tech industry: technology is only an unstoppable juggernaut if we don’t unite to stop it.