Facebook The inside story

1.jpeg

This is a Facebook-sanctioned book about Facebook, we could reasonably assume that it’s leaning more toward Facebook defence. Even so the book painted Facebook with a rather grim light. What interests me in the book was dark side of Facebook also as it was not yet discussed widely. As Facebook fully cooperates with the author, he can glean inside the company better than others. The book was full of insider interviews, exclusive details of company’s culture and leadership.

The dangerous unintended consequences

Facebook most cherished accomplishments were now unmasked as liabilities.

  • The huge user base once seen as a world-changing kumbaya was now alarming evidence of excessive power.

  • The ability to give voices to the unheard was identified as a means to bequeath an earsplitting sound system to hate groups.

  • The ability to organize political movements of liberation was now a deadly tool of oppressors.

  • The joyful metrics that spurred smile-inducing memes to entertain and uplift us now were fingered as an algorithmic boost to misinformation

Virtually every problem that Facebook confronted had been a consequence of the unprecedented nature of the mission to connect the world, and the consequences of its reckless haste to do so.

Founder reckless disregard of personal privacy

Business Insider showed, via Zuckerberg’s IMs, how he used the users’ private accounts on Thefacebook to read the emails of the Harvard University internal magazine writer and editor. If this was what he did, what confidence is there that they will not exploit current user information for their benefits?

Sign in with Facebook

Facebook shared information it had about users (who intentionally signed up for the apps using Facebook Connect) and friends of users (who had no idea that their information was being passed on to apps that they might never have heard of, let alone signed up for.

You would log in to these apps with Facebook and suddenly they knew everything about you and your friends. And they were doing very nefarious things with it. Now Facebook would stop the practice, not to serve users but because it did not want to give away data to developers for nothing in return. Facebook, however, spun it as a move toward more user privacy.

Fraud 

When old tool no longer took enough user information, Facebook created a different honey trap for user data, Onavo Protect, which delivered what seemed like a bargain: a free “Virtual Private Network” (VPN) that provided more security than public Wi-Fi networks. It takes a certain amount of chutzpah to present people with a privacy tool whose purpose was to gain their data.

Onavo Protect, Apple app store concluded, was a surveillance tool marketing itself as a secure VPN, and harmful to users.

Democracy 

Even more disturbing was the idea that Facebook could use that power to manipulate people to get the results it wanted. Facebook cooperated with a social science study by splitting part of its population into two groups: those that could see the “I Voted” button and a control group that could not. Then, using voter records, the authors compared. Since it turned out that voting increased with the experimental group, then it was possible Facebook affected the election.

As one Facebook policy person put it, “Could I guarantee that I could help you manipulate Facebook to win the election? The answer is no. But can you tap into people’s fears, and people’s worries, and people’s concerns and people’s bigotry, to activate and prime things? Absolutely.

Algorithm impacts

When you searched on Facebook for vaccination information, it was the anti-vaxxers—with bogus science and conspiracy theories—who dominated the results. Though they were only a tiny minority in a huge state, the fringe owned the discussion. I guess between asking Facebook to adjust its algorithm (which geared toward maximising attention/engagement for its benefits) to reduce its bottom line for the benefits of mankind (which is not of financial interest of Facebook) and educating people digital literacy, critical thinking (e.g. Facebook is not the place to look for credible information). I’d be hard-pressed to tell which is easier.

Facebook had built an engine to push propaganda.

DiResta-Jounalist from the Philippine-managed to get a meeting with a News Feed director, who conceded that some groups were problematic but that the company did not want to hamper free expression. “I wasn’t asking for suppression,” she says. “I was saying your recommendation engine was growing this community!”

Facebook impacts

We’ve actually built an AI that’s more powerful than the human mind and we hid it from all of society by calling it something else,” Harris-former Google ethicist-says. “By calling it the Facebook News Feed, no one noticed that we’d actually built an AI that’s completely run loose and out of control.” Harris says that using the News Feed is like fighting an unbeatable computer chess player—it knows your weaknesses and beats you every time.  

“It was turning from a place to explore to a place to exploit. The best minds of my generation are thinking about how to make people click ads,” Harris said. “That sucks.”

News in the age of Facebook

When one of BuzzFeed’s social-media managers unearthed someone’s Tumblr photo of a dress of ambiguous color—some people saw black and blue, others gold and white—she took all of five minutes to whip up a story she shared on Facebook. With the benefit of BuzzFeed’s social-media acumen, the story cut through the social graph like Sherman’s March. Twenty-eight million views in a day. Then Buzzfeed ran dozens of follow-up posts, flooding the zone as if it were The New York Times reporting on 9/11.

What Facebook knows about us

Kosinski used statistics to make predictions about personal traits from the Likes of about 60,000 volunteers, then compared the predictions to the subjects’ actual traits as revealed by the myPersonality test. The results were so astounding that the authors had to check and recheck. “It took me a year from having the results to actually gaining confidence in them to publish them because I just couldn’t believe it was possible,” says Kosinski. They published in the Proceedings of the National Academy of Sciences (PNAS), a prestigious peer-reviewed journal, in April 2013. The paper’s title—“Private Traits and Attributes Are Predictable from Digital Records of Human Behavior”—only hinted at the creepiness of the discovery. Kosinski and his co-authors were claiming that by studying someone’s Likes, one could figure out people’s secrets, from sexual orientation to mental health. “Individual traits and attributes can be predicted to a high degree of accuracy based on records of users’ Likes,” they wrote. Solely by analyzing Likes, they successfully determined whether someone was straight or gay 88 percent of the time. In nineteen out of twenty cases, they could figure out whether one was white or African American. And they were 85 percent correct in guessing one’s political party. Even by clicking on innocuous subjects, people were stripping themselves naked: for example, the best predictors of high intelligence include “Thunderstorms,” “The Colbert Report,” “Science,” and “Curly Fries,” whereas low intelligence was indicated by “Sephora,” I Love Being A Mom,” “Harley Davidson,” and “Lady Antebellum.” Good predictors of male homosexuality included “No H8 Campaign,” “Mac Cosmetics,” and “Wicked The Musical,” whereas strong predictors of male heterosexuality included “Wu-Tang Clan,” “Shaq,” and “Being Confused After Waking Up From Naps”

At the paper’s conclusion they noted that the benefits of using Likes to broadcast preferences and improve products and services might be offset by the drawbacks of unintentional exposure of one’s secrets. “Commercial companies, governmental institutions, or even one’s Facebook friends could use software to infer attributes such as intelligence, sexual orientation, or political views that an individual may not have intended to share,” they wrote. “One can imagine situations in which such predictions, even if incorrect, could pose a threat to an individual’s well-being, freedom, or even life.

In subsequent months, Kosinski and Stillwell would improve their prediction methods and publish a paper that claimed that using Likes alone, a researcher could know someone better than the people who worked with, grew up with, or even married that person. “Computer models need 10, 70, 150, and 300 Likes, respectively, to outperform an average work colleague, cohabitant or friend, family member, and spouse,” they wrote.

Facebook treatment of user data

What Facebook did not do for more than a year after learning about the Cambridge Analytica data abuse was get a formal affirmation that Cambridge had deleted the data. (Facebook’s excuse: its outside law firm was negotiating.) While Kogan had not turned in his affirmation until that June, Cambridge did not do so at all during the entire election campaign.

In his-own words

I personally avoided the service and my children “aren’t allowed to use that shit.” - former Facebook head of growth.

Chankhrit Sathorn