Summary of Broken Code by Jeff Horwitz: Inside Facebook and the Fight to Expose Its Harmful Secrets - GP SUMMARY - E-Book

Summary of Broken Code by Jeff Horwitz: Inside Facebook and the Fight to Expose Its Harmful Secrets E-Book

SUMMARY GP

0,0
4,99 €

oder
-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

DISCLAIMER

This book does not in any capacity mean to replace the original book but to serve as a vast summary of the original book.


Summary of Broken Code by Jeff Horwitz: Inside Facebook and the Fight to Expose Its Harmful Secrets

 

IN THIS SUMMARIZED BOOK, YOU WILL GET:

  • - Chapter astute outline of the main contents.
  • - Fast & simple understanding of the content analysis.
  • - Exceptionally summarized content that you may skip in the original book

 

Facebook's Broken Code is a behind-the-scenes look at its strategic failures to address its role in the spread of disinformation, political fracturing, and even genocide. The book, filled with eye-popping statistics and anecdotes from insiders, explores Facebook's manipulation tactics and the distorted way we connect online. The book highlights the company's failures to control or understand its own platforms, leading employees to discover deeper issues such as peddling anger, human trafficking, enabling drug cartels and authoritarians, and distorting behavior in ways no one understood. Despite personal trauma and professional setbacks, employees identified the root causes of Facebook's viral harms and drew up concrete plans to address them. The book highlights that the problems spawned by social media cannot be resolved by strapping on a headset.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB

Veröffentlichungsjahr: 2023

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.


Ähnliche


GP SUMMARY

Summary of Broken Code by Jeff Horwitz: Inside Facebook and the Fight to Expose Its Harmful Secrets

Facebook's Broken Code is a comprehensive analysis of the company's strategic failures in addressing its role in disinformation, political fracturing, and genocide. It reveals its manipulation tactics and the distorted online connection, revealing deeper issues like human trafficking and drug cartels. The book emphasizes that social media problems cannot be resolved by simply adjusting the platform.BookRix GmbH & Co. KG81371 Munich

title page

Summary of Broken Code

A

Summary of Jeff Horwitz’s book

Inside Facebook and the Fight

to Expose Its Harmful Secrets

GP SUMMARY

Summary of Broken Code by Jeff Horwitz: Inside Facebook and the Fight to Expose Its Harmful Secrets

By GP SUMMARY© 2023, GP SUMMARY.

All rights reserved.

Author: GP SUMMARY

Contact: [email protected]

Cover, illustration: GP SUMMARY

Editing, proofreading: GP SUMMARY

Other collaborators: GP SUMMARY

NOTE TO READERS

This is an unofficial summary & analysis of Jeff Horwitz’s “Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets” designed to enrich your reading experience.

 

DISCLAIMER

The contents of the summary are not intended to replace the original book. It is meant as a supplement to enhance the reader's understanding. The contents within can neither be stored electronically, transferred, nor kept in a database. Neither part nor full can the document be copied, scanned, faxed, or retained without the approval from the publisher or creator.

Limit of Liability

This eBook is licensed for your personal enjoyment only. This eBook may not be resold or given away to other people. If you are reading this book and did not purchase it, or it was not purchased for your use only, then please purchase your own copy. You agree to accept all risks of using the information presented inside this book.

Copyright 2023. All rights reserved.

1

Arturo Bejar returned to Facebook's Menlo Park campus in 2019 after six years away, feeling that something had gotten stuck. He had noticed things that seemed off, making it seem like the company didn't care about what its users experienced. Bejar's tech career was charmed, and he spent over a decade as the "Chief Paranoid" in Yahoo's security division. Mark Zuckerberg hired him as a Facebook director of engineering in 2009.

Bejar's expertise was in security but he embraced the idea that safeguarding Facebook's users meant more than just keeping out criminals. Early in his tenure, Facebook's chief operating officer asked Bejar to get to the bottom of skyrocketing user reports of nudity. His team sampled the reports and found they were overwhelmingly false. Instead of telling users to cut it out, they gave users the option to report not liking a photo of themselves, describing how it made them feel, and then prompting them to share that sentiment privately with their friend. Nudity reports dropped by roughly half.

Bejar created a team called Protect and Care, a testing ground for efforts to head off bad online experiences, promote civil interactions, and help users at risk of suicide. The only reason Bejar left the company in 2015 was because he was in the middle of a divorce and wanted to spend more time with his kids.

Arturo Bejar, a former member of Facebook's Protect and Care team, returned to the company after leaving to investigate the experience of young users on Instagram. He found that everyone at Facebook was as smart, friendly, and hardworking as before, even if no one believed social media was pure upside. The company's headquarters remained one of the world's best working environments, and it was good to be back.

Bejar noticed that Facebook had revamped its reporting system six months prior to redesigning it with the specific goal of reducing the number of completed user reports. This led to an arrogance in the company's approach, as users would report horrible things before realizing that Facebook wasn't interested.

Bejar found that many Facebook employees had been asking similar questions about the company's handling of social media issues. This effort, known as integrity work, required not just engineers and data scientists but intelligence analysts, economists, and anthropologists. These tech workers faced not just external adversaries but also senior executives who believed Facebook usage was an absolute good.

Facebook's integrity staffers became the keepers of knowledge that the outside world didn't know existed and their bosses refused to believe. As scrutiny of social media increased, Facebook had accumulated an ever-expanding staff devoted to studying and addressing social media's problems.

The author, a researcher with PhDs in data science, behavioral economics, and machine learning, was covering Facebook for the Wall Street Journal. They wanted to investigate how Facebook was altering human interaction and felt that their political accountability work felt pointless. Covering Facebook was a capitulation as the system of information sharing and consensus building was on its last legs. However, it was difficult to figure out the basics of Facebook's operations, such as its News Feed algorithm and its "People You May Know" recommendations.

The author became familiar with Facebook's mechanics and found that its automated enforcement systems were incapable of performing as billed, and the company knew far more about the negative effects of social media usage than it let on. The author tried to cultivate current employees as sources and obtained stray documents indicating that Facebook's powers and problems were greater than it let on.

Amid the flood of information, Frances Haugen, a mid-level product manager on Facebook's Civic Integrity team, responded to the author's LinkedIn messages, stating that Facebook's platforms eroded faith in public health, favored authoritarian demagoguery, and treated users as exploitable resources. She thought she might have to play a role in making these flaws public, which would produce tens of thousands of pages of confidential documents showing the depth and breadth of the harm being done to everyone from teenage girls to victims of Mexican cartels.

The author found that not every insider shared Haugen's exact diagnosis of what went wrong at Facebook or her prescription for fixing it, but they agreed with the written assessments of scores of employees who never spoke publicly. In the internal documents gathered by Haugen and hundreds more provided to the author after her departure, staffers documented the demons of Facebook's design and drew up plans to restrain them.

2

Facebook's senior Public Policy and Elections staff gathered in the conference room of their old Washington, DC, office to understand what Donald Trump's upset victory meant for the company. Elliot Schrage, Facebook's head of Public Policy and Communications, was convinced that Facebook would end up as 2016's scapegoat. The election had brought a new rage to American politics, with racist dog whistles and crude taunting of opponents becoming a regular feature of mainstream news coverage. Facebook had already faced criticism for censoring trending news stories with a right-wing bent, using the platform to launch attacks on Muslim and Mexican immigrants, and fabricating much of the platform's most popular news stories.

For the past five years, trying to prove that Facebook would transform politics had been her job. Katie Harbath, the head of Facebook's Elections team and a Republican, had caught the politics bug after volunteering for a Republican Senate campaign in college. She joined the Republican National Committee in 2008 and worked for the National Republican Senatorial Committee for the 2010 midterms.

Harbath bought a lot of Facebook advertising as part of her job at the NRSC and regularly consulted with Adam Conner, who had founded Facebook's DC office in 2007. By 2011, with another election around the corner, Conner decided it wasn't great having Republicans like Harbath discuss advertising strategy with a Democrat like himself. By 2011, Harbath joined the company's DC office as one of its first employees.

When the 2012 election was over, Harbath's political team hadn't won—but her corporate one had. At a time when Facebook was looking to compete with Twitter by getting into news and politics, Obama's reelection campaign's prominent use of the platform had been good for Facebook's clout. Harbath became Facebook's global emissary to the political world, traveling more than half the year to meet with major political parties in India.

Facebook's mission was compelling, and its stock proceeds covered the purchase of a two-bedroom condo in Arlington, Virginia. Facebook's role in politics was so successful that Facebook's Partnerships team tried to subsume it, but only Joel Kaplan, the head of Facebook's Public Policy team in Washington, kept it under Harbath. Facebook published research showing it could boost election turnout on a mass scale through messages directing users to state voter registration sign-ups and digital "I Voted" stickers. Harbath wanted Facebook to create dedicated political-organizing tools and channels for elected officials to interact with constituents before the next presidential election. Zuckerberg suggested building a team devoted to civic engagement work, which Harbath and her team sponsored and broadcasted every political event.

However, by the spring of 2016, Harbath started to feel something was off in online politics, particularly in the Philippines, where the president-elect, Rodrigo Duterte, had a combative and sometimes underhanded brand of politics. Facebook received reports of mass fake accounts, bald-faced lies on campaign-controlled pages, and coordinated threats of violence against Duterte critics.

In May 2016, the UK's referendum to leave the European Union reinforced Facebook's place in politics, but for Harbath, its role wasn't a feel-good kind. Both winning campaigns relied heavily on Facebook to push vitriol and lies. The success of Trump's campaign in the US was even more uncomfortable, as he used Facebook and Twitter to short-circuit traditional campaign coverage. Harbath broached the topic with Adam Mosseri, then Facebook's head of News Feed, but the company chose to punt when it came to lies on its platform. Facebook had signed on as a sponsor of the Democratic and Republican conventions and threw big parties at both.

Harbath handled the Republican convention and was horrified by the speeches from Trump's oddball celebrity acolytes and chants of "Lock her up," referring to Trump's opponent, Hillary Clinton. Facebook offered a dedicated staffer to help target Facebook ads, address technical problems, and liaise with company leadership. Harbath turned to James Barnes, a Republican friend who worked on the company's political ad sales team. Barnes relocated to the San Antonio offices of Giles-Parscale, the web marketing firm running Trump's digital campaign.

On October 7, 2016, a video from Access Hollywood leaked footage of Donald Trump boasting about his unsuccessful attempts to sleep with a married woman. Barnes left the office and never returned, as further work with the campaign was distasteful and pointless. He flew back to Washington, staying in only loose touch with the Trump people. Just days before the election, an article in Bloomberg Businessweek containing a boast from Trump's digital team that it was running voter suppression operations on Facebook, which Barnes had no idea what they were talking about.

On the evening of the election, Barnes took the results especially hard, feeling incredibly guilty for his actions. The next morning, Facebook's top lobbyist Joel Kaplan wanted a word with Barnes, telling him that it wasn't his fault that Trump had been elected or Facebook's fault. However, most Facebook executives were telling themselves that their core self-conception was that by building a platform and connecting people, Facebook was making the world a more knowledgeable and understanding place.

Facebook's largely liberal employee base believed this idea repeatedly over the years, but now they weren't really questioning whether Facebook had elected Trump as much as how his victory was compatible with Facebook's existence. The same questions were getting asked by journalists, who were walloped in election analysis pieces within 24 hours of the vote being called.

Mark Zuckerberg was angry at the implication that Facebook might have thrown the election. He believed math was on Facebook's side, as misinformation accounted for just a fraction of all news viewed on Facebook and news itself. He declared the possibility that it had swung the election "a crazy idea."