Home Reviews Book review: ‘Broken Code’ by Jeff Horwitz

Book review: ‘Broken Code’ by Jeff Horwitz


For all the public scrutiny heaped on tech companies in recent years, few people know how Facebook really works. Certainly not lawmakers and, sometimes, not even Facebook itself. The company that shapes the informational diet and worldviews of billions of people is a behemoth of growing complexity, a knotwork of automated systems and carefully constructed algorithms that exist behind a scrim of corporate secrecy. The average observer tends to glimpse Facebook piecemeal, finding a privacy scandal here, intrusive advertising there, perhaps some hate speech in the timeline, all of it forming an incomplete mental image of how the platform operates and why users see what they see.

Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets,” the new book by Wall Street Journal reporter Jeff Horwitz, tries to give us the whole elephant. An extension of the Facebook Files, a series of prizewinning articles that Horwitz and his colleagues based on more than 20,000 screenshots of Facebook documents from Frances Haugen, who worked as a product manager at Facebook before becoming disillusioned, “Broken Code” offers a comprehensive, briskly reported examination of key systems governing the platform and their many failings. Combining Haugen’s access to original sources with interviews with Facebook insiders, Horwitz sets out to demonstrate that Facebook is perhaps less deliberately malevolent and more casually destructive than previously thought — rending the social fabric, funneling its customers into extremist groups, catalyzing political polarization, flooding the infoscape with disinformation, and providing tools that inadvertently facilitate human trafficking and other varieties of exploitation and fraud.

According to Horwitz, Facebook has long known what’s wrong with its platform — the company employs a boatload of researchers — but it would rather not know, so reports and memos sometimes get buried. As Horwitz chronicles, a reporting system might be redesigned to simply produce fewer user reports. A request to tweak an algorithm that would reduce the spread of fake news was approved, but Meta CEO Mark Zuckerberg ordered that its impact should be reduced 80 percent, lest it affect growth or anger power users. Zuckerberg then told the manager who proposed the change, “Don’t bring me something like this again.”

“Efforts to engineer growth had inadvertently rewarded political zealotry,” Horwitz writes. “And the company knew far more about the negative effects of social media usage than it let on.”

In Horwitz’s telling, Facebook’s leadership suffered from the egoism of the mission-driven corporation, believing that it could do little wrong and that the success of the platform was an inherent good. They were, after all, linking the globe. Their ambitions were utopian — and deeply lucrative. “They had never considered the possibility that it could do harm on the same scale,” Horwitz writes.

By the 2016 U.S. presidential election, Facebook, with its sophisticated tools for targeting people according to narrow demographic data and interests, had become a key tool for political campaigns. Facebook offered help to both Donald Trump’s and Hillary Clinton’s campaigns. Only Trump accepted, and so a Facebook staffer went to San Antonio to embed himself in the office from which Brad Parscale directed Trump’s 2016 digital efforts. Facebook’s targeting tools were also credited with aiding in the election of Rodrigo Duterte in the Philippines and Narendra Modi in India. The company was helping bring authoritarians into office and then becoming, in the eyes of some human rights groups, complicit in their acts of repression.

“The truth was that it had no idea what was happening on its platform in most countries,” Horwitz writes. Non-western countries were consigned to the category “Rest of World.” Despite being a dominant communications and broadcast medium in dozens of countries, Facebook often had no native speakers, no policy experts and no real on-the-ground staff in these places. “Most of our integrity systems are less effective outside of the United States,” noted one employee.

Facebook whistleblower Frances Haugen warns of algorithmic dangers

When Facebook did expand internationally, it hired connected elites and tended to accede to demands from authoritarian governments. In Vietnam, government officials made Facebook more difficult to access after Facebook refused to remove the accounts of some activists. After seven weeks, Facebook capitulated. It was essential to keep Facebook available in Vietnam in any form. The logic applied to Facebook’s efforts to expand into China, for which the company developed censorship tools that would satisfy the government’s strict expectations.

Horwitz’s book contains valuable accounts of internal company research and disputes between its various teams — civic integrity, growth, news feed, safety — many of them sincerely motivated to make the best out of a bad hand. There is a culture of internal critique at Facebook — rowdy online staff forums, all-hands meetings where employees can question Zuckerberg — but it has been tamped down in recent years as the company, subject to constant outside attacks and leaks, has strengthened internal security controls. By the time Horwitz met Haugen, who became his key source, “staffers with law enforcement backgrounds [had become] more common.”

In Horwitz’s account, Facebook is constantly working to compensate for its own inherent structural flaws. As a force for content distribution and for exciting people’s worst emotional appetites, Facebook is unparalleled. Its libidinal appeal is potentially endless. “It was a Ferrari, a machine designed for one thing: infinite scroll,” says the Facebook employee who was embedded in Trump’s campaign. Indeed, some of Facebook’s most problematic users — the ones who spread colossal amounts of racist content, for example — are those who post obsessively, thousands of times per day. Wary of outright censorship and bad press, the company instead finds ways to suppress the reach of these users’ content or channel their behaviors to less destructive ends. “If people had to be bigots, the company would prefer they be bigots on Facebook,” observes a member of the Integrity team.

More than once in this book, a maverick engineer or data scientist is brought in to overhaul a key company system, only to find that their ability to enact substantive change is limited. Growth remains the imperative. And Zuckerberg — a mostly cool, distant figure in Horwitz’s narrative — is sovereign. He welcomes disruption — “move fast and break things” being the company’s infamous longtime motto — but he has interests to protect.

Still, the problems are clear, and elemental to Facebook’s nature. The company knows that Instagram, with its endless process of social comparison, produces body image issues in, by its own estimates, a third of teenage girls. People feel worse after using Instagram. Facebook produces similar depressive effects. But what can the company do without blowing up what it has? In response to one proposed set of changes, an employee observed that “the Well Being team was suggesting that Instagram become less like Instagram.”

“Broken Code” is light on summary conclusions, and that’s for the best. Too many tech books offer 11 chapters of doom and gloom, diagnosing our dire predicament of mass surveillance and exploitative automated systems, and then follow all that up with one chapter of false solace where everything, in a brushstroke (or maybe a mouse click), is easily solved. Almost a decade ago, I wrote one such book. “Broken Code” is something better. It’s a smartly reported investigation into the messy internal machinations of one of the world’s most important and least understood companies. Horwitz emerges with the company’s dirty secrets but no pat conclusions. That’s left to the reader, who might decide that all of this has to go.

Inside Facebook and the Fight to Expose its Harmful Secrets

Doubleday. 330 pp. $32.50

A note to our readers

We are a participant in the Amazon Services LLC Associates Program,
an affiliate advertising program designed to provide a means for us to earn fees by linking
to Amazon.com and affiliated sites.



Source link

Previous articleZoom videoconferencing app now available on Apple TV
Next articleMan loses $2.5K in online Bitcoin scam | Top Stories … – scarsdalenews.com