********************  POSTING RULES & NOTES  ********************
#1 YOU MUST clip all extraneous text when replying to a message.
#2 This mail-list, like most, is publicly & permanently archived.
#3 Subscribe and post under an alias if #2 is a concern.
*****************************************************************

NY Times Op-Ed, Oct. 4, 2018
Russian Meddling Is a Symptom, Not the Disease
By Zeynep Tufekci

(Dr. Tufekci is a professor who studies the social effects of technology.)

Given the credible evidence of Russian meddling in the 2016 presidential election, it’s only natural that Americans are concerned about the possibility of further foreign interference, especially as the midterms draw closer.

But I worry that we’re focusing too much on the foreign part of the problem — in which social media accounts and pages controlled by overseas “troll factories” post false and divisive material — and not enough on how our own domestic political polarization feeds into the basic business model of companies like Facebook and YouTube.

It’s this interaction — both aspects of which are homegrown — that fosters the dissemination of false and divisive material, and this will persist as a major problem even in the absence of concerted foreign efforts.

Consider some telling exchanges from this year’s Senate hearings involving high-level executives from Facebook and Twitter. (Google, which owns YouTube, didn’t bother sending a comparable representative.) In April, Senator Kamala Harris, Democrat of California, pressed Facebook’s chief executive, Mark Zuckerberg, on how much money the company had made by ads placed by the Internet Research Agency, a Russian troll factory. Mr. Zuckerberg replied that it was about $100,000 — a negligible amount of money for the company.

Last month, Ms. Harris further grilled Sheryl Sandberg, Facebook’s chief operating officer, on this point, demanding to know how much inauthentic Russian content was on Facebook. Ms. Sandberg had her sound bite ready, saying that “any amount is too much,” but she ultimately threw out an estimate of .004 percent, another negligible amount.

The exchange made for good viewing: a senator asking tough questions, chastised executives being forced to put exact numbers on the table. But the truth is that paid Russian content was almost certainly immaterial to Facebook’s revenue — and the .004 percent figure, though almost certainly rhetorical, does capture the relative insignificance of the paid Russian presence on Facebook.

Contrast this, however, with another question from Ms. Harris, in which she asked Ms. Sandberg how Facebook can “reconcile an incentive to create and increase your user engagement when the content that generates a lot of engagement is often inflammatory and hateful.” That astute question Ms. Sandberg completely sidestepped, which was no surprise: No statistic can paper over the fact that this is a real problem.

Facebook, Twitter and YouTube have business models that thrive on the outrageous, the incendiary and the eye-catching, because such content generates “engagement” and captures our attention, which the platforms then sell to advertisers, paired with extensive data on users that allow advertisers (and propagandists) to “microtarget” us at an individual level.

Traditional media outlets, of course, are frequently also cynical manipulators of sensationalistic content, but social media is better able to weaponize it. Algorithms can measure what content best “engages” each user and can target him or her individually in a way that the sleaziest editor of a broadcast medium could only dream of.

In the case of microtargeted advertisements, Silicon Valley’s basic business model is also more worrisome than the relatively small number of ads bought by Russian operatives. For example, less than two weeks before the 2016 presidential election, a senior official with the Trump campaign told Bloomberg Businessweek that it had “three major voter suppression operations underway.” These were aimed at young women, African-Americans and white idealistic liberals. One of the key tools of the operation, according to Brad Parscale, a Trump campaign manager, was Facebook “dark posts” or “nonpublic posts,” whose viewership was controlled so that “only the people we want to see it, see it.”

Mr. Parscale said that Facebook turned out to be so useful that the Trump campaign spent most of its budget of $94 million on the platform, and in response, Facebook helpfully embedded staff members with the Trump campaign (as it does with major advertisers) to help it spend its money more effectively.

Microtargeting, while lucrative for Silicon Valley, brings with it many other problems. Investigators at ProPublica found that Facebook was allowing advertisers to post job and housing ads that specifically excluded African-Americans, and even made it possible for people to select “Jew-haters” as a category to reach with paid messaging. The American Civil Liberties Union found that Facebook was allowing employers to show jobs only to men or to younger employees — excluding women and older workers.

In response to a barrage of criticism, Facebook has agreed to make all advertisements on its site available on a public database and limit some forms of microtargeting, but we still don’t know exactly what happened in 2016: What features of Facebook were used to help the Trump campaign find the voters who could be subject to this “voter suppression” effort? How was the platform used to test the messages that might be most effective at this?

This all leads to two other questions: Why does it take outsiders to discover such flagrant problems, and why does it take so much outside pressure to get the company to act? Again, follow the money: Silicon Valley is profitable partly because it employs so few people in comparison to its user base of billions of people. Most of its employees aren’t busy looking for such problems.

Wall Street is under no illusion about how things work for these companies. When Mr. Zuckerberg announced that Facebook would try to do better — even if it hurt profits — and when Twitter started purging bots, bringing down the number of “users” the company can report, their stock prices dropped because of worries about their long-term profitability.

It is understandable that legislators and the public are concerned about other countries meddling in our elections. But foreign meddling is to our politics what a fever is to tuberculosis: a mere symptom of a deeper problem. To heal, we need the correct diagnosis followed by action that treats the underlying diseases. The closer our legislators look at our own domestic politics as well as Silicon Valley’s business model, the better the answers they will find.

Zeynep Tufekci (@zeynep) is an associate professor at the School of Information and Library Science at the University of North Carolina, the author of “Twitter and Tear Gas: The Power and Fragility of Networked Protest” and a contributing opinion writer.
_________________________________________________________
Full posting guidelines at: http://www.marxmail.org/sub.htm
Set your options at: 
http://lists.csbs.utah.edu/options/marxism/archive%40mail-archive.com

Reply via email to