WE NEED YOUR HELP — Support your hometown newspaper by making a donation.
Scott Brinton

Social media’s responsibility for venom and vitriol


Is Facebook responsible for genocide? It’s an unexpected question, and a frightening one, I know, but it’s one that must be asked. Hear me out.

In 2004, Facebook founder Mark Zuckerberg unleashed the social media site on the world, and its meteoric rise as one of the planet’s wealthiest, most powerful corporations — a juggernaut in the truest sense — has been breathtaking. It has a market value of just over $500 billion, and only a handful of companies are worth more, including Apple, Alphabet (Google), Amazon and Microsoft.

From Zuckerberg’s earliest days coding Facebook’s algorithms, he has been clear about his company’s mission to connect us all through a social network. That network has become so vast, so complex, that it boggles the imagination. There is little doubt in my mind that Zuckerberg intended his network for good. If people were connected socially, his reasoning went (and still goes), then they could better understand one another. If they better understood one another, they would be more likely to love (or at least not hate) one another.

The opposite happened in Myanmar, however. For five years the Burmese military spread lies through a Facebook campaign that was intended to incite violence between Muslims and Buddhists in the Southeast Asian nation of 53 million, including rape, murder and forced migration. Facebook took down the accounts of senior military leaders last August, according to The New York Times. Myanmar’s disinformation campaign, intended to ensure that the military maintain power, had operated under Facebook’s radar because the military opened accounts under false names to avoid detection.

According to The Times story “A Genocide Incited by Facebook, With Posts from Myanmar’s Military,” by Paul Mozur (Oct. 15, 2018), military personnel “posed as fans of pop stars and national heroes as they flooded Facebook with their hatred. One said Islam was a global threat to Buddhism. Another shared a false story about the rape of a Buddhist woman by a Muslim man.”

Bloodshed ensued. Who’s responsible? Clearly, Myanmar. But does Facebook share responsibility? I wrestled with that question recently because, if it did, would I want to remain a Facebook user? Should I, from a moral standpoint, delete my account?

It’s a difficult decision. I love Facebook. It has connected me to friends and family members whom I had long lost touch with. It plugs me into my circle of interest and influence. For me, it does what Zuckerberg intended it to do.

Bad actors — despots, narco-gangsters, bigots — clearly understand the awesome power of Facebook, though, and exploit it for their personal gain. Witness Russia in the 2016 presidential election. Russian troll farms fed a steady stream of fake news into Facebook intended to further divide an already politically fractured nation, and many, many users joined in the frenzy, sharing often-ludicrous posts again and again.

I teach online journalism, among other courses, at Hofstra University. Should journalists feed their stories into Facebook’s News Stream, where anything goes? I ask my students. Are we not damaging our brand by posting our well-researched, well-written, well-vetted stories into the very same “space” where con artists stream their vile and venomous screeds? How do Facebook users distinguish between them and us? There are no easy answers.

Zuckerberg has been in damage-control mode of late. He recently posted a one-hour video conversation with Mathias Dopfner, CEO of Axel Springer, one of Europe’s largest media groups, which publishes 80 percent of its content online. It was supposed to be a friendly chat that would put a friendly face on Zuckerberg, and it was, for the most part. But Dopfner questioned whether Zuckerberg intends to transform Facebook into a publisher, rather than a “neutral platform” on which anyone can publish. It’s an alarming scenario to imagine Facebook as a publisher — that is, a media outlet that employs journalists to report stories.

Facebook has 2.32 billion users. By comparison, The New York Times has 3 million digital subscribers and another million print subscribers. If Facebook were to become a publisher, it would dwarf any media outlet in the world.

Already, it is attempting to get into the media game. Since 2015, Facebook has entered into revenue-sharing agreements with media outlets, including The Times and The Washington Post, to publish their pieces on the site through a program called “Instant Articles.” The publisher does all the work; Facebook keeps part of the money. That might explain why some publishers, like The Post, have abandoned the project.

My Hofstra students say I should stay on Facebook — that we, as journalists, must continue to post our stories to spread truth as widely as possible to counteract the liars and the cheats. They’re right. We must.

So I return to my original question: Is Facebook partly responsible for genocide? I’m still deciding. One thing is clear: The company’s leaders were naive to believe that this social network, which is so wonderful, could not also be used for evil.

Perhaps we’re all too naïve.

Scott Brinton is the Herald Community Newspapers’ executive editor and an adjunct professor at the Hofstra University Herbert School of Communication. Comments about this column? SBrinton@liherald.com.