How to Stand Up to a Dictator
In interviews, Maria Ressa explains the complex problems of social media with such clarity but wavers in detailing concrete steps users can take in the short term.
Hello! I hope your first week of December is off to a good start. In my case, I experienced the juggernaut that is the Manila Christmas-rush traffic the other day and got stuck for two hours on the road starting at 3 p.m. on what should have been a benign 14.5km drive. I had to remind myself how Manila felt like a ghost town in 2020 and how this year is the first time we’re going to have some semblance of all-out normalcy since the pandemic ravaged the country, traffic and crowds included.
Nobel Peace Prize winner and Filipino-American journalist Maria Ressa was a guest on Stephen Colbert on November 29 to promote her book, How to Stand Up to a Dictator. Asked the titular question, she demurred at the idea that she could condense the answers within the limits of a late-night-show format given that she had written an entire book about it but it’s worth watching how she was able to explain complex issues with such eloquence and clarity within 9 minutes.
At 6:55, Mr. Colbert asked about how Mark Zuckerberg and Elon Musk can do the “right thing” (via regulation and editorial control) with their social media companies as opposed to “the most profitable or expedient thing.” Edited for brevity:
“Social media is mildly addictive… What keeps you scrolling? Lies (cites MIT study on how lies spread faster and further than facts, which I found here behind a paywall), fear, anger, hatred, us against them...
“More money comes in as you become more afraid, more hateful, and it impacts us on many levels: personally, psychologically, sociologically—we behave differently in groups—and then finally, that last part which we rarely talk about is emergent human behavior: what kind of species behavior are we encouraging with this information ecosystem? It is the worst of who we are.”
This resonated with why I started this Substack in the first place. Her answer also hits squarely at the very economic model of social media platforms: generating revenue by using detailed behavioral information to direct ads to individual users (Frank, 2021). But it’s hard to surmise what was not being said in her indirect answer and I wonder how far she goes in her book to say it explicitly. (By the way, preferring a hardcopy, I looked for her book both online and in-store at Fully Booked but it’s unavailable.)
As for what we can do, Ms. Ressa (4:00) said, “Where is the line you draw, where on this side you’re good and on this side you’re evil? And that’s an individual battle for each of us—that’s where it begins.”
Unsatisfied, I searched for her other recent interviews. At CBS Mornings, Gayle King asked (6:59) how one can fight back against someone whose reality is different from yours. Ms. Ressa cited education in the long term and legislation in the medium term. “In the short term, it is just us. It is hand-to-hand combat.”
Ms. Ressa does not detail what this entails but given the context, it seemed to imply countering disinformation on social media platforms as well.
In his whitepaper on Disinformation and Its Effects on Social Capital Networks, investigative journalist Dave Troy wrote:
“… no amount of ‘truth’ poured onto a society that has been ravaged by disinformation and social dysfunction can restore it. Instead, we must recognize the social nature of the problem and begin to ‘re-grow’ social capital in purposeful and imaginative new ways.”
This is also supported by the MIT study Ms. Ressa quoted. In it, the authors found that humans, not bots, are primarily responsible for the spread of misleading information, emphasis mine:
“Now behavioral interventions become even more important in our fight to stop the spread of false news,” Sinan Aral, one of the study’s co-authors, was quoted in the MIT PR as saying. “Whereas if it were just bots, we would need a technological solution.”
To me, short-term solutions—specifically online—are limited other than what we can do to protect our mental well-being, as well as our own behavioral response to and on social media (and this is what I will continue to explore here in my newsletter). While the long- and medium-term solutions frame human behavior as the problem—and at first gloss, this may seem to exonerate social media companies—Ms. Ressa is right in asking what emergent behavior in our species is technology encouraging when humans are best seen and heard at their worst nature.