Lawmakers grill tech CEOs on responsibility for online extremism, misinformation

Facebook CEO Mark Zuckerberg introduced his own proposal for Section 230 repeals including providing transparency on content moderation during testimony before Congress Thursday. File Pool Photo by Hannah McKay/UPI

March 26 (UPI) — The CEOs of Facebook, Twitter and Google testified before Congress Thursday about the role of their platforms in contributing to extremism and misinformation online.

Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey and Google CEO Sundar Pichai appeared before a joint hearing of House energy and commerce subcommittees where they faced questions about how their platforms contributed to the Jan. 6 riot at the Capitol and how the sites have been used to spread misinformation about the COVID-19 pandemic.

“The spread of disinformation and extremism has been growing online particularly on social media, with little to no guardrails in place to stop it,” Rep. Frank Pallone, D-N.J., said in opening remarks. “Unfortunately, this disinformation and extremism doesn’t just stay online, it has real-world, dangerous and even violent consequences and the time has come to hold online platforms accountable for their part.”

When asked if their platforms bore responsibility for the insurrection at the Capitol, Pichai said it was a “complex question” while Dorsey acknowledged Twitter did play a role, adding, however, that lawmakers must “take into account the broader media ecosystem.”

Zuckerberg also admitted some responsibility when asked whether rioters used Facebook to organize.

“Certainly, there was content on our services, from that perspective, I think there’s further work that we need to do,” he said.

Rep. Doris Matsui, D-Calif., asked Zuckerberg and Twitter why they hadn’t banned hashtags such as “#chinesevirus” amid a surge in anti-Asian violence including the killing of six Asian women at massage parlors during a shooting in Atlanta earlier this month.

The two CEOs said they were working to remove racist posts from the platform but did not commit to banning the hashtags.

“If it’s combined with something that’s clearly hateful, we will take that down,” Zuckerberg said. “We need to be clear about when someone is saying something because they’re using it in a hateful way versus when they’re denouncing it.”

Zuckerberg also said Facebook would remove false posts about COVID-19, or vaccines for the virus that could lead to “imminent harm” such as someone getting sick.

“That’s the broad approach that we have … that sort of explains some of the differences between some of the different issues and how we approach them,” he said.

The CEOs also faced questions from lawmakers about Section 230, a decades-old law protecting tech companies from legal action related to the content users post on their platforms.

Rep. Jan Schakowsky, D-Ill., said she plans to introduce legislation to reform the law, placing a focus on consumer protection.

“Self regulation has come to the end of its road,” she said.

Zuckerberg proposed his own Section 230 changes including requiring companies to publicly explain how they enforce rules on moderating content and hold them liable for illegal content such as drug sales or child pornography that take place on their platform.

Pichai said Google would “certainly welcome legislative proposals in that area” while Dorsey also said he was open to Zuckerberg’s proposal.

“We think the ideas around transparency are good,” Dorsey said. “It’s going to be very hard to determine what’s a large platform and what’s a small platform.”

Lawmakers and the CEOs also clashed throughout the hearing as the tech leaders were instructed to answer questions with brief “yes” or “no” responses.

During the hearing, Dorsey sent out a tweet with a question mark and a poll with the options “yes” and “no” in an apparent reference to the instructions.

LEAVE A REPLY

Please enter your comment!
Please enter your name here