The executives of Google, Facebook and Twitter testify in the House of Representatives on Thursday how disinformation is spreading on their platforms, an issue that tech companies were scrutinized for during the presidential election and after the January 6 riots at the Capitol.
The House Energy and Trade Committee hearing marks the first time Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and Google’s Sundar Pichai have appeared before Congress during the Biden administration. President Biden has indicated he is likely to hit the tech industry hard. That position, coupled with the democratic oversight of Congress, has raised liberal hopes that Washington will take steps to curb and hold big tech power over the next several years.
The hearing is also the first opportunity since the January 6 riot that lawmakers question the three men about the role their companies played in the event. The attack made the issue of disinformation very personal to lawmakers as those who participated in the uprising have been linked to online conspiracy theories like QAnon.
Prior to the hearing, Democrats signaled in a memo that they were interested in informing executives about the January 6 attacks, efforts by the law to undermine the 2020 election results, and misinformation related to the Covid-19 Questioning pandemic.
Republicans sent letters to executives this month asking them about decisions to remove conservative figures and stories from their platforms, including an October article in the New York Post about President Biden’s son Hunter.
Legislators have debated whether the business models of social media platforms encourage the spread of hatred and disinformation by prioritizing content that engages users, often by highlighting eye-catching or divisive posts.
Some lawmakers will be pushing for changes to Section 230 of the Communications Decency Act, a 1996 law that protects the platforms from complaints about their users’ contributions. The legislature tries to remove the protection in cases in which the algorithms of the companies have reinforced certain illegal content. Others believe that the spread of disinformation could be stopped by stricter antitrust laws, as the platforms are by far the primary focal point for online public communications.
“It is now painfully clear that neither the market nor public pressure will stop social media companies from escalating disinformation and extremism. So we have no choice but to legislate, and now it’s a question of how best to do is, “said Representative Frank Pallone, of the New Jersey Democrat who chairs the committee.
Technical leaders are expected to do their best to limit misinformation and redirect users to more reliable sources of information. You might also consider the option of tighter regulation to shape increasingly likely legislative changes rather than directly opposing them.