National Press Club

Panels say social media platforms pose quandaries for regulation and monitoring

April 25, 2019 | By Lorna Aldrich | lorna2@verizon.net

Legal and media experts presented the quandaries social media platforms pose for government regulators, self- monitoring by the platforms themselves, journalists and third-party monitoring during a symposium at the National Press Club Thursday.

The event, which included two panels, was co-sponsored by the Club’s Journalism Institute and the University of Missouri's schools of law and journalism.

Keynote speaker Brian Stelter, chief media correspondent for CNN Worldwide, identified the challenge, “The real world and the virtual world have merged.”

Opening the legal panel’s discussions, David Vladeck, professor of law at Georgetown University and formerly director of consumer protection at the Federal Trade Commission, described the challenges of regulating social media companies. The FTC has authority to protect consumers from unfair and deceptive practices, but defining these when the issue is the harvesting and use of consumer data is a problem, he said.

“A company can, by and large, disclose its way around FTC enforcement,” he said. He added that the legal staff at Facebook, which the FTC is currently investigating, far outnumbers the legal staff available to the FTC for the case.

Cecilia Kang, national correspondent for The New York Times, cited the FTC case against Facebook, which is based on violating promises on the use of consumers’ data, as a signal the government is beginning to regulate social media.

Andrea Matwyshyn, professor of law at Northeastern University, said, “Sometimes we need to look at traditional tools and evolve them,” referring to unfair competition standards. For example, she pointed to standards of truthful marketing, which is violated when a company claims its product is totally secure. Nothing, she said, is totally secure. “All code has flaws,” she said.

She urged companies to have a process-based feedback loop, responding to discoveries of problems in products and addressing them, adding that now, “All companies are technology companies.”

Jeff Kosseff, assistant professor of cybersecurity law at the U.S. Naval Academy, pointed out that monitoring content on Internet platforms is limited by Section 230 of the Communications Decency Act of 1996, which grants Internet service providers immunity from liability for material from third parties whose content is unknown to the provider.

Anupam Chander, professor of law at Georgetown University, mentioned a proposal by Mark Zuckerberg of Facebook to create an outside oversight body, which he termed a supreme court, to find and remove objectionable material. With two billion users, the challenge is global, but Chander sees no good alternatives.

Following the legal panel, the journalists’ panel continued the focus on monitoring content, describing efforts by media companies, social media platforms, and third-party organizations.

Manuel Garcia, senior director for standards and ethics for the USA TODAY NETWORK, emphasized policies and training, which include, “If you see something, flag it.” In cases of mass shootings, reporters are urged to step back and take time before reacting, he said.

Nancy Scola, senior technology reporter for POLITICO Pro, said self monitoring has social media platforms “really struggling.” They use both people and algorithms. Algorithms can be a problem as, for example, when an algorithm connected the burning of Notre Dame with the burning of the World Trade Center, thereby creating rumors of terrorism. Further complications include cultural differences, she said.

Angie Drobnic Holan, editor of Politifact, works with Facebook to fact check posts that are flagged as false by users; Facebook downgrades the posts that monitors label as false. After two years, evaluation is difficult, she said, but the number of flagged posts has declined.

James Warren, executive editor of NewsGuard, has formed an organization that evaluates Internet sites, as opposed to posts. It uses such standards as ethics policy, transparency, revealing ownership, correction policies and labeling sponsored products, he said. NewsGuard has evaluated 2,200 sites and is working on expansion into Europe, he reported.

A remaining challenge for his organization is finding a way to make money from this product, he said.