On Tuesday, lawmakers in the US House of Representatives heard from three academics who argued that social media companies cannot be trusted to control themselves.
This may seem like a win-win given Facebook’s serial involvement in controversies over the past few years, its long-standing allergy to surveillance, and its unrepentant decision this week to halt the development of “Instagram Kids” only after being pilloried by news reports that the company its own research acknowledged Instagram’s mental toll on teenage boys, especially teenage girls.
And that’s what it was. The audience title was its own spoiler: The Disinformation Black Box: Researching Social Media Data. Despite Facebook’s PR setback over the weekend against its accusers – it took issue with the claim that Instagram is ‘toxic’ to teens – lawmakers involved have become convinced that social media is spreading disinformation and that they operate without adequate control.
Researchers can only see what companies want them to do. And access can be cut off at any time
Citing the damage caused by misinformation – the U.S. Capitol insurgency in January, lies about the severity of COVID-19 and continued misinformation about vaccines – Bill Foster (D-IL), chairman of the Committee on U.S. House of Representatives Science, Space and Technology Subcommittee on Investigation and Oversight has called social media manipulation a threat to public health and lamented companies’ refusal to social media to provide the internal data necessary to respond.
âUnfortunately, it’s extremely difficult for researchers to have sufficient access to social media data,â Foster said.
âCompanies make some information public, but a lot of it is through interfaces they control, which means researchers can only see what companies want them to do. And access can be cut off. at any time. “
The chickens come home to roost
Foster may have had a recent episode of denial in mind: Facebook’s decision in August to shut down the accounts of NYU researchers investigating the company’s advertising operations.
Laura Edelson, a doctoral student in computer science at New York University and one of the researchers who lost access to Facebook because of her work on the NYU Advertising Observatory project, was among three witnesses who testified – via video statements and longer written remarks.
“Tobacco companies don’t decide who researches smoking and the idea of ââsocial media companies deciding who studies them is perverse,” Edelson said.
“Lack of data is currently the most serious obstacle to the work of disinformation researchers,” she said, noting that Twitter is the only major social media company that allows researchers to access public data, but at a high cost.
Facebook bought a company called CrowdTangle in 2016, which a few researchers use to access it despite being primarily offered as a business analytics product. And other platforms like YouTube and TikTok, she said, don’t offer any suitable tools.
Researchers from her own team, from Mozilla, and journalists, she said, attempted to crowdsource the data from social media. But some of the social media platforms, she said, have been hostile, and she pointed to Facebook’s cancellation of its research team’s accounts this summer and the company’s legal threats against German Algorithm. Watch.
“It’s time for Congress to act to ensure researchers and the public have access to the data we need to protect ourselves from disinformation online,” she said.
No incentive and no oversight
Alan Mislove, professor and acting dean of Khoury College of Computer Sciences at Northeastern University, came to a similar conclusion.
âSocial media platforms currently do not have the appropriate incentives to allow research on their platforms, and have been observed to be actively hostile to important ethical research that is in the public interest,â he said. .
âAs the power and influence of these platforms reaches new heights, our ability as independent researchers to understand the impact they are having is diminished every day. So me and other researchers need help from Congress to enable researchers to have sufficient access to data from social media platforms to ensure that the benefits of these platforms do not come at too high a cost to society. “
Kevin Leicht, professor of sociology at the University of Illinois at Urbana-Champaign, sent a similar message. âThe biggest gap we see in research is in the data and algorithms or the black box that social media companies use to determine what end users see,â he said. “And at some level, we need to access not only the data but the black box.”
Edelson, in his prepared remarks, urged Congress to pass a universal digital ad transparency law that would require digital ad platforms to make their ads available in a machine-readable format. And she said she intended to publish a draft proposal soon.
Mislove said proposed legislation, such as the Algorithmic Justice and Online Platform Transparency Act 2021 and the Social Media Advertising Disclosure and Transparency (DATA) Act 2021, would be helpful to researchers. .
Mozilla, which helped control NYU’s Ad Observatory software, also recently approved the Social Media DATA Act to keep the ad platform transparent.
“Transparency is the essential first step towards holding social media platforms accountable for damaging outcomes,” said Marshall Erwin, director of security at Mozilla, in a statement emailed to The register.
“Without insight into what people are going through, what ads are shown to them and why, what content is recommended to them and why, we cannot begin to understand how misinformation spreads.”
Facebook did not respond to a request for comment. Â®