Tech

Stop Disinformation and Defend Democracy — Scholars See New Approach

Huge data platforms such as Facebook and YouTube have a corrosive impact on US and European politics. The proposed law requires them to expose their data to researchers, which may provide insight into hidden systems that enable and promote disinformation. Credits: Neil Freese / UC Berkeley

Facebook, YouTube, Twitter — In just 10 years, these large data platforms and others have transformed society. But each is like a black box. While being accused of damaging public health and eroding democracy and making tens of billions of dollars each year, their innermost businesses are barely visible.


But now, US and European lawmakers are taking steps to give researchers new rights to access and analyze data from these powerful platforms. Brandie Nonnecke, a technology scholar who is the director of the CITRIS Policy Lab at the University of California, Berkeley, said it’s important to understand how this helps disseminate false and disinformation that undermines the country’s well-being. It states that it can be a step.

Analysis published in the journal today (February 11th) Chemistry, Nonnecke and co-author Camille Carlton detail the political measures that force Facebook and other platforms to access large amounts of data collected from billions of users. The fate of the bill is uncertain, but Nonnecke said the stakes are historic.

“These platforms play an increasingly influential role in our social, economic and political interactions, and are rarely monitored,” Nonnecke said in an interview. “There is a problem with this. In the United States, after the election, we saw how the platform was used to manipulate public opinion and influence voting behavior.

“Yes, yes. These bills are necessary to protect democracy. Absolutely,” she added.

Nonnecke is an influential scholar on information and communication technology, artificial intelligence, Internet governance, and how all of this intersects the public good. Her work was published in leading academic journals and News mediaAnd she has consulted with policy makers in the United States and around the world.

Carlton is the Communications Manager of the Humanitarian Technology Center.

Ignore the risk of pursuing profits

The proposed law comes when criticisms of the 21st century media giants are escalating across the political spectrum.

Surveys and news reports detail how various national and international stakeholders actively used social media to manipulate the 2016 and 2020 US presidential elections.

Then, in October last year, former Facebook data scientist Whistleblower Frances Haugen said corporate leaders knew that their products were disrupting political divisions and endangering children. I testified in Congress.

The work of such influence depends on the use of advanced technology to manipulate readers and viewers. However, according to Nonnecke, the main concern is that companies such as Facebook and YouTube (owned by Google) “Recommender system“We evaluate user interests and drive them to provocative content, but they are not always reliable.

“Because people are involved in that type of content, these systems prioritize sensational over sensational,” she explained. “We stick to the rubberneck and focus on shocking content. They know it’s watching the screen.”

But how exactly are companies doing these processes?

“We have some ideas, but more research is needed,” Nonnecke said. “For example, as a researcher, it’s very difficult for me to see the virality of false or disinformation campaigns.”

Some data on traffic and targeting is available by other legal means. “Some researchers say platformA recommender system with careful experimentation and analysis is a difficult task, “she added. That.

“As researchers, we’ve been asking for more access to platform data for years. To understand the impact these platforms have on our society, we need to access that data. I know that. “

Use the law to pry open a black box

Digital media giants aren’t blocking all data access. Nonnecke emphasized that some platforms are making ongoing efforts to build partnerships with independent researchers. However, companies sometimes release incomplete data, and much of the information remains off limits.

Various legislative efforts have been made to regulate media platforms, but so far with little success. The legislation proposed in the West has made great strides in increasing the transparency of companies and requires them to work with researchers who want to investigate the internal workings of the company.

Stop Disinformation and Defend Democracy — Scholars See New Approach

In 2011, Facebook’s new data center in the desert-rich town of Prineville, Oregon covered 150,000 square feet. Since then, the complex has grown significantly to process the data generated by Facebook’s global community. By the end of 2023, Facebook will have 11 data centers, covering over 4.5 million square feet and costing $ 2 billion. Credits: Tom Raftery / Wikimedia Commons

“We need accountability, and accountability comes from transparency, which comes from giving researchers access to the data in the spirit of establishing proper oversight and guidance by law.”

She said the proposed law could be “transformative.”

Europe: The European Commission has already approved a Code of Conduct on Disinformation to support researchers’ access to data. The Digital Services Act, which is currently passed by the European Parliament and is being considered by Member States, requires the largest online platform to be an active partner in the fight against disinformation.

They need to assess the systematic social, economic and political risks that arise from their system and implement strategies to limit the risks. Their assessments and relevant data are open to audits by government agencies and researchers.

Specifically, the proposed law requires a large platform to provide access to data related to the risks posed by its operation. It reflects the behavior and accuracy of the algorithms that shape the content recommendations. It shows the process of content moderation and complaint handling.

According to Nonnecke, the European bill seems to be on the path to legislation.

United States: Nonnecke and Carlton said in the United States that a bill submitted by Congress in December, with bipartisan support, would require a large platform to make data available for research and surveillance. It writes that it is the “most comprehensive” bill proposed so far.

Known as the Platform Accountability and Transparency Act, the bill sets the formal process by which the National Science Foundation (NSF) evaluates proposed studies that require platform data. Once the project is approved, the Federal Trade Commission (FTC) will work with the platform to manage the release of data to NSF-approved researchers.

FTC also has the authority to require the Platform to disclose data and other information that assists researchers, journalists and others in assessing how the Platform is harming society. ..

Control “actual, concrete, visible harm”

Both European and US measures are aimed at protecting the identities of individual users and provide businesses with a mechanism to protect some data related to trade secrets.

However, Nonnecke has flagged some possible shortcomings. Of particular concern is the potential for restricted access to research institutes with advanced infrastructure for managing data and cybersecurity requirements. She said it could be a barrier to small or non-wealthy institutions, thereby undermining the essential need for diversity among researchers.

Second, in order for the data to be of great value to scientific research, the platform has metadata and how the data was cleaned up, transformed, and modified before it was passed on to researchers. Must contain contextual information such as. “This ensures that our data and research insights are of higher quality and accuracy,” says Nonnecke.

It is not yet known how Facebook, YouTube and other large platforms will react to these measures. They were able to welcome the law to provide a rational process and clear procedures, Nonnecke said.

Or they may resist.

“Overall, companies want to protect themselves and protect shareholder value,” she explained. “That’s why they don’t want the skeleton in the closet to be transparent to the world.”

Nonnecke believes it has some similarities to the efforts to regulate tobacco use in the 1950s and 1960s.

When faced with the scientific evidence that smoking causes cancer, big cigarettes were initially “denied, denied, denied,” she said. “Finally, Congress said,” No, the study is clear. We impose these restrictions on you for the health and well-being of the people. “

Still, she said, it’s more difficult to assess the impact of big data platforms across society. “How do I do that on a platform that is difficult to evaluate? What if I don’t understand how it works and how people interact?

“It’s clear that the effects aren’t the same as causing cancer. But there are real, concrete, visible harms that the platform causes …. We look at them and what’s happening Further research is needed to understand and how to minimize it. Harm. ”


Australia wants Facebook to seek parental consent for children


For more information:
Brandie Nonnecke et al, EU and US law seeks to publish digital platform data. Chemistry (2022). DOI: 10.1126 / science.abl8537

Quote: Stop disinformation and defend democracy — scholars obtained from https: //phys.org/news/2022-02-thwarting-disinformation-defending-democracyscholar-approach.html on February 14, 2022 Looking at the new approach (February 14, 2022)

This document is subject to copyright. No part may be reproduced without written permission, except for fair transactions for personal investigation or research purposes. Content is provided for informational purposes only.



Stop Disinformation and Defend Democracy — Scholars See New Approach

Source link Stop Disinformation and Defend Democracy — Scholars See New Approach

Show More

Related Articles

Back to top button