Social media platforms like Facebook “have played a major role in exacerbating political polarization that can lead to such extremist violence,” in accordance with a from researchers at New York University’s Stern Center for Business and Human Rights.
That could not appear to be a shocking conclusion, however Facebook has lengthy tried to downplay its position in fueling divisiveness. The firm says that present analysis reveals that “social media is not a primary driver of harmful polarization.” But of their report, NYU’s researchers write that “research focused more narrowly on the years since 2016 suggests that widespread use of the major platforms has exacerbated partisan hatred.”
To make their case, the authors spotlight quite a few research analyzing the hyperlinks between polarization and social media. They additionally interviewed dozens of researchers, and at the least one Facebook government, Yann Le Cun, Facebook’s prime AI scientist.
While the report is cautious to level out that social media will not be the “original cause” of polarization, the authors say that Facebook and others have “intensified” it. They additionally word that Facebook’s personal makes an attempt to scale back divisiveness, corresponding to de-emphasizing in News Feed, present the corporate is nicely conscious of its position. “The introspection on polarization probably would be more productive if the company’s top executives were not publicly casting doubt on whether there is any connection between social media and political divisiveness,” the report says.
“Research shows that social media is not a primary driver of harmful polarization, but we want to help find solutions to address it,” a Facebook spokesperson stated in a press release. “That is why we continually and proactively detect and remove content (like hate speech) that violates our Community Standards and we work to stop the spread of misinformation. We reduce the reach of content from Pages and Groups that repeatedly violate our policies, and connect people with trusted, credible sources for information about issues such as elections, the COVID-19 pandemic and climate change.”
The report additionally raises the problem that these issues are tough to handle “because the companies refuse to disclose how their platforms work.” Among the researchers suggestions is that Congress drive “Facebook and Google/YouTube, to share data on how algorithms rank, recommend, and remove content.” Platforms releasing the information, and unbiased researchers who research it, must be legally protected as a part of that work, they write.
Additionally, Congress ought to “empower the Federal Trade Commission to draft and enforce an industry code of conduct,” and “provide research funding” for different enterprise fashions for social media platforms. The researchers additionally increase a number of modifications that Facebook and different platforms may implement instantly, together with adjusting their inner algorithms to additional de-emphasize polarizing content material, and make these modifications extra clear to the general public. The platforms must also “double the number of human content moderators” and make all of them full workers, with a view to make selections extra constant.
All merchandise beneficial by Engadget are chosen by our editorial crew, unbiased of our guardian firm. Some of our tales embody affiliate hyperlinks. If you purchase one thing by way of one among these hyperlinks, we could earn an affiliate fee.
#Researchers #Platforms #Facebook #performed #main #position #fueling #polarization #Engadget