HACKINTOSH.ORG | Macintosh discussion forums

Macintosh News => Apple News => Topic started by: HCK on July 29, 2023, 04:05:02 pm



Title: New research shows how Meta's algorithms shaped users' 2020 election feeds
Post by: HCK on July 29, 2023, 04:05:02 pm
New research shows how Meta's algorithms shaped users' 2020 election feeds

<p>Nearly three years ago Meta announced it was partnering with more than a dozen independent researchers <a data-i13n="cpos:1;pos:1" href="https://about.fb.com/news/2020/08/research-impact-of-facebook-and-instagram-on-us-election/">to study[/url] the impact Facebook and Instagram had on the 2020 election. Both Meta and the researchers promised the project, which would rely on troves of internal data, would deliver an independent look at issues like polarization and misinformation.</p><p>Now, we have the<a data-i13n="cpos:2;pos:1" href="https://about.fb.com/news/2023/07/research-social-media-impact-elections/"> first results[/url] of that research in the form of four peer-reviewed papers published in the journals Science and Nature. The studies offer an intriguing new look at how Facebook and Instagram’s algorithms affected what users saw in the run-up to the 2020 presidential election.</p><span id="end-legacy-contents"></span><p>The papers are also a notable milestone for Meta. The company has at times had a <a data-i13n="cpos:3;pos:1" href="https://www.engadget.com/facebook-interfering-research-tools-the-markup-212026086.html">strained[/url]&nbsp;relationship with independent researchers and been accused of "<a data-i13n="cpos:4;pos:1" href="https://www.engadget.com/facebook-data-transparency-researchers-170021370.html">transparency theater[/url]" in its efforts to make more data available to those wishing to understand what’s happening on this platform. In a statement, Meta’s policy chief Nick Clegg said that the research suggests Facebook may not be as influential in shaping its users’ political beliefs as many believe. “The experimental studies add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization, or have meaningful effects on key political attitudes, beliefs or behaviors,” he wrote.</p><p>The researchers’ initial findings, however, appear to paint a more complex picture.</p><p>One study <a data-i13n="elm:affiliate_link;sellerN:Nature Research;elmt:;cpos:5;pos:1" href="https://shopping.yahoo.com/rdlw?merchantId=c6c965be-a8ea-473d-89f4-9754f2675a60&amp;siteId=us-engadget&amp;pageId=1p-autolink&amp;featureId=text-link&amp;merchantName=Nature+Research&amp;custData=eyJzb3VyY2VOYW1lIjoiV2ViLURlc2t0b3AtVmVyaXpvbiIsInN0b3JlSWQiOiJjNmM5NjViZS1hOGVhLTQ3M2QtODlmNC05NzU0ZjI2NzVhNjAiLCJsYW5kaW5nVXJsIjoiaHR0cHM6Ly93d3cubmF0dXJlLmNvbS9hcnRpY2xlcy9zNDE1ODYtMDIzLTA2Mjk3LXciLCJjb250ZW50VXVpZCI6ImU2NGJjNmE0LTQyMmEtNGE2Mi1hM2YyLTRlMTQzY2ExNDYxOSJ9&amp;signature=AQAAAfskyIlcBR7H5ROm8DVMpmCaOs7B4dp40OPYgC9EnSog&amp;gcReferrer=https%3A%2F%2Fwww.nature.com%2Farticles%2Fs41586-023-06297-w" class="rapid-with-clickid">in Nature[/url] looked at the effect of so-called “echo chambers,” or when users are exposed to a large amount of “like-minded” sources. While the researchers confirm that most users in the US see a majority of content from “like-minded friends, Pages and groups,” they note all of it isn’t explicitly political or news-related. They also found that decreasing the amount of “like-minded” content reduced engagement, but didn’t measurably change user’s beliefs or attitudes.</p><p>While the authors note the results don’t account for the “cumulative effects” years of social media use may have had on their subjects, they do suggest the effects of echo chambers are often mischaracterized.</p><p>Another study in Nature looked at the effect of <a data-i13n="cpos:6;pos:1" href="https://www.science.org/doi/10.1126/science.abp9364?adobe_mc=MCMID%3D64243633531002265781381656517959855590%7CMCORGID%3D242B6472541199F70A4C98A6%2540AdobeOrg%7CTS%3D1690488631">chronological feeds[/url] compared with algorithmically-generated ones. That issue gained particular prominence in 2021, thanks to revelations from whistleblower Frances Haugen, who <a data-i13n="cpos:7;pos:1" href="https://www.engadget.com/what-facebook-whistleblower-frances-haugen-said-should-change-143051354.html">has advocated[/url] for a return to chronological feeds. Unsurprisingly, the researchers concluded that Facebook and Instagram’s algorithmic feeds “strongly influenced users’ experiences.”</p><p>“The Chronological Feed dramatically reduced the amount of time users spent on the platform, reduced how much users engaged with content when they were on the platform, and altered the mix of content they were served,” the authors write. “Users saw more content from ideologically moderate friends and sources with mixed audiences; more political content; more content from untrustworthy sources; and less content classified as uncivil or containing slur words than they would have on the Algorithmic Feed.”</p><p>At the same time, the researchers say that a chronological feed “did not cause detectable changes in downstream political attitudes, knowledge, or offline behavior.”</p><p>Likewise, another study, also <a data-i13n="cpos:8;pos:1" href="https://www.science.org/doi/10.1126/science.add8424?adobe_mc=MCMID%3D64243633531002265781381656517959855590%7CMCORGID%3D242B6472541199F70A4C98A6%2540AdobeOrg%7CTS%3D1690488631">in Science[/url], on the effects of reshared content in the run-up to the 2020 election found that removing reshared content “substantially decreases the amount of political news, including content from untrustworthy sources” but didn’t “significantly affect political polarization or any measure of individual-level political attitudes.’</p><p>Finally, researchers <a data-i13n="cpos:9;pos:1" href="https://www.science.org/doi/10.1126/science.ade7138?adobe_mc=MCMID%3D64243633531002265781381656517959855590%7CMCORGID%3D242B6472541199F70A4C98A6%2540AdobeOrg%7CTS%3D1690488631">analyzed[/url] the political news stories that appeared in users’ feeds in the context of whether they were liberal or conservative. They concluded that Facebook is “substantially segregated ideologically” but that “ideological segregation manifests far more in content posted by Pages and Groups than in content posted by friends.” They also found conservative users were far more likely to see content from “untrustworthy” sources, as well as articles rated false by the company’s third-party fact checkers.</p><p>The researchers said the results were a “manifestation of how Pages and Groups provide a very powerful curation and dissemination machine that is used especially effectively by sources with predominantly conservative audiences.”</p><p>While some of the findings look good for Meta, which has long argued that political content is only a small minority of what most users see, one of the most notable takeaways  from the research is that there aren’t obvious solutions for addressing the polarization that does on social media. “The results of these experiments do not show that the platforms are not the problem, but they show that they are not the solution,” University of Konstanz’ David Garcia, who was part of the research team, <a data-i13n="cpos:10;pos:1" href="https://www.science.org/content/article/does-social-media-polarize-voters-unprecedented-experiments-facebook-users-reveal">told Science[/url].</p>This article originally appeared on Engadget at https://www.engadget.com/new-research-shows-how-metas-algorithms-shaped-users-2020-election-feeds-213002211.html?src=rss

Source: New research shows how Meta's algorithms shaped users' 2020 election feeds (https://www.engadget.com/new-research-shows-how-metas-algorithms-shaped-users-2020-election-feeds-213002211.html?src=rss)