Quattro ricerche riabilitano gli algoritmi di Meta: non influenzano le nostre convinzioni politiche thumbnail

Four searches rehabilitate Meta’s algorithms: they don’t affect our political beliefs

From time to time an association, a research or a journalistic scoop pops up that signals how this or that social network is detrimental to a certain segment of the population (usually represented by young and very young people). Regardless of the fact that no social platform is good or bad, but all of them carry enormous possibilities and equally great risks. It depends on how you use it.

And the use made of it depends in turn on how literate one is in the use of social networks. For example, do parents really know all the applicable filters and restrictions? And have they really talked to their children, deciding together with Solo which is a suitable exposure – qualitatively and quantitatively – to the Net and to social networks?

This premise to say that the problem of social media and how much they can affect our existence is complex, but settling the matter by demonizing social media itself is simplistic (besides cleverly self-absolutory).

For example, how do social networks, or more specifically the Meta algorithms, influence (if they do) our political beliefs?

Meta

The four searches

Research on Meta’s social algorithms with respect to political beliefs was published on Thursday 27 July on the authoritative sites Science and Nature.

The titles are, respectively, How do social media feed algorithms affect attitudes and behavior in an election campaign? How do social media feed algorithms influence attitudes and behavior in an election campaign? and Like-minded sources on Facebook are prevalent but not polarizing non-polarizing.)

The research involved a single team of 17 people, and Meta also contributed to their creation. But if someone turns up their noses at the independence of the reports, it should be added that Zuckerberg’s company has limited itself to providing the data, but has not funded any of the studies.

Which were conducted in 2020, in view of the American elections. But they have only now been published, in anticipation of next year’s elections, and to allay the fears of those who fear that Meta’s algorithms could precisely determine the outcome of the elections.

Meta’s social feed algorithms

The study published in Science analyzed data from 208 million users over a three-month period.

We investigated the effects of Facebook and Instagram feed algorithms during the 2020 US election. A sample of consenting users were assigned feeds in reverse chronological order rather than based on the default algorithms.

The most striking result is that conservatives in particular tend to read news feeds labeled as disinformation. However, we find from research, “the history feed did not significantly alter bias levels.”

The other two studies

Two more of the four studies show that Meta’s algorithms have not changed users’ political beliefs.

The first of the two (published in Nature) comes from an experiment conducted on 23,377 Facebook and Instagram users: the contents of pages and users expressing ideas similar to those of the sample being researched were reduced by a third. And in any case, the political tendencies have not changed.

Another study shows us what happened by removing the ability to use the content share button. Disabling this feature has certainly reduced the spread of misinformation posts. However, it has not changed the political views of those involved.

Meta algorithms and political opinions

The last of the four studies specifically investigates the Meta algorithms behind the suggested posts.

Here the hypothesis according to which posts showing ideas similar to one’s own would create bubbles and prevent access to a plurality of information is debunked. Instead, the research shows no substantial differences between the posts suggested by the algorithm and those that appear in chronological order.

Comments

David Lazer, a professor at Northeastern University who participated in the four researches, said: “The algorithms only make it easier for users to do what they are already inclined to do.”

Nick Clegg, president of Global Affairs at Meta, also commented on the studies. Clegg said: “There is no evidence that Meta’s key features alone cause dangerous polarization, or significantly impact political faith.”

Walker Ronnie is a tech writer who keeps you informed on the latest developments in the world of technology. With a keen interest in all things tech-related, Walker shares insights and updates on new gadgets, innovative advancements, and digital trends. Stay connected with Walker to stay ahead in the ever-evolving world of technology.