Tuesday, April 16, 2024
HomeSECURITYNow proven! Google cannot influence the political preferences of users

Now proven! Google cannot influence the political preferences of users

-


Now proven! Google cannot influence the political preferences of users

Algorithms are not to blame, and the influence of “information bubbles” is exaggerated.

Many people think that Internet algorithms, or sets of rules that determine what we see on the Internet, isolate us in so-called “information bubbles”, showing us only what confirms our point of view.

Diverse algorithms pervade almost every aspect of our online existence and can even shape our view of the world around us. “They have some influence on how we consume information and therefore how we form our opinions,” says Katerina Ognyanova communications researcher at Rutgers University and co-author of the study.

It is sometimes difficult to measure the extent to which these sets of rules contribute to political polarization. According to the researcher, the algorithm can take into account who we are, where we are, what device we are searching from, our geography and language. But how exactly the algorithm works, we don’t know for sure. This is a “black box”.

As it turned out, not all algorithms contribute to political polarization. Study, published In the magazine Nature May 24, revealed that the search engine Google does not produce political news based on user preferences. Instead, politically polarized users tend to click on links to relevant news sites themselves.

As part of the study, the authors tracked the online behavior of Internet users during the 2018 and 2020 US election cycles using a custom web browser extension. The monitoring tool was designed to measure two specific indicators:

  1. “impact” – what news sources are offered to users by the Google search engine itself;
  2. “engagement” – with what sources users, as a result, interact after receiving search results.

The researchers collected data from hundreds of users and analyzed them according to participants’ age and self-defined political orientation. Participants’ political affiliations were found to have very little dependence on what the search engine offered them, and a very noticeable connection with what content they themselves chose.

That is, after several months of the experiment, Google did not offer users such news that would clearly correspond to their political position. Thus, the search engine does not put users in an “information bubble” and always leaves them the right to choose.

The results of the study are noteworthy because they seem to disprove accusations that Google has a problem with “information bubbles.” A few years ago, skeptics claimed that the company was using users’ personal data to personalize search results, although the tech giant denied this fact, saying that it did not see the point in personalizing results.

However, Google’s algorithm is not flawless. The researchers noticed that unreliable or outright misleading news sources sometimes slip through the results, whether or not users have interacted with them before. Google representatives read the study data and said that the company is trying to keep its algorithm “relevant and reliable” and is not repelled in search results by the race, religion or political preferences of users.

It would be interesting to look at the results of a similar study, but fair for Russian search engines, like the same Yandex. Do its algorithms work just as impartially, or do they still contribute to the formation of “information bubbles”?



Source link

www.securitylab.ru

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular