So Google Feed responds to our choices

So Google Feed responds to our choices
MeteoWeb

The biggest engaging factors with the unreliable news you provide google feeds It seems to rely mostly on past selections rather than the algorithm’s interest or user’s political ideology. This was revealed by a study published in the journal nature It was conducted by scientists at Rutgers School of Communication and Information, Stanford University and Northeastern University. Despite the critical role algorithms play in choosing news to be suggested to the user, little research has focused on analyzing methods of feed assortment.

The team led Catherine Ognyanova, compare exposure, i.e. the set of links in search results, followers, connected to the pages people choose to visit, and engagement, i.e. the set of sites visited by the user while browsing. The researchers addressed a long-standing concern that digital algorithms learn from historical preferences and superficial information to internalize users’ attitudes and biases. Experts are of the opinion that feed results look a little different based on underlying political ideologies, but they set themselves apart once people start visiting certain web pages.

The scholars comment that this work highlights that Google’s algorithms can sometimes generate polarizing and dangerous results, even if they appear uniformly among users with different political views. The research team gathered the information in two phases, evaluating survey results with empirical data from a browser extension designed to measure exposure and engagement with specific online content during the 2018 and 2020 elections in the United States.

As part of the survey, 1,021 respondents installed a browser extension for Chrome and Firefox. The program recorded Google search results URLs, search history, and a variety of data about what users viewed. The survey aimed instead to discern the political orientation of the subjects. The results showed that political identity and ideology had nothing to do with exposure and the quality of news that users were exposed to. On the contrary, a clear link seemed to emerge between political identification and interaction with polarizing connotations. “search engines – comment OgninyanovThey tend to show people unreliable content, but our work confirms that the users themselves and the choices they make over time can directly influence the type of links that are served in their feed.”

Leave a Reply

Your email address will not be published. Required fields are marked *