People, not Google’s algorithm, create their own partisan ‘bubbles’ online

People, not Google’s algorithm, create their own partisan ‘bubbles’ online
Spread the love

From Thanksgiving dinner conversations to pop culture discourse, it’s easy to feel like people of different political persuasions occupy entirely separate worlds, especially online. People often blame algorithms, the invisible sets of rules that shape online landscapes from social media to search engines, for sealing off usage in digital “filter bubbles” by feeding us content that reinforces our existing world view.

Algorithms are always biased: Studies have shown that Facebook ads target particular racial and gender demographics. Dating apps select matches based on a user’s previous swiping history. And search engines prioritize links based on what they deem most relevant. But according to new research, not all algorithms generate political polarization.

A study published today in Nature found that Google’s search engine does not return disproportionately partisan results. Instead, politically polarized Google users tend to isolate themselves by clicking links to partisan news sites. These findings suggest that, at least when it comes to Google searches, it may be easier for people to escape echo chambers online than previously thought, but only if they choose to do so.

Algorithms permeate almost every aspect of our online existence and are capable of shaping the way we view the world around us. “They do have some impact on how we consume information and therefore how we form opinions,” says Katherine Ognyanova, a communications researcher at Rutgers University and co-author of the new research.

But sometimes it can be hard to quantify how much these programs drive political polarization. An algorithm could analyze “who you are, where you are, what kind of device you’re looking for, geography, language,” says Ognyanova. “But we don’t really know exactly how the algorithm works. It’s a black box.”

Most studies looking at algorithm-driven political polarization have focused on social media platforms like Twitter and Facebook rather than search engines. This is because, until recently, it has been easier for researchers to obtain usable data from social networking sites with their public-facing software interfaces. “For search engines, there is no such tool,” says Daniel Trielli, an incoming assistant professor of media and democracy at the University of Maryland, who was not involved in the study.

But Ognyanova and her co-authors found a way around this problem. Instead of relying on anonymous public data, they sent volunteers a browser extension that logged all their Google search results, and the links they followed from those pages, over the course of several months. The extension acted as backyard camera traps photographing animals; in this case, it provided snapshots of everything that populates each participant’s online landscape.

The researchers collected data from hundreds of Google users during the three months leading up to the 2018 US midterm elections and the nine months leading up to the 2020 US presidential election. They then analyzed what they had collected in relationship with participants’ age and self-reported political orientation, ranked on a scale of one to seven, from strong Democrat to strong Republican. Yotam Shmargad, a computational social scientist at the University of Arizona, who was not a member of the research team, calls the approach “innovative” to merge real-world behavioral data about participants’ search activity with survey information about their inclinations. policies.

Field data of this kind is also extremely valuable from a policymaking perspective, says Homa Hosseinmardi, a cybersecurity researcher at the University of Pennsylvania, who was also not involved in the research. To ensure that search engine giants like Google, which receives more than 8.5 billion queries every day, operate with people’s best interests in mind, it’s not enough to know how an algorithm works. “You need to see how people are using the algorithm,” says Hosseinmardi.

While many lawmakers are currently pushing for big tech companies to make their anonymous user data public, some researchers worry that this will incentivize platforms to post misleading, biased, or incomplete information. One notable example was when Meta hired a team of scientists to investigate the relationship of the platform with democracy and political polarization and then failed To provide half of the data you promised to share. “I think it makes a lot more sense to go directly to the user,” says Ronald Robertson, a network scientist at Stanford University and lead author of the new study.

Ultimately, the team found that a quick Google search didn’t give users a selection of news stories based on their political leanings. “Google doesn’t do as much customization in general,” says Robertson. “And if the personalization is low, then maybe the algorithm isn’t really turning the page that much.” Instead, strongly partisan users were more likely to click on partisan links that fit their pre-existing worldview.

This does not mean that Google’s algorithm is flawless. The researchers noticed that unreliable or downright misleading news sources still appeared in the results, regardless of whether or not users interacted with them. “There are other contexts as well where Google has done some pretty problematic things,” Robertson says, including dramatically underrepresenting women of color in its image search results.

Google did not immediately respond to a request for comment on the new study.

Shmargad notes that the study data is not completely free of bias if you break it down to a more granular level. “It doesn’t seem like there’s a lot of algorithmic bias across party lines,” she says, “but there might be some algorithmic bias across age groups.”

Users 65 and older were subject to more right-links in their Google search results than other age groups, regardless of their political identity. However, because the effect was small and the older age group only made up about one fifth of the total number of participants, the impact of the higher exposure on the overall study results disappeared in the macro-analysis.

Still, the findings reflect a growing body of research that suggests that the role of algorithms in creating political bubbles may be overstated. “I’m not against blaming the platforms,” says Trielli. “But it’s a bit disconcerting to learn that it’s not just about making sure the platforms are well behaved. Our personal motivations to filter what we read to fit our political biases remain strong.”

“We also want to be divided,” adds Trielli.

The silver lining, says Ognyanova, is that “this study shows that it’s not that hard for people to escape their [ideological] bubble.” That can be so. But first they have to want to get out.

#

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *