Helping Advertisers, but Hindering Flow of Information
The internet was conceived with the intent of connecting all the people of the world with one another and providing a publishing platform for all who had something to say. In theory, that goal has been accomplished. In reality, algorithms and advertisers are thwarting an egalitarian online environment.
Without a doubt, algorithms are helpful narrowing down the firehose blast of information available online, so that when you do a Google search for a dry cleaner the results are relevant to your location. To deliver this kind of personalized information Google monitors and tracks 57 signals that your device emits like browser type, browsing history, past searches, geographic location and more. So in fact, there is no such thing as a standard Google search results. Every single search – even if the exact same phrase is used – displays unique results tailored to the user. This is not only the case with Google, it also happens with Yahoo News, Facebook, LinkedIn and other social media.
Most consumers understand the bargain that they’ve made with search engines and social media. The cost of the “free” service is that their online actions are tracked and compiled to create a consumer profile for advertisers. Information personalization is a two-way street. Not only can you find a dry cleaner near you – the local dry cleaner can find you and direct advertising to you. Anyone who has been assaulted with repetitive pop up ads for a pair of shoes you looked at online is well aware that their behavior is fueling what is being marketed to them.
At first glance, all of this personalized information sounds like a win, win for everyone; that is until you understand that narrowed content delivery is a double-edged sword. Algorithms are also determining what news and information you view online. Essentially, delivering an online newspaper to you with many of the articles cut out.
The problem is that the choices made by the algorithms as to what content you see are not transparent. The result is a “filter bubble,” a term coined by internet activist Eli Pariser. Pariser argues that a result of filter bubbles is that users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.
A new study from Pew Research claims that 62 percent of people get their news from social media, with Facebook the most prevalent news source. If a majority of people are getting their news and information from social media and that content is being curated to only show them information that mirrors their beliefs, it perpetuates “confirmation bias,” the tendency of people to embrace information that supports their beliefs and reject information that contradicts it. The surprising results of the U.S. presidential election in 2016, has been blamed on the "filter bubble" phenomenon.