The “filter bubble” is a concept developed by Eli Pariser that indicates the negative side of personalized search. Users receive information based on the Websites determining what kind of Internet resources a person would like to read or see depending on their search history, location, and past mouse clicks. As a rule, Web pages only display information according to the user’s previous views. All other information is usually not shown to the user. The most prominent examples are Google and Facebook, with personalized search results and news feeds. This leads to the fact that a person does not acquire information that contradicts his or her opinions and loses the ability to get to know various points of view.
The awareness of ways to circumvent personalized web pages is crucial in being information literate. An invisible algorithm can limit the receipt of new information and narrow the scope of the diversity of this information. For instance, Google generates search results based on previous queries and pages visited. To get around this process and see results that would correspond to the relevance of the information, a person needs to select “all results” in the search tools. Moreover, these days most search engines provide the ability to adjust the search results: in Google, registered users can simultaneously view personal and general results using a select button to the right of the search results.
The “filter bubble” effect can have negative consequences on public opinion formation. A potential drawback of filtering search queries is that it closes people from new ideas, objects, and relevant information. According to Paraiser (2011), the harmful effects of the “filter bubble” include harmton society as a whole, making people more vulnerable to propaganda and manipulation. “Filter bubble” creates the impression that possible narrow interests are all that exist and surrounds Internet users.
References
Pariser, E. (2011). Beware online “filter bubbles” [Video]. TED. Web.