A few days ago, I was surprised to see some of the news and items I had recently searched appear on my Facebook and Instagram. At first, I thought it was convenient for the service to find out and recommend my interests. However, I became concerned for a moment wondering if the news was different from what I saw before and that felt wrong. This phenomenon is called‘filter bubble’.
Filter bubble refers to a phenomenon in which the user is accepting specific information in the process of providing user-customized information. It is a concept presented by Ellie Fraser, the chairperson of the US online civic group movement. He proposed that internet corporations created filter bubbles as a strategy to give companies specific information about individuals... The individualized strategy is to provide people with customized information by figuring out their interests or taste. Accordingly, experts say that internet filters look into what you actually do and infers what you like. The pioneer that brings the phenomenon of filter bubble is Amazon, the world’s largest online bookstore. Amazon applied the idea that implements the sales method of a local bookstore that grasps the tastes of a frequenter and recommends a book. That is, they succeeded by using filtering that classifies individual tastes. As the big data era opened, customized information provisions became popular.
There are also several concerns in this filter bubble. Now, websites such as Yahoo news and New York Times internet edition are providing people with news depending on their particular interests or needs. About this concept, Fraser believes that a world that specializes in individualization is a comfortable place where people and ideas are gathered. However, he also cautioned that they are trapped in biases because the filter bubble inevitably leads to distortion. Additionally, he is concerned that because SNS is used as a means of gathering people with similar thoughts and behaviors rather than trying to utilize SNS as a means of communicating with other people or groups, there is a tendency to accept only specific information in this process.
To solve this problem, the algorithm itself should either send a warning message to users who are accustomed to information misinformation or often provide the opposite news of user interest. Also, we need to be in a position to suspect that we have a biased view because of the information that is conveyed to us under the name 'customized information'.