These days, algorithms play an important role in our lives; they select the things you see and hear on the Internet. They are even able to determine the way you see the world. How that works? Well, that is simple –or at least it sounds that way. Think of Facebook’s algorithm; it takes all your activity into account. When you like a post, when you share a photo, and even when you stick around a little longer to watch a video, you show that you like it or think it is important. The next time you log in, you will see the same sort of posts, because apparently these things are relevant to you. On the other hand, it means that there are things out there that you will not see…

The way Facebook algorithms determine your News Feed

Facebook takes a lot of factors into account to determine what you see in your News Feed, such as (1) who is posting? If you comment and like many posts of that one friend, you will mainly see posts of that one friend in your News Feed. Thus, updates of friends with whom you barely interact, will not appear; (2) who is commenting? When many friends are commenting on a post, Facebook’s algorithm figures you might like that as well and thus it will appear on your News Feed; (3) what is the type of post? If you watch videos or read news articles quite often, then these are the types of post that will pop up a lot in your Feed.

Bubble-shift

The information visible in your News Feed is about who you are, what you like and what suits you. For instance, you play field hockey, you to go to concerts and you are interested in a specific party in politics; then you will most likely do not have deep insights into other topics. But what are the effects? There is a possibility that you might think that your worldview is quite complete, while, in fact, you only see the things within your own bubble*. And when these things are expressed more extreme or more interesting we are more likely to click on these posts or links. For example, you follow everything about field hockey. Then one day you see a video about nude-hockey, you might be more likely to watch that video, because it seems, simply, to be more exiting.

When you do this, your bubble is shifting to the more extreme, which could mean that you are less aware of what happens in the middle. And then the differences are increasing and people will create exactly opposite opinions. Take, for instance, the US elections; at first Trump was, from the Democratic view, just an opponent. Later, he became a sexist and then the devil. The same goes for Clinton, from the Republican view, she was first a competitor but soon that changed to running a child sex trafficking ring and later she had to die. In response to this fake news a shooting affair took place in a pizzeria where Clinton supposedly was running this ring.

Who is to blame?

But can we only blame such algorithms? According to research done by MIT about Twitter users during the US election, it seemed that Twitter users isolate themselves from users who think differently. When someone has a different opinion? A simple ‘unfollow’ will do. Other studies show that we are more likely to believe things when someone within your bubble says it. Just as, when people agree on a specific topic, they will believe it even more strongly. So, it seems that not only algorithms create such bubbles, but that we have a hand in it ourselves as well. But the way in which algorithms interfere in our worldview and form our opinion is not know yet. The solution for now? Critical thinking!

Last, don’t forget that algorithms are improving your Internet experience as well.

*In this scenario, we only take Facebook into account, which means that one can, of course, have a worldview that is extended outside of its bubble when reading for example different newspapers or reading news sites.

Source: http://nos.nl/op3/artikel/2149923-zo-bepalen-algoritmes-jouw-wereldbeeld.html