Nicolas Hanisch


Online Phenomena: When programm codes decide for us

  • Share
    Two clicks for more data privacy: click here to activate the button and send your recommendation. Data will be transfered as soon as the activation occurs.
  • Print
  • Read out

According to a study by Bertelsmanns Stiftung about the perception of algorithms in Europe, 48 percent of the population do not know what an algorithm is. Yet they are our closest companions on the web. An algorithm is a programmed series of consecutive instructions. A navigation system, for example, calculates the shortest or fastest route based on an algorithm.

On the Internet algorithms pre-sort what we see. They are programmed to analyse our behaviour on the Internet and recognise our preferences. This applies not only to products we like, but also to posts that correspond to our views. In social media, too, what we don't like or might not like stays out. It does not appear in our newsfeed. In search engines such results are de-prioritised and only appear far back.


When program codes decide for us.

At first glance, algorithms seem to be good tools for making the wealth of information on the Internet more manageable. However, if you want to deal with political or socially relevant topics in particular, it is precisely such opposing perspectives that are missing for forming your own opinion.

Suggestion algorithms, for example, are programmed in such a way that the search results become more and more extreme. What starts on YouTube with a video on a current government decision increases in the further recommendations and can end with a video of a group that does not accept our state and spreads anti-democratic propaganda.

What we are usually not aware of: Programm codes work in the background. We don't notice them or what influence they have on us. We feed them with our likes and what we click the most. And: they are programmed by people.

Algorithms: the basis for further tools on the Internet

Microtargeting: A popular instrument of product marketing is microtargeting. It uses data that algorithms have collected about us. This information is compared with each other to create different profiles. The goal: provide Internet users with personalised advertising based on their activities and thus increase sales figures. However, microtargeting has long since been used not only for product advertising, but also as an election campaign tool that first came to the attention of the public due to the scandal surrounding Cambridge Analytica and Facebook.

Dark Post & Dark Ads: When advertisements are displayed in the newsfeed of social media and disappear after a short time, this is called dark ads. If the same happens with news, it is called dark posts. What both have in common is that they are tailored to specific target groups and are only shown to them. The basis for this is provided by algorithms. The initiators are usually anonymous. Dark ads and dark posts are often used for political opinion-making. They can contain fake news about political events or people. Due to their short, limited visibility, they are hardly verifiable or traceable. 

Bots: Robots that write on the Internet are called bots. They are computer programmes that can perform repetitive tasks based on algorithms. Bots in social media are called social bots. They are created as realistic-looking accounts of a person with a profile picture, posts and followers who also follow other users. On Twitter, for example, social bots can be used that react to specific hashtags and then send out previously programmed information. When social bots like, share or comment on posts, it gives outsiders the impression that many people are paying attention to the content. They are often used to manipulate and steer discourse in certain directions. Even if the exact number is unclear: it is assumed that about every third hate comment comes from a bot. It is not easy to identify bots, but there are conspicuous features that suggest a bot: How active is the profile? Is content only being forwarded? Are queries not answered at all or only in an evasive manner?

We ourselves determine how the algorithm influences us

In the digital world, it is the responsibility of each individual to deal critically with the information collected there. Programm codes work invisibly in the background. But we can determine what space we give them. Because: The algorithm can only work as well as we allow it to.

What each of us can do:

  • Regularly check your own browser and search engine settings. Change the search engine from time to time.
  • Delete cache and cookies regularly.
  • Check personal settings in the social networks - especially after updates. 
  • Focus on media diversity: Use different sources and pay attention to seriousness.

Those who deal with digital phenomena and are aware of how they work can make better use of the advantages of the digital world, especially in terms of opinion-forming. Our democracy needs - not only in times of crisis or before elections - openness, different perspectives and a constructive dialogue on an equal level. This is what we are all called upon to do!


No Hate Speech

Words must not become a weapon. Deutsche Telekom is fighting for a network without hate in which we treat one another respectfully.