Video-Interview with Katarina Barley, German Federal Minister of Justice and Consumer Protection.
Please complete the following sentences: My biggest hope, when it comes to artificial intelligence, is ...
Katarina Barley: ... that it makes people’s lives simpler, more practical, and better and that it opens new opportunities for us, some of which we cannot even imagine.
... and your biggest fear when it comes to artificial intelligence?
Barley: Fear is generally a bad advisor. That is not so much about fear, but when developing artificial intelligence, one is always on the lookout that the rights of people are safeguarded, so that it is first and foremost about us as users and that priority is not given to the technology or to the data or to whatever else. We are the central figures.
That already leads me to my next question. Must users be protected from the AI or does the AI have to protected from consumer protection?
Barley: It is not a matter of “either-or” because these are not opposites: AI and consumer protection necessarily belong together. One has to think of both from the outset. Already during the development of AI, one has to be careful to determine which interests of users and consumers have to be protected. Data privacy, personal life — all things like that.
Is the self-commitment of companies enough or is, in your view, something also being asked here of the political sphere?
Barley: Whenever data is involved, politicians have to be especially vigilant because data have become so precious. You can do a lot with them — for example, manipulate them. We have seen that in connection with the Cambridge Analytica scandal. That is to say we already need rules — clear ones — but ones we can continue to develop further because, let’s face it, AI is also continuing to develop.
Are the EU member states discussing the control of AI too intensively and too long, instead of finally pointing out a common direction?
Barley: The European member states have already taken a giant step with their General Data Protection Regulation. It is a model for addressing data as a whole and something that has, in the meantime, been copied worldwide. Based on that, one has a right to have data deleted; it gives one much stronger control. Otherwise it is clear that Europe with its 28 member states — soon to be just 27 — takes a bit of time but we need that — we need regulations for Europe as a whole and we need principles and not detailed regulations.
You want to prevent discrimination by algorithms. How do you want to handle that?
Barley: Above all, algorithms have to be transparent so that, in other words, we can know by what criteria we are actually being judged and into what boxes we are therefore being dropped. For example, what are the attributes that determine if I get the credit I applied for or not. As citizens, we have a right to know that. That is what’s most important because sometimes algorithms discriminate against women in their professional lives or against people who come from one of the poorer parts of town. That should not be permitted to happen.
Internet providers or rather, internet platform providers like Facebook should be monitored more strictly and also make their algorithms accessible to consumer protection organization and authorities. Why?
Barley: Because the algorithms decide what happens to my data and what so often results, for example, is a personality profile. After that, I start to get advertising but I will subsequently be pigeon-holed according to whether I am credit-worthy, am to get an apartment, things like that. That is why this is so very, very important. Facebook commands so much data and they can do so much with it that we have to set boundaries and we have to do it from the governmental side — ideally, the European government.
What do responsible IT customers themselves need to know and how strongly can they depend upon regulation?
Barley: As an IT user, one should watch that one does not disclose any more data about oneself than necessary. That is my opinion, anyway, and fortunately we will be asked, now that we have the General Data Protection Regulation. For example, may I latch onto the program, the friends, the address list, the site? It is OK to say “no” to this if one finds that is not at all a part of the service but politicians must set the ground rules. We are doing that but naturally, we cannot always be completely up-to-date. Therefore, we also need to look out for ourselves and always inquire at consumer protection organizations.
It's about us as people
AI and big data can use our data trails to build complex profiles of us. But what about consumer protection? German Federal minister Katarina Barley provides answers.