Interview with Prof. Christiane Woopen, Executive Director of the Cologne Center for Ethics, Rights, Economics, and Social Sciences of Health (ceres), University of Cologne.
Today we would like to talk about big data in the healthcare sector. Please explain to us what you consider to be the real advantages of big data in healthcare applications.
Prof. Christiane Woopen: Big data makes it possible to combine many different kinds of data, such as data from different levels of the body's physical structures (genes, proteins, metabolic products, etc.), lifestyle data, data from social networks, data on nutritional behavior, data on sports programs, etc. And from all that data one can learn something about how diseases occur. One can do research with the data and – ideally, at some point – use big data applications to offer truly individualized and customized therapies and prevention regimes.
Those are the positive sides. What about down sides? How would you describe them?
Prof. Christiane Woopen: Well, these things naturally entail a number of challenges. Primarily, the challenges arise in connection with the quality of data, as well as of the quality of algorithms that identify interrelationships or that produce various hypotheses about why certain things happen in diseases and why a disease takes such and such a course. There are also challenges in the area of data privacy – problems related to privacy protection, and to patients' and users' ability to decide how their data are used. And then there are data-release issues – for example, in connection with health insurance companies, especially in connection with structuring of insurance rates.
Structuring of insurance rates. Does this mean that insurance companies might monitor people's behavior and then scrutinize their insurance rates on that basis? That they would then assign even greater responsibility to individual patients, and to individual policyholders, and then say to them, "you need to behave in ways that will keep you healthy"?
Prof. Christiane Woopen: Yes, internationally, companies are already trying to structure their insurance rates in keeping with behavior-based criteria. For example, wearables can now report your daily activity levels, such as step counts, to your insurance company, and then your company can give you points, based on those reports, in a bonus program. This certainly has impacts on how the solidarity principle is applied and how responsibility is assigned. It also has impacts in terms of who actually defines what we mean by "health." Why should step counts be important, but not something such as meditation for mental health, or some other criterion? In other words, the question of what data I release to whom, and for what evaluation purpose, certainly is of systemic significance.
You mentioned the issue of individual responsibility. Does this mean that every individual is responsible for what happens with his own data, for how he uses wearables, for how and to whom his data are released and for where his data might wind up at some point?
Prof. Christiane Woopen: Yes, each individual is responsible, to a great extent, for what happens with his data, but no individual is able to exercise such responsibility to the fullest degree. Data privacy advocates, and data protection officers, etc., have been giving wearables very poor marks in the areas of data privacy and transparency. I think that ultimately we have to try to strike a balance between a) the trust that individuals place in state regulation, and in the ethical behavior of all participating actors, companies, makers of wearables etc., and b) each individual's responsibility and ability to determine how his data are used. That said, people need to be empowered to exercise such responsibility. This means they must be enabled to understand data release clauses, and to know where their data are really going and what is being done with them, etc. This is all very complex, but it should really be easy.
Understandably, many people are concerned about the trends and developments that have been emerging. So one might ask whether we can slow things down, or even maintain the status quo. Is sticking to the status quo an option?
Prof. Christiane Woopen: No. Digitization is going to profoundly affect and change all areas of our lives – in fact, it's already doing that. And we can't slow it or stop it – nor should we even be trying to do so, in my opinion. We should be working to guide and structure it. In the health-care sector – and especially in that sector – we shouldn't be trying to slow things down; we should be doing everything possible to move them forward. In Germany, by the way, it's really high time for us to be picking up the pace.
That certainly sounds clear enough. Could you give us some details about what "acceleration" should involve?
Prof. Christiane Woopen: I think we need a think tank*, a think tank that would design a new approach for our healthcare system. A user- and patient-centered approach that gives each person full control over his own data, and where each person can build his own healthcare network by picking and choosing from among the various healthcare sectors – which are all still too rigid, by the way – in order to receive quality care from a range of trustworthy healthcare professionals. Such quality care would also be data-based – to patients' own benefit. So we need to restructure our healthcare system, around users and patients.
*Who should be in such a think tank?
Prof. Christiane Woopen: Representatives of all fields of relevance to such a restructuring. This would include patients' representatives, as well as people who are simply taking part in prevention programs – i.e. people who can represent the public and the society at large, in non-patient roles. We need to get away from interest groups who just want to uphold the status quo and their old interests. It is important for interests to be represented, but an effort to overcome our sectoral boundaries is definitely overdue.
*Additional question from the video interview