By Yvonne Hofstetter
As of the summer of 2015, more than half of all German employees (56 percent) contacted in a survey were unable to come up with any definition for digitization. One third of the surveyed group had never even heard of the term. 88 percent of those surveyed also responded that they had no idea what the "Internet of Things" is. A full 92 percent had no concept of what "big data" might be.
And yet the overall context for these terms is quickly explained. Digitization is turning our world into a mega-computer. Everything is being interconnected. All kinds of things are being measured, saved, analyzed and predicted, in order to be optimized and controlled – automatically controlled, wherever possible. And those "things" have of course not failed to include people. The technology giants refer to this process as "global consumer control."
By now it is clear that digitization also has a downside, and the new problems are being widely discussed. Computerized mass data analytics and algorithmic control of society are, supposedly, leading to a "black box society," a society of secret control. Supposedly, private companies are now doing something that only totalitarian countries normally do – namely, keeping consumers under complete electronic surveillance. This is illegitimate and violates basic rights, so the critics. What is more, people in more and more occupations are supposedly going to be replaced by (intelligent) computers. Many people are thus wondering what is actually going on ... is digitization a sweet-tasting medicine that has very serious side effects? A look back at history can help us understand the present a little better.
The command to "subdue the earth" is an old theological motif and a primary theme of European development. It establishes a fundamental distinction between human beings and nature and calls on human beings to subjugate nature. In this view, nature is often rough and hostile, and humans are charged with mastering and controlling it, in order to improve their lives and make them more pleasant and comfortable. That process is exactly what we mean by the term "cultural achievement." The command to master nature can be used to justify all technological progress, from the discovery of fire to the invention of the wheel and to the widespread introduction of the universal technologies of the 19th and 20th centuries – including the steam engine, electricity, wireless transmission and computers. So the further that human history progresses, and the more culture we create, the further we distance ourselves from nature. And the culture of digitization is very "un"-natural, in this sense, as we can see in the terms we apply to it: "artificial intelligence," "virtual reality" and "synthetic biology," for example. Digitization is simply a continuation of the process in which every bit of progress moves people a bit further away from nature.
The question of whether we want digitization or not is thus irrelevant. Digital transformation is going to take place whether we like it or not, and it can be philosophically justified. Human beings create cultural accomplishments because they have the ability to reason and, in the case of digitization, to reason scientifically. This is why the digitization currently underway is not God-given. It has not fallen from heaven. We ourselves are shaping it. With the Internet of Things, we give IP addresses to such things as water heaters, umbrellas, showerheads, beds and reserved seats in high-speed trains, and then we interconnect them. If that then enables everyday objects to monitor us and profile our behavior, we can't call that capability a bolt out of the blue, unexpected and unwanted like an incurable cancer. No, we have been wanting everyday objects to have that capability. But why?
Some might find the following clear answer to this question rather surprising: We need economic growth. If economic growth sputters, our standard of living can suffer. We need economic growth to fuel our lifestyles. Economic growth means "earning money." Another term for this concept – which, admittedly, we have explained in a very abbreviated fashion – is capitalism. In the 20th century, the Austrian/Hungarian economist Karl Polanyi defined the three fictitious commodities of capitalism that every child now hears about in school: land (nature), labor and money. By the early part of this century, those things were considered basically exhausted. None of them can be used to make money any longer. A study recently carried out by the McKinsey Global Institute, in 25 developed countries (U.S. and Europe), found that from 2005 to 2014 disposable income had stagnated, or even declined, in 65 to 70 percent of all households – also as a result of digital progress. Those who work are now considered the new poor, since work earns less today than it did a decade earlier. Nature has also been maximally exploited, and land is no longer a reliable source of income, neither as agricultural land nor as a supplier of raw materials. August 8 is Earth Overshoot Day for 2016, the day of the year on which human demand for natural resources begins outstripping nature's capacity to regenerate the relevant resources in the same year. In times of low and negative interest, one can no longer earn money even with money. Money in the 21st century is under investment pressures. It wants to grow and expand. The question is: where can that happen? Capitalism needs a new fictitious commodity – one that is so fresh and innovative that it can attract many investments. That commodity is information.
Joseph Schumpeter, an Austrian economist who was a contemporary of Karl Polanyi's, and who in 1919 briefly served as his country's financial minister, called innovation the driving force for capitalism. And digitization has undoubtedly boosted innovation, although it is debatable whether the quality of its innovations even begins to compare to that of the universal technologies of earlier eras. The innovation boost we are seeing is functioning nonetheless, in the way we want it to function. Financial investors have made the digital technology giants the most valuable companies in the world. Apple, Google and Microsoft rank one, two and three in this regard. They form an oligopoly that can be likened to those built by economic magnates of an earlier industrial era: J.P. Morgan (electricity), Carnegie (steel) and Rockefeller (oil).
The economic expansion fueled with the new fictitious commodity, information – with which money can again be earned, in the 21st century – has no political origins. Like the developments of that earlier industrialization, it also originates in the realm of speculation, where rationally acting economic players and their financial investors invest in order to grow and expand. This also explains why digitization is taking place without any social discussion and any political a priori control. Economic growth and expansion are not political tasks. They follow only economic and market laws – they neither are politically motivated nor have political origins. Economic actors press forward, and policymakers gaze in awe at the "new territory" they open up. In another installment in this series, Ranga Yogeshwar put his finger right on this problem: "The policymaking sector is too sluggish, and its structures are too slow… What we are currently seeing is that things are often being done…before we have really had a chance to discuss them." Policymakers haven't even begun following the market into that space into which the economy has long since expanded – the digital, virtual, non-substantial space of data and information. But now the information economy is forcing us to address the digital with political means.
Digitization has created a new, uncontrolled market. Free markets in the libertarian sense, as leading German philosophers have commonly noted, tend to be inhumane and to offer people little other than precarious circumstances. If one gives credence to the aforementioned McKinsey Report, then this seems just as true today as it was in the unregulated beginnings of the Industrial Revolution, when child labor, 60-hour workweeks and unhealthy working conditions were the rule. Or what would you have answered if an industrialist of that day had trumpeted to you with full conviction: "of course child labor is wonderful. After all, working children contribute to their families' incomes." In the postwar years of the 20th century, capitalism met with great acceptance only because it was able to be socially controlled, and because, under Ludwig Erhard, it became social and liberal. We Europeans are still smarting from the libertarian capitalism of Reaganomics and Thatcherism, from deregulated (financial) markets that gave us not only growth but also market crashes of unprecedented magnitude.
Only recently has the information economy begun meeting resistance from policymakers' ambitions to structure society and regulate markets. But the policy sector is in conflict with itself: should it leave the information economy completely up to its own devices and thereby accept the risk of major social collateral damage? Or should it seek to regulate it, at the risk of braking the economic growth that is so desperately needed? The answer is: regulation and economic growth / competitive advantages are not mutually exclusive. In regulation of the information economy, we are now at the point where we were in the 1980s with regard to environmental protection – still at square one. At that time, business voices were shouting down the first green voices, claiming that environmental protection is bad for business. Today we know how wrong those business voices were. High European environmental standards have become competitive advantages and have been benefiting both people and nature. We now have the task of humanizing the digital era in a similarly successful manner. The mission now is to create an information economy that is both social and liberal and that will give young generations both beautiful technologies and good lives.