Company

François Fleutiaux

0 Comments

Modern data centers – the heartbeat of digitalization

  • Share
    Two clicks for more data privacy: click here to activate the button and send your recommendation. Data will be transfered as soon as the activation occurs.
  • Print
  • Read out

An article by Francois Fleutiaux, Director of T-Systems’ IT Division.

François Fleutiaux

Francois Fleutiaux, Director of T-Systems’ IT Division.

The rising flood of data doesn’t necessarily mean that we need more data centers. Why? Because modern data centers deliver a far better performance than their predecessors – and they also take up far less space. What’s more, technologies such as edge and fog computing process data before it even arrives at a data center.

Our world is becoming increasingly digital and connected with streaming, cloud, big data, and IoT technology. According to an IDC study we will be creating around 163 zettabytes of personal and work-related data in 2025. That’s a gigantic amount – the equivalent of 163 billion terabytes – representing a ten-fold increase from the amount of data created two years ago. For comparison: An iMac’s hard drive holds one terabyte. Some 60 percent of the global data volume in 2025 will be generated by business enterprises. That’s also a pretty remarkable development considering that recent figures show 70 percent of all generated data stemming from private users, with Netflix, YouTube, Spotify, etc. certainly having an impact on that number.

The challenge for enterprises faced with such a huge amount of data isn’t just the sheer volume. The “Three Vs” (volume, velocity and variety) also play an important role. In the future, data will be generated AND required faster. Enterprises that can process customer data in real time to support lightning-fast analytics will generally be a step ahead of the competition. Devices that are connected to the internet are also generating data in all kinds of formats during every business process. Enterprises hoping to leverage this data effectively will have to be agile and efficient in data transformation.

More powerful data centers are needed

Improving the performance of data centers will involve intelligent software (think: artificial intelligence), algorithms and fast network connectivity. But the backbone of productive data usage are the 8.6 million data centers in operation worldwide – and they also need to undergo further optimization in order to meet future challenges. Those with IT infrastructures that have evolved over time generally lack the necessary resource scalability, not to mention the capability to provide services quickly and to support innovative business models. Data centers are also affected by issues such as high costs and poor environmental efficiency.

”By 2020, the heavy workload demands of next-generation applications and new IT architectures in critical business facilities will have forced 55% of enterprises to modernize their data center assets through updates to existing facilities and/or the deployment of new facilities,” says IDC. And by that time “companies pursuing digital transformation will be forced to migrate over 50% of the IT infrastructure in their data center and edge locations to a software-defined model.”

International footprint and high IT performance

The rising flood of data and the urgent need to modernize doesn’t necessarily mean that additional data centers are required. Today’s technology is already far more advanced than it was just a few years ago. Modern data centers deliver a far better performance than their predecessors – and they also take up far less space. This benefits enterprise customers worldwide – and the environment.

And data processing is already becoming less centralized as a result of edge and fog computing. Edge computing processes the data in IoT devices, whereas fog computing processes data in the nearest network nodes. This speeds up response times and reduces data center workloads.

T-Systems launched its modernization and consolidation program back in 2012. Over the past six years 13 ultra-modern data factories have been established. We have also brought our worldwide footprint into line with the demands of our global customers. Germany is a central data hub with three major facilities. We also operate European data centers in the United Kingdom, Spain, Austria, and Switzerland, and we’ve built modern twin-core data centers in the USA, Mexico, Brazil, South Africa, China, and Singapore.

Consolidated data centers are not only more cost-effective and environmentally friendly; at the end of the day they perform better – as a result of better scalability – and they are more innovative. Modern data centers are more compact in size, more secure and far easier to upgrade to the state of the art. Updates for selected data centers can also be rolled out much faster. Sometimes less is more!

So does consolidation mean that less computing power and storage capacity are available? Absolutely not! Today a higher IT performance can be achieved with a smaller server footprint than was possible in the past because technology has become smaller and more powerful, which reduces floor space requirements, and data centers are increasingly using standardized cloud technology. The best example of this is our flagship cloud data center in Biere, Germany.

The “House of Clouds” is expanding

Coud data center Biere, server shelves.

Inside the Telekom cloud: In DT’s most modern data center servers work highly efficient, shielded by top security.

The village of Biere near Magdeburg is a central hub supporting our global data center strategy. Over 50 hardware and software vendors from all over the world are “tenants” of the ultra-modern “House of Clouds” in Biere, because it gives enterprises exactly what they are looking for: exceptional security and reliability (99.999 percent uptime), plus stringent, European standards-compliant (GDPR) data protection and best-in-class energy efficiency (LEED Gold Standard).

In response to the high demand for dynamic cloud services from Biere – where our Open Telekom Cloud “resides” – we have ramped up the processing and storage capacity of this facility by 150 percent, from 7.2 to the current 18 megawatts of power input, over the past two years. The project had a nine-figure price tag and we’ve added three new data center modules to the original two. Bigger than ever before and with state-of-the-art technology, Biere II started operations in early September.

Integral part of the service portfolio

Our 13 data factories are an integral part of the T-Systems service portfolio. Without them, we cannot effectively support our enterprise customers to overcome the challenges posed by the digital transformation with efficient system integration, dynamic cloud solutions and innovative concepts for the Internet of Things.

It is therefore also important for providers to continue making investments in their data center infrastructure – for example in new flash memory, allowing them to meet growing market demands from customers who want to perform real-time analytics.

More data storage at the push of a button

Modern data centers don’t just need the best hardware, they also need the best software. So a future cloud infrastructure will have to offer more than a platform – like various managed services modules in a multi-cloud environment. Ideally, customers should be able to select a cloud ecosystem from the available range – including VMware, Microsoft Azure and Open Stack. The ecosystems would then be provisioned from the provider’s own data centers or from their own – with standardized security and service management. To round off the product, the customer would be able to choose the perfect cloud infrastructure via an IT service management tool – like the ServiceNow Portal – for simple or even automated selection (think: API).

This kind of model has several advantages, the main one being that it delivers a more user-friendly and agile customer experience, because customers get the cloud capacities they need at the touch of a button. New services – such as memory expansion via virtual machines – can be implemented within just a few hours, which also helps users to cope with the rising flood of data. A high degree of automation optimizes quality because it eliminates individual sources of error. Ultimately, this kind of cloud infrastructure will also be more cost-effective to operate for companies of all sizes – from large corporation to SME.

Summary

Growth in the volume of data is accompanied by similar increases in data speed and diversity. Enterprises that can process this data as quickly and effectively as possible to create new business models will emerge as the winners. As a result, enterprise requirements for high-performance data center infrastructures are increasing.

Data centers are and will be the backbone of digitalization. T-Systems was an early adopter of a strategy to modernize and consolidate its data centers. Our customers will soon have access to services from 13 high-performance data factories to help them master the digital transformation.

With our state-of-the-art data center in Biere and its future-proof cloud infrastructure, we are putting the foundations for the future success of our enterprise customers in place today.

François Fleutiaux

François Fleutiaux

François Fleutiaux, member of the T-Systems Board of Management

Laptop, mobile phone and coffee cup on the table

Management unplugged

Deutsche Telekom's top management on current topics.

FAQ