Media

Frank Leibiger

0 Comments

Big data on the edge: T-Systems analyzing massive data sets wherever they are produced

  • Share
    Two clicks for more data privacy: click here to activate the button and send your recommendation. Data will be transfered as soon as the activation occurs.
  • Print
  • Read out
  • Solutions for construction, testing and logistics
  • Service passes load test in polar and desert regions
200421-Big-Data

T-Systems can now provide a solution for analyzing data spread out all across the world. Ever growing numbers of machines and products are creating masses of information.


T-Systems can now provide a solution for analyzing data spread out all across the world. Ever growing numbers of machines and products are creating masses of information. Because the vast amount of data it is an expensive and time-consuming task if everything needs to be pulled back into a fully-fledged data center first. That is why T-Systems created the “Big Data & Global Edge Analytics” solution, designed to get to grips with massive data sets wherever they are produced.

The T-Systems solution has already passed its most stringent load test. During vehicle tests conducted by a big car manufacturer in the snows of Finland and the deserts of Dubai, the quantity of data that needs processing daily for each prototype vehicle is in the terabyte range. Teams of engineers from all over the world stand ready to use all this data for extensive and wide-ranging tests immediately after the tests are completed. The “Big Data Signal Processing” software package records and saves the data with no loss of any information, compressing it in size by as much as 90 times. The data is not moved from where it was created – in a mini data center on the test site, known as an edge device. The standardization of binary data formats and the conversion into a special big data format allows parallel computing processes. Data processing is up to 40 times faster.

It is no longer necessary to send such data to a full-blown data center to make it available for analysis. Instead, the engineers send their analytical queries to edge clusters spread out all across the world. Experts speak of “code to data”: instead of sending data for calculation, the analysis requests are sent to the data. This saves bandwidth and transmission costs. Critical data is available faster. Data quality is better. This shortens development cycles and improves engineering processes.

The brand-new Deutsche Telekom solution is designed to be useful in all industries that work with large amounts of decentralized data – such as the construction, testing and logistics sectors. “Big Data & Global Edge Analytics” extends the space now available for the processing and usage of very large volumes of data. The software reads raw data quickly and reliably, indexing and compressing it very efficiently. As a result, users can ensure the tamper-proof storage of their data – that is to say, without losing any information – for up to ten years. The entire data set is available at all times from its cloud-based storage locations. 

T-Systems offers the solution as a comprehensive service: from planning, through implementation and all the way on to day-to-day operation. For more information about a free workshop see here.

About Deutsche Telekom: Deutsche Telekom at a glance
About T-Systems: T-Systems company profil

 It's happening in the cloud.

Cloud Computing

It's happening in the cloud.

FAQ