profile image


Main tasks:

As "Senior Data Engineer (m/f/d)" in the Data Intelligence BU, you will be responsible for the (further) development of marketable industry-specific and cross-industry platforms and platform-based products, for the overall target architecture of the core platforms and platform-based products, and for the application operation of platforms and products of T-Systems. You will actively shape the future of the Magenta Data Analytics /Data-Economy portfolio.

This includes the following responsibilities

  • Leads data engineer team, act as thought leader and work with product management to help drive appropriate product enhancements and develop roadmaps and architecture strategy
  • Designs, builds, integrates data from various resources into datamarts & warehouses. Builds large scale data processing pipelines
  • Uses automated software frameworks for data acquisition, validation, blending and modeling - make sure data is accessible and cleaned for further analysis
  • Proven track-record in building data workflows and data products. Defines big data architecture (ingest, store, analyze, visualize etc.) and specifies technical requirements on product design. Leads development of solution design of IOT architecture
  • Optimizes performance of big data systems and researching how Spark, Kafka, Hadoop (HDFS, Hive), Flink, Cassandra and other distributed systems can work together at scale. Comes up with entirely new ideas to build world class data integration systems one step ahead of the industry
  • Proficient in designing efficient and robust ETL workflows (Extract, Transform, Load) on top of big datasets and create big data warehouses that can be used for reporting or analysis by data scientists


Background and required skills:

  • Has a PhD or master’s degree in Data Science, Physics, Mathematics, Statistics or Computer Science with at least +3 years of professional experience in the field of data analytics & data economy, ideally experience with the Telekom Data Intelligence Hub
  • Experience in working in large agile environments (SAFe, SCRUM, DevOps, Clean Code)
  • Has advanced experience & competence to design and manage big data environments (e.g., Hadoop), evaluate and configure components (e.g., Spark, DB), adapt and code algorithms for customer solutions (at least 5 years). Hands-on experience with big-data eco-system – at least one of them: Hadoop, map/reduce, yarn, apache spark, kafka, flink, storm, nifi (at least 5 years)
  • High level experience with one of the programming languages: e.g., Python, java, R, scala, Javascript (at least 3 years). At least 3 years of experience in software engineering, including expertise in agile methods for all steps (design, architecture, coding, test, deployment)
  • Knowledge in mathematics and statistical methods is required (at least 3 years). Experience in two of the following methods are desirable: clustering, classification, modeling, regression, text analysis, pattern recognition, neuronal networks. Understanding of machine learning principles and methods
  • Knowledge of privacy and security challenges and solutions in IOT and data analytics, e.g.,   privacy by design, anonymization, privacy preserving storage (at least 3 years)
  • Experience in two of the following application areas are required: IOT analytics, time series analytics, social media analytics, location data analytics, security data analytics, customer data analytics, signal processing, anomaly detection (at least 3 years)
  • Advanced experience in cloud architectures, computing services (network, storage), application operations and development (java, Python, Jupiter) on large scale data lake solutions. Experience in deploying and managing large-scale distributed systems (at least 5 years)
  • Experience in deploying and managing large-scale distributed systems. Forward Thinking in your ability and interest in translating business requirements into technological visions, and ​reimagining ​​data-processing technologies (at least 5 years)
  • Expert level in (functional) programming languages, e.g., Python, java, R, Scala, Javascript

The processing of your application for job advertisements outside Deutsche Telekom AG may also take place outside Deutsche Telekom AG in other European countries. This also applies to the processing of applications at subsidiaries of Deutsche Telekom AG. For our civil servant employees, please note the following: You are not obliged to enclose documents relevant to your personnel file, in particular civil servant appraisals, with your application. However, you are free to provide these documents voluntarily.

Schwerbehinderte Menschen werden bei gleicher Eignung vorrangig berücksichtigt.

Was wir bieten

  • Flexible Arbeitszeiten

    Gestaltungsspielraum für berufliche und private Herausforderungen - mit unseren flexiblen Arbeitszeitmodellen ermöglichen wir selbstbestimmtes Arbeiten. So, wie es zum Leben und der aktuellen Situation passt.

  • Weiterbildungsangebote

    Lebenslanges Lernen ist für uns unverzichtbar. Ob vor Ort oder digital. Wir bieten eine große Anzahl an Weiterbildungsmöglichkeiten - vom Seminar bis hin zum berufsbegleitenden Studium.

  • Betriebliche Altersvorsorge

    Gut aufgestellt im Alter - wir bieten eine betriebliche Altersvorsorge und zahlen abhängig von Alter und Einkommen unserer Mitarbeiter*innen regelmäßig auf ein persönliches Versorgungskonto ein.

  • Mitarbeiterrabatte

    Produkte und Dienstleistungen vergünstigt kaufen - in den Bereichen Festnetz, Internet, Mobilfunk, TV und Smarthome erhalten Mitarbeiter*innen Rabatt. Auch Freunde und Bekannte profitieren von vielen Angeboten.

Über uns

Über uns

Your department

This position is filled in the Data Intelligence BU in the Digital Solutions portfolio unit of T-Systems International GmbH. The Data Economy chapter (Data Intelligence Hub) focuses on the development of the Telekom Data Intelligence Hub (DIH). We use the full range of data sources, from IoT to social media, to create sustainable innovation/value from the use of data and AI building blocks. Our solutions are both the memory and the brain of digitization. We use state-of-the-art open-source technologies and reference architectures from the IDS and GAIA-X environment. Always with the goal of bringing our customers' digitization strategies to success in the context of data spaces. As Germany's largest digital service provider, we shape the digital world together with our customers in a constantly growing market.


Hervé Serrette

Fragen? Ich helfe gerne weiter! Für die Bewerbung bitte unsere Online-Jobbörse nutzen.

Unser Online-Bewerbungsprozess

Vergleichbare Jobs

DruckenMerken Teilen
2 Klicks für mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie können Ihre Empfehlung senden. Schon beim Aktivieren werden Daten an Dritte übertragen.
2 Klicks für mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie können Ihre Empfehlung senden. Schon beim Aktivieren werden Daten an Dritte übertragen.