Fakultät Informatik

Data Engineering & Analytics

The Technical University of Munich (TUM) is an internationally recognized excellence center for database engineering. Its high reputation is demonstrated by cooperations with leading industrial partners as well as academic collaborations. The strength of the TUM Data Engineering and Analytics group lies in its systems-centered work which has produced many prototypes, including HyPer, widely considered the fastest main-memory database system. Its commercialization license was acquired by Tableau, the world´s market-leading provider of analytical data visualization. TUM retained the research license for HyPer which is a particularly useful platform for validating innovative research in BigData analytics techniques. The participating research groups (with an estimated 100 researchers) cover a broad interdiscipinary spectrum in the field of database systems engineering and have at their command abundant research expertise grounded in domain-specific analysis. They furthermore carry out practice-oriented and application validated research and development work for the upcoming Big Data era. 

Scientists (with focus)

  • Prof. Stephan Günnemann
  • Prof. Alfons Kemper
  • Prof. Thomas Neumann
  • Prof. Thomas A. Runkler

Scientists (with involvement)

  • Prof. Martin Bichler
  • Prof. Hans-Joachim Bungartz
  • Prof. Daniel Cremers
  • Prof. Peter Gritzmann (Fak. f. Mathematik)
  • Prof. Alois Knoll
  • Prof. Florian Matthes
  • Prof. Bjoern Menze
  • Prof. Burkhard Rost
  • Prof. Rüdiger Westermann

Exemplary Projects

HyPer: The "all-in-one" Database System

HyPer is a long-term collaborative project of the Chair for Database Systems (Profs. Kemper and Neumann). This newly developed in-memory database system integrates transaction processing(OLTP), complex analytical query processing (OLAP) and data exploration in one database engine on the same database state. 

 More information

DFG Emmy Noether Projekt: "Robust Data Mining of Large-Scale Attributed Graphs"

With the rapid growth of social media, sensor technologies, and life science applications, large-scale complex graphs have become a ubiquitous and highly informative source of information. Examples include review and co-purchase networks, protein interaction networks, or social networks. The goal of this project is to develop and analyze robust data mining techniques for large-scale complex graphs. Specifically, since in real life applications, complex graphs are often corrupted, prone to outliers, and vulnerable to attacks, we will focus on the methods’ robustness properties. The obtained research results will act as a foundation for research and development in areas such as spam and fraud detection, advanced data cleansing, and recommender systems. 

 More information

ERC Consolidator Grant CompDB: The Computational Database

The goal of this project is to enable users to perform near real-time analysis and exploration of complex and large databases by exploiting modern hardware. For this purpose I propose to develop a computational database system that integrates all data processing tasks from transactional scripts to analytical SQL queries to exploratory workflows – in one system on the same most current (i.e., transaction-consistent) database state. In this sense I want to turn the database system into a comprehensive data science platform. 

DFG Priority Program: Scalable Data Management on Modern Hardware

As a result of the priority program we expect the development and evaluation of architectures and abstractions for flexible and scalable data management techniques which provide extensibility regarding new data models including processing and access mechanisms for emerging applications, and exploit the features of modern and heterogeneous hardware as well as system-level services.

 More information