To content
Fakultät für Informatik


The general goal of pattern recognition is to reproduce or mimic human perceptual capabilities in technical systems, making machines "see" or "hear". Generally, pattern recognition enables machines to "sense" their surroundings with a range of sensors, analyze the sensory data and react intelligently and appropriately to certain events occurring in these surroundings. Relevant events are associated with reappearing patterns in the sensory data streams. Thus, the task is to find, model (or "learn") and classify those patterns, distinguishing relevant from irrelevant events.

Research in the Pattern Recognition Group aims both at advancing the principled pattern recognition methods behind Intelligent Systems and at developing application-oriented solutions for real-world problems. The term Intelligent Systems comprises a wide range of artifacts and devices augmented with advanced computational capabilities.  Good examples are robotic assistants, smart homes, or decision support systems. All these have in common that they react intelligently when interacting with humans and draw "intelligent" inferences in automated decision processes. In order to realize such seamingly intelligent behavior advanced techniques of pattern recognition and machine learning are developed and applied.

The group closely collaborates with several research groups from academia and a number of industrial partners.

Currently, the main research topics addressed come from the fields of computer vision, acoustic signal processing, and document image analysis. In these areas techniques for the natural and robust interaction between technical systems - like, for example, smart spaces - and human users are developed. The solutions are primarily based on the application of advanced methods from the field of statistical pattern recognition. These all share the important property of being able to automatically learn computational models from examples, which is also a fundamental capability of human perceptual systems.