Academic Staff


Picture of Martin Oelsch

Martin Oelsch, M.Sc.

Chair of Media Technology (Prof. Steinbach)

Postal address

Postal:
Arcisstr. 21
80333 München

  • Phone: +49 (89) 289 - 23509
  • Room: 0509.02.916
  • martin.oelsch(at)tum.de

Biography

Martin Oelsch studied computer science at the University of Applied Sciences Ingolstadt with focus on safety and security. He finished his bachelor studies (B.Sc.) in 2014 and his masters (M.Sc.) in 2016 respectively. Since August 2016 he is a Ph.D. candidate at the Chair of Media Technology at TUM.

Research Interests

Martin's research interests are digital image processing, invariant feature extraction, machine learning and keypoint ranking.

Invariant features are used in various applications: content-based image retrieval, Simultaneous Localization and Mapping (SLAM), object detection, homography estimation and so on.

Feature extraction consists of two steps: feature detection and feature description. Feature detectors aim to find salient regions in an image, which have a high probability to be detected again in the next frame. The task of feature descriptors is the description of this features in a rotation, scale and illumination invariant manner by a feature vector. Martin Oelsch worked in the project VIPE dealing with the exploration of terrain using visual and proprioceptive clues in the Valles Marineris on the planet Mars.

Since January 2018 he works in the project "KI-Inspektionsdrohne". The goal of this project is the development of an autonomous drone for the inspection of airplanes in a hangar. Challenging is the localization of the drone in this GPS denied environment. However, there are LiDAR, cameras and IMUs which can be exploited for the position estimation. The inspection data should be sent to the ground station and analyzed in real-time by machine learning algorithms. Finally an inspection report will show the results and give hints about the condition of the aircraft. We use a 3D model in a Gazebo simulation for the path planning of the drone with which we can generate optimal way points.

 

References:

[1] Devon Island Dataset, Autonomous Space Robotics Lab, University of Toronto (http://asrl.utias.utoronto.ca/datasets/devon-island-rover-navigation/rover-traverse.html#Overview)