Mojtaba Karimi, M.Sc.
Chair of Media Technology (Prof. Steinbach)
- Phone: +49 (89) 289 - 23509
- Room: 0509.02.916
Mojtaba Karimi (Leox) studied Computer Engineering and Information Technology and received his Bachelor of Science (B.Sc.) degree in 2012. In 2014, he graduated with the degree "Master of Science (M.Sc.)" in Robotics Engineering from Amirkabir University of Technology (Tehran Polytechnic), Tehran (Iran). After graduation, Leox worked as a robotics engineer in industry and, later in 2016, he worked in the Center for Advanced Systems & Technologies (CAST) at Tehran University, as a research assistant. In January 2017, he joined the Chair of Media Technology (LMT) at the Technical University of Munich (Germany), as a member of the research staff. He received the STARTUP WORLD – cutting-edge products win awards at AUTOMATICA 2016 for designing the "Olive Smart Suitcase". His research interests comprise sensor fusion and communication, robot localization, and telepresence systems as well as the extensive field of mobile robotics and motion control systems.
2016 - 2017: Telepresence and Teleoperation Systems (MAVI):
One of the goals in telepresence systems is to be able to perform daily tasks remotely. A key requirement for this is a robust and reliable mobile robotic platform. Ideally, such a platform should support 360◦ stereoscopic vision and semiautonomous telemanipulation ability. In this research, we designed a telepresence mobile robot platform called MAVI. MAVI is a low-cost and robust but extendable platform for research and educational purpose, especially for machine vision and human interaction in telepresence setups. MAVI platform offers a balance between modularity, capabilities, accessibility, cost and an open source software framework. With a range of different sensors such as Inertial Measurement Unit (IMU), 360◦ laser rangefinder, and force sensors along with smart actuation in omnidirectional holonomic locomotion, high load cylindrical manipulator, and actuated stereoscopic Pan-Tilt-Roll Unit (PTRU), not only MAVI can provide the basic feedbacks from its surroundings, but also can interact within the remote environment in multiple ways. The software architecture of MAVI is based on the Robot Operating System (ROS) which allows for the easy integration of the state-of-the-art software packages. In this research, we used state-of-the-art methods for low-delay sensory data transmission over VBR networks and remote visual-inertial localization systems as well as virtual force feedback for remote mobile manipulation.
 M. Karimi, T. Aykut, and E. Steinbach, “Mavi: A research platform for telepresence and teleoperation,” Robotics, vol. abs/1805.09447, 2018.
 T. Aykut, M. Karimi, C. Burgmair, A. Finkenzeller, C. Bachhuber, E. Steinbach, "Delay Compensation for a Telepresence System with 3D 360° Vision based on Deep Head Motion Prediction and Dynamic FoV Adaptation", IEEE Robotics and Automation Letters (with IROS presentation option), July 2018
2018 - Current: Visual-Inertial localization and low-delay sensor-fusion & communication
Since January 2018 Leox works in the project "KI-Inspektionsdrohne". The goal of this project is to design and develop of a autonomous drone for airplane inspection inside a hangar. One of the most challenging parts in this project is the indoor precise localization of the inspection drone and exact positioning of the large object as it's an airplane in this content related to the inspection camera. However, there are LiDAR, stereo camera, and IMUs which can be utilized for the position estimation, but the processing unit is limited as well as the power. We investigate the remote semi-onboard localization system which runs partly in a local processing unit on the drone (local, short term positioning) and the global localization system with more precise estimation and drift correction which runs on the remote station. We use wireless 5.0-GHz network for low-delay drone-station communication.
Additionally, the inspection data should be sent to the ground station and analyzed in real-time by machine learning algorithms and the feedback will be sent to the drone for more detailed inspection in case of need. Finally, an inspection report will show the results and give statistics about the condition of the aircraft.