Bachelor and Master theses

In the following list are topics for both Master or Bachelor theses, please get in touch with the contact person listed, to find out more (last update May 2020).

Master Theses

Topic: Error-related potentials in human-robot collaborative assembly tasks

Supervisor: Stefan Ehrlich and Dr. Emmanuel Dean-Leon

The prospective candidate has a strong interest in combining neuroscience methods with robotics and ideally prior experience in and enthusiasm about working with real robotic systems.

Abstract: There have been several works on training robots using reinforcement learning with the help of error-related potentials (ErrP) [1], where humans could be regarded as a teacher forcing robots to submit to the human’s best strategy. But in some domains, there might be no predefined dominant role and no explicit optimal strategy. Humans and robots are expected to be capable of adapting to each other by reflecting upon the behaviors of the counterpart [2][3] so that the teamwork could take advantage of both human knowledge and the robot’s capability, and enhance human’s willingness to work with robots.

The project aims to realize a co-adaption between robots and human workers in collaborative tasks. [2] has already demonstrated the co-adaptation using ErrPs as a feedback signal in a simplified human-robot interaction scenario. This work will be extended in a practical assembly task, where human workers and robots work together to complete a task but might have different working styles and different ideas about the best strategy at the beginning. During the collaboration they will learn more about each other, thus adjust their actions to improve team performance and finally reach a consensus.

References:

[1] Iturrate I, Montesano L, Minguez J. Robot reinforcement learning using EEG-based reward signals[C]//2010 IEEE International Conference on Robotics and Automation. IEEE, 2010: 4822-4829.

[2] Ehrlich S K, Cheng G. Human-agent co-adaptation using error-related potentials[J]. Journal of neural engineering, 2018, 15(6): 066014.

[3] Nikolaidis S, Zhu Y X, Hsu D, et al. Human-robot mutual adaptation in shared autonomy[C]//2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI. IEEE, 2017: 294-302.

------------

Topic: A Comparison of Signal Characteristics and Classification Methods for Image Recognition and EEG Event-Related-Potentials to Contribute towards a More Generalized Representation of EEG Signals (already taken)

Supervisor: Stefan Ehrlich

Abstract: In recent years, researchers have successfully used machine learning (ML) and neural networks (NN) on some key EEG paradigms such as event-related P300 [1] and motor imagery [2,3]. However, most of these applications suffer from the EEG’s non-stationarity and poor signal-to-noise ratio and are therefore adapted to specific tasks and subjects. To address this problem, some researchers have developed more generalized networks such as the convolutional EEGNet architecture. This network yields results which are comparable to reference algorithms across subjects and tasks [3]. Nevertheless, we still observe a lack of generic, pre-trained algorithms which are transferable between scenarios. On the contrary, in image recognition transfer learning across datasets is a common practice [4]. Thus, we are wondering why NNs and other ML algorithms produce excellent results for some data types but unsatisfactory results for others, even though they seem to share certain key characteristics (e.g. spatiotemporal features)? In order to answer this question, we aim to compare the signal characteristics and classification algorithms of image and EEG data as examples for best- and worst-case data scenarios.

The main objective of this project is to understand why machine learning algorithms (in particular convolutional NNs) produce excellent results for image data but unsatisfying outcomes for EEG data (in particular event-related EEG). We want to identify the data’s crucial signal characteristics and analyze how they are/could be encoded in classification algorithms. Further, we want to know if we can transfer knowledge from the image data and its classification to EEG data and its classification algorithms. Thereby, we aim to give general guidelines for the design of ML algorithms on the basis of data characteristics and to improve the architectures of algorithms deployed on EEG data.

References:

[1] Lawhern, V. J., Solon, A. J., Waytowich, N. R., Gordon, S. M., Hung, C. P., & Lance, B. J. (2018). EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. Journal of neural engineering, 15(5), 056013.

[2] Tayeb, Z., Fedjaev, J., Ghaboosi, N., Richter, C., Everding, L., Qu, X., ... & Conradt, J. (2019). Validating deep neural networks for online decoding of motor imagery movements from EEG signals. Sensors, 19(1), 210.

[3] Schirrmeister RT, Springenberg JT, Fiederer LDJ, et al. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum Brain Mapp. 2017;38(11):5391‐5420. doi:10.1002/hbm.23730

[4] Shin, H. C., Roth, H. R., Gao, M., Lu, L., Xu, Z., Nogues, I., ... & Summers, R. M. (2016). Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE transactions on medical imaging, 35(5), 1285-1298.

------------

Topic: Multi-limb low-impact locomotion for humanoid robots (already taken)

Supervisor: Julio Rogelio Guadarrama Olvera

------------

Topic: Whole body control for competitive mobile service robots (already taken)

Supervisor: Julio Rogelio Guadarrama Olvera

 


Bachelor Theses

Topic: Encoding different temperature levels using the ICS Robot Skin (already taken)

Advisor: Zied Tayeb / Dr. Emmanuel Carlos Dean Leon
Skin is a very important sensor for human beings. Up to 5 million discrete receptors of different modalities (e.g. temperature, force, and vibration) are distributed close to our bodies ' surface. The skin helps us to learn more about our environment and how we can safely interact with it. The aim of this project is to distinguish and encode different measured temperature ranges using the ICS Robot skin. These encoded levels can be translated thereafter into the stimulation of an amputee's arm and/or can be used as in the context of real-time human-robot interaction.

References:

  • Cheng, Gordon; Dean-Leon, Emmanuel; Bergner, Florian; Olvera, Julio Rogelio Guadarrama; Leboutet, Quentin; Mittendorfer, Philipp: A Comprehensive Realization of Robot Skin: Sensors, Sensing, Control, and Applications. Proceedings of the IEEE Volume 107 (10), 2019.
  • Zied Tayeb, Nicolai Waniek, Juri Fedjaev, Nejla Ghaboosi, Leonard Rychly, Christian Widderich, Christoph Richter, Jonas Braun, Matteo Saveriano, Gordon Cheng, Jörg Conradt, 'Gumpy: A Python toolbox suitable for hybrid brain-computer interfaces', Journal of neural engineering, Volume 15 (6), 2018.