Haptic Communication

A major focus of our research activities is in the area of haptic communications. True immersion into a distant environment requires the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audio-visual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to revolutionize human-human and human-machine interaction.

Recent projects/activities in this area address for example:

Haptic codecs

The main goals are to develop efficient haptic communication approaches over the tranditional and the emerging communication networks, such as the Tactile Internet and the 5G networks. To this end, we study the network characteristics, the psychophysical model of human haptic perception, and the perceptual coding of haptic signals. A major innovation happens at the intersection of these traditionally independent areas.


As a typical application of haptic communications, bilateral teleoperation with haptic feedbacks is our major study topic. The aim is to develop efficient haptic (kinesthetic information) communication approaches for time-delayed teleoperation, while guaranteeing stability of the teleoperation system. A major expertise in our group is to jointly optimize the haptic codecs, control schemes, and network resources to achieve the best possible teleoperation quality.

Grasp planning for deformable objects based on point cloud data and tactile sensors

Intelligent robots should be able to operate in unstructured environments with various objects. Manipulating deformable objects is especially challenging, since the contact changes during the grasp. It is further important to not damage the object and to avoid undesired effects such as content spilling of an open container caused by the deformation. This project focuses on grasp planning and manipulation of deformable objects. The optimal grasping posture is first determined for known objects based on the deformation and contact analyses. Unknown objects are then recognized based on point cloud data and tactile sensors, such that they can be grasped with the optimal postures followed by slip detection and grasp adaptations. The algorithms can be applied in a household environment or a teleoperation scenario. In the latter case, the operator specifies the object to be manipulated and the teleoperator should be able to grasp it autonomously, because of the challenging operation due to the network delay. The object can be safely manipulated by the operator afterwards.

Surface classification and parametrization of materials

Visual and auditory information are predominant in modern multimedia systems. The acquisition, storage, transmission and display of these modalities have reached a quality level which is typically referred to as high definition (HD) and beyond. Similar HD technology for audio is also available. Technical solutions addressing the sense of touch (also referred to as haptics), in contrast, have not yet reached the same level of sophistication. In the context of haptic interaction, kinesthetic and tactile interactions are typically considered separately, as different perceptual mechanisms are involved. While the kinesthetic modality has been studied extensively in the context of teleoperation systems, the analysis, processing and reproduction of tactile touch impressions has received comparatively little attention so far. This is surprising given the fact that we as humans rely heavily on the tactile modality to interact with our environment. In a Virtual Reality application, for example, a typical intention of a user is to interact physically with the objects in the virtual scene and to experience their material and surface properties. Many challenges have to be overcome before tactile solutions will reach the same level of sophistication as corresponding HD video or audio solutions. With recent advances in Virtual Reality (VR), Augmented Reality (AR) and Telepresence, however, the topic is rapidly gaining in relevance and is becoming an enabling technology for novel fields of application, like E-Commerce with tactile feedback (T-Commerce) or touch-augmented VR systems (T-VR).

An important prerequisite of this objective is being researched. Like a camera for capturing images under various viewing conditions, the Texplorer is conceptually designed to obtain haptic properties of objects. Features describing major perceptually relevant dimensions are defined to form a feature vector representation of an object. Beyond the classification of materials, further sensory data can be used to represent the materials in a virtual environment which potentially enables the future use for displaying materials in virtual shopping malls or online stores.

We see two major future applications for the outcome of our conducted research. First, we identify the need for a low-cost system capable of identifying materials, similar to a content-based image retrieval system for visual or an audio retrieval engine for audio content identification. Secondly, the recorded data and calculated features can form a model of the object surface properties. This is particularly interesting in upcoming virtual environments that will provide touch experience beside visible and audible content as well.

You can find the LMT Haptic Texture Database here.