DE
Menü öffnen

Haptics-Enhanced Teleoperation Robotics

Teleoperation robotics combines the precision and strength of machines with the decision-making and adaptability of humans. By enabling operators to control robots from a distance, while receiving visual and haptic (touch-based) feedback, these systems allow complex tasks to be performed in environments that are unsafe, inaccessible, or ergonomically challenging for humans. Haptic feedback plays a key role in this interaction, allowing operators to “feel” what the robot is touching or manipulating. This tactile awareness improves safety, accuracy, and control in delicate or force-sensitive tasks such as assembly, grinding, or handling fragile objects.

Methodology / System Overview

At our institute, we focus on developing immersive and responsive teleoperation platforms that integrate robotics, virtual/augmented reality, and haptic interfaces. These systems aim to:

  • Enhance operator awareness through VR/AR visualization.
  • Provide intuitive and precise control of robotic manipulators.
  • Enable safe execution of tasks in hazardous or high-demand environments.
  • Support skill transfer, where human demonstrations can be recorded and replayed autonomously by robots.

Our projects demonstrate how teleoperation and haptic robotics can expand the capabilities of human–robot collaboration, from industrial skill transfer to coordinated dual-arm manipulation in challenging environments.

The teleoperation frameworks use the UR10e collaborative robotic manipulator as the core platform, controlled via the Robot Operating System 2 (ROS 2) for modularity and low-latency communication. Human input is captured through motion trackers (HTC Vive / Vive Ultimate Tracker) and custom handheld devices, which translate operator movements into real-time six-degree-of-freedom velocity commands. These devices are equipped with buttons for interaction and vibration motors for force feedback, allowing operators to feel contact dynamics while executing precise tasks.

Visual immersion is achieved through a Meta Quest 3 VR/AR headset, which provides a digital twin of the robot workspace, real-time video streaming, and force visualization overlays. On the control side, methods such as admittance control for compliant grasping, hybrid force-position strategies, and slippage detection layers ensure stability, safety, and adaptive response in dual-arm manipulation and industrial grinding tasks. A finite state machine governs both teleoperation and autonomous modes, enabling operators to record demonstrations for later playback, thus supporting programming-by-demonstration workflows.

Hardware and software integration is managed through ROS nodes, Unity-based visualization, and microcontroller-driven operator tools, ensuring synchronized interaction between human input and robotic output. This combined methodology demonstrates how teleoperation systems can extend human skills into hazardous or demanding environments, achieving robust dual-arm coordination, precise force-sensitive operations, and effective human–robot skill transfer.

Achievements and System Specifications

The developed teleoperation platforms demonstrate how immersive interfaces and haptic feedback can make robotic systems more precise, responsive, and intuitive. In the dual-arm project, two UR10e robots achieved synchronized coordination with latencies under 15 ms, while a custom slippage sensor detected instability and triggered grip correction within 8.8 ms, ensuring secure object handling. Shared admittance control allowed for compliant yet accurate manipulation, and the AR interface on Meta Quest 3 provided real-time visual cues for task states and slippage, significantly improving operator awareness.

The skill-transfer study focused on simulated industrial polishing/grinding tasks validated an end-to-end latency below 15 ms and positional errors under 5 mm, with trajectory repeatability yielding correlation coefficients above 0.8 across trials. A user study with 17 participants confirmed that workspace scaling (2×–6×) enhanced precision and force regulation, with 94% rating trajectory accuracy as good or very good and 88% endorsing the training mode as useful. The system supported both teleoperation and autonomous playback, enabling programming by demonstration for industrial applications.

Together, these systems showcase robust specifications:

  • Robotic Platform: UR10e collaborative manipulators (single- and dual-arm setups).
  • Control Framework: ROS 2 integration for modularity and low-latency communication.
  • Interfaces: Meta Quest 3 VR/AR for immersive visualization, Vive trackers for precise motion mapping.
  • Haptics: Vibrotactile feedback and force-aware visual overlays for real-time force perception.
  • Performance: Latency consistently below perceptual thresholds, high repeatability, and strong usability ratings.

These achievements highlight the potential of haptics-enhanced teleoperation for safe object handling, industrial skill transfer, and remote operation in hazardous environments, bridging human intuition with robotic reliability.

Related Publications

• Sadeghi, M., Khalil, I., Orlandatou, K., Kern, T.A., (2025), Haptic-Enhanced Shared Control for Collaborative Dual-Robot Telemanipulation in Large-Scale Object Handling, Telepresence IEEE Conference, Leiden, Netherlands, 8–10 September.

Join Us for Your Master’s, Bachelor’s, or Project Thesis!

Interested in robotics, VR/AR, and haptics? Work with us on cutting-edge teleoperation projects using UR10e robots, immersive interfaces, and real-time feedback. Gain hands-on experience and help shape the future of human–robot collaboration.

Contact Person: Dr.-Ing. Mohammad Sadeghi