Teleoperation robotics combines the precision and strength of machines with the decision-making and adaptability of humans. By enabling operators to control robots from a distance, while receiving visual and haptic (touch-based) feedback, these systems allow complex tasks to be performed in environments that are unsafe, inaccessible, or ergonomically challenging for humans. Haptic feedback plays a key role in this interaction, allowing operators to “feel” what the robot is touching or manipulating. This tactile awareness improves safety, accuracy, and control in delicate or force-sensitive tasks such as assembly, grinding, or handling fragile objects.
Our projects demonstrate how teleoperation and haptic robotics can expand the capabilities of human–robot collaboration, from industrial skill transfer to coordinated dual-arm manipulation in challenging environments.
The teleoperation frameworks use the UR10e collaborative robotic manipulator as the core platform, controlled via the Robot Operating System 2 (ROS 2) for modularity and low-latency communication. Human input is captured through motion trackers (HTC Vive / Vive Ultimate Tracker) and custom handheld devices, which translate operator movements into real-time six-degree-of-freedom velocity commands. These devices are equipped with buttons for interaction and vibration motors for force feedback, allowing operators to feel contact dynamics while executing precise tasks.
Visual immersion is achieved through a Meta Quest 3 VR/AR headset, which provides a digital twin of the robot workspace, real-time video streaming, and force visualization overlays. On the control side, methods such as admittance control for compliant grasping, hybrid force-position strategies, and slippage detection layers ensure stability, safety, and adaptive response in dual-arm manipulation and industrial grinding tasks. A finite state machine governs both teleoperation and autonomous modes, enabling operators to record demonstrations for later playback, thus supporting programming-by-demonstration workflows.
Hardware and software integration is managed through ROS nodes, Unity-based visualization, and microcontroller-driven operator tools, ensuring synchronized interaction between human input and robotic output. This combined methodology demonstrates how teleoperation systems can extend human skills into hazardous or demanding environments, achieving robust dual-arm coordination, precise force-sensitive operations, and effective human–robot skill transfer.
The developed teleoperation platforms demonstrate how immersive interfaces and haptic feedback can make robotic systems more precise, responsive, and intuitive. In the dual-arm project, two UR10e robots achieved synchronized coordination with latencies under 15 ms, while a custom slippage sensor detected instability and triggered grip correction within 8.8 ms, ensuring secure object handling. Shared admittance control allowed for compliant yet accurate manipulation, and the AR interface on Meta Quest 3 provided real-time visual cues for task states and slippage, significantly improving operator awareness.
The skill-transfer study focused on simulated industrial polishing/grinding tasks validated an end-to-end latency below 15 ms and positional errors under 5 mm, with trajectory repeatability yielding correlation coefficients above 0.8 across trials. A user study with 17 participants confirmed that workspace scaling (2×–6×) enhanced precision and force regulation, with 94% rating trajectory accuracy as good or very good and 88% endorsing the training mode as useful. The system supported both teleoperation and autonomous playback, enabling programming by demonstration for industrial applications.
Together, these systems showcase robust specifications:
These achievements highlight the potential of haptics-enhanced teleoperation for safe object handling, industrial skill transfer, and remote operation in hazardous environments, bridging human intuition with robotic reliability.
• Sadeghi, M., Khalil, I., Orlandatou, K., Kern, T.A., (2025), Haptic-Enhanced Shared Control for Collaborative Dual-Robot Telemanipulation in Large-Scale Object Handling, Telepresence IEEE Conference, Leiden, Netherlands, 8–10 September.