Exploring Immersive Social-Physical Interaction with Virtual Characters through Coordinated Robotic Encountered-Type Contact

Exploring Immersive Social-Physical Interaction with Virtual Characters through Coordinated Robotic Encountered-Type Contact
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

This work presents novel robot-mediated immersive experiences enabled by an encountered-type haptic display (ETHD) that introduces direct physical contact in virtual environments. We focus on social-physical interactions, a class of interaction associated with meaningful human outcomes in prior human-robot interaction (HRI) research. We explore the implementation of this interaction paradigm in immersive virtual environments through an object handover, fist bump, and high five with a virtual character. Extending this HRI paradigm into immersive environments enables the study of how physically grounded robotic contact and virtual augmentation jointly shape these novel social-physical interaction experiences. To support this investigation, we introduce ETHOS (Encountered-Type Haptics for On-demand Social interaction), an experimental platform integrating a torque-controlled manipulator and interchangeable props with a headset-mediated virtual experience. ETHOS enables co-located physical interaction through marker-based physical-virtual registration while concealing the robot behind the virtual environment, decoupling contact from visible robot embodiment. Both technical characterization, through spatial alignment and interaction latency tests, and experiential evaluation, through a 55 participant user study, were completed. Overall, the findings demonstrate the feasibility and experiential value of robot-mediated social-physical interaction in VR and motivate further development of dynamic encountered-type approaches for immersive HRI.


💡 Research Summary

The paper introduces ETHOS, an experimental platform that merges a torque‑controlled 7‑DoF LBR iiwa robotic arm with interchangeable physical props and a Meta Quest 3‑based virtual reality (VR) environment to enable “social‑physical” interactions with virtual characters. Three interaction types—object handover, fist bump, and high‑five—are implemented, each using a dedicated prop (a 3‑D‑printed baton, a silicone fist, and a silicone open hand). ETHOS employs a ChArUco marker system to establish a six‑degree‑of‑freedom physical‑virtual registration, and UDP communication to synchronize the robot’s end‑effector pose with the user’s hand tracking data from the headset.

Two control strategies are explored: static physicality (SP), where the prop remains stationary and provides only contact, and dynamic physicality (DP), where the robot actively moves to generate impact forces that approximate natural touch. Technical evaluation shows an average spatial alignment error of 5.09 ± 0.94 mm and an interaction latency of 28.58 ± 31.21 ms, both below typical perceptual thresholds for VR haptics.

A user study with 55 participants examined experiential outcomes across three conditions: no physicality (NP), SP, and DP. Participants experienced each of the three interaction scenarios under one of the conditions. Results indicate that both SP and DP significantly improve presence, realism, and perceived social connection compared with the purely virtual baseline. However, differences between SP and DP were not statistically significant, suggesting that the current implementation of dynamic impact does not yet produce a perceptibly richer tactile experience.

The authors position their work within the broader HRI literature, noting that social‑physical interactions—where physical contact and social meaning are inseparable—have been shown to boost motivation, learning, and engagement in assistive and instructional contexts. By concealing the robot behind the virtual environment, ETHOS isolates the tactile contribution while allowing the virtual avatar to convey social cues, thereby avoiding the “robot visibility” effect that can distract users.

Limitations include reliance on marker‑based registration (which introduces small positional drift), limited force‑control bandwidth of the robot, and the modest differentiation between SP and DP. Future directions propose higher‑speed vision‑based tracking, machine‑learning‑driven calibration, richer impact‑force profiles, and extending the platform to collaborative, educational, or therapeutic scenarios.

In summary, ETHOS demonstrates that robot‑mediated encountered‑type haptics can feasibly and beneficially augment immersive VR experiences, providing tangible physical feedback that enhances users’ sense of presence and social connection with virtual characters. This work opens a pathway for more sophisticated, dynamically rendered social‑physical interactions in next‑generation mixed‑reality systems.


Comments & Academic Discussion

Loading comments...

Leave a Comment