Five Full Papers & One Workshop Paper at CHI 2026

The UMTL has published 5 full papers and a workshop paper at the ACM CHI 2026 conference in Barcelona, Spain!


Xinrong Wang presented her full paper about user-reconfigured haptics, which explores how users can reconfigure passive haptic objects to create more dynamic and lightweight tactile experiences in virtual reality.

André Zenner contributed to four full papers, covering topics such as privacy and safety challenges in on-body interaction, novel vibrotactile techniques for material perception in VR, how mobile interactions influence grasp and contact behavior beyond touchscreens, and perception thresholds for real and virtual inclinations during cycling in VR.

André Zenner, Moiz Sakha, and Florian Daiber co-authored a paper presented at the AI for Haptics workshop exploring how everyday objects can be transformed into effective haptic props for immersive VR experiences using AI.

Nina Knieriemen contributed a position statement at the workshop “Towards Proactive Approaches to Combating Toxicity, Harassment, and Abuse in Online Social Spaces: A Collaborative Theory-Building Workshop”.

 

 

User-reconfigured Haptics: Combining User-Reconfiguration and Visual Manipulations to Enhance Dynamic Passive Haptic Experiences for VR

Xinrong Wang, Yu Jiang, Martin Schmitz, Jürgen Steimle, Antonio Krüger, Donald Degraen

https://doi.org/10.1145/3772318.3793333

This paper explores how to make passive VR haptics more dynamic while still keeping it lightweight. We combine modular passive hardware, user-driven reconfiguration, and virtual remapping to enable richer and more flexible haptic experiences in VR. Abstract: Virtual Reality (VR) depends on haptic feedback to create immersive experiences. Traditional passive proxies align physical props with their virtual counterparts but remain limited in scalability and expressiveness, or require bulky actuators to support reconfiguration. We introduce User-reconfigured Haptics, an approach that utilizes implicit user actions to reconfigure haptic interfaces to extend the gamut of VR haptic experiences. Modular 3D-printed cells are assembled into dynamic interfaces that express diverse haptic properties such as softness and weight. By masking physical reconfigurations with visual (re)mapping, user actions unnoticeably change haptic properties, resulting in user-driven, dynamic haptic experiences. User studies show that our design can provide distinguishable haptic experiences and is perceived as realistic and enjoyable in a VR task. We further showcase four applications: a fishing rod that changes weight and flexibility, a dynamic desktop of pressable buttons, a glove with adjustable squeezing, and a crossbow with variable pulling resistance.

Xinrong Wang presenting her new full paper on user-reconfigured haptics at CHI 2026.

 

Privacy & Safety Challenges of On-Body Interaction Techniques

Dañiel Gerhardt, Divyanshu Bhardwaj, Ashwin Ram, André Zenner, Jürgen Steimle, and Katharina Krombholz

https://doi.org/10.1145/3772318.3790403

As computing moves onto the human body, interaction techniques raise new and deeply intertwined privacy and safety concerns. This paper presents insights from expert interviews exploring risks such as sensitive data over-collection, harmful inferences, bystander impacts, and threats to bodily autonomy. The study shows that on-body interactions can enable not only privacy violations, but also physical and psychological harm if misused or poorly designed. Based on these findings, the paper derives eight actionable design guidelines for safer and more privacy-aware on-body systems. The work provides a foundation for building trustworthy wearable and on-body technologies.

Connected Material Experiences using Bimanual Vibrotactile Crosstalk in Virtual Reality

Nihar Sabnis, André Zenner, Erik Peralta Løvaas, Marco Weiss, Andrea Bianchi, and Paul Strohmeier

https://doi.org/10.1145/3772318.3790767

Many materials, such as elastic bands or twisted wires, are naturally explored with both hands—but VR systems struggle to recreate this experience without complex hardware. This paper introduces a vibrotactile technique that makes two unconnected VR controllers feel as if they are manipulating a single shared object. By carefully “coupling” vibrations between the hands, the system can convey material properties such as elasticity, flexibility, and torsion. Two user studies show that this approach increases the sense of connectedness and supports rich material perception. The work enables expressive, two-handed haptic interactions using standard consumer VR hardware.

Understanding How Mobile Interactions Shape Grasp and Contact Patterns Beyond the Touchscreen

Carolin Stellmacher, Leon Tristan Dratzidis, André Zenner, Iddo Yehoshua Wald, Johannes Schöning, Yvonne Rogers, Donald Degraen, and Mark Colley

https://doi.org/10.1145/3772318.3790565

Smartphone interaction is shaped not only by touchscreens, but also by how we physically hold devices. This paper presents a detailed study of how fingers and palms grasp and contact a smartphone’s back and edges across nine everyday tasks. Using thermal imaging, the work reveals task-specific contact patterns and support roles of individual fingers that are usually hidden from view. The results expose how little of the phone is actually in contact with the hand and where physical strain may occur. These insights open up new opportunities for ergonomic device design, back-of-device interaction, and spatial haptic feedback beyond the touchscreen.

Determining Perception Thresholds for Real and Virtual Inclinations While Cycling in Virtual Reality

Jonas Keppel, Marvin Prochazka, Stefan Lewin, Markus Stroehnisch, Marvin Strauss, André Zenner, Donald Degraen, Andrii Matviienko, and Stefan Schneegass

https://doi.org/10.1145/3772318.3791538

This paper investigates how much real and virtual slopes can differ in VR cycling before users notice the mismatch. In a controlled user study, cyclists rode a tilting indoor bike while visual inclinations in VR were independently manipulated. The results reveal clear perception thresholds: users tolerate surprisingly large visual exaggerations of uphill and downhill slopes without noticing incongruence. These findings show how controlled sensory mismatches can expand the design space of VR cycling experiences. The work informs the design of more engaging and flexible VR applications for training, exergames, and rehabilitation.

 

Workshop Paper

How AI Enables Haptic Virtual Reality in Everyday Environments

André Zenner, Muhammad Moiz Sakha, Sukran Karaosmanoglu, Florian Daiber, and Frank Steinicke

Proceedings of the AI for Haptics and Haptics for AI: Challenges and Opportunities Workshop at the 2026 CHI Conference on Human Factors in Computing Systems (CHI ’26), April 13–17, 2026, Barcelona, Spain. 8 pages.

Download Paper

Providing convincing haptic feedback in VR usually requires specialized hardware that is rarely available at home or work. This position paper argues that artificial intelligence can turn everyday objects—such as broomsticks, bottles, or tools—into effective haptic props for immersive VR experiences. The authors outline how AI-driven scene understanding can identify safe objects, assess their perceptual and functional similarity to virtual objects, and dynamically adapt interaction techniques to compensate for mismatches. By combining computer vision, reasoning, and perceptual illusions, the approach enables rich tactile experiences without additional hardware. The paper lays the conceptual groundwork for scalable, affordable haptic VR in everyday environments.

André Zenner presenting a workshop paper at the AI for Haptics Workshop at CHI 2026 on how AI can enable haptic VR experiences in everyday environments.


Contact Person

Moiz Sakha