Tangible interactions involve multiple sensory cues, enabling the accurate perception of object properties, such as size. Research has shown, however, that if we decouple these cues (for example, by altering the visual cue), then the resulting discrepancies present new opportunities for interactions. Perception over time though, not only relies on momentary sensory cues, but also on a priori beliefs about the object, implying a continuing update cycle. This cycle is poorly understood and its impact on interaction remains unknown. We study (N=80) visuo-haptic perception of size over time and (a) reveal how perception drifts, (b) examine the effects of visual priming and dead-reckoning, and (c) present a model of visuo-haptic perception as a cyclical, self-adjusting system. Our work has a direct impact on illusory perception in VR, but also sheds light on how our visual and haptic systems cooperate and diverge.
The study recruited 80 participants divided into four groups, each engaging in one-hour sessions in virtual reality. Conditions varied across two haptic device types — a fixed-size passive device and a custom-built active shape-changing device that dynamically altered its size between 6 cm and 8 cm — and three visual priming strategies: no priming, correct visual priming (revealing the true device), and misleading priming (revealing a device of a different size).
Participants completed repeated forced-choice tasks to estimate perceived object size at multiple time points throughout the session. Between estimation tasks, acclimation games simulated prolonged, natural exposure and practice with the device. This design allowed the researchers to track how perception evolved over time under sustained illusory conditions.
By controlling when and what participants saw of the physical proxy device, the study isolated the contributions of visual priming and prior knowledge to perceptual updating. The resulting dataset was used to derive and validate a first-order control system model of visuo-haptic sensory integration, capturing the cyclical, self-adjusting nature of perception over extended interaction.
Participants consistently overestimated the size of passive haptic objects over time, with drift following a curve toward a local asymptote. When an active shape-changing device shrank from 8 cm to 6 cm, perception drifted in the opposite direction, toward underestimation. Visual priming with the true device produced more accurate initial estimates and substantially reduced drift across the session. Misleading priming skewed perception toward the false size shown. Intermittent correct priming functioned as dead-reckoning, partially resetting perception at each reveal. These patterns were captured by a first-order control system model in which perceptive discrimination amplifies the sensory signal and feedback is weighted by confidence. The illusion threshold for grasping (approximately 17% JND for 6 cm objects reported in prior work) was shown to erode meaningfully over a one-hour session, with implications for the deployment of visuo-haptic illusions in prolonged VR interactions.
@inproceedings{zhang2026visuohaptic,
author = {Jian Zhang and Wafa Johal and Jarrod Knibbe},
title = {Modelling Visuo-Haptic Perception Change in Size Estimation Tasks},
booktitle = {Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems},
series = {CHI '26},
year = {2026},
pages = {18 pages},
publisher = {ACM},
address = {New York, NY, USA},
doi = {10.1145/3772318.3791140},
isbn = {979-8-4007-2278-3/2026/04},
location = {Barcelona, Spain},
website = {https://chri-lab.github.io/papers/2026-visuo-haptic-perception-size/}
}
Jian Zhang, Wafa Johal, Jarrod Knibbe (2026). Modelling Visuo-Haptic Perception Change in Size Estimation Tasks. ACM CHI Conference on Human Factors in Computing Systems (CHI '26).