top of page
WhatsApp Image 2023-08-30 at 18.43.11.jpeg

Illuminating VR Wayfinding: Color, Luminance, and Light in Eye-Tracking Teleportations

This project investigates how color, luminance, and light affect VR wayfinding through eye-tracking teleportation. The study's comprehensive experiments aim to enhance interaction design, align VR with real-world architecture, and improve gaming, simulations, robotics, and educational VR environments, promising better user experiences in virtual realities.


Virtual reality (VR) is a dynamic realm where movement through architectural settings holds the key to unlocking immersive experiences. Traditional navigation tools like handheld controllers, however, often fall short of delivering seamless interactions. Enter eye-tracking teleportation – an innovative technique that enables users to effortlessly traverse VR spaces by a mere gaze. The effectiveness of this approach hinges on providing users with clear, reliable visual cues for a comfortable and efficient experience. Among these cues, color and luminance play pivotal roles in shaping depth perception and facilitating intuitive navigation.

​

Objectives:
This study delves into uncharted terrain at the nexus of color, luminance, and light intensity within VR wayfinding and eye-tracking teleportations. The objectives are dual-fold:
1. To meticulously examine the influence of color and luminance as depth distortion cues on the perception of proximity and distance during eye-tracking teleportations.
2. To unravel the intricate impact of varying light intensity levels on depth and distance perception, particularly among destinations with diverse color and luminance characteristics.

​

Methodology:
To rigorously address these objectives, a methodologically robust forced-choice pairwise comparison experiment was undertaken. A diverse cohort of 60 participants engaged in assessing 12 distinct color conditions at four distinct light intensity levels (0.25, 0.50, 0.75, and 1.00). The heralded paired comparison approach, renowned for its sensitivity to object differentiation, presented participants with two objects, tasking them with discerning the object that showcased the desired effects more prominently. The chosen objects, simple cubic buildings, were situated within a white flat virtual environment (#FFFFFF). Participants systematically evaluated 66 pairs of stimuli per light intensity level. 

​

Expected Contributions:
This study's potential contributions are far-reaching and deeply impactful:
1. Informed Interaction Design: The study enriches our understanding of eye-tracking teleportation, providing designers with insights into how color and luminance impact depth perception. This knowledge empowers the creation of intuitive, user-friendly VR interfaces.


2. Architectural Harmonization: Architects and VR designers gain a toolkit to align virtual navigation principles with established architectural wayfinding strategies. The findings guide the integration of VR experiences into real-world spaces.


3. Gaming and Simulation Advancements: Game designers and simulation developers stand to benefit from enhanced VR navigation insights. The research facilitates more realistic, immersive experiences within game worlds and simulations.


4. Robotic Navigation Enhancement: In the realm of robotics, where virtual environments are pivotal for testing and training, the study’s results can shape navigation strategies and interaction design for robots.


5. Educational VR Environments: Learning in VR environments benefits from clearer navigation cues. The findings inform educators and e-learning developers, fostering more effective and engaging educational VR spaces.

By unraveling the intricate tapestry of color, luminance, and light in VR wayfinding and eye-tracking teleportations, this research stands as a beacon illuminating the path to enhanced user experiences within the realm of virtual realities.

bottom of page