XR Prototypes
As an XR researcher, I’m constantly exploring the possibilities of immersive technologies through the development of XR applications. My primary focus lies in crafting novel 3D interaction techniques aimed at elevating the user experience within these immersive environments. What you’ll find below is a selection of prototypes showcasing some of these explorations. These projects reflect my skills as a Unity developer, my innovative solutions for interaction design, and my ability to visually communicate complex concepts. I hope this provides a glimpse into the ongoing evolution of my work in shaping the future of XR interaction.
Clean the Ocean (Winner of the 3DUI Contest 2022)
We adapted two classic interaction techniques, Go-Go and World in Miniature (WiM), to provide an engaging minigame in which the user collects the trash in the ocean. To improve the precision and address occlusion issues in the traditional Go-Go technique we propose ReX Go-Go. We also propose an adaptation to WiM, referred to as Rabbit-Out-of-the-Hat to allow an exocentric interaction for easier object retrieval interaction.
Fantastic Voyage (Winner of the 3DUI Contest 2021)
We propose a solution to use interactive storytelling in immersive VR to promote science education for the general public on the topic of COVID-19 vaccination. The educational VR storytelling experience we have developed uses sci-fi storytelling, adventure and VR gameplay to illustrate how COVID-19 vaccines work. After playing the experience, users will understand how the immune system in the human body reacts to a COVID-19 vaccine so that it is prepared for a future infection from the real virus.
Authentication in the Metaverse
We developed an authentication method that uses a virtual environment's individual assets as security tokens. To improve the token selection process, we introduce the HOG interaction technique. HOG combines two classic interaction techniques, Hook and Go-Go, and improves approximate object targeting and further obfuscation of user password token selections. We created an engaging mystery-solving mini-game to demonstrate our authentication method and interaction technique.
Gesture-Based Math Operations in VR
We designed embodied gestures for arithmetic operations in virtual reality. The application creates an intuitive gamified experience designed to teach the interaction methods in an engaging setting.
Recommendation Agent driven by Gaze in the Immersive Space
Designed for intelligence analysts at DoD. The application integrates the user's gaze data to an AI agent in the background to generate a list of personalized recommendations for the user. The application also demonstrates several visualization techniques to convey the agent outputs to the user.
Hybrid Meeting in Immersive Space
We present CoLT, a platform where people can join via AR, VR, or traditional PCs to collaborate with each other on a group of documents. Our platform allows users to interact and share with immersive elements in real-time. The platform is designed for researchers working on a survey paper. So it can connect to individual reference management accounts (e.g. Mendeley, Zotero), and import the documents into the platform. We showcase the use case of our platform through a simulation of a group of researchers living far from each other and working together.
Collaborative Inspection of 3D Models
Designed for LLNL scientists. This is a demonstration of multiple people working together on a shared 3D model with annotations. The users can join and contribute to the networked persistent platform both synchronously and asynchronously.
Your Eyes Know what You're Thinking
Designed for DoD intelligence analysts. This prototype was designed to capture the eye tracking data of a user while they complete a fact-finding mission from a large set of inter-connected documents. We found strong evidence that in an otherwise unstructured environment, user's gaze data faithfully reveals how important a document is to the user.
Real-Time Clustering in 3D Space
A demonstration of clustering documents in the immersive space with different levels of automation. The video also showcases the interaction techniques for these clusters.