Browsing by Author "Lee, Gun A."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item A Gaze-depth Estimation Technique with an Implicit and Continuous Data Acquisition for OST-HMDs(The Eurographics Association, 2017) Lee, Youngho; Piumsomboon, Thammathip; Ens, Barrett; Lee, Gun A.; Dey, Arindam; Billinghurst, Mark; Tony Huang and Arindam DeyThe rapid development of machine learning algorithms can be leveraged for potential software solutions in many domains including techniques for depth estimation of human eye gaze. In this paper, we propose an implicit and continuous data acquisition method for 3D gaze depth estimation for an optical see-Through head mounted display (OST-HMD) equipped with an eye tracker. Our method constantly monitoring and generating user gaze data for training our machine learning algorithm. The gaze data acquired through the eye-tracker include the inter-pupillary distance (IPD) and the gaze distance to the real and virtual target for each eye.Item Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze(The Eurographics Association, 2017) Lee, Gun A.; Kim, Seungwon; Lee, Youngho; Dey, Arindam; Piumsomboon, Thammathip; Norman, Mitchell; Billinghurst, Mark; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiTo improve remote collaboration in video conferencing systems, researchers have been investigating augmenting visual cues onto a shared live video stream. In such systems, a person wearing a head-mounted display (HMD) and camera can share her view of the surrounding real-world with a remote collaborator to receive assistance on a real-world task. While this concept of augmented video conferencing (AVC) has been actively investigated, there has been little research on how sharing gaze cues might affect the collaboration in video conferencing. This paper investigates how sharing gaze in both directions between a local worker and remote helper in an AVC system affects the collaboration and communication. Using a prototype AVC system that shares the eye gaze of both users, we conducted a user study that compares four conditions with different combinations of eye gaze sharing between the two users. The results showed that sharing each other's gaze significantly improved collaboration and communication.Item WeightSync: Proprioceptive and Haptic Stimulation for Virtual Physical Perception(The Eurographics Association, 2020) Teo, Theophilus; Nakamura, Fumihiko; Sugimoto, Maki; Verhulst, Adrien; Lee, Gun A.; Billinghurst, Mark; Adcock, Matt; Argelaguet, Ferran and McMahan, Ryan and Sugimoto, MakiIn virtual environments, we are able to have an augmented embodiment with various virtual avatars. Also, in physical environments, we can extend the embodiment experience using Supernumerary Robotic Limbs (SRLs) by attaching them to the body of a person. It is also important to consider for the feedback to the operator who controls the avatar (virtual) and SRLs (physical). In this work, we use a servo motor and Galvanic Vestibular Stimulation to provide feedback from a virtual interaction that simulates remotely controlling SRLs. Our technique transforms information about the virtual objects into haptic and proprioceptive feedback that provides different sensations to an operator.