Browsing by Author "Billinghurst, Mark"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item An AR Network Cabling Tutoring System for Wiring a Rack(The Eurographics Association, 2017) Herbert, B. M.; Weerasinghe, A.; Ens, Barrett; Billinghurst, Mark; Wigley, G.; Tony Huang and Arindam DeyWe present a network cabling tutoring system that guides learners through cabling a network topology by overlaying virtual icons and arrows on the ports. The system determines the network state by parsing switch output and does not depend on network protocols being functional. A server provides a web-based user interface and communicates with an external intelligent tutoring system called The Generalized Intelligent Framework for Tutoring. Users use a tablet to view AR annotations, though support for HoloLens HMD will be added soon.Item Comparative Evaluation of Sensor Devices for Micro-Gestures(The Eurographics Association, 2017) Simmons, H.; Devi, R.; Ens, Barrett; Billinghurst, Mark; Tony Huang and Arindam DeyThis paper presents a comparative evaluation of two hand gesture recognition sensors and their ability to detect small, sub millimeter movement. We explore the capabilities of these devices by testing if users can reliably use the sensors to select a simple user interface element in 1D space using three distinct gestures a small movement of the thumb and forefinger representing a slider, the slightly larger movement of moving a finger up and down and a large gesture of moving the whole hand up and down. Results of our preliminary study reveal that the palm provides the fastest and most reliable input. While not conclusive, data from our initial study indicates that the Leap sensor provides lower error, difficulty and fatigue than the Soli sensor with our test gesture set.Item Comparative Evaluation of Sensor Devices for Micro-Gestures(The Eurographics Association, 2017) Simmons, H.; Devi, R.; Ens, Barrett; Billinghurst, Mark; Tony Huang and Arindam DeyThis paper presents a comparative evaluation of two gesture recognition sensors and their ability to detect small, movements known as micro-gestures. In this work we explore the capabilities of these devices by testing if users can reliably use the sensors to select a target using a simple 1D user interface element. We implemented three distinct gestures, including a large gesture of moving the whole hand up and down; a smaller gesture of moving a finger up and down and; and a small movement of the thumb against the forefinger to represent a virtual slider. Demo participants will be able to experience these three gestures with to sensing devices, a Leap Motion and Google Soli.Item WeightSync: Proprioceptive and Haptic Stimulation for Virtual Physical Perception(The Eurographics Association, 2020) Teo, Theophilus; Nakamura, Fumihiko; Sugimoto, Maki; Verhulst, Adrien; Lee, Gun A.; Billinghurst, Mark; Adcock, Matt; Argelaguet, Ferran and McMahan, Ryan and Sugimoto, MakiIn virtual environments, we are able to have an augmented embodiment with various virtual avatars. Also, in physical environments, we can extend the embodiment experience using Supernumerary Robotic Limbs (SRLs) by attaching them to the body of a person. It is also important to consider for the feedback to the operator who controls the avatar (virtual) and SRLs (physical). In this work, we use a servo motor and Galvanic Vestibular Stimulation to provide feedback from a virtual interaction that simulates remotely controlling SRLs. Our technique transforms information about the virtual objects into haptic and proprioceptive feedback that provides different sensations to an operator.