Country
Full text data for US and EP
Status
Type
Filing Date
Publication Date
Inventor
Assignee
Click to expand
IPC
No.
Publication Number
Title
Publication/Patent Number Publication/Patent Number
Publication date Publication date
Application number Application number
Filing date Filing date
Inventor Inventor
Assignee Assignee
IPC IPC
1
EP3455701A1
Publication/Patent Number: EP3455701A1
Publication date: 2019-03-20
Application number: 17725484.4
Filing date: 2017-05-15
Assignee: Google LLC
2
US10275023B2
Publication/Patent Number: US10275023B2
Publication date: 2019-04-30
Application number: 15/386,594
Filing date: 2016-12-21
Assignee: GOOGLE LLC
Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input. In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A ...more ...less
3
US2019139323A1
Publication/Patent Number: US2019139323A1
Publication date: 2019-05-09
Application number: 16/240,176
Filing date: 2019-01-04
Assignee: Google LLC
Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller. Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to ...more ...less
4
US10198874B2
Publication/Patent Number: US10198874B2
Publication date: 2019-02-05
Application number: 15/594,186
Filing date: 2017-05-12
Assignee: Google LLC
Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller. Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to ...more ...less
5
EP3403165B1
Publication/Patent Number: EP3403165B1
Publication date: 2019-04-10
Application number: 17724975.2
Filing date: 2017-05-04
Assignee: Google LLC
6
US2019155439A1
Publication/Patent Number: US2019155439A1
Publication date: 2019-05-23
Application number: 16/111,702
Filing date: 2018-08-24
Assignee: Google LLC
Abstract: A method that includes employing several sensors associated with a handheld controller, where each of the sensors is made of one of a hover, touch, and force/pressure sensor, and generating, by one or more of the sensors, sensor data associated with the position of a user's hand and finger in relation to the handheld controller. The method continues with combining the sensor data from several sensors to form aggregate sensor data, sending the aggregate sensor data to a processor, and generating an estimated position of the user's hand and fingers based on the aggregate sensor data. A method that includes employing several sensors associated with a handheld controller, where each of the sensors is made of one of a hover, touch, and force/pressure sensor, and generating, by one or more of the sensors, sensor data associated with the position of a user's hand ...more ...less
7
EP3400502A1
Publication/Patent Number: EP3400502A1
Publication date: 2018-11-14
Application number: 16829453.6
Filing date: 2016-12-16
Assignee: Google LLC
8
EP3403165A1
Publication/Patent Number: EP3403165A1
Publication date: 2018-11-21
Application number: 17724975.2
Filing date: 2017-05-04
Assignee: Google LLC
9
US2017139567A1
Publication/Patent Number: US2017139567A1
Publication date: 2017-05-18
Application number: 13/935,125
Filing date: 2013-07-03
Inventor: Li, Chun Yat Frank  
Assignee: Google Inc.
Abstract: Embodiments described herein may help to provide a lock-screen for a computing device. An example method involves: (a) displaying two or more rows of characters and an input region that is moveable over the rows of characters, (b) based on head-movement data, determining movement of the input region with respect to the rows of characters, (c) determining an input sequence, where the sequence includes one character from each of the rows of characters that is selected based at least in part on the one or more movements of the input region with respect to the rows of characters, (d) determining whether or not the input sequence matches a predetermined unlock sequence, and (e) if the input sequence matches the predetermined unlock sequence, then unlocking the computing device. Embodiments described herein may help to provide a lock-screen for a computing device. An example method involves: (a) displaying two or more rows of characters and an input region that is moveable over the rows of characters, (b) based on head-movement data, determining ...more ...less
10
US2017329419A1
Publication/Patent Number: US2017329419A1
Publication date: 2017-11-16
Application number: 15/379,770
Filing date: 2016-12-15
Assignee: Google Inc.
Abstract: A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment, a virtual display of virtual items and/or features may be adjusted in response to movement of the gyromouse combined with touch inputs, or touch and drag inputs, received on a touch surface of the gyromouse. Use of the gyromouse in the AR/VR environment may allow touch screen capabilities to be accurately projected into a three dimensional virtual space, providing a controller having improved functionality and utility in the AR/VR environment, and enhancing the user's experience. A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment, a virtual display of virtual items and/or features may be adjusted in response to movement of the gyromouse combined with touch ...more ...less
11
EP3072010A4
Publication/Patent Number: EP3072010A4
Publication date: 2017-07-12
Application number: 14864260
Filing date: 2014-10-30
Assignee: Google Inc.
12
US9804682B2
Publication/Patent Number: US9804682B2
Publication date: 2017-10-31
Application number: 15/012,426
Filing date: 2016-02-01
Assignee: Google Inc.
Abstract: Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation. Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface ...more ...less
13
WO2017196395A1
Publication/Patent Number: WO2017196395A1
Publication date: 2017-11-16
Application number: 2016067119
Filing date: 2016-12-16
Assignee: GOOGLE LLC
Abstract: A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment
14
US2017330387A1
Publication/Patent Number: US2017330387A1
Publication date: 2017-11-16
Application number: 15/594,186
Filing date: 2017-05-12
Assignee: Google Inc.
Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller. Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to ...more ...less
15
WO2017192177A1
Publication/Patent Number: WO2017192177A1
Publication date: 2017-11-09
Application number: 2016068257
Filing date: 2016-12-22
Assignee: GOOGLE LLC
Abstract: In a virtual reality system
16
US2017322623A1
Publication/Patent Number: US2017322623A1
Publication date: 2017-11-09
Application number: 15/386,594
Filing date: 2016-12-21
Assignee: Google Inc.
Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input. In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A ...more ...less