Country
Full text data for US and EP
Status
Type
Filing Date
Publication Date
Inventor
Assignee
Click to expand
IPC
No.
Publication Number
Title
Publication/Patent Number Publication/Patent Number
Publication date Publication date
Application number Application number
Filing date Filing date
Inventor Inventor
Assignee Assignee
IPC IPC
1
US10198874B2
Publication/Patent Number: US10198874B2
Publication date: 2019-02-05
Application number: 15/594,186
Filing date: 2017-05-12
Assignee: Google LLC
Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller. Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to ...more ...less
2
EP3403165B1
Publication/Patent Number: EP3403165B1
Publication date: 2019-04-10
Application number: 17724975.2
Filing date: 2017-05-04
Assignee: Google LLC
3
US20190139323A1
Publication/Patent Number: US20190139323A1
Publication date: 2019-05-09
Application number: 16/240,176
Filing date: 2019-01-04
Assignee: Google LLC
Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller. Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to ...more ...less
4
US10275023B2
Publication/Patent Number: US10275023B2
Publication date: 2019-04-30
Application number: 15/386,594
Filing date: 2016-12-21
Assignee: GOOGLE LLC
Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input. In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A ...more ...less
5
EP3455701A1
Publication/Patent Number: EP3455701A1
Publication date: 2019-03-20
Application number: 17725484.4
Filing date: 2017-05-15
Assignee: Google LLC
6
US20190155439A1
Publication/Patent Number: US20190155439A1
Publication date: 2019-05-23
Application number: 16/111,702
Filing date: 2018-08-24
Assignee: Google LLC
Abstract: A method that includes employing several sensors associated with a handheld controller, where each of the sensors is made of one of a hover, touch, and force/pressure sensor, and generating, by one or more of the sensors, sensor data associated with the position of a user's hand and finger in relation to the handheld controller. The method continues with combining the sensor data from several sensors to form aggregate sensor data, sending the aggregate sensor data to a processor, and generating an estimated position of the user's hand and fingers based on the aggregate sensor data. A method that includes employing several sensors associated with a handheld controller, where each of the sensors is made of one of a hover, touch, and force/pressure sensor, and generating, by one or more of the sensors, sensor data associated with the position of a user's hand ...more ...less
7
EP3400502A1
Publication/Patent Number: EP3400502A1
Publication date: 2018-11-14
Application number: 16829453.6
Filing date: 2016-12-16
Assignee: Google LLC
8
EP3403165A1
Publication/Patent Number: EP3403165A1
Publication date: 2018-11-21
Application number: 17724975.2
Filing date: 2017-05-04
Assignee: Google LLC
9
US20170139567A1
Publication/Patent Number: US20170139567A1
Publication date: 2017-05-18
Application number: 13/935,125
Filing date: 2013-07-03
Inventor: Li, Chun Yat Frank  
Assignee: Google Inc.
Abstract: Embodiments described herein may help to provide a lock-screen for a computing device. An example method involves: (a) displaying two or more rows of characters and an input region that is moveable over the rows of characters, (b) based on head-movement data, determining movement of the input region with respect to the rows of characters, (c) determining an input sequence, where the sequence includes one character from each of the rows of characters that is selected based at least in part on the one or more movements of the input region with respect to the rows of characters, (d) determining whether or not the input sequence matches a predetermined unlock sequence, and (e) if the input sequence matches the predetermined unlock sequence, then unlocking the computing device. Embodiments described herein may help to provide a lock-screen for a computing device. An example method involves: (a) displaying two or more rows of characters and an input region that is moveable over the rows of characters, (b) based on head-movement data, determining ...more ...less
10
WO2017196395A1
Publication/Patent Number: WO2017196395A1
Publication date: 2017-11-16
Application number: 2016067119
Filing date: 2016-12-16
Assignee: GOOGLE LLC
Abstract: A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment
11
US9804682B2
Publication/Patent Number: US9804682B2
Publication date: 2017-10-31
Application number: 15/012,426
Filing date: 2016-02-01
Assignee: Google Inc.
Abstract: Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation. Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface ...more ...less
12
EP3072010A4
Publication/Patent Number: EP3072010A4
Publication date: 2017-07-12
Application number: 14864260
Filing date: 2014-10-30
Assignee: Google Inc.
13
US20170329419A1
Publication/Patent Number: US20170329419A1
Publication date: 2017-11-16
Application number: 15/379,770
Filing date: 2016-12-15
Assignee: Google Inc.
Abstract: A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment, a virtual display of virtual items and/or features may be adjusted in response to movement of the gyromouse combined with touch inputs, or touch and drag inputs, received on a touch surface of the gyromouse. Use of the gyromouse in the AR/VR environment may allow touch screen capabilities to be accurately projected into a three dimensional virtual space, providing a controller having improved functionality and utility in the AR/VR environment, and enhancing the user's experience. A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment, a virtual display of virtual items and/or features may be adjusted in response to movement of the gyromouse combined with touch ...more ...less
14
US20170322623A1
Publication/Patent Number: US20170322623A1
Publication date: 2017-11-09
Application number: 15/386,594
Filing date: 2016-12-21
Assignee: Google Inc.
Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input. In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A ...more ...less
15
WO2017197381A1
Publication/Patent Number: WO2017197381A1
Publication date: 2017-11-16
Application number: 2017032635
Filing date: 2017-05-15
Assignee: GOOGLE LLC
Abstract: Systems
16
WO2017200755A1
Publication/Patent Number: WO2017200755A1
Publication date: 2017-11-23
Application number: 2017031036
Filing date: 2017-05-04
Assignee: GOOGLE LLC
Abstract: A method for aligning an image on a mobile device disposed within a head-mounted display (HMD) housing includes: detecting a request to align an image on a touchscreen of a mobile device; detecting
17
US20170336915A1
Publication/Patent Number: US20170336915A1
Publication date: 2017-11-23
Application number: 15/585,905
Filing date: 2017-05-03