http://rdf.ncbi.nlm.nih.gov/pubchem/patent/AU-2019240661-B2

Outgoing Links

Predicate Object
assignee http://rdf.ncbi.nlm.nih.gov/pubchem/patentassignee/MD5_80c7fd5d310c64914d28ef706b87d5e8
classificationCPCAdditional http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G09G2370-20
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T2219-024
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B2027-0127
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G09G2370-02
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B2027-0138
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B2027-0178
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B2027-014
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T2207-30201
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B2027-0187
classificationCPCInventive http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06F3-013
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G09G5-006
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B27-0093
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T7-73
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T1-20
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06F3-016
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06F3-017
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06V10-28
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/H04L67-10
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B27-017
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G02B27-0172
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/H04L67-131
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06V10-40
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T19-006
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/A61B3-00
classificationIPCInventive http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G02B27-01
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/A61B3-10
filingDate 2019-10-03-04:00^^<http://www.w3.org/2001/XMLSchema#date>
grantDate 2020-04-30-04:00^^<http://www.w3.org/2001/XMLSchema#date>
inventor http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_598bb182aaf8476d1cc778da252ca14a
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_fb404cead745879404308e233a1553c7
publicationDate 2020-04-30-04:00^^<http://www.w3.org/2001/XMLSchema#date>
publicationNumber AU-2019240661-B2
titleOfInvention System and method for augmented and virtual reality
abstract Disclosed is a method, comprising: sensing a physical object at a first location using at least one or more outward facing cameras of a head-mounted user display device and recognizing a type of the physical object sensed at the first location by the head mounted user display device by: capturing one or more field-of-view images, extracting one or more sets of points from the one or more field-of-view images, extracting one or more fiducials for at least one physical object in the one or more field-of-view images based on at least some of the one or more sets of points, processing at least some of the one or more fiducials for the at least one physical object to identify the type of the physical object sensed at the first location, wherein processing at least some of the one or more fiducials comprises comparing the one or more fiducials to sets of previously stored fiducials; associating a virtual object with the sensed physical object based on the type of the physical object as a result of both recognizing the type of the sensed physical object at the first location by the head-mounted user display device using the at least one or more outward facing cameras and identifying a predetermined relationship between the virtual object and the type of the sensed physical object recognized at the first location by the head-mounted user display device; receiving virtual world data representing a virtual world, the virtual would data including at least data corresponding to manipulation of the virtual object in the virtual world by a first user at the first location; transmitting at least the virtual world data corresponding to manipulation of the virtual object by the first user at the first location to a head-mounted user display device, wherein the head-mounted user display device renders a display image associated with at least a portion of the virtual world data including at least the virtual object to the first user based on at least an estimated depth of focus of a first user's eyes; creating additional virtual world data originating from the manipulation of the virtual object by the first user at the first location; and transmitting the additional virtual world data to a second user at a second location different from the first location for presentation to the second user, such that the second user experiences the additional virtual world data from the second location.
priorityDate 2013-03-11-04:00^^<http://www.w3.org/2001/XMLSchema#date>
type http://data.epo.org/linked-data/def/patent/Publication

Incoming Links

Predicate Subject
isDiscussedBy http://rdf.ncbi.nlm.nih.gov/pubchem/substance/SID448910669
http://rdf.ncbi.nlm.nih.gov/pubchem/compound/CID181520

Total number of triples: 39.