http://rdf.ncbi.nlm.nih.gov/pubchem/patent/CN-114049542-A

Outgoing Links

Predicate Object
assignee http://rdf.ncbi.nlm.nih.gov/pubchem/patentassignee/MD5_8c7c582c58ebb902c3111171ec32c641
classificationCPCInventive http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G01C3-00
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G01S17-931
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G01S17-89
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06F18-253
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T7-11
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T7-30
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T7-136
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G01S17-86
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06T5-002
classificationIPCInventive http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G01S17-931
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06T7-136
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06V10-80
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G01C3-00
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06T5-00
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G01S17-89
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06T7-11
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G01S17-86
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06T7-30
filingDate 2021-10-27-04:00^^<http://www.w3.org/2001/XMLSchema#date>
inventor http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_3cdf5dec13fba46d376f135dc510c362
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_eb3cb12cf06aff2585d13b98b393a940
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_fcbb8a6d58a091f3dc59a4a67942726c
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_9c298e5730f93c55c3e8e4a8ece73cd2
publicationDate 2022-02-15-04:00^^<http://www.w3.org/2001/XMLSchema#date>
publicationNumber CN-114049542-A
titleOfInvention A fusion localization method based on multi-sensors in dynamic scenes
abstract The invention relates to a fusion positioning method based on multi-sensors in a dynamic scene. The method includes: S1, acquiring dynamic event point cloud collected by a dynamic vision sensor camera and environmental point cloud data collected by laser radar; S2, analyzing the dynamic event point cloud Process and filter out noise events to obtain a dynamic object image; S3, identify the dynamic object in the dynamic object image, and frame the dynamic object area; S4, map the dynamic object area to the environmental point cloud data and remove the dynamic object point cloud to obtain Static environment point cloud; S5, registering the static environment point cloud and map features to obtain positioning information. Compared with the prior art, the positioning accuracy and robustness of the present invention are greatly improved.
priorityDate 2021-10-27-04:00^^<http://www.w3.org/2001/XMLSchema#date>
type http://data.epo.org/linked-data/def/patent/Publication

Incoming Links

Predicate Subject
isDiscussedBy http://rdf.ncbi.nlm.nih.gov/pubchem/compound/CID60825
http://rdf.ncbi.nlm.nih.gov/pubchem/substance/SID419582621

Total number of triples: 32.