http://rdf.ncbi.nlm.nih.gov/pubchem/patent/CN-112597967-A

Outgoing Links

Predicate Object
assignee http://rdf.ncbi.nlm.nih.gov/pubchem/patentassignee/MD5_ec3a5f434b2f386c8e7bec3fa7a6b787
classificationCPCInventive http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06V40-168
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06N3-08
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06N3-045
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06F18-2415
http://rdf.ncbi.nlm.nih.gov/pubchem/patentcpc/G06V40-174
classificationIPCInventive http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06K9-00
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06N3-08
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06N3-04
http://rdf.ncbi.nlm.nih.gov/pubchem/patentipc/G06K9-62
filingDate 2021-01-05-04:00^^<http://www.w3.org/2001/XMLSchema#date>
inventor http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_e182e99ccc4bdad3c321021d0ac6f020
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_37026a8087765ac4d6dfeec9a5df3597
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_a43ee4967398e7bcbab6e6c4aa166047
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_47c112666ef755db72fe24d9761bacef
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_fde3be3741e5a194b4afabf8a642f879
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_a31f878db5a4647933f4a211fa8ed1ca
http://rdf.ncbi.nlm.nih.gov/pubchem/patentinventor/MD5_77eddd3315550d023f274cd1d2d1480d
publicationDate 2021-04-02-04:00^^<http://www.w3.org/2001/XMLSchema#date>
publicationNumber CN-112597967-A
titleOfInvention Emotion recognition method and device for immersive virtual environment and multimodal physiological signals
abstract The present invention provides an emotion recognition method and device based on immersive virtual environment and multi-modal physiological signals. The method includes: step 1, conducting an emotion induction experiment in an experimental environment; step 2, sitting a trainee in the experimental environment On the smart mobile device, wear a virtual reality device to watch the pictures selected from the International Emotional Picture System, collect the trainees' biological signals, and use the SAM self-assessment body model to quantify the pictures; step 2, locate the quantitative score at valence‑ The arousal two-dimensional emotion model is mapped to determine the emotion type corresponding to the quantitative score; step 3, the physiological signal of the trainee is input into the recognition convolutional neural network model with the emotion type as the label, and the emotion model is obtained by training; step 4, obtain the tester's Personal biosignals are imported into the emotion model to identify the tester's emotion category.
isCitedBy http://rdf.ncbi.nlm.nih.gov/pubchem/patent/CN-114640699-A
http://rdf.ncbi.nlm.nih.gov/pubchem/patent/CN-114403877-A
http://rdf.ncbi.nlm.nih.gov/pubchem/patent/CN-113470787-A
http://rdf.ncbi.nlm.nih.gov/pubchem/patent/CN-115376695-A
priorityDate 2021-01-05-04:00^^<http://www.w3.org/2001/XMLSchema#date>
type http://data.epo.org/linked-data/def/patent/Publication

Incoming Links

Predicate Subject
isDiscussedBy http://rdf.ncbi.nlm.nih.gov/pubchem/compound/CID977
http://rdf.ncbi.nlm.nih.gov/pubchem/substance/SID419523291

Total number of triples: 30.