This HTML5 document contains 30 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
dctermshttp://purl.org/dc/terms/
n2https://kar.kent.ac.uk/id/eprint/
n14https://kar.kent.ac.uk/72659/
wdrshttp://www.w3.org/2007/05/powder-s#
dchttp://purl.org/dc/elements/1.1/
n7http://purl.org/ontology/bibo/status/
rdfshttp://www.w3.org/2000/01/rdf-schema#
n17doi:10.1109/
n12https://demo.openlinksw.com/about/id/entity/https/raw.githubusercontent.com/annajordanous/CO644Files/main/
n10http://eprints.org/ontology/
n18https://kar.kent.ac.uk/id/event/
bibohttp://purl.org/ontology/bibo/
n15https://kar.kent.ac.uk/id/org/
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n6https://kar.kent.ac.uk/id/eprint/72659#
owlhttp://www.w3.org/2002/07/owl#
n9https://kar.kent.ac.uk/id/document/
n19https://kar.kent.ac.uk/id/
xsdhhttp://www.w3.org/2001/XMLSchema#
n20https://demo.openlinksw.com/about/id/entity/https/www.cs.kent.ac.uk/people/staff/akj22/materials/CO644/
n4https://kar.kent.ac.uk/id/person/

Statements

Subject Item
n2:72659
rdf:type
bibo:Article bibo:AcademicArticle n10:ConferenceItemEPrint n10:EPrint
rdfs:seeAlso
n14:
owl:sameAs
n17:ICASSP.2019.8682986
n10:hasAccepted
n9:3169627
n10:hasDocument
n9:3169766 n9:3169767 n9:3169768 n9:3169769 n9:3169627 n9:3169659
dc:hasVersion
n9:3169627
dcterms:title
Forked Recurrent Neural Network for Hand Gesture Classification Using Inertial Measurement Data
wdrs:describedby
n12:export_kar_RDFN3.n3 n20:export_kar_RDFN3.n3
dcterms:date
2019-04-17
dcterms:creator
n4:ext-maass@isip.uni-luebeck.de n4:ext-koch@isip.uni-luebeck.de n4:ext-fd54924686319c9763f2e79c9d236338 n4:ext-mertins@isip.uni-luebeck.de n4:ext-h.phan@kent.ac.uk
bibo:status
n7:peerReviewed n7:published
dcterms:publisher
n15:ext-af0a9a5baed87c407844a3f5db44597c
bibo:abstract
For many applications of hand gesture recognition, a delayfree, affordable, and mobile system relying on body signals is mandatory. Therefore, we propose an approach for hand gestures classification given signals of inertial measurement units (IMUs) that works with extremely short windows to avoid delays. With a simple recurrent neural network the suitability of the sensor modalities of an IMU (accelerometer, gyroscope, magnetometer) are evaluated by only providing data of one modality. For the multi-modal data a second network with mid-level fusion is proposed. Its forked architecture allows us to process data of each modality individually before carrying out a joint analysis for classification. Experiments on three databases reveal that even when relying on a single modality our proposed system outperforms state-of-the-art systems significantly. With the forked network classification accuracy can be further improved by over 10% absolute compared to the best reported system while causing a fraction of the delay.
dcterms:isPartOf
n19:repository
bibo:authorList
n6:authors
bibo:presentedAt
n18:ext-5ff3da8a1c281cedb6c9b5128f72c918