Por favor, use este identificador para citar o enlazar este ítem:
http://hdl.handle.net/10609/92454
Registro completo de metadatos
Campo DC | Valor | Lengua/Idioma |
---|---|---|
dc.contributor.author | Igual, Laura | - |
dc.contributor.author | Lapedriza, Agata | - |
dc.contributor.author | Borràs, Ricard | - |
dc.contributor.other | Universitat de Barcelona (UB) | - |
dc.contributor.other | Universitat Autònoma de Barcelona (UAB) | - |
dc.contributor.other | Universitat Oberta de Catalunya (UOC) | - |
dc.date.accessioned | 2019-03-22T09:56:41Z | - |
dc.date.available | 2019-03-22T09:56:41Z | - |
dc.date.issued | 2013-01-02 | - |
dc.identifier.citation | Igual, L., Lapedriza, A. & Borràs, R. (2013). Robust gait-based gender classification using depth cameras. EURASIP Journal on Image and Video Processing, 2013(1). doi: 10.1186/1687-5281-2013-1 | - |
dc.identifier.issn | 1687-5176MIAR | - |
dc.identifier.uri | http://hdl.handle.net/10609/92454 | - |
dc.description.abstract | This article presents a new approach for gait-based gender recognition using depth cameras, that can run in real time. The main contribution of this study is a new fast feature extraction strategy that uses the 3D point cloud obtained from the frames in a gait cycle. For each frame, these points are aligned according to their centroid and grouped. After that, they are projected into their PCA plane, obtaining a representation of the cycle particularly robust against view changes. Then, final discriminative features are computed by first making a histogram of the projected points and then using linear discriminant analysis. To test the method we have used the DGait database, which is currently the only publicly available database for gait analysis that includes depth information. We have performed experiments on manually labeled cycles and over whole video sequences, and the results show that our method improves the accuracy significantly, compared with state-of-the-art systems which do not use depth information. Furthermore, our approach is insensitive to illumination changes, given that it discards the RGB information. That makes the method especially suitable for real applications, as illustrated in the last part of the experiments section. | en |
dc.language.iso | eng | - |
dc.publisher | EURASIP Journal on Image and Video Processing | - |
dc.relation.ispartof | EURASIP Journal on Image and Video Processing, 2013, 2013(1) | - |
dc.relation.uri | https://jivp-eurasipjournals.springeropen.com/articles/10.1186/1687-5281-2013-1 | - |
dc.rights | CC BY | - |
dc.rights.uri | http://creativecommons.org/licenses/by/3.0/es/ | - |
dc.subject | linear discriminant analysis | en |
dc.subject | gait feature | en |
dc.subject | gait recognition | en |
dc.subject | depth camera | en |
dc.subject | análisis discriminante lineal | es |
dc.subject | característica de la marcha | es |
dc.subject | reconocimiento de la marcha | es |
dc.subject | cámara de profundidad | es |
dc.subject | anàlisi lineal discriminant | ca |
dc.subject | característica de la marxa | ca |
dc.subject | reconeixement de la marxa | ca |
dc.subject | càmera de profunditat | ca |
dc.subject.lcsh | Optical pattern recognition | en |
dc.title | Robust gait-based gender classification using depth cameras | - |
dc.type | info:eu-repo/semantics/article | - |
dc.subject.lemac | Reconeixement òptic de formes | ca |
dc.subject.lcshes | Reconocimiento óptico de formas | es |
dc.rights.accessRights | info:eu-repo/semantics/openAccess | - |
dc.identifier.doi | 10.1186/1687-5281-2013-1 | - |
dc.gir.id | AR/0000004290 | - |
dc.type.version | info:eu-repo/semantics/publishedVersion | - |
Aparece en las colecciones: | Articles cientÍfics Articles |
Ficheros en este ítem:
Fichero | Descripción | Tamaño | Formato | |
---|---|---|---|---|
robust.pdf | 4,1 MB | Adobe PDF | Visualizar/Abrir |
Comparte:
Este ítem está sujeto a una licencia Creative Commons Licencia Creative Commons