Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10609/137009
Título : End-to-end global to local convolutional neural network learning for hand pose recovery in depth data
Autoría: Madadi, Meysam
Escalera, Sergio  
Baró, Xavier  
González, Jordi
Otros: Universitat Oberta de Catalunya (UOC)
Citación : Madadi, M., et al.: End-to-end global to local convolutional neural network learning for hand pose recovery in depth data. IET Comput. Vis. 1-17 (2021). https://doi.org/10.1049/cvi2.12064
Resumen : Despite recent advances in 3-D pose estimation of human hands, thanks to the advent of convolutional neural networks (CNNs) and depth cameras, this task is still far from being solved in uncontrolled setups. This is mainly due to the highly non-linear dynamics of fingers and self-occlusions, which make hand model training a challenging task. In this study, a novel hierarchical tree-like structured CNN is exploited, in which branches are trained to become specialised in predefined subsets of hand joints called local poses. Further, local pose features, extracted from hierarchical CNN branches, are fused to learn higher order dependencies among joints in the final pose by end-to-end training. Lastly, the loss function used is also defined to incorporate appearance and physical constraints about doable hand motions and deformations. Finally, a non-rigid data augmentation approach is introduced to increase the amount of training depth data. Experimental results suggest that feeding a tree-shaped CNN, specialised in local poses, into a fusion network for modelling joints' correlations and dependencies, helps to increase the precision of final estimations, showing competitive results on NYU, MSRA, Hands17 and SyntheticHand datasets.
Palabras clave : computer vision
data acquisition
human computer interaction
learning (artificial-intelligence)
pose estimation
DOI: https://doi.org/10.1049/cvi2.12064
Tipo de documento: info:eu-repo/semantics/article
Fecha de publicación : 12-ago-2021
Licencia de publicación: http://creativecommons.org/licenses/by-nc/3.0/es/  
Aparece en las colecciones: Articles cientÍfics
Articles