**Solid and Effective Upper Limb Segmentation in Egocentric Vision** Monica Gruosso, Nicola Capece, Ugo Erra Department of Mathematics, Computer Science and Economics, University of Basilicata, Potenza, Italy 85100 monica.gruosso@unibas.it, nicola.capece@unibas.it, ugo.erra@unibas.it ![](teaser.png width="100%") _Real-life photos captured in egocentric vision (top row) and segmentation overlay images (bottom row) obtained by overlapping the input and prediction of our best model based on DeepLabv3+ with Xception-65 network backbone. Accurate segmentation was achieved for both naked and clothed upper limbs in different lighting conditions, skin tone, occlusions, hand poses, user/camera movements, indoor and outdoor scenarios._ Abstract =============================================================================== Upper limb segmentation in egocentric vision is a challenging and nearly unexplored task that extends the well-known hand localization problem. For example, it can be crucial for a realistic representation of users' limbs in immersive and interactive environments, such as VR/MR applications designed for web browsers that are a general-purpose solution suitable for any device. Existing hand and arm segmentation approaches require a large amount of well-annotated data. Then different annotation techniques were designed, and several datasets were created. Such datasets are often limited to synthetic and semi-synthetic data that do not include the whole limb and differ significantly from real data, leading to poor performance in many realistic cases. To overcome the limitations of the existing approaches and the challenges inherent in both egocentric vision and segmentation, we trained several segmentation networks based on the state-of-the-art [`DeepLabv3+`](https://arxiv.org/pdf/1802.02611.pdf) model, collecting a large-scale comprehensive dataset. It consists of $46$ thousand real-life and well-labeled RGB images with a great variety of skin colors, occlusions, lighting conditions, and the occurrence of both bare and clothed arms. In particular, we carefully selected the best data from existing datasets and added our EgoCam dataset, which includes new images with accurate labels. Finally, we extensively evaluated the trained networks in unconstrained real-world environments to find the best model configuration for this task, achieving promising and remarkable results in diverse scenarios. The code, the collected egocentric upper limb segmentation dataset, and a video demo of our work will be available on the project page. Video =============================================================================== ![A video](EgoUpperLimbSeg_demo.mp4) BibTeX =============================================================================== ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @INPROCEEDINGS{"10.2312:stag.20211483", title = "Solid and Effective Upper Limb Segmentation in Egocentric Vision", author = "Gruosso, Monica and Capece, Nicola and Erra, Ugo", booktitle = "The 26th International Conference on 3D Web Technology", year = "2021", publisher = "Association for Computing Machinery", doi = "10.1145/3485444.3495179", } ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Resources =============================================================================== | Download | Description | |:--------:|:---------------------------------------------------------------------------------------------------------------------------------------------:| | | Code | | email | Our Upper Limb Segmentation Dataset | | | Official publication: 2021 ACM 26th International Conference on 3D Web Technology (WEB3D | For more information about original EDSH and TEgO data, please visit the following pages: - EDSH web page: http://www.cs.cmu.edu/~kkitani/datasets/ - TEgO web page: https://iamlabumd.github.io/tego/ Acknowledgments =============================================================================== The authors thank NVIDIA's Academic Research Team for providing the Titan Xp cards under the Hardware Donation Program.