In the last few years, collaborative tools have increased dramatically, partly to compensate for the distance between people due to the pandemic and partly to allow activities (work, entertainment, free time, etc.) to be carried out among people without having to worry about geographical distances. In this scenario, it was necessary to overcome the classic remote meeting tools e.g. video, audio, chat), which have a reduced sense of presence. Extended Reality (XR) represents a Computer Graphics (CG) based innovative technology particularly suited to this purpose. Indeed, XR aims to develop Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (XR) solutions that can transform how people interact by increasing the sense of presence. This last aspect depends not only on the virtual/real objects and scene visualization but also on their interaction. In this regard, human-computer interaction (HCI) techniques represent a possible solution. However, these techniques depend on specific devices, such as Head Mounted Display (HMD), Smart Glasses, Depth and Tracking Cameras, etc., whose costs make access difficult. For this reason, we propose a Hand Gesture Recognition (HGR) system that can be used in XR applications using a simple RGB camera. Our is a deep learning system based on MediaPipe, the state-of-the-art (SOTA) for hand tracking through simple RGB images [1], [2].

An easy Hand Gesture Recognition System for XR-based collaborative purposes

Capece N.
;
Manfredi G.;MacEllaro V.;
2022-01-01

Abstract

In the last few years, collaborative tools have increased dramatically, partly to compensate for the distance between people due to the pandemic and partly to allow activities (work, entertainment, free time, etc.) to be carried out among people without having to worry about geographical distances. In this scenario, it was necessary to overcome the classic remote meeting tools e.g. video, audio, chat), which have a reduced sense of presence. Extended Reality (XR) represents a Computer Graphics (CG) based innovative technology particularly suited to this purpose. Indeed, XR aims to develop Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (XR) solutions that can transform how people interact by increasing the sense of presence. This last aspect depends not only on the virtual/real objects and scene visualization but also on their interaction. In this regard, human-computer interaction (HCI) techniques represent a possible solution. However, these techniques depend on specific devices, such as Head Mounted Display (HMD), Smart Glasses, Depth and Tracking Cameras, etc., whose costs make access difficult. For this reason, we propose a Hand Gesture Recognition (HGR) system that can be used in XR applications using a simple RGB camera. Our is a deep learning system based on MediaPipe, the state-of-the-art (SOTA) for hand tracking through simple RGB images [1], [2].
2022
978-1-6654-8574-6
File in questo prodotto:
File Dimensione Formato  
1570798751 final.pdf

solo utenti autorizzati

Tipologia: Pdf editoriale
Licenza: Versione editoriale
Dimensione 6.59 MB
Formato Adobe PDF
6.59 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11563/166294
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 2
social impact