Developing a Sign Language Translator Using XR Hand Tracking
- This topic has 3 replies, 2 voices, and was last updated 14 hours, 40 minutes ago by .
- You must be logged in to reply to this topic.
Welcome to Portals United! / Forums / Troubleshooting / Developing a Sign Language Translator Using XR Hand Tracking
Hi,
We’re developing a solution for people who use sign language. It’s basically a translator for those who don’t know sign language. In order for somone to communicate in sign language, they need to use their hands, but we couldn’t find this option in the project (it seems like only controller interactions are active).
Currently, we’re able to do this via Unity’s XR Hands package, so it would be nice to use similar libraries or any alternative you can provide. We need the x,y and z data of the hand’s key points for our product.
We are currently evaluating when we can build this into the system. No fixed date but we definitely want to support hands.
Hand tracking is now implemented in Release 0.2.2. You can download it now. The setup is as follows:
In the hierarchy, you’ll find
Hope this helps!
Bob