Developing a Sign Language Translator Using XR Hand Tracking

Welcome to Portals United! / Forums / Troubleshooting / Developing a Sign Language Translator Using XR Hand Tracking

  • Author
    Posts
  • March 12, 2025 at 10:34 am #1041

    Hi,

    We’re developing a solution for people who use sign language. It’s basically a translator for those who don’t know sign language. In order for somone to communicate in sign language, they need to use their hands, but we couldn’t find this option in the project (it seems like only controller interactions are active).
    Currently, we’re able to do this via Unity’s XR Hands package, so it would be nice to use similar libraries or any alternative you can provide. We need the x,y and z data of the hand’s key points for our product.

    • This topic was modified 1 month ago by Sametk13.
    • This topic was modified 1 month ago by Sametk13.
    • This topic was modified 1 month ago by Sametk13.
    • This topic was modified 1 month ago by Sametk13.
    • This topic was modified 1 month ago by Sametk13.
    March 14, 2025 at 4:44 pm #1321

    We are currently evaluating when we can build this into the system. No fixed date but we definitely want to support hands.

    April 10, 2025 at 4:49 pm #1448

    Hi, is there an upcoming update when hand tracking will be implemented?

    April 11, 2025 at 11:51 am #1455

    Hand tracking is now implemented in Release 0.2.2. You can download it now. The setup is as follows:

    In the hierarchy, you’ll find

    Hope this helps!

    Bob

  • You must be logged in to reply to this topic.