Hi XR Developers! In this video we will look at how to use Meta’s Voice SDK to define and understand intents. Furthermore, we will use these intents to create a wake word that we know from assistants such as Google and Siri, as well as create some logic to capture the transcript of what we said after the wake word was heard by our system!
Resources:
🔗 Meta XR Core SDK: https://assetstore.unity.com/packages...
🔗 Meta - Voice SDK - Immersive Voice Commands: https://assetstore.unity.com/packages...
🔗 Meta XR All-in-One SDK: https://assetstore.unity.com/packages...
🔗 Meta Voice SDK Documentation: https://developer.oculus.com/document...
Other Videos:
📹 Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK: • Mixed Reality Utility Kit: Build spatially...
📹 Meta’s Building Blocks | Develop Mixed Reality Apps lightning fast: • Meta’s Building Blocks | Develop Mixed Rea...
📹 How to use Lights & Shadows in Passthrough Mode: • How to use Lights & Shadows in Passthrough...
📹 Get Started with Meta Quest Development in Unity: • Get Started with Meta Quest Development in...
📹 Meta Quest Scene API: • Meta Quest Scene API | Scene Understanding...
📹 Meta XR Simulator: • Meta XR Simulator | Synthetic Environments...
Join the Discord!
👾 / discord
Support the Channel:
🫂 Patreon: / blackwhalestudio
🐦 Twitter: / xrdevrob
🤝 Join this channel to get access to perks:
/ @xrdevrob
Need professional XR development for your project?
👨💻 Get in touch with us: https://bento.me/blackwhale
00:00 Intro
00:40 Wit.ai app and Wake Word intent setup
02:13 Set up Wit.ai configuration file
02:50 Adding App Voice Experience
03:39 Adding Wake Word Response Matcher
04:14 Voice Manager script
06:27 Scene setup for Wake Word detection
07:01 Tip: Increase keep alive time on App Voice Experience
07:11 Testing the scene
07:32 Outro