Hi XR Developers! In this video I will teach you how to use the 3D touch input for visionOS devices. I prepared a bunch of scripts that let us manipulate game objects in different way! There is also Skeletal Hand Tracking which is available to us through the XR hands package from Unity. We will look into that in a future video!
Resources:
🔗 Unity PolySpatial 1.1 Documentation: https://docs.unity3d.com/Packages/com...
🔗 Play to Device: https://docs.unity3d.com/Packages/com...
🔗 Input Documentation: https://docs.unity3d.com/Packages/com...
Other Videos:
📹 Create and Manage SwiftUI windows with Unity PolySpatial for Apple visionOS: • Create and Manage SwiftUI windows wit...
📹 Get Started with Meta Quest Development in Unity: • Get Started with Meta Quest Developme...
📹 Develop for Apple Vision Pro with Unity’s PolySpatial: • Develop for Apple Vision Pro with Uni...
📹 How to become an Apple XR Developer with Unity for visionOS: • Apple Vision Pro - How to become an A...
Join the Discord!
👾 / discord
Support the Channel:
🫂 Patreon: / blackwhalestudio
🐦 Twitter: / xrdevrob
🤝 Join this channel to get access to perks:
/ @blackwhalestudio
Need professional XR & Multiplayer development for your project?
👨💻 Get in touch with us: https://bento.me/blackwhale
Chapters:
0:00 Intro
0:43 Play to Device on Apple Vision Pro
2:58 Connect Apple Vision Pro to Xcode
4:40 SpatialPointerDevice Input device (Unity Input System)
5:45 Read Spatial Pointer data directly in code
10:36 Outro
Смотрите видео Apple Vision Pro Input & Object Manipulation with Unity PolySpatial | Connect to Unity and Xcode онлайн, длительностью часов минут секунд в хорошем качестве, которое загружено на канал Black Whale Studio - XR Tutorials 08 Апрель 2024. Делитесь ссылкой на видео в социальных сетях, чтобы ваши подписчики и друзья так же посмотрели это видео. Данный видеоклип посмотрели 4,501 раз и оно понравилось 98 посетителям.