Тёмный
No video :(

Mapping Inputs to Actions in Unity VR! 

Fist Full of Shrimp
Подписаться 8 тыс.
Просмотров 3,3 тыс.
50% 1

Learn how to effortlessly map inputs using Unity's Input System! I keep things simple for this tutorial and you'll learn how to toggle a menu on and off by mapping the menu button to the toggle action. As simple as it is, learning how to map actions and inputs is vital for an VR/AR developer.
🍤 Support the Channel: Dive deeper into XR development with exclusive content and perks by checking out my Patreon!
/ fistfullofshrimp
🎓 What You'll Learn:
-Unity Input System Basics: Get up to speed with the fundamentals of the -Unity Input System.
-Mapping XR Toolkit Inputs: Learn how to map your inputs using the XR Interaction Toolkit sample inputs for efficient development.
-Toggling Menus: Discover how to use the menu button to seamlessly toggle menus on and off, a crucial skill for any XR developer.
-Enhancing User Experience: Tips and tricks on improving interaction design and user experience in your XR projects.
👉 Don't forget to like, share, and subscribe for more Unity XR tutorials and insights!
🔗 Relevant Links:
VR Template: github.com/Fis...
#Unity #XRDevelopment #UnityInputSystem #VRDev #ARDev #XRToolkit #GameDev #IndieDev #XRInteraction #UnityTutorial #Coding #IndieGameDev

Опубликовано:

 

18 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 14   
@DrunkAncestor
@DrunkAncestor 4 месяца назад
This is a lifesaver, thank you. I'm still confused about getting values of things (like trigger, grip button, etc) out of the input map actions and into scripts. That would be a great video to clarify
@FistFullofShrimp
@FistFullofShrimp 4 месяца назад
Thanks for watching and the suggestion! 🍤
@user-ms3cs3yg7t
@user-ms3cs3yg7t 5 месяцев назад
This demo arrive at the perfect time thank you for this!
@FistFullofShrimp
@FistFullofShrimp 5 месяцев назад
You're super welcome!
@ladyupperton23
@ladyupperton23 5 месяцев назад
Thank you for another amazing video 🎉❤🎉
@FistFullofShrimp
@FistFullofShrimp 5 месяцев назад
What a wonderful shrimp lady you are!!!
@Lemon-dh4fz
@Lemon-dh4fz 5 месяцев назад
Thanks for the video. 4:11 What will if I didn't check the Generic XR controller?
@FistFullofShrimp
@FistFullofShrimp 5 месяцев назад
This is a great question! The Generic XR Controller setting from my understanding, allows the Input System to use a scheme that allows for broad compatibility from a variety of devices. Essentially it is a way for the Input System to abstract the hardware specifics and allows it to use a generalized input model. Will your inputs work if you don't use this setting? Maybe. Will it work on a variety of devices without using this setting? probably not
@GwynPerry
@GwynPerry 5 месяцев назад
The UI issue when selecting the binding control is due to a misspelling in the UI theme that was introduced in recent versions of Unity. You can see the list is rendered to high up, covering the input field and listen button.
@FistFullofShrimp
@FistFullofShrimp 5 месяцев назад
I was honestly not expecting someone to answer this question that I was thinking of when making this video. It was driving me nuts!!! THANK YOU!!! 🍤🍤🍤
@rodneywheeler7764
@rodneywheeler7764 5 месяцев назад
THANK YOU!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
@ohokcool3119
@ohokcool3119 5 месяцев назад
Hey Thanks for the video! I am currently working on a research project regarding gaining insights about a user's usage of different menu types within VR, now I'm stuck between choosing whether to go ahead with using Unity's XR interaction toolkit, or Meta's XR All-in-One SDK. In your opinion, with your experience with either toolkit, considering I'm going to need data such as time taken to select item, checking certain inputs, etc.. which toolkit would you suggest using? I am developing for an Oculus so I can go with either but if you have the time, which of the two would you use? Do both of the toolkits support a decent amount of features regarding observing user input? Thanks :)
@FistFullofShrimp
@FistFullofShrimp 5 месяцев назад
This is a bit of a toughie because I think both could work just fine. I do find Meta's to be a bit messy and confusing most of the time, but it does seem to offer more in-depth specifics for Oculus devices. The XR Interaction Toolkit is way easier to get projects moving along and implement with. Both could be used easily for the current things you've listed that you're going to collect. I'd personally go with the XR Toolkit because I'm more experienced with it and I enjoy developing with it, but if I was going to want every little bit of data specific to only Meta devices, then I'd consider going with Meta's Toolkit. Hope that helps!
@ohokcool3119
@ohokcool3119 5 месяцев назад
@@FistFullofShrimpThanks for the insightful response! Ya that makes sense, I also think just in terms of getting things moving along I'd stick with the XR interaction toolkit and then see down the line if I would need a wide variety of specifics but theres also probably 3rd party libraries that may help like cognitive3D