There is also a Building Blocks Script in each BuildingBlocks objects that allows you to find all the other building blocks elements and if you want to remouve one, you can just delete it and it will reflect on the building blocks scripts and the building blocks window automatically. Very clean!
At 1:42 we see "Something went wrong. Please try again later". Overlooking such messages does not inspire confidence. However, having watched more of your tutorial, things still worked! Thanks for a very informative show.
If you try to download the old Oculus Integration instead of Meta XR, there are multiple example scenes, one of that has objects that can be grabbed with hands or controllers, and the set up for them
I readed that uses a different plug in / provider so you cant use use XR Hands and oculus interaction SDK at the same because are incompatible (I was looking for a way to use XR Hands and Passthrough from Oculus SDK)
Is there anyone that sees the right controller both in left and right hand with default configuration? I don't know the reason but my left controller always shows the right Meta Quest Touch Plus Right object
Thank you Valem. How can we connect the Meta Quest 3 to Unity for real-time testing like what you did here in the tutorial. I'm using Mac so I can't use Meta link. Please help me.
Why can't thins ever just work... Getting the error "Error building Player: 2 errors" with no other explanation? Wtf is this shit? Anyone know how to solve it? Tried switching UnityEditor, no difference :(
Honestly, not just for beginners, but this is really helpful for rapid prototyping projects! This came at the perfect time for me actually as I was getting passthrough working manually with the new SDK just last month and need to make a few different projects to test some things.
When I tried to use the controller building block, unity crashed. And could you make a tutorial on how to combine the building blocks with other VR / MR things?
Hi. I have a question. Why hand tracking doesn't work on unity editor? I have already enabled the hand tracking to my oculus quest 2 and it's working perfectly fine on the the device. Can someone please help me
Look if in your Oculus Desktop App, in settings, beta, you have real time features for developers activated ? Or in Camera Rig, in Quest Features, General, you have Hand Tracking Support with Controllers and Hands?
I hope they make something similar in godot. They need more projects like this to get devs to get quickly into actually building the game and not reinventing the wheel every time. Would love valve releasing the Alex sdk to be usable.
I still find it confusing tough that unity builds there own vr strcutures while meta does the same. It would be really great if unity could remain the universal base for example through open xr. And then meta delivers the feature set on top of that. But they build entirly parralel like having their own interaction and grabbable system.. I always wonder if the unity meta open xr enables you to use this new building blocks on other platforms such as the pico..
It's normal, Meta's helmets are not the only ones in the world. There are actually dozens if not more helmets that most people have never even heard of. I happened to see a list of maybe all of them on a site. Unity is more universal and works both for Meta helmets and for example phones.
I am currently one of your patreon subscribers. I am about to cancel my subscription because, I have been trying to reach out to you many time without any success. As you know, there have been many changes between Oculus XR and Open XR. I wanted to follow your how to draw in Vr but was having problems running this because this tutorial is outdated. You set up a community but you are not responding to its members while reaping the benefits. Not cool.
Thank you very much! Does Passthrough still only show up when standalone in the headset, and not in Oculus Link Mode? Also, I've been trying to learn how to set up an XR scene synced with the shapes of my whole house, and it seems like there IS more support for stuff like this now with the remembered location mapping (is this relevant to the Room Model Building Block? another term I saw was "Scene Understanding"?), but it keeps reverting to the constrained single level plane small boundary with unnecessary guardian system, not what I want. More about how to do this would be great, if it's not somewhere already. Cheers! Also, these synthetic hands float stuck in space at last known location if the headset loses sight of them, pretty odd. Is this a bug or is there a way to make them not do that, if anyone knows? :D
With the Quest 3 and that SDK are we able to use the Quest 3 room scan feature and create a mesh collider from that rather than manually placing all the boundaries using the furniture function?
All controllers models are imported as 3D models, you can search in Hierarchy like "quest 2" and you will see the models. Maybe you can desactivate the others and keep only quest 3 controllers
Anyone else having issues when adding the hand tracking, where hand materials are pink. Seems to be an issue with the OculusSampleAlphaHandOutline thats used byt DefaultHandMaterial, SkeletonBoneMaterial, SystemGestureHandMaterial and SystemGestureSkeletonBoneMaterial located under the BuildingBlocks\BlockData\HandTracking\Materials
Yes, the problem is because you have created your project with URP Template. Try create a new unity project with default 3D template (no 3D URP Template) or go to project settings, graphics and be sure that Scriptable render Pipeline Settings is None, do the same in Quality in Render Pipeline Asset. If there is something instead of none, means that you are using URP as render pipeline and that could be the issue
I had a quick question, Whenever I build my project and run it on VR the camera just offsets it self to some amount in y direction ! don't know why and how to control that ?
I think it is more like a script which compares the velocity of the controllers. If the velocity is greater than x value, so move in x / y / z coords your Camera Rig, or give a force in impulse mode your player rigidbody
If you try to download the old Oculus Integration instead of Meta XR, there are multiple example scenes, one of that has objects that can be grabbed with hands or controllers, and the set up for them
Hi. I could not create the apk, I received several errors about the SDK, I already reinstalled it with Android Studio, but the problem is not solved. Thanks
Qué errores ? Podrías dar click en development build al momento de hacer el apk con la opción "Build and run" asegurándote que la plataforma de exportación seleccionada es Android y que el dispositivo que sale en "Run Device" son tus Quest. También puede ser porque al momento de conectar las quest al PC no permitiste la depuración USB, o tus quest no tienen suficiente almacenamiento. O podría ser que seguiste el orden del video y al final añadiste las manos sintéticas que causa un problema porque falta una referencia y tienes que arrastrar un componente de manos sintéticas donde falta (puedes saberlo dando click en el error, te manda al GameObject en el que ocurre y es bastante intuitivo donde parece que necesita algo de manos sintéticas pero no está asignado)
I assume replace the Cube's mesh and colliders with the ones for your custom objects. You'll have to adjust the Rigidbody settings to better reflect the object mass (weight effectively) you're swapping in. Or you can copy the unique scripts from the cubes over to your objects.