Тёмный
Biggs
Biggs
Biggs
Подписаться
- Spatial / AI / Gaming -
Комментарии
@fanyumeng604
@fanyumeng604 19 дней назад
Hello, thank you for sharing such a helpful video. Unfortunately, I encountered issues when following the steps in your video: when I ran the GenerateProjectFile.command, it showed 149 errors, and each error message stated "… is a binary file, not a text file …" I would greatly appreciate your assistance, and I look forward to your reply.
@SpatialBiggs
@SpatialBiggs 18 дней назад
@@fanyumeng604 You are likely using the wrong Xcode version or there is some mismatch. I have seen that issue when I tested latest Xcode. If you use exact versions from video it should work then I would troubleshoot updating to latest UE5 and Xcode only after that initial success. Good luck!
@fanyumeng604
@fanyumeng604 18 дней назад
@@SpatialBiggs wow ill try it❤️
@fanyumeng604
@fanyumeng604 17 дней назад
​@@SpatialBiggs There are still 149 warnings,😢😢😢 I used to run generateprojectfiles.command on the built-in hard drive of the Mac, and it was successful, but I didn't have enough hard disk space. Later, I saw the video and thought that you might have installed it on a mobile hard drive, so I download the code and ran setup.command again on the mobile hard drive . Then, generateprojectfiles could not be run. I am a new user of the Apple computer and have no understanding of Apple computers at all. What should I do? I think I should apologize for my continued inquiries.
@fanyumeng604
@fanyumeng604 14 дней назад
help
@fanyumeng604
@fanyumeng604 24 дня назад
why i cant download UE 5.4 in github 😢😢😢
@SpatialBiggs
@SpatialBiggs 24 дня назад
@@fanyumeng604 Make sure you have done this 🫡 www.epicgames.com/help/en-US/c-Category_EpicAccount/c-ConnectedAccounts/how-do-i-link-my-unreal-engine-account-with-my-github-account-a000084938
@fanyumeng604
@fanyumeng604 19 дней назад
@@SpatialBiggs ❤️❤️❤️
@benjohnson7610
@benjohnson7610 25 дней назад
The tutorial works pretty well but I wanted to ask a quick question after seeing a few of your shorts. Do you have any info you can share on getting hand tracking working? I've been trying to use the UMotionController components but so far I have had no luck. I can get the motion controller components from the VRPawn using the standard methods but all data is marked as invalid. Unfortunately I've not really had much luck debugging this either since the AVP seems to crash when you use the Print String function.
@SpatialBiggs
@SpatialBiggs 25 дней назад
@@benjohnson7610 OpenXR is the pipeline but there is no pre built project, hand mesh, or interaction tech yet- but the base tracking works as long as your plist permissions are in.
@benjohnson7610
@benjohnson7610 25 дней назад
@@SpatialBiggs hmm, I must be doing something else wrong. I have the same 3 plist keys you showed in the tutorial. My plan was to make my own gesture system using the thumb and index tip positions from UMotionController but for whatever reason everything is marked as invalid. For now though I was just trying to see if I could get cube meshes to follow my hands. Although I did notice only the hand tracking permission popup appears when I start the project. I hate to be a bother but can you think of any other setting I could be missing or do you have a sample project I could compare with?
@SpatialBiggs
@SpatialBiggs 25 дней назад
@@benjohnson7610 I can't share current project due to being in flight on hybrid development for AVP/Quest. All I do though is "Get Motion Controller data" then "Break" and check if valid. Then I grab the positioning data off the "Break" and use that for all hand positioning tech I built on top. Hope that helps!
@benjohnson7610
@benjohnson7610 25 дней назад
@@SpatialBiggs no worries. Sounds like we took the same approach. I'll have to dig around more and see what I missed. Maybe Left and Right are not the actual motion controls for the hands or something is blocking permission.
@ysaito608
@ysaito608 2 месяца назад
I saw your article and was able to launch VRtemplate with AVP. Thank you very much. I think I'm using OpenXR hand tracking to move my hands like this, but it's not working. Would it be difficult for you to release the project?
@SpatialBiggs
@SpatialBiggs 2 месяца назад
@@ysaito608 That is great to hear! And yes, you will have to build the visual representation and interactions yourself. OpenXR is the pipeline but no core assets are provided by Epic at this time for AVP. Hopefully that changes and an example project comes at some point.
@antdx316
@antdx316 2 месяца назад
Did they make it easier in 5.4.2?
@SpatialBiggs
@SpatialBiggs 2 месяца назад
Same process so far but there has been some activity in Main github recently for VisionOS so some amount of progress looks to be happening.
@antdx316
@antdx316 2 месяца назад
@@SpatialBiggs How easy is it to load up the scene on the Meta Quest? Can eye tracking be setup quickly with a plugin? With the method you have, is that how to get any scene loaded into the AVP? What will I need for hand interactions? Does gaze tracking work?
@SpatialBiggs
@SpatialBiggs 2 месяца назад
@@antdx316 Quest is relatively easy to build to but you need Meta XR plugin Any scene yes but it is complex and delicate. Definitely use VR template to prove it works. Hand tracking is Open XR and currently there is no out of box tech. You have to build it. Eye tracking doesnt work in immersive mode on any engine yet. Unity or Unreal. Also they have a privacy layer meaning the only data point you can extract is on-tap. Foveated rendering etc hopefully comes in Metal as that was announced at WWDC but I suspect that isn't an easy implementation and will take time. Generally Quest is much easier and has a mountain of ready to go tech with example projects. AVP is bare bones. Hope that helps!
@antdx316
@antdx316 2 месяца назад
@@SpatialBiggs I cannot see the Meta XR plugin in the list?
@SpatialBiggs
@SpatialBiggs 2 месяца назад
@@antdx316 It is external: www.unrealengine.com/marketplace/en-US/product/8313d8d7e7cf4e03a33e79eb757bccba
@SkinnyNix19
@SkinnyNix19 2 месяца назад
Hi, I attempted to follow this guide using Unreal 5.4 and Xcode 16 Beta 2 (MacOS 15), and once I get to the part where you open the Mac workspace it doesn't work. The scheme in Xcode is blank and has no target so I can't select "Unreal Editor" or build anything. Any suggestions?
@SpatialBiggs
@SpatialBiggs 2 месяца назад
Yes Xcode 16 did not work for me last I checked. I had the exact problem you ran into. I rolled back my environment after my tests failed. You need to stick to Xcode 15 I think until more updates come (there has been activity in Main github recently). Also be warned VisionOS 2.0 seems to break builds. Lmk how it goes, good luck!
@SkinnyNix19
@SkinnyNix19 2 месяца назад
@@SpatialBiggs thanks for the reply! Will try this and get back to you!
@SpatialBiggs
@SpatialBiggs 2 месяца назад
@@SkinnyNix19 Not a problem good luck building 🫡
@贾佳菊
@贾佳菊 3 месяца назад
Hi My Vision Pro is OS 1.2, and I have test with UE5.4.2 or UE5.4.1, but I cant launch the game, so I used XCode debug the game, the crash message is malloc: *** error for object 0x30348add0: pointer being freed was not allocated malloc: *** set a breakpoint in malloc_error_break to debug I cant understand what happend, Did you have some ideas? Thanks!
@SpatialBiggs
@SpatialBiggs 3 месяца назад
Not sure but I am currently on visionOS 1.2 as 2.0 breaks the build at this time. My Unreal Engine version is 5.4.2. When I tested with Xcode 16 I couldn't get builds to work. I would double check and make sure you are using an older Xcode. If you see the swift window and crash you are mostly there. If you aren't getting that far something went wrong in the process. I have gone through the steps as recently as last week so it should still work if you use the older Xcodes onto visionOS 1.2 If you update to 2.0 at the moment it seems to break. Also keep in mind to use the VR Template as your base for initial testing. I can't think of why you would get a memory management error unless there is something custom going on or something corrupted along the way. My gut instinct here would be to fresh start the process, with a cleaned environment if possible, and go through it pure step by step. Otherwise it sounds like a memory management chase but I am not certain what could cause it. Haven't seen this specific error in my pipeline. Hope that still helps some. Good luck!
@GNARBOSS
@GNARBOSS 2 месяца назад
I'm getting the same message In Xcode, but my headset is already on VisionOS 2.0
@SpatialBiggs
@SpatialBiggs 2 месяца назад
@@GNARBOSS Last I heard VisionOS 2.0 breaks builds but it is possible Main 5.5 is working with latest and I am unaware.
@GNARBOSS
@GNARBOSS 2 месяца назад
@@SpatialBiggs I was not even aware there was a 5.5! Maybe that's something I can try
@SpatialBiggs
@SpatialBiggs 2 месяца назад
@@GNARBOSS Can be broken depending on revision but if you want to try grab "Main" instead of "Release" off github- could be a path. I have been building out tech and haven't had time to test latest. If you do try it would be great to know. Good luck!
@linyuzqlby2590
@linyuzqlby2590 3 месяца назад
!!!
@DesignsbyElement
@DesignsbyElement 3 месяца назад
When I attempt to launch my app, I just get the grey swift window. It never takes me to the unreal vr template stage. I've followed this tutorial over 10 times from start to end and haven't been able to get this to work for me. I've spent more time troubleshooting than creating, why can't this stuff be plug and create?
@SpatialBiggs
@SpatialBiggs 3 месяца назад
You are likely crashing or your app isn't valid. The best way to find out is build with xcode at the end of video and read the output you get. If you are getting the swift window the hard part was already done and something in your app is crashing it. Double check and if you haven't go with a clean VR Template to start. Good luck!
@DesignsbyElement
@DesignsbyElement 3 месяца назад
Is there a method to packaging your unreal project for testing on the Vision Pro without having an Apple developer account? I have one but it’s pending and I want to start packaging for testing purposes.
@SpatialBiggs
@SpatialBiggs 3 месяца назад
You won't be able to sign the app and fully make a package but I use an intel mac to test all the time. I basically assume if it gets all the way through build to that complaint I removed all other errors. As far as I know though you need to sign to fully package. I even tried on a remote wifi off the internet while traveling with my Mini and signing only works when online. Hope that helps.
@DesignsbyElement
@DesignsbyElement 3 месяца назад
@@SpatialBiggs I am at the point wherein I’m trying to package my project so it can be tested, but unreal needs the Xcode ID info etc
@SpatialBiggs
@SpatialBiggs 3 месяца назад
@@DesignsbyElement Yep I think you need the $100 dev account. It's just a credit card purchase if you are not a company I believe? Entities have an approval process though. There might be another way without dev account but I haven't tried in a long time.
@DesignsbyElement
@DesignsbyElement 3 месяца назад
I am getting errors... is the process the same as your other walk through?
@SpatialBiggs
@SpatialBiggs 3 месяца назад
Pretty much, I did fail after testing Xcode 16 beta on Seqouia. Had to roll back but have been fine on Xcode 15.3 so far. I believe my 15.4 test worked as well but went with a full restore. Not sure if that helps but lmk if you get more info 🫡
@DesignsbyElement
@DesignsbyElement 3 месяца назад
Same with me. I upgraded to sequoia and keep getting errors when trying to open the project file. I’ll downgrade the OS today and see if that changes the results
@marinaalbertovna2457
@marinaalbertovna2457 3 месяца назад
@Bandit-10-4
@Bandit-10-4 3 месяца назад
I am from Australia and my wife is in Scotland at the moment.The coast is a bit different day too.
@SpatialBiggs
@SpatialBiggs 3 месяца назад
Scotland coasts were also a life changing sight. Amazing place!
@AnonymousStokes
@AnonymousStokes 3 месяца назад
Beautiful...but I would want to catch a flat tire out there
@SpatialBiggs
@SpatialBiggs 3 месяца назад
I've broken down on top of the fjords before and they are very good at getting you safely off. Very well done roads for so wild a landscape. Still agree!
@devongratrix4921
@devongratrix4921 3 месяца назад
Is iceland haunted like this music. It looks desolate, except for all the ghosts I suppose
@SpatialBiggs
@SpatialBiggs 3 месяца назад
It ranges from gorgeous like Lord of The Rings to spooky like a black metal album cover.
@devongratrix4921
@devongratrix4921 3 месяца назад
@@SpatialBiggs Haunted Golem is now my new Metal band name. Album Title, Icelandic Ghosts
@davids1inwestholl45
@davids1inwestholl45 3 месяца назад
No lava today? It's on the news tonight.
@SpatialBiggs
@SpatialBiggs 3 месяца назад
We're on the eastern part which is a little ways. Always something to watch for though.
@slothsarecool
@slothsarecool 4 месяца назад
I swear 5.3 or 5.2 completely broke GPU Lightmass, maybe they fixed it but I hope they don't abandon baked lighting it'll be essential for AVP
@SpatialBiggs
@SpatialBiggs 4 месяца назад
I bake using pc with traditional lightmass and get great results 5.4.1 Agree baked is pretty much needed with standalone and hope it is maintained.
@Hobnockers
@Hobnockers 4 месяца назад
Doesn’t the Apple Vision Pro come with a default app that allows you to load or import .fbx models and PBR-materials?
@SpatialBiggs
@SpatialBiggs 4 месяца назад
This is specifically for getting up Unreal Engine 5.4 game development using hand tracking.
@Hobnockers
@Hobnockers 4 месяца назад
@@SpatialBiggs got it. What leads to the next question. Since Apple Vision Pro is like its own PC & GPU, almost like a PC workstation, can you compile and deploy UE5.4 projects to the Apple Vision Pro like you would for a desktop PC? Or is it the same when using Meta Quest and you have to deploy for mobile devices? Meaning deferred rendering vs. forward shading. If the Apple Vision Pro would run UE5.4 VR applications with deferred rendering, like a PC, that would be amazing. If not, I guess I don’t understand why Apple Vision Pro would beat the Meta Quest?
@SpatialBiggs
@SpatialBiggs 4 месяца назад
@@Hobnockers It is most comparable to an ipad as far as operating system goes. So ipad apps run and you can leverage spatial capability to make VR experiences. The closest you can get to your ask right now is a macbook in lap and AVP streaming the desktop (which is pretty amazing). Think of it as pushing an ipad app. Really a very similar pipeline and end platform. Not as open and free as MacOS fwiw. Hope that helps!
@Hobnockers
@Hobnockers 4 месяца назад
@@SpatialBiggs I see. So you would use the computing from your MacBook and stream it to the AVP. But I was hoping the AVP itself is powerful enough, as powerful as a MacBook? Basically, I was hoping the AVP doesn’t need any other computing to run VR applications with deferred rendering. I could also use any other powerful windows operated laptop to run my VR application and stream it to the Mera Quest. But I was hoping when buying the AVP, the AVP is strong enough to handle deferred rendering on its own. If that’s not the case, and you need to use a Laptop or PC anyways for deferred rendering and computing, well.. I guess Meta Quest just saves you $3k. If that kinda makes sense?
@SpatialBiggs
@SpatialBiggs 4 месяца назад
@@Hobnockers sort of, if you are asking if deferred rendering works I believe it has similar m2 support but you still need to make builds. UE5 will not tether and does not play in editor like with meta xr hmds on windows. The chip is as powerful as other m2 but you need to keep in mind this is still mobile low watt rendering on an extremely high render target of AVP. Much higher than an m2 macbook.
@羽馨乔
@羽馨乔 5 месяцев назад
Hi there,the Vision Pro told me “This app cannot be installed because its integrity could not be verified” when I installed apps,How can I solve this problem?
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
Am not positive but it sounds like you are not being authenticated when signing app. Maybe check firewall and developer account information as well as checking that entitlements all work on the swift apple test app. That is an important check as an apple developer first to ensure your dev account works at all. Good luck!
@AyushAnbhore
@AyushAnbhore 3 месяца назад
It is simple, go to settings and check the verify or trust the developer button
@Grace-yu9tg
@Grace-yu9tg 5 месяцев назад
Hello there, does this also work with just the Xcode VisionOS simulator?
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
I have not used simulator but there is a checkbox in Unreal's Project settings under iOS that enables capability which likely gets you there. Good luck!
@janklingner2694
@janklingner2694 5 месяцев назад
Do you have some comparision for performance? Is the Mobil M2 of Apple vision as stron as a Laptop M1 (Pro) etc? How much frames do you get out of this sample Project?
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
It has a high render target and is in VR so comparing to desktop or laptop performance is not accurate without XR context. However it uses an M2 chip and I get to 90fps in template scene with shadows off then even higher with further mobile shader optimizations. It very powerful but still has similar constraints as all the mobile standalone hmds. You do see a difference in Metal vs Vulkan. As you get more powerful chips, so goes up base hardware requirements with increased render targets and usual draw call bottlenecks. If your ask is better than Quest, marginally due to the caveats but it gives you more room to squeeze perf. Best I've used so far but you still need to deeply understand optimization on mobile devices to thrive as usual. This does not mean there is a laptop form factor direct comparison. It is not rendering a flat screen. Good luck!
@matthewschroeder7111
@matthewschroeder7111 5 месяцев назад
Hi there, I wanted to ask you a question about this, would you be able to do a mixed reality game under unreal engine like this? Also would custom gestures work? I’m debating developing for the avp but I can’t seem to find these answers anywhere and I figured I’d try here
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
[Tldr at end] As it stands the setup is a bit simple and specific. Unity is where you'll want to go at least for now. To explain: Unity has support for both Bounded and Unbounded states. Bounded puts it in a window of a size you control, Unbounded sets you into roomscale ARkit style functionality. Both work out of the box and you can walk around in mixed reality doing AR stuff. Worked in there right after AVP and it is great. Ready to ship. Unreal right now is bare bones. And buggy. You get in as if VR, and it uses the Immersed mode. Because of this mode, even with a hacked skybox shader, I cannot walk around my room like ARkit apps. And you need to use the approach iBrews took that breaks your skybox essentially. Might be another path but point being passthrough is clearly a wip. The support as it stands today can be thought of as stationary VR. 6dof but no roomscale. I am looking into Unreal due to very specific reasons but as of now it is extremely early support. Unity is ready to go. If I consider my past experience with ARkit, I suspect it will take some months for Unreal to catch up at minimum. Unity was an Apple partner and had early access, as well as a more forgiving architecture, so is the arguable place to go. Definitely the way to go if you lean "app" I simply have Unreal specific long term desires and have been tinkering doing the exact evalution you are. Desire to be cinematic, high end, physics, multiplayer, etc. My skillset is very expansive in Unreal and I can make things quality. I also find both the AVP and Quest 3 passthrough to be wanting... My thoughts are leaning more toward Unreal because of my past but it would have to aim "VR" until their support improves with proper ARkit style functionality. As of now I can't depend on Mixed Reality until I see it implemented. Or at least confirmation it is coming. If those devices had better passthrough to where I felt we could really build into the world I might go Unity to prioritize into mixed but I am less convinced of this generation once obtaining the hardware. They are ok but far from the "wow" moment VR hit quite awhile ago in PCVR. But honestly, I am still in deep testing. Weighing the performance and capabilities between the two in many ways. Additionally I am watching github like a hawk as more comes in. Hope that clears up where things are at for now. I'll likely update as my work comes together in the months ahead. TLDR: Go Unity until we see more integration and can confirm arkit style functionality works. Good luck!
@SteveTalkowski
@SteveTalkowski 5 месяцев назад
Great step-by-step video, Brandon!
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
Thanks Steve great work to you as well. Good pushing!
@cjadams7434
@cjadams7434 5 месяцев назад
@@SpatialBiggs - we need a discord.. X just sucks for keeping track of stuff!
@cjadams7434
@cjadams7434 5 месяцев назад
Awesome! - Thank you for your hard work on this
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
Np at all. Hopefully this helps others avoid the blocks I was hitting for weeks. Good Luck!
@cjadams7434
@cjadams7434 5 месяцев назад
@@SpatialBiggs I think this might be a better way for me to go to make projects. I took one look at swift type of stuff and my eyes glazed over in the first minute and a half I can look at code and kind of see what it does, but I’ll be damned if I’d be able to actually come up with it in the first place.
@SpatialBiggs
@SpatialBiggs 5 месяцев назад
@@cjadams7434Same, it is very early but my skillset all revolves around game engines. Swift is great for apps though.