Can someone build an app that generates a virtual Paul Graham or Michael Seibel that just hang around my place all day, give advice and randomly crack jokes
Actually such an idea is more than possible. We saw how character AI, POE and many other platforms could make an LLM embodies someone's personality whether it is a real character or an imaginary one. With the light of this information we only need an API, then a text-to-speech software that will work on the backend, and lastly we need a virtual body of that person and BOOM you have your best imaginary friend. Now of course you will find more barriers but these are just flash thoughts
I believe meditations and apps for mental health are great use cases for AR. We have released mentalOS, it is AI-powered spatial meditation app for Apple Vision Pro
A story driven VR game called The Simulation™. (The ™ is part of the title.) It’s about a brutal war between Muffins and Donuts. You (the player) are an expendable laboratory test subject used to test dangerous simulations. And you’ll be thrust into the most advanced simulation of reality ever created; a simulation that may finally be the one to turn the tides of war.
I want to bring my dreams into reality, but to make money automotive AR/VR tools! My father is a mechanic and if can make his job easier then that's a win for me.
Meta Quest 3 has impressed me way more than I thought it would with it's capabilities, both in workflow and also in developing (it is SO much more developer friendly than the vision pro). In my opinion they are writing it off too quickly as 'just a gaming device'
I recently launched a startup, and one of my co-founders and I each got the Vision Pro... and ever since it has dominated our thought cycles. It is such an exciting step forward.
@@ilya3341 All of which had no, or unsuccessful App Store. Meta had a solid app ecosystem developed and already going strong. They are years ahead of Apple in terms of VR and AR apps.
Chapters (Powered by ChapterMe) - 00:00 - Coming Up 00:37 - Intro 01:26 - Diana AR/VR Startup 02:12 - Getting acquired by Niantic 02:50 - Why AR? What's going on with Apples Vision Pro? 03:48 - Challenges with VR headsets: Microsoft HoloLens 05:14 - Eyes ability to focus 05:42 - How it's different from optical approach? 06:19 - Apple Vision Pro video feed reduces technical challenge 07:41 - Hard interesting stuff: Hardware vs Software 09:24 - Connection between VR/AR and Self-Driving cars 11:07 - Use cases: Focus on productivity 12:48 - iPhone human interface guidelines 13:25 - Eye tracking: Something founders can focus on 15:26 - What the difference between Meta SDK and Apple Vision Pro SDK? 16:52 - iPhones impact on YC companies 17:33 - Is Vision Pro an iPhone moment? 18:35 - Why did it take 5 years for good iPhone companies to come out? 19:45 - VR will be different from mobile 20:19 - Group Office Hours: Pre-mortem exercise 21:31 - Advice for founders and why should they build VR apps? 23:05 - Path of consumer social networks: Facebook 24:12 - Should founders build on VR/AR tech? 27:09 - One of the things YC will look for in a VR applications? 27:36 - Outro
I love Diana's point on the current 2D limitations (and dare I say lack of 3D creativity) of the UI/UX. I wish AVP apps would dare to go more 3D...and maybe they will - maybe Apple is trying to ease us into a 3D UI. But I think there's a huge opportunity here to improve UI/UX using the AVP. Almost like force touch or shake to undo on the iPhone. Imagine interfaces had layers that you could visualize - almost like a photoshop or Figma file.
25:11 CAD software: you could have a 2D screen floating where you do traditional modeling with paired Bluetooth mouse and keyboard and next to that you can have the actual 3D version of the object you’re modeling. The 3D part or assembly could be scaled or kept 1:1 so you can virtually test fit with real world parts.
One major hurdle for developers to get on the visionOS bandwagon is they have to be a heavy iOS developers(RealityKit, ARKit, SwiftUI etc.). It's great for guys like me who are already iOS developers but for non iOS developers, they might be at a disadvantage. There are some links I saw where Apple has provided some support for Unity developers to port their applications over to visionOS but once ported over they will have to develop the apps using the iOS tech stack. Also, with the price tag being so high for the headsets, we will have to see how fast the consumer base expands.
It would be amazing if two or more Vision Pro headsets could share an augmented environment. Like both people in the room can see the same floating screens or objects.
I have an idea for an app: When you walk around with the Vision Pro, it will automatically detect billboards/commercials and overlay them with random photos/memories in your camera roll 👀
Yep. They imply this is a platform of the future but they're not willing to sit for even half an hour in these devices because of how awkward/uncomfortable it is.
As soon as they said can we take the headset off so we can talk means that it is too early to get in on this tech. It’s too much in the way but once somebody can figure out how to make it smaller and less clunky it will be huge
A huge question I have is is this a good reason on why we saw transparent TVs this year at CES. Is that the next step for these and having AR experiences transparent? Or will the technology never go that direction and always be pass through?
Apple's approach of using Swift to have developers write code for a VR/AR environment is going to be the determining factor on whether these devices get developers to make apps for them. Already many developers know Swift and SwiftUI, which means the learning curve is not as intense as learning Unity or a whole new 3D Gaming environment. Meta & Microsoft will have to respond to something akin to a Jetpack Compose or SwiftUI with Reality Kit to ultimately compete in the long run.
Swift and SwiftUI is only a small component of the visionOS tech stack, the main components developers need to be experts at is ARKit and RealityKit. Majority of the iOS developers don't work with ARKit on a daily basis, so there will be a learning curve for them also.
@@samTheTechieGuy Indeed. a learning curve non the less, but the realityKit, which is the new library for visionOS feels like an extension of SwiftUI declarative design. I dont know Unity, but just the thought of having to learn a new platform AND language to develop an app halts me on my tracks as a developer.
It's the first time i hear "eye tracking tires the user more". I haven't used eye tracking much since it's not very useful on my quest pro but i'm really interested to try it out to feel it myself.
Basically anything you can imagine a magic holographic display can show, this should be able to do. Any flat screens you see with data should augment into a 3D view of you look at it and pull it closer. Like the trading example. If I look at a stock screen and pull it closer, it should expand itself into a 3D chart of volume, price, and change over time. Have events as floating points that you can expand into it. Have the event play in 360 around the person if possible. Stuff like that.
Developing VR 3D content requires significant investment such as acquiring the black magic purpose built 3D video capture device out late 2024 and so specific applications will be built around a niche market.
It is true that opening a PDF in Meta is difficult. But there's a $99 pre-packaged PDF viewer on the asset store for Unity. That's something the Apple Vision Pro doesn't have. There's no asset store with pre-packaged content. There are over 11,000 products on the asset store, and a ton of that is UI and utility stuff.
Best comment I've read all day lol I was laughing with my daughter the other day about how happy I am that the Vision Pro got so many people to look into the Quest market lol they just announced job simulator coming to vision which is great cause it'll make all those people realize just how fun gaming is in VR n will quickly see the visions limitations and jump to a quest so they can actually play games and get more use then movies and emails. The vision pro for bragging rights and the Quest 3 for everything else lol
@@Mr_Louyall always forget that its been out for 3 weeks relax apps and games will come u guys think apple is dumm nd dont collab with game pubs give it 1 year u will see how far they will be. Yes at the moment quest got more apps but its not forever and as u said apple vision got people buying quest not quest itself people want avp when more apps and a less expensiv vision air comes out at the end apple will win
2:40 hilarious how the guy without the vision pro and with the meta was like "can we take these off so we can have a conversation" and everyone else with the Apple product were super comfortable hahaha
It's not just the human eye, but the entire visual system, including the optic nerve and its integration with the brain, that creates our sense of place and reality. Technologies that directly interface with the optic nerve, bypassing the natural visual pathway, have the potential to disrupt this fundamental sense of self and reality. Such a profound shift could lead to existential alienation if users are not adequately prepared and supported. Therefore, the integration of these technologies should be accompanied by robust social support networks and economic systems to mitigate the potential psychological impact.
I could see a hyper realistic model that appears right in your room to try on clothes before you buy them. You could change the body type/characteristics of the model. Like something similar to the F1 car but a representation of a person. Something so detailed you can see the stitching on the fabric Continuing that line of thinking, I bet there would be a lot of companies that would be interested in showing off their products in a virtual showcase that has that level of detail
Wouldn't it be better to be able to try the clothes on your own body? So yourself? AR SDKs for virtual trying on already exist, I'm not sure whether the APV could be a game changer in this existing market
@@shumbabala6474 it would but it seems like a more difficult task for it to be able to accurately model your entire body so I was just thinking a generic model for the time being. I mean even the FaceTime head representations it makes don’t look like us yet. But yeah eventually. Edit: they might exist already but not with the resolution this thing can provide and they don’t have anywhere close to the same level of detail as the F1 model for example. And the fact that you can do things like enlarge it, make it smaller and move it around etc. The ability to view and manipulate objects in insanely high resolution at scale seems like a game changer to me. Those other devices also don’t have good enough passthrough to make it look like it’s really right in your living room. You want customers to be able to see themselves owning the product. I’ll give you a real life example: if I was looking to buy a fishing rod, you could show me some poorly rendered 3D model on another headset and it probably wouldn’t get me interested in buying it. Show me a selection of rods that look as real as some of these Vision Pro demos and I could absolutely see myself deciding I want one of them. It would be nice to get to really examine every feature on them so I can get an accurate idea of the weight, length, spooling system, handle curvature, and actually be able to to remove each piece so I can see how they’re built etc
@@Brian-oz8io I agree on your take, the AVS can definitely open up new opportunities. Will keep on the lookout for them! Hopefully with time apple makes these devices more affordable for the general public and thus expands the potential customer base to even greater extents
Why does everyone act like the vision pro is so different from the Quests? "Meta focuses just on games." Y'all don't remember that they made Workrooms for virtual work, and a partnership with Microsoft to get ms office? The whole selling point of the quest pro was to be able to work in it. The vision pro is also WAY more expensive than any quest, so it's kind of strange to compare them like they're in equivalent segments.
The technology approaches are interesting. The bigger issue is the human factor and the fact that nobody except the most enthusiastic adopters want stuff strapped to their face. The biggest use case I've seen for apple vision pro is shooting first person videos.
These things remind me of smartphones before the iPhone, just big, clunky, and stupid looking. Steve Jobs was reluctant to make a smartphone for a long time because he didn't want Apple's image to be tarnished by that.
One reason eye tracking is challenging is that tracking through prescription glasses introduces errors in pupil location. It looks like Apple is using Zeiss inserts to get around this, I'd guess with the eye tracking camera/sensor between the pupil and the insert/lens? "You cannot wear Apple Vision Pro while wearing eyeglasses. If you require prescription eyeglasses, including for astigmatism, you may be able to order prescription ZEISS Optical Inserts that can be used with your Vision Pro."
In future you will just have a setting at the device level that will allow you to remove your prescription glassed and the device to adjust the vision based on the prescription setting you have put in it. Simple. Currently, how you have dark mode on phone so that all apps go in dark mode.
I must say seeing you guys talk to one another with VR headsets is quite distracting. Feels weird watching you talk to one another with no EYE CONTACT...
Imagine a poor child from the slums of Brazil, who lacks the resources to attend school, being able to put on a VR headset-as affordable as a smartphone-and find themselves in a virtual classroom alongside their friends, attending the country's top schools. This scenario highlights a crucial element missing in today's online education: a sense of community and tangible presence. While you can already study online, the experience often feels isolated, making it easy for students to lose focus or even sleep through online classes. Contrast this with the immersive potential of VR/AR, where a student could solve math problems writing on a digital whiteboard, share a floating screen of a tiktok to his friend or disturb class throwing digital paper airplanes at the teacher's avatar in a virtual classroom, that'd be crazy.
Setting aside the contradiction of a more focused environment including other people and tiktok windows, this seems relatively doable now. From a software perspective, we already see a lot of this being quite doable. You can get together in a digital classroom in VRChat today, and it's okay. Immersed also seems good, but I haven't tried it. So, it hasn't seemed to catch on that much yet.
VR goggles aren't the future. they're a niche for gaming. VR *glasses* are much more likely to have mass market adoption. I'd be surprised if the Vision Pro is a long term success as is
Yes it is that moment that others have to catch up. The iPhone gave everyone an instant communication a Security camera Music and Apps. this is Making everyone the Directior of their own movie with constant security so the cops can't lie anymore because everything is being recorded.