Love this, really hope you're just busy experimenting with Vision Pro and we see more from you again soon Brian. Would love to see building on some of these concepts with being able to drag, rotate and resize the objects and being able to pick from a range of shapes or objects and load them into the same scene. Also keen to see if real world occlusion works where if the cube goes under the table then you can't see it until you look under the table. All things I can research and hopefully build onto this great starter in the meantime. Thanks again and can't wait to see more!
17:48 - you say "once we reached this next line of code, we know the fingerTip is tracked" but all you know at that point is that it is non-nil. Should it actually be tracked, or is it enough to just be non-nil?
Created a ModelEntity with a USDZ file and placed it on the wrist. The ModelEntity is so large that it cannot be scrolled, scaled, or moved. How can I set the size or frame of the ModelEntity?
This is exactly what I needed. I have been playing around with Spline recently and wondered how the animations/interactiity worked in an app. Thanks so much.
Hello, I had built this in my VisionPro. I can be able to click ‘ start tracking’ , but there is no blue dot on my hands. I am wondering did I miss something to set up in Vision Pro itself or anything else?, I used the file from you. 🥺
Thanks for this tut. This explanation only works with versions 0.x.x, not with 1.x.x. Can you explain to us how to import newer versions, for example 1.9.0
I come back to this course so often. I don‘t know why, but I *still* find it inspiring. So it is time to say thank you! I am also one of your paying patreons. Because if *this* video.
You can use it in commercial products as long as you include the licence text Apple provided: "The Apple Software is provided by Apple on an "AS IS" basis. APPLE MAKES NO WARRANTIES, EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION THE IMPLIED WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE, REGARDING THE APPLE SOFTWARE OR ITS USE AND OPERATION ALONE OR IN COMBINATION WITH YOUR PRODUCTS. IN NO EVENT SHALL APPLE BE LIABLE FOR ANY SPECIAL, INDIRECT, INCIDENTAL OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) ARISING IN ANY WAY OUT OF THE USE, REPRODUCTION, MODIFICATION AND/OR DISTRIBUTION OF THE APPLE SOFTWARE, HOWEVER CAUSED AND WHETHER UNDER THEORY OF CONTRACT, TORT (INCLUDING NEGLIGENCE), STRICT LIABILITY OR OTHERWISE, EVEN IF APPLE HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."
Great Tutorial Brian!, i saw that the library allows us to add an space between each chart value data, do you know if there is a way to set corner radius to each chart sections?
Hey Brian! I loved the tutorial and your great in depth explaination. I am wondering if us react-native folk can also use Spline for mobile applications. I’m assuming no but would love to know. Ps: subbed :)
Thanks for the tutorial Brain! I especially liked how you are changing scene based on the scroll position, also wondering if you have explored other creative ways to manipulate the scene/spline object with code.
Especially looking at Apple Vision Pro I am eager to revisit what spline can do in the near future. Events and actions are currently being worked on: docs.spline.design/native-3d-embeds-for-ios
Thanks for the video Brian! I'm trying to embed locally, changed nothing on the provided code, added the package but my simulator is crashing with the error "Fatal error: Unexpectedly found nil while unwrapping an Optional value". I've imported the scene to my root folder as you show here but it seems xcode can't find it. What am I doing wrong?
Did you also add you .splineswift files to your project and activated "Copy items if needed" as well as added your files to your target? "Fatal error: Unexpectedly found nil while unwrapping an Optional value" could indicate in your case that a file could not be found and therefore you get a nil value.
@@BrianAdventumm ok fetching from the cloud is working but fetching from local not somehow. I think I did everything right, double checked your points and tried with few fresh projects but nope.
Hey my man, I really liked your tutorial, and took a look at your patreon and previous video output. Are you going to continue making videos? I've never signed up to anyone's patreon, and I still haven't signed up to yours, but are you planning on making more vids in the future on a semi-regular basis? I know there's a TON that goes into making a vid, but I really like the way you explained this one and watched some other ones from 3+ years ago.
Thanks a lot! My plan is to bring new content regularly this year. Hopefully every month or 1.5 months. The focus is going to shift a bit. Away from pure Apple development content to stuff that I feel is not explained well enough elsewhere. In the next weeks there will be a video or two about LLM embeddings: About what embeddings are, what kind of math is used with them, how to use your own data and create embeddings to ask e.g. GPT-4 questions about your data.
Video is so good. I really liked it. Thank you! But when I tried it the same all code was working but seems ML model which I created from my images is not yielding result as expected. Any tips on this how we can create good ML model?
what does make sense is: why do we have to take photos of hands to use hand tracking? wouldn't that be easy for apple to do? like, have a basic hand tracking model in Vision? that would be insane if they didn't have that.