I really love this format where you walk us through the whole process of you actually struggling and powering through. Your enthusiasm and positive attitude is also very contagious! I could watch your videos all day just for that reason. And PS to anyone reading this: the book is frikn awesome. I've done some courses on ML and I can tell you it's absolutely brilliant, both for newcomers and intermediates.
Thanks! I enjoyed making this style of video and it's definitely one of my personal favorites in a while. Glad to hear you also enjoyed the book, thank you for sharing your thoughts!
I have been waiting for something like this since I fiddled around with simulating robots in Unity a few years back. Make. A. Series. Seriously. I love this.
I was always fascinated about training a project in a simulation and export its results to world with brutal and breaking physics. This seems a good choice. "I feel future".
This how they do control algorithms for rockets and aircraft! Build a simulation, then go fly. Alot of times there even using a kalman filter against the simulation results and real life!
This is awesome, Thank you for this! I have been working on a project that I have known that eventually, I will need to have a simulation for some of the machine learning, but I had no idea where to start. This is a great start!
A few of the problems you ran into screamed "smart developers forgetting that the end user doesn't have the same context as them, and throwing friendly, descriptive names out of the window"...
sweet! You re awesome! I had been fiddling with IsaacGym for a while and was a bit frustrated navigating through, but now I have newfound confidence! Any thoughts on getting interactive input to this Isaac Sim simulation? maybe connecting to Unreal - Unity? Do you have any first feelings on how to proceed!
Thanks great video. What is the workstation that you are using for Isaac Sim and Gym (graphics card)? Also I guess you're running Ubuntu 20.04? Thanks!
I think that ROS would be ideal for communicating between your controller script and the script running the sim. You would just have a topic to which you would listen to in the sim and publish to it from the controller. BTW. Awesome work
@@sentdex I am currently enrolled in an excellent course by theconstructsim. It is the ROS in 5 days and honestly ROS is super easy even for me not so experienced python dev. What you did in your video is basically what ros topics are xD. Also I find the official ROS docs not so easy to use :/ with all the different versions and all xD
Oh gosh, you started with the msot difficult simulator. When I used Isaac sim.... I felt your pain. The documentation is horrible. There's other sims that are easier to work with but are definitely worse in terms of limitations (ROS gazebo, pybullet, VREP)
Your videos always feel so grounded. Most other videos on such topics make me feel so insignificant. Also, I have a friend, Zeke, from Florida, and you two are soo uncannily similar!!!!
Hi sentdex! Im dealing with an similar problem in RL with Isaac-Sim, where I’m trying to teach a mobile robot to park. I have a lot of problems with the simulation program, because I don’t really know what are the limits of the joints of my robot, so that I can’t set my observation and action space correctly. Did you figured out how to set the limits of each joint in the simulation, so that I can use them to set the obs and action space ? I would appreciate if we can chat a little bit more detailed on about dealing with the Isaac sim simulator and it’s dynamic library? Kind rewards!
I went to follow along with this before discovering that omniverse only runs if you have an RTX card. Any way around that? In the future, can you do things like this with technologies everyone can access?
Consider a situation where I have var_name with 600 unique values, bins, woe values and event rate values. Now I am having to create a dashboard with all the above features as input such that when I select a particular variable name from a dropmenu and a woe or event rate feature from the other dropmenu, I can visual a bar plot of the woe or event rate values against the bins of that particular selected var_name value. Can someone please help me with that code?
Wow this is an awesome video! Really cool to see your thought process and how you figure out all these new systems. Excited for the next part in the series
I really appreciate your work. I learned Machine Learning 2 years ago with your videos and others. I always liked how you present your work and that you have no problem to admit when you can't do something 100% yet. You have a clear voice, and I never had any issue to understand what you say. I am from Germany. By the way, Panda it the name of the Robot from FRANKA EMIKA 😉 On this video I sadly had some issues to rebuild it. I really want to accomplish this and build my own solution for the Reinforcement Learning for the robot. What I don’t understand is If I need the simplified Script you created at 7:04? How do I create a scene as .usd like Bittle.usd? How should I structure the file locations for everything I need for manual control? I hope so much you find the time to answer 😊
7:14 was gym, so nope. USD is what omniverse app saves, so I just manually build and adjust in Isaac sim, then save, that makes a USD and is default stored in the Isaac directory, which can then be loaded. Id be happy to answer more and share the files, I can probably slap them into github or something. Feel free to email me harrison@pythonprogramming.net
Great video, however after putting together Bittle, (as mentioned in its manual) small things like putting on the rubber "socks", reversing direction of the battery to change the centre of gravity, all these will significantly impact gait of Bittle, which I highly doubt can be simulated in virtual environment (without very detailed modelling of the say dimension and shape of battery) Bittle's gait is basically simple half sine curve, which "supposingly" should allow it to walk basically but in reality, i find it wobble left and right each time Bittle raises a leg, which again i doubt can be modelled virtually without great level of detail. For quadruped-type of robots, shouldn't we be heading down the ROS route instead? Personally i have not used Omniverse before so i can be totally wrong
I presume since it's Nvidia it's supposed to integrate easily with their machine learning tools. I'm not sure how well something like Gazebo would be used to train an AI, it might be just as easy, I don't know
It seems Omniverse is analigous to Gazebo as opposed to ROS. Gazebo is a simulation environment that can be directly integrated with ROS to setup controls of the robot in the simulation.
this is the sort of thing I want to see. Train a robot in a in a simulator for a million virtual years and then bring the boy out into the real world. Chef's kiss.
🤫 I’m thinking: Calibrating what the simulator can do/"is allowed to do" based on what the robot dog can do (possibly for real) (extent of movement, acceleration, refresh rate etc.). Simulation runs only seem to make sense within the limits of the robot dog, not the simulator. Love this exploratory format!
Oh my god! how do you learn this stuff? I think simulations could save a lot of time for robotics developers so they don't have to deal with the electronics and complicated background earlier in the development.
I’ve always wanted to play around with gait algorithms but felt that unless I have a simulation to test it on first it’s too much money to spend on a test robot first
How about using ROS with Omniverse. As usual great content. I look forward to watching this series. Thank u ✌🏻 PS 2nd try to post a comment. Apparently RU-vid rejected the previous one coz I added the link to the documentation for the ROS brigde.
Not sure where your original went, I checked spam too. Yeah there are some definite features in sim that I've seen that look insanely fancy. Unfortunately I am a total ROS noob. I know what it is and that's about it. I keep controlling my robots serially with pyserial :P If you know of a great intro to ROS that ramps up to something like this, let me know. It's a subject I've been wanting to dive into eventually. Also try replying with the docs maybe again.
@@sentdex do you recommend Ubuntu as a daily driver or would you say Windows or Mac would be someone’s best bet. I am a PhD student in Robotics Engineering and considering the switch but don’t know if it is a good idea or not and looking for additional opinions
Yes, great series Harrsion! Keep demoi'ing different types of robot dogs.. (eventually working your way up to.... SPOT!) (or do a collab with michael reeves lmao)
This is EXCATLY what i would have needed back in early 2021. I did my Bachelor Thesis in simulating and modeling a robotic manipulator and generating a path using genetic algorithms. I digged a little bit into physics engines (or something like ROS) but ended up using simple 3D Plots which took a shit load of time to calculate and animate... Really looking forward for your series!
if fullVideoWatched == true: print("This is probably the best RU-vid video ending that I ever saw") print("Yeah I know the true check is useless it's just for reading purpose") return thumb_up = 1
Fantastic video - I've previously only had experience animating rigged characters in Maya or Blender, but have really wanted to get into robotics. Definitely excited to get started!
how do you deal with the overwhelming anxiety when you start out with something completely new? I am working on carla project using ros and I feel anxious with how much I don't know yet
Start by using « the code you wish you had » from the highest level. Then break it down and implement it. It will help you figure out what you’ll be needing in order to achieve the goal. In other words, start by moving away from the stress of having to figure out thousands of details. Act as if someone else will do it later. You first focus on what you’re trying to do and let the _how_ emerge. This will also help you avoid coding useless low-level things that look important at first but then prove unneeded for the current focus. HTH.