I signed up a similar course in my uni this sem. but the prof. demostrated this interesting content in a terrible way. And Prof. Stachniss you DEFINITELY save my life!!!
Best explanation about KF & EKF ever! Now I finally understand their principles, and their differences. I read the Probabilistic Robotics book before, although it is a good reference book for details, it's hard for beginners to understand the concept. Combined with Prof. Stachniss's tutorial, now I understand both from a Big Picture side, and also from the details side. Thank you very much for sharing great knowledge.
I enjoyed your EKF and UKF videos from 2013 and appreciated this video as it taught the EKF in a slightly different way. I would be interested in seeing a UKF version like this video. It may be too niche, but I’d also be very curious about your thoughts on the square root form of the UKF.
Wonderful lecture but I have a question regarding the part where we assumed Qt is very small , hence the posterior state estimate is the Observation. How would we know that the observation is perfect ? Does it mean that we sample various measurements and calculate the covariance of there measurements , or is it generally assumed to be constant ? Because it would make sense that environmental changes affects the Q ? So how is it done in practice ?
@@priyanshugarg6175 I would say both , an application for Localisation is EKF-SLAM , where you predict the next landmark positions and compare them with actual observation, effectivky optimising for the Robots position in the created map. As for sensor fusion , I have not applied that or reach about it but Kalman filter is quite famous for that field too , like looking at one way of deriving the Kalman Filter Equation was to find the optimal mix of 2 readings with diff precisions
Two best Prof. ----> 1) Michel Van Biezen 2) Cyrill Stachniss. Folks add your fav. Prof. in Comment, so that everyone get privilege to know them. Lots of love
Thank you for the video I am currently using the book "Probabilistic Forecasting and Bayesian Data Assimilation - by Sebastian Reich" for studying derivations. Can you please suggest any other book to understand deeper mathematics and derivations related to Bayesian Inference and Data Assimilation
Hi, i'm working implement extended kalman filter. But i have problems. İ'm trying extended kalman filter in 3 dimension(x,y,z positions). and visulation is simple just use matplotlib. Can anybody know that how can i do? Any resourse or sample?
Now I understood after 45.00 that If we are taking count an angle in System then EKF will be a good option but If we are predicting for example state of Vehicle, then we need to use Kalman? correct me if I am wrong. Thank you Prof.
At 46:56 you show the mapping of the Gaussian distribution using a linear function. I would just like to point out that the new function shown on the left is the mirror of the original function. For example, the part of the original Gaussian that is to the right of the average, is below the average, meaning that y-axis of the new distrubution should run from high (5) at the bottom and low at the top. Same applies to the next slide with the non-linear function. If the gradient of the linear function is positive, this mirroring does not occur, but of course with a non-linear function the gradient and be negative and positive.
56:52 What would I do in case of non-smooth non-linearities, e.g. because of physical limits of state variables in my system dynamic? Just approximate them by a smooth-function?
13:15: Why is it u_t and not u_{t-1} in the discrete state space model? Wouldn't you have to take into account the control command at the previous time step, not the current one? Thanks for the great videos and explanations!
Thank you professor, it was really amazing explanation with deep concept that I can use for my problems. I want to ask one question, for example if we have a function with high non-linearity, is it possible to localize mobile robot using EKF by increasing the number of sensors? And if there is any book that can guide for the implementation of systems using MATLAB, would you recommed it please?
not a 100% sure, but I think if more of the same sensors(for example 2 magnetometers one at the front one at back) can reduce the uncertainty for the observation(in this case heading angle), then the new belief should be more dependent on the observation so error introduced due to the prediction would be reduced as it has less contribution in the final update, therefore it should make the overall estimate better compared to using observation from a single sensor of a kind.
Thank you for these videos they are really helpful. You mentioned that python will be used in homeworks but i can not find the homework assignments anywhere. Is there any way I can reach to homework assignments?
I have now watched the complete part of the linear KF. What I don't understand is, how are the matrices A, B and C determined/calculated in the first place. Could someone help me out? :)
A and B describe how your robot/vehicles moves and C specifies how your sensors works. Thus, A, B, and C are robot-specific and need to be defined by the user.
@@CyrillStachniss can A change at each time step? as you stated in the example A can encode information about wind speed for the case of UAV, I assume the predictive model can update it based on sensor information? Thanks for the lecture professor!!
“Gaussian” in the sense explained would be understood as “well behaved”, meaning that if your “vehicle” is in the middle of a storm, non linearities come in and controls may not work as intended.