Тёмный

Lie theory for the roboticist 

Institut de Robòtica i Informàtica Industrial, CSIC-UPC
Просмотров 9 тыс.
50% 1

Robotics & AI Summer School 2022
Lie theory for the roboticist
Joan Solà
www.iri.upc.ed...

Опубликовано:

 

18 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 27   
@zhaoxingdeng5264
@zhaoxingdeng5264 2 месяца назад
Very clear and useful. Thank you!
@RoboCodeHub
@RoboCodeHub 4 месяца назад
Thank you, Sir!
@fangbai8238
@fangbai8238 3 месяца назад
Thank you so much, Joan!
@yen-linchen7398
@yen-linchen7398 Год назад
Thank you so much. It's really helpful and clear.
@puguhwahyuprasetyo3927
@puguhwahyuprasetyo3927 10 месяцев назад
This video is amazing. Thank you Professor Sola
@mMaximus56789
@mMaximus56789 2 года назад
I loved this course, probably the best introduction on Lie groups in this platform! Is there, by any chance, a possibility of a course such as this but on Riemmanian manifolds?
@chineduecheruo8872
@chineduecheruo8872 Год назад
Thank you Juan Sola!
@5ty717
@5ty717 10 месяцев назад
Excellent
@mohammedtalha4649
@mohammedtalha4649 Год назад
Thanks alot for this man! loved it
@franciscodominguezmateos5528
@franciscodominguezmateos5528 6 месяцев назад
Hi Joan, Do you know about Geometric Algebra?
@joansola02
@joansola02 6 месяцев назад
No, I never approached this topic...
@ninepoints5932
@ninepoints5932 6 месяцев назад
Objects in GA also have a natural Lie group and algebra structure related by the exp and log math shown here. Thanks for the presentation!
@PengfeiGuo-yn7hu
@PengfeiGuo-yn7hu 7 месяцев назад
Thank you for sharing this great video,it's very helpful to me.And cloud I get the slides file?
@longfeihan2100
@longfeihan2100 Год назад
Very nice and comprehensive video! Thanks a lot! I'm wandering whether the link to the video in the last slide will be maintained. Currently it is not available.
@joansola02
@joansola02 10 месяцев назад
All videos can be found by searching for "Lie theory for the roboticist", on YT. There starts to be a few of them! They are all roughly the same, but not equal!
@Aleksandr_Kashirin
@Aleksandr_Kashirin 2 года назад
Very nice lecture! Could you please make these slides available for other viewers? Also, I have a question: Could you please emphasize the key differences between EKF and IEKF that you showed on the slides? Why do we want to use Lie Algebra in localization tasks, especially in EKF? Thank you!
@joansola02
@joansola02 2 года назад
What do you mean by IEKF? Invariant? Iterative? Information? Indirect? They are all possible choices. In the course, however, I dont remember referring to any of them. I suppose then that you refer to the ESKF or error-state KF. All ESKF work with a nominal state and an error state. All Lie-based KF are indeed ESKF because the error is defined in the tangent space. For example, let the state be a quaternion q \in S3 \in R4. The tangent space is isomorphic to R3. Now given a computed Kalman gain K, the update on the state q for EKF and ESKF are: EKF: q_new = q + K * ( y - h(q) ) --- here dq = K * ( y - h(q) ) \in R4 ESKF: q_new = q * Exp ( K' ( y - h(q) ) ) = q (+) ( K' * (y - h(q) ) ) --- here dq = K' * ( y - h(q) ) \in R3 so the updates are indeed quite different, but the shortcut (+) makes it look the same. Remark that K is for EKF and K' is for ESKF, they are not equal.
@urewiofdjsklcmx
@urewiofdjsklcmx Год назад
Will it make a big difference in practice if I apply the (+) operator only for the "group variables" and keep the regular + for the remaining states (for instance also if I want to include sensor biases)?
@joansola02
@joansola02 Год назад
@@urewiofdjsklcmx All variables that can be described as pertaining to R^n can be treated normally with a '+' sign. In fact, the R^n spaces are also Lie groups under addition, and the (+) operator in R^n boils down to the '+' operation. Even more, since R^n under addition is a commutative group, then left-(+) and right-(+) are both the same and equal to regular '+'.
@urewiofdjsklcmx
@urewiofdjsklcmx Год назад
​@@joansola02 Hmm but this will decouple the error states right? If I understood the invariant EKF (IEKF) by Barrau and Bonnabel correctly they stay in the SE_n(3) group to define the error. Apparently this is more accurate, but I guess also quite complicated if you need to consider biases and other states..
@joansola02
@joansola02 Год назад
@@urewiofdjsklcmx Yes, in this manner the errors are decoupled. The question is how much. And the answer is not much. But again, not much might be too much depending on the appication, objectives, and particular numeric values of the involved variables. the advantage of decoupling is that you have all the algebra you need for each one of the blocks. If you want a completely coupled state, then sometimes you will not have all the closed forms you need (exponential map, adjoint matrix and right Jacobian being the key 3 elements for which you would like to have closed forms -- al the other forms can be deduced from these three)
@iamyouu
@iamyouu 10 месяцев назад
May I get a link to the slides? Thank you
@urewiofdjsklcmx
@urewiofdjsklcmx Год назад
Let's say I have two processes that both estimate the same group element (e.g. an element of SO(3)) and for both I have an associated covariance than the covariances are defined in different tangent spaces (at the individual estimate), right? So in order to combine them I somehow have to transform them with the Jacobians such that they are mapped to the same tangent space before I can combine them? The hypothecial application that I have in my mind are two Kalman Filters that estimate the same system. Before I watched the video I would have naively fused the two covariance matrices directly, which is apparently not the correct way..
@joansola02
@joansola02 Год назад
It is unclear how do you "combine" both estimates. If you provide the formulas for such a combination in the case of vector spaces, I can then hopefully guide you through the process of doing the equivalent thing in Lie groups.
@urewiofdjsklcmx
@urewiofdjsklcmx 3 месяца назад
@@joansola02 Ok a bit late my response: So in vector spaces I can simply add the information matrices (inverse covariance matrices) of state estimates x_1 and x_2 like so: I_fused = I_1 + I_2. But if x_1 and x_2 belong to a Lie group, my understanding is now that the I_1 and I_2 are defined in the tangent spaces defined at x_1 and x_2. So I guess I cannot just add them up like in a vector space ? I probably need to transform first both I_1 and I_2 to some common tangent space and then add them afterwards?
@joansola02
@joansola02 2 месяца назад
@@urewiofdjsklcmx You are right. If the covariances or the info matrices are defined locally, then you have to combine them in the same reference space. You can use for this the adjoint operator, which can be used to transform covariances matrices from one tangent space to another. Let X and Y be two elements of group. Let E be the identity. Let Ad_X be the adjoint at X, and Ad_Y that at Y. Now, Ad_XY = Ad_X.inv * Ad_Y transforms vectors from the tangent at Y to the tangent at X. You transform covariances as Q_X = Ad_XY * Q_Y * Ad_XY.transpose. You can easily sort out the equivalent conversion for info matrices. You can also express all info matrices at the identity E. To do so, you do e.g. Q_E = Ad_X * Q_X * Ad_X.tr. You can then directly add I_X + I_Y = Q_X.inv + Q_Y.inv. The conversions for the info matrices are easy: I_X = (Q_X).inv = (Ad_XY * Q_Y * Ad_XY.tr).inv = Ad_XY.inv.tr * I_Y * Ad_XY.inv
Далее
Joan Solà - Lie theory for the Roboticist
37:17
Просмотров 30 тыс.
Lie theory for the roboticist
1:32:49
Просмотров 6 тыс.
How 3 Phase Power works: why 3 phases?
14:41
Просмотров 1 млн
How the Best Hackers Learn Their Craft
42:46
Просмотров 2,6 млн
Why Democracy Is Mathematically Impossible
23:34
Просмотров 4,1 млн
Spinors for Beginners 16: Lie Groups and Lie Algebras
36:23
New Breakthrough on a 90-year-old Telephone Question
28:45
The Boundary of Computation
12:59
Просмотров 1 млн