Тёмный

The Covariant Derivative (and Christoffel Terms) 

Eddie Boyes
Подписаться 3,2 тыс.
Просмотров 9 тыс.
50% 1

In this video (GR - 07), the idea of the “Covariant Derivative” is introduced. As a means of trying to ‘justify’ it, a one-dimensional ‘proof’ (of sorts) is offered and worked through. This is then used to try and make the two dimensional version (and hence the multi-dimensional version) of the Covariant Derivative, with its associated Christoffel Symbols, acceptable.
Inevitably, the final versions of these equations are in “Einstein Summation” form, and so for the benefit of those who are coming to this for the first time, the video ends with examples of how to expand such equations, and try to ‘see into them’ exactly what they mean.
This video is part of a series of videos on General Relativity (GR-01 to GR-20), which has been created to help someone who knows a little bit about “Newtonian Gravity” and “Special Relativity” to appreciate both the need for “General Relativity”, and for the way in which the ‘modelling’ of General Relativity helps to satisfy that need - in the physics sense.
The production of these videos has been very much a ‘one man band’ from start to finish (‘blank paper’ to ‘final videos’), and so there are bound to be a number of errors which have slipped through. It has not been possible, for example, to have them “proof-watched” by a second person. In that sense, I would be glad of any comments for corrections ……. though it may be some time before I get around to making any changes.
By ‘corrections and changes’ I clearly do not mean changes of approach. The approach is fixed - though some mistakes in formulae may have been missed in my reviewing of the final videos, or indeed some ‘approximate explanations’ may have been made which were not given sufficient ‘qualification’. Such changes (in formulae, equations and ‘qualifying statements’) could be made at some later date if they were felt to be necessary.
This video (and channel) is NOT monetised

Опубликовано:

 

17 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 21   
@arezaajouneghani3082
@arezaajouneghani3082 7 месяцев назад
The most comprehensive and lucid lecture on Christoffel that I've encountered on the internet-extremely didactic and profoundly stimulating for delving deeper into this subject. Undoubtedly, a superb teacher!
@jimgolab536
@jimgolab536 9 месяцев назад
I very much like your approach of starting in 1D and discussing all the pieces, and only then adding one more dimension.
@skbhatta1
@skbhatta1 9 месяцев назад
The best exposition for the covariant derivative that I have seen so far. Looking forward to seeing the rest.. Thanks a lot.
@christianfunintuscany1147
@christianfunintuscany1147 6 месяцев назад
Thanks again for this precious lecture
@duycuongnguyen227
@duycuongnguyen227 4 месяца назад
Excellent explanations!!!
@eustacenjeru7225
@eustacenjeru7225 9 месяцев назад
The lecture has improved my understanding on covariant derivative
@BLEKADO
@BLEKADO 9 месяцев назад
MARAVILLOSO, MARVELOUS, MERVELLEUSE, MIRINDA.
@christianfunintuscany1147
@christianfunintuscany1147 6 месяцев назад
The interpretation of the metric tensor entries was impressive! Thank you!
@eddieboyes
@eddieboyes Год назад
The original video GR-07 was ‘published’ in January 2023 and by July 2023 had received 969 views and 26 likes. It had also received the comments below. It was then replaced by the current GR-07 (July 2023). Hopefully, some of the minor issues mentioned in the comments below have been addressed in this newer version. @lowersaxon July 2023 Brilliant. Didactically the best exposition possible for all beginners, imho. Isnt that Christoffel symbol a gamma and not a lambda? @darkangel105100 June 2023 Can also be expressed in matrix form ? @aashraykannan5027 June 2023 Eddie - really appreciated this video; helped me work through some GR roadblocks. @aashraykannan5027 May 2023 Eddie, I think there is an error in the Einstein summation of the covariant derivative of covariant vectors, the one with the minus symbol. At about 43:48. The upper index of the Christoffel symbol should not be the "r", but the "n". The "r" should appear at the lower indices. Nevertheless, I am very grateful for your video's, they help a lot! @r2k314 April 2023 Thank You very much. This really help a novice get oriented. Also this is the first time i've seen the idea behind the metric formula for the Christoffel symbols. Again thank you for your time and efforts
@l-erusal
@l-erusal 7 месяцев назад
Thank you for brilliant lecture. Just one small correction (probably) - I noticed that you called g11 a "scalar". But scalar is the same in all coordinate systems and g11 will be different.
@palfers1
@palfers1 7 месяцев назад
Really good. I was however disappointed that you chickened out of deriving the full expression for the Christoffels.
@thevegg3275
@thevegg3275 4 месяца назад
A fundamental question about The Vr in the red 5:57 circle at minute 6:33. Aside from the fact, that r is a dummy variable, why not also let it be Vn thus making it super clear that on both sides of the equation you have Vn. And sure, it could stand for both the instances of the measured component, and the dummy variable. Why would that be more complex for the simple sake you have fewer variables?
@yancymuu4977
@yancymuu4977 Год назад
Thanks for the really good video's. I am still struggling with covariant and contravarient vectors. Some questions: if we generate the covariant components of a vector, does that make it a covariant vector? Can the gradient vector be expressed by contravarient components? If I draw out the covariant components of some vector, it would seem like increasing the size of the basis vectors would decrease the size of the vector components just as with a contravariant vector. How have I got this wrong. Thanks again for the great content.
@thevegg3275
@thevegg3275 Год назад
"If I draw out the covariant components of some vector, it would seem like increasing the size of the basis vectors would decrease the size of the vector components just as with a contravariant vector". I had the same question. Contravar: If BV goe up, comp go down. Covar: If BV go up, comp go up. It is either bc the basis vectors in covariant axes are 1 which would lead to contravariance. or both! Someone please correct me if I'm wrong.
@gso.astrowe
@gso.astrowe 8 месяцев назад
​@@thevegg3275as I understand it, the covar is essentially telling you how to modify the length of the actual vector, e.g. given a unit vector, 5*1 would essentially be a vector 5x's the length of the unit vector. Thus, this grows or shrinks in the same way that you modify the vector. The contravar is telling you how the underlying space is modified; think of it as "# lines pierced". This grows or shrinks opposite the vector. So, again, given a unit vector modified by contra of 5, you are essentially saying that the single vector passes through 5 unit measurements, e.g. lines. In both cases, the vector is still the same size, we have just changed how we define the coordinate system. If we change the units of measuring the vector, it is covar; if change the units of the field, it is contravar. Someone feel free to correct me if I am wrong.
@thevegg3275
@thevegg3275 Год назад
Love this! Min 6:22 in the second termon the rhs, why isn't V^r named V^n so it matches dV*n in the first term. They are the same coordinates. Just the first term is the partial derivative and the second term is not. Thanks!
@hershyfishman2929
@hershyfishman2929 6 месяцев назад
The r's in each factor of that term (upper index for V, lower index for Γ) are dummy indices. They disappear after being summed over, and what is left is n upper, m lower, just as in the lhs
@thevegg3275
@thevegg3275 5 месяцев назад
@@hershyfishman2929 Thank you! Is there any connection between the covariant vector (formed by parallel projection on a graph-vs-contravariant vector formed by perpendicular projection) and a tensor's covariant indices? I know the math of finding the vector covariant vector but have no clue how it relates to tensors.
@hershyfishman2929
@hershyfishman2929 5 месяцев назад
@@thevegg3275 Yes, a rank 1 tensor is a vector, and the position of its indices (higher vs. lower) indicate whether it is co- or contravariant. The same idea extends to higher rank tensors. To be clear, a vector is a an object with a magnitude and direction, independent of coordinates, but every vector can be represnted by either covariant or contravariant components (different sets of numbers), each with the appropriate type of basis vectors. I believe, though i could be wrong, that this aspect does not extend to tensors of higher rank which are more complex objects and are not invariant. [update: In Sean Carroll's textbook on GR p. 26 he says that "tensors generally have a 'natural' definition independent of the metric"]
@thevegg3275
@thevegg3275 5 месяцев назад
Ah. So, maybe there is no connection between a rank one vector and a rank one tensor?
@hershyfishman2929
@hershyfishman2929 5 месяцев назад
@@thevegg3275 no, a vector is a rank 1 tensor
Далее
Theory Of Relativity | Explained in Malayalam
1:21:03
Просмотров 318 тыс.
Demystifying The Metric Tensor in General Relativity
14:29
The Meaning of the Metric Tensor
19:22
Просмотров 211 тыс.
Affine connection
23:25
Просмотров 13 тыс.