Тёмный
No video :(

Contravariant & Covariant Components of Vectors - An Introduction to the Metric Tensor 

Eddie Boyes
Подписаться 3,1 тыс.
Просмотров 13 тыс.
50% 1

In this video (GR - 04), we take the idea of one-dimensional Contravariant and Covariant vectors, and move to thinking about TWO dimensional space, and the vectors in that space having two types of components - again called ‘Contravariant’ and ‘Covariant’. This leads on to a simple introduction to the ‘Metric Tensor’. On the way to this, the Einstein Summation Convention is introduced, which will be used from now on to reduce long equations to a much simpler-looking form.
This video is part of a series of videos on General Relativity (GR-01 to GR-20), which has been created to help someone who knows a little bit about “Newtonian Gravity” and “Special Relativity” to appreciate both the need for “General Relativity”, and for the way in which the ‘modelling’ of General Relativity helps to satisfy that need - in the physics sense.
The production of these videos has been very much a ‘one man band’ from start to finish (‘blank paper’ to ‘final videos’), and so there are bound to be a number of errors which have slipped through. It has not been possible, for example, to have them “proof-watched” by a second person. In that sense, I would be glad of any comments for corrections ……. though it may be some time before I get around to making any changes.
By ‘corrections and changes’ I clearly do not mean changes of approach. The approach is fixed - though some mistakes in formulae may have been missed in my reviewing of the final videos, or indeed some ‘approximate explanations’ may have been made which were not given sufficient ‘qualification’. Such changes (in formulae, equations and ‘qualifying statements’) could be made at some later date if they were felt to be necessary.
56:51 Correction - The column vector on the left hand side should read (downwards) V1, V2, V3
This video (and channel) is NOT monetised

Опубликовано:

 

12 янв 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 33   
@math2cool
@math2cool 9 месяцев назад
What a gifted instructor. If you truly want to understand GR you must make this series a stop along your way. Thank you Eddie.
@LibertyAzad
@LibertyAzad 11 месяцев назад
This is THE place to start studying GR. Then you can hit the books and other video lectures more profitably.
@jonathanlister5644
@jonathanlister5644 25 дней назад
Just beyond belief! Wonderfully expressed, I love how you subtly hammer home the points to guide people away from the restrictions of geometry learned from a black board! " Be very careful what you put into that mind for you will never get it out!".
@astronomy-channel
@astronomy-channel 11 месяцев назад
A superb series of videos which progress in slow logical steps. Impressive!
@jesuscuadrado2291
@jesuscuadrado2291 6 месяцев назад
The video is incredible, pure gold! The exhibition is masterful! It is the best EVER introduction to contravariant and covariant components, and metric tensor (I use many books and resources). I was reading the "Covariant Physics (Moataz)" book and I started to get lost in chapter 1.3, it is an easy chapter where the contravariant and covariant components are introduced along with the metric tensor in cartesian coordinates (metric tensor is not mentioned in any case, and it is used as the Kronecker delta like a kind of magic term to convert between contravariant and covariant components) . Only after watching the video, I have truly understood the chapter. I am excited about these GR videos, I have decided to study them carefully one by one. I have already seen the first one. I am a graduate in Physics who, after a long time, went back to studying certain topics to fill many gaps that were left during my career. Thanks Professor Edward D Boyes for this precious resource.
@christianfunintuscany1147
@christianfunintuscany1147 6 месяцев назад
I finally got the geometric interpretation of covariant and contravariant components … thank you!!!!
@r2k314
@r2k314 Год назад
Your series are wonderful. If I had studied them when I first started, I would be much much further along. Thank you I wish I could figure how to put this at the top of the intro to GR recommendations. You have no peer!
@ImranMoezKhan
@ImranMoezKhan 4 месяца назад
Small typographical error I suspect: for the matrix equation at around 50:46 , the right hand side is product of 1x2 and 2x2, which would produce a 1x2 row vector, but the left hand side is 2x1 column vector. I suppose the matrix of dot products should come first followed by the contravariant as a column vector.
@manishankaryadav7307
@manishankaryadav7307 7 месяцев назад
Hi Eddie, Thank you for the videos. Everything (the content, the flow, the math, the questions, the voice, the presentation etc.) is fabulous. At time stamp 57:21 on LHS third entry's subscript needs to be 3. 🙂 Thank you once again, Mani
@maaspira
@maaspira 3 месяца назад
Thank you very much!
@jameskinnally4173
@jameskinnally4173 Год назад
This is an excellent series. Thank you very much!
@victoriarisko
@victoriarisko 3 месяца назад
Beautiful instruction! Most enjoyable to learn quite sophisticated topics
@jimmyraconteur2522
@jimmyraconteur2522 Месяц назад
fantastic professor
@messapatingy
@messapatingy Месяц назад
Remember Co-Low-Pro Co: Covariant components Low: Use lower indices Pro: Represent projections onto coordinate axes
@Rauf_Akbaba
@Rauf_Akbaba 9 месяцев назад
Excellent, thank you
@Mouse-qm8wn
@Mouse-qm8wn 4 месяца назад
Eddie, you made my day😊🎉! What a great video. I am looking forward to see the whole series. I have a question. Do you have a reference to a good GR book which contains problems and solutions for practise?
@eddieboyes
@eddieboyes 7 месяцев назад
Thanks Mani - another one I hadn't spotted! I'll probably put a "correction" into the Video Description rather than upload a new corrected version (as that would re-start the counts etc from scratch). Putting a "correction" into the Video Description ought to put a correction notice at the relevant time point (according to RU-vid) ...... but I can't seem to get that to work at the moment. Thanks again anyway. Eddie
@christosgeorgiadis7462
@christosgeorgiadis7462 8 месяцев назад
This is a great exposition of the subject thank you! Believe me, I have tried a lot of others ...
@miguelaphan58
@miguelaphan58 4 месяца назад
. A Master explanation !!!
@hasnounimohamed4710
@hasnounimohamed4710 5 месяцев назад
you make it very easy to understand thank you
@forheuristiclifeksh7836
@forheuristiclifeksh7836 3 месяца назад
1:00
@djordjekojicic
@djordjekojicic 11 месяцев назад
Beautiful explanations and examples. I've seen a lot of GR videos but this series is one of the best. I have one question though. Why are covariant vector components with lower indices presented on same axis when they belong to dual vectors basis which are different in such manner that each one is perpendicular to the original one with different index?
@jesuscuadrado2291
@jesuscuadrado2291 6 месяцев назад
You are right, it is probably a simplification to not introduce dual basis vectors
@jesuscuadrado2291
@jesuscuadrado2291 6 месяцев назад
In any case, you can also have the covariant components with respect to the contravariant basis: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-nNMY02udkHw.htmlsi=l0YONDtcKOnksqLt&t=468
@sagsolyukariasagi
@sagsolyukariasagi 2 месяца назад
Should the basis vectors always come from the covariant bases?
@kevincleary627
@kevincleary627 Год назад
Great videos. Is there a playlist so that I can watch them sequentially? Thanks!
@r2k314
@r2k314 Год назад
Thank You!
@RBRB-hb4mu
@RBRB-hb4mu 3 месяца назад
Black background with light letters please, it make it easier to learn
@2pizen
@2pizen 3 месяца назад
second that!
@user-gx8xs4ib4u
@user-gx8xs4ib4u 8 месяцев назад
1:06:52 Isn’t it a bit early to name g a tensor? We know :) that not every object with indexes is a tensor, right?
@jameshopkins3541
@jameshopkins3541 11 месяцев назад
You has no explain it!!!!! SoNOLIKE
@Altalex988
@Altalex988 6 месяцев назад
Thnks for the serie! One question: the proof @32:25 holds only if basis vectors are unit vector, otherwise general formula is V_n=gmnV_upper_n ? Tnhks, bye
@JoseAntonio-ml8yg
@JoseAntonio-ml8yg 2 месяца назад
Thank you very much!
Далее
What's a Tensor?
12:21
Просмотров 3,6 млн
What is a tensor anyway?? (from a mathematician)
26:58
Просмотров 173 тыс.
Everything You Need to Know About VECTORS
17:42
Просмотров 1,1 млн
The Clever Way to Count Tanks - Numberphile
16:45
Просмотров 939 тыс.
The Boundary of Computation
12:59
Просмотров 1 млн
The Meaning of the Metric Tensor
19:22
Просмотров 210 тыс.