Тёмный
No video :(

Basis vectors and the metric tensor 

Tensor Calculus - Robert Davie
Подписаться 10 тыс.
Просмотров 44 тыс.
50% 1

This video goes through the process of deriving the basis vectors in an arbitrary coordinate system and then looks at how these basis vectors are related to the metric for the space.
Correction: At 12:15 the formula for the angle between two vectors should NOT have a square root in the numerator. The correct form is:
Cos(𝜃) = g_ij u^i v^j / [Sqrt(g_ij u^i u^j) Sqrt(g_ij v^i v^j)]

Опубликовано:

 

17 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 54   
@scottydscottd
@scottydscottd 6 лет назад
Hell yesss. Clearest and most logical exposition on RU-vid. Reasonable definitions, etc. This is a gold mine. Thank you!
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Thanks for your comment. Much appreciated!
@abcdef-ys1sb
@abcdef-ys1sb 6 лет назад
I was looking for this kind of explanation for a long time
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Thank you for your comment.
@logansimon6653
@logansimon6653 4 года назад
Honestly, a very competent run through. Thanks!
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 года назад
Hello Logan and thank you for your comment. Much appreciated!
@logansimon6653
@logansimon6653 4 года назад
@@TensorCalculusRobertDavie Hi, you are very welcome. I browsed through some of your other titles just now, and I am excited to see a rich source of mathematics of my most favorite type. Would you mind if I cite you as a source in the text I am writing on general relativity (with a rigour in tensor calculus and differential/Riemannian geometry) -- especially for any instances that I am inspired to add to my work because of your content? If you would like, this is a hyperlink to my document. drive.google.com/open?id=1-MU7daeZ0Q8TefNOwzImcGD2uZIhktvZl3FO13R5UkQ Thank you for the content!
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 года назад
You are welcome to cite my material andgood luck with your efforts.
@marinajacobo3550
@marinajacobo3550 5 лет назад
Thank you Robert! I really enjoyed this video.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 5 лет назад
Hello Marina and thank you for your comment. Much appreciated.
@theboombody
@theboombody 2 года назад
I like the ad placements on these videos. "Are you struggling with calculus?" If you're watching a video on curvature and differential geometry, then no, you're not struggling with calculus. You're struggling with something far beyond.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 2 года назад
Yes, a bit ironic. I hope there aren't too many ads?
@theboombody
@theboombody 2 года назад
@@TensorCalculusRobertDavie No, it's not too bad. That's the price of posting stuff on youtube. They can put ads in your stuff and there's nothing you can do about it except not post videos. But I think it's a small price to pay for the freedom of being able to post mathematical content. I'm pretty grateful for youtube both as a viewer and as a poster.
@marinajacobo3550
@marinajacobo3550 5 лет назад
Thank you! I really enjoyed this explanation :)
@g3452sgp
@g3452sgp 6 лет назад
The images at 1:59 and at 3:20 are good. They are well organized and help us to get the whole picture of underlying concept. Excellent! Thanks a lot.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Thank you again!
@dansaunders6957
@dansaunders6957 4 года назад
What happens to the position vector when working with a manifold? how does one typically define a basis without a position vector.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 3 года назад
Please have a look at the first few minutes of this video.
@user-ox9fg8wd9j
@user-ox9fg8wd9j 4 года назад
5:12 thank you so much
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 года назад
You're welcome!
@davidprice1875
@davidprice1875 7 лет назад
Very clear and precise summary.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 7 лет назад
Thank you David.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 7 лет назад
Hello Sjaak, the content covered here does assume some prior knowledge of vector calculus. The main point of the video are the two forms of basis vectors that can be formed so could I suggest that a good starting point would be to focus on the meaning of the diagrams before moving on to deal with the notation and what it is trying to express. Hope that helps?
@user-si1zn3ir7x
@user-si1zn3ir7x 3 года назад
Is linear algebra needed (I mean in a rigorous way starting from defining vector spaces and dual spaces and so on...) to fully understand Tensor and General Relativity? Because some Textbooks were pretty hard to read since they start from a very abstract point of view not even mentioning about differentials, chain rules from calculus. I really enjoyed the video by the way I really appreciate it. Thank you!
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 3 года назад
Hello and thank you for your comment. The answer is no because this video provides you with a basic introduction to basis vectors and one forms (the objects with raised indices). However, the more you learn the better so do continue to study linear algebra if you can. Thank you for the feedback and good luck with your studies.
@user-si1zn3ir7x
@user-si1zn3ir7x 3 года назад
@@TensorCalculusRobertDavie Thankyou!
@benedekjotu266
@benedekjotu266 5 лет назад
Excellent presentation. In general, what is the punch line for working with both covariant and contravariant coordinates? They are both representing the same objects. The metric tensor is usually at hand anyways. At first it seems an unnecessary complication while on the way to general relativity. How come they didn't just go with one or the other? And left the other as a fun fact side note. Thanks
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 5 лет назад
Hello Benedek and thank you for your question. Wikipedia discusses this issue in the quote below and further in the link below that. "The vector is called covariant or contravariant depending on how the transformation of the vector's components is related to the transformation of coordinates. Contravariant vectors are "regular vectors" with units of distance (such as a displacement) or distance times some other unit (such as velocity or acceleration). For example, in changing units from meters to millimeters, a displacement of 1 m becomes 1000 mm. Covariant vectors, on the other hand, have units of one-over-distance (typically such as gradient). For example, in changing again from meters to millimeters, a gradient of 1 K/m becomes 0.001 K/mm." www.wikiwand.com/en/Covariance_and_contravariance_of_vectors
@dsaun777
@dsaun777 5 лет назад
@@TensorCalculusRobertDavie So it doesnt matter if you use contravariant or covariant they are just used whenever most conveniently for transforms?
@hariacharya5533
@hariacharya5533 6 лет назад
good presentation. you explain nicely.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Thank you.
@nicolecui3214
@nicolecui3214 4 года назад
Hi, thanks for the video, but why does every vector is written by a covariant component with contravarient basis, and vice versa. Intuitively, I thought isn't the component and basis are consistent?
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 года назад
The two bases are distinct, hence the upper and lower indexes, and behave in different ways unlike in Euclidean space where they really are just the same thing hence no reason to raise or lower indices. Sorry for the short answer. Have a look at this article; en.wikipedia.org/wiki/Covariance_and_contravariance_of_vectors and this video; ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-CliW7kSxxWU.html In General Relativity we use a metric to raise and lower these indices that is not the same as the Euclidean metric.
@nicolecui3214
@nicolecui3214 4 года назад
@@TensorCalculusRobertDavie Thank you for the reply, will take a look! :)
@parthvarasani495
@parthvarasani495 23 дня назад
12:24 , u • v = g(ij) ui vj , not square root of it. I think.(In Numerator)
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 22 дня назад
You are right. Thank you for spotting that.
@parthvarasani495
@parthvarasani495 22 дня назад
@@TensorCalculusRobertDavie Thank you for your all efforts, highly appreciated 👍👏👏
@vicentematricardi3596
@vicentematricardi3596 6 лет назад
Muy Buenos sus Videos !!!!!
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Vicente Matricardi Muchos gracias.
@vicentematricardi3596
@vicentematricardi3596 6 лет назад
Gracias a usted por generar y divulgar tan buena calidad de informacion , le escribo en español por que me agrada que sepa que a mucha gente le interesan estos temas , un saludo !!!!
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Vicente Matricardi Thanks Vicente.
@vicentematricardi3596
@vicentematricardi3596 6 лет назад
Thanks, Robert Davie
@garytzehaylau9432
@garytzehaylau9432 4 года назад
Excuse me What is Nable i u in 12:53 actually this notation is not clear why g^ij Nablaj u ej = n could you explain to me thank for your great videos i would recommand to other people
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 года назад
Hello Gary and thank you for your question. The inverse metric is the g^ij part and the nabla u is the derivative giving us the maximum direction of increase in the scalar u in each of the directions j. The inverse metric raises the j index on the resultant of nabla u so that we obey the Einstein summation convention and don't end up with two j's down below. We CANNOT have (nabla u)j e_j but we can and must have (nabla u)^j e_j. Hope that helps?
@abhishekrai1204
@abhishekrai1204 4 года назад
Thanks sir
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 года назад
You're welcome.
@rontoolsie
@rontoolsie 7 лет назад
At 11:45, line 3 should end up as u(covariant)V(contravariant). Otherwise this is an excellent presentation.
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 7 лет назад
Hello Ron, thank you for your comment and you are correct however, in this case, we have u(covariant)v(contravariant) = u(contravariant)v(covariant) which was the point I was trying to show across lines 3 and 4. The point here is that there are four different looking ways to get the same result. At the time I did um and arrgh about whether I should write it in the form you have pointed out but my goal took precedence in the end.
@zoltankurti
@zoltankurti 5 лет назад
At the beginning of the video, you have to assume that the coordinate transformation and its inverse is also differentiable
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 5 лет назад
Thank you Zoltan, that is a good point about differentiability, I should have mentioned it at the beginning.
@anthonysegers01
@anthonysegers01 6 лет назад
GREAT JOB!!! (
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 6 лет назад
Thank you Anthony.
@rontoolsie
@rontoolsie 7 лет назад
correction...line 4
@TensorCalculusRobertDavie
@TensorCalculusRobertDavie 4 месяца назад
Which slide?
Далее
The Meaning of the Metric Tensor
19:22
Просмотров 211 тыс.
The Metric Tensor in 20 Glorious Minutes
19:43
Просмотров 17 тыс.
Running With Bigger And Bigger Feastables
00:17
Просмотров 85 млн
What's a Tensor?
12:21
Просмотров 3,6 млн
An introduction to vectors and dual vectors
10:13
Просмотров 31 тыс.
Demystifying The Metric Tensor in General Relativity
14:29
Contravariant and Covariant Vectors | 1/2
10:48
Просмотров 122 тыс.
Introduction to the Hodge Star Operator - 1
24:04