I am continually amazed by the quality and value of your videos aswell as papers and notes. It is not only very helpful but also enjoyable to read and watch! Thanks!
This is one of the best videos on the LaPlacian on the entire internet. If you made a LaPlacian/Fourier component, that would complete the complexity cycle personally!!
Incredible content, it's rare to see such clear maths lectures online! I thought I had a good grasp of the Laplacian before, but this video blew my mind anyway! Thanks a lot and keep up the good work! :D
You'll find a link to a conference lecture on the course page. It's not missing, just not part of the playlist. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-4YHmaoQoT9s.html
According to the 2020 course site, the L19 content is covered in a video by Prof. Crane titled "Conformal geometry processing" posted under the account name "CG Group Telecom ParisTech". I won't risk posting a direct link, because RU-vid algorithms have recently been obsessed with removing comments containing any link. Searching with either group of keywords might be insufficient, but including both works for me.
Thank you for these amazing lectures Dr. Crane! I was wondering that at the end of lecture 18 you mention discrete Laplace operator, but I was not able to find Lecture 19 for it. The link for lecture 19 at your website points to a lecture on conformal mapping.
Very informative :) A simple practical application of the Graph Laplacian is to regularize mesh deformation, something I talk about in my own DGP playlist.
Great animation around m 12 about the deviation from the average. Only, as epsilon -> 0 the function will become smaller and smaller. You have to divide by epsilon^2 to make it converge to something in that limit. This is made very clear on the next slide, the one with the average over a ball.
Amazing explanations. I really appreciate this so much. One thing I was wondering about is how you made all these visuals. For instance at 55:10 did you use Matlab to compute this, or do you use multiple different software? How do you make it look so nice?
Love these lectures and wish I had found them years ago! One minor "aargh!" What happened to Lecture 19? I came here for discrete Laplacian but it's the only missing lecture. So sad!
Awesome, Lecture. Is the Discrete Laplacian Lecture lost unfortunately? The notes and suggested readings are great, but would have liked to cement my understanding.
Here the inverse is already written explicitly, so upper indices would imply yet another inverse. In other words, g^{ij} = (g^{-1})_{ij}, and likewise, (g^{-1})^{ij} would be equal to ((g^{-1})^{-1})_{ij} = g_{ij}. Of course, the reason for explicitly writing g^{-1} here is that the upper/lower index convention can get confusing if you're not used to it! (You notice I also don't use the Einstein summation convention here. ;-))
@@lucagagliano5118 Yep! You can find slides at the link in the video description. There are also some written course notes that go along with the slides. So the only thing that's missing is vocal narration.
@@keenancrane But your narration is still critical. Your explanations are really lucid and simple. I sincerely hope you can make and add Lecs 1-13 sometime soon after the semester. There are very few courses on this topic. Thanks a ton for making these available though!!