Тёмный
No video :(

SVD: Image Compression [Matlab] 

Steve Brunton
Подписаться 358 тыс.
Просмотров 80 тыс.
50% 1

This video describes how to use the singular value decomposition (SVD) for image compression in Matlab.
Book Website: databookuw.com
Book PDF: databookuw.com/...
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com...
Brunton Website: eigensteve.com
This video was produced at the University of Washington

Опубликовано:

 

21 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 85   
@AlessandroBottoni
@AlessandroBottoni 3 года назад
Absolutely amazing! This video series should be considered a "national treasure" and kept safe at the Congress Library. Congratulation Prof. Brunton and many thanks for you valuable work.
@Eigensteve
@Eigensteve 2 года назад
Thank you so much!!
@ariffaridi6699
@ariffaridi6699 2 года назад
Very very great lecture series. In my opinion, what humans/students need the most (water, air, food, and lecture series like this) should be available to all.
@matheusparanahiba1057
@matheusparanahiba1057 Год назад
I'm learning Recommender Systems, so I came across the concept of SVD and searched for a video to better understand it. I couldn't be happier to find this amazing series, which has been most helpful and very enriching. Thank you so much, prof., regards from Brazil :)
@rudypieplenbosch6752
@rudypieplenbosch6752 Год назад
Wow not only he provides us with brilliant lessons, he even shares his book in PDF form
@AnimationsJungle
@AnimationsJungle 3 года назад
Sir you are simply the best. I have learned a lot from your lectures. You are simply becoming my role model in my phd program........... lots of love from kashmir ...........
@Eigensteve
@Eigensteve 3 года назад
Thank you so much for the very kind words!!
@SandeepSingh-yx2si
@SandeepSingh-yx2si 4 года назад
Amazing Steve. You have really helped me in understanding SVD application in Data Science. Thanks a lot. I really hope if you could have made your videos downloadable
@raviprakash1278
@raviprakash1278 4 года назад
Excellent lectures. I was having trouble understanding SVD. This lecture helped me a lot. Thank you very much for uploading.
@Eigensteve
@Eigensteve 4 года назад
Glad it was helpful!
@rasher939
@rasher939 2 года назад
Excellent lecture series!!! This is really inspiring and probably the best lecture series ever. Totally transforming the way we look at SVD and its applications in real life. Thank you! for your efforts and passion to create such lovely teaching materials, including the other lecture series on machine learning, control systems, and data-driven dynamical models.
@BoZhaoengineering
@BoZhaoengineering 4 года назад
thank you for your video and PDF books. Data science is now everywhere. Your video and the book alongside are my ongoing resources to visit, when I need a certain mathematic technique. I work for wind power section as a structural/mechanic engineer (of course a math lover). For the topics of Load Simulation, various of vibration and aerodynamics such as turbulence are the ones I am working on daily. Cheers,
@hongwang6778
@hongwang6778 4 года назад
Dr. Brunton, thanks very much for your excellent lectures!
@franciscogaray2129
@franciscogaray2129 3 года назад
Simply spectacular your way of teaching....you are a great teacher, greetings from Perú, South America
@cxxocm
@cxxocm 2 года назад
I was trying to understand PCA and Googled this amazing series. Thanks, Dr. Brunton. Not only the contents and explanations are stunning, but also the technologies used in the lectures were fabulous. The only complaint is that sometimes I couldn't focus because I'm thinking that how could Dr. Brunton write reversely? What fancy technology he was using? :)
@jacobanderson5693
@jacobanderson5693 4 года назад
thanks for positing these. Definitely buying your book!
@syoudipta
@syoudipta Год назад
With just 5 modes, you can get a "Ruff" estimate!
@woodworkingaspirations1720
@woodworkingaspirations1720 9 месяцев назад
Always a pleasure to watch
@Eigensteve
@Eigensteve 9 месяцев назад
Thanks!
@HavaN5rus
@HavaN5rus Год назад
11:22 What would also be great is too add here Frobenius norm error graph, and show it's decreasing. Also I have a question about hidden watermarks you talked about: If I add a big enough watermark even to parts related to the last eigenvalue, wouldn't it change the whole SVD basis? Btw, thank you, you're lectures are God's blessing on mankind. 👍
@kasturibarkataki4154
@kasturibarkataki4154 3 года назад
Really really grateful to you for helping me learn this!
@SoumilSahu
@SoumilSahu Год назад
Just to make sure I've understood this correctly, since you're performing the SVD for a single image, you're essentially seeing how well the "pixel columns" of the same image are correlated to each other, correct? P.S. the idea of digital watermarking seems so simple yet so cool, this is amazing stuff!
@_J_A_G_
@_J_A_G_ Год назад
He responded to another comment on this. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-H7qMMudo3e8.html&lc=UgybqQYO8S_PsQOrkvV4AaABAg.96mgj1_NYPW98T05U7uNgu
@qamarkilani551
@qamarkilani551 4 года назад
I cannot wait for the next lecture .. Very informative .
@woodworkingaspirations1720
@woodworkingaspirations1720 9 месяцев назад
Amazing talk
@blackspitit
@blackspitit 3 года назад
Thanks for these amazing lectures!
@Eigensteve
@Eigensteve 3 года назад
You're very welcome!
@andrezabona3518
@andrezabona3518 3 года назад
Professor Steve, why people use FFT or Wavalets instead of SVD? For which application is SVD approach better then other two?
@noahbarrow7979
@noahbarrow7979 2 года назад
Steve (and co.) I am a huge fan. You've deepened my appreciation of linear algebra, data science, fluid mechanics and matlab itself. Thank you! I recently purchased your book and I haul it around with me to school like it's one of the dead sea scrolls. I'm trying to better understand this idea of cumulative energy...can it be thought of as the 'effective power' of the rank of our sigma matrix? In this video with the image of your dog it appears that we approach the energy of rank 1 as we include more information, right? Am I understanding these nuances correctly? Thanks again for all of these videos. The clarity, passion and enthusiasm you have for these subjects is inspiring!
@Eigensteve
@Eigensteve 2 года назад
Thanks for the kind words and great question. We actually just finished up a 2nd edition, and realized that the discussion of cumulative energy needed to be cleared up. So your confusion is probably because it was a bit confusing... Technically, the cumulative energy would be computed by adding up the sum of the *squares* of the singular values, although most of the time we just add up the sum of singular values. Not a huge difference, but important to make units match up. And in that case, the spectrum does have a similar interpretation as *power* in the power spectral density with the Fourier transform. And yes indeed, this should all approach a normalized sum of 1 when we have all of the modes included. I will do my best to start posting errata soon to clarify some of these points. Usually would be posted on databookuw.com (and you can find the pdf at databookuw.com/databook.pdf ... not updated yet, but soon)
@noahbarrow7979
@noahbarrow7979 2 года назад
@@Eigensteve Wow, thank you for taking the time to answer my question! I must admit, I think there are a range of topics I need to delve further into to support my understanding of some of these concepts, but your response gives me a great point of departure for developing my intuition about this type of analysis. I guess I'm also asking too because, as I learn more, and watch more of these videos, I am trying to file away the sort of "magical" bits of knowledge as well as the sort of immediately "practical" bits (i.e "always graph the semilogy values of..")...not that the practical is any less magical...
@Martin-iw1ll
@Martin-iw1ll 11 месяцев назад
Yes, good to know you are a fan of Terry Prachett as well
@tingyangmeng2832
@tingyangmeng2832 3 года назад
Super cool! Thank you so much Prof. Brunton.
@wentaowu3070
@wentaowu3070 7 месяцев назад
Great lectures
@hoschi49
@hoschi49 2 года назад
Wonderful way of presentation!
@saurabhkale4495
@saurabhkale4495 4 года назад
Best explanation!!!!Amazing...
@bibekdhungana2182
@bibekdhungana2182 3 года назад
Thank you so much for an amazing presentation!!
@yingqinwu9889
@yingqinwu9889 3 года назад
Thank you for your amazing contribution!
@alfonshomac
@alfonshomac 4 года назад
my highest of fives for you
@ozzyfromspace
@ozzyfromspace 4 года назад
Thanks for that point about hiding data in low modes of the SVD. Good to know in case I ever wanna send “in your face” encryption or something :)
@mrjawad6826
@mrjawad6826 3 года назад
Thanks a lot for all this @Steve_Brunton
@Eigensteve
@Eigensteve 3 года назад
You are very welcome
@liorcohen4212
@liorcohen4212 3 года назад
Thank you for this great video. One remark on matlab syntax, X' is the complex conjugate and not the transpose. The syntax for the transpose is X.'
@_J_A_G_
@_J_A_G_ Год назад
Interesting, I don't think I've ever seen the correct one then. Was this always the case? Anyway, in this case we know it's real numbers only, so still correct. In the words of the documentation: "When no complex elements are present, A' produces the same result as A.'."
@liorcohen4212
@liorcohen4212 Год назад
@@_J_A_G_ yes. it was always the case and for real numbers it doesn't matter
@_J_A_G_
@_J_A_G_ Год назад
@@liorcohen4212 Addendum for future readers: Apparently complex conjugate was anyway the right thing to do. So code was right also for complex numbers and lecture was simplifying. This came up in later lecture. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-46Hpy4FiGls.html
@abhimansakilam6009
@abhimansakilam6009 2 года назад
Superb explaination
@fzigunov
@fzigunov 4 года назад
Hi, Dr. Brunton, I got a question from a student on the POD mode energy that made me look again at this lecture. In your book and in your lecture, you suggest that the total energy is given by sum(diag(S)) and that the energy of each eigenmode is given by the eigenvalues diag(S). I feel like it should be given by diag(S²), though. In my understanding, energy should reflect the variance of the snapshots, which is contained in the terms of the correlation matrix X X*. The diagonal terms of X X* are directly the variance of each data series. X X*, however, as you show in equation 1.7a, is U S² U*. Since U are unitary matrices, the energy contribution must come from S². This is in conflict with the code you present, where energy is given in terms of S. Am I mistaken in my understanding of POD? I know plotting diag(S) gives you a good proxy for mode energy, but in many applications (like acoustics, for example) the correct energy metric is crucial. So I really would like to get this right!
@fzigunov
@fzigunov 4 года назад
I'd like to add that if you attempt to do this on a random noise matrix, for example, you do get the correct metric when squaring S. Try this in Matlab: clear; clc; close all; X=2*randn(10000,100); %Generates random noise with variance = 4. Rectangular matrix assures we'll not mess up the row/columns. Variance 4 assures squaring changes the outcome w.r.t. not squaring [U,S,V]=svd(X,'econ'); %Regular SVD X_energy_indiv=diag(X*X'); %x1.*x1 is the energy of the first pixel, for example. For 10000x100, We get 10000 entries averaging 400 each. Dividing by the number of snapshots (400/100=4) we get the variance X_energy_total=sum(X_energy_indiv); %sum of the energies of each snapshot gives total energy. Should give about 10000*400=4e6 S_energy_nonsquared=sum(diag(S)); %should give about 2e4 S_energy_squared=sum(diag(S).^2); %should give exactly X_energy_total (4e6) (I think this is correct)
@arminth4117
@arminth4117 4 года назад
@@fzigunov Wondering the same, but the good news is, plotting on a log scale shouldn't really change the shape of the curves, just the units up to a constant factor at least I hope so! The cumulative plot might not have the right shape, and we depend on the percentage thresholds so I am a bit confused about that one
@fzigunov
@fzigunov 4 года назад
@@arminth4117 The problem is that it is very common for people to quote something like "modes 1 to M contain X% of the total energy" or "mode M contains X% of the energy in the flow". Therefore, the energy metric matters quite a bit. About the plot shape; I think it is quite a secondary feature when analyzing POD results. The whole point of POD is to provide a more understandable description of what a complex system is fundamentally doing, so the shape only matters to give you a sense that the higher order modes can indeed be discarded (or not).
@AFA-qg6hk
@AFA-qg6hk 9 месяцев назад
Thank you
@hindumuppala
@hindumuppala 7 месяцев назад
Tq prof.
@danielniels22
@danielniels22 3 года назад
very cool sir. u expert in math and how do you easily do transition between 2 different languages MATLAB & Python? It's more of memorizing syntax right?
@matthewjames7513
@matthewjames7513 3 года назад
Can SVD also be used to extract an approximate analytical equation from a bunch of x,y,z data? For example x = age, y = amount of hours walked per day, z = weight of person. Say the equation we would want to extract from the data would be z = x^2 - 4*x/y?
@NotTzeentch
@NotTzeentch 4 года назад
Link to the website: databookuw.com/
@SLguitarGuy
@SLguitarGuy 3 года назад
Thank you very much
@tvstation8102
@tvstation8102 Год назад
I ran thru all this in Matlab, and am a little confused about one thing. In your on screen examples in other videos you refer to each column on X being an image of a different face....but in this example the entire Matrix X appears to be one image ( the dog). Is it just a different example, or am I misinterpreting? Thanks!
@diegoguisasola3858
@diegoguisasola3858 3 года назад
Dr. Brunton, ty for these videos. I'll finish all the lectures before diving into the book. One question, though. In previous videos you mentioned that the U matrix was composed by several columns which are the information of several images. Here, you applied the SVD to a single image and I don't understand how is that SVD can be applied to a single images which can be approximated by an U matrix with a single column. I would be really thankful if you could explain this to me. Thanks in advance!
@_J_A_G_
@_J_A_G_ Год назад
He responded to another comment on this. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-H7qMMudo3e8.html&lc=UgybqQYO8S_PsQOrkvV4AaABAg.96mgj1_NYPW98T05U7uNgu
@diegoguisasola3858
@diegoguisasola3858 Год назад
@@_J_A_G_ ty mate!
@timetheoncelee2961
@timetheoncelee2961 2 года назад
Hi Prof Steve, may I ask a question please: the compressed picture "Xapprox" has the same dimension as the original picture X. So why you said that the compression save storage?
@_J_A_G_
@_J_A_G_ Год назад
See discussion in other comment: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-QQ8vxj-9OfQ.html&lc=UgySxL8I3zJqiqPtUlt4AaABAg.9Owk8AGSt769ijFmwq9Vli
@panthakhin1659
@panthakhin1659 4 года назад
Thank You
@panthakhin1659
@panthakhin1659 4 года назад
SVD Algoritham using Original Image Compression Make it Decompression Image Poor quality Disadvantage
@convex9345
@convex9345 3 года назад
While using command 'imagesc' I am getting different color image not the gray one
@jenkinsj9224
@jenkinsj9224 3 года назад
In the MATLAB implementation, the dog looks grainy for low ranks. But, when I check the memory size of the grainy dog using the 'whos' command, surprisingly the grainy dog occupies as much memory in bytes as the original HD dog. we are doing all the hard work to reduce the memory size, aren't we? can you explain this prof?
@_J_A_G_
@_J_A_G_ Год назад
Both original and reconstucted image are 2000x1500 pixels. This is (nx*ny) in both cases, so not where to look to save storage. The compression idea is to look at the right side of the equation and instead store the U,S,V matrices. The "recipient" would from those do the work to reconstruct Xapprox after loading the data. When discarding parts of those matrices (to keep only r columns of U and r rows of V) you get a lossy compression. The smaller r, the less data to store, but also a less accurate reconstruction. The key insight is that the vectors are already ordered by importance, so it's easy to include only as many as you need. The "title" calculation has r*(nx+ny) for the stored data size. I think (as he indicated verbally) it should be r*(nx+1+ny) to also include the S matrix (S is diagonal, so the r-by-r matrix is zeroes everywhere but the diagonal, so I agree that it's a very small correction). If the r=nx from the economy SVD transform, storing the U,S,V directly is no win, but as shown in the video r can be lowered quite a lot without visible degradation or even more if that is acceptable. PS. IMHO, this video is mainly part of the intro for SVD, explaining the concept of rank reduction. It's not a literal "how to compress images" tutorial.
@jonweeee
@jonweeee 2 года назад
Hi Steve, are you able to share with me your set up to record this video? Would like to do something similar for my lecturers. Thanks!
@sollinw
@sollinw 3 года назад
nice
@zhenzhoutoh7345
@zhenzhoutoh7345 3 года назад
where can I get this complete code?
@Eigensteve
@Eigensteve 3 года назад
Check out the links at databookuw.com
@kahnzo
@kahnzo 2 года назад
How did you get so good at writing backwards :)
@simong1666
@simong1666 2 месяца назад
I'm pretty sure they just mirror the video in post-production
@jorgeruiz2121
@jorgeruiz2121 4 года назад
amazing....do you have email or twitter?
@Eigensteve
@Eigensteve 4 года назад
on twitter @eigensteve
@1PercentPure
@1PercentPure 9 месяцев назад
holy shit dude
@joshmyer9
@joshmyer9 4 года назад
7:19 boopin' the low rank snoot
@zelexi
@zelexi 4 года назад
awwww.... I thought we were *actually* going to compute the SVD... not just call "svd". Seems like a cop-out Mr. ;)
Далее
SVD: Image Compression [Python]
9:46
Просмотров 90 тыс.
Əliyev və Putin kilsədə şam yandırıblar
00:29
Просмотров 186 тыс.
How Image Compression Works
6:52
Просмотров 113 тыс.
Lecture: The Singular Value Decomposition (SVD)
44:36
Просмотров 226 тыс.
Image Compression with the FFT (Examples in Matlab)
17:30
SHA: Secure Hashing Algorithm - Computerphile
10:21
Просмотров 1,2 млн
Principal Component Analysis (PCA) 1 [Python]
7:37
Просмотров 46 тыс.
I gave 127 interviews. Top 5 Algorithms they asked me.
8:36