Тёмный
No video :(

Kl transform (Explanation with a sum) 

Pratik Jain
Подписаться 5 тыс.
Просмотров 20 тыс.
50% 1

Опубликовано:

 

21 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 38   
@iyshwaryakannan6677
@iyshwaryakannan6677 3 года назад
Have searched all the channels to get at least one well explained KL transform, and finally I got one, Thankyou so much bro, it would be a great help if you please post the Reconstruction of the input.
@pratikian
@pratikian 3 года назад
Thank you for the feedback. It will be difficult for me to do a video for the reconstruction in near future. I will try to do it. Mean while you can look at my video of PCA which is the basis of this KL transform. If you understand PCA you will easily understand the KL transform and its reconstruction. Link for visualization ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-tfbJA9FM3Uw.html Link for the complete maths behind PCA ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-hJbvC6HkZLw.html
@vikaskanojiya1553
@vikaskanojiya1553 5 лет назад
Hey brother, just taking a moment to appreciate you brother. Thank you so much for everything ♥️. You helped us a lot. One day you will become something big in your life. Lots of ♥️ from Mumbai
@pratikian
@pratikian 5 лет назад
Thank you so very much brother ... thank you so very much for these kind and precious words .. 😊🙏
@NithishKumar-ng7dp
@NithishKumar-ng7dp 8 месяцев назад
Nice Explanation Anna
@Saravananmicrosoft
@Saravananmicrosoft 4 года назад
well explained, very good effort.
@juniorchancel943
@juniorchancel943 4 года назад
Very well explained, A lot of thanks.
@pratikian
@pratikian 4 года назад
Happy to help 😊
@Pushpraj_Joshi
@Pushpraj_Joshi 4 года назад
Thank You... You explained very well😁
@anarkaliprabhakar6640
@anarkaliprabhakar6640 Год назад
at the end do we have to take transpose of eigen vectors to write A
@pratikian
@pratikian Год назад
So to be clear you put the eigen vectors in the columns of A.
@lachulachus6623
@lachulachus6623 4 года назад
Transformation matrix u writtens is wrong because we should take transpose of eigen each vector there
@sonalighoshal3212
@sonalighoshal3212 5 лет назад
well explained
@pratikian
@pratikian 5 лет назад
Thank you so much ..😊
@c.danielpremkumar8495
@c.danielpremkumar8495 5 лет назад
From the characteristic equation (Lmda x I - A) equal to 0, the Eigen values Lmda1 and Lmda2 work out to be 0.25 and 0.00. Where as you have considered the matrix equation as contaning only one Lmda and then split the value of Lmda using b2-4ac .....
@c.danielpremkumar8495
@c.danielpremkumar8495 5 лет назад
The eigen values Lmda1 work out to 0.35 and 0.
@c.danielpremkumar8495
@c.danielpremkumar8495 5 лет назад
Sorry for my earlier question (s). I got it all messed up. Your video shows the right calculations.
@pratikian
@pratikian 5 лет назад
I hope all your doubts are cleared if any you can email me the queries 😊
@keskarshivani
@keskarshivani 5 лет назад
In the quadratic equation that is formed at 21:18, shouldn't the sign of 0.0625 be negative?
@pratikian
@pratikian 5 лет назад
It is right please check your calculation it is basically 0.5x0.25 - 0.25×0.25 = 0.125 - 0.0625 = 0.0625
@abhiparab9294
@abhiparab9294 5 лет назад
Is there video for kernal matrix sum And to find basis image
@pratikian
@pratikian 5 лет назад
I have not made on that yet will try to make
@AvinyaSolutions
@AvinyaSolutions 3 года назад
Bro matrix multiplication process is wrong
@pratikian
@pratikian 3 года назад
Can you please point out where have I done the mistake in the video .. Thanks
@AvinyaSolutions
@AvinyaSolutions 3 года назад
@@pratikian bro it is not matrix multiplication it is array multiplication there is vast defference between array ( normal multiplication ) and matrix multiplication
@pratikian
@pratikian 3 года назад
No no it is matrix multiplication
@AvinyaSolutions
@AvinyaSolutions 3 года назад
@@pratikian (m1×n1)×(m2×n2)=n1×m2 matrix once verify given intially u multiply [(2×1)×(1×2)]=1×1matrix but u shows 2×2matrix how can it possible
@pratikian
@pratikian 3 года назад
@@AvinyaSolutions so a correction in the matrix multiplication size that you mentioned [ MxN ] x [ NxM ] = [ MxM ] So [ 2X1 ] x [ 1X2 ] = [ 2X2 ]
@parnikanaik7272
@parnikanaik7272 5 лет назад
How to solve after the last step?
@pratikian
@pratikian 5 лет назад
After finding the transformation function you can find the transformed output Y by the formula that os given .
@niravmehta5559
@niravmehta5559 5 лет назад
How to multiply vector with the matrix 'A'
@pratikian
@pratikian 5 лет назад
@@niravmehta5559 X is a 2×4 matrix and A is 2×2 so you can apply normal matrix multiplication
@bharatagarwal33
@bharatagarwal33 4 года назад
@@pratikian the mean matrix u is a 2*1 matrix,right? In order to subtract it from X, will we assume the other elements of u to be zero?
@pratikian
@pratikian 4 года назад
@@bharatagarwal33 no its like subtracting the x colums of x matrix with mean Eg 1 2 3 4 - 1 5 6 7 8 2 = 0 1 2 3 3 4 5 6
@SuperRohitthegreat
@SuperRohitthegreat 5 лет назад
you from MU?
@pratikian
@pratikian 5 лет назад
+Rohit Pawar yes
Далее
Motivation for histogram processing
21:49
Просмотров 1,9 тыс.
Wiener Filter
14:55
Просмотров 45 тыс.
Never Troll Shelly🫡 | Brawl Stars
00:10
Просмотров 1 млн
KL transform - Concept and Numerical (in HINDI)
20:37
Global Thresholding
13:08
Просмотров 38 тыс.
Region Growing
8:57
Просмотров 78 тыс.
Region Split and Merge
7:09
Просмотров 81 тыс.
Cocky Guy Thinks He Can Beat Me in Chess
20:06
Просмотров 82 тыс.
HADAMARD transform in digital image processing
6:23
Просмотров 126 тыс.
(DCT) Discrete Cosine Transform in image processing
8:14
Image Sampling and Quantization / 7 Sem / ECE / M1/ S5
44:28
Hadamard Transform
5:48
Просмотров 35 тыс.