Thanks Mostapha. The video was quite helpful. I have a short question. When I use MATLAB build in command [vec, scores]=pca(X); the values of scores are different from values of variable z in your code? Don't they suppose to be the same? scores in matlab function is not the projection of the data on each components? I would appreciate it if you could respond to my question. Cheers.
Great video! What i miss is u comparing the the clusters at the end of the video( pca() function) and finding what component is separation the two clusters so that you may track it over time to identify exact time that clusters switches
hello mostapha, thanks for the video. please, why do we have try 3 colours in the figure? even though m = 2 Which 2 among them is the PCA? thank you in anticipation for the reply.
Excellent tutorial! If we write pca = PCA(0.95) in Python, then 95% of variance is retained. How can we do the same thing in MATLAB? I don't want to specify the number of components but want a fixed variance.
57:30 sorry, but can you tell me, why look for the z matrix and then deduce the pca y matrix from there? i read the theory and only understood the step of finding the covariance matrix and the eig command, but in theory it just says : 'find the image of the matrix A^T. X^ of vector X^...." i don't understand the parts after that, i not good at English and i must use gg translate, it s really hard for me, hope u answer soon, this Pca is a homework for my team to get points for a year :((
Hey this was an amazing video with really clear explanations. However, around 32:14, you confuse the term Eigenvalues with Eigenvectors. Please correct me if I'm wrong! :)
two video requests 1. use of PCA to analyse/reduce a dataset for regression(only numerical variables) instead of iris/digits (classification) 2. biplot based interpretation/analysis of PCA/SVD