Тёмный

Mall Customer Segmentation using k-means Clustering | Machine Learning | MATLAB 

Knowledge Amplifier
Подписаться 29 тыс.
Просмотров 13 тыс.
50% 1

Наука

Опубликовано:

 

24 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 23   
@Tystile
@Tystile 2 года назад
It been a long day! (sigh ) . Thanks for the info sir ! Been looking for this all day , finally got the answers 😁
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 года назад
Glad to know that you got answer of your question Tystile Nobela! Happy Learning :-)
@manishabarsale1197
@manishabarsale1197 2 года назад
thank you for such informative video. how can I implement k-means in wireless sensor network for clustering in order to plot paramaters such as residual energy , live nodes ,packets transmitted i.e. throughput.
@rawdahfarzeen7069
@rawdahfarzeen7069 4 года назад
Hi, could you please provide me your insights on how to apply the clustering code when I have an annual yearly data (time, vehicle 1 charging and vehicle 2 charging)?
@shubhamsahu240
@shubhamsahu240 Год назад
Nice 👍
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Год назад
Thank you Shubham Sahu! Happy Learning
@ranganagunasekara4380
@ranganagunasekara4380 3 года назад
Thank you so much for the video. Can you give me any tips. I need to cluster a huge smart meter data set. Hence I have a lot of columns ( or variable) Please help
@IllSetYouFree
@IllSetYouFree 4 года назад
Hi! Thanks for the very informative video. How would you go around clustering two variables of different types - one numerical, the other string? In this case SpendingScore & Gender
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 года назад
Convert gender from categorical data to numerical (Male=1 & Femal=0 or Male =0 & Female=1 anything your choice )& then apply kmeans . Happy Coding :-)
@IllSetYouFree
@IllSetYouFree 4 года назад
@@KnowledgeAmplifier1 Thank you for the quick reply! I have one more question - what should I do if I have 3 different string variables? Is it ok to assign them to 3 numerical categories (0,1,2 or 1,2,3)? I read here (datascience.stackexchange.com/questions/22/k-means-clustering-for-mixed-numeric-and-categorical-data) that this is the wrong approach because of the different distances between the points: “Categorical data is a problem for most algorithms in machine learning. Suppose, for example, you have some categorical variable called "color" that could take on the values red, blue, or yellow. If we simply encode these numerically as 1,2, and 3 respectively, our algorithm will think that red (1) is actually closer to blue (2) than it is to yellow (3). We need to use a representation that lets the computer understand that these things are all actually equally different. One simple way is to use what's called a one-hot representation, and it's exactly what you thought you should do. Rather than having one variable like "color" that can take on three values, we separate it into three variables. These would be "color-red," "color-blue," and "color-yellow," which all can only take on the value 1 or 0." Sorry for the trouble and thank you very much for your insight!
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 года назад
@@IllSetYouFree Yes one hot encoding we use , see try to understand the difference , there are two different scenario for categorical variable , one is when categorical data are comparable like First , Second , third rank , this time you can assign First=1 , second =2 , third =3 right , but suppose you have categorical data like country where data is India , Nepal , Bhutan , France etc , then you can not assign France=1 , India=2 ... right as they are not comparable ,in this case we go with one hot encoding . I have already uploaded how to handle these 2 scenarios ... Dealing with categorical features in machine learning | MATLAB( One Hot Encoding): ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-wCebbrfInRI.html Categorical Data Handling | Part 2 | Machine Learning | MATLAB: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-_v4UT5qsibk.html Hope these videos will help you to clear all your confusion or doubt about handling categorical data . Happy Coding :-)
@hugelevin
@hugelevin 4 года назад
Why did you normalize more like denormalized the feature values?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 года назад
Whenever you are dealing with features or parameters that differ from each other in terms of range of values (which is the case in most of data sets) then you have to normalize the data so that the difference in these range of values does not affect your outcome in distance based algorithms.
@swathiartsrangolies
@swathiartsrangolies 3 года назад
excellent video
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 года назад
Glad you liked it! Thank You. Happy Learning :-)
@muhammadhaziq5542
@muhammadhaziq5542 2 года назад
why the for loop is K 1 to 20?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 года назад
Hello Muhammad Haziq , the for loop is basically used to vary the value of k , it is done basically to get 20 data-points to plot elbow curve ..
@kamalhossain8466
@kamalhossain8466 3 года назад
Good video, but let me know if the steps involved are different for three variables
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 года назад
Apart from visualization part , it's same..
@saswatachakraborty4759
@saswatachakraborty4759 4 года назад
bhagwan ho ap
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 года назад
Thank You Saswata . Happy Coding :-)
@rajashreec5233
@rajashreec5233 3 года назад
Sir can you please mail me the complete details sir
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 года назад
Code & Dataset link is given in the description box Rajashree C . Happy Learning :-)
Далее
K-means Segmentation with Matlab
28:10
Просмотров 11 тыс.
Каха понты
00:40
Просмотров 223 тыс.
He went ALL in 😭
00:12
Просмотров 1,8 млн
Applied K-Means Clustering in R
16:54
Просмотров 54 тыс.
StatQuest: K-means clustering
8:31
Просмотров 1,6 млн
K-means Cluster Analysis With Excel - A Tutorial
48:36
low battery 🪫 smart bro
0:12
Просмотров 492 тыс.
Рабочие Будни Бездельника...
17:26