Тёмный

Feature selection in machine learning | Full course 

Data Science with Marco
Подписаться 4,2 тыс.
Просмотров 27 тыс.
50% 1

Опубликовано:

 

2 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 49   
Год назад
I am currently reading your book and it's amazing
@jmagames2766
@jmagames2766 11 месяцев назад
what is the name of the book plz
@ax5344
@ax5344 3 месяца назад
I like the logic of this video. You showed the baseline, then three additional methods, then compare them in the end. Thanks a lot for sharing the technique. The feature/target matrix is also very helpful. My question is the principle or concept behind the filter method, RFE, and boruta. Is it possible to do a video on them?
@paramvirsaini2806
@paramvirsaini2806 11 месяцев назад
Great explanation. Easy hands-on as well!!
@datasciencewithmarco
@datasciencewithmarco 11 месяцев назад
Thank you!
@TheSerbes
@TheSerbes 2 месяца назад
I want to make LSTM time series, what should I do for this? I think the situation is different for time series. Would I be wrong if I use what you did? There is both trend and seasonality in the series.
@oluwasegunodunlami7360
@oluwasegunodunlami7360 10 месяцев назад
Wow, this video is really helpful, a lot of interesting methods were shown. Thanks a lot. I like to ask you to make a future video covering how you perform feature engineering and model fine tuning 1:49
@mauroSfigueira
@mauroSfigueira 4 месяца назад
Hugely informative and educational content. Many feature engineering videos are not that instructive.
@samuelliaw951
@samuelliaw951 10 месяцев назад
Really great content! Learnt a lot. Thanks for your hard work!
@mohammadhegazy1285
@mohammadhegazy1285 13 дней назад
Than you very much for the video
@edzme
@edzme Месяц назад
this is great! about how long did it take to do boruta for your dataset? like if i have 400 features and 1 million rows.. would that be impossible?
@datasciencewithmarco
@datasciencewithmarco Месяц назад
@@edzme possible, but it will take a while, for sure.
@berlinbrown03
@berlinbrown03 Месяц назад
Thanks, good review
@dorukucar7105
@dorukucar7105 10 месяцев назад
pretty helpful!
@cagataydemirbas7259
@cagataydemirbas7259 Год назад
Hi, when I use randomforest , DecisionTree and xgboost on RFE , even if all of them tree based models, they returned completely different orders. On my dataset has 13 columns, on xgboost one of feature importance rank is 1, same feature rank on Decisiontree is 10, an same feautre on Randomforest is 7. How can I trust wich feature is better than others in general purpose ? İf a feature is better predictive than others, shouldnt it be de same rank all tree based models ? I am so confused about this. Also its same on SquentialFeatureSelection
@datasciencewithmarco
@datasciencewithmarco Год назад
That's normal! Even though they are tree-based, they are not the same algorithm, so ranking will change. To decide on which is the best feature set, you simply have to predict on a test set and measure the performance to make a decision.
@mrthwibble
@mrthwibble 8 месяцев назад
Excellent video, however I'm preoccupied trying to figure out if having wine as a gas would make dinner parties better or worse. 🤔
@eladiomendez8226
@eladiomendez8226 7 месяцев назад
Awesome video
@datasciencewithmarco
@datasciencewithmarco 7 месяцев назад
Thanks!
@nikhildoye9671
@nikhildoye9671 6 месяцев назад
I thought feature selection is done before model training. Am I wrong?
@keerthana7353
@keerthana7353 6 месяцев назад
Yes correct
@chiragsaraogi363
@chiragsaraogi363 11 месяцев назад
This is an incredibly helpful video. One thing I noticed is that all features are numerical. How do we approach feature selection with a mix of numerical and categorical features? Also, when we have categorical features, do we first convert them to numerical features or first do feature selection. A video on this would be really helpful. Thank you
@haleematajoke4794
@haleematajoke4794 10 месяцев назад
You will need to convert the categorical features into numerical format by using label encoding which automatically converts it to numerical values or custom mapping where u can manually assign ur preferred values to the features. I hope it helps
@haleematajoke4794
@haleematajoke4794 10 месяцев назад
You will have to do the conversion before feature selection because machine learning models only learn from numerical data
@mandefrolegesse5748
@mandefrolegesse5748 3 месяца назад
Very interesting explanation and clear to understand. I was looking for this kind of tutorial. Subscribed👍
@nabeel_kaleel
@nabeel_kaleel 3 месяца назад
subscribed
@alfathterry7215
@alfathterry7215 9 месяцев назад
in Variance threshold technique, if we use Standard scaler instead of Minmax scaler, the variance would be the same for all variables.... does it means we can eliminate this step and just use standars scaler?
@pedro_tonom
@pedro_tonom 2 месяца назад
Amazing video and excelent didatic. Congrats for the great quality, helped me a lot!
@NulliusInVerba8
@NulliusInVerba8 2 месяца назад
This is extremely helpful and informative. Thanks a LOT!
@abhinavreddy6451
@abhinavreddy6451 5 месяцев назад
Please do more Data science-related content, It was very helpful I searched everywhere for feature selection videos and finally landed on this video and this was all I needed, the content is awesome and the explanation is as well!
@babakheydari9689
@babakheydari9689 5 месяцев назад
It was great! Thanks for sharing your knowledge. Hope to see more of you.
@tanyaalexander1460
@tanyaalexander1460 5 месяцев назад
I am a noob to data science and feature selection. Yours is the most succinct and clear lesson I have found... Thank you!
@maythamsaeed533
@maythamsaeed533 10 месяцев назад
very helpful video and easy way to explain the content. thanks alot
@claumynbega1670
@claumynbega1670 10 месяцев назад
Thanks for this valuable work. Helps me learning the subject.
@lecturesfromleeds614
@lecturesfromleeds614 3 месяца назад
Marco's the man!
@shwetabhat9981
@shwetabhat9981 Год назад
Woah , much awaited 🎉 . Thanks for all the efforts put in sir . Looking forward to more such amazing content 🙂
@roba_trend
@roba_trend Год назад
i tried to search under your github aint get the data where is the data you work on?
@datasciencewithmarco
@datasciencewithmarco Год назад
The dataset comes from the scikit-learn library! We are not reading a CSV file. As long as you have scikit-learn installed, you can get the same dataset! That's what we do in cell 3 of the notebook and it's also on GitHub!
@michaelmecham6145
@michaelmecham6145 7 месяцев назад
Sensational video, thank you so much!
@imadsaddik
@imadsaddik 7 месяцев назад
Thank you for sharing
@pooraniayswariya997
@pooraniayswariya997 Год назад
Can you teach how to do MRMR feature selection in ML?
@tongji1984
@tongji1984 7 месяцев назад
Dear Marco Thank you.😀
@Loicmartins
@Loicmartins 10 месяцев назад
Thank you very much for your work!
@roba_trend
@roba_trend Год назад
interesting content much love it🥰
@therevolution8611
@therevolution8611 Год назад
can you explain how we are performing feature selection for the multilabel problem?
@BehindClosedDoorsBCD
@BehindClosedDoorsBCD 8 месяцев назад
You can convert the label to numerical features by replacing them with numbers. If you have 3 labels in a feature, you could represent them with 0,1,2 there are different methods to use. Simpler one is .replace({})
@scott6571
@scott6571 8 месяцев назад
Thank you! It's helpful!
@datasciencewithmarco
@datasciencewithmarco 8 месяцев назад
Glad it helped!
Далее
КОГДА НАКРОШИЛ НА ПОЛ #shorts
00:19
Просмотров 684 тыс.
Feature Engineering Secret From A Kaggle Grandmaster
22:23
Thomas J. Fan - Time Series EDA with STUMPY
26:24
Просмотров 1,4 тыс.
Machine Learning for Everybody - Full Course
3:53:53
198 - Feature selection using Boruta in python
16:50
Просмотров 14 тыс.
How do I select features for Machine Learning?
13:16
Просмотров 177 тыс.