Тёмный

Data Cleaning in MySQL | Full Project 

Alex The Analyst
Подписаться 869 тыс.
Просмотров 149 тыс.
50% 1

Full MySQL Course: www.analystbui...
In this lesson we are going to be building a data cleaning project in MySQL!
Download Dataset: github.com/Ale...
GitHub Code: github.com/Ale...
____________________________________________
SUBSCRIBE!
Do you want to become a Data Analyst? That's what this channel is all about! My goal is to help you learn everything you need in order to start your career or even switch your career into Data Analytics. Be sure to subscribe to not miss out on any content!
____________________________________________
RESOURCES:
Coursera Courses:
📖Google Data Analyst Certification: coursera.pxf.i...
📖Data Analysis with Python - coursera.pxf.i...
📖IBM Data Analysis Specialization - coursera.pxf.i...
📖Tableau Data Visualization - coursera.pxf.i...
Udemy Courses:
📖Python for Data Science - bit.ly/3Z4A5K6
📖Statistics for Data Science - bit.ly/37jqDbq
📖SQL for Data Analysts (SSMS) - bit.ly/3fkqEij
📖Tableau A-Z - bit.ly/385lYvN
Please note I may earn a small commission for any purchase through these links - Thanks for supporting the channel!
____________________________________________
BECOME A MEMBER -
Want to support the channel? Consider becoming a member! I do Monthly Livestreams and you get some awesome Emoji's to use in chat and comments!
/ @alextheanalyst
____________________________________________
Websites:
💻Website: AlexTheAnalyst.com
💾GitHub: github.com/Ale...
📱Instagram: @Alex_The_Analyst
____________________________________________
All opinions or statements in this video are my own and do not reflect the opinion of the company I work for or have ever worked for

Опубликовано:

 

23 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 522   
@anyamunkh1787
@anyamunkh1787 5 месяцев назад
Watched all the ads without even skipping, that's how much I am grateful for your work and time you put into this.
@projectamaze9677
@projectamaze9677 3 месяца назад
Yes , it's true .. I'm grateful for this course , that too for free
@ozirikennedy2838
@ozirikennedy2838 Месяц назад
will start doing this too
@chandrabose867
@chandrabose867 Месяц назад
mam I'm from India can u get a job ? currently I'm looking for job if u got a job then tell me how you got the job it means a lot to me...
@user-sp8sw7vt5k
@user-sp8sw7vt5k 5 месяцев назад
Timestamps: Removing Duplicates: 8:00 Standardizing Data: 17:32 Null/Blank Values: 33:30 Remove Unnecessary Columns/Rows: 46:12 Great video!
@obeliskphaeton
@obeliskphaeton 4 месяца назад
I converted the 'date' column from text to Date format after importing. And when I ran the duplicate_cte, I only got 5 rows in output. Note: I used date instead of 'date' in the partition by section.
@harshitthakur8300
@harshitthakur8300 4 месяца назад
Life saver I was just searching for this
@usmanabid5452
@usmanabid5452 3 месяца назад
nice video
@tejjm4263
@tejjm4263 4 месяца назад
Thanks for the kind words! I made it to the end and learned a lot while working on the project simultaneously.
@HandyCliffy
@HandyCliffy Месяц назад
Timestamp: 18:59: after unchecking the Safe updates mode , you don't have to restart MySQL; go to Query on the top left side of your window then select reconnect to server and then run your query.
@joe2271
@joe2271 12 дней назад
I love the way you teach. It is so important to see the errors when learning something new.
@shamisomaisiri9117
@shamisomaisiri9117 Месяц назад
3.15am South African time, I managed to finish the project. It was tough had to start over, missed some stuff but I did it. I just completed my Meta Data Analyst Professional Course and decided to do this Boot camp. As someone coming from the Beauty Industry, I was so lost. Thank you @AlextheAnalyst for helping people like me who want to learn and excel but cannot afford tuition for University or most Institutions.
@awuluemmanuel2529
@awuluemmanuel2529 14 дней назад
how did you download the datasets from his github??
@mahamejaz6857
@mahamejaz6857 9 дней назад
I have started my journey of learning data analytics.... And with the help of your content i understand so many things quickly and easily... Hopefully i will get the job at the end of thid course.... Thanks a lot alex... From pak
@farahkhan1128
@farahkhan1128 8 дней назад
Hey, me too. btw, would you like to share the name of city you are from?
@utkarshrana39
@utkarshrana39 5 месяцев назад
Hey Alex! I'm from India, I have been following you for a months but really couldn't make any project. But from the first encounter of your content, I knew I'm gonna walk on your foot steps. I loved it and also I was looking for some data like this last 2 weeks, I did tried on most the US, Indian data bureau(s) and where not. Yesterday I decided to make this project at hand, AND WOW It was that data I was looking for. Thank you so much. This is my second project ever in SQL. I love it totally from the beginning to the end. And I had so much fun doing this project, literally. I was laughing with the funny parts, even overwhelmed in the end at that industry populating part juts like. I cheered yayy. I made a mistake too, I forgot to run the deleting query of the duplicates. I had to run from the start to find out where did I miss. I love your energy and how to take things so calmly and carry the learner with you till the very motive. I probably have written too much now. I am so excited to follow this project till the visualization part. And here's one more thing I tried, I wanna show you - In the populating part, we can do it without making the blanks, null. This is the query I tried, UPDATE layoffs_staging2 t1 JOIN layoffs_staging2 t2 ON (t1.company = t2.company AND t1.location = t2.location) SET t1.industry = t2.industry WHERE (t1.industry IS NULL OR t1.industry = '') AND (t2.industry IS NOT NULL AND t2.industry != '');
@alphaghost4330
@alphaghost4330 4 месяца назад
hey i'm working on the latest layoff dataset, matched some unknowns in the stage column by doing a self join on company column, should i update those unknown values?
@avilagustin
@avilagustin Месяц назад
Hola Alex, soy de Argentina y estoy viendo todo el Bootcamp con subtítulos. Eres una persona maravillosa a la hora de explicar las cosas, siento como que un amigo me está explicando los temas. Muchas gracias por lo que hiciste.
@ibrahimdenisfofanah6420
@ibrahimdenisfofanah6420 5 месяцев назад
Patiently waiting for the exploratory aspect of the clean data. Thanks very much
@10cutie207
@10cutie207 5 месяцев назад
Alex! This is why I subscribed, thank you so much for doing this in MySQL!!
@cjbrown3396
@cjbrown3396 4 месяца назад
watch till the end it's awesome Alex ! thanks so much
@FlintPaul
@FlintPaul 4 месяца назад
Thanks Alex. Great video. The best tip so far was from the data cleaning vid. I didn’t realize that I could check the results before executing changes on the database. Like start_time, length(start_time), SUBSTRING(start_time,1,19) to check truncating that string prior to the real deal.
@rosaadly
@rosaadly Месяц назад
I don't know but I'm too thankful for you, I really hope see people could be so great like you in my daily life. Thank you, Alex, I hope you getting too much positive things with numbers of people who are got benefit from your videos.
@zakhelembhele7046
@zakhelembhele7046 4 месяца назад
Alex, You're so natural. The Best yet!
@NduezeIfeanyiDavid
@NduezeIfeanyiDavid 26 дней назад
Timestamp 24:27 This is how I resolved all those, just the way you taught us. It is usually caused by difference in character type. SELECT location FROM layoffs_staging2; SELECT DISTINCT location, country FROM layoffs_staging2; SELECT * FROM layoffs_staging2 WHERE location LIKE 'Mal%'; UPDATE layoffs_staging2 SET location = 'Malmo' WHERE location LIKE 'Mal%'; SELECT * FROM layoffs_staging2 WHERE location LIKE 'Flor%'; UPDATE layoffs_staging2 SET location = 'Florianopolis' WHERE location LIKE 'Flor%'; SELECT * FROM layoffs_staging2 WHERE location LIKE '%sseldorf'; UPDATE layoffs_staging2 SET location = 'Dusseldorf' WHERE location LIKE '%sseldorf'; Hope this is okay?
@AnnNguyenHo
@AnnNguyenHo 5 месяцев назад
Amazing Alex, this is exactly what I'm looking for my project too. Thank you so much
@newenglandnomad9405
@newenglandnomad9405 5 месяцев назад
Outstanding video. I did follow most of it, the rest I'll rewind and study. Definitely going to be doing the code along myself and posting to my portfolio. Thanks for the very detailed walk through. I am trying to get better so I can try this as a side hustle while looking for a data job. I have a comfy IS help desk job, I'm just bored to death of it and not learning anything new.
@oyeyemiakinsanmi4713
@oyeyemiakinsanmi4713 2 месяца назад
I must commend you Alex, had a little bit of trouble populating the null values following the instructions in this video, had to go and check the code on your Github which was super helpful and straightforward. Thanks for your lecture and am sure going to complete this boot camp playlist❤
@stephenwagude9330
@stephenwagude9330 24 дня назад
Finally completed this after a second trial and now I have my first project on MySQL
@alamsgodwin6179
@alamsgodwin6179 5 месяцев назад
Thanks Alex, Can't wait to Start this project
@peaceandlove8862
@peaceandlove8862 5 месяцев назад
Alex videos are always so real authentic and so relevant!
@ahmadmarzodandy6054
@ahmadmarzodandy6054 5 месяцев назад
Thanks for this video, Alex! Really need it
@DasaHerrera
@DasaHerrera Месяц назад
I shouldn't be having this much fun doing homework. Thank you Alex!
@saheel7
@saheel7 19 дней назад
Dear Alex, I wanted to take a moment to express my heartfelt thanks for the incredibly useful lesson you provided on data cleaning. Your generosity in sharing your knowledge for free has been invaluable, and the techniques you've taught me will undoubtedly help me handle data more effectively in my work. I truly appreciate your time, effort, and willingness to help others grow in this essential skill. Thank you once again for your guidance and support! Warm regards, Saheel Mowlana
@AnalystInMaking
@AnalystInMaking 5 месяцев назад
Has anybody told you that you are not just good but you are AWESOME !👑
@mggaming106
@mggaming106 24 дня назад
absolutely enjoyed the tutorial and learnt loads of stuff!! When you said at the end that not everyone is able to make till the end, I felt really good that I could follow along and made it till the end. :) thank you very much Alex for putting out this tutorial for all of us newbies. Really looking forward to begin Exploratory data analysis tutorial!! Cheers..!
@hopemsoffe702
@hopemsoffe702 25 дней назад
Made it to the end!! Thank you Alex. Much love from Tanzania:)
@irshadmansour7408
@irshadmansour7408 25 дней назад
twinsies
@thamizarasan1913
@thamizarasan1913 5 месяцев назад
Thanks for doing a project in SQL. Waited for long.
@muneebbolo
@muneebbolo 4 месяца назад
Thanks for sharing this helpful content, Alex! We need more of this.
@ozirikennedy2838
@ozirikennedy2838 Месяц назад
Thank you Alex for the kind words at the end of this video. It made me feel good 😊
@piromaniaco3579
@piromaniaco3579 4 месяца назад
Just finished doing this one, really fun and practical. Now heading to the EDA part. A question, I am not sure how to post this projects in a portfolio. I normally publish projects in my github page when it's about web or app development but I've never done SQL projects before, how is it supposed to be published to make it properly visible for recruiters for example. Thank Alex for all the value you share
@sujayy6851
@sujayy6851 4 месяца назад
Thanks a lot for simplifying MYSQL Alex!
@ibrahimolasunkanmi7576
@ibrahimolasunkanmi7576 3 месяца назад
Alex The Analyst, You are a Blessing to this Generation...
@tobijarrett3951
@tobijarrett3951 12 дней назад
Alex congratulating me for making it to the end like i have a choice. i have made it to the end like 3 times now😂. great work man. i bless the day i discovered your channel.
@gauravtanwar8886
@gauravtanwar8886 5 месяцев назад
exactly what i was looking for! thanks a lot 🙌🏻
@Ladyhadassah
@Ladyhadassah 5 месяцев назад
Great work, Alex. we love you
@eritrean_forever
@eritrean_forever 4 месяца назад
...and another lesson taken... 47:35 "I can't trust that data, I really can't!" We should get to this level before confidently deleting 'useless' rows! As always, Alex you're the best! Thank you very much for all your contribution!
@huonggiang8277
@huonggiang8277 3 месяца назад
I have a question. I do the exact same thing, but there is nothing when I do the 'removing duplicates' part. I did all the processes again and just realized that mine is only 564 records. I don't know why. Can you explain how to fix it?
@ericawelch4218
@ericawelch4218 Месяц назад
I'm having this issue, did you figure it out?
@huonggiang8277
@huonggiang8277 Месяц назад
@@ericawelch4218 unfortunately, I have not found the solution. but just keep doing it and ignore those 'differences'.
@swapnilhatwar1310
@swapnilhatwar1310 Месяц назад
@@ericawelch4218have you found the solution?
@Maakikirkiri
@Maakikirkiri 3 месяца назад
while you were updating the date format but there were null values and you could still go ahead and continue the update query. but when i tried to do the same it is not letting me do it and throws an erroe `Error Code: 1411. Incorrect datetime value: 'NULL' for function str_to_date`
@arnavchopra7708
@arnavchopra7708 2 месяца назад
hello, were you able to fix this? im facing the same error..please help
@okodedefaith6582
@okodedefaith6582 2 месяца назад
@@arnavchopra7708 hello, I am facing the same problem. how did you fix this issue
@Duncan-Muthami
@Duncan-Muthami 2 месяца назад
@@okodedefaith6582 confirm if you are putting % before the Y
@chandrabose867
@chandrabose867 Месяц назад
same bruh
@chandrabose867
@chandrabose867 Месяц назад
@@arnavchopra7708 hey it's actually changed check the correct table
@ShortClipsPodcasts
@ShortClipsPodcasts 5 месяцев назад
I'm having problem importing the data, the orginal data has 2000+ rows, but when I import it, it only has 564. Does anyone know how to fix this issue?
@Pato-rt1vh
@Pato-rt1vh 5 месяцев назад
Same, if anyone knows a video I can watch to fix it just let me know. 👍🏽
@SomeStatus
@SomeStatus 5 месяцев назад
@@Pato-rt1vh convert that .csv into a json!
@piromaniaco3579
@piromaniaco3579 4 месяца назад
I am facing the same issue. I just came to the comment section to see if anyone can bring some light to it. I really want to practice and do the project, it's frustrating to be stuck right before starting.
@ichigokurosaki6470
@ichigokurosaki6470 4 месяца назад
I’m having the same issue
@rahulganeshregalla1165
@rahulganeshregalla1165 3 месяца назад
I faced the same issue. I don't have a solution but a suggestion. Just follow through the video with whatever data u could import. Won't be perfect, but try to get what we're trying to do here. Then just try to practice data cleaning on some raw data which you can find on kaggle.
@leosch80
@leosch80 5 месяцев назад
Excellent Alex!!! You read my mind, man! This is just what I needed to put in my portfolio. THANK YOU
@hazoom
@hazoom 5 месяцев назад
i Appreciat your work Alex, well done
@jaden13art50
@jaden13art50 3 месяца назад
Hey Alex I was wondering if you could help me troubleshoot what I'm doing incorrectly. All my row numbers are different when I run the query during the 9:31 duration of the video.
@offodilesomtochukwu6268
@offodilesomtochukwu6268 2 месяца назад
hiii, please im going through the same challenge at this point. Were you able to fix your issue? if so...please can you walk me through.
@austinohlrich9370
@austinohlrich9370 Месяц назад
@@offodilesomtochukwu6268 same, im getting a HUGE list, not just 6 or 7 like in the video...
@austinohlrich9370
@austinohlrich9370 Месяц назад
guess it doesnt matter since we're deleting them all anyways...
@mikellerfranco
@mikellerfranco 27 дней назад
Probably a little late for you guys, but I was having the same issue, I noticed that everything was Partition correctly but not the date, the issue was in the quotation mark, make sure you are using the correct one: Like this --> ˋdateˋ
@ula7110
@ula7110 10 дней назад
Hey, I've had the same issue and I had to click on the row_num column so it will show it in the ascending order
@michaelp9061
@michaelp9061 5 месяцев назад
Incredible tutorial Alex. Thank you!
@ppiercejr
@ppiercejr 4 месяца назад
I understand the ease of which using the upload tool makes it to create a table and import the data all at once, but I find that it makes it cumbersome in this case since there is no unique column. Is there a reason that you wouldn't create the schema for the table and create an auto incrementing id column that is the primary key to assign a unique id to every row, then use the row_number function to search for duplicate rows using all the columns except the id column. This would save you from having to create a duplicate table to store row_num as you could just use the id column to delete the duplicate records. This also seems like it would make your database easier to deal with since it would have a proper primary key. Sure, it is a meaningless primary key, but it would certainly make updating the data easier and faster in many cases.
@MrWonderninja
@MrWonderninja 4 месяца назад
Learned a lot following along through this, excited to follow the EDA next!
@uniongrob8194
@uniongrob8194 20 дней назад
Thank you, Data guy, brilliant work)
@edydossantos
@edydossantos Месяц назад
Done something wrong, as I am coming back and forth as my data doesn't have any duplicates. Jeez! Any idea what is wrong?
@SafiaSingla
@SafiaSingla 5 месяцев назад
This was an amazing tutorial!! Thank you
@mohammadalom4854
@mohammadalom4854 4 месяца назад
Hello, great video. However, I'm having trouble transferring all the data from the file to MySQL. Do you have any suggestions?
@nicolekanigina
@nicolekanigina 4 месяца назад
Bro, same here. I only get around 500 imported.
@mohammadalom4854
@mohammadalom4854 4 месяца назад
@@nicolekanigina Did u end up finding a solution?
@_m3atball
@_m3atball 28 дней назад
also having this problem
@cgadison
@cgadison 5 месяцев назад
This was very insightful, thank you so much for this Alex.
@womanonamission8677
@womanonamission8677 3 месяца назад
Took me all day but yayyy I’m done my first sql project!!
@rakeshbasak6842
@rakeshbasak6842 4 месяца назад
awesome work Alex! and thanks for providing this kind of contents
@krystalbrantley4996
@krystalbrantley4996 5 месяцев назад
Thank you so much for sharing your expertise. I learned and I laughed (your comments 😄) throughout the tutorial. You're awesome!
@AlexTheAnalyst
@AlexTheAnalyst 5 месяцев назад
Haha glad to hear it! Learning should be fun :D
@nadeemjan1000
@nadeemjan1000 Месяц назад
Great , The Alex. thank you so much. I have learned a lot from basic to Cleaning Data. Thank you once again.
@sharonhay5884
@sharonhay5884 17 дней назад
Thank you ,Alex. Now that I can see how sql works in a real project. Although I've finished all of your sql videos I have no idea how to utilize it in a real project. But thanks to this video I feel like I can move forward. Again , thank you. You are the best teacher for a self learner like me.
@ratnakshtyagi3564
@ratnakshtyagi3564 4 месяца назад
thanks alex for this data cleaning practice
@srp4024
@srp4024 Месяц назад
In the data where there are null values for industry, those rows are repeated but however they did not show up in CTE (row num) and query for row_num>1. e.g Bally's Interactive have three exactly the same rows with null values industry and total_laid_off. How we remove these duplicate rows ?
@thedevreda2839
@thedevreda2839 12 дней назад
did you find solution for that
@alice60372
@alice60372 5 месяцев назад
Alex u r the best! Thank you so very much... Plzzz do more videos on data cleaning.
@duurduranto
@duurduranto 4 месяца назад
Absolutely phenomenal work. Thank you very very much for this. Cleared some of the concepts and at the same time created a project. Absolutely love it.
@nickc.440
@nickc.440 Месяц назад
Trying to do this tutorial but whenever I use import wizard the csv is not fully read and only 564 lines are registered. Did anyone else have this issue?
@thatgirlwangechi
@thatgirlwangechi 2 месяца назад
Thank you for your help. I appreciate your videos a lot
@muhammadomer1654
@muhammadomer1654 14 дней назад
Thanks Alex for such a informative tutorial on cleansing. But how would we handle some of the remaining null values in total_laid_off, percentage_laid_off and funds_raised_million columns?
@Maakikirkiri
@Maakikirkiri 3 месяца назад
thanks @alex i convert the csv to json and it worked
@bharathlingampalli4708
@bharathlingampalli4708 4 месяца назад
while importing the dataset, it is automatically removing many rows. It has only imported 534 rows. Can anyone help me with this?
@미화-d9o
@미화-d9o Месяц назад
same issue
@derrickmedina2796
@derrickmedina2796 4 месяца назад
Great work! Love how it was all broken down
@JesusA.HidalgoGamez-go2ib
@JesusA.HidalgoGamez-go2ib Месяц назад
About the duplicates: I used a subquery to reference the table in the 'WITH' statement, that way I get to delete duplicates without creating another table. Just wanna know if it's a valid procedure ☺. Love this course, learning a lot.❤
@karthickraja36
@karthickraja36 Месяц назад
can u help me with the query
@irehlove
@irehlove 2 месяца назад
Thank you so much for these lessons. I just had one question. when importing my .csv file onto mySQL, I was not able to get all of the data but only the 500 range. Could I get some help on how to load in all of the data. I am running all of this on a macbook.
@camfaux
@camfaux 2 месяца назад
I had the same issue and from reading other comments this seems to be an issue for anyone using mysql on mac. I found the best workaround was to convert this to a json. Just search online for a csv to json converter and use the json file when first importing. However, this seemed to cause another issue where it replaced NULL values with the text "NULL" and that gave me errors later when converting the dates. So, to solve this, the first thing you will want to do after creating the 'layoffs_staging2' table is convert all those "NULL" text values to actual nulls using the following action: UPDATE layoffs_staging2 SET `company` = CASE WHEN `company` = 'NULL' THEN NULL ELSE `company` END, `location` = CASE WHEN `location` = 'NULL' THEN NULL ELSE `location` END, `industry` = CASE WHEN `industry` = 'NULL' THEN NULL ELSE `industry` END, `total_laid_off` = CASE WHEN `total_laid_off` = 'NULL' THEN NULL ELSE `total_laid_off` END, `percentage_laid_off` = CASE WHEN `percentage_laid_off` = 'NULL' THEN NULL ELSE `percentage_laid_off` END, `date` = CASE WHEN `date` = 'NULL' THEN NULL ELSE `date` END, `stage` = CASE WHEN `stage` = 'NULL' THEN NULL ELSE `stage` END, `country` = CASE WHEN `country` = 'NULL' THEN NULL ELSE `country` END, `funds_raised_millions` = CASE WHEN `funds_raised_millions` = 'NULL' THEN NULL ELSE `funds_raised_millions` END;
@SHAMIANSARI-o2f
@SHAMIANSARI-o2f 5 месяцев назад
Truly Blessing for us.
@TheSupersayan6
@TheSupersayan6 5 месяцев назад
can you make a tutorial on how to connect this mysql database in power bi and make a dashboard for it?
@hasanrazakhan7154
@hasanrazakhan7154 5 месяцев назад
Thanks Alex for this amazing video
@rokibchi
@rokibchi 5 месяцев назад
Looking forward to next project
@AkashKumar-jo7ec
@AkashKumar-jo7ec 3 месяца назад
@Alex The Analyst , Thank you so much for sharing a genuine Content . Till now i have learnt lots of SQL Tutorial , (there is one issue on fixing Text Type to Date ) . I hope when you find this message definately help me out there.
@zaidahmad.773
@zaidahmad.773 5 месяцев назад
Hey alex , thanks for the video. Please cover data cleaning in stata or R as well.
@skismshehe5815
@skismshehe5815 3 месяца назад
Thank you so much for this, I enjoyed every step of this
@nickelikem
@nickelikem 4 месяца назад
I find your method of removing duplicates too complicated. When inserting into the new table, I removed duplicates by running 'SELECT DISTINCT * ....' . Is there any downside to my method? Are there cases where it wouldn't work?
@andiralee408
@andiralee408 3 месяца назад
Oh oh me too! I used Union of the same table to get rid of duplicates, but SELECT DISTINCT * is just so much quicker and shorter! Thanks for sharing :) Great job us! PS: Alex if you're seeing nickelikem's comment please also see if there might be any downside to my method in the long run. Thanksss
@baominh4380
@baominh4380 5 дней назад
As Alex was checking for duplicates, the company "Casper" has different layoff waves, 3 in total; one is unique and two are duplicates of each other; however, you would want to keep one of the two duplicates due to itself; if you go SELECT DISTINCT, it would result in one unique and leave out the other input. That's my take on the SELECT DISTINCT
@justinabosco2763
@justinabosco2763 14 дней назад
Thanks for the great lesson Alex. Because we fixed some industries, I want to check if the row numbers are still distinct or not. I tried this syntax, is this correct? with cte as (select row_num as row1, row_number() over(partition by company, location, industry, total_laid_off, percentage_laid_off, `date`, country, funds_raised_millions) as row2 from layoffs_staging_2) select * from cte where row1 != row2;
@georgek398
@georgek398 4 месяца назад
why does the data appear to be pre-cleaned? I'm not seeing the duplicates and I'm not seeing the different crypto industries...
@AlexTheAnalyst
@AlexTheAnalyst 4 месяца назад
Are you getting the data from the github link?
@georgek398
@georgek398 4 месяца назад
@@AlexTheAnalyst thanks for your response. I am getting it from github. with the help of the comments i believe i have figured it out. i had to convert the data to JSON, and raise the 'Limit to 1000 rows' dropdown menu to something higher than the length of this data. Otherwise I was about to give up, so perhaps a description update would help other viewers. Now I just have to change all the 'NULL' strings in the JSON data into actual NULL values. Thanks again
@ichigokurosaki6470
@ichigokurosaki6470 4 месяца назад
⁠@@georgek398Have you figured out how to change the null strings into null values? I’m stuck at that at the moment
@emansabra-n7g
@emansabra-n7g 2 месяца назад
Thanks alot, Alex. you're doing an amazing job.🥰💜
@yustone
@yustone 5 месяцев назад
Thanks, I really like this project
@harshitthakur8300
@harshitthakur8300 4 месяца назад
Great video and easy to learn from these kind of videos.
@sseemm
@sseemm Месяц назад
Thank you Alex. you're the best
@user-wm6fq6zx7v
@user-wm6fq6zx7v 2 месяца назад
Can someone please tell me how to add this to my Portfolio/Resume (I'm a fresher)
@irfankhan-qj4mu
@irfankhan-qj4mu 4 месяца назад
Sir, if you write table_name .date(column name) , it works better,becoz 'date' does not work in my workbench ,dont but after all super cool you are Sir, thanks and respect from Pakistan
@dennisbunarta1190
@dennisbunarta1190 4 месяца назад
I love this channel.. God bless you alex
@vivianoffia315
@vivianoffia315 2 месяца назад
​ @AlexTheAnalyst I'm only able to import 536 out of the 2000+ rows using the import wizard. Please How do i resolve this. Ps: I'm using a mac
@akanshamishra219
@akanshamishra219 2 месяца назад
is it solved now? I'm facing the same issue
@niky2152
@niky2152 2 месяца назад
Great video! Those mistakes helped me the most!
@kahinaabbas9792
@kahinaabbas9792 5 месяцев назад
just on time 🤩
@justinkings635
@justinkings635 28 дней назад
Regarding the "Removing Duplicates" part. I tried to check if there are duplicates left by adding "WHERE company = 'Casper' ", but there are still duplicates showing up, but when i filtered the rows using the results grid, it does not show any duplicates. Does that confirm that the duplicates are deleted? Thanks in advance!
@DataCapsulas
@DataCapsulas Месяц назад
Thank's for you explanation!
@limuwaa4623
@limuwaa4623 Месяц назад
hello guys, when i want to import the table into sql i am getting a table with about 500 rows, so there are actually missing a lot. I am only facing this problem on my mac, not using my windows pc. someone knows what the problem could be?
@nickc.440
@nickc.440 Месяц назад
i have no idea.
@nickc.440
@nickc.440 Месяц назад
i had to convert the csv into a json file on a 3rd party website. I'm sure there's a better solution that keeps it as CSV but that worked best for me. Still couldn't figure out the problem though, maybe it has something to do with how null values are read
@limuwaa4623
@limuwaa4623 Месяц назад
@@nickc.440 okay thanks im gonna try that
@Arslan-nm3rz
@Arslan-nm3rz 5 месяцев назад
Thank you very much Alex!!!
@allankamau6299
@allankamau6299 2 месяца назад
I made it to here. #moving forward
@elle8388
@elle8388 2 месяца назад
😀 Are you going for the entire bootcamp?
@allankamau6299
@allankamau6299 2 месяца назад
@@elle8388 yes
@allankamau6299
@allankamau6299 2 месяца назад
@@elle8388 Yes I am
@allankamau6299
@allankamau6299 2 месяца назад
@@elle8388 yes I am
@allankamau6299
@allankamau6299 2 месяца назад
@@elle8388 yes
@kujiama7946
@kujiama7946 Месяц назад
Tried doing it blind just doing the 4 steps you mentioned in doing data cleaning i got atleast 70% right ( i miss stuff especially with the United States. one)
@juliewali1681
@juliewali1681 Месяц назад
16:50, if the DELETE command doesn't work even if you untick the DELETE option in the Preferences, you can add this in the code: SET SQL_SAFE_UPDATES = 0;
@bayonette1
@bayonette1 4 месяца назад
it was a very useful project, thank you very much 🙌
@rhythmdas2689
@rhythmdas2689 3 месяца назад
Hey there! I am trying to transfer the data to sql but it is showing that 564 records are showing instead of 2361. Can someone please help
@ARAOYEOMOTOLA
@ARAOYEOMOTOLA 2 месяца назад
did you uncheck the anomaly removing function while importing your data
@sparkle_ko
@sparkle_ko 3 месяца назад
Thank you so much, Alex. 😊
@ryanjames3235
@ryanjames3235 5 месяцев назад
I'm trying to do this project following the steps but I'm having difficulty creating a new table from the exiting table, the like function is not working , same as the as and select from statement
@caioguazzelli4736
@caioguazzelli4736 Месяц назад
Same for me. Were you able to solve this?
Далее
MySQL Exploratory Data Analysis | Full Project
32:44
Просмотров 65 тыс.
Triggers and Events in MySQL | Advanced MySQL Series
14:42
Roadmap for Learning SQL
4:52
Просмотров 401 тыс.
What does a Data Analyst actually do? (in 2024) Q&A
14:27
End to End Data Analytics Project (Python + SQL)
46:52
Просмотров 152 тыс.
Exposing How Alex The Analyst Became a Data Analyst
31:36