I am amazed how clearly you explaining these things and how much work you put into these free video. Cannot thank you enough for this quality content! please do not stop, the work you do and talent you have are incredible!
Hi Darshil, i am currently working as a data engineer in an MNC and I find your projects amazing. This Gives us a handson to get a clear objective about the things and for practice for my portfolio.
this was amazing Darshil, as a cloud enthusiast {i have most. of my experience on premise}, it was a perfect end to end data project, please create more such videos, it was a great learning and will add value to everyone's resume
I am glad that you liked it, just heads up these projects are for learning and to show how things work in real world, I do not recommend adding this project in your resume, you can take learning and apply learning into some different problem and then add it, that will hold more value. because everyone is going to add this project after watching my video so try to stand out
Hi Darshil! I somehow stumbled on your video, and I'm literally amazed by the way you are explaining things.You literally showed me the usecases of the services. Please bring more such end to end projects, as we don't really have such things on RU-vid.
Thank you so much Darshil for these videos you made, you are my mentor so far on this data engineering journey. God bless you bro ... keep the good job going. Because as a DevOps engineer and AWS SAA certified I know the energy it takes to build a single project...... Thank you once more.
Hi Darshil, Your content is really very amazing and easy to understand. I am gaining great knowledge by watching your videos. But I am stuck with a problem and I can't find any proper solution anywhere online. While creating the crawler I attached proper policy and everything as told but in the end its throwing an error as: "One crawler failed to create The following crawler failed to create: "glue-enigma-jhud" Here is the most recent error message: Account ..................... is denied access." Please help me out with this issue. Thank You!
Any laptop is good enough as long as it has basic config such as 8-16GB ram, SDD 500GB, latest processor Most of the time you will be using Cloud Platform to do your work so your laptop config does not matter that much www.linkedin.com/posts/darshil-parmar_datascience-dataengineering-cloudcomputing-activity-6949973068303720448-OgF2?
Are you doing paid lessons for individuals or groups to learn data Engineering? I'm lost and don't know how to start.. i have tried to search for online courses but i couldn't find something i can learn what i want to learn ..
Hi Darshil , I have been getting an error while executing the querry on athena "The S3 location provided to save your query results is invalid. Please check your S3 location is correct and is in the same region and try again. If you continue to see the issue, contact customer support for further assistance." I have even checked the region specified for all my bucket but the error still persist. can you help?
Hey Darshil, Trying to generate DDL from Athena database tables but getting below error ErrorCategory:USER_ERROR, ErrorCode:UNKNOWN_ERROR This query ran against the "covid-19_db" database, unless qualified by the query can you please help to resolve this
Hey @Kenneth Law and @Renz Carillo, we have a discord channel where you can ask your doubts and share issues, it's difficult to debug things on RU-vid comments so you can join the discord and ask your doubts
@Darshil Parmar is it possible We can become a Data Engineer after follow your videos? I have don't have enough money to spend for to learn Data Engineer course from outside, please advise Bro
Yes, you can learn most of the things for free on my channel also I have added Data engineering roadmap for youtube - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-5CA8LenDMig.html
@DarshilParmar hy darshil, I'm using windows pc and I'm not able to preview data in table.Can you explain why? crawler works fine im able to create 10 table but there is no data in tables. I have checked my csv files they contain excel files.
You will still have to create table on Athena to query data from s3 That means you’ll have to define schema, headers, other properties too. It’s a manual work but again there are different ways to solve one problem I just used glue Crawler to showcase one way
If the files have same structure then it will make one single table for all the data, if it has difficult files then it will give you an error while querying
You are using only tools then why this job is high paying? I think this demand is only because their are less people who can use this tool once enough people will learn this all then it will not demanding and high paying
So what's your plan bro? Can't learn tools that is used my less people because of demand will not be there in future and no point of learning tools that everyone know because of less pay
@@DarshilParmar bro point is as you told in one of your Vedio that it is one of thr high demanding and paying job in market but later it might not be like this. Their is no issue in learning the tools that everyone knows but one question in parallel with this is it good career for future then. I am started following roadmap but recently I watched your vedio how DE projects are made so that's why this question arise in my mind and only want your suggestion and guidance Thanks
@@pradhyumansinghmandloi8240 I thought you made a statement But this goes for all of the career and if you just think and never take actions then you will stay behind While you are just thinking about tools and it's future there are other peole who already learned it, got the job and started thier career
@@DarshilParmar yes you are right I have already started my career as a software engineer but after watching your vedio and looking to switch into data engineering that's why I have some dout regarding this.
How i downloaded the data from the s3://covid19-lake : With the AWS CLI : aws s3 sync s3://covid19-lake// / I made 5 commands like this to uplad the 5 folders.