This was such a great video! The way you explained the complex process made it really easy to understand, even for beginners. I especially appreciated the real-world examples and step-by-step demos. Looking forward to more content like this-thanks for helping me get started with this!
Darshil, you are an amazing human being, giving such informative tutorials free of cost is quite rare these days so I really appreciate your efforts. Thank you so much for teaching with so much clarity and simplicity. You're the best!!❤
Thank you Darshil for providing such end-to-end data engineering projects for our viewers. The efforts you been putting on are so great. looking forward for more and more projects... thank you once again...
I've checked bunch of channels and series videos and it seems you know what you're doing and love what you're doing. Great great work! Subscribed. Keep up!
It was a great tutorial. I followed each and everything which I feel help me building the portforlio to apply for data engineering. Thank you so much. From now on you are my data guru. :)
I'm so glad I found this video before I started my Azure Data project. It saved me so much time and frustration. And I'm definitely subscribing to your channel and looking forward to more Data related tutorials.
@@GATE_Education yes u can use just pass the parameter at run time. have a textfile with name of all the data then read it one by one then pass it as a parameter
Hello, Darshil. Today, at 4:50 PM, I finished the first phase of Data Engineering adventure, and the sense of success is beyond description. The joy I feel is great, and I wanted to share it with you because you inspired me to begin this adventure. You will always hold a special place in my prayers. Thank you so much, Darshil, for contributing to this accomplishment. With deepest thanks.
Hey Darshil, I wanted to thank you for the informative and useful session. Your content is great and very helpful. I appreciate that you are one of the few RU-vid channels that covers end-to-end data engineering projects.
Thank you so much brother. Your motivation helped me get my DP-203. Now I’m currently doing a guided project from udemy. Once that is done, i will surely do this❤
Hi i started to learn Data Engineering, there's not too much info on youtube and internet at all, but this video by far has answered a lot of questions i had for a long time, thank you so much!
Great content Darshil, After completion of data engineering internship, Iam looking for real time open source projects, this is one of the best project i have found it covers all major Tools/Technologies that i have learned in the internship. Thank you.
Hey Darshil, Your video was a game-changer for me! I've always been curious about the inner workings of the industry, and your content came at just the right time. Your teaching method is incredibly effective, breaking down complex concepts into easy-to-understand chunks. You've given me the direction I needed to kickstart my learning journey in this non-tech realm. Now, I not only understand how it all works but also know where to focus my efforts. Your comprehensive approach for beginners is exactly what I was searching for. Expect me to be a regular viewer, soaking in all the valuable insights you share. Thanks a ton for being the guiding light I needed!
Hi Darshil, Great demos and tutorials, really appreciate your hard work , Please keep this going you are helping many... I had a quick question. : how do you do Increamental loads in the process also how do you handle SCD type 2 etc. Talking about the projects which are used to build the data lake/delta lake, and AWS projects for datawarehouse on redshift etc. could you please shed some light on both the processes
amazing Drashil, as you said in the video many people take thousands of money for this but you still explained it very easy and free of cost for the viewers, appreciated brother..... thank you
Well done Darshil very intuitive video, appreciate the work. Can you do a video on how to schedule a python script on azure to run on a schedule, the python script could be extracting data from an API or something and then loading it to blob storage. Just giving ideas.
Hey Darshil, I have completed first part, following along. A big thanks to you for putting so much efforts. The idea of explanation of services along with ongoing project is awesome. Keep doing such great work.
This is my very first comment on RU-vid normally I just watch but ever since I got the idea of transitioning into a data engineer, I've been watching a lot of videos and I am 100 percent sure this is the best! Thanks, Darshil!
Hi Darshil, This video helped me a lot in kickstarting my understanding of Azure Databricks. I am glad to have come across your channel. Thanks a ton for this and I hope your channel grows beyond expectations :)
Million thanks Darshil for sharing knowledge.. I just resume from my maternity leave and your videos give me so much confidence to upgrade my skills.. Many more success to you
Nice one however we could have used for each loop and pass file names from an Array variable that contains name of the files. Relative URL field could have been used to refer to each item in the array.
Nice Explanation bro, experts generally cut of the thinking part and just show things in a mechanical way, but you tellin your though process while doing things makes learning very interesting and avoids distraction as we are not needed to serch for small small details at any other place.... Thanks man
Darshil.. I have one basic question. It might sound stupid question but I just want to know that both Datafactory and datalake can be used to store the data, I am unable to figure out Why do we need datalake when we can ingest the data in the data factory? Basically i googled for the difference between the two but still I am confused about the core difference that these two services had when both seems to store the data and perform desired functions? Any help would be appreciated.. Thanks
I just want to say a million, thanks.. I just finish my Azure Synapse Analytics certification, and this video gave me so much confidence for my interview.
Hey hi, I am thinking to start learning azure synapse. Could u pls guide me how can I start with and any suggestions for resources. Thanks in advance 😊
Thank you for putting this together from PR. Clear explanations and to the point... I like how you made mistakes and solve them on screen and explanation of services. Ill follow and await patiently while honking my horn on the parking lot :-) for more content. Quick question while going through it the col function was not there on data Bricks. is it common to install the library through the commands or should I install libraries at the beginning like I have seen on regular python scripts?
Thank you. It is a very good project to connect different Azure products and learn more about the Data engineering tasks, even if it is in a smaller scale. I will definitely try to implement something similar in the future, but using APIs and from there I can do the transformations either in Azure DF (like mapping Data Types) or Azure DBricks.
This video is just awesome to learn so many things and understand. Thank you so much for creating such helpful content and making things simple and clear. You are doing a great work, keep going, you just got a new subscriber :)
One suggestion i would like to provide this is not a recommened way to copy each data everytime using each copy activity. U can simply use parameter which will be passed at run time. It will read parameter one by one then perform copy activity. Just like copy activity inside a for loop
Mount point is not suggested since there is no security possible i.e. anyone having access to the databricks workspace can access those mount points. Use abfss path as the alternative
Thank you. Please keep creating similar projects. Something I’m struggling find is end to end machine learning project with front end web application to interact with the ml model.
Hi Darshil, I have faced a issue when tried to build the pipeline as i could not http option in data factory to create a source in pipelines, please help me with this issue, Thanks!
Hi sir I personally thank you very much for the AWS Data Engineering Project series and we are learning a lot from them. We all are really grateful for this level of your generosity. Our humble request you to create an Azure Data Engineering Project using SQL & SSMS as there is only one project of Azure Data Engineer. We want to learn more about the Azure Data Engineering Project as there is no Azure Data Engineer project using Bronze, Silver & Gold layer transformation in PySpark DataBricks and SSMS respectively. I hope our request will soon be accepted from your end. Thank you so much for your valuable guidance & support.
Thank you Darshil and showing the different Azure Services. What is the benefit of using different services or why shouldn't we use just azure synapse to implement all steps mentioned?
Thank you for the video. Do we need to publish the first pipelines that we've created at the 1 hour mark? If I close the tab with the pipelines, all the info is discarded.
Hi Darshil, thanks for an awesome project demonstration on Azure cloud. Few things that miss on all YT project videos are the lack of implementing incremental load how the pipeline would handle and another very specific to Azure, how collaboration and CI/CD works with drag and drop. Could you please add some of these concepts with practical.
Thank you so much for the great effort. You have unparalleled teaching style. Alternatively, When there is a option to upload file in DataLake Gen 2 directly, why are we writing a pipeline to fetch it from some HTTP page?
I am using trail account created by edu email and it's not allowing me to create the app registration, I tried to change the user settings and I don't have permission to change that. I believe the issue with the azure active directory, any one faced this issue?? Please help me out.
Thanks, Darshil I completed part 1. I faced challenges because the account I was using is a student ACC and don't have access to Azure Active Directory to create app, but I was able to create using CLI and completed everything without an error