Thanks for your note... If you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... this 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... and make all SQL statement available for CI/CD... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
Great video. It would have been wonderful if it also contained 1. how to do end to end CICD 2. How to setup pipeline dependency between data ingestion tool and snow flake task ( assuming we can bundle up all the loading steps you have covered in this video into a snowflake task) Apologies if you have already got these covered else where , if so please direct me. Many Thanks
Great video. It would have been wonderful if it also contained 1. how to do end to end CICD 2. How to setup pipeline dependency between data ingestion tool and snow flake task ( assuming we can bundle up all the loading steps you have covered in this video into a snowflake task) Apologies if you have already got these covered else where , if so please direct me. Many Thanks 1:21:30
Thanks for everything. U helped a lot ❤! May i ask if u can make videos on the exception handling and error logging? E.g. one of the csv has an additional column. Another example is when loading data into the internal stage, wifi connection failed and how to resume the job? Thanks bro! :)
yes, you can do it... and snowpark is nothing but SQL generator with current version .. may be you can watch this video.. what it is and what it is not 1. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE--awSPRW9AOY.html (What is snowpark) 2. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-7tToBddZ_is.html (What is NOT snowpark)
HI , i was tryin to run the getting the csv etc. files on the internal stage , I even get the output of the directory but Im not able to see the data as a result ! pls do help have been trying to rectify for hours but got no clue! thanks
Not sure which step you are talking about... if you can give me a timestamp, it will be helpful or you share a screenshot to my instagram account (instagram.com/learn_dataengineering/)
Thank you for sharing such a good content. I should say you are a rockstar in Snowflake world. I have a question. In this case, since there are lot of Data frames created in snowpark-python scripts and running the code from local machine ,does it consume local system storage/compute or push everything to the Snowflake storage/Compute? Thank you in advance!
thanks for your note.. when you perform an operation using dataframe in snowflake, it uses snowflake's compute power. When you pull data to your location machine..in that case.. it uses your local compute...
Thank you for your quick response. If i would like to push everything to the snowflake storage and compute, how should we do it? How should we register the snowpark-python programs in snowflake database and run/debug it(Instead of Stored proc route) ? is is really possible? May be having a separate video might help@@DataEngineering
Watch ch-08 from this snowpark playlist.. and you would understand how to deploy it (playlist link ru-vid.com/group/PLba2xJ7yxHB4yPg3pUrobdzeMxk4mP24S)
@@DataEngineeringThank you. I already watched it. Does that mean we should test it locally first and then deploy on SF sandbox. I am looking for options if we can develop,test, debug and deploy directly in the SF sandbox itself? Is it possible? Any insight?
complete data set is too big.. the desc has the link that has limited data. ----- and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in udemy.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/ 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/
Quick question: I am on Part 4 but for some reason I downloaded the data to my computer but it does not show when I try to run it through Snowflake. Is there a reason for that?
Hi Bro, While going through the course, I found out that not all the data is provided in the gitlab link as well as the exchange_rates.csv at 50:00 . The exchange rate column is null for all rows after moving the file into curated stage. Could you update the link with all the files as mentioned in the course? Thanks
When you connect to any BI tool like PowerBI..they need these relationship.. and can build the model for slice and dice... and if you have to draw the ER diagram.. to understand the relationship.. in such case.. you have to have those relationship are important..
Can you please tell me how to update row in snowflake_sample_data.tpch_sf100.orders??? getting error - "Object 'ORDERS' does not exist or not authorized."
then you have to write program for it.... snowpark can do .. or you can also write python unless snowsight come up with some kind of UI for that.. and yes, If you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... this 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... and make all SQL statement available for CI/CD... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
Hi I keep getting this error File "C:\Users\anbest\OneDrive - Capgemini\Documents\Git\Snowpark_project\LoadData.py", line 57, in main put_result(file_element," => ",put_result[0].status) TypeError: 'list' object is not callable I tested the traverse func on its own, and it is picking up my file names location etc. seems to be the put_result causing issues
Not sure clear what kind of error you are getting... your result is not what the program expect.. so you need to check the typeof(object) and if it is list or not.
These contents are available in udemy.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/ 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/
it is in the description... and yes..I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35