Тёмный

End to End Project on Snowflake & AWS | Master AWS & Snowflake Integration Today! 

KSR Datavizon
Подписаться 110 тыс.
Просмотров 94 тыс.
50% 1

Опубликовано:

 

27 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 112   
@ra2f.
@ra2f. 8 дней назад
Amazing demo, very clear and well-detailed, step-by-step - rare quality content! Thank you!
@shatrughanbtech-ek4mp
@shatrughanbtech-ek4mp Год назад
Nice present that I expect. I have seen many videos to How load data in s3 to snowflake❄️ but not found any single video to how integrate aws with snowflake. 🙏🏼🙏🏼👍 Thank u so much.
@KSRDatavizon
@KSRDatavizon Год назад
Thank You so much
@jeydrn276
@jeydrn276 Год назад
absolutely amazing presentation, this is exactly what I was looking for until I get access to our account. Thanks very much!
@KSRDatavizon
@KSRDatavizon Год назад
Glad it was helpful!
@srinivaskotagiri1113
@srinivaskotagiri1113 Год назад
Can I get from starting of snowflake, I have experience on DBA.
@shavy223
@shavy223 Год назад
Thank you for clear explanation with step by step process. I would love to learn more from you.
@KSRDatavizon
@KSRDatavizon Год назад
Thanks a lot
@anishyapaul5189
@anishyapaul5189 7 месяцев назад
Good Demo and explanation. Thank you
@KSRDatavizon
@KSRDatavizon 7 месяцев назад
Glad it was helpful!, Please subscribe our channel for regular updates .
@vishalcrazy5121
@vishalcrazy5121 2 месяца назад
Very very very helpful to get the real exposure
@KSRDatavizon
@KSRDatavizon 2 месяца назад
Thanks for your kind words! We're glad our content is helpful. If you found it valuable, please consider subscribing to our channel for more similar content. It helps us keep creating videos that you'll enjoy. 😊"
@ramakambhampati5094
@ramakambhampati5094 Год назад
fantastic demo. Deserve 5 stars. "Snowflake Spider Man"
@KSRDatavizon
@KSRDatavizon Год назад
Thank you so much, Please subscribe our channel, it motivates us a lot
@reachinganwesha123
@reachinganwesha123 11 месяцев назад
Excellent.. sooo helpful
@KSRDatavizon
@KSRDatavizon 11 месяцев назад
Glad it was helpful!
@ravishankarrallabhandi531
@ravishankarrallabhandi531 Год назад
Thanks for explaining it with an end to end project. I have a question... Once we create an integration on a s3 bucket and create a stage on top of it.. how do we identify and load only the delta files into snowflake table? For ex: On Day 1 we have 10 files.. and on Day 2 we have 10 new files.. how can manage to load only the new 10 files into snowflake table? Thanks you.
@KSRDatavizon
@KSRDatavizon 8 дней назад
We can maintain meta data..(meta table) which captures the timestamp of data load.. this will be helpful for tracking purpose
@balajikomma541
@balajikomma541 Год назад
kindly make project video on "Snowflake Integration with Azure"
@KSRDatavizon
@KSRDatavizon Год назад
Sure, will do shortly, Please subscribe our channel, it motivates us
@asmitjais3776
@asmitjais3776 Год назад
Very help full..thanks..
@KSRDatavizon
@KSRDatavizon Год назад
Most welcome
@VenkatAnnamraju
@VenkatAnnamraju Год назад
Very detailed explanation, very good
@KSRDatavizon
@KSRDatavizon Год назад
Glad it was helpful!
@dilipkumarreddyjanumpally6231
Thanks for your video it's helpful for while do integration end yo end
@KSRDatavizon
@KSRDatavizon Год назад
You are welcome, please subscribe our channel for regular updates,
@JimnyGirl
@JimnyGirl 10 месяцев назад
very nice
@KSRDatavizon
@KSRDatavizon 10 месяцев назад
Thanks
@SRaavi
@SRaavi 9 месяцев назад
superb explanation, can u pls provide that JSON script which is used for policy creation
@KSRDatavizon
@KSRDatavizon Месяц назад
Thank You for Your comment, let me check to get the jSON FILES Please subscribe our channel for regular updates and it Motivates us a lot 🙏🏼
@rk-ej9ep
@rk-ej9ep Год назад
This is awesome bro..
@KSRDatavizon
@KSRDatavizon Год назад
Thank you, Please subscribe our channel for regular updates , all the best
@amrendrakumarsingh5911
@amrendrakumarsingh5911 7 месяцев назад
Well explained
@KSRDatavizon
@KSRDatavizon 7 месяцев назад
Thank you so much, Please subscribe our channel, it motivates us a lot
@nadianizam6101
@nadianizam6101 6 месяцев назад
well explained video.plz make more videos on snoflake
@KSRDatavizon
@KSRDatavizon 6 месяцев назад
Thank you for your feedback! I'm glad you found the video helpful.Please subscribe our channel for regular updates and it motivates us a lot
@mounikamallikanti9522
@mounikamallikanti9522 Год назад
Good morning sir, very good explanation sir.iam mounika.i have learnt snowflake software from a private institute, but I have the career gap.now i want to improve my skill, for that i would like to work with real time work shops on snowflake.so sir, is there any freelancers or real time work shop on snowflake.
@KSRDatavizon
@KSRDatavizon 2 месяца назад
Good morning Mounika! I'm glad the explanation was helpful. Here's some information to help you improve your Snowflake skills and find real-time workshops: Freelance Work and Real-Time Workshops on Snowflake: 1. Freelance Projects: Freelance platforms: Look for Snowflake projects on platforms like Upwork, Freelancer.com, and Arc.dev. These platforms connect freelancers with businesses needing help on various projects, including Snowflake development. Direct connections: Network with other data professionals or companies using Snowflake. They might have freelance opportunities for someone with your skills. 2. Real-Time Workshops: Snowflake Training: While Snowflake doesn't offer public real-time workshops, they do provide a comprehensive training platform with on-demand courses and labs. You can explore these at learn.snowflake.com/en/courses. Third-Party Platforms: Some companies offer real-time or live online workshops on Snowflake. Explore platforms like Udemy, Coursera, or Pluralsight for such workshops. Meetups and Events: Stay updated on local data meetups or conferences that might feature live workshops or sessions on Snowflake. Here are some resources for finding events: Meetup.com: www.meetup.com/ Eventbrite: www.eventbrite.com/ Data Science Central: www.datasciencecentral.com/ Tips for Finding Opportunities: Focus on your skills: Highlight your strengths in Snowflake, including specific tools, tasks, or functionalities you're proficient in. Build a portfolio: Consider creating a portfolio showcasing your Snowflake projects, even if they were personal projects. Network actively: Connect with data professionals on LinkedIn or online forums to stay updated on opportunities. Additional Resources: Snowflake Documentation: docs.snowflake.com/ Snowflake Certification: Consider pursuing the Snowflake Data Analyst Professional certification to validate your skills. By combining freelance work, real-time workshops, and self-directed learning, you can effectively improve your Snowflake expertise and bridge your career gap.
@lavanyap3021
@lavanyap3021 Год назад
Excellent explanation, thank you.
@KSRDatavizon
@KSRDatavizon Год назад
Glad it was helpful!, Please subscribe our channel it motivates us
@OnlineGuru001
@OnlineGuru001 11 месяцев назад
Do i need any prior snowflake/aws knowledge to start this?
@KSRDatavizon
@KSRDatavizon 10 месяцев назад
No need, we will teach from scratch,.
@samyakG-dr4vu
@samyakG-dr4vu Год назад
Very informative video.
@KSRDatavizon
@KSRDatavizon Год назад
Glad it was helpful!
@juliestudy1475
@juliestudy1475 9 месяцев назад
Excellent! Just curious why you label your chart 1, 2 & 3 starting from Snowflake and yet you proceed your steps the other way round ie 3, 2 & 1?
@KSRDatavizon
@KSRDatavizon 28 дней назад
Great observation! This reversal of numbering is common in certain technical or architectural diagrams to reflect a "flow" of steps in the process, but can indeed create some confusion. Here's why it often happens: Diagram Sequence vs. Execution Flow: Diagrams may label components or stages in a logical hierarchy (e.g., the final data warehouse like Snowflake may be labeled as step 1 because it’s the final destination of your data). However, the steps are executed in reverse, starting from the raw data source (which might be labeled step 3, since it's the initial point of data ingestion). Final Destination First: Often, the final stage (e.g., where data is loaded, such as Snowflake) is labeled as "1" to emphasize the end goal of the process. This approach is common in project plans or workflow charts where the target is highlighted first. The flow, however, starts at the origin (S3 or data lake), hence the actual process runs from "3 → 2 → 1." Hierarchical Presentation: The final step is presented as the most important, so it may be visually or logically numbered first. This helps to understand that all preceding steps aim toward achieving this final goal.
@SwatiXtia
@SwatiXtia 4 месяца назад
Hi, excellent project, however it would be nice if you provide the policy script
@KSRDatavizon
@KSRDatavizon 2 месяца назад
sure, will make for future sessions
@mohammedvahid5099
@mohammedvahid5099 Год назад
Pls make more projects n in interview session programs we need that to practice..
@KSRDatavizon
@KSRDatavizon Год назад
sure..we will upload asap
@aniruddhyadav3698
@aniruddhyadav3698 9 месяцев назад
Can we get a job with sql & snowflake certification. Please reply sir
@KSRDatavizon
@KSRDatavizon 8 месяцев назад
Adding additional tools like Cloud and DBT, Python would help you in getting Job
@shashankm2859
@shashankm2859 Год назад
What is the difference between Integration object and pipe? When to use them?
@sairevanthmuppana
@sairevanthmuppana Год назад
Snowflake Integration object is used for making relationship between the snowflake and external cloud storage. if you want to load the data file from external storage, first of all we need to create an Intergation object and in that intergation object we are going to define the location of the file and role(which is used for accessing the files). whereas snowpipe is used for continous data loading, sometimes in relatime we will get the data for every one hour or within certain time in microbatches. so once the data is available in storage locations, cloud vendor will send an notification to the snowflake. once snowflake receives notification it will load the data.
@KSRDatavizon
@KSRDatavizon 2 месяца назад
Integration Objects vs. Pipes in Dataflow Integration Objects and Pipes are fundamental components in dataflow systems, each serving distinct purposes. While they might appear similar at first glance, understanding their differences is crucial for effective data processing. Integration Objects Purpose: Integration objects are designed to connect dataflow systems to external sources or destinations. They act as bridges between the internal processing of data and the external world. Functionality: Integration objects handle tasks such as: Reading data from external sources (e.g., files, databases, APIs) Writing data to external destinations (e.g., files, databases, APIs) Transforming data to match the requirements of the external system Types: Common types of integration objects include: File readers/writers: For reading from or writing to files in various formats (e.g., CSV, JSON, XML) Database connectors: For interacting with databases (e.g., MySQL, PostgreSQL, SQL Server) API connectors: For communicating with web APIs (e.g., REST, SOAP) Message queues: For integrating with messaging systems (e.g., RabbitMQ, Kafka) Pipes Purpose: Pipes are used to define the flow of data within a dataflow system. They represent the logical connections between different processing stages or components. Functionality: Pipes handle tasks such as: Transmitting data from one component to another Applying transformations to the data (e.g., filtering, mapping, aggregation) Controlling the flow of data (e.g., branching, looping) Types: Pipes can be categorized based on their functions: Data transformation pipes: Apply transformations to the data (e.g., map, filter, reduce) Data flow control pipes: Control the flow of data (e.g., branch, loop) Data storage pipes: Store data temporarily (e.g., buffer) When to Use Which Integration Objects: Use integration objects when you need to interact with external systems or sources. They are essential for connecting your dataflow system to the outside world. Pipes: Use pipes to define the internal logic and flow of data within your dataflow system. They are crucial for transforming, filtering, and controlling the data as it moves through the pipeline. In summary, integration objects are the gateways to the external world, while pipes are the building blocks of the internal dataflow process. By understanding their roles and differences, you can effectively design and implement robust dataflow systems.
@mukeshr7401
@mukeshr7401 Год назад
1st of all thank you for clear explanation... Is this data avro means how do we load? At least Can you explain avro file format? Please
@KSRDatavizon
@KSRDatavizon 7 месяцев назад
thank you for your compliment, sure will make better sessions in future with all these updates
@pvijayabhaskar1520
@pvijayabhaskar1520 Год назад
Sir I have already my snowflake account is created in AWS and wanted Amazon S3 bucket storaged data files access into snowflake tables
@KSRDatavizon
@KSRDatavizon 7 месяцев назад
That's great to hear! Integrating Amazon S3 with Snowflake can really streamline your data storage and processing.
@SidharthanPV
@SidharthanPV Год назад
watch it in 1.25x speed
@KSRDatavizon
@KSRDatavizon Год назад
thank you
@yayi3434
@yayi3434 6 месяцев назад
Good One! What Notepad are you using here?
@KSRDatavizon
@KSRDatavizon 2 месяца назад
Let me check, Thank you, Please subscribe our channel for regular updates,
@ITKaksha
@ITKaksha Год назад
Is pipeline concept different than what we saw in this video ?
@KSRDatavizon
@KSRDatavizon 8 дней назад
Its same
@ChandraS-c9h
@ChandraS-c9h Год назад
Can you please explain how control frame work for snowflake project?
@KSRDatavizon
@KSRDatavizon Год назад
sure we will make a seperate video for the same, here will explain little short . Certainly! The "Control Framework" in the context of a Snowflake project typically refers to a set of guidelines, best practices, and processes that help manage and maintain the data architecture, data pipelines, and overall data management within a Snowflake data warehouse environment. It ensures consistency, reliability, and scalability in your data operations. Here's an overview of how a control framework might work for a Snowflake project: Key Elements: Governance: Define data standards, ownership, and access rules. Environments: Separate environments (dev, test, prod) with proper security. Data Model: Design logical/physical models for data organization. ETL/ELT: Implement structured data extraction, loading, and transformation. Version Control: Manage code changes and deployments. Testing: Ensure data quality with automated validation. Monitoring: Watch query performance and resource utilization. Backup/Recovery: Plan for data protection and disaster scenarios. Documentation: Maintain architecture and process documentation. Improvement: Continuously enhance based on feedback and lessons learned.
@shahidawan4072
@shahidawan4072 Год назад
Hi, any new training course for Snowflake is available?
@KSRDatavizon
@KSRDatavizon 11 месяцев назад
yes, we are starting new batch from 23rd Nov. 18th we have master class. please enroll from our webiste Datavizon.com
@mohammadarsalanadil1057
@mohammadarsalanadil1057 Год назад
from where we can get sample files that we are storing in AWS? Can you please help
@KSRDatavizon
@KSRDatavizon 8 дней назад
drive.google.com/drive/folders/1W3alj8Q2nD1V9huvurFceV4TNVu4HLlc
@Larry21924
@Larry21924 8 месяцев назад
This content is absolutely astonishing. I recently read a similar book, and it was an absolute marvel. "Mastering AWS: A Software Engineers Guide" by Nathan Vale
@ranjith335
@ranjith335 Год назад
While creating external storage, facing below error: Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)]
@pkjangra1
@pkjangra1 Год назад
I have got same error. I have ensured that the policy is correct. Still same error. Did you find any solution?
@augustineopokujuniorantwi881
@augustineopokujuniorantwi881 6 месяцев назад
try this: GRANT USAGE ON STAGE project_aws_stage TO ROLE accountadmin; Change the stage name to match yours
@meghanb-pc6rk
@meghanb-pc6rk 9 месяцев назад
Hi, when we are running for all the files, in what order it will load in the file ? how can I set my files to be loaded based on timestamp. For example its a full load and I did not run my process today. So next day there will be two files in my S3 location. I want to load only one file that I have the latest timestamp. How can we achieve that. thanks for your answer in advance.
@KSRDatavizon
@KSRDatavizon 28 дней назад
To load only the file with the latest timestamp from your S3 bucket, you can follow these steps using AWS SDKs like Boto3 in Python, or similar mechanisms if you're working in another language or platform. Here's a general approach in Python using Boto3: Steps to Achieve the Desired File Load: List all the files in the S3 bucket location: You can use the list_objects_v2() function to list all the files in the specified S3 path. Sort files by timestamp: From the result of the list, you can extract the timestamps (typically from the file name or metadata) and sort the files accordingly. Pick the file with the latest timestamp: Once sorted, pick the most recent file. Load the file: Perform the desired operation with the latest file.
@cansener9802
@cansener9802 6 месяцев назад
Why not add codes that you used in the video, to the description....?!
@KSRDatavizon
@KSRDatavizon Месяц назад
will check and add the codes to decription
@dev4128
@dev4128 4 месяца назад
Thank you so much
@KSRDatavizon
@KSRDatavizon 2 месяца назад
You're most welcome, Please subscribe our channel for regular updates
@vedavathit4614
@vedavathit4614 Год назад
How to change dateformat yyyy-mm-dd to dd-mm-yyyy in snowflake with date datatype.
@KSRDatavizon
@KSRDatavizon Год назад
use TO_CHAR() function
@sudhac5841
@sudhac5841 Год назад
What happens if stream gets dropped?
@havishataneja6371
@havishataneja6371 Год назад
Hello Sir , I Have creatded the statge intergation the same way but i am getting i am getting the error to check aws role "Snowflake Error assuming AWS_ROLE. Please verify the role and external Id are configured correctly in your AWS policy." though all is verified .While List @stage iam getting this error
@KSRDatavizon
@KSRDatavizon 25 дней назад
Hi! The error you're seeing typically occurs if there's a mismatch between the AWS role configuration and Snowflake's integration setup. Make sure: The AWS role has the correct trust relationship with Snowflake. The External ID is correctly set in both Snowflake and AWS. Your AWS policy allows the necessary S3 bucket actions. Double-check the trust policy in AWS IAM and ensure the role ARN is correct. If everything seems fine and the error persists, try re-creating the integration. Let me know how it goes!
@bhargavnagineni9322
@bhargavnagineni9322 Год назад
Do u providing any training?
@KSRDatavizon
@KSRDatavizon Год назад
pls reach us on 9916961234 for more details
@veer5827
@veer5827 Год назад
I explained the same project in interview but profile got rejected..
@KSRDatavizon
@KSRDatavizon Год назад
Presentation skills also matters.. Usually we will not get rejected for project.. We need to be confident in explaining
@GeekCoders
@GeekCoders 9 месяцев назад
this is not a project this is just a one task
@jawwadkhan7699
@jawwadkhan7699 10 месяцев назад
Can you please provide the text file of policy you have created under IAM Role.
@KSRDatavizon
@KSRDatavizon 29 дней назад
sure will check and update in Description
@rakshithdyavarishetty9657
@rakshithdyavarishetty9657 Год назад
Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)] How do i resolve this error, i have made the data public in the aws s3 and can view the data in the aws but in the snowflake it is giving me this error.
@augustineopokujuniorantwi881
@augustineopokujuniorantwi881 6 месяцев назад
try this: GRANT USAGE ON STAGE project_aws_stage TO ROLE accountadmin; Change the stage name to match yours
@KSRDatavizon
@KSRDatavizon 8 дней назад
Pls verify the access.. and provide proper access
@gangabhavanipolimetle3090
@gangabhavanipolimetle3090 Год назад
how can i join in your training course.
@KSRDatavizon
@KSRDatavizon Год назад
Please call to +91-9926961234/8527506810
@neethisurya9755
@neethisurya9755 Год назад
This is Etl testing project? Or else
@KSRDatavizon
@KSRDatavizon Год назад
its a snowflake one
@harinimmagadda14
@harinimmagadda14 Год назад
what are the frameworks in snowflake?
@KSRDatavizon
@KSRDatavizon Год назад
Key components in Snowflake: Virtual Warehouses: Compute resources for query processing. Database: Organizes data. Schema: Contains tables and objects. Table: Stores data (structured, semi-structured, external). Stage: Loads data from external sources. Warehouse: Executes queries. Materialized Views: Precomputed data summaries. Streams: Captures and replicates changes. Tasks: Automates operations. Security: Role-based access, encryption. Data Sharing: Secure data sharing between accounts. Data Marketplace: Access to third-party data and services.
@ramadevigarnepudi-xh8iu
@ramadevigarnepudi-xh8iu Год назад
How much this snowflake project
@KSRDatavizon
@KSRDatavizon Год назад
did u mean course price sir?
@PriyaGupta-qj1zb
@PriyaGupta-qj1zb 8 месяцев назад
Can you please provide customer_ info _ csv file !!!!!!
@KSRDatavizon
@KSRDatavizon Месяц назад
sure will add shortly
@baludhary6823
@baludhary6823 Год назад
Iam getting acces denied 403 error code
@KSRDatavizon
@KSRDatavizon Год назад
can you try recreating it please
@augustineopokujuniorantwi881
@augustineopokujuniorantwi881 6 месяцев назад
try this: GRANT USAGE ON STAGE project_aws_stage TO ROLE accountadmin; Change the stage name to match yours
Далее
бабл ти гель для душа // Eva mash
01:00
Ванька пошел!!!! 🥰
00:18
Просмотров 732 тыс.
Power Apps - HCL Interview Question
15:56
Просмотров 1,2 тыс.
What does a Data Analyst actually do? (in 2024) Q&A
14:27
бабл ти гель для душа // Eva mash
01:00