Тёмный

Snowflake - Loading data from Azure 

Janardhan Reddy Bandi
Подписаться 16 тыс.
Просмотров 7 тыс.
50% 1

I can be reachable on jana.snowflake2@gmail.com
-----------------------------
Steps:
-------
1. You should have Snowflake trial account
2. You should have Azure trial account
3. Create storage account and containers in Azure
4. Upload the source files to these containsers
5. Create storage integration between Snowflake and Azure
6. Create stage objects using the storage integration object
7. Use copy commands to extract the data from files and load in snowflake tables.
// Creating Azure free trial account
• How to Create a Free A...
-----------------------------------
-- Create a storage integration object
CREATE STORAGE INTEGRATION snow_azure_int
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = AZURE
ENABLED = TRUE
AZURE_TENANT_ID = 'Azure-Tenant_ID'
STORAGE_ALLOWED_LOCATIONS = ('azure://snowazureintg22.blob.core.windows.net/customerdatafiles', 'azure://snowazureintg22.blob.core.windows.net/snowazurefiles');
-- Describe integration object
DESC STORAGE INTEGRATION snow_azure_int;
-----------------------------------
// Create database and schema
CREATE DATABASE IF NOT EXISTS MYDB;
CREATE SCHEMA IF NOT EXISTS MYDB.file_formats;
CREATE SCHEMA IF NOT EXISTS MYDB.external_stages;
// Create file format object
CREATE OR REPLACE file format mydb.file_formats.csv_fileformat
type = csv
field_delimiter = '|'
skip_header = 1
empty_field_as_null = TRUE;
// Create stage object with integration object & file format object
CREATE OR REPLACE STAGE mydb.external_stages.stg_azure_cont
URL = 'azure://snowazureintg22.blob.core.windows.net/snowazurefiles'
STORAGE_INTEGRATION = snow_azure_int
FILE_FORMAT = mydb.file_formats.csv_fileformat ;
//Listing files under your azure containers
list @mydb.external_stages.stg_azure_cont;
// Create a table first
CREATE OR REPLACE TABLE mydb.public.customer_data
(
customerid NUMBER,
custname STRING,
email STRING,
city STRING,
state STRING,
DOB DATE
);
// Use Copy command to load the files
COPY INTO mydb.public.customer_data
FROM @mydb.external_stages.stg_azure_cont
PATTERN = '.*customer.*';
//Validate the data
SELECT * FROM mydb.public.customer_data;
Steps to Load data from Azure
------------------------------
Step 1: Create storage integration between Snowflake and Azure:
docs.snowflake...
Step 2: Create External Stage objects:
docs.snowflake...
Step 3: Copy command to load the data from Azure containers to Snowflake tables:
docs.snowflake...

Опубликовано:

 

13 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 32   
@relookyouxrb4338
@relookyouxrb4338 12 дней назад
SUPER well explained! Just what I needed to know! MANY THANKS!
@ugandar123
@ugandar123 8 месяцев назад
This is the one exactly what I wanted to understand. Thanks a lot.
@maggie2509a
@maggie2509a 8 месяцев назад
Crisp and clear
@srikanthu5549
@srikanthu5549 9 месяцев назад
Hi sir could you please do one video about snowflake future , current market and prerequisites or things to learn for some one who what to change into Snowflake domain?
@HARIKIRANTSV
@HARIKIRANTSV 9 месяцев назад
Hello janardhan hope you are doing well could you please make a video on notification alerts and i need to be notified through email when a table is dropped or truncated in the snowflake by creating the notification alerts. Also when the ware house size is altered i need to be notified. Please kindly make a video on this ASAP thank you
@kamasanidamu4650
@kamasanidamu4650 8 месяцев назад
Hi janardhan garu, it was very useful video ,, I have a requirement to extract the aws snowflake data ingested into azure storage using adf ..can i execute the same commands and assign the role ,,will that work ? Currently I am seeing an issue
@mrjana520
@mrjana520 8 месяцев назад
What is this boss? I don't understand. aws snowflake data ingested into azure storage into adf
@kamasanidamu4650
@kamasanidamu4650 8 месяцев назад
We have aws snowflake instance ,,my requirement is to connect aws snowflake using Azure adf nd load the data into azure storage
@kamasanidamu4650
@kamasanidamu4650 8 месяцев назад
I am using the copy activity in adf ..error is "snowflake fails to access remote file ".Can you please suggest on the same .
@mrjana520
@mrjana520 8 месяцев назад
It seems credentials or blob storage permission issue. Read this page community.snowflake.com/s/article/Permissions-error-during-COPY-INTO-from-Azure-Storage-Location learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-snowflake
@sunnyanju0609
@sunnyanju0609 9 месяцев назад
Hi sir, do you provide online or offline classes?
@mrjana520
@mrjana520 9 месяцев назад
No
@shreyaroraa2234
@shreyaroraa2234 9 месяцев назад
Whats best way to do incremental load daily from container to Azure Adf or snowflake task or something else?
@mrjana520
@mrjana520 9 месяцев назад
When there is in-built functionality, why should we go for other tools with extra cost? And more over it depends on the technology stack your company is using.
@ramramaraju2221
@ramramaraju2221 8 месяцев назад
Hi Jana ,Could you pls provide video on error notification when snowpipe skips the file due to errors
@mrjana520
@mrjana520 8 месяцев назад
will try
@durgasreelakshmi
@durgasreelakshmi 9 месяцев назад
Pls upload video on iceberg tables and dynamic tables
@sunnyanju0609
@sunnyanju0609 9 месяцев назад
Hi sir,I'm already working on monitoring batch jobs. I want to switch my career to snowflake side. Are you providing any online classes or udemy course is enough to get job on snowflake?
@mrjana520
@mrjana520 9 месяцев назад
That Udemy course covers all topics needed for interviews
@sunnyanju0609
@sunnyanju0609 9 месяцев назад
@@mrjana520 thanks for your reply. I'll take that course
@arunashah6555
@arunashah6555 7 месяцев назад
Hello Jana, how can i upload files directly from github to azure container. I tried using shell but each time i am getting 403 error. This request is not authorized to perform this operation and not able to load files. Able to load it from local system but from git hub how to do it.
@mrjana520
@mrjana520 7 месяцев назад
I never did it, just try this. www.c-sharpcorner.com/article/uploading-files-from-a-git-repository-to-azure-storage-using-azure-cli/
@mohammedvahid5099
@mohammedvahid5099 9 месяцев назад
Thank you sir❤
@mram1690
@mram1690 9 месяцев назад
Sir i took udamey snowflake database for your material is added in the udamey list
@mrjana520
@mrjana520 9 месяцев назад
yes
@Shalini_Ekta
@Shalini_Ekta 9 месяцев назад
Hi sir , plsql course em iyna plan chestunara?
@mrjana520
@mrjana520 9 месяцев назад
not now
@ektagoyal27march
@ektagoyal27march 9 месяцев назад
Please share data files also
@mrjana520
@mrjana520 9 месяцев назад
how can I share them in youtube. You can get them from my Udemy course. My Snowflake Udemy Course: www.udemy.com/course/snowflake-complete-course-for-clearing-interviews/?couponCode=A853DD251C99330498C0 Use the coupon below to get it for 499 rupees. Coupon: A853DD251C99330498C0
@karokaro111
@karokaro111 9 месяцев назад
What job can we look for after learning snowflake and sql.. I mean like snowflake developer? Or Etl developer?? Etc
@ektagoyal27march
@ektagoyal27march 9 месяцев назад
There are jobs for Snowflake Developer
@mrjana520
@mrjana520 9 месяцев назад
Correct, snowflake developer, data engineer etc.
Далее
Snowflake - TroubleShooting SnowPipe - Working Session
52:13
Snowflake - SnowPipe - Working Session
45:23
Просмотров 21 тыс.
Snowflake - External Stages - Working Session
32:23
Просмотров 16 тыс.
Snowflake - Unloading Data - Working Session
30:00
Просмотров 8 тыс.
Create an external data stage (Azure) on Snowflake
9:28
Azure Data Engineer Mock Interview -First Round
26:57