Тёмный
No video :(

18. Databricks & Pyspark: Ingest Data from Azure SQL Database 

Raja's Data Engineering
Подписаться 24 тыс.
Просмотров 32 тыс.
50% 1

Опубликовано:

 

21 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 63   
@venkataram6460
@venkataram6460 Год назад
simple but effective process and expalination as well. Great job
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Glad you liked it!
@omprakashreddy4230
@omprakashreddy4230 2 года назад
Simply Awesome !! We are learning a lot from your videos.
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
Thanks Omprakash
@avinashpasupuleti8585
@avinashpasupuleti8585 2 года назад
Simply Super Bro.. Awaiting for more videos from you on db related activates in data bricks
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
Hi Avinash, thank you. Sure will post more videos on db related activities
@sravankumar1767
@sravankumar1767 2 года назад
Nice explanation bro 👍
@sushilkushwaha260
@sushilkushwaha260 2 года назад
Awesome, Very nice explanation...
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
Thank you
@dataisfun4964
@dataisfun4964 Год назад
Thanks worked perfectly.
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Great
@veerag9426
@veerag9426 2 года назад
Super nice video
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
Thank you
@prathapganesh7021
@prathapganesh7021 9 месяцев назад
I really appreciate this video thank you🙏
@rajasdataengineering7585
@rajasdataengineering7585 9 месяцев назад
Thanks Prathap! Glad you find it useful
@prathapganesh7021
@prathapganesh7021 9 месяцев назад
Can i join for paid project.. Or else how can i contact you..
@melvin9993
@melvin9993 2 года назад
Simple and effective
@suresh.suthar.24
@suresh.suthar.24 Год назад
great video
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Glad you enjoyed it
@Khm3rick
@Khm3rick 2 года назад
Great video! Just one question, I saw you defined the jdbcDriver...but I didn't see it used after in jdbcUrl? What is it for?
@ShivaKumar-dj8bj
@ShivaKumar-dj8bj 11 месяцев назад
Yes I have also observed it. Later in other videos I found out that we have to add it as an option when reading data into data frame like .option("driver", jdbcDriver")
@vaddenata6735
@vaddenata6735 9 месяцев назад
Thank you so much ❤Sir....
@rajasdataengineering7585
@rajasdataengineering7585 9 месяцев назад
Most welcome! Hope you find it useful
@vaddenata6735
@vaddenata6735 9 месяцев назад
@@rajasdataengineering7585 Yes 💯
@bhumikalalchandani321
@bhumikalalchandani321 8 месяцев назад
Sir getting no suitable driver on running df ​@@rajasdataengineering7585
@VirajithaPanguluri
@VirajithaPanguluri Год назад
Can we mention type of authentication while connecting? What if we have only Azure Active Directory Password ? How to mention that?
@betterahyosi
@betterahyosi Год назад
You didn't use the jdbcDriver . What is the purpose to have jdbcDriver ????
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Why do you say I didn't use jdbc driver???? Look at 7:30 in the video
@betterahyosi
@betterahyosi Год назад
​@@rajasdataengineering7585I meant that u didn't pass the jdbcDriver value in to the jdbc url
@rajasekharmedia8987
@rajasekharmedia8987 6 месяцев назад
I want to do query from sql and load the result into one variable. Can we do that. Like select max(id) from sql table. I am using this id for comparison in next steps
@kylebrogan6416
@kylebrogan6416 Год назад
What about just being able to access those same tables via an existing databricks catalog in the hive metastore structure for example? Is there a way to do that?
@arnabsontu6578
@arnabsontu6578 2 года назад
Sir, Can we also create , view, alter , run stored procedures from databricks ?
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
Hi Arnab, stored procedures can't be created in databricks. Views can be created and can be altered as well
@161vinumail.comvinu6
@161vinumail.comvinu6 Год назад
Osm explanation
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Thanks
@shashikantchaturvedi1559
@shashikantchaturvedi1559 Год назад
Hi Raja's, I am following this tutorial step by step but I got an error while running the 2nd cell of getting product table. the error is " java.sql.SQLException: No suitable driver", can you please help in this case.
@shashikantchaturvedi1559
@shashikantchaturvedi1559 Год назад
Now I got that, something was wrong in preparing the connection. I can connect, and get the data from Azure Sql Server.. Thanks Raja.
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Glad to hear you fixed the issue 👍🏻
@a2zhi976
@a2zhi976 Год назад
like in unix , can i save all these details in one file and call in the beginning of the scripts. ?.
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Yes we can use yaml or json configuration file to save the details and during run time, spark can read the configuration file and process accordingly
@alwalravi
@alwalravi Год назад
How to write this product table data into blob storage in parquet format in a databrick notebook? Plz help
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
We can use databricks writer df.write.format("parquet").save(location)
@user-ye4hy7yw4q
@user-ye4hy7yw4q 9 месяцев назад
HELLO SIR..WHILE IMPORTING DATA HOW WE COME KNOW WHICH IS MODIFIED AND WHICH IS LATEST DATA??I MEAN ANY UPDATED DATA HOW WE HANDLE THAT..PLS REPLY
@AjithKumar-cj7hh
@AjithKumar-cj7hh 10 месяцев назад
What if data is huge like 100 gb. Is it still recommended?
@rajasdataengineering7585
@rajasdataengineering7585 10 месяцев назад
Jdbc connection has performance issue while handling huge amount of data. But there are options to improve the performance which can be applied depending on the use case
@SantoshKumar-yr2md
@SantoshKumar-yr2md 5 месяцев назад
how to get multiple tables from Azure SQL into databricks notebook
@govardhanbola1195
@govardhanbola1195 2 года назад
Hard coding Password in the code is not recommended. Can we get password from Azure Key Valt. Can you please let us know the steps for that
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
We need to integrate azure key vault with databricks by creating secret scope
@shivangishingatwar1356
@shivangishingatwar1356 2 года назад
Could you help me , establish connection string using azure active directory authentication mode
@MrTejasreddy
@MrTejasreddy Год назад
simple superb can you make a video how to creatra account in databricks community addition for free
@Jayalakshmi-r9t
@Jayalakshmi-r9t Месяц назад
how to get that ip address . i did not find while logging. please can you say
@rajasdataengineering7585
@rajasdataengineering7585 Месяц назад
You can get it from command prompt using ipconfig command
@Jayalakshmi-r9t
@Jayalakshmi-r9t Месяц назад
how to get that IP address , foe me it was not visible
@apoorvsrivastava7121
@apoorvsrivastava7121 2 года назад
Sir how can we connect using serects from keyvault ?
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
We need to create scoped credentials in databricks first to setup integration between key vault and databricks
@apoorvsrivastava7121
@apoorvsrivastava7121 2 года назад
@@rajasdataengineering7585 thank you will check and do 💪
@arnabsontu6578
@arnabsontu6578 2 года назад
Sir, is there any way to hide the password from exposing it in the code ?
@rajasdataengineering7585
@rajasdataengineering7585 2 года назад
Yes Arnab, we can use azure key vault
@akashsharma4769
@akashsharma4769 Год назад
Also we can use Databricks secret scope
@rajasdataengineering7585
@rajasdataengineering7585 Год назад
Yes we can use databricks secret scope
Далее
Italians vs @BayashiTV_  SO CLOSE
00:30
Просмотров 3,9 млн
Read/Write Data from Sql Database using JDBC Connector
12:08
Data Ingestion using Databricks Autoloader | Part I
24:11
Get Data Into Databricks - Simple ETL Pipeline
10:05
Просмотров 74 тыс.