Тёмный

Microsoft Fabric: What are the options to load local files in Lakehouse 

Learn Microsoft Fabric, Power BI, SQL Amit Chandak
Подписаться 28 тыс.
Просмотров 6 тыс.
50% 1

Опубликовано:

 

20 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 13   
@trone_tip
@trone_tip Год назад
Thanks for this 😊 please add how to connect ssms
@AmitChandak
@AmitChandak Год назад
To connect with SSMS, I have discussed in this video ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Mn1GB8rSeoQ.html Take url, use database engine, url and login with User id password with MFA
@DanielWeikert
@DanielWeikert 4 месяца назад
Why is the destination type always string and cannot be changed? Thx
@AmitChandak
@AmitChandak 4 месяца назад
When we got UI initially, we have option on the left to change data type. Now it changed and we can change it at destination side.
@matiasbarrera6959
@matiasbarrera6959 Год назад
Thanks for the video. Can I upload files using python in fabric notebook?
@AmitChandak
@AmitChandak Год назад
If the file is on the some cloud and you can access it using URL and credential you can. As of now I doubt on-premise gateway is supported for that
@martinbubenheimer6289
@martinbubenheimer6289 Год назад
Hi Amit, thank you for this introduction. Now, for the shown approaches I need a web-browser and do manual upload each time I have new data. If I want to automate an update schedule for local file with Fabric, do I need Power BI Gateway like with Power BI or Azure Integration Runtime like with Synapse?
@AmitChandak
@AmitChandak Год назад
As of now on-premise gateway is not supporting that. But we can expect that soon.
@AmitChandak
@AmitChandak Год назад
For Azure you should be able to schedule the pipeline and dataflows
@bloom6874
@bloom6874 4 месяца назад
Great explanation sir. Sir, is it possible to download the csv files in fabric lakehouse?
@AmitChandak
@AmitChandak 4 месяца назад
Thanks. For uploading, we have the option in Lakehouse itself. For downloading, we can use One Lake Explorer. Once installed, you will have access to the files similar to One Drive. Another method is to use Python code. PySpark on Notebooks can help, or the local Python code, which I have discussed in later videos of the series, can help to download files. I also checked that you can create a pipeline to move data from Lakehouse to a local folder using a pipeline. You need to have an on-premises gateway for that, and it will be downloaded on the on-premises gateway machine.
@nies_diy986
@nies_diy986 Год назад
@amit , Great tutorial , how we can transform multiple csv in a single dataflow and then sink them all in a lakehouse or warehouse using the same dataflow is it even possible i tried it but when i select the destination it is only taking one CSV to sink at destination i am stuck here does that mean i have to create multiple dataflows to tackle multiple CSV transformations ? i know in pipelines we can handle this via for each activity but not able to sort this for transformation multiple csv using dataflows ? Scenario is I have , Emp.csv , Dept.csv provided by client and i want to transform them ( changing the ID column from string to Whole number ) and then sink them to lakehouse , the problem is that i can ingest the files automatically using pipelines for each but cannot perform the same for transformations
@AmitChandak
@AmitChandak Год назад
You can add multiple files in the same data flow and set a destination. When you start a dataflow from the Lakehouse Explorer. It will automatically set the destination for all tables. In the above case, you should be able to add all the files to one dataflow and transform and load to the lakehouse and warehouse.
Далее
🎙ПОЮ ВЖИВУЮ!
3:07:23
Просмотров 913 тыс.
Microsoft Fabric - Incremental ETL
26:29
Просмотров 16 тыс.
Creating Tables in Microsoft Fabric Warehouses
9:24