Тёмный

8. Delta Optimization Techniques in databricks 

CloudFitness
Подписаться 19 тыс.
Просмотров 17 тыс.
50% 1

Опубликовано:

 

1 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 18   
@186roy
@186roy 2 года назад
A small correction..Compacting (OPTIMIZE) is idempotent, Z-ordering is NOT idempotent.
@olegkuzmin594
@olegkuzmin594 3 года назад
Hello Bhawna. Regarding "partitions should be at least 1GB", it is not always as straightforward. If your use case is read-heavy, then large partitions make sense. For write-heavy use cases, smaller partitions work much better. Here is a reference video for this: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-o2k9PICWdx0.html
@cloudfitness
@cloudfitness 3 года назад
Yes I agree!
@vipinkumarjha5587
@vipinkumarjha5587 3 года назад
Hi Bhavana , Thanks for he important video. Can you please create one video on how to read the streaming data incrementally in delta lake table.
@cloudfitness
@cloudfitness 3 года назад
Give me sometime I will
@TheDataArchitect
@TheDataArchitect 5 месяцев назад
What about using Partitioning and Optimization with zordering together, where zorder using multiple columns?
@pratiksharma8548
@pratiksharma8548 Год назад
Hi I just want to know how many files are scanned by the below query. Select I'd, name from table where Id= 1000:
@sreeragnambiar4579
@sreeragnambiar4579 2 года назад
How to delete partition folders/directories (which contains parquet files). I could remove the reference of the particular date partition from delta log but the original date partition folders are not getting deleted. Tried Vacuum as well.
@ManishSharma-wy2py
@ManishSharma-wy2py 11 месяцев назад
I love to see your video and listen your voice
@akash4517
@akash4517 Год назад
Very informative video , thank you .
@nagamanickam6604
@nagamanickam6604 5 месяцев назад
Thank you
@SpiritOfIndiaaa
@SpiritOfIndiaaa Год назад
thanks Bhawna , I have use -case , i have two files i.e. s3 "delta" files , i need to get 1 first file and delete those records in second file i.e. without changing the file path , is it possible if so how it can be done ?
@selvavinayaganmuthukumaran1332
@selvavinayaganmuthukumaran1332 4 месяца назад
@SpiritOfIndiaaa When dealing with Delta files in an S3 bucket, it’s important to note that directly modifying the contents of a file in place (i.e., without changing the file path) is not possible. However, I can provide you with some alternative approaches: Local Modification and Upload: Download the second Delta file locally. Apply the necessary changes (deleting records) to the downloaded file. Upload the modified file back to the same S3 location, overwriting the original file. This approach ensures that the file path remains unchanged. Upsert Using Delta Lake (Databricks): If you have access to Databricks or a similar platform, you can use Delta Lake’s MERGE operation to upsert data from one Delta table into another. This method allows you to insert, update, or delete records in a target Delta table based on the contents of a source table or DataFrame1. Delta Lake with Databricks (Without Changing File Path): If you’re not using Databricks, modifying Delta files directly in S3 without changing the file path is challenging. You would need to follow the first approach (local modification) and then upload the modified file back to S3. Remember that directly modifying files in place (especially in distributed storage systems like S3) can be complex due to transactional guarantees and the distributed nature of the data. Always ensure data consistency and backup your files before making any changes. 😊
@SpiritOfIndiaaa
@SpiritOfIndiaaa 4 месяца назад
@@selvavinayaganmuthukumaran1332 thanks a lot for your detailed explanation...thanks a lot
@ankbala
@ankbala 2 года назад
Thanks very much for your efforts! very useful!
@CoopmanGreg
@CoopmanGreg Год назад
Great video!
@tanushreenagar3116
@tanushreenagar3116 Год назад
Nice ❤️
@AyushSrivastava-gh7tb
@AyushSrivastava-gh7tb Год назад
I haven't seen a better Data Engineering channel than this one!! 🙇‍♀