here mount point can i give directly access key is secure or not. instead of directly without given access key is possible to create secret key by using key vault.
Hi, is there any other way or option to access files from ADLS inside databricks without mounting the storage account in databricks? Can you please provide your inputs on that. Thank you!
In our project we are using abfss path apart from wasbs. Most projects i had seen they are using abfss. What is the difference between abfss vs wasbs. Could you please explain 🙏
Account Key- if you provide access using it then user will get complete access on account. Like view/modify. SAS- it's useful if you want to share the access to resources for a specific period of time and only specific permission like view or create or modify or all.
Hi bro, today I attended the TCS interview. they asked me about realtime scenarios in adf. 1. how to create reusable pipeline for collecting the required columns from n no of files from adls to SQL. ex I have 10 files in every file I have 20 columns but I want only 15 columns. I need to do this activity repeatly so you can create a pipeline for reusable. 2. in adls I have different CSV files in Adls ex India, aus, eng, sa cricket teams. I want only Indian cricket team realted files. for this purpose how to create a pipeline. Please create realtime scenario video for this thanks in advance
2.) in adls u have diff csv files , u want only india team related files ? ans: take getmetadata acitivity in this to fetch all files than take filter activity in this u have give condition on starts with indian files next copy activity ....then u get only indian team files.
2nd method ; fisrt u can take csv datasets and linked services on adls gen2 ,,take pipeline on copy activity to select source in source u give widcard file path. *.csv open dataset select indian folders only and finally in sink u create datasets and linkedservice on destination ..finally sink u give name on sink path...debug u can get output only indian files