Тёмный

Power Query - Avoid "Helper Queries" (+10 Cool Tricks) 

BCTI
Подписаться 7 тыс.
Просмотров 43 тыс.
50% 1

Learn how to avoid creating "Helper Queries" when extracting data from binary files.
Plus, see 10 incredibly useful Power Query & "M" code tricks to save valuable time and perform unthinkable data transformation acts.
"M" code examples (File download link:)
www.bcti.com//...
00:27 When & why Helper Queries are created
01:29 Video Objectives
02:11 Avoiding Helper Queries when Working with CSV files
04:18 Filtering by File Extension (i.e., file type)
05:03 Filtering by Path (i.e., file location)
05:44 Removing unnecessary Meta Data
05:54 Utilizing "Helper Queries"
07:03 Removing unwanted rows using "good" errors
07:36 Avoiding "Helper Queries" when working with text files
07:50 Extract Binary content using a function
09:06 Filling in the missing steps
10:26 Avoiding "Helper Queries" when working with Binary files
12:57 Filtering nested tables before data extraction
15:07 Sorting columns during column extraction
15:54 Renaming columns during column extraction
16:37 Removing total rows using "good" errors
17:33 Find / Replace values

Наука

Опубликовано:

 

24 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 154   
@robbe58
@robbe58 7 месяцев назад
What a way to avoid all those extra query steps when using files from a folder. NOT having to see all those steps Power Query adds is fantastic. You are a very skilled and intelligent guy. So keep digging into the Excel world to explain all those useful methods/tips. Thank you very much for sharing them.
@ExcelWithChris
@ExcelWithChris 7 месяцев назад
Brilliant! Been using for a few years and always struggled with those helper files. Sometimes ending with 20 or even more. And it creates major issues if you just copy a query and maybe change the source only. It still works but I am always scared something goes wrong. Thanks so much. More of these please.
@bcti-bcti
@bcti-bcti 7 месяцев назад
Thanks for taking the time to watch. Glad it helped. 👍👍
@phandongexcel
@phandongexcel 6 месяцев назад
I like this video ❤❤❤❤
@manojkrishna5391
@manojkrishna5391 5 месяцев назад
Fantastic
@mightydrew1970
@mightydrew1970 7 месяцев назад
Great stuff, I'll use that next time I touch a folder full of files. The only thing I'll do different is to filter out "grand total" and surplus header rows first. Filtering just based on error may (silently) remove and hide bad entries in the csv
@BrvKor
@BrvKor 4 месяца назад
Simple, clear, focused --> understood and implemented in my work. Excelent presentation, clear use of words and examples are one of the best I have seen. Thank you for the tutorial.
@bcti-bcti
@bcti-bcti 4 месяца назад
Thank you so much for your VERY kind and supportive words. Thanks for watching!!!
@Adam_K_W
@Adam_K_W 6 месяцев назад
Holy Cow!! This was an incredible mind blowing bit of knowledge! I for one have always HATED the helper queries because when you swap out entire sets of files in a folder, the helper queries are still looking for the filename that they first referenced and when that file is gone, they break. I have some pretty heavy queries that have too many steps. I know I can clean those up much better now. Thank you.
@bcti-bcti
@bcti-bcti 6 месяцев назад
So glad it will help. I prefer this method (in most cases.) Thanks for watching!!
@Adam_K_W
@Adam_K_W 6 месяцев назад
@@bcti-bcti At work, I was asked to automate the updating and consolidation of spec files when part numbers change. Prior process was to assign someone to manually go through hundreds of files, with tens of thousands of rows and MANUALLY copy/paste/Q.C. the data. Whenever specs were updated (several times per year), this error prone process would take a full time user about a month. Using PQ, I got that down to about 7 minutes... Now that I've seen this, I can make that and other processes even better!
@bcti-bcti
@bcti-bcti 6 месяцев назад
@@Adam_K_W FANTASTIC!!!! It’s always great to hear a PQ success story.
@FranzL608
@FranzL608 3 месяца назад
I hated these helper files from the beginning. So far I tried to avoid them by creating a function like let GetFiles=(Path,Name) => let Source = Excel.Workbook(File.Contents( Path & Name ), null, true), Sheet1_Sheet = Source{[Item="Sheet1",Kind="Sheet"]}[Data], #"Promoted Headers" = …, #xxx = … in #"xxx" in GetFiles and applied this function to the files in a folder. Your solution seems smarter as it even avoids having the function code. Brilliant. I will certainly give it try. Thanks for sharing your knowledge.
@bcti-bcti
@bcti-bcti 3 месяца назад
I'm not a fan either. Thanks for watching.
@paser2
@paser2 4 месяца назад
Just what I was looking for - getting rid of helper queries and hard coded steps. I love the fool proofing tips. Thank you.
@bcti-bcti
@bcti-bcti 4 месяца назад
You're very welcome. 👍👍
@serdip
@serdip 5 месяцев назад
Great video! Thanks for sharing this very practical information. I have performed multi-file import operations similar to what was demonstrated in the lecture. However, I just retain the [Name] column from the initial metadata table and don't require the additional steps outlined in the presentation. I do remove the file extensions, of course. It propagates through the subsequent steps, as far as I can tell. I have been using and programming Microsoft Excel for 27 years but only started learning about Power Query some six months ago. It's a game changer! It's so cool that PQ can replace many of the VBA routines I have developed over the years to clean and transform data, all with just a few clicks. Throw in the added power of customizing the generated M Code script - I have created dozens of general-purpose PQ custom functions plus my own Excel VBA Add-in to help load multiple tables and ranges in one go - and my data cleansing capabilities have now reached the next level. I will *never* interact with Excel the same way again! Thank you kindly.
@bcti-bcti
@bcti-bcti 5 месяцев назад
I, like you, retain the [Name] column from the meta data. This was to demonstrate how to achieve the goal if you didn't take that approach. I used to rely very heavily on VBA for me cleanup and automation solutions. Now it only gets used for situations where Power Query can't do the job. MY need for VBA has declined about 90% since Power Query came to being. Thanks for watching.
@SaladSharkGaming
@SaladSharkGaming 3 месяца назад
Ive been wanting to get around these helper queries for ages. Your guide is incredibly straight forward, and i cant wait to apply this new methodology in my future queries!
@bcti-bcti
@bcti-bcti 3 месяца назад
Thank you so much for saying nice things. Thanks for watching!!!
@tinhoyhu
@tinhoyhu 7 месяцев назад
I like this format of going through long examples with a lot of embedded tricks. That Detect Data Types feature will save me a lot of time.
@bcti-bcti
@bcti-bcti 7 месяцев назад
So glad you're enjoying them. I know it's not a format for everyone. Thanks for watching.
@RichardJones73
@RichardJones73 7 месяцев назад
Superb explanation and now I can get rid of my ugly, confusing helper queries. Always used to drive me mad they did and now I can control the data much more easily EDIT: I would add though when I do the Folder.Contents thing I like to have the steps expanded out in the Applied Steps window so that I can easily see the path/folders I've drilled down into. To do this, you add a line in the advanced editor to set a variable e.g. file = "expand" and put a comma on the end and then you will find that the steps following that are expanded out when you drill down into folders
@bcti-bcti
@bcti-bcti 7 месяцев назад
So glad to hear it helps. Thank you so much for watching and commenting. It really helps.
@iankr
@iankr 7 месяцев назад
Great video, and very clearly explained, as always. I also use that technique to avoid those confusing Helper Queries!
@mostafahwafy
@mostafahwafy Месяц назад
Amazing For any one dealing with multiples monthly files types 🤩
@bcti-bcti
@bcti-bcti 28 дней назад
Thanks for taking the time to watch. 👍🏼
@mostafahwafy
@mostafahwafy 28 дней назад
@@bcti-bcti no problem... I enjoyed it🤗
@DanKnight
@DanKnight 7 месяцев назад
Wonderful! Excellent tips, except now it's increased my workload, as I'm going to have to go back and rework some queries to avoid the "helpers" and streamline multiple steps. Well done, well explained and illustrated.
@bcti-bcti
@bcti-bcti 7 месяцев назад
Yeah, but's a "good" kind of work. Very satisfying. Thanks for watching.
@AlvaroFreireAF1
@AlvaroFreireAF1 7 месяцев назад
Great job!! However, in my case it was not so simple since my CSV files have a more complex encoding. When I first tried to use the Csv.document command, it threw an error. To work around the problem, I first expanded using the normal process and copied the formula from the Transform Sample File located in the help folder. After that, just replace the expression “Parameter1” with “Content”. This was the original formula: Csv.Document(Parameter1,[Delimiter=";",Columns=17, Encoding=1252, QuoteStyle=QuoteStyle.None]) This is the new one: Csv.Document([Content],[Delimiter=";", Columns=17, Encoding=1252, QuoteStyle=QuoteStyle.None])
@bcti-bcti
@bcti-bcti 7 месяцев назад
Great idea! As this is not a 100% perfect solution, there are always going to be scenarios that require creativity. Thanks for watching!!!
@ladytigereye6145
@ladytigereye6145 6 месяцев назад
I have the exact same problem and your solution works for me. I can't say how much I want to thank you, saving me so much time and nerves! 😄🤗
@AlvaroFreireAF1
@AlvaroFreireAF1 6 месяцев назад
@@ladytigereye6145 🤗
@bcti-bcti
@bcti-bcti 6 месяцев назад
@@ladytigereye6145 so glad it helped. Thanks for taking the time to watch.
@ladytigereye6145
@ladytigereye6145 6 месяцев назад
@@bcti-bcti The crazy thing about those "helper queries" is: If you use them, it prevents you from building a Power BI template app. Power BI template apps don't allow parameters of the type "any" or "binary". So without eliminating those helper queries from the data model one is not able to use an app to publish Power BI reports built from that whole data model. That's a very annoying relation I had to uncover. 😅 Thanks for helping me solve this issue. My app is running now. 😁
@anilb5836
@anilb5836 6 месяцев назад
Hats up sir. I watched so many youtube views did not get expiation like you. Appreciated for sharing such a useful information.
@bcti-bcti
@bcti-bcti 6 месяцев назад
That is exceedingly nice of you to say. Thanks for taking the time to watch.
@alman34
@alman34 6 месяцев назад
This has been super helpful. I didn't know about the combined function. This is saving me a bunch of time. Thank you!
@bcti-bcti
@bcti-bcti 6 месяцев назад
Glad we could be of service. Thanks for your viewership!
@Daithi354
@Daithi354 6 месяцев назад
Honestly, this was very well explained, at a good pace, with lots of cool tips and tricks. Keep these vids coming, subscribed!
@bcti-bcti
@bcti-bcti 6 месяцев назад
Thank you so much. We appreciate your viewership.
@HVACALC
@HVACALC 7 месяцев назад
Excellent - and a great introduction for anyone looking to learn how to 'program' with power query.
@bcti-bcti
@bcti-bcti 7 месяцев назад
Thank you so much for taking the time to watch and comment.
@thelastfry23
@thelastfry23 7 месяцев назад
Great video. Not sure how you keep putting out videos on projects I just finished using a better methodology, but keep them coming!
@bcti-bcti
@bcti-bcti 7 месяцев назад
So what's the NEXT project I can assist with? 🤣 Thank you for taking the time to watch.
@aliasghar09
@aliasghar09 Месяц назад
Great Stuff...Very Informative...and easy to understand. Great work guys!!!🖐
@bcti-bcti
@bcti-bcti Месяц назад
Much appreciated!
@Mephistopheles019
@Mephistopheles019 5 месяцев назад
A gentleman and a scholar. Thank you so much, dude. Now I get to spend the day optimizing/rebuilding all my tools. Question for you: does this speed up the performance due to not having all that overhead?
@bcti-bcti
@bcti-bcti 5 месяцев назад
I haven't run any scientific studies, but I can't help but think that things would run a bit quicker without the extra overhead. I'm sure someone could demonstrate a scenario where having the helper queries improves performance, so I guess it's just a matter of what you're willing to tolerate. Thanks for watching!
@djl8710
@djl8710 7 месяцев назад
Cool, I hate all the helper junk, way too much clutter. Nicely done!
@bcti-bcti
@bcti-bcti 7 месяцев назад
Thanks. I couldn't agree more.
@qwerty1945yt
@qwerty1945yt 7 месяцев назад
this is AWESOME! and, learned fundamentals in the process! thanks!!
@bcti-bcti
@bcti-bcti 7 месяцев назад
You are very much welcome. Thanks for watching.
@HanoCZ
@HanoCZ 6 месяцев назад
Simply briliant. I have learned a trick or two, while considered myself as skilled in PQ
@bcti-bcti
@bcti-bcti 6 месяцев назад
I like to think I know more than most about PQ, but I still have a LOT to learn. It’s always a great feeling to learn something new. Thanks for watching.
@jessicajolly7828
@jessicajolly7828 12 дней назад
I can whole heartedly second this comment. I have reasonably good PQE skills but you sir are next level!!!
@AJ-fd3yj
@AJ-fd3yj 2 месяца назад
I sincerely thank you for this awesome, well-done, informative, mind-blowing tutorial.
@bcti-bcti
@bcti-bcti 2 месяца назад
Thank you so much!!! We appreciate you taking the time to watch and comment.
@AJ-fd3yj
@AJ-fd3yj 2 месяца назад
@@bcti-bcti you ought to have an idea how much I and other users here appreciate the time you took to create content that is not only fun to watch - and learn from - but adds so much value to our lives.
@bcti-bcti
@bcti-bcti 2 месяца назад
@@AJ-fd3yj thank you for the encouraging words. It makes me want to try harder with each video. Thank you.
@teekayjunior2517
@teekayjunior2517 6 месяцев назад
I love your channel and hope it thrives! Specific question from this video. I have a file that I've applied this technique. It works but the refresh rate is exceedingly slow. It's a very small file at this point. The only thing I can think of that would take a long time is filtering through the Sharepoint directory to get to the target folder. The root folder is parent to a lot of subfolders and sub-subfolders and so on. In any case, anything you can do to help me speed this up would be helpful. We're talking over 10 minutes or more every time a change is made.
@bcti-bcti
@bcti-bcti 6 месяцев назад
Thank you for the nice encouragement. We hope it thrives, too. If you want to post here, or email me your full query (copied from the Advanced Editor), I can take a look and see if there's anything that may come to mind when it comes to optimizing the performance. training@bcti.com
@patrickharilantoraherinjat2994
@patrickharilantoraherinjat2994 7 месяцев назад
Before expanding table you can promote headers for each table at once!
@bcti-bcti
@bcti-bcti 7 месяцев назад
Very true. Thanks for the input. And thanks for watching.
@AmirMosh-b4w
@AmirMosh-b4w 9 дней назад
Realy nice tutorial! I see Im not the only one that do not feel comfortable with those "helper" queries... I would like to know if this video is part of an on-line course that I can take ?
@bcti-bcti
@bcti-bcti 9 дней назад
This isn't part of an online course, it's just me making videos that I find interesting. The company I work for does perform training for other companies, but they're typically local (i.e., within 150km), but we do have some students that take the class over the Internet. The class(s) are live and not recorded. It's the same experience for students in the classroom versus remote. Time zones are also an issue with live, distance-learning sessions. We're flattered that you would wish to take a full course. Where are you located?
@AmirMosh-b4w
@AmirMosh-b4w 6 дней назад
I am located out side the U.S... Which of the trainings is related to Power query?
@MaureenPesch
@MaureenPesch 7 месяцев назад
This was excellent!. Thank you! Love your channel!
@bcti-bcti
@bcti-bcti 7 месяцев назад
Thank you for saying so. And thanks for watching.
@kirux882
@kirux882 2 месяца назад
This was fantastic! Thank you for a great video!
@bcti-bcti
@bcti-bcti 2 месяца назад
Thanks for the compliment. We appreciate you watching.
@flmiami1
@flmiami1 5 месяцев назад
Great and clean method! Thanks a lot.
@bcti-bcti
@bcti-bcti 5 месяцев назад
You're welcome. Thanks for watching.
@7213261
@7213261 6 месяцев назад
Thanks, cool trick to avoid subfolders!
@bcti-bcti
@bcti-bcti 6 месяцев назад
Yep. Sure is. Thanks for watching.
@jamihoneycutt3025
@jamihoneycutt3025 3 месяца назад
Great view and excellent training! What do you do when your data is not in table form? I am using a date created column from the file selection to separate the data that is on Sheet1 of the individual files. When I click to expand the Data column containing the tables, I get "Column 1", "Column 2", etc. When I promote headers, I lose the first date of the date created because it is then promoted to a header as well. How do you correct for this?
@bcti-bcti
@bcti-bcti 3 месяца назад
Without seeing your data, I would think that you would just have to rename the date column to something more appropriate. Are you losing the entire record when you promote the header row, or just a single heading for a single column?
@michaelt312
@michaelt312 7 месяцев назад
In your Expand_CSV step, couldn't you change the header names there? This would avoid the need to promote the first row to name the columns?
@bcti-bcti
@bcti-bcti 7 месяцев назад
Can you provide a sample of M-code to demonstrate your combined strategy? Thanks.
@michaelt312
@michaelt312 7 месяцев назад
8:55 I'm on my phone so just typing out this line as it is what I'm referencing. I'll type out the full M when I get to my computer later today. Please forgive any typos. I'm currently on a crowded L heading into downtown Chicago. I think you will be able to understand from this. =Table.ExpandTableColumn{#"Removed Columns", "CSV_Data", ("Column1", "Column2", "Column3", "Column4"), ("TranDate", "Account", "Dept", "Sum of Amount")
@bcti-bcti
@bcti-bcti 7 месяцев назад
I thought you were alluding to that, but I wasn't sure you were doing something else. Yes, that would be a great step-saver. I should have done that in the video. Thanks for the input, and thanks for watching. @@michaelt312
@michaelt312
@michaelt312 7 месяцев назад
@@bcti-bcti , your channel is awesome. I tripped on this process when I was trying to clean some code. It is now my go-to in every build.
@srider33
@srider33 5 месяцев назад
Good video. While I agree that those helper functions clutter any detail on performance for larger datasets with query?
@bcti-bcti
@bcti-bcti 5 месяцев назад
Thanks. I haven’t done any large scale performance tests, so I’m uncertain as to the benefit/detriment of either approach. Thanks for watching.
@allisterbaken5396
@allisterbaken5396 2 месяца назад
Looks Great - how would it be different if I had workbook files (xlsx, xlsb etc) in the folder ?
@bcti-bcti
@bcti-bcti 2 месяца назад
If you look at 12:08 in the video, I show how to use the strategy on Excel files using the Excel.Workbook() function. Thanks for watching.
@reggydavis
@reggydavis 2 месяца назад
what are the gains in speed? I'm struggling to the major benefit here. Can't you just ignore the helper queries? It's so much easier to click the button than go through these steps.
@bcti-bcti
@bcti-bcti 2 месяца назад
@@reggydavis You could, but it’s my experience that many users don’t like all the additional overhead and would rather take matters into their own hands. Thanks for watching.
@yousrymaarouf2931
@yousrymaarouf2931 7 месяцев назад
Excellent
@zzota
@zzota 6 месяцев назад
Amazing. Thank you for this, very helpful.
@bcti-bcti
@bcti-bcti 6 месяцев назад
You are most welcome. Thanks for watching!!!
@teekayjunior2517
@teekayjunior2517 3 месяца назад
Another question! How is this technique applied to Sharepoint folders? While the local folder reference is helpful, more often than not these files are collaborated up, thus rendering the local reference ineffective (unless I'm missing something).
@sanchowitfurrows1561
@sanchowitfurrows1561 22 дня назад
replacing .Files with .Content works for Sharepoint folder as well, with small condition: you'd better be site admin.
@Gef1ManArmy
@Gef1ManArmy 7 месяцев назад
Can I ask - why do you start by connecting to the parent folder and then go through the steps of filtering just to the sub folder? Is it in case people have their files across multiple folders? If not, wouldn't it easier to connect straight to the sub folder? Thanks for your informative videos.
@bcti-bcti
@bcti-bcti 7 месяцев назад
Yes, exactly. I would just start at the sub-folder level, but this gives the operator the ability to "catch" more files that may be spread across multiple folders. It also allows the video to demonstrate "path filtering", something some users may have never thought to do. Thanks for watching.
@asher0987
@asher0987 3 месяца назад
Thats a wonderful solution. What if the data type is in .xml?
@bcti-bcti
@bcti-bcti 3 месяца назад
Once you have the folder data reduced to just a [Content] column, you could create a custom column using the following formula: =Xml.Tables([Content])
@gabrielgordon
@gabrielgordon 7 месяцев назад
Thank you! this is so valuable!
@bcti-bcti
@bcti-bcti 7 месяцев назад
Glad it helps. Thanks for watching.
@McIlravyInc
@McIlravyInc 7 месяцев назад
I like the cleaner combining of files, but I often need the name of the source file for each row. How would you approach that? Thanks.
@bcti-bcti
@bcti-bcti 7 месяцев назад
You could keep the [Name] and [Content] columns, that way you could use the [Name] as a transaction-level "stamp" so you always know where each transaction came from. See the below M code for an example (this is done with the CSV files, but can easily be done the same way with any files.) ============================== let // Connect to the parent folder holding all files and subfolders Source = Folder.Files("C:\Camtasia Projects\Power Query - Avoid Scary Queries"), // Standardize the casing of the file extensions (using UPPERCASE) Uppercase_File_Extensions = Table.TransformColumns(Source,{{"Extension", Text.Upper, type text}}), // Filter to keep only files in the "CSV Files" folder and subfolders Filter_By_Path = Table.SelectRows(Uppercase_File_Extensions, each Text.Contains([Folder Path], "\CSV Files\")), // Filter to keep only CSV files Filter_By_CSV_Files = Table.SelectRows(Filter_By_Path, each Text.Contains([Extension], ".CSV")), // Remove all columns of folder metadata EXCEPT the [Name] and [Content] columns Keep_Only_Content = Table.SelectColumns(Filter_By_CSV_Files,{"Name", "Content"}), // Standardize the casing of the Filenames (UPPERCASE) #"Uppercased Text" = Table.TransformColumns(Keep_Only_Content,{{"Name", Text.Upper, type text}}), // Extract the tables from the Binary files Extract_Tables = Table.AddColumn(#"Uppercased Text", "CSV_Data", each Csv.Document([Content])), // Delete the [Content] column (column of Binaries) Delete_Binaries = Table.RemoveColumns(Extract_Tables,{"Content"}), // Expand the table columns of data from the [CSV Data] column Expand_Tables = Table.ExpandTableColumn(Delete_Binaries, "CSV_Data", {"Column1", "Column2", "Column3", "Column4"}, {"Column1", "Column2", "Column3", "Column4"}), // Promote the first row of data to a Header Row position Promoted_Headers = Table.PromoteHeaders(Expand_Tables, [PromoteAllScalars=true]), // Set the Data Types for the columns Set_Data_Types = Table.TransformColumnTypes(Promoted_Headers,{{"TranDate", type date}, {"Account", Int64.Type}, {"Dept", Int64.Type}, {"Sum of Amount", Currency.Type}}), // Remove any redundant headers and total rows (rows with text are rendered as errors by the previous step) Remove_Error_Rows = Table.RemoveRowsWithErrors(Set_Data_Types, {"TranDate", "Account", "Dept", "Sum of Amount"}) in Remove_Error_Rows
@Mephistopheles019
@Mephistopheles019 5 месяцев назад
Follow up question for you-what if the data type is txt?
@bcti-bcti
@bcti-bcti 5 месяцев назад
All steps should be the same for .TXT files as they use the same connector for decoding. The only issue may be when you are using a TAB as the file's delimiter instead of a COMMA. If this is the case, you will have to modify the Csv.Document function as follows: Csv.Document([Content], [Delimiter=" "]) (There is a TAB character between the double-quotes.) NOTE: The "Custom Column" dialog box will not allow you to type a TAB character in the formula; it interprets TAB as an instruction to move to the next field in the dialog box. For this reason, you may have to type the above command in something like Notepad, then copy/paste it into the "Custom Column" dialog box. Hope this helps. Thanks for watching.
@Isambardify
@Isambardify 6 месяцев назад
Will the model load everything in the parent folder before filtering down or is it intelligent enough to only request the data it needs?
@bcti-bcti
@bcti-bcti 6 месяцев назад
I can't be entirely certain as to your question. Can you rephrase it with a procedural example? Thanks for watching!
@maurobsn1873
@maurobsn1873 2 месяца назад
hello, great video! i have a question, i can i extract a certain table if there are more than one in an excel workbook? should i [Kind] = "Table" & [Name] = "TableName" ?
@bcti-bcti
@bcti-bcti 2 месяца назад
You could do something like the following: let Source = Excel.Workbook(File.Contents("C:\Excel Data.xlsx"), null, true), // Filter the Excel file's metadata by the name of the desired table (ex: "MyData"). #"Filtered Rows" = Table.SelectRows(Source, each ([Item] = "MyData")), // Extract the data from the "MyData" table MyData_Table = #"Filtered Rows"{[Item="MyData",Kind="Table"]}[Data] in MyData_Table
@maurobsn1873
@maurobsn1873 2 месяца назад
@@bcti-bcti thank you i will try it... i was actually thinking about something like that, i didnt have the time yet to try iy! thank u!!!
@kebincui
@kebincui 7 месяцев назад
Fabulous video❤
@jerry.david1
@jerry.david1 6 месяцев назад
What a great video!!
@LilLinh
@LilLinh 3 месяца назад
Great tips
@bcti-bcti
@bcti-bcti 3 месяца назад
Thanks. We appreciate you taking the time to watch.
@ARSEABOUTFACE
@ARSEABOUTFACE 6 месяцев назад
one of the better instructional videos. Very good pace and excellent explanations. Well done.
@bcti-bcti
@bcti-bcti 6 месяцев назад
Thank you for saying so. We appreciate you taking the time to watch.
@bimanroy8865
@bimanroy8865 7 месяцев назад
If there is no error, how can the header rows be removed for all the input files except for one? In this case, there was one field which was numeric. But there may be case where all the columnns may be of same type.
@bcti-bcti
@bcti-bcti 7 месяцев назад
Then a different strategy will need to be employed. Every scenario is different and requires its own special adjustments. Thanks for watching.
@Hello-bn2yc
@Hello-bn2yc 3 месяца назад
thank you very much
@bcti-bcti
@bcti-bcti 3 месяца назад
You are quite welcome!
@ExcelWithChris
@ExcelWithChris 4 месяца назад
at 13.25 I want to extract a specific named range in the file, but it is not showing in the list at the bottom. How can i do that? So what I am doing is getting a file from Sharepoint folder, then go through all the steps to get a named range out of it, resulting in hundreds of Helper Queries. I want to do what you doing here. Get the NamedRange out of the file without the helper queries. The list does not show named ranges in the file???
@bcti-bcti
@bcti-bcti 4 месяца назад
I'm tied up for a couple days, but let me look into this and see what I can figure out.
@ExcelWithChris
@ExcelWithChris 4 месяца назад
@@bcti-bcti Thanks a million. Would help a lot. Currently I import 15 files and it creates a lot of Helper Queries.
@bcti-bcti
@bcti-bcti 4 месяца назад
@@ExcelWithChris I can assist with this, but it would be easier if I were to call you. We could share screens and I could walk you thru the process. That way, if there's a glitch, I can see what is happening and we can figure it out. You can email me at "training@bcti.com". cheers.
@sanchowitfurrows1561
@sanchowitfurrows1561 22 дня назад
​@@ExcelWithChrisdo you mean 15 files of different structure each?
@SamehRSameh
@SamehRSameh 7 месяцев назад
Marvelous 🎉🎉
@bcti-bcti
@bcti-bcti 7 месяцев назад
Thanks for watching!!!
@Hemo2YoYo
@Hemo2YoYo Месяц назад
12:59 inside content there is another sheet come up first column is "record " the second column is "data" < the table here How can I do that if happen like my case another table inside content and the subject column dynamic not fixed?
@bcti-bcti
@bcti-bcti Месяц назад
I hate to sound dense, but I'm unclear as to your question. Can you rephrase it, or provide an example? Thanks.
@Hemo2YoYo
@Hemo2YoYo Месяц назад
@@bcti-bcti my bad sorry, check in 13:02 you extract the content tables in new column. In my case when I’m trying to extract the content I got table but it’s not belong to my data sheet, that table with 2 columns the first one referred "name" and sometimes "record" and second column is "data" in first row is the table that related to my sheet. How can I make expression to get the table that inside content. In my case usually comes second column inside the content and first row ? I hope that’s clear if not i can share screenshot to ur email.
@bcti-bcti
@bcti-bcti Месяц назад
@@Hemo2YoYo yes, send a screenshot to training@bcti.com and I’ll give a look. Thanks.
@Hemo2YoYo
@Hemo2YoYo Месяц назад
@@bcti-bcti i shared the video to your email.
@bimanroy8865
@bimanroy8865 7 месяцев назад
Do your input excel files have- tables named after region?
@bcti-bcti
@bcti-bcti 7 месяцев назад
Yes; the tables within each Excel file contains a table that was named according to the state the date is related.
@Hemo2YoYo
@Hemo2YoYo Месяц назад
What if my files are XLS or XLSX in 8:06 ? What is the expression will be?
@bcti-bcti
@bcti-bcti Месяц назад
If you look at 12:08 in the video, I show how to use the strategy on Excel files using the Excel.Workbook() function. Thanks for watching.
@Hemo2YoYo
@Hemo2YoYo Месяц назад
@@bcti-bcti thanks so much
@YPartbee
@YPartbee 6 месяцев назад
Nice 😃👍
@bcti-bcti
@bcti-bcti 6 месяцев назад
Thanks for taking the time to watch.
@anz672
@anz672 6 месяцев назад
Powerfull
@bcti-bcti
@bcti-bcti 6 месяцев назад
Thanks for watching.
@sharmarudra
@sharmarudra 4 месяца назад
How to retain the file names?
@bcti-bcti
@bcti-bcti 4 месяца назад
Can you provide additional information regarding your question? I am uncertain as to what you are asking. Thank you.
@sanchowitfurrows1561
@sanchowitfurrows1561 22 дня назад
at 5:52 you'll select Content column first, just as on video, then you hold Ctrl key and click on Name column to add it to selection. then you follow the video and add "Remove other columns" step
@janesheng7113
@janesheng7113 Месяц назад
Awesome video!
@bcti-bcti
@bcti-bcti Месяц назад
Thank you!!! That means a LOT to me.
@alexcross5916
@alexcross5916 2 месяца назад
This was unbelievably helpful! Thanks! I am however, now facing an issue...we're bringing in Ledger Accounts and due to the length of the numbers/codes used for these, it looks like PQ is literally making numbers up? Code 123.1234.123 in the Excel has been brought in as 123.1234.122999998 - has anyone else had this happen, and how do you counteract it? We can't just round as due to the .s they're coming in as text.
@bcti-bcti
@bcti-bcti 2 месяца назад
I am unable to replicate your issue with the example provided. Can you provide additional information? A copy of the M code would help.
@alexcross5916
@alexcross5916 2 месяца назад
@@bcti-bcti I'm honestly not sure what to share, the steps are identical to what you've done above, but this also happens when using the Combine button, so it's not related to the method. I don't think I can share actual account codes, but to show their length (which might be the issue?) they are 1234567.1234 (with various numbers) and some have subledgers which show as 1234567.12345.123 (weirdly these show correctly in PQ). The codes with only one . in them seem to be rounded away from their actual values as shown in the original excel files (which appear to be stored as text as well), and become 1234567.1234000002 (five 0s) or 1234567.1233999998 (five 9s). The data first shows after the expanded columns step to show the actual table data so there are no transforms/promoted headers changing anything, the data seems to be being read this way? Really appreciate your help/insight on this, I can't really find anything about it online but it's very frustrating to have your ledger and subledger account codes being changed upon import!
@alexcross5916
@alexcross5916 2 месяца назад
​@@bcti-bcti Seems my reply disappeared, oddly. In terms of the code, it's literally exactly what you're doing here to the letter, so I think the issue lies with the data perhaps. A text field is brought in from excel which contains the ledger codes in the format 1234567.1234 or 1234567.1234.123 (ledger and subledger). Subledgers seem to come in fine and remain correct, however the ledgers with only one . get weirdly rounded to either 1234567.1234000002 (with five 0s) or 1234567.123999998 (five 9s). In the Excel files which are being brought in the ledger just fixed at .1234, there's no numbers after that, no rounding etc. Hope that makes some more sense? In terms of clarity of the situation, the issue makes no sense! Thanks
Далее
Power Query - Rounds Differently than Excel (DANGER)
9:47
Тонкости французской кухни🥰
00:48
Advanced Unpivoting Tricks in Power Query
30:19
Просмотров 18 тыс.
Power Query - Faster & Easier Parameters
13:38
Просмотров 38 тыс.
The Excel Trick I Use EVERY DAY (and you should too!)
11:21
Power Query - Get Top N by Group
22:19
Просмотров 1,7 тыс.
Power Query - Data from Email Attachment
11:45
Просмотров 3,2 тыс.
Мэджик! Смартфон 2-в-1!
0:48
Просмотров 619 тыс.
Apple Event - September 9
1:38:50
Просмотров 26 млн
iPhone Standby mode dock, designed with @overwerk
0:27
Тест кнопки iPhone 16 под водой
0:39
iPhone 16 в 2012 году!
0:38
Просмотров 1,5 млн