Тёмный

How to use Power Automate and the SharePoint Batch API to create thousands of list items quickly 

Paul Murana
Подписаться 6 тыс.
Просмотров 21 тыс.
50% 1

This video demonstrates a method to use #PowerAutomate to create SharePoint list items quickly using the $batch SharePoint API method. This is really useful to improve #microsoftflow performance but also to reduce the number of API actions consumed by your flow.
Full Details on my blog: www.tachytelic...

Опубликовано:

 

14 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 136   
@matthewsoutdooradventures7540
Just an amazing solution. 3 important mods/things to look out for to make it work. 1. When adding column 'Invoice Amount' to the SP list, depending on how it is created there won't be a space in the internal name. It works with the internal name not the display name. 2. The default limit on 'List rows present in a table' is 256 rows but can be changed in the settings 'Pagination'. 3. When using "batchSize": 1000 it seems miss about ~20 records per batch. Setting the "batchSize": 900 works perfectly. Thank you for sharing. You should modify your profile to include your Twitter account.
@PaulieM
@PaulieM Год назад
Thanks Matthew - I have added my Twitter link!
@RowDogSA
@RowDogSA 9 месяцев назад
Thank you! I am writing 37000 records to SharePoint in 30 minutes! It was taking 4 hours previously.
@PaulieM
@PaulieM 9 месяцев назад
Fantastic! Well done! It is a powerful solution
@adelawilliams2260
@adelawilliams2260 3 года назад
You make some fantastic videos. I am new to Power Apps and Power Automate and can’t tell you how much I appreciate your help!
@PaulieM
@PaulieM 3 года назад
Thank you Adela, I really appreciate the kind words. Makes creating the videos so much more worthwhile getting feedback like this.
@juanpierrejesusgonzales933
@juanpierrejesusgonzales933 3 года назад
Hi Paulie, all the work you do seems incredible to me, you literally see things where no one else sees. I have a question, how would you do instead of creating the flow updates to hundreds of records?
@tylerkolota
@tylerkolota 2 года назад
Hi Juan, I’ve looked around and haven’t found a bulk update SharePoint set-up yet. However, I recently implemented a work-around that will keep over 100k records up to date. I first use Sharepoint Get items with 100,000 pagination to get all the rows in the table as they currently are. I get the many items to update from an external site, I use a Filter array to get just the ones I need to update in SharePoint. I use Paulie’s batch create method to create new rows for all the record updates (yes, there are now duplicate rows with the updated data). Then if the batch create is successful, I use a Select action to generate a unique key from a single or a combination of columns from the Filter array action. Let’s say the unique key is just email addresses. Then I use another filter array with the SharePoint Get items value in the From and the filter set to if String(Select) contains item()?[‘Email’]. That filters the SharePoint Get items to just the old version of the records just updated/duplicated in the batch create. Now I can use the results of that second Filter array in Paulie’s batch delete set-up. And by creating new versions of the records & deleting old versions, it replicates an update.
@beeboitan5422
@beeboitan5422 Год назад
Big help for my work scenario! I adjusted my batch size to 800 as I am getting timeout error for 900 and 1000. Thanks much for your video!
@PaulieM
@PaulieM Год назад
Glad it helped!
@robinroy3300
@robinroy3300 2 года назад
Hi Paule, Great video. Can you suggest how we can use this method to update person or group columns as well? Thanks.
@manjunathpatil6785
@manjunathpatil6785 Год назад
Hi i have same issue did you find any solution to create person or group columns please guide expective your response please !!
@robinroy3300
@robinroy3300 Год назад
@@manjunathpatil6785 for updating person or group column you will need the ID of the user (this is specific to site collection). Email ID is not enough.
@DamoBird365
@DamoBird365 3 года назад
Super efficient! I fancy giving that a go.
@mdhidayat5706
@mdhidayat5706 4 месяца назад
Hi Paul, thanks for this extremely useful video! Would the approach be similar if were to copy from one sharepoint list to another list?
@saullazzarini4507
@saullazzarini4507 2 месяца назад
Hi Paul, amazing video! Can this method be used to copy entire sharepoint libraries from one site collection to another? Thank you!
@PaulieM
@PaulieM 2 месяца назад
Yes, but it would be more complex as you’d have to consider the file sizes of the documents in the library - personally I’d be inclined to use a power shell script.
@xeyrruken
@xeyrruken 2 года назад
Hey Paulie thanks alot for the videos and information. I saw below that you responded about the batch update. Any luck on that one?
@nikhilvalaboju1492
@nikhilvalaboju1492 Месяц назад
Hey Paul Thanks for this video Helps so much But there’s a small question. My batch operation is missing 1500 records every single time for a 30k item upload to SharePoint list. Is there a way to identify the failed batches or missed batches and try a retry in this same flow ?
@PaulieM
@PaulieM Месяц назад
Yes, you can examine the output from SharePoint batch operation. But it’s not straightforward to do. I’ve got an enhanced method coming
@davidboorman5435
@davidboorman5435 Год назад
Hi Paulie, love your vids and appreciate the instruction. I want to do almost exactly what you are demonstrating only I have a pipe delimited text file. I understand how to split and modify the values but putting those values back into an array and generate the batching is a struggle for me. Every instructional guide I watch/read always uses Excel. Can you provide an example of a text file with delimited rows and using SharePoint batch api? Much appreciated!!
@vinaydesai6917
@vinaydesai6917 10 месяцев назад
Hi Paulie, thank you for creating such an insightful video, but is there a way of doing the same but with Dataverse as the data source? If so, would you plan on making a video that covers that topic in the future?
@PaulieM
@PaulieM 10 месяцев назад
Yes, I’ve done it (to be honest I thought I had done a video already, but seems not). It’s very similar and easy to do. I will try to do one soon.
@bobleach
@bobleach 2 года назад
Hi Paulie, Thanks for this, your videos are very helpful! I've managed to get this working, but could really do with the bulk Update SharePoint items - Are you still planning on creating a video for this? Keep up the great work! Thanks Bob
@dankirchner4460
@dankirchner4460 Год назад
This is fantastic sir. Thank you for sharing. In the spirit of learning this I am building my flow step by step with your video. My only question is in the batch template step (5:48 into the video) you have what appears to be two compose steps....the site address and the list name. How did you do that? Ive tried to figure it out myself but have not been able to. Any help would be greatly appreciated!
@PaulieM
@PaulieM Год назад
Those are just the outputs of the settings Compose action. So the expression would be something like outputs('settings')['siteAddress'] or outputs('settings')['listName'].
@dankirchner4460
@dankirchner4460 Год назад
@@PaulieM thanks! I'll give it another look at shortly. Appreciate the reply!
@ircheema
@ircheema Год назад
Hi Paulie, Thanks for a great tutorial. I have an additional question regarding the last step where the response from batch request is converted to base64String. Is there any way we can convert this response to JSON?(In Case of GET request)
@GuilhermeAlves-zq2gc
@GuilhermeAlves-zq2gc Год назад
Hi Paulie! I did exactly as you taught in the video. The flow is running without errors, however it doesnt copy all the data to the sharepoint list. My excel list has approximately 10000 items, and this flow just copy 1024. Can you tell me what could possibly be going on? thankyou!
@deborahjoyous
@deborahjoyous 3 месяца назад
I have used your batch delete and it works perfectly! I have a flow that executes the batch delete, then moves on to the batch upload as described in your video and blog. It fails at the GenerateSPData, with the Error Details: "The action 'GenerateSPData' was executed for '1665' iterations resulting in an aggregated result size that exceeded the maximum value '209715200' bytes allowed. Please reduce the number of iterations to ensure that the aggregated results size is less than '209715200' bytes." I have 1,600 records in my Excel file that I am uploading. Where do I adjust this?
@zerokhan858
@zerokhan858 2 месяца назад
thanks paul! extremely useful and helpful
@saharhussein7043
@saharhussein7043 2 года назад
My flow goes without no errors but dose not update my sharepoint list. I amended the Setting and GenerateSPdata section, was there somewhere else that I needed to modify?
@PritishTogare
@PritishTogare 6 месяцев назад
Hi Paul. Great content, although bit difficult for beginners to follow. I followed your steps to the T but out of the 15000 records from the mockup excel data, my flow only loaded 1024 items for some reason. any suggestions?
@PritishTogare
@PritishTogare 6 месяцев назад
Update: I increased the threshold limit to 5000 and it loaded 5120 items but it still didn't go through the entire excel file of 15000 records.
@andrepanm
@andrepanm 2 года назад
hi Paulie, I'm from Brazil, firstly thank you very much for your channel and blog, it really helped me a lot in the problem of removing a lot of items through your API, I copied and it run very well. Now I'm trying to apply this your API to copy items from Excel, but it runs in 10 seconds and nothing is done. Do you have any idea what about it be?
@andrepanm
@andrepanm 2 года назад
I could to solve it... the problem was column name. But my excel has 2188 items, and API had copy only 1032 items (API no threads) and only 1023 items (API with Threads)... I don't know what it happened.
@juansepowerplatform
@juansepowerplatform 3 года назад
Thank you for sharing so much knowledge
@CMo_w
@CMo_w 6 месяцев назад
What is the best way to compile non-xl file records to prep the value(s) for batch upload? ie. using an variables array that is json parsed.
@jeevithavaranasi
@jeevithavaranasi Месяц назад
i want to do the same thing but instead of extracting rows from excel , i want to do it from power dataset , i have included "run a query against a dataeset" and "Parse JSON" activities , and called in the body of json output into generateSPData step. but its not working. can you please let me know how to do it with power bi dataset
@ajssoarestube
@ajssoarestube 3 года назад
Tks Mr Paul!
@manjunathpatil6785
@manjunathpatil6785 Год назад
@Paulie M This solution is working great!!Thank you so much!! But i have one probelm here am not able to create person or group columns from excel using this batch method can you please guide me on this.if any documentation on this please provide.appreciate your response!
@manjunathpatil6785
@manjunathpatil6785 Год назад
Any response would be appreciated thank you 🙏
@sanil-edwin
@sanil-edwin 3 года назад
I have a rather wide table with 70 columns and few thousand rows of data. I keep getting this error - The action 'GenerateSPData' was executed for '4096' iterations resulting in an aggregated result size that exceeded the maximum value '209715200' bytes allowed. Please reduce the number of iterations to ensure that the aggregated results size is less than '209715200' bytes. Any idea why this might be? In the above case there were 4096 records I tried to put through. Is there requirement of some premium license to use batch API?
@michaelmunro3037
@michaelmunro3037 2 года назад
I'm having this issue as well. My limit is 1048576 bytes, so 1 mb. I need to know how to up this limit.
@rafabrener2000
@rafabrener2000 2 года назад
I am also having this issue. Can you please give us a tip on how to solve it, Paulie?
@ubezmusic
@ubezmusic 2 года назад
pretty sure this is flow's internal size limit for variables, i've run into something similar when storing base64 images in strings, only way around it i found was smaller batches
@cipriandia8409
@cipriandia8409 2 года назад
Hi Paulie, amazing video!!! Thank you so much for providing the code!!!! I've managed to test it successfully on lists that do not have People columns however it fails on list that have People columns. The error is "A 'PrimitiveValue' node with non-null value was found when trying to read the value of a navigation property;" which I suspect is because the Person column is not defined properly under the GenerateSPData step. Do you know what the correct format would be for a people column? The excel column for the People column is an email address. I just assumed SharePoint would read the email and create the person.
@reggieoverton121
@reggieoverton121 Год назад
I ran into the same issue. I wasn't able to get it to work by setting the for example Approver Field to the person I am wanting. However, I did found out that all people fields have a hidden field that you can use. You just have to add "Id" to the end of the field name. In my example I would use ApproverId and set that field the user sharepoint user Id. The Id will be an integer.
@Alm3n
@Alm3n Год назад
@@reggieoverton121 You Sir, are a hero! Been trying for hours to get this going and this fixed it. Thank you!
@saqnevermind
@saqnevermind Год назад
Hi Paulie, What will happen, If I increase the batch size, will it be faster ?
@mchandramohan6833
@mchandramohan6833 Месяц назад
Implemented the same and succeded the flow, but items were not created Any idea what it could be?
@divyasree1792
@divyasree1792 7 месяцев назад
Hi Paul, All the records are not getting created, response has the error operation timed out. Even if I reduce the batch size to 500 the same issue occurs. Any other settings has to be done?
@Monchou2
@Monchou2 3 года назад
Hi from Spain, Paulie! First of all, million thanks for this tutorial. I've been stucked for a long time with this trouble and now I've a way to go on. Here is my question: What about if my Excel file has 95.000 rows? "List rows present in a table" threshold only allows 5.000 rows per flow execution. Have I to split my excel file into 5.000 rows Excel files? Thanks and sorry for my English. Regards ;)
@PaulieM
@PaulieM 3 года назад
If you go into the settings of your Get Rows action you should be able to switch "Pagination" on, which will give you more than 5,000 rows.
@Monchou2
@Monchou2 3 года назад
@@PaulieM Thanks Paul. If you don't mind, I've another question: I tested your flow to upload 10,000 rows and 9,996 rows have been uploaded. There are 4 remaining rows that have not been uploaded and I don't know why. How can I find out? Thanks again.
@SprayPaint001
@SprayPaint001 Год назад
Hi Paulie, Thanks for the video. When i execute the flow it works but only create 5120 ítems (from 7k). What could be the problem ? Thanks
@caraziegel7652
@caraziegel7652 Год назад
FYI we used the copied code from your blog - there is still a comment on one of the steps about deleting
@laketomsloan
@laketomsloan 3 года назад
Thank you so much! This is incredible
@joanasousa6233
@joanasousa6233 4 месяца назад
thank you so much that helped me a lot
@michaellussem1777
@michaellussem1777 9 месяцев назад
Ich vertraue mal auf die automatische Übersetzungsfunktion: Ein wirklich sehr vielversprechender Ansatz. Einzig die Tatsache, dass nach genau 1.024 erstellten Elementen der Flow ohne Fehler beendet wird, reduziert die Freude. Von diesem Verhalten wurde bereits in den Kommentaren auf Deinem Blog berichtet, leider ohne Lösungsansatz. Und "nein", an der Einstellung "Pagination" liegt es nicht, der Wert steht unverändert auf 1000. in der Excelliste (Datenquelle) sind 19.000 Elemente enthalten. Würde mich sehr freuen hier einen Hinweis zur Lösung zu bekommen.
@PaulieM
@PaulieM 9 месяцев назад
The error messages are contained in the response from SharePoint. You can extract them from there (there is an error for each request in the batch)
@shaylen.naidoo
@shaylen.naidoo 3 года назад
Hi Paulie, I'm struggling to insert a value into a people picker field. I only have the person's display name and email address. Is there a way I can achieve this?
@ajambo2010
@ajambo2010 3 года назад
Hi Paulie, With the right IF statement in the Apply to Each could the number of Threads be increased, to say 10, therefore making it even faster? Or is there a some sort of limit to the process.
@PaulieM
@PaulieM 3 года назад
You could break it down further (I have done this in my get more than 5,000 items video). I split it two ways just for simplicity - but it could be done for sure
@PaulieM
@PaulieM 3 года назад
I'm going to see if I can make a simple 8 threaded version. I think that would be a nice balance between performance and API actions.
@PaulieM
@PaulieM 3 года назад
I just tried it with 8 threads, instead of 2, it was marginally faster, 43 seconds to create 1,000 rows instead of 58. I think this is because SharePoint queues batch requests.
@PaulieM
@PaulieM 3 года назад
I've got it down to 35 seconds now to create 1,000 records. So that is a good improvement
@ajambo2010
@ajambo2010 3 года назад
@@PaulieM Hi, I am now increasing the amount of items being uploaded to 11,000, it worked fine with 4,000. But in the Get Items from Excel, it keeps stopping at 5120 items. I'm guessing I now now need to incorporate your 'get more than 5,000 items' video solution into this - is the correct?
@electricalinspectorate6199
@electricalinspectorate6199 9 месяцев назад
Sir In 5:48 how do we get that template of Batch API in that compose
@TrailersYT
@TrailersYT 2 года назад
Hi Paulie, Is there any possible way to have a column with the number of the item? So if I have 5000 items for example, every row has a column 'RowID' with the value of the rownumber; item1 RowID = 1 item2 RowID = 2 item 5000 RowID = 5000 The following option is also possible: items 1 till 1000 have RowID of 1 items 1001 till 2000 have RowID of 2 items 2001 till 3000 have RowID of 3 I have tried some things myself, but I can't figure it out. Would really appreciate your help!
@ivoferrari9669
@ivoferrari9669 Год назад
Hello! Im having this error when sending the batch data to sharepoint: value: "El patrón no pudo resolver un tipo con el nombre 'SP.Data.Proyectos_x0020_Asignaciones.ListItem'. Cuando hay un patrón disponible, cada nombre de tipo se debe resolver en un tipo válido.". What could it be the reason? Thanks!
@bertoso
@bertoso 3 года назад
is it possible to create an Automate extracting data from a Power Bi table to Sharepoint, but above 1000 rows of the table? because I can, but Automate only exports up to 1000 rows from the Power Bi table to Sharepoint and I didn't find anything to help me with that.
@TrailersYT
@TrailersYT 2 года назад
Hey Paulie, I can't figure it out still, are you able to help me ?
@mariolopez-od7pr
@mariolopez-od7pr 2 года назад
Hi Paulie, newbie here...sometimes I get an 400 error in the sendbatch action, but the flow is successfully, so is there a way for me to create an action to filter those error? so at least I know some items weren't create in SP list or at least identify that there is an error in the http request. Thank you!
@GSMonkey
@GSMonkey Год назад
HI, i am quite new to this, so i should start with something more simple :) But the "Create SharePoint Items" step, where do i find it? I tried searching for this action, but i cant find it. I am thinking maybe you renamed it, but what action should i be using then?
@reggieoverton121
@reggieoverton121 Год назад
That is actually step called scope. All need to do is copy the scope code, add a new step, go to my clipboard, and ctlr-v (paste) what you have copied. It will not show up so you have to use ctlr-v to do it.
@Y0urN3m3sis
@Y0urN3m3sis Год назад
I'm utterly amazed by how speedy it is. Few seconds, and I have 1000 records in a SharePoint Lists. I encountered multiple errors and managed to resolve them all. Now I'm trying to a run a load test by uploading 600,000 records to see how it fares. But to my dismay, the flow succeeds, and only uploads 1024 records. What could possibly be the reason?
@PaulieM
@PaulieM Год назад
Have you tried using a smaller batch size? If you have a lot of columns you could be going over the maximum request size. Try dropping the batch size down to 500.
@Y0urN3m3sis
@Y0urN3m3sis Год назад
I'm trying your threaded approach now on a different list, I reduced the batch size to 250 while keeping the pagination limit to 1000, I feel like I'm so close to cracking this. But still ends up with 1024 rows. Any idea what I'm doing wrong? I got the error "Request to Azure Resource Manager failed with error: '{"error":{"code":"ActionRepetitionNotFound","message":"The specified repetition '000005-000000' of workflow action 'Results' is not found."}}'."
@reggieoverton121
@reggieoverton121 Год назад
Has anyone figured out how to populate multi select fields in sharepoint list using this method? I have tried just about everything and can't get it to work. I get this error Microsoft.SharePoint.Client.InvalidClientQueryException","message":{"lang":"en-US","value":"A node of type 'StartArray' was read from the JSON reader when trying to read a value of a property; however, a 'PrimitiveValue' or 'StartObject' node was expected."
@ajambo2010
@ajambo2010 3 года назад
Hi Paulie, How would you recommend handling a blank/empty/null fields in excel (particularly date and number fields)? If there is a null in any field then the entire row is not added to sharepoint. The only solution I have come across is to have a Condition and if the field is blank then have a separate POST BATCH for that row/item where that field is not included it in the POST BATCH metadata. I would end up having many conditions to cater for all the columns could have a blank. Is there a better way?
@ajambo2010
@ajambo2010 3 года назад
Hi Paulie, I think I have answered my own question. I used an "if" expression along with 'null' in expression and it work. I'm sure I tried it before and it didn't work before, but there you go. Maybe it was the use of the Select that helped.
@PaulieM
@PaulieM 3 года назад
@@ajambo2010 well done!
@arjaybernardo9501
@arjaybernardo9501 2 года назад
@@ajambo2010 Hi, would you mind sharing which part did you use the if-null formula? is it on the 'GenerateSPData' part and use it for all columns?
@ajambo2010
@ajambo2010 2 года назад
@@arjaybernardo9501 Sorry for the belated reply. I had find were I did that. I put the IF statement in the right hand column of the GenerateSPData action. In the GenerateSPData the left column is the column name/title e.g. '"Material" and right hand column is the value for that column, e.g. "if(empty(item()?['Material']), null, item()?['Material'])". I did this for each row in the GenerateSPData action.
@brukutoo
@brukutoo 2 года назад
I copied your flow and paste it as is. I ran into an error in the results. (The template language expression cannot be evaluated because property content cannot be selected. Property selection is not supported on values string) How can i fix it? Thanks a lot
@irisbalahay9260
@irisbalahay9260 2 года назад
same concern. I noticed that batch template from the video is different. Cannot replicate because he did not show the syntax of the expressions
@LionMs-w7n
@LionMs-w7n 7 месяцев назад
Hi Paul,I followed your instructions and found an error:Microsoft.SharePoint.Client.InvalidClientQueryException","message":{"lang":"zh-CN","value":"named“SP.Data.TestBatchUploadListItem” The type cannot be resolved by the model. When a model is available, each type name must resolve to a valid type Could you please help me find the question. thx!!
@Asenay999
@Asenay999 5 месяцев назад
Hi. I got the same error, did you find the solution?
@stefanlunneborg3249
@stefanlunneborg3249 2 года назад
A note: I tested this on a list named 'test' and it didn't work. I hade to write "listName": "Test" in the settings. It seems like a listname must start with a capital letter even though the actual listname is 'test'.
@ajambo2010
@ajambo2010 3 года назад
Hi Paulie, How would you apply this to updating an item, i.e. PATCH. I'm stuck on how to the the REPLACE to update the Batch with the ID. Thanks
@PaulieM
@PaulieM 3 года назад
I intend to do a video/blog post on performing updates, I just haven’t had time to produce it yet. Coming soon!
@reggieoverton121
@reggieoverton121 Год назад
@@PaulieM have you done an updating video yet?
@rachitaarora1468
@rachitaarora1468 2 года назад
Hi Paulie, I am trying to implement the same for 50,000 plus records, but this flow only works for 1000 rows. Is there any workaround for that?
@PaulieM
@PaulieM 2 года назад
Hi Rachita, The batch operation only accepts 1,000 records at a time. But you could process 50,000 records by looping around 50 times. Have you tried that?
@GuilhermeAlves-zq2gc
@GuilhermeAlves-zq2gc Год назад
hello Rachita, I've this same problem here! (10K items) did you manage to make the loop run more times?
@JustinBouchardw
@JustinBouchardw 5 месяцев назад
@@GuilhermeAlves-zq2gc I just ran into this, but the default loop until count is 60. I bumped mine up to 1000 and should handle my 90K list.
@jonneybravo
@jonneybravo 3 года назад
cheers 👍
@HARMONICACOVER
@HARMONICACOVER Год назад
Need help for sending data in person type column using batch method...can anybody help me ?
@adelawilliams2260
@adelawilliams2260 3 года назад
Do you by any chance have a workaround for when you need to take a csv file, parse, then use the result to create list items? I have been trying for a week to figure this out. I have successfully parsed the field and can get SharePoint to populate, but when I map to SharePoint, if a date field is empty, it will not complete the flow siting an empty string. I have tried to use an if expression in the map to determine if null but I can’t get it to work. I want to figure this part out and then do a bulk upload.
@PaulieM
@PaulieM 3 года назад
I do - is it a required field or not?
@adelawilliams2260
@adelawilliams2260 3 года назад
@@PaulieM It is not a required field. It is a Closed_Dt field, so it will often be null.
@PaulieM
@PaulieM 3 года назад
@@adelawilliams2260 I do know a workaround but it’s tricky to explain - do you want to do a quick sharing session? If so can I record and put on RU-vid? 😆
@adelawilliams2260
@adelawilliams2260 3 года назад
@@PaulieM Would it be ok if I sent you screenshots of all my steps? I cannot share my work laptop screen because it is on a vpn. I do not have a personal power platform subscription I can use on my personal laptop.
@PaulieM
@PaulieM 3 года назад
@Adela Williams basically this is what I think you should do. In the select action that generates the data, wrap the date field name and field value around an if expression. So that if the date is empty, then the field won't even be included in the JSON. Do you know what I mean. If you want to screenshare - I can share *my* screen with you to show you what I mean.
@JesusJimenezALJP
@JesusJimenezALJP 2 года назад
hello Paul, I am struggling a bit, it runs fine but it doesnt upload anything. I get an error in batch data, it goes between exceeding minimum bytes to an error in the body looking forward to get your feeback T_T
@saharhussein7043
@saharhussein7043 2 года назад
Have you been able to fix this, if so, how did you do it?
@yilu3219
@yilu3219 2 года назад
can you check my flow for me. i copied the same template and using the same input and output you using.
@HappyLife-um2tn
@HappyLife-um2tn Год назад
why -1 on loop control?
@urveshghelani1547
@urveshghelani1547 2 года назад
Index was outside the bounds of the array. Error
@MaksimU_W
@MaksimU_W Год назад
Hi Paulie, thank you for the instruction. Somehow I'm struggling to adjust your flow for a PowerBi export but the values after "GenerateSPData" are just null. { "inputs": { "from": "@body('Run_a_query_against_a_dataset')?['firstTableRows']", "select": { "__metadata": "@json(concat('{\"type\":\"SP.Data.', replace(outputs('settings_2')['listName'], ' ', '_x0020_'), 'ListItem\"}'))", "Title": "@item()?['Title']", "Report": "@item()?['Date - Kopie']", "OData__x0032__x002e__x0020_Region": "@item()?['Region.Region Level 01 - Kopie']", "Plant": "@item()?['Plant']", "SupplierID": "@item()?['Supplier']", "SupplierName": "@item()?['Vendor.Vendor Level 01 - Kopie']", "OData__x0032__x002e_1MPPM_x002d_8D": "@item()?['1M PPM (8D) - EXPORT']", "OData__x0031_MPPMno8D": "@item()?['1M PPM (no 8D) - EXPORT']", "OData__x0032__x002e_RejectedItems": "@item()?['1M Rejected Items']", "OData__x0032__x002e_1MReceivedItems": "@item()?['1M Received Items']", } }, "metadata": { "operationMetadataId": "db350097-7581-4c11-88c4-e75bd431af09" } }
@jreyesvega
@jreyesvega 2 года назад
I try but I can not making work. I am new in this,
@yilu3219
@yilu3219 2 года назад
it does not work, it run successful. but there is no output in the sharepoint list
@wolenbergsilva
@wolenbergsilva 2 года назад
Hi Paulie ! Really very nice video !! it help me a lot here in Automate perfomace, but i'm stuck in one step, in _metada column i did not change anything like you said, but after the flow runs, it show me this error : "body": { "error": { "code": "-1, Microsoft.SharePoint.Client.InvalidClientQueryException", "message": { "lang": "en-US", "value": "A type named 'SP.Data.EnvioLembrete_AgrupadoListItem' could not be resolved by the model. When a model is available, each type name must resolve to a valid type." } } }, "statusCode": 400 } What i did it wrong? lol ... i don't understand the part of "SP.Data.Mylist" , if you can help me with this , i'll really gratefull.
@Johanvanderschelling
@Johanvanderschelling 2 года назад
I have the same problem Wolemberg, did you manage to solve this? And if so how did you do that? Thanks.
@wolenbergsilva
@wolenbergsilva 2 года назад
@@Johanvanderschelling I resolve just change the name of my list !! some characters like underscore, or a blank space force the metadata to rename a list with a valid caracters, so i change the name of the list without using spaces or "_" and the batch running ok. Hope help you !
@Johanvanderschelling
@Johanvanderschelling 2 года назад
@@wolenbergsilva it worked, thanks!
@WesAspinall
@WesAspinall Год назад
@@wolenbergsilva This works, thank you!
@kulski666
@kulski666 2 года назад
Great work Paulie - your things are really helping much - i nearly got the insert via batch running, but i stumble of errors that are generated by the post, which show up in the results. I double checked already the settings you asked us to change, but i don't find an issue. This is the error message i get for each tried insert -> --batchresponse_f76345de-0f6a-47a9-a440-4919b061e52b Content-Type: application/http Content-Transfer-Encoding: binary HTTP/1.1 400 Bad Request CONTENT-TYPE: application/json;odata=verbose;charset=utf-8 {\"error\":{\"code\":\"-1, Microsoft.SharePoint.Client.InvalidClientQueryException\",\"message\":{\"lang\":\"en-US\",\"value\":\"A type named 'SP.Data.Disaster_Export_Service_DetailsListItem' could not be resolved by the model. When a model is available, each type name must resolve to a valid type.\"}}} What could cause this ? I suggest a fault by me, but i don't find it.
@kulski666
@kulski666 2 года назад
I found the issue while reading through the comments in your blog .. - it were the underscores in my list names.
@YonPengus
@YonPengus 8 месяцев назад
@@kulski666 I had the same issue. Fixed by changing to a list without any special characters or spaces in the name.
@ajambo2010
@ajambo2010 3 года назад
Duplicate Entries. A heads up for anyone manually recreating the Threaded version from the non-threaded version that, in addition to the if statement for the Apply to Each, the 'join' expression in the batchData action needs to be update from "join(body('Select'), decodeUriComponent('%0A')) " to "join(item(), decodeUriComponent('%0A'))"
@PaulieM
@PaulieM 3 года назад
Thanks for that, I thought I made a mistake in the original version, but see that it is fine. Thanks for the info.
Далее
这到底是怎么做到的 #路飞#海贼王
00:10
Просмотров 4,4 млн
ОНА БЫЛА ПЕВИЦЕЙ🤪
3:13:12
Просмотров 1,1 млн
Create a SharePoint List quickly | Power Automate
11:34
Power Automate: Add and update data in bulk
30:58
Просмотров 50 тыс.
SharePoint Batch Update, Create, & Upsert
27:29
Просмотров 11 тыс.
这到底是怎么做到的 #路飞#海贼王
00:10
Просмотров 4,4 млн