I'm a Technical Director for Microsoft and I'm looking to share videos that help people better understand how to use Microsoft services and technologies.
Note: The views and expressions on my videos do not represent those of my employer and are strictly my own.
HI Steve thanks for sharing the useful information. I would like to ask about as I have under apply to each action is export to file for paginated reports and it is taking quite time and passing few parameters value from dataverse then calling send email action.kindly suggest to optimize the flow performance issue .
Hi ! can you tell me for this interface card (nic) reference to the pbi private endpoint how many ips are assigned to it ? We have create the private link for powerBI/Fabric and this nic is consuming somehow 18 ips from the vnet. Can you tell me why? Thks so much for the help!
Hello Steve, I am in the process of attempting to utilize excel to create / update records in a SharePoint List. I currently have mu flow functioning the way i want. However, I am stuck at that 5000 limit. My data will be upwards of 200 times that. I know this will move slow but will work for what I would need. How do I implement this on of these methods into it? I understand that I will have to turn my flow into a solution. Just not sure what that will look like with my needs. Currently I am utilized a Scheduled Flow--->Get files (properties only)--->Apply to each->Update file properties->List rows present in a table->Apply to each-Get items-Conditions->Delay->Update file Properties The video I watched to do this is Reza Dorrani - Add & Update Excel Data to SharePoint List using Power Automate | Excel Import using flow My flow will be almost Identical to his. The only difference is I did not need to do all the extra steps to select choices. My data is very clean and singular text.
Just stumbled across this and subscribed to your channel. Great video on the topic; I have been searching for how to handle large dataset in Automate and this will do it for me! Thanks!!!
Thanks Steve for this video, one quick question - Is it possible to read table rows content continuing from Page 1 to page 2. My use case is below I need to extract information in tabular format from order confirmation pdfs received. Each pdf has multiple items and each item will have a Name, description, Vendor and delivery date. So the table will have four columns: Name , description, Vendor, Delivery Date with each row representing an item. The problem arises when some details for an item are present at the bottom of one page and the remaining details are on the next page. Example : Description in the table continuing in the page 2 from page 1 bottom , So unable to tag these rows which is continuing from page 1 to page 2. For example: if this is the pdf -----some text-------------------------------------------- -----some text--------------------------------------------- code: 1 description: this is first item Vendor: XYZ1 delivery date: 12.01.2024 code: 102 description: this is second item Vendor: XYZ2 delivery date: 13.01.2024 code: 103 description: this is third item -------page 1 ends here--------- -------page 2 begins here-------- description(Continuing from Page): this is third item Continuing Vendor: XYZ3 delivery date: 14.01.2024 code: 104 description: this is fourth item Vendor: XYZ4 delivery date: 15.01.2024 code: 105 description: this is fifth item Vendor: XYZ5 delivery date: 16.01.2024 ---------some text here-------------------------------- ------------------------------page 2 ends---------------------- ------------------------------pdf ends---------------------------- The document cannot be tagged correctly using custom model when page 1 content - Description is continuing on Page 2 . For the above document, the tagged tables look like this Code Description Vendor Delivery Date 101 this is first item XYZ1 11.01.2024 102 this is second item XYZ2 12.01.2024 103 this is third item XYZ3 13.01.2024 Code Description Vendor Delivery Date Some text are continuing from page 1 104 this is fourth item XYZ4 14.01.2024 105 this is fifth item XYZ5 15.01.2024
Thanks for the story. Im a freshman as of now, and Ive been struggling incredibly bad with speaking in front of my classes. It’s too the point where I’ve almost passed out in class do to anxiety.
Concurrency is actually defaulted to 20, not 1, even with control off. To do truly synchronous (1 by 1), you need to turn concurrency control ON, and slide the scaler down to 1.
At timestamp 10:05 you mention you will show how you are parsing out to get the skip token value, but you don’t show it. For the action ‘update skip token variable’, what expression are you using for the value field? I’ve tried a few things, but I don’t get correct results.
This is exactly the solution I need as I looked all over RU-vid and couldn't find the right video that highlights all the components you included in this video. Thanks for doing that. My only thing is that I am trying to get the data from SharePoint (over 100k rows) and not dataverse and then piping the data to Cosmos I get the concept would be the same just a different data source. Thanks much.
around 3:50, he skips over the part of the flow called 'Remove_the_Header_Info_from_the_Power_Apps_Image_Value'. On my end, it's failing with an error "The 'start index' for function 'substring' must be equal to or greater than zero and must be less than '0' which is the length of the string." Since I just updated the API key and end point within the flow and updated the Power App, what's causing this?
So with this linking process; does it matter where the access database is stored after migration? If it is stored on a local computer and that computer is shut down what happens?
Thanks for sharing this with us. I was thinking that the limit was the number of rows per page, but now I completely understand this feature. Thanks again!
Thank you so much for this video. Question, once I add data in the app it doesn't go directly into MS Access, I need to close access and reopen it to see the new information. Is there a way around this?
what abou the reports ? Once I have the data synchronized with the Dataverse and from PowerApps I can make an aplication, is it possible to "migrate" the Access reposts? In my case I have about 180 reports in access. How could you migrate the reports?
Hi, Steve. It's a great video. Question here.. how about the opposite case? It means that, if the source file encrypted by AIP, is it possible import into Power BI for visualization?thanks in advance..
Hi Steve, This is very nice and exactly what I need to do. One question though, how do I trigger this function (HttpTrigger) from an external applications? Can you please explain this? We are trying to trigger it using Autosys. Is there any possibility of triggering this function using autosys?
My company has given us some access to dataverse but not fully functioning which is a huge frustration as they want to move all our files to the cloud but have left us a bit high and dry with access databases. They insist on using SharePoint lists instead
how do i obtain the licenses bro ive been stuck and lost trying to get the correct licenses to do this . please help . Im so lost and confused with the process
I need help with what you guys recommend me to fix my Canon EOS M50 or Shall I buy Elgato Web cam because I heard there is webcam like Logitech can remove background without using chroma key. but I want very clear because I want to use to stream for gaming 🙏
Hi Steve. Great stuff. Wanted to know if you would be helpful in knowing how would this work with sharepoint. I have a flow that creates/updates sharepoint item sourcing out from an excel file present on the sharepoint. The data ofcourse will be going beyond 5000 rows in excel and evidently at some point I have to worry about paging. Could you please create a video or help so that creating/updating sharepoint list items for newly added rows/existing rows (no matter the no of rows to be listed on automate) are possible. P.S. an O365 user.
Steve, thank you for sharing this walkthrough. The current limitations (which haven't changed since GA as far as I can tell) are pretty severe and basically make this a non-starter for most enterprise use cases. No access to other datasets/dataflows in the Service, no email subscriptions, no telemetry, no file exports-I have to tell my client today that we're going to need to abandon our plan to use this feature. Are there any plans to fix this, and if so, what is the timeline for that?
Hi Steve, Thanks for your video! While your flow works on my data, there's still something I am not totally clear about: when discussing the "Do until" loop at 10m02s, you said "as long as the next URL property is not null", then we loop. Which makes sense for a "Do while" loop but not in this case since it's a "Do until". At least I would run the loop UNTIL the next link variable becomes null, meaning that the NextLink property does not exist anymore. Why does it work in that case?