Kent Weare is a technology enthusiast who is focused on Integration and Automation using Microsoft Technologies like Power Automate, Azure Logic Apps, Power Virtual Agents and the rest of the Power Platform. On this channel you will find practical, real-world information that you can use to drive greater productivity.
Really nice video. This service has a lot of potential and can help many people with speech issues or those looking to enhance their content delivery. It would be great if users could utilize their own voice as a speaker. I’ve been testing this with my own voice in another service, and it would be awesome to have this feature available in Azure as well.
That is a great idea. I am sure there will be more options for speakers, voices, languages etc. I have no idea if there is a similar Microsoft equivalent of this but I sure hope so!
RAG mispronunciation: think it’s because of limited training datasets on acronyms and generally the lack of phonetic guidance in most LLMs. I don’t know what’s their cut off date for their model but it could also be that RAG didn’t exist (in the context of Logic Apps) back then and it’s semantically matching it with R as a language because it thinks is a spelling mistake or something. Definitely hallucination there.
I think the intended use case is more around BCDR use cases. However, Service bus has recently enabled a more robust solution for replicating messages across regions so I would opt for that over this approach. But at the time that was created, there wasn't an equivalent capability
Hi Kent, my experience is that most people who leverage no code-low code tools like Logic Apps barely have workforces that deal with ARC enabled K8s clusters running on the edge. Those who deal with edge computing and have resources to configure and deploy K8s cluster prefer must more advanced open source workflow orchestration engines that are designed for edge and microservices workloads. They cannot even spell logic apps :) How big is the market Microsoft is targeting with this technology? I am really hoping one day we can run most of Azure Services including Logic Apps on Azure Arc bare metal without having to deploy any additional tech stack such as K8s . I know I am dreaming :)
Hi Reza - fair question. The market opportunity is there. Can't share more specifics publicly but there is a reason why we are making the engineering investments to make this happen. But I don't expect this to be as big as Azure (managed) logic apps, but big enough to priortize and address the customer need.
[2024-09-13T18:59:02.912Z] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.). For detailed output, run func with --verbose flag. [2024-09-13T18:59:08.640Z] Host lock lease acquired by instance ID '00000000000000000000000055A5DDE5'. why i am getting this error while running my code
how should we approach this from a ci/cd perspective? think most people are working in portal and extracting, so there its no change but what if you are local first?
In the future I would envision this showing up in VS Code as well. We do have an updated export to VS Code shipping imminently so I would imagine that will help in getting a template into a local project
Thanks for the detailed video Kent. I am facing an issue at the end while making the call to logic app. Though I am passing the token in the request to logic app i am getting the following "401 Unauthorized" error: "code": "DirectApiInvalidAuthorizationScheme", "message": "The provided authorization token is not valid. The request should have a valid authorization header with 'Bearer' scheme." is there any other setup to be done.?
Hi kent,I have one window UI application automation flow in desktop,so i have to converter as unattended using power automate cloud automated flow using when email arrive trigger option,im greeting press and populate exception how can I do that,kindly reply me
Hi Kent, am I right in thinking there is a throughput limit on this connection, will this handle large volumes (100's or 1000's) of files arriving all at the same time?
Hi Kent - have you decided not to record the 4th episode discussing Logic Apps as Semantic-Kernel plugins ? This was supposed to be the chapter that seals this topic and since I haven’t seen it , I wanted to check in Thanks
It is still coming...I was wating on Easy Auth to be available in portal which makes the experience much simpler. That has now been deployed. I just need to make the video
@@moisenach1 I may not get to it for a couple weeks as I have some other commitments. But in the meantime, Parth just posted this blog post: techcommunity.microsoft.com/t5/azure-integration-services-blog/integrate-logic-app-workflows-as-plugins-with-semantic-kernel/ba-p/4210854
Kent, can we have multiple Functions within a project and call these from Logic Apps? I could not find a way to create a new Function and it looks like the metadata created only has one entry point. Any examples out there on how to create multiple Functions to call from Logic Apps? Any help is appreciated.
I love these videos you are doing on Gen AI Kent....One suggestion I have for you is the use of the phrase "publishing to LLM". That phrase gives the impression that the data you present to the LLM is absorbed into the LLM which we all know is not true..The data is just kind of sitting to the side in a vectorized format in a vector DB (in your case Azure AI Search). There is enough of misconception about the LLMs being trained on people's data :) Also, I loved how you opened up your series with some use case of Gen AI in integration. Keep going on that ground and dont limit your series to just Logic Apps. Given your background in BizTalk , I'd love to learn from you some use cases in schema mapping, sample data generation and that brilliant idea you briefly referred to in one of the videos about plugging Gen AI to integration pipelines to understand what's going on (instead of ton of complex ETL)... LLMs are great transformers. Let's show case some transformation use cases rather than just always some Q&A. Thanks again for teaching us alot of new things Kent.
Thanks Reza - I was using the publish term as a way for people new to Gen AI, but familiar with pub-sub architectures to grasp the concept. but fair feedback. I will avoid that term moving forward. I feel we are just scratching the surface with Integration and Gen AI, so thanks for validating the interest. I plan on doing much more in this area.
My current perspective on AI is still evolving, but I primarily view AI as a powerful automation tool. AI, in essence, requires human interaction to create meaningful outputs. As someone deeply passionate about metalworking and woodworking, I see parallels between AI's impact and the changes automation has brought to these trades. Automation has democratized access, enabling more individuals to participate in these crafts. However, it has also made it challenging to sell custom furniture, as mass-produced options from stores like IKEA are available at a fraction of the cost. Looking ahead, will we witness an increase in the value of original "real human" content while AI generates more generic content and addresses common problems? Could this shift empower true artists and innovators to achieve things they never thought possible?
Some great perspectives here. I totally agree with the democratization comment. Much like low(er) code tools have enabled more people to self serve their needs, AI will take that a step further in many areas. For example, I have zero artistic skills but can use Gen AI to build something that would never be possible. So this hase a bi-modal impact as it could impact people who specialize in those skills, but in return also enables me to do something I never could (reasonably_ do before. Kara Swisher has a great comment about AI: "How many people still churn their own butter".
Hi Kent - Perhaps I am jumping too fast but leveraging Logic Apps as plugins within the Semantic Kernel is a great interest to me . I still can’t tell whether I should be following the protocol that says that I need to use the metadata/Swagger definition of the LA with EasyAuth enabled or simply import a custom plugin with a semantic function calling that handles the http requests. The differences between these two options are unclear. Would be nice to see a detailed overview and a demo on each of the above and a summary on some of Pros and Cons. I tried to follow the MSFT documentation that involves enabling EasyAuth and even though I succeeded in it, the documentation lacks many of the steps required. Thznls
@@KentWeare - thank you for responding so quickly ! I’d love to get your take on the differences between the two approaches and how you see function calling / plugin invokation when Logic Apps are orchestrated as well Looking out for that.
Doesnt work for me. We got standard and use Node.Js 18 but this feateru si non working. If i copy on action and hit the + sign there is just standard add action/ add parallel branch UPDATE: you need to allow azure.portal access to your coppied files/pic in your browser.
This is true. Business Process Tracking has been decoupled (but a link can be added after). By decoupling these capabilities, you can now create an application without an ADX instance.
If anyone is watching this wondering why they can't get the private/vnet routing to work, don't use api connections as they are executed outside of the runtime and won't benefit from the routing, use service provider connections.
Hey Kent, nice demo thanks. Is it possible to use BPT with an event flow with high frequency scenarios? I mean lets assume we are getting lots of events from an external system and process/enrich them under various azure services and publish them to other bunch of downstream systems via service bus. In order to be able to verify if an event is published or not by all down streamsystems, does it make to use BPT? Or is it more suitable for less frequently running processes?
Through-put should not be a problem. We have a job that runs that is similar to our action completion job (for run history). Instead of writing to Azure Storage, we write to Azure Data Explorer which is capable of high throughput as well. We have designed this solution with performance in mind and that shouldn't impact the performance of processing Logic Apps transactions.
HI Kent, a query, I have some emails that I receive weekly, in which they paste images in the body of the email, how could the ocr be applied to this case? I need to extract some values from those images, could you guide, greetings and thank you.
There is..you need to have something copied first. Go to 1:54 of this video: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE--9DVlAUrAjk.htmlsi=5NRstpdFtcrlGNOq&t=114
@@KentWeare Hi, thank you for the reply. The case is I see the "Paste action" in the VS Code version, but not in the portal. I do "Copy action" (RMB on an action block and click ""Copy action"), they if I go to the False branch and click on "+" sign I see only "Add an action" which opens "add an action" panel. If I click "+" sign between other blocks, it has 2 options "Add an action" and "Add a parallel branch", but no "Paste" option(
@@KentWeare I found an answer, the problem was in the permissions, once I cleared all my browser cash and got back to work with the Logic App designer, the website asked permission to work with the buffer, after I granted it - all worked
@@KentWeare We would like to use Active directory OAuth as authentication type, Even if I use send http action to generate a token, how can I pass this token to next Send HTTP action, as we don't see any authentication space for HTTP action in Copilot Studio.
@@KentWeare We would like to use Active directory OAuth as authentication type, Even if we use send http action to generate a token, how can I pass this token to next Send HTTP action, as we don't see any authentication space for HTTP action in Copilot Studio
@Kent, thanks for sharing this video. I have one question on "Connection Name" to clarify. Do we need to use on-prem data gateway or Machine configuration for connection. As I understand, it requires either one of those for connecting to local file shares. Please do let me know.