Тёмный

Integrating Azure ML and Power BI 

Kevin Feasel
Подписаться 2,5 тыс.
Просмотров 3 тыс.
50% 1

In this video, I show off how easy it is to integrate Azure ML and Power BI, at least once you get past all of the trouble trying to integrate them.
LINKS AND INFO
Catallaxy Services -- www.catallaxyservices.com
GitHub repo for Azure ML and Power BI integration -- github.com/feaselkl/AzureML-L...
GitHub Repo for Practical MLOps with Azure ML -- github.com/feaselkl/Practical...
Tutorial: Consume Azure Machine Learning models in Power BI -- learn.microsoft.com/en-us/pow...
Azure Machine Learning integration in Power BI -- learn.microsoft.com/en-us/pow...
Connect to AI Insights in Power BI -- learn.microsoft.com/en-us/pow...
Azure Machine Learning model doesn't show up in Power BI Pro Desktop -- community.fabric.microsoft.co...
Creating a Power BI compatible endpoint -- learn.microsoft.com/en-us/azu...
InferenceSchema Python package -- github.com/Azure/InferenceSchema

Наука

Опубликовано:

 

4 дек 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 15   
@datascienceandaiconcepts5435
@datascienceandaiconcepts5435 6 месяцев назад
nice
@user-lz8wv7rp1o
@user-lz8wv7rp1o 6 месяцев назад
great great great
@PAwader
@PAwader 3 месяца назад
Great tutorial. I did notice that if I try to run a Azure ML endpoint with a good number of rows, around 100k, I will get a 502 Bad Gateway response. However, if I limit my data set to under 100k rows, the response returns successfully. I do not want to limit the data in my tables. Is there a way to limit the number of rows that get sent to the ML endpoint or another work around?
@KevinFeasel
@KevinFeasel 3 месяца назад
Unfortunately, I have bad news and slightly less bad news. The bad news is that the 502 Bad Gateway response you're getting is (most likely) because the response to the Power BI gateway is too large, above 8MB. Here's a similar issue regarding Application Insights data: stackoverflow.com/questions/41869170/query-from-powerbi-to-ai-suddenly-fails-with-502-bad-gateway To fix this, there are three options I can see. One is, like you mentioned, reducing the amount of data you send (and receive) as part of the Power BI to Azure ML connection. Because that's not a desirable option for you, a second alternative would be to feed the data to Azure ML using a separate process, store the results, and then load that into Power BI. You lose the benefit of the AML-PBI direct integration and may have some level of delay in scoring data, but would not run into this issue. The third alternative would be something like a hybrid approach. Have a back-end process call your scoring endpoint for data, storing that someplace you can access it later. Then, in Power BI, have two Power Query tables: one with the scored results and one without (for records that you ingested after the last time your back-end scoring process ran). The one without, you'd perform just-in-time scoring. Then, union the two tables together and the output looks normal again. Some enterprising mind might know of a good fourth way, but I'm not aware of one.
@caiyu538
@caiyu538 6 месяцев назад
could you do a video to connect azure ml to tableau?
@KevinFeasel
@KevinFeasel 6 месяцев назад
I'm not particularly familiar with Tableau, but I'll at least add this to my list to check out and see if I can create a video.
@kamelsanny6951
@kamelsanny6951 2 месяца назад
Hi Kevin. I tried to deploy to Web service, creating the conda file and adding azureml-inference-server-http to this file. But I got the error " Packages notfounderror, the following packages are not available from current channels: azureml-inference-server-http ". I tried to install it but it seems that it is already installed. Could you please help me on that? Thanks.
@KevinFeasel
@KevinFeasel 2 месяца назад
Hmm, that's interesting. I do see azureml-inference-server-http available in pip, so the package hasn't been removed. If there's a typo, that could explain things, though I didn't see any typos based on the error message you have. One thing you could try is running the container locally to see if you get any additional information. I have some instructions at github.com/feaselkl/AzureML-Local-Deployment/tree/master and a video at ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-bue85m7lbjQ.html. If you can successfully run the container locally, that would indicate an issue somewhere in Azure ML.
@muhammadtalmeez3276
@muhammadtalmeez3276 Месяц назад
Can we integrate LLM models in powerBi dashboard to ask question from data? If yes how?
@KevinFeasel
@KevinFeasel Месяц назад
As of today, the answer is kind of complicated. You can use Azure OpenAI within Power Query to enrich data: techcommunity.microsoft.com/t5/educator-developer-blog/how-to-use-azure-open-ai-to-enhance-your-data-analysis-in-power/ba-p/4041036 And there is Copilot for Power BI within Microsoft Fabric but it requires Microsoft Fabric and at least an F64 SKU: learn.microsoft.com/en-us/power-bi/create-reports/copilot-introduction There is at least one third-party custom component that might do what you want in AI Lens for Power BI, though there will likely be additional costs for licensing it: www.lensvisual.io/
@muhammadtalmeez3276
@muhammadtalmeez3276 Месяц назад
@@KevinFeasel thanks kevin for detail answer, but instead of openai or copilot i want to use other open source model from huggingface. Is it feasible?
@KevinFeasel
@KevinFeasel Месяц назад
@@muhammadtalmeez3276 Probably not without a significant amount of development work on your end. Just spitballing an answer, you'd probably need to host your model via API and then create a custom component to send the user prompt plus data to that API and get the response back. It's the "plus data" that would make this a real challenge, especially if you intended to use it with sliders and to view data on charts. It's not something I've done before, though I suppose it is technically possible, just a major endeavor.
@muhammadtalmeez3276
@muhammadtalmeez3276 Месяц назад
@@KevinFeasel thank you so much kevin. I am struggling this from last 3 days. Your answer is very helpful to conclude my research.
@PatrickBateman12420
@PatrickBateman12420 6 месяцев назад
funny; real time vs. online 🙂
@pmo4325
@pmo4325 4 дня назад
How absurd that MS can't just make this a simple one click solution!
Далее
Performing Automated Machine Learning with AzureML
14:03
Ютуб был хороший...
00:52
Просмотров 266 тыс.
Embedding with Power BI - What's the difference?
10:46
Просмотров 119 тыс.
Azure Machine Learning:  the Overview
15:14
Просмотров 21 тыс.
Using R with Azure Machine Learning
33:07
Просмотров 2,3 тыс.
Battery  low 🔋 🪫
0:10
Просмотров 13 млн