Тёмный
Netwoven Inc
Netwoven Inc
Netwoven Inc
Подписаться
We help modernize and secure your business processes and IT systems using Microsoft technologies.
Data Science for Business with Microsoft Fabric
1:00:43
3 месяца назад
Planning of Microsoft 365 Copilot Rollout Strategy
1:03:10
5 месяцев назад
Webinar: Data Observability with Microsoft Fabric
1:00:12
9 месяцев назад
Webinar: The Roadmap to Security Modernization
57:33
10 месяцев назад
Webinar Power BI: More Than Just Visuals
55:43
10 месяцев назад
The Roadmap to Security Modernization
1:41
11 месяцев назад
Microsoft Power BI - Desktop Demo
12:07
2 года назад
Microsoft Power BI  - An Overview
4:46
2 года назад
Azure AD - Setup Azure AD Terms of Use
11:05
3 года назад
Комментарии
@crunchraven000
@crunchraven000 Месяц назад
Do you know how can i activate an Azure resource role?
@richardcollins9862
@richardcollins9862 4 месяца назад
Excellent presentation
@georgewashington3012
@georgewashington3012 5 месяцев назад
Are sensitivity labels truly required for Copilot for M365? Labels have their own issues (compatibility, performance, end user training, etc.). I’m hoping I can simply get by with double checking permissions everywhere prior to rollout as label deployment would be a cultural shift and an even bigger headache than deploying Copilot. Besides, what would be the benefit in deploying labels as it relates to Copilot? We’d want people to feel free to use ALL sensitivities of data with Copilot for M365. After all, one reason it’s so expensive is the robust privacy and security controls built into Copilot for M365. Restricting the types of data users can use with Copilot would only serve to frustrate users and create angry calls to the Service Desk. Same thing for DLP. I realize DLP is useful in its own right, but what relevance does it have for Copilot? It isn’t as though users will exfiltrate data to external storage services/devices/recipients to a higher degree simply because they have Copilot for M365. Employees are expected to exercise good judgement when sending data and Copilot for M365 doesn’t change that.
@netwoven
@netwoven 5 месяцев назад
Sensitivity labels are not required to use Copilot for M365. We are recommending, however, that you take steps to secure sensitive or confidential information from being accessed and incorporated into CoPilot results. Sensitivity labels are one way to do this, but there are other approaches you can take as well. Double-checking access permissions is a good idea. We would also suggest monitoring access logs for selected content, to confirm that CoPilot is not inadvertently accessing content it should not see.
@sudhanshukaushik9980
@sudhanshukaushik9980 9 месяцев назад
Wish I could give you 10 likes myself. Great stuff! Thanks!
@richardwaldron1684
@richardwaldron1684 11 месяцев назад
Great explanation and demo, thanks for posting.
@aureliaauma2552
@aureliaauma2552 Год назад
kindly send me a link so that i can download the dynamics 365
@coolbeing163
@coolbeing163 Год назад
thnx, can we do some manipulation with data while importing. e.g. if my Student ID is "XXX-BAC-StudentID-123" and so on...I want to extract ONLY StudentID-123 and discard prefixes. Can we do that?
@emergentform1188
@emergentform1188 Год назад
As a developer who has been doing complex reporting for a variety of industries for 25 years, I can assure everyone of this: These incredibly weak measures offer very little in the way reporting capacity that a company would actually need. SSRS and Power BI are great tools for sure, but the problem is that Microsoft does not allow access to cloud hosted production data. Power BI can connect to the "data warehouse", but that warehouse is incredibly useless and doesn't even work because it's just too slow. In short, Microsoft is SCAMMING you by telling you that it's possible to use Power BI to generate reports (if your D365 database is cloud hosted by Microsoft). And the SSRS option is only available to developers with the knowledge and access to be able to customize D365, and that's will cost a fortune. Microsoft has set up this scam to trick customers into believing that reporting is "easy" but then customers only discover after implementation, when it's too late, that that was all lies. If you want meaningful reporting you need direct access to the database, or have a copy of the data somewhere where you can access it and write database code to do the fetching and transformations. These rinky-dink methods touted by Microsoft are like bringing a bicycle to a formula 1 race. Microsoft knows this. They want to sell report development services so they and their partners have a permanent cash cow. This bait and switch has understandably ticked a LOT of companies off, and so Microsoft will apparently eventually have a way to push D365 to a separate Azure data lake, where it will be accessible for proper reporting, but that's going to come at a price of course, those tools aren't cheap. Sorry to break the bad news to you all. I know this because I've been dealing with the fall-out of Microsoft's lies for the last 3 years, and found a way to work around their numerous roadblocks to generate useful reports. A word of advice to companies thinking of implementing D365: don't. There are far better options.
@blackfincloudservices2844
@blackfincloudservices2844 5 месяцев назад
Ahh ... where to begin? First, with empathy. It is true that some reporting choices for Dynamics are a bit daunting and somewhat limited. SSRS is a good example - while it's certainly possible for SOME end-users to take a course to learn how to develop SSRS reports, it's just not realistic that this could be a widespread option for the end-user community. As well, SSRS against the online version of Dynamics does not support SQL queries, so FetchXML must be used, which is also not exactly what we might call intuitive. I will also admit that the performance of some reporting choices are not ideal against big data sets with this platform. In the case of large data sets it would be a good idea to report against a separate source. It's possible to use something like Azure Data Factory to periodically copy sets of data for the use of reporting - and it's not break-the-bank expensive. And while this concept would have been quite expensive in the past, involving a lot of development work, the more graphical tools such as ADF have made the integration process much simpler than it was. Having said that, there are some serious inaccuracies in the polemic above. For instance, is it absolutely possible to connect Power BI to the production version of Microsoft Dynamics Customer Engagement, now known at the Dataverse. Just open either the cloud version of Power BI or the Desktop version and click Get Data - the ways in which you can connect to that data source are well documented - and honestly, by no means rocket science. The idea of the "data warehouse" being the only thing you can connect to related to Production Dynamics - honestly don't k now what you mean by that. As you well know, Power BI can connect to many sources - a given "data warehouse" being just one. Also, as the video explains - what defines reporting? In the 17 years that I have been involved with consulting on this platform, most of the time, the most popular "reports' with many end-users are two-dimensional representations which are essentially excel spreadsheets. And the argument that if you want meaningful reporting you need to export the data to Azure Data Lake and attack with "meaningful" reporting tools - well, now you sound like the developer that you are. So, not sure if it's appropriate advice to end-users or business people who need to make a business decision as to whether or not a system will work for them. Moreover, it is absolutely TRUE that many end-users of Microsoft Dynamics make extensive use of the native charts and graphs against Advanced Find Views to produce perfectly fine interactive dashboards that provide the ability to drill down and answer questions. There are many people who consider those to be reports. What's more, you can now take any view you have created or that the system is presented and simply click "Export to Excel Online" and work on that data in Excel online, click Save and it writes it back to the database. This is an operational advantage that many systems simply do not have. People love Excel. Sorry, but I'm not aware of many other CRM systems where this is possible. So it's not all "lies, ,lies, lies" from Microsoft. I spent 10 years in the open source world, and I still log into Linux on a weekly basis for some clients, but I don't find the need to bash Microsoft and call them "liars." And I have used other CRM systems - some are absolutely great and well-suited to a vertical market. But there is a lot to like with this system - as well as the low-code, and affordably-priced Power Platform, which is essentially Dynamics with specific IP stripped away and presented as a starting point to quickly develop business applications. But I digress ... back to reporting. Power BI is, of course, the choice for more sophisticated reports, and while I agree that there is a learning curve to create Power BI reports - I personally know plenty of end-users who have an analytical bent and have learned to create Power BI reports for themselves against the data in Dynamics. As well, it's pretty trivial embed a Power BI report into Microsoft Dynamics as an embedded dashboard. Every single system has benefits and weaknesses. Microsoft Dynamics 365 has plenty of benefits, and plenty of weaknesses. This is true for all CRM systems. Don't even get me started on Salesforce -- which is madly loved by millions. At the end of the day, none of them should be immediately and completely dismissed. They should be carefully evaluated to determine for your given business whether the benefits outweigh the weaknesses. Take the time to learn, and decide for yourself, for your own business. It's worth the effort to do so.
@emergentform1188
@emergentform1188 5 месяцев назад
​@@blackfincloudservices2844 Wow, I very much appreciate this long and detail comment. Let me give you some background of where I'm coming from. 4 years ago I joined a company already using D365 F&O, cloud hosted by MS. During the installation of D365 they (the Finance staff with no real report dev experience) were told that they could connect directly to D365 data from PBI and generate their own reports. They were also told this was "easy". That's not a lie, it's 2 lies. I had determined, and confirmed twice with Microsoft support on 2 separate occasions (2 tickets), that PBI can only connect to the data warehouse (which is largely useless for serious reporting anyway) and the connection is too slow to even work anyway (except for tiny tables with no more than a few hundred records). Furthermore, even if that did work as promised (and which was even documented in Microsoft's sales materials), the end users lack the skills to do anything meaningful with it. Plus the data warehouse is far too limited for the vast majority of the reporting they wanted anyway (and I also confirmed this directly with Microsoft support - I was pretty relentless in my hounding of them about this actually and was put in direct contact with technical guys who know what's up). I also inquired and pressed our Microsoft partner, a company certified by Microsoft to do D365 installations and support, to pls help me in accessing the D365 database for the purposes of reporting. They didn't know how and couldn't provide an answer either. They just sent me random BS Microsoft articles with information I had already tried and confirmed directly with Microsoft doesn't actually work. So I asked them to open a ticket with Microsoft to inquire as well, and Microsoft's answer yet again was simply: that's not possible. They said that if we were hosting the database locally then the PBI connection to the data warehouse would work, but since it's cloud hosted then not a chance. Not that the data warehouse would have been much use to us anyway, but at the time we had nothing so it would've been better than nothing at least. So as you are likely aware, this lack of data access has seriously ticked off numerous organizations all of the world. Where does Microsoft get off thinking they have the right to sequester a company's own data from them? Especially when they are being paid so much for the service and use of the product. It literally boggles my mind and if I were supreme ruler of earth I would make that illegal. A company's data is it's most precious resource, and for companies to restrict semi real-time access to it is nothing short of crippling and unacceptable. But so many software companies do it and they get away with it because most often the people making the decision to adopt a piece of cloud hosted software don't know to ask the right questions and demand proof of the claims being made around data access. And once the software is installed, a year and millions of dollars later, it's too late to back out and now you're at their mercy. That is the game being played here, I'm quite sure of that. The company I currently work for has been burned repeatedly by this same scam. So with so many disgruntled and misled organizations out there, Microsoft was promising they would FINALLY provide a means to kick out the data. They released that early 2023 I believe it was. It's a mechanism that pushes full actual tables (not that mostly useless data warehouse nonsense) to an Azure data lake (where it's held in csv files). I set that up and tested it and it works great, hooray! Finally we have a way forward. Then oops, not so fast... The data lands in csv files with no column names. The columns name are actually held in separate json files, and Microsoft advises using Azure Synapse to mesh them together and create pipelines to push the data to an SQL database or whatever you want to do with it. Before going further, side note, let me just point out what an incredibly insane amount of work that is. My current reporting uses more than 70 tables, and that's a steadily growing list. That's a LOT of pipelines to set up, schedule, and maintain. Lots of opportunity for things to go wrong too, and having to set up all the datatype transformations since the data is all text in the csv files. Meanwhile, we've had database replication technology for well over 25 years now. What Microsoft is suggesting here is absolutely archaic compared to database replication. Basically, the data lake method completely disassembles the tables into their constituent parts and requires you to reconstruct it yourself, just to make a copy of it in a separate database so you can finally access it (and apply your own indices too of course). What a huge step backwards from database replication. But hey, we're desperate for something close to real time data access, and Microsoft has us over a barrel, so we gotta do it I guess.. So I went ahead and did that, and it turns out it doesn't even work. The specific problem is that Azure synapse (or any other pipeline technology in Azure) isn't able to interpret the date format being kicked out of D365 into the csv files. The dates all come out as nulls. I confirmed this with Microsoft support. However, there is a way to still get that csv data into a pipeline BUT then you lose the columns names. Microsoft's recommendation was to do that and then manually assign the columns names in the pipeline. Not only is that adding an insane amount of additional work to what is already a ridiculously labor intensive and unnecessary development effort, but I'm not sure I'd even trust it, because column names would need to be assigned based on their sequence. If the columns are ever re-ordered or if a new column is inserted then that ETL pipeline breaks and it's a bit of nightmare to track down and fix. I've learned the hard way from many years of experience not to rely on the order of columns in a provided dataset. I just revisited this about 6 weeks ago and lo and behold Microsoft still has not fixed the date issue. Meanwhile their long promised data lake "solution" to the long standing data access issue has been in production for ages now and it still doesn't even work. So to recap, Microsoft appeared to be capitulating to public demand to provide data access but they seem to have pulled another ruse with this archaic data lake solution which remains broken to this day. Did they ever actually intend to provide that access? Or was this just another false promise, leaving many companies with no other option but to get Microsoft or their partner$ to build reports for them with the D365 SSRS methodology (which is also itself extremely archaic and labor intensive compared to using the standard SQL server and PBI). At this point I doubt Microsoft ever intends to provide direct or near real time production data access to D365. Why would they? They have a cash cow going by restricting data access. Dismantling the tables and pushing them out as csv's (with column names held in a separate json file) seems to be all they are willing to do, and then it's up to the companies to undertake the substantial development effort to reconstruct the tables from that. What a disaster. Also I looked into the data verse thing and it's an absolute joke. There's far far too little data available through that for us to do anything with. The vast majority of the tables I need aren't even showing. I'm not even sure what that's even supposed to be actually, or what the point of it is, because it's just so limited. So again I opened a ticket with Microsoft to inquire and they also confirmed what I had observed. It's just not usable for serious reporting. I'm not even sure why it exists at all. I've been at this a long time now. I've spoken to numerous people online facing the same problems due to Microsoft's lies and roadblocks. As of yet I still have not encountered a viable solution to this problem. So I'm left kicking out the database and restoring it locally on my PC and then generating the reports from that. Given that the data lake solution only partly works, and the Azure Synapse pipeline methodology (without column names) just seems too cumbersome, potentially unreliable, and labor intensive to be a viable option, I'm thinking about writing my own ETL application in .Net to do what Azure Synapse is, as of yet, incapable of doing: take the data from the data lake csv files, mesh it with the column names in the json files, and reconstruct the database tables in our own Azure SQL instance. Seems to be the only viable option, and that would probably be less work than setting up all those Azure pipelines. Oh, and another option is tapping directly into the D365 API and pulling the data out that way, but I'm not sure how viable that is for extracting tables with over a million records. Since I have the data available in csv's in Azure blob storage, then I'm more inclined to just use that. However, having said all this, if you feel I'm off the mark about anything I've said here, pls alert me to the error of my ways. I would LOVE to be wrong and for there to truly be a way to access D365 data in close to real time without embarking on the major custom development effort mentioned (and not just the relatively useless data warehouse). To any non-techies reading this, like managers or finance staff thinking of adopting D365, here's my advice: don't. Not only does Microsoft lie egregiously and repeatedly about the data access for reporting, the system itself is generally disliked by everyone in my company. They much preferred Navision actually. I can say with absolute certainty that no one I work with likes it. They find it restrictive and cumbersome, and supporting it costs a fortune too. You can do better.
@octaviansfetcu4458
@octaviansfetcu4458 Год назад
Great video. Thanks for sharing! One question how/where do you save/read the change token? Thanks.
@Radhe_kanha0
@Radhe_kanha0 Год назад
I need to learn more about nintex, can you please help me out.
@juliovilaca2120
@juliovilaca2120 Год назад
Hello, I just created the same flow to generate the Req Number, but when I came back to the form, it apprears like "required" anyway, do not allowing to submit the form. Please your help, thanks a lot!
@VincePangan
@VincePangan Год назад
Hi Julio. In your "Requests" list, change the Title field to NOT be required. You may have to update your Nintex form again as well. Let me know if this helps!
@juliovilaca2120
@juliovilaca2120 Год назад
@@VincePangan great! It worked well. Thank you
@kannibala1
@kannibala1 Год назад
Hi I have used Node JS CLI to add Webhook URI to Sharepoint List/Site. Is there any way to automate it? Like, any powershell scripts to add webhook url to SP?
@James-sc1lz
@James-sc1lz 2 года назад
How do make sure it never expires?
@netwoven
@netwoven 2 года назад
Cloud-only accounts cannot be set to expire, so that is not an issue. Many people are actually asking for Microsoft to implement a way to expire AAD accounts. feedback.azure.com/d365community/idea/5d44d790-c525-ec11-b6e6-000d3a4f0789
@Quincypatty
@Quincypatty 2 года назад
Thank you for the video. However, I think I did something wrong and I dont know what. You showed how to edit the task form at configuring the assigned flexibel task. If i generate a preview or publish it, the original form keeps on edit-mode. How can i change this to read-only?
@netwoven
@netwoven 2 года назад
Are you asking about the form for the SharePoint list item? Or the task form? And how is the user going to access that form? If you are referring to the SharePoint list item form, then you should send a link to the view item (viewitem.aspx). Please email info(at)netwoven.com for more information
@leannefleming716
@leannefleming716 2 года назад
Thank you this was fantastic, very clearly presented and gives a good overview of the various types of reports
@netwoven
@netwoven 2 года назад
Thanks Leanne. Here are some additional resources you may also like www.netwoven.com/category/modern-applications/dynamics-365/
@inatovrustam
@inatovrustam 2 года назад
Thank you. Good educational material.
@sawramdhavi2037
@sawramdhavi2037 2 года назад
in my saleshub display, there is no report tools. how to bring it up?
@pasumarthiashik1099
@pasumarthiashik1099 2 года назад
hello sir , can u provide ur linkedin id ,so i can connect to you regarding nintex.
@netwoven
@netwoven 2 года назад
Here's our LinkedIn id www.linkedin.com/company/netwoven-inc-/mycompany/?viewAsMember=true also, you can reach out to us at info@netwoven.com for more information
@browngentle1
@browngentle1 2 года назад
I have the same question as defiant1024. How to get it working with MFA enabled environments?
@tamboleo
@tamboleo Год назад
Did you manage to workaround this?
@pravinwankhade1835
@pravinwankhade1835 2 года назад
How to create our own endpoint url... please guide?
@alexisjesus1635
@alexisjesus1635 2 года назад
Hello i need help to compared two diferente date with run if
@Catonkey1
@Catonkey1 2 года назад
Hi, where's the link to setting up Diagnostic Settings?
@muzzamilazam
@muzzamilazam 2 года назад
Thanks for making this video. It has helped me a lot as a newbie to dynamics 365 CE.
@jlewis6698
@jlewis6698 2 года назад
Great video. What about excluding the account from any MFA?
@netwoven
@netwoven 2 года назад
Great question, we recommend excluding it from any conditional access policies including ones that enforce MFA in case there are any AAD MFA outages
@James-sc1lz
@James-sc1lz 2 года назад
@@netwoven Also exclude AD Sync accounts if you are using it.
@valavanchandran8573
@valavanchandran8573 Год назад
@@James-sc1lz Never ADSync
@touchtbo
@touchtbo 3 года назад
Aren't Workflows deprecated on SharePoint?
@ahmadganteng7435
@ahmadganteng7435 4 месяца назад
SharePoint 2013 workflow will be turned off for new tenants as of April 2, 2024. It will be removed from existing tenants and will be fully retired as of April 2, 2026
@mohammedajeddig6381
@mohammedajeddig6381 3 года назад
Heey Netwoven. Thanks for the tuto. Have you faced any problems when you try to open nintex workflow on Request List. for me it deosnt load up even. is there any configuration that I should make before? thanks in advance
@netwoven
@netwoven 3 года назад
Hey Mohammed, We would love to help you answer any questions you may have. Firstly, have you been able to open Nintex Workflow on any other lists? Secondly, do you see the button for Nintex Workflow?
@aniruddhamukherjee5231
@aniruddhamukherjee5231 3 года назад
Did anyone say "Steering the Ship" ? Well I can say for sure that this ship has now become a hi-tech aircraft carrier (ready to hit any target), after starting its journey as a paddle boat around two decades back !! Congrats Matt for staying on this 'boat to ship' journey of Netwoven and helping all of us with your extra-ordinary technical contributions all through these years.
@sukantaFun
@sukantaFun 3 года назад
A big congratulation Matt!
@c016smith52
@c016smith52 3 года назад
Awesome, thanks so much for this. Very timely too, as I was just researching and trying to do this today. I was able to get it working already, thanks!
@netwoven
@netwoven 3 года назад
Glad this video helped! Please let us know if you have any other questions by emailing us at info@netwoven.com or if you would like to see any other topics covered in our channel.
@UmeshBeti
@UmeshBeti 3 года назад
@Netwoven, how do you subscribe webook and access Document library? i am trying to get whenever folders and file deleted!
@netwoven
@netwoven 3 года назад
Umesh, Web hooks allow for itemdelete events to be notified. When you get the notification and query the change log, you should see the item in the event log have DeleteObject = true; Supported events: docs.microsoft.com/en-us/sharepoint/dev/apis/webhooks/lists/overview-sharepoint-list-webhooks#list-event-types Change Log Delete Item: docs.microsoft.com/en-us/sharepoint/dev/solution-guidance/query-sharepoint-change-log-with-changequery-and-changetoken#use-the-corelistitemchangemonitor-add-in
@anjanakrishnaveni611
@anjanakrishnaveni611 3 года назад
Fantastic video! Good job
@anjanakrishnaveni611
@anjanakrishnaveni611 3 года назад
Amazing video, very helpful! p.s. love your voice
@christophernowak3338
@christophernowak3338 4 года назад
Was hoping for a tutorial, not an ad....
@guruprasadmarathe
@guruprasadmarathe 4 года назад
Can i get the ppt?
@netwoven
@netwoven 4 года назад
Yes. Can you please drop an email at info@netwoven.com?
@oivvv9218
@oivvv9218 4 года назад
good one Alex :)
@netwoven
@netwoven 4 года назад
Thanks Oivv. Do you have anything where we can help?
@chanm1000
@chanm1000 6 лет назад
With the approvals, will the approval outcomes be retained? I am noticing that workflows seem to hide this information from view after 90 day or so. Would like approvals to be auditable.
@SANTACRUZDRONES
@SANTACRUZDRONES 6 лет назад
Yes you can retain all the approvals and make them auditable. Please go to Netwoven.com and contact me Alex Viera, I'll be happy to do a demo of these capabilites.
@patenik2
@patenik2 7 лет назад
Awesome info and demo. Much better than PnP video from Microsoft.
@Isha3006
@Isha3006 7 лет назад
Thanks for the awesome video. Can we send the notification to localhost instead of the azure web site in your case. It is not working from my end if I do that. Could you please share your code so that I could see what is getting wrong from my end.
@webdeveloperify
@webdeveloperify 4 года назад
You can use ngrok which allows your local PC to be available out in internet both in http and https endpoint