My name is Matt Allford, I am a DevOps Engineer working for Parallo, a Crayon company, a content creator based in Australia, and a Pluralsight Author.
On this RU-vid channel I plan to share technical how-to and walkthrough videos, covering topics such as: - Public Cloud (primarily Microsoft Azure) - DevOps Tooling and Processes - Automation - Scripting
I hope you find the content enjoyable and valuable. Please feel free to reach out and connect with me online, the links are below!
Thanks for watching, glad it was helpful! I use a physical Wacom device with a pen, and for this one I was just using Microsoft whiteboard. As you can tell I’m not very experienced with it yet and still figuring that bit out 🤣 When I was drawing boxes / arrows on the screen, that was using “ZoomIt” from Microsoft, part of the sysinternals suite of software. There are a number of 3rd party apps that can achieve this on screen annotation too. www.wacom.com/en-au/products/pen-tablets/one-by-wacom
Thanks for the great content / introducing me to this tool; really well presented. One question; normally with a key vault I'd set up a private endpoint then remove all public access to help ensure it's secure. With the function service being hosted on a consumption plan we don't have the option to integrate that into our private network, and I don't think we can just whitelist the service's public IPs (i.e. there's a huge range of CIDRs, and IP groups aren't supported in whitelists, so it feels unmanagable at best). Is there a nice solution to keep key vault securely within the network whilst taking advantage of the cheaper consumption plan; or else what are your opinions on the cost of switching plans to use the private network vs the benefits of network security on top of Key Vault's existing identity based security?
Thanks for the feedback! And yeah, what you’ve described is just one of the trade off decisions that you need to make as part of the architecture and design on your application(s). One thing to consider would be to use this key vault only for certificate storage, and then the risk of allowing public access from a network perspective is probably a little less risky, compared to if you were storing other secrets and information. On top of that, it’s just about the layers of security you’re able to implement, and deciding what level is a suitable configuration between usability, cost, and security. With all of that said, and I know it is still in preview, but have you seen the Flex Consumption option? It’s a little more expensive I think than standard consumption, but it supports VNET integration - learn.microsoft.com/en-us/azure/azure-functions/flex-consumption-plan
@@MattAllford Good shout; I'd not come across that, but it looks ideal. Sadly my infra's deployed using IaC (Terraform), and whilst the FC1 SKU (flex consumption) was added last week, it looks like support for the (mandatory for FC1) `FunctionAppConfig` property of the function app isn't yet there. For now I'll try deploying a Basic plan, then will switch over to the cheaper flexible plan once it becomes available. Really appreciate your input; thanks again.
Hi Matt, I have 2 questions around setting this up for Enterprise. 1. We have multiple organisations in our enterprise. The instructions and your video, shows you need to get the Database ID to setup, this is based on your Organisation Name. But you can set up a Azure Virtual Network at Enterprise level. Do we uses any Organisation Database ID? 2. If we did setup multiple organisations each with their own private network configuration, do they each need a separate subnet in our VNet? Or can they use the same subnet?
Hey Paul! Yeah, I realised after I filmed this that things were slightly different in an Enterprise, and I added a few sections in, but I can't recall how many. For your first question, yes, you still get the Database ID, but instead you pass in your enterprise slug, the specific docs are here: docs.github.com/en/enterprise-cloud@latest/admin/configuration/configuring-private-networking-for-hosted-compute-products/configuring-private-networking-for-github-hosted-runners-in-your-enterprise#1-obtain-the-databaseid-for-your-enterprise For question 2, given the setup in an Enterprise is done at the Enterprise level, you can then leverage it from multiple organisations. So you could probably go either way you want, where you setup specific runners and runner groups at the enterprise level, for each organisation, or you could just set up one at the enterprise level to use across multiple orgs. Hope that helps!
Thanks for the feedback! Good question - I’m honestly not sure. I don’t think that functionality was there when this video was released, but it may be today. Alternatively leveraging LLMs to create the files is a great use of that technology.
Thanks for watching, happy to hear it was helpful! Point noted - might make for a good follow up section. Not sure if you came across it, but there's a bit of info in the docs about using the API if that's of interest: github.com/shibayan/keyvault-acmebot/wiki/
Great video mate! Thinking out loud, if I'm using a virtual WAN - I would assume you just ensure that there a hub connection from the vnet to the VWan and it will be able to find resources that way?
Thanks for watching, and sorry for the delay in response. You are correct! As long as the VNET where the GitHub runner NIC is located has routing and firewall access to the target resources, it will be good to go. It will abide by any network policies and configurations such as DNS that you have applied to the network it joins 👍
Hi there! Thanks for watching. There is some information in the WIKI page of the tool for Route 53 linked below. Otherwise this might be a good use case to get a LLM to help with the specific steps you’re looking for? github.com/shibayan/keyvault-acmebot/wiki/DNS-Provider-Configuration#amazon-route-53 Hope that helps.
Hey Matt, The tutorial is really awesome. You have covered everything in an hour-long video. I liked the way that you have also added some intentional common mistakes which can happen during the setup, such as configuring the runners into the default group instead of the one that needs to be used, which is eventually going to deploy NIC cards. Overall, it is really very easy to follow.
I haven’t come across that myself, sorry. There seems to be a bit of information about that error and JMX files. Have you gotten far with general troubleshooting or even with a LLM like ChatGPT?
I also got MalformedURLExpception. I used your sample directly and used it from git. It appears as if somehow environment variables are not picking up. Checked UDVs it is correct. Any settings in Azure to somehow let it pick?
I get why it works for Azure (considering that the GitHub Hosted runners already live there), but it would be great to get integration to networks on other clouds, so there could be a consistent pattern.
Thank you Matt for this content. I purchased your "Zero to Hero" course. I noticed that there are few videos in production. Are you planning to finished them any time soon?
Thanks for the support! Yes, absolutely aiming to keep the content for the course rolling out. Unfortunately the last 6 months threw a few hurdles my way and cause a delay in production. Planning is well underway for the next section!
Came back to add an update, want to thank you again Matt, this tutorial was really great, I've managed to implement ACMEbot with a custom domain managed in Azure public DNS, along with integrating the key vault with two IIS servers using the Azure Keyvault Extension which runs on the windows servers and will periodically update the certs used on the server from those in the key vault. We now have fully automated certs for our custom web domain / iis servers.
Woo! That's a fantastic solution, great work, and I'm glad this helped you achieve a hands off, low cost automated solution :) Thanks for sharing the update, I love hearing when people put this sort of thing in to practice!
Can we use HTTP-1 validation for subdomains? A redirect rule in application gateway for the acme challenge that checks a static file in a storage account where let's encrypt can update the key. I need wildcards and also single certificates for subdomains and there's not a solution that covers both and saves the certs to key vault.
Hey! Yep, the solution will automatically renew certificates 30 days before their expiry - github.com/shibayan/keyvault-acmebot/wiki/Frequently-Asked-Questions#automatic-renew-an-existing-certificate Hope this helps!
Hi Matt what is the best way to mitigate the risk of the DNS provider credentials being compromised , will this solution work togeather with acme-dns ?
Hey Simon. Are you referring to the protection of the API key being used to access your DNS provider? The best course is to store the API key as a secret in Key Vault, and then reference that secret from the function app. For example, the app setting "Acmebot:Cloudflare:ApiToken" on the function app could be set to reference the key vault secret containing the API Key, rather than pasting it directly in to the value (like I did in the video). Does that help?
Thank you Matt for this excellent content. I especially appreciate that you explain "small" nuances, such as @11:40 where you explain the difference between "requiredonly" and "all" parameters, where you even explain (just with a short sentence) what "requiredonly" parameters mean (parameters without default values). These little things help me make sure I understand what it means to have requiredonly and all parameters. Keep up the good work.
Hey Patrick. Thanks for the comment and the feedback. I really appreciate you took the time to let me know - as a content creator that really helps a lot! Glad the video was useful for you 🙂
This solution is not cost effective. For each renewal of Certificate in Key Vault, Microsoft charges $3.00. If a LetsEncrypt certificate has to be renewed 4 times a year, you end up paying Key Vault charges of $12 for each certificate. Check the documentation for pricing of Azure Key Vault.
Hi there. Sorry about the delay in response, I missed this comment. The $3 renewal is not relevant with this solution - that’s applicable when Key Vault itself is processing the renewal. This solution performs the renewal outside of key vault, and is just using key vault to store the certificate. Hope that helps!
Very good explanation.. Well done! A couple of things though: 1) please do more if you're able to and stay focused on one thing at the time, I know how difficult it is to create content with Full-time job, but you have a golden voice and way of explaining concepts that the vast majority of people on RU-vid just don't have it. 2) try to create 10 min videos without sacrificing the level of detail, this is not for me, it is for the younger generation who is running like a checken from one video to another, plus for RU-vid algorethm to help you get a better reach. Finally, please either replace the tree behind you or treat it, in your last videw the tree looked greener, but I know your trick ✂😉
Thanks Wasim, appreciate the watch and the feedback! At the end of the day, the content I create here is something I do to share knowledge and try to deep dive in to various topics. If the next generation or the algorithm aren’t happy with that, then that’s ok 🙂 I try to create content I’d enjoy consuming. There’s definitely a backlog of content to create (over 40 topics / ideas), but as you noted there are often more important things that demand my time.
@MattAllford thanks for your quick reply. I agree and disagree with you at the same time; I realize that when you do something like RU-vid, you usually want to talk about subjects your way, leaving behind the constraints we often have in our work environments; however, as an opportunistic I am, I believe that you have all the qualities that can make you huge in the field on RU-vid or any other platform. I really hope to see more of your content soon.
Great video Matt. Is there a Q&A board for asking questions on Azure load test? The Microsoft Q&A board for Azure load testing is dead. I have several issues with my JMeter script that works well locally with no issues but chokes in Azure load test. The documentation is useless. There are no examples / documentation on how to ensure that cookies are extracted and passed in with every subsequent request after logging in. I have an HTTP cookie manager at the plan level in the JMX and that works great and I can test all areas of the website. But since the cookie extraction doesn't work, I can't get past the login process in Azure load test. No help to be found anywhere. Azure load test is not ready for prime time.
Thanks for watching! Sorry to hear the docs are falling short. Do you have an active subscription with a support agreement? It could be worth opening a service request via the Azure Portal. Otherwise, I believe the team moved from GitHub issues over to the VS developer community for load testing ideas, feedback, and issues. URL is below, I hope this helps! developercommunity.visualstudio.com/loadtesting
Hi, I've checked your video. And it is so much helpful for the automation. I was wondering is there any way to add multiple DNS Zones to one function app ?
Hi there, sorry I did not see this comment earlier. I’m not immediately aware of the ability to add multiple DNS zones to a single function app, but I can see why that’s a valid request. I’d suggest logging an issue on the GitHub page to see if that functionality is available today, and if not then make it a feature request!
Hi quick query: When the test engines are running will they bring down the performance of the actual app service where my app is hosted ? ( trying to understand if these load tests would hamper user experience using the website at the same time )
Howdy. Yes, this is a live load test against the target endpoint you provide when performing the test, so it will generate real load and therefore impact the performance. It's not a simulation, it is real load hitting the target. So you're right in thinking through whether you'd want to do this against, say, a production endpoint, or maybe have another environment that is used for load testing, with the same architecture as your production site, but doesn't have active users that will be impacted by the test.
The catch I see with creating a separate environment is paying for the 1 instance even when load testing is not being performed. In my research, the one way I've noticed this can be tackled with is by using Azure Container apps that have the ability to go down to 0 instance ( no charge ) when not in use, and an autoscaling can be put in place for it to scale to 1 instance or more once the requests start coming in ( the only issue I see here is the initial requests the load test would send would fail with a 503 as there wouldn't be an instance available to service them ? ). Not an ideal solution, however let me know if a different method comes to your mind. Thanks a bunch for the response, great video ! :)@@MattAllford
@@user-ht4mn2ud1oI’m glad you enjoyed the video! This is really one of those “it depends” kind of things, but I really do mean that. If the infrastructure isn’t too large and complex, maybe you can deploy it via pipelines using Infrastructure as Code, deploy the application, load test against that environment, and then delete the environment. This could all be done using pipelines such as Azure DevOps or GitHub Actions. A lot of customers I work with will typically have multiple environments, so they will usually have a test/dev/staging/uat/whatever environment, separate to production, and one of those environments can (usually) be leveraged to run load tests against. Or as you said, if the endpoint is running in infrastructure like serverless functions or container apps that could be scaled to 0, maybe you can have a pipeline that scales it up, potentially even pauses if the app needs a little bit of time to get “warm”, and then your load test could be invoked in that same pipeline.
Thankyou for posting the video. in the test criteria when i add 10 metrics in load test getting pass failure can't have 10 metrics points can you pls help on this
Hello Matt, Thanks for this video. Just have a quick question - Is there a way that we can add the certificates in the dashboard too in an automated way please?
Thanks for watching. I’m not 100% sure what you’re referring to sorry. I suspect your best bet might be to add an issue on the GitHub repo for the project with a feature request?
Hey, There is a note in the readme to say the Azure Naming Tool has moved to a new repository. Here is the new location - github.com/mspnp/AzureNamingTool I’ll update the description on this video when I get a moment. Cheers, Matt.
@@MattAllford Hi Matt, I cant seem to deploy as it seems to be complaining of not having a docker image there. Because of the repo location change, it seems to be missing the docker file? Any ideas?
Hi Satish. There’s probably a bit much there for a RU-vid comment. Have you tried enabling it in the portal and replicating the configuration in Bicep? I’d suggest your question might be better posed on a forum such as reddit or stack overflow, where some more verbose discussion can occur.