- 15
- 52 586
Matt Allford
Australia
Приєднався 22 чер 2020
Hi!
My name is Matt Allford, I am a DevOps Engineer working for Parallo, a Crayon company, a content creator based in Australia, and a Pluralsight Author.
On this UA-cam channel I plan to share technical how-to and walkthrough videos, covering topics such as:
- Public Cloud (primarily Microsoft Azure)
- DevOps Tooling and Processes
- Automation
- Scripting
I hope you find the content enjoyable and valuable. Please feel free to reach out and connect with me online, the links are below!
My name is Matt Allford, I am a DevOps Engineer working for Parallo, a Crayon company, a content creator based in Australia, and a Pluralsight Author.
On this UA-cam channel I plan to share technical how-to and walkthrough videos, covering topics such as:
- Public Cloud (primarily Microsoft Azure)
- DevOps Tooling and Processes
- Automation
- Scripting
I hope you find the content enjoyable and valuable. Please feel free to reach out and connect with me online, the links are below!
GitHub Hosted Runner Azure VNET Integration: Accessing Private Resources Made Easy
Traditionally, accessing private resources with GitHub Actions required the setup and management of self-hosted runners. Now, with new features available for GitHub's Teams and Enterprise plans, you have the capability to configure GitHub-hosted runners to connect directly to your Azure virtual networks. This enables direct network access from the hosted runner to private resources, whether they're hosted on Azure, or connected through hybrid networks from your Azure VNET to other cloud environments, or even on-premises environments!
Let's take a look at how to set this up and walk through the end-to-end configuration!
// SUBSCRIBE ✅
ua-cam.com/users/mattallford
// RESOURCES & REFERENCES 📃
GitHub Repository:
🔗github.com/mattallford/github-hosted-runner-azure-networking
About Azure private networking for GitHub-hosted runners in your organization:
🔗docs.github.com/en/organizations/managing-organization-settings/about-azure-private-networking-for-github-hosted-runners-in-your-organization
Configuring private networking for GitHub-hosted runners in your organization:
🔗docs.github.com/en/organizations/managing-organization-settings/configuring-private-networking-for-github-hosted-runners-in-your-organization
About Ubuntu and Windows Larger Runners:
🔗docs.github.com/en/enterprise-cloud@latest/actions/using-github-hosted-runners/about-larger-runners/about-larger-runners#about-ubuntu-and-windows-larger-runners
// FOLLOW ME 👉
Blog - mattallford.com
LinkedIn - www.linkedin.com/in/mattallford/
Twitter - mattallford
GitHub - github.com/mattallford
// CHAPTERS 🕛
0:00 Introduction
0:56 What Problem is Being Solved?
2:51 Example GitHub Workflow
6:05 The Demo Environment
6:46 What Did We Do Before?
10:45 GitHub Runner VNET Integration
13:14 Reviewing the GitHub Documentation
18:23 GitHub Enterprise Configuration
20:00 High Level Configuration Workflow
22:00 IaC for My Demo Environment
24:43 Prerequisites
26:59 Be Aware of the Deny Outbound NSG Rule!
29:35 Obtaining the GitHub Database ID
33:10 Configure Azure Resources
40:18 GitHub Networking Configuration
42:08 GitHub Runner Groups
43:49 Create a New Hosted Runner
45:45 GitHub Runner Billing and Spending Limit
48:28 Update and Test The Workflow
51:46 Troubleshooting Time - I Made a Mistake!
54:55 Test The Workflow - Take 2
57:06 Summary
59:02: A Final Thought About Managing Cost
Let's take a look at how to set this up and walk through the end-to-end configuration!
// SUBSCRIBE ✅
ua-cam.com/users/mattallford
// RESOURCES & REFERENCES 📃
GitHub Repository:
🔗github.com/mattallford/github-hosted-runner-azure-networking
About Azure private networking for GitHub-hosted runners in your organization:
🔗docs.github.com/en/organizations/managing-organization-settings/about-azure-private-networking-for-github-hosted-runners-in-your-organization
Configuring private networking for GitHub-hosted runners in your organization:
🔗docs.github.com/en/organizations/managing-organization-settings/configuring-private-networking-for-github-hosted-runners-in-your-organization
About Ubuntu and Windows Larger Runners:
🔗docs.github.com/en/enterprise-cloud@latest/actions/using-github-hosted-runners/about-larger-runners/about-larger-runners#about-ubuntu-and-windows-larger-runners
// FOLLOW ME 👉
Blog - mattallford.com
LinkedIn - www.linkedin.com/in/mattallford/
Twitter - mattallford
GitHub - github.com/mattallford
// CHAPTERS 🕛
0:00 Introduction
0:56 What Problem is Being Solved?
2:51 Example GitHub Workflow
6:05 The Demo Environment
6:46 What Did We Do Before?
10:45 GitHub Runner VNET Integration
13:14 Reviewing the GitHub Documentation
18:23 GitHub Enterprise Configuration
20:00 High Level Configuration Workflow
22:00 IaC for My Demo Environment
24:43 Prerequisites
26:59 Be Aware of the Deny Outbound NSG Rule!
29:35 Obtaining the GitHub Database ID
33:10 Configure Azure Resources
40:18 GitHub Networking Configuration
42:08 GitHub Runner Groups
43:49 Create a New Hosted Runner
45:45 GitHub Runner Billing and Spending Limit
48:28 Update and Test The Workflow
51:46 Troubleshooting Time - I Made a Mistake!
54:55 Test The Workflow - Take 2
57:06 Summary
59:02: A Final Thought About Managing Cost
Переглядів: 1 805
Відео
bicep deployment pane
Переглядів 737Рік тому
In the latest version of Azure Bicep (0.20.4), the team have released an experimental feature in the Visual Studio Code extension, providing a deployment pane for validating, performing what-if analysis, and deploying Azure Bicep templates to Azure. Join me as I show you how to enable and use this new feature. // SUBSCRIBE ✅ ua-cam.com/users/mattallford // RESOURCES & REFERENCES 📃 Using the Dep...
Azure Bicep Native Parameter Files
Переглядів 3 тис.Рік тому
Azure Bicep now supports writing parameter files natively in the Bicep language, using .bicepparam files. Join me in this video where I go through creating parameter files from scratch, using the Bicep CLI, and Visual Studio Code. We'll also explore how to deploy Bicep to Azure with a bicep parameter file, and explore some other features and functionality. If you're interested to learn more abo...
Azure Naming Tool Overview
Переглядів 2,1 тис.Рік тому
Remembering the constraints and caveats of how to name your resources in Azure can be challenging. Between minimum and maximum lengths, allowed characters, and using unique names, there's too much to remember. Microsoft have some guidance around naming your resources as part of the Cloud Adoption Framework, and to make everyone's life much easier, there's a fantastic utility called the Azure Na...
Free Automated SSL Certificates in Azure KeyVault with ACME Bot
Переглядів 12 тис.2 роки тому
Azure KeyVault ACME Bot is a solution provided by Tatsuro Shibamura to manage and automate SSL certificates in Azure KeyVault. The SSL certificates can be generated from a free provider such as lets encrypt, and the whole solution will cost next to nothing to run! Join me as I cover an overview of how the KeyVault ACME Bot solution works, and then we'll walk through a deployment and generation ...
Azure Deployment Scripts - Real World Use Cases
Переглядів 1,2 тис.2 роки тому
Did you know you can run PowerShell and AzureCLI commands and scripts in the middle of your ARM Template or Bicep deployments? The scripts run in an Azure container, and can allow you to make your IaC deployments more versatile. In this video, I show you two real world examples where this has come in handy for me, and some patterns around using secrets and managed identities. This video is also...
PowerShell Crescendo First Look
Переглядів 7162 роки тому
PowerShell Crescendo was made generally available by Microsoft in March 2022. It's a new framework to allow you to create PowerShell cmdlets that use any underlying command line tool, with all of the PowerShell standards and goodness, while also returning the results as PowerShell objects, rather than text. // SUBSCRIBE ✅ ua-cam.com/users/mattallford // RESOURCES & REFERENCES 📃 GitHub Gist, if ...
Azure Load Testing Detailed Walkthrough
Переглядів 16 тис.2 роки тому
Azure Load Testing can help you gain confidence in your application and architecture, by knowing it will scale and perform under demand to meet business requirements, and if it doesn't, you will be able to quickly understand where the bottleneck in your application architecture lies. You can easily integrate load tests to your CI/CD process to build confidence during the build and release proce...
SFTP Support in Azure Blob Storage
Переглядів 8 тис.3 роки тому
Azure Storage now has support for accessing the blob service using the Secure File Transfer Protocol (SFTP). At the time of publishing this is in preview, but this has been a long time coming, as providing SFTP access previously required a custom solution, typically bringing some complexity and management overhead. Join me in this clip as I go from zero, to having a storage account provisioned ...
Azure Bastion VNET Peering Support
Переглядів 7953 роки тому
At the end of 2019, we saw Azure Bastion become generally available. One of the most requested features was to enable support of using a single Azure Bastion to access resources in peered virtual networks (VNETS). On the 14th of May 2021, Bastion with VNET Peering became generally available. This feature isn't super exciting to implement, which is not a bad thing as it "just works". Join me as ...
Introduction To Azure Template Specs
Переглядів 9123 роки тому
Azure Template Specs provide an area to secure Azure Resource Manager templates in Azure, for the purpose of secure and easy deployment. You can easily provide access to other users in your organisation using Azure RBAC, rather than having them need to know how to deploy from an ARM template, or abstract it through another tool like a CICD pipeline. Join me as we take a look at the capabilities...
Azure Automation Tasks
Переглядів 1,4 тис.3 роки тому
Azure Automation Tasks are pre-defined workflows that can be configured on a per-resource basis in the Microsoft Azure Cloud, built on top of Azure Logic Apps. Join me as we take a look at some of the early preview capabilities of this new feature. Microsoft Docs: docs.microsoft.com/en-us/azure/logic-apps/create-automation-tasks-azure-resources 0:00 Introduction 2:13 Virtual Machine Automation ...
Microsoft PowerShell Secret Management - Preview 3
Переглядів 7324 роки тому
Join me as I take a look at the newly released Preview 3 of the Microsoft PowerShell Secret Management Module! Microsoft Resources: Blog Post - devblogs.microsoft.com/powershell/secretmanagement-preview-3/ Design Changes - devblogs.microsoft.com/powershell/secretmanagement-module-preview-design-changes/ Secret Management GitHub - github.com/PowerShell/SecretManagement Secret Store Github - gith...
Microsoft PowerShell Module for Azure Functions
Переглядів 6324 роки тому
In early 2020, Microsoft released a new PowerShell module to deploy and manage Microsoft Azure Function Apps. In this clip, we'll take a look at some of the cmdlets that are available to us in version 1 and go through the process of deploying a new function app, updating some settings and then removing the function app.
Amazing Video!!!
Thank you! Really glad you liked it 🙂
Awesome video!!! I could not have asked for a much better video, this really helped with the solution I was looking for... Thank you very much!!!
Thanks so much for the feedback, really glad to hear it was helpful!
Hi! In the json format I could do something like "${VAR_X}" in the json dile and later when i did 'azd up' it would prompt me asking for a value for var_x,im not finding how to so this in the new bicepparam,do you know?
Interesting. I’m not 100% following sorry, can you provide some more info? Alternatively, logging an issue on the Bicep GitHub might be the go, you can easily attach screenshots / output / examples, and there are people much more clever than I there 😀
This is great Matt, I've few follow up questions, how about these Enterprise Identities? How did / or in which stage of this integration the following got created in your Azure tenant? GitHub CPS Network Service id: 85c49807-809d-4249-86e7-192762525474 GitHub Actions API id: 4435c199-c3da-46b9-a61d-76de3f2c9f82 and did you ,manually assigned the permissions listed in the github doc to these Identities ?
can you make a video on what to learn to become a basic azure cloud engineer with examples
Hey there! While I’m sure I could, there are lots of good resources out there on this topic already! One place I would recommend starting is below learntocloud.guide
Hey! Incredible content. Very clear explanation and useful information. Would love more devops content from you
Thanks so much! Sorry about the delayed reply, been a bit busy! There’s definitely a lot of topics in the pipeline, but very little time to achieve them 😅
All your videos are interesting and really unique Matt. Great job 👏
Thanks very much - I really appreciate the feedback. Glad you enjoyed the video!
The best tutorial I've ever seen around ACME + Azure KeyVault. It works very well, thank you very much!
Thanks for watching, I’m glad you found it helpful!
good information, Can i used this one as well when im using Digicert to integrate this to azureKeyVault?
Thanks for watching! I’m not sure if it can be integrated directly with Digicert sorry. From memory there are supported CAs on the main readme in the project on GitHub, and it might be worth raising an issue on the project if you have a feature request.
How did you make the GitHub runner use the identity defined in 22:37?
Hey! That's the service principal that was set up to authenticate to Azure using the azure/login@v1 action int he workflow (lines 27-32 in the GitHub workflow in the repository). When the workflow runs, it logs into Azure using this principal, and that's the same one I'm defining to give access to the key vault in the Bicep template. Hope that makes sense?
@@MattAllford Hey! Yep, that makes sense, but I am wondering how is the runner allowed to log in using the principal? And where does it come from, did you just create it yourselves?
@@emilkordahl4113 So in this scenario, I created a Microsoft Entra Application with a service principal, and then gave this the required access it needs in Azure. From there, you store the information about the service principal in a GitHub secret, and then reference those secrets with the azure/login@v1 action. Part of this also requires you to configure the app registration to allow tokens from GitHub to leverage the app registration / service principal. John Savill has a great overview of OIDC authentication from GitHub to Azure (which is what I'm using) over here - ua-cam.com/video/XkhkkLBkAT4/v-deo.html
@@MattAllford Ah great, thanks for the quick response!
Great video thanks Matt. It would be awesome to do another take with an Azure web app deployment.
Thank you! Good call, there’s been a few updates to the tool now as well which might be worth showing.
Awesome video Matt, thank you a lot. You explain very well and clear. =)
Thanks so much for taking the time to leave a comment! Really glad it was helpful!
Great videos, exactly what I'm looking for. I have one question, what tool did you use to draw the screen ?
Thanks for watching, glad it was helpful! I use a physical Wacom device with a pen, and for this one I was just using Microsoft whiteboard. As you can tell I’m not very experienced with it yet and still figuring that bit out 🤣 When I was drawing boxes / arrows on the screen, that was using “ZoomIt” from Microsoft, part of the sysinternals suite of software. There are a number of 3rd party apps that can achieve this on screen annotation too. www.wacom.com/en-au/products/pen-tablets/one-by-wacom
Thanks for the great content / introducing me to this tool; really well presented. One question; normally with a key vault I'd set up a private endpoint then remove all public access to help ensure it's secure. With the function service being hosted on a consumption plan we don't have the option to integrate that into our private network, and I don't think we can just whitelist the service's public IPs (i.e. there's a huge range of CIDRs, and IP groups aren't supported in whitelists, so it feels unmanagable at best). Is there a nice solution to keep key vault securely within the network whilst taking advantage of the cheaper consumption plan; or else what are your opinions on the cost of switching plans to use the private network vs the benefits of network security on top of Key Vault's existing identity based security?
Thanks for the feedback! And yeah, what you’ve described is just one of the trade off decisions that you need to make as part of the architecture and design on your application(s). One thing to consider would be to use this key vault only for certificate storage, and then the risk of allowing public access from a network perspective is probably a little less risky, compared to if you were storing other secrets and information. On top of that, it’s just about the layers of security you’re able to implement, and deciding what level is a suitable configuration between usability, cost, and security. With all of that said, and I know it is still in preview, but have you seen the Flex Consumption option? It’s a little more expensive I think than standard consumption, but it supports VNET integration - learn.microsoft.com/en-us/azure/azure-functions/flex-consumption-plan
@@MattAllford Good shout; I'd not come across that, but it looks ideal. Sadly my infra's deployed using IaC (Terraform), and whilst the FC1 SKU (flex consumption) was added last week, it looks like support for the (mandatory for FC1) `FunctionAppConfig` property of the function app isn't yet there. For now I'll try deploying a Basic plan, then will switch over to the cheaper flexible plan once it becomes available. Really appreciate your input; thanks again.
This works beautifully for my wildcard requirements. Azure | AWS Route 53. Thanks for this.
Awesome to hear, glad it helped you get up and running with the wildcard!
Hi Matt, I have 2 questions around setting this up for Enterprise. 1. We have multiple organisations in our enterprise. The instructions and your video, shows you need to get the Database ID to setup, this is based on your Organisation Name. But you can set up a Azure Virtual Network at Enterprise level. Do we uses any Organisation Database ID? 2. If we did setup multiple organisations each with their own private network configuration, do they each need a separate subnet in our VNet? Or can they use the same subnet?
Hey Paul! Yeah, I realised after I filmed this that things were slightly different in an Enterprise, and I added a few sections in, but I can't recall how many. For your first question, yes, you still get the Database ID, but instead you pass in your enterprise slug, the specific docs are here: docs.github.com/en/enterprise-cloud@latest/admin/configuration/configuring-private-networking-for-hosted-compute-products/configuring-private-networking-for-github-hosted-runners-in-your-enterprise#1-obtain-the-databaseid-for-your-enterprise For question 2, given the setup in an Enterprise is done at the Enterprise level, you can then leverage it from multiple organisations. So you could probably go either way you want, where you setup specific runners and runner groups at the enterprise level, for each organisation, or you could just set up one at the enterprise level to use across multiple orgs. Hope that helps!
very informative videos! can we generate .jtl file if we are running test in azure?
Thanks for the feedback! Good question - I’m honestly not sure. I don’t think that functionality was there when this video was released, but it may be today. Alternatively leveraging LLMs to create the files is a great use of that technology.
@@MattAllford thanks will try
Great overview, NSG tip saved me some time. Thanks
Glad it helped! Thanks for watching!
awesome tutorial I am revisiting again and again and following the steps..thanks alot
You're most welcome, glad it is helpful!
Thanks, Matt, it was so helpful. It would be even more helpful if you can show a demo of API to manage all these certs
Thanks for watching, happy to hear it was helpful! Point noted - might make for a good follow up section. Not sure if you came across it, but there's a bit of info in the docs about using the API if that's of interest: github.com/shibayan/keyvault-acmebot/wiki/
Great video mate! Thinking out loud, if I'm using a virtual WAN - I would assume you just ensure that there a hub connection from the vnet to the VWan and it will be able to find resources that way?
Thanks for watching, and sorry for the delay in response. You are correct! As long as the VNET where the GitHub runner NIC is located has routing and firewall access to the target resources, it will be good to go. It will abide by any network policies and configurations such as DNS that you have applied to the network it joins 👍
I created for post and get packets to https and using dynamic data.. All thanks to this tutorial.
Awesome, love to hear that!
One of the best!!
Great Matt. Can u please refer me the documentation for creating API keys in aws route 53 as you did on cloudflare. Thanks in advance
Hi there! Thanks for watching. There is some information in the WIKI page of the tool for Route 53 linked below. Otherwise this might be a good use case to get a LLM to help with the specific steps you’re looking for? github.com/shibayan/keyvault-acmebot/wiki/DNS-Provider-Configuration#amazon-route-53 Hope that helps.
Hey Matt, The tutorial is really awesome. You have covered everything in an hour-long video. I liked the way that you have also added some intentional common mistakes which can happen during the setup, such as configuring the runners into the default group instead of the one that needs to be used, which is eventually going to deploy NIC cards. Overall, it is really very easy to follow.
Thank you mate, glad you liked it 🙂
Hi Matt, Thank you for great content. I get an "MalformedURLException" error after running tests with an jmx file. Do you have any idea about this?
I haven’t come across that myself, sorry. There seems to be a bit of information about that error and JMX files. Have you gotten far with general troubleshooting or even with a LLM like ChatGPT?
I also got MalformedURLExpception. I used your sample directly and used it from git. It appears as if somehow environment variables are not picking up. Checked UDVs it is correct. Any settings in Azure to somehow let it pick?
great explanation
Glad it was helpful!
That's an amazing walkthrough... Very informative and to the point...
Thank you for watching and taking the time to comment - I really appreciate the feedback!
I get why it works for Azure (considering that the GitHub Hosted runners already live there), but it would be great to get integration to networks on other clouds, so there could be a consistent pattern.
100% agree! I’m sure they’ll see a big uptake in this integration, and can then hopefully bring it to other cloud platforms too.
Thx. Great video.
Thanks for watching! I’m glad it was helpful.
Whenever I am running the test in Azure Load test, getting error of Resource Not Found
Hi there. Apologies, I missed this comment. I’m not sure what is going on, sorry. Are you able to engage Azure support to help troubleshoot?
Thank you Matt for this content. I purchased your "Zero to Hero" course. I noticed that there are few videos in production. Are you planning to finished them any time soon?
Thanks for the support! Yes, absolutely aiming to keep the content for the course rolling out. Unfortunately the last 6 months threw a few hurdles my way and cause a delay in production. Planning is well underway for the next section!
This is amazing!! Shout out to the Aussie and the Github creator!!
Thank you, glad you enjoyed it!
Came back to add an update, want to thank you again Matt, this tutorial was really great, I've managed to implement ACMEbot with a custom domain managed in Azure public DNS, along with integrating the key vault with two IIS servers using the Azure Keyvault Extension which runs on the windows servers and will periodically update the certs used on the server from those in the key vault. We now have fully automated certs for our custom web domain / iis servers.
Woo! That's a fantastic solution, great work, and I'm glad this helped you achieve a hands off, low cost automated solution :) Thanks for sharing the update, I love hearing when people put this sort of thing in to practice!
Hello @Saqibss how did you link azure dns to the functionapp, what did you add in the app settings of the function app, its not working for me
@@flapa2010 see the documentation for ACMEBOT and azure dns. Open the Access Control (IAM) of the target DNS zone or resource group containing the DNS zone, and assign the role of DNS Zone Contributor to the deployed function application.
@@flapa2010 check the documentation you need to add the Subscription ID that the DNS zone resides in under "Acmebot:AzureDns:SubscriptionId" and then ensure the function app has the DNS contributor permission on the DNS zone (under IAM).
Azure downloads are very slow, i am trying to download a 5.5GB archive and i am getting 150 kbs download per second.... :/, I DONT RECOMMEND AZURE.
That's the best tutorial on SSL certificate automation I've found using the stack I was interested in. Thank you very much
Thanks for the feedback, I’m glad it was helpful!
Can we use HTTP-1 validation for subdomains? A redirect rule in application gateway for the acme challenge that checks a static file in a storage account where let's encrypt can update the key. I need wildcards and also single certificates for subdomains and there's not a solution that covers both and saves the certs to key vault.
I’m not sure about the specifics of that one, sorry.
Will it auto-renew the certificate once expiry is nearby ? if yes, what's the minimum day count it consider a valid cert.
Hey! Yep, the solution will automatically renew certificates 30 days before their expiry - github.com/shibayan/keyvault-acmebot/wiki/Frequently-Asked-Questions#automatic-renew-an-existing-certificate Hope this helps!
Hi Matt what is the best way to mitigate the risk of the DNS provider credentials being compromised , will this solution work togeather with acme-dns ?
Hey Simon. Are you referring to the protection of the API key being used to access your DNS provider? The best course is to store the API key as a secret in Key Vault, and then reference that secret from the function app. For example, the app setting "Acmebot:Cloudflare:ApiToken" on the function app could be set to reference the key vault secret containing the API Key, rather than pasting it directly in to the value (like I did in the video). Does that help?
Great video Matt, thanks a lot!
My pleasure! Thank you for watching!
Thank you Matt for this excellent content. I especially appreciate that you explain "small" nuances, such as @11:40 where you explain the difference between "requiredonly" and "all" parameters, where you even explain (just with a short sentence) what "requiredonly" parameters mean (parameters without default values). These little things help me make sure I understand what it means to have requiredonly and all parameters. Keep up the good work.
Hey Patrick. Thanks for the comment and the feedback. I really appreciate you took the time to let me know - as a content creator that really helps a lot! Glad the video was useful for you 🙂
You saved me a bunch of time! Thank you!
I love to hear that! Thank you for watching and I’m glad it helped.
This solution is not cost effective. For each renewal of Certificate in Key Vault, Microsoft charges $3.00. If a LetsEncrypt certificate has to be renewed 4 times a year, you end up paying Key Vault charges of $12 for each certificate. Check the documentation for pricing of Azure Key Vault.
Hi there. Sorry about the delay in response, I missed this comment. The $3 renewal is not relevant with this solution - that’s applicable when Key Vault itself is processing the renewal. This solution performs the renewal outside of key vault, and is just using key vault to store the certificate. Hope that helps!
This works perfectly. Thanks Allford.
Awesome! Glad it was helpful!
Great video! Very useful and highly informative. Thank you 🙂
Thank you for watching! I’m glad it helped.
Very good explanation.. Well done! A couple of things though: 1) please do more if you're able to and stay focused on one thing at the time, I know how difficult it is to create content with Full-time job, but you have a golden voice and way of explaining concepts that the vast majority of people on UA-cam just don't have it. 2) try to create 10 min videos without sacrificing the level of detail, this is not for me, it is for the younger generation who is running like a checken from one video to another, plus for UA-cam algorethm to help you get a better reach. Finally, please either replace the tree behind you or treat it, in your last videw the tree looked greener, but I know your trick ✂😉
Thanks Wasim, appreciate the watch and the feedback! At the end of the day, the content I create here is something I do to share knowledge and try to deep dive in to various topics. If the next generation or the algorithm aren’t happy with that, then that’s ok 🙂 I try to create content I’d enjoy consuming. There’s definitely a backlog of content to create (over 40 topics / ideas), but as you noted there are often more important things that demand my time.
@MattAllford thanks for your quick reply. I agree and disagree with you at the same time; I realize that when you do something like UA-cam, you usually want to talk about subjects your way, leaving behind the constraints we often have in our work environments; however, as an opportunistic I am, I believe that you have all the qualities that can make you huge in the field on UA-cam or any other platform. I really hope to see more of your content soon.
Great work! Clear and concise instructions. Thanks for that. 😍
Glad it was helpful!
Great video Matt. Is there a Q&A board for asking questions on Azure load test? The Microsoft Q&A board for Azure load testing is dead. I have several issues with my JMeter script that works well locally with no issues but chokes in Azure load test. The documentation is useless. There are no examples / documentation on how to ensure that cookies are extracted and passed in with every subsequent request after logging in. I have an HTTP cookie manager at the plan level in the JMX and that works great and I can test all areas of the website. But since the cookie extraction doesn't work, I can't get past the login process in Azure load test. No help to be found anywhere. Azure load test is not ready for prime time.
Thanks for watching! Sorry to hear the docs are falling short. Do you have an active subscription with a support agreement? It could be worth opening a service request via the Azure Portal. Otherwise, I believe the team moved from GitHub issues over to the VS developer community for load testing ideas, feedback, and issues. URL is below, I hope this helps! developercommunity.visualstudio.com/loadtesting
Great tip!
Hi, I've checked your video. And it is so much helpful for the automation. I was wondering is there any way to add multiple DNS Zones to one function app ?
Hi there, sorry I did not see this comment earlier. I’m not immediately aware of the ability to add multiple DNS zones to a single function app, but I can see why that’s a valid request. I’d suggest logging an issue on the GitHub page to see if that functionality is available today, and if not then make it a feature request!