Tag Archives: Azure

Azure- Why is my OS disk bigger than I asked for?

When spinning up a VM from a marketplace image using the Azure Portal you don’t get a choice of OS disk size, and if you specify a size in an API call it’s ignored. For example when deploying Ubuntu images a 32GB default OS disk is always created.

This is because the size is defined in that marketplace template. We can use the Azure CLI to pull out this information.

az vm image list
returns a list of Marketplace Images. Then:

az vm image show --urn "Canonical:UbuntuServer:18.04-LTS:latest"
Returns
{
 "automaticOsUpgradeProperties": {
    "automaticOsUpgradeSupported": true
 },
 "dataDiskImages": [],
 "hyperVgeneration": "V1",
 "id": "/Subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/Providers/Microsoft.Compute/Locations/westus/Publishers/Canonical/ArtifactTypes/VMImage/Offers/UbuntuServer/Skus/18.04-LTS/Versions/18.04.201911130",
 "location": "westus",
 "name": "18.04.201911130",
 "osDiskImage": {
     "operatingSystem": "Linux",
     "sizeInBytes": 32213303808,
     "sizeInGb": 31
  },
 "plan": null,
 "tags": null
}

The “sizeInGb” entry shows us that a 31 GB OS disk is part of the template provided by Canonical. Other templates are similar, CentOS is 1GB smaller at 30GB and RHEL is 64GB.

If a smaller OS disk is required then a custom template can be used in place of the Marketplace one, but there’s a certain level of maintenance required to keep that up to date.

PowerShell Get-Command: finding the cmdlet

A recent Slack chat reminded me that PowerShell’s Get-Command cmdlet is a good way of finding what commands to use when you encounter a new problem. However it goes beyond typing “Get-Command” and just getting a huge list back- my laptop just gave me 7659 commands to choose from – as this can be unusable. Here’s some quick tips on focussing your search by using the built in arguments.

1. –module

PowerShell and it’s extensions are comprised of modules. If you want to use the cmdlets for interacting with a VMware environment you install their “PowerCLI” module. Get-Command can return just the cmdlets from a specific module, for example we can list all the cmdlets from the VMware modules

Get-Command –Module VMware.*

Or we can list the commands in the Azure Compute PowerShell module

Get-Command –Module Az.Compute

2. –verb

If you’ve used PowerShell before, you’ll know that cmdlet names are all of the format verb (“a doing word” as I was taught at school), followed by a dash,  followed by a noun. So we have Measure-Object, Remove-Disk, and even Get-Command itself. The “-verb” argument can be used to only show us cmdlets with this verb, for example to only see the “Get” cmdlets we use

Get-Command –Verb Get

3. –noun

So, after the dash we have the noun. A disk, network connection, user account, and so on. So to find out all the cmdlets that work on or with services:

Get-Command –Noun Service

4. Combining the above

Of course we can make this even more powerful by combining these arguments together and with wildcards. Let’s say we want to know all the cmdlets for working with VMware vSphere tags?

Get-Command –Module VMware* –Noun *Tag*

Or if we want to find all the get Azure get commands for working with resources, resource groups, resource locks and so on.

Get-Command -Module Az.* -Verb Get -Noun *resource*

Azure: Email a Backup Report with PowerShell and Office365

Azure PortalThis PowerShell snippet compiles a daily report of backup jobs on all the Recovery Service Vaults within the current subscription. It then uses the Office 365 SMTP server to mail this report out to chosen recipients – if you’re not using O365 then just change the SMTPServer, Port, and UseSSL arguments as appropriate in the Send-MailMessage cmdlet.

$Body=foreach ($RSV in Get-AzRecoveryServicesvault) {
Get-AzRecoveryServicesBackupJob -VaultID $RSV.ID -Operation "Backup" -From ((Get-Date).AddDays(-1).ToUniversalTime()) |
Select-Object WorkloadName,Operation,Status,StartTime,EndTime,Duration
}
$Body= "
<h1>Daily Azure Backup Report: "
+ (Get-AzSubscription).Name +"</h1>
<code>"
+ ($Body | ConvertTo-HTML)+"</code>"
Send-MailMessage -BodyAsHTML $Body -From "[email protected]" `
-To "[email protected]" -SmtpServer smtp.office365.com -Port 587 `
-Subject "Azure Backup Report" -UseSsl `
-Credential (Get-Credential -Message "Office 365 credentials")

If the email should go to multiple recipients then comma separate the list as follows:

Send-MailMessage -To @("[email protected]","[email protected]")

Obviously to automate this you’ll need to feed the credentials in, using whatever secure platform you have available, rather than prompting for them in the script. The resulting email looks something like this:
Email
There’s plenty of scope for customisation of the email – the style and look of it can be changed by manipulating the HTML that’s generated in the snippet and the information included can be changed by modifying the Select-Object parameters.

Azure Arc Announcement

Microsoft released an 87 page “Book of New” listing the announcements  from this weeks Ignite Conference and right at the top is Azure Arc. It’s not just alphabetical order that put’s this new product here, in my opinion this is a real step forward by Microsoft towards fulfilling the early promise of their Azure Hybrid Cloud model.

Arc’s first feature provides the ability to run Azure data services – Azure SQL Server and friends- on any platform, be it on-premises, on an edge device, or in the public cloud. We saw VMware advertising this from their point of view in the VMworld Europe keynote this week. Bringing Platform-As-A-Service to your own platform, or those at another cloud provider, is an interesting concept and vital to the idea of a true hybrid environment where you can run any app on any cloud.

Whilst Azure stack provided “Azure consistent hardware” in your datacentre, Azure Arc continues this journey – in essence expanding what “Azure consistent” means to the customer in terms of data services.

Azure Arc also extends the security, governance and management from Azure into other environments – coming back to a single architecture.

Azure hybrid innovation anywhere infographic

For me this is the key feature of this technology. With Azure Arc sitting at the heart of the Azure Hybrid model we’re one step closer to that utopia where the datacentre is abstracted away in the same way that virtualisation abstracted away the server hardware. You can do this abstraction in the public clouds, but there are still workloads that have regulatory, financial, or technical reasons for staying on-premises (or even a different public cloud) and until now managing these alongside Azure has meant two different platforms.

 image

Previously Azure Stack (and to a certain extent Azure Stack HCI) came close to providing this true hybrid functionality for Microsoft but there was still a disconnect- you have to visit a separate Azure portal to manage your on-premises Azure Stack “Region” for example.

In the Arc environment, an Azure agent is deployed to non-Azure VMs (or physical servers) and then they appear on the Azure Portal as a regular resource. Policies can be applied and compliance audited (remediation is expected in the “next few months”). The people in your Security Team who got excited about what was possible with Policies in Azure can now apply the same policy features to VMs in your datacentre and from the same interface.

image

As I implied above, this is still a journey in progress and I believe Microsoft have further to travel down this roadmap, but this is definitely a big step along their way and provides very useful features now and promise of an even brighter future.

As you would expect, there’s a number of recorded sessions at Microsoft Ignite 2019 covering this new product following it’s announcement in the keynotes. If you’re interested in finding out more I would suggest starting with BRK2208 : Introducing Azure Arc. Azure Arc is currently available in Preview and usable from the portal today.

image

Improving Documentation via the Community.

Have you ever had to deal with incorrect documentation? Or been frustrated by a typo? Or been annoyed that a how-to guide uses an old version of an interface?

Now you can fix it!

Many software providers are now using community-editable documentation online. This isn’t a Wikipedia style free-for-all, but a carefully moderated process ensuring that the resulting document is accurate.  If you come across an error in an online doc, or even a PowerShell help page, check and see if you can submit edits.

Continuous deployment pipelines mean that these edits can make it into live documentation in a matter of hours or days- impressive times if you’ve ever submitted an errata to a printed book, or submitted a bug request to get online documentation fixed.

docs.Microsoft.com

If you visit a Microsoft docs page, you’ll see an Edit link at the top of the screen (see (1) in the screenshot below). Clicking on this takes you to a page on Github with the source of the document. Click there to edit the file and a git fork will be made under your own profile- make your edits and submit a merge request and, once approved, your updates will appear in the original website. You’ll even get a little credit (see (2) in the screenshot below) for your contribution.

image

In this particular example I was following the step-by-step guide and noticed that the wording in the document no longer matched the Azure Portal. I was quickly able to suggest a fix and later that day the page was updated and anyone else following the instructions wouldn’t be misled. Two minutes of my time hopefully saved ten minutes of head-scratching by someone else.

VMware PowerCLI Example Scripts

As the name suggests, the source code for some example PowerCLI scripts has been published by VMware supported by members of the #vCommunity. If you find an error in the scripts you can pop over to Github and correct them- and remember this isn’t just the code of the script, but also it’s accompanying documentation.

image

In this example a typo in the get-help file was spotted and quickly corrected. Whilst the spelling mistake wasn’t a show-stopper this shows how quick and easy it is to contribute to these projects without being a coding guru.

Summary

Many of these projects use Github and learning how to use that version control platform isn’t arduous- especially for small changes like these- and is a useful skill to pickup if you don’t already have it. The important message here is you don’t need to be a developer to contribute to the code.

So, next time you spot a mistake in documentation, see if you can fix it yourself and help the next person who comes along.