Quantcast
Channel: Category Name
Viewing all articles
Browse latest Browse all 5932

Automation–Microsoft Azure Automation–The Evolution of Cloud Automation

$
0
0

Hello Readers!

By now you have likely seen the announcements for Microsoft Azure Automation (currently in preview). If you haven’t, I strongly recommend you check it out. All it takes is a Microsoft Account, Subscription to Microsoft Azure, and a desire to Automate from the Cloud!


Microsoft Azure Automation


First, a small introduction…

From the service description link above, you can read all about Azure Automation, but here are a few other links and references about this new Microsoft Azure Service, to help you get started.

Service Information…

Getting Started…

The System Center Orchestrator Engineering Blog has a great post outlining the first steps you need to know when getting started with this new Microsoft Azure Service: Managing Azure Services with the Microsoft Azure Automation Preview Service

SC Automation Product Team & Community Contributions

As soon as you setup your first Automation account, you will start to see the similarities between Microsoft Azure Automation and Service Management Automation (SMA).

image

That said, everything below the image of the gear is new for Azure Automation!

And the highlighted bit above, that is where you will find the first batch of available SC Automation Product Team & Community Contributions on Script Center!

As you may have guessed, that is exactly where my first Microsoft Azure Automation Script Center Contribution lives…more on that, below…Read On…


Evolution


I am not saying this is the last stop in the evolution of cloud automation, but it is certainly the next step.

So, what do I mean by evolution?

This isn’t some epic revelation. For me, it comes down to a simple progression of ever improving technology, which also happened to shift upward, to the Cloud.

Let me see if I can call on an image to assist with my meaning…

image

And to drive my point home, here is a bulleted list of chronological blog posts to track this (my) evolution:

  1. June 1st, 2010 – Introduction (my first TechNet Blog Post, introducing Automation with Opalis Integration Server and System Center)
  2. February 21st, 2013 – Orchestrating Windows Azure – Solving the Public Cloud Puzzle with System Center 2012 SP1 (though much time passed and many On-Prem example Runbooks were created, this was my first venture to On-Prem Automation (System Center Orchestrator) of Public Cloud Resources in Azure)
  3. November 14th, 2013 – Automating Hybrid Clouds with Windows Azure and PowerShell (my evolution from System Center Orchestrator to straight PowerShell and PowerShell Workflow – still Automating the Public Cloud (Azure) from On-Prem)
  4. Today – Provision Azure Environment Resources(my latest adventure, this time with the latest technology – Microsoft Azure Automation!)

Cloud Automation


As you may have guessed, my definition of “Cloud Automation” has varied and evolved over time - right along with the technologies and methods that brought my blog posts/examples to bear.

Back in June of 2010, I couldn’t have imaged where I would be today – executing PowerShell Workflow from the Public Cloud. Wow. Back then it was all about On-Prem tools integrating with On-Prem targets. Today, the sky (pun intended) is the limit.

So where am I with all this today? What do I have for you?

My latest creation, an Azure Automation Runbook: Provision Azure Environment Resources(available right now from Script Center!)


Provision Azure Environment Resources


This is where we can see proof of evolution.

As you saw in the bulleted list of chronological blog posts (above), my first venture into Automating the Public Cloud leveraged Orchestrator + The Integration Pack for Windows Azure. My second releaseleveraged PowerShell and PowerShell Workflow + Windows Azure Cmdlets. And for today’s post, I am leveraging Microsoft Azure Automation + Microsoft Azure Cmdlets.

Let’s get down to the goods. And actually, for the first time in a long time, my published example came out a couple days before the blog post / teaser!


Script Center Contribution and Download

The download is the example: New-AzureEnvironmentResources.ps1

Here is a brief description:

This runbook creates a number of Azure Environment Resources (in sequence): Azure Affinity Group, Azure Cloud Service, Azure Storage Account, Azure Storage Container, Azure VM Image, and Azure VM. It also requires the Upload of a VHD to a specified storage container mid-process.

A detained Description, full set of Requirements, and the actual Runbook Contents are available within the Script Center Contribution (not to mention, the actual download).

Download the Provision Azure Environment Resources Example Runbook from Script Center here:

BC-DLButtonDark


A bit more about the Requirements…

Runbook Parameters

  • Azure Connection Name

    REQUIRED. Name of the Azure connection setting that was created in the Automation service.
        This connection setting contains the subscription id and the name of the certificate setting that
        holds the management certificate. It will be passed to the required and nested Connect-Azure runbook.

  • Project Name

    REQUIRED. Name of the Project for the deployment of Azure Environment Resources. This name is leveraged
        throughout the runbook to derive the names of the Azure Environment Resources created.

  • VM Name

    REQUIRED. Name of the Virtual Machine to be created as part of the Project.
  • VM Instance Size

    REQUIRED. Specifies the size of the instance. Supported values are as below with their (cores, memory)
        "ExtraSmall" (shared core, 768 MB),
        "Small"      (1 core, 1.75 GB),
        "Medium"     (2 cores, 3.5 GB),
        "Large"      (4 cores, 7 GB),
        "ExtraLarge" (8 cores, 14GB),
        "A5"         (2 cores, 14GB)
        "A6"         (4 cores, 28GB)
        "A7"         (8 cores, 56 GB)

  • Storage Account Name

    OPTIONAL. This parameter should only be set if the runbook is being re-executed after an existing
    and unique Storage Account Name has already been created, or if a new and unique Storage Account Name
    is desired. If left blank, a new and unique Storage Account Name will be created for the Project. The
    format of the derived Storage Account Names is:
        $ProjectName (lowercase) + [Random lowercase letters and numbers] up to a total Length of 23


Other Requirements

  • An existing connection to an Azure subscription (requires the Connect-Azure runbook and its pre-requisites)

  • The Upload of a VHD to a specified storage container mid-process. At this point in the process, the runbook will intentionally suspend and notify the user; after the upload, the user simply resumes the runbook and the rest of the creation process continues.

  • Six (6) Automation Assets (to be configured in the Assets tab). These are suggested, but not necessarily required. Replacing the "Get-AutomationVariable" calls within this runbook with static or parameter variables is an alternative method. For this example though, the following dependencies exist:
        VARIABLES SET WITH AUTOMATION ASSETS:
             $AGLocation = Get-AutomationVariable -Name 'AGLocation'
             $GenericStorageContainerName = Get-AutomationVariable -Name 'GenericStorageContainer'
             $SourceDiskFileExt = Get-AutomationVariable -Name 'SourceDiskFileExt'
             $VMImageOS = Get-AutomationVariable -Name 'VMImageOS'
             $AdminUsername = Get-AutomationVariable -Name 'AdminUsername'
             $Password = Get-AutomationVariable -Name 'Password'

Note     The entire runbook is heavily checkpointed and can be run multiple times without resource recreation.


Upload of a VHD

Waaaaait a minute! That seems like a pretty big step, how am I going to accomplish that?

I am so glad you asked.

Remember back in this post (Automation–Automating Hybrid Clouds with Windows Azure and PowerShell (Part 3): Public Cloud Environment Provisioning PowerShell Workflow Examples) where I illustrated how to provision Azure resources with PowerShell Workflow? Buried within those example scripts was a workflow for Upload-LocalVHDtoAzure. This is what I suggest you used, but it is completely up to you.

To make this easier (for all of us), I created a separate PowerShell Workflow Script to take care of this step. In fact, it is the same one I used during the creation and testing of New-AzureEnvironmentResources.ps1.

Here it is (the contents of a file I called Upload-LocalVHDtoAzure.ps1):

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
param
(
    [Parameter(Mandatory=$true)]
    [string]$AzureSubscriptionName,
    [Parameter(Mandatory=$true)]
    [string]$ProjectName,
    [Parameter(Mandatory=$true)]
    [string]$StorageAccountName
)

workflow Upload-LocalVHDtoAzure { 

    param 
    ( 
        [string]$StorageContainerName, 
        [string]$VHDName, 
        [string]$SourceVHDPath, 
        [string]$DestinationBlobURI, 
        [bool]$OverWrite 
    ) 
    
    $AzureSubscriptionForWorkflow = Get-AzureSubscription 

    $AzureBlob = Get-AzureStorageBlob -Container $StorageContainerName -Blob $VHDName -ErrorAction SilentlyContinue 
    
    if(!$AzureBlob -or $OverWrite) { 

        $AzureBlob = Add-AzureVhd -LocalFilePath $SourceVHDPath -Destination $DestinationBlobURI -OverWrite:$OverWrite 
    } 

    Return $AzureBlob 

}

$GenericStorageContainerName = "vhds"

$SourceDiskName = "toWindowsAzure" 
$SourceDiskFileExt = "vhd" 
$SourceDiskPath = "D:\Drop\Azure\toAzure" 
$SourceVHDName = "{0}.{1}" -f $SourceDiskName,$SourceDiskFileExt 
$SourceVHDPath = "{0}\{1}" -f $SourceDiskPath,$SourceVHDName 

$DesitnationVHDName = "{0}.{1}" -f $ProjectName,$SourceDiskFileExt 
$DestinationVHDPath = "https://{0}.blob.core.windows.net/{1}" -f $StorageAccountName,$GenericStorageContainerName 
$DestinationBlobURI = "{0}/{1}" -f $DestinationVHDPath,$DesitnationVHDName 
$OverWrite = $false 

Select-AzureSubscription -SubscriptionName $AzureSubscriptionName
Set-AzureSubscription -SubscriptionName $AzureSubscriptionName -CurrentStorageAccount 
$StorageAccountName

$AzureBlobUploadJob
 = Upload-LocalVHDtoAzure -StorageContainerName $GenericStorageContainerName -VHDName $DesitnationVHDName `
    -SourceVHDPath $SourceVHDPath -DestinationBlobURI $DestinationBlobURI -OverWrite $OverWrite -AsJob 
Receive-Job -Job $AzureBlobUploadJob -AutoRemoveJob -Wait -WriteEvents -WriteJobInResults

Note     This is just one method of uploading a VHD to Azure for a specified Storage Account. I have parameterized the entire script so it could be run from the command line as a PS1 file. Obviously you can do with this as you please.

Important     This script has to be run On-Prem, likely where the VHD you intend to upload exists. All On-Prem pre-requisites for connection to Azure apply (see this blog post for more details).


Testing and Proof of Execution

I figured you might want to see the results of my testing during my development of the Provision Azure Environment Resources example…so here are some screen captures from the Azure Automation interface:

Dashboard

image

Runbooks

image

Assets

image

Azure All Items View

You know, to prove that I created something with these scripts…

image


Thanks for checking out my blog post! For more information, tips/tricks, and example solutions for Automation within System Center, Windows Azure Pack, Microsoft Azure, etc., be sure to check out the other blog posts from Building Clouds in the Automation Track!

enJOY!


Viewing all articles
Browse latest Browse all 5932

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>