The following README will guide you on how to use the provided Terraform plan to deploy a Windows Server installed with Microsoft SQL Server 2019 (Developer edition) in a Amazon Web Services (AWS) EC2 instance and connect it as an Azure Arc enabled SQL server resource.
By the end of the guide, you will have an AWS EC2 instance installed with Windows Server 2019 with SQL Server 2019, projected as an Azure Arc enabled SQL Server and a running SQL assessment with data injected to Azure Log Analytics workspace.
-
Clone this repo
git clone https://github.com/microsoft/azure_arc.git
-
Install or update Azure CLI. Azure CLI should be running version 2.7 or later. Use
az --version
to check your current installed version. -
Create a free Amazon Web Services account if you don't already have one.
-
Create Azure Service Principal (SP)
To connect the EC2 instance to Azure Arc, an Azure Service Principal assigned with the "Contributor" role is required. To create it, login to your Azure account run the below command (this can also be done in Azure Cloud Shell).
az login az ad sp create-for-rbac -n "<Unique SP Name>" --role contributor
For example:
az ad sp create-for-rbac -n "http://AzureArcServers" --role contributor
Output should look like this:
{ "appId": "XXXXXXXXXXXXXXXXXXXXXXXXXXXX", "displayName": "AzureArcServers", "name": "http://AzureArcServers", "password": "XXXXXXXXXXXXXXXXXXXXXXXXXXXX", "tenant": "XXXXXXXXXXXXXXXXXXXXXXXXXXXX" }
Note: It is optional but highly recommended to scope the SP to a specific Azure subscription and Resource Group
Create AWS User IAM Key. An access key grants programmatic access to your resources which we will be using later on in this guide.
- Navigate to the IAM Access page.
- Select the Users from the side menu.
- Select the User you want to create the access key for.
- Select *Security credentials of the User selected.
- Under Access Keys select Create Access Keys.
- In the popup window it will show you the Access key ID and Secret access key. Save both of these values to configure the Terraform plan variables later.
For you to get familiar with the automation and deployment flow, below is an explanation.
-
User is exporting the Terraform environment variables (1-time export) which are being used throughout the deployment.
-
User is executing the Terraform plan which will deploy the EC2 instance as well as:
-
Create an Administrator Windows user account, enabling WinRM on the VM and change the Windows Computer Name.
-
Generate and execute the sql.ps1 script. This script will:
-
Install Azure CLI, Azure PowerShell module and SQL Server Management Studio (SSMS) Chocolaty packages.
-
Create a runtime logon script (LogonScript.ps1) which will run upon the user first logon to Windows. Runtime script will:
- Install SQL Server Developer Edition
- Enable SQL TCP protocol on the default instance
- Create SQL Server Management Studio Desktop shortcut
- Restore AdventureWorksLT2019 Sample Database
- Onboard both the server and SQL to Azure Arc
- Deploy Azure Log Analytics and a workspace
- Install the Microsoft Monitoring Agent (MMA) agent
- Enable Log Analytics Solutions
- Deploy MMA Azure Extension ARM Template from within the VM
- Configure SQL Azure Assessment
-
Disable and prevent Windows Server Manager from running on startup
-
-
-
Once Terraform plan deployment has completed and upon the user initial RDP login to Windows, LogonScript.ps1 script will run automatically and execute all the above.
Before executing the Terraform plan, you must set the environment variables which will be used by the plan. These variables are based on the Azure Service Principal you've just created, your Azure subscription and tenant, and your AWS account.
-
Retrieve your Azure Subscription ID and tenant ID using the
az account list
command. -
The Terraform plan creates resources in both Microsoft Azure and AWS. It then executes a script on the virtual machine to install all the necessary artifacts.
Both the script and the Terraform plan itself requires certain information about your AWS and Azure environments. Edit variables according to your environment and export it using the below commands
export TF_VAR_subId='Your Azure Subscription ID'
export TF_VAR_servicePrincipalAppId='Your Azure Service Principal App ID'
export TF_VAR_servicePrincipalSecret='Your Azure Service Principal App Password'
export TF_VAR_servicePrincipalTenantId='Your Azure tenant ID'
export TF_VAR_location='Azure Region'
export TF_VAR_resourceGroup='Azure Resource Group Name'
export TF_VAR_AWS_ACCESS_KEY_ID='Your AWS Access Key ID'
export TF_VAR_AWS_SECRET_ACCESS_KEY='Your AWS Secret Key'
export TF_VAR_key_name='Your AWS Key Pair name'
export TF_VAR_aws_region='AWS region'
export TF_VAR_aws_availabilityzone='AWS Availability Zone region'
export TF_VAR_instance_type='EC2 instance type'
export TF_VAR_hostname='EC2 instance Windows Computer Name'
export TF_VAR_admin_user='Guest OS Admin Username'
export TF_VAR_admin_password='Guest OS Admin Password'
-
From the folder within your cloned repo where the Terraform binaries are, the below commands to download the needed TF providers and to run the plan.
terraform init terraform apply --auto-approve
Once the Terraform plan deployment has completed, a new Windows Server VM will be up & running as well as an empty Azure Resource Group will be created.
- Download the RDP file and log in to the VM (using the data from the TF_VAR_admin_user and TF_VAR_admin_password environment variables) which will initiate the LogonScript run. Let the script to run it's course and which will also close the PowerShell session when completed.
Note: The script runtime will take ~10-15min to complete
- Open Microsoft SQL Server Management Studio (a Windows shortcut will be created for you) and validate the AdventureWorksLT2019 sample database is deployed as well.
- In the Azure Portal, notice you now have an Azure Arc enabled Server resource (with the MMA agent installed via an Extension), Azure Arc enabled SQL resource and Azure Log Analytics deployed.
Now that you have both the server and SQL projected as Azure Arc resources, the last step is complete the initiation of the SQL Assessment run.
- On the SQL Azure Arc resource, click on "Environment Health" followed by clicking the "Download configuration script".
Since the LogonScript run in the deployment step took care of deploying and installing the required binaries, you can safely and delete the downloaded AddSqlAssessment.ps1 file.
Clicking the "Download configuration script" will simply send a REST API call to the Azure portal which will make "Step3" available and will result with a grayed-out "View SQL Assessment Results" button.
- After few minutes you will notice how the "View SQL Assessment Results" button is available for you to click on. At this point, the SQL assessment data and logs are getting injected to Azure Log Analytics.
Initially, the amount of data will be limited as it take a while for the assessment to complete a full cycle but after few hours you should be able to see much more data coming in.
To delete the environment, use the terraform destroy --auto-approve
command which will delete the AWS and the Azure resources.