Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for SAS URL in Read-DbaBackupHeader and Get-DbaBackupInformation #9289

Conversation

david-garcia-garcia
Copy link
Contributor

@david-garcia-garcia david-garcia-garcia commented Mar 25, 2024

Type of Change

New feature

Unless I am missing something, it is currently not possible to do database restores from a collection of backups stored in Azure Blobs (automatically). With the fixes in this PR I hope to be able to have restore automation to support the backups that can be made using Hallengren backup solution to Azure Blobs.

Purpose

Add support for Shared Access Singature URL's in Get-DbaBackupInformation and Read-DbaBackupHeader

Approach

Small tweaks to support URLS with SAS. See comments in code.

Commands to test

Read-DbaBackupHeader and Get-DbaBackupInformation

$containerName = $pathSegments[0];
$prefix = if ($pathSegments.Length -gt 1) { $pathSegments[1] } else { "" }
$sasToken = $uri.Query.TrimStart('?')
$ctx = New-AzStorageContext -StorageAccountName $storageAccountName -SasToken "$sasToken";
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was the only way I could find to obtain the blobs in a container through powershell, I really don't like the approach but it's the only way I could come up with.

@@ -128,7 +128,15 @@ function Read-DbaBackupHeader {
$restore = New-Object Microsoft.SqlServer.Management.Smo.Restore

if ($DeviceType -eq 'URL') {
$restore.CredentialName = $AzureCredential
if (-not [String]::IsNullOrWhiteSpace($AzureCredential)) {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The deal here is... Read-DbaBackupHeader is actually working right now. If you want to use a SAS URL, you need to remove the SAS from the URL and add the SAS credential in MSSQL. This is here for convenience, you either use an explicit credential or a SAS URL, and if it is a SAS url, MSSQL wants it without the SAS token.

@@ -273,6 +273,24 @@ function Get-DbaBackupInformation {
Write-Message -Level VeryVerbose -Message "File"
$Files += $f.FullName
}
} elseif ($f -match "^http") {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can foresee a nice WTF if you call this method with a SAS URL for a container, and you have NOT created the SAS credential in MSSQL, because this method will properly enumerate the individual blobs, but when calling Read-DbaBackupHeader, they are going to get a permission denied issue (I could enumerate, but I could not read them...)

@david-garcia-garcia
Copy link
Contributor Author

No need for any of this, you can extract the URL's manually and pass them in to Get-DbaBackupInformation

$ctx = New-AzStorageContext -StorageAccountName $backupUrl.storageAccountName -SasToken $backupUrl.sasToken;
$blobs = Get-AzStorageBlob -Container $backupUrl.container -Context $ctx -Prefix $backupUrl.prefix |
Where-Object { ($_.AccessTier -ne 'Archive') -and ($_.Length -gt 0) };
        
$blobUrls = $blobs | ForEach-Object { $backupUrl.baseUrl + $_.Name } 
$files = Get-DbaBackupInformation -SqlInstance $sqlInstance -Path $blobUrls | Where-Object { $_.Database -eq $databaseName };

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant