Total Pageviews

Saturday, August 15, 2020

On-premises SQL Server – Database Backup to Azure Blob Storage

As part of a hybrid solution, and in terms of reducing the local backup storage cost and increasing backup availability, many companies are utilizing cloud storage for their backup solution. Microsoft offers their Azure Blob Service to redirect all on-premises database backups to the Azure Blob Store starting from SQL Server 2012.

Azure Storage Services:

Microsoft Azure introduced the Storage V2 service a while back by addressing several issues and adding new features. Currently, Azure Storage Service provides five different types of storage accounts and according to the MSDN, the following are their storage offerings:

  • General-purpose v2 accounts: Basic storage account type for blobs, files, queues, and tables. Recommended for most scenarios using Azure Storage.
  • General-purpose v1 accounts: Legacy account type for blobs, files, queues, and tables. Use general-purpose v2 accounts instead when possible.
  • BlockBlobStorage accounts: Storage accounts with premium performance characteristics for block blobs and append blobs. Recommended for scenarios with high transactions rates, or scenarios that use smaller objects or require consistently low storage latency.
  • FileStorage accounts: Files-only storage accounts with premium performance characteristics. Recommended for enterprise or high performance scale applications.
  • BlobStorage accounts: Legacy Blob-only storage accounts. Use general-purpose v2 accounts instead when possible.

Three Types of Blobs:

In Azure, there are three different types of blobs - block blobs, append blobs, and page blobs. Once a blob is created in a container, its type can not be changed. Following is the brief difference among the blobs:

  • Block Blobs: For large objects that don't use random read and write operations, for example - photos, videos, backup files.
  • Page Blobs: Optimized for random read and write operations, for example - VHD files.
  • Append Blobs: Optimized for append operations, for example - log files.

Which blob type to choose for SQL Server database backups?

For SQL Server backups, it is recommended to use Storage V2 as it has more features with predictable performance. Block Blob and Shared Access Signature (SAS) Tokens are recommended starting from SQL Server 2016.

For SQL Server database backups, we can either choose a block blob or page blob. But choosing a blob type depends on the T-SQL syntax used to create the blob.

To backup SQL Server databases to Azure Storage, we either need the Storage “Access Key” or the “Shared Access Signature” (SAS Token).

  • When the Storage “Access Key” is used in the credential, a page blob will be created.
  • When the “Shared Access Signature” is used, a block blob will be created.

To Create Page Blob:

CREATE CREDENTIAL bakupCredential

          WITH IDENTITY='sqlbackup80',

-- the identity value is the Azure Storage Account name  

          SECRET = 'SBBRDAFMPv9oTrFOCMatMLFNNxxW2jNuHIuC2qx3e5CsrWsFWtG2oypiiFBiXAXwlIzJewSqLX88txeDi+41uw==';

 

BACKUP DATABASE <DatabaseName> 

TO URL = 'https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/<BackupFileName>.bak'  

      WITH CREDENTIAL = '<CredentialName>'  

     ,COMPRESSION, STATS = 5; 

To Create Block Blob:

CREATE CREDENTIAL [https://sqlbackup80.blob.core.windows.net/srvsql2016]

WITH IDENTITY='SHARED ACCESS SIGNATURE'

  -- this is a mandatory string and should not be changed  

 , SECRET = 'sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2025-08-14T02:25:28Z&st=2020-08-13T18:25:28Z&spr=https&sig=OcJtUQRKPg7TcqmcGxnjCgSD26GtFRC47sZTbBxsfkY%3D';

 

BACKUP DATABASE <DatabaseName> 

TO URL = 'https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/<BackupFileName>.bak'  

      WITH COMPRESSION, STATS = 5; 

Block Blob – Backing up to URL: The block blob is substantially faster than the page blob service. However, backing up databases to a URL has some limitations. Before developing SQL Server backup solutions to Azure Cloud Storage, the following criteria must be considered:

  • Block blob can only be used from SQL 2016 or later.
  • SQL Server Backups to premium storage is not supported.
  • General Purpose Storage V2 is recommended for block blobs type.
  • SQL Server 2016 or a later version, and a Shared Access Signature (SAS) token must be used.
  • SSMS GUI does not support “Storage Access Key” credentials.
  • The maximum of each backup file size is limited to 195 GB.
  • In SQL Server 2016 or later, striping the backup to support up to 12.8 TB backup size.
  • In SQL Server 2016, a backup can be striped on up to 64 devices.
  • When performing backups to a URL, the default MAXTRANSFERSIZE is 1048576 (1MB)
  • From SQL Server 2012 to SQL Server 2014, backups to URLs do not support striping.
  • COMPRESSION is recommended for better backup performance.

An error Message: On SQL Server 2016 or later, if we attempt to use a “Storage Access Key” to backup databases to a General Purpose Storage V2, we will receive the below error. However, if we use General Purpose Storage V1 for the backup destination, we will not receive the error and the backup will be successful, but as a page blob type.

Msg 3271, Level 16, State 1, Line 21

A nonrecoverable I/O error occurred on file "https://sqlbak80.blob.core.windows.net/dbbak80/SupportDB_FULL_2020_08_13_10:15_AM.BAK:" Backup to URL received an exception from the remote endpoint. Exception Message: The remote server returned an error: (409) Conflict..

Msg 3013, Level 16, State 1, Line 21

BACKUP DATABASE is terminating abnormally.

Step-by-step demonstration:

Let’s say we have an on-premises SQL Server 2017 Instance and it has a database named SalesDB. We need to perform a full backup of this database to Azure Cloud instead of an attached local or network shared drive. We also need to perform a transaction log backup.

We have the following information for this demonstration:

  • Azure subscription: Any subscription type would work.
  • Azure Storage Account: Storage V2 type named sqlbackup80 (hierarchical namespace is enabled)
  • Blob Container: sql2017

Step#1: Obtain the SAS token, either Key1 or Key2, from the storage account blade. Note, the very first letter of the token starts with “?” so we need to remove the “?”, otherwise it will fail to connect to the storage account.

 


Step #2: From SSMS, login to Azure. You can use “Azure Storage…”  from the object explorer to connect to Azure as shown in the following screenshot.

 

Step#3: First, we need to create a SQL Credential using the SAS token. Following is the T-SQL syntax:

CREATE CREDENTIAL [https://<StorageAccountName>.blob.core.windows.net/<StorageAccountContainerName>]

   WITH IDENTITY = 'SHARED ACCESS SIGNATURE', 

   SECRET = '<SAS_TOKEN>';

The IDENTITY value must be ‘’ and can not be changed. Copy and paste the SAS Token and remove the “?” from the beginning. The final statement should look like the following: 

USE master

GO 

CREATE CREDENTIAL [https://sqlbackup80.blob.core.windows.net/sql2017]

WITH IDENTITY='SHARED ACCESS SIGNATURE',

-- this is a mandatory string and should not be changed  

          SECRET = 'sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-08-15T00:36:21Z&st=2020-08-14T16:36:21Z&spr=https&sig=YgwFqWUe0nJLew2rFTy%2Ba3qgkkoEDq%2Fg84UKFlg4BQM%3D'

   -- this is the shared access signature key that you obtained in step#1.  

GO   

Where “sqlbackup80” is the Storage Account Name and “sql2017” is the Container Name within the Storage Account.

Step#4:  Now, lets backup the “SalesDB” to the Azure Storage.

BACKUP DATABASE <DatabaseName> 

TO URL = 'https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/<BackupFileName>.bak'  

WITH COMPRESSION

         ,STATS = 5; 

GO 

So, the final backup statement looks like the following:

BACKUP DATABASE SalesDB

          TO URL = N'https://sqlbackup80.blob.core.windows.net/sql2017/SalesDB_FULL_2020_08_14.BAK'

          WITH COMPRESSION, STATS=5;

Step#5: As we are using StorageV2, we can create one or more folders inside the container. To create the folder, say “tlog”, we simply need to add the folder name after the container. The folder will be created automatically. Following is an example to backup the transaction log to a folder: 

BACKUP LOG SalesDB

          TO URL = N'https://sqlbackup80.blob.core.windows.net/sql2017/tlog/SalesDB_Tlog_2020_08_13_01.TRN'

          WITH COMPRESSION, STATS=5;

GO

BACKUP LOG SalesDB

          TO URL = N'https://sqlbackup80.blob.core.windows.net/sql2017/tlog/SalesDB_Tlog_2020_08_13_01.TRN'

          WITH COMPRESSION, STATS=5;

GO

Step#6: Now let’s verify the blob storage to review how it looks after backing up the database and transaction log. We can use Azure Portal, SSMS or Storage Explorer:

Using SSMS#


 

Using Storage Explore#


 

Using PowerShell:

For the Database Backup Files:

$StorageAccountName = "sqlbackup80"

$StorageAccountKey = "GLmPu26ok+mH65CrAUzRI4Hm1Zpe8uyFT8pPmFtPeDPV9E1PYXGIb5idGO4bxXV7PDpDt2NJbWQBeYvIQlgMrQ=="

$container = "sql2017"

$context = New-AzureStorageContext

    -StorageAccountName $StorageAccountName

    -StorageAccountKey $StorageAccountKey

# read the backup files from the container

$bakfilelist = Get-AzureStorageBlob

    -Prefix '/'

    -Container $container

    -Context $context | Where-Object { $_.name -like "*.bak" }

# create an empty array

$bakobj = @()

foreach ($bfile in $bakfilelist)

{          $bakobj += New-Object PSObject -Property ([ordered]@{

                             'FileName' = $bfile.Name;

                             'LocalDateTime' = $bfile.LastModified.LocalDateTime;

                             'UTCDateTime' = $bfile.LastModified.UtcDateTime

                   })      

}

Write-Host 'Backup files'

$bakobj | sort LocalDateTime

Output



 

For the Transaction Log Files:

$StorageAccountName = "sqlbackup80"

$StorageAccountKey = "GLmPu26ok+mH65CrAUzRI4Hm1Zpe8uyFT8pPmFtPeDPV9E1PYXGIb5idGO4bxXV7PDpDt2NJbWQBeYvIQlgMrQ=="

$container = "sql2017"

$context = New-AzureStorageContext

    -StorageAccountName $StorageAccountName

    -StorageAccountKey $StorageAccountKey

 # read the tlog backup files from the container

$trnfilelist = Get-AzureStorageBlob

        -Prefix '/tlog/'

        -Container $container

        -Context $context | Where-Object { $_.name -like "*.trn" }

 # create an empty array

$trnobj = @()

foreach ($tfile in $trnfilelist)

{          $trnobj += New-Object PSObject -Property ([ordered]@{

                             'FileName' = $tfile.Name;

                             'LocalDateTime' = $tfile.LastModified.LocalDateTime;

                             'UTCDateTime' = $tfile.LastModified.UtcDateTime

                   })         

}

Write-Host 'Transaction log backup files'

$trnobj | sort LocalDateTime

Output: 


 

Further Reading and References:


No comments:

Post a Comment