Search
Close this search box.

How to create a nightly backup of your SQL Azure database using the new Azure Mobile Services (Preview)

This handy MSDN blog post already details 8 (yes EIGHT) different ways to backup your SQL Azure Database. But the easiest technique of all, by scheduling a job using Windows Azure Mobile Services, isn’t even mentioned. Perhaps this is because WAMS is so new and still in Preview mode as of right now.

This technique does not use a 3rd party and is 100% cloud – it doesn’t require any traditional server-hosted resources to complete the backup and stores the database backup itself in a .bacpac file in an Azure-hosted storage container of your choice.

(Please Note: as per Microsoft, “the .BACPAC file is not equivalent to a Backup as it does not contain Transaction Log and History Data and is not transactionally consistent by itself.”)

To perform a SQL Azure Database Backup Using Windows Azure Mobile Services:

  1. Log into Azure and navigate to the “Mobile Services” section.

2. Click the “New” button. The “Compute > Mobile Service > Create” menu should appear. Click “Create”.

3. Name your new Mobile Service something unique to your specific account. Select the Database option and make sure you select the same Region in which your database resides.

4. Enter your Database login details. This will grant the service access to the database to create a randomized new username that is specific to that Azure Mobile Service. You should be able to see the random user listed in the database now:

(Note: As mentioned on the blog of Fabric Controller, this user doesn’t have access to run stored procedures as of yet. If you want to run a stored proc in a nightly job, make sure you grant execute access for this user to run that specific stored proc. )

5. Select your new Mobile Service and click “Scheduler”, then “Create a Scheduled Job”. Enter your own job name, like “NightlyDBBackup” and set the schedule to something like every 1 day at midnight.

6.Now click into the job you’ve created and select “Script” from the top nav:

7. Copy and paste the script below. Obviously, don’t forget to swap out all the variables in <brackets> at the beginning with your own. Note: Run this script at your own risk.

function NightlyDBBackup(){

console.log(‘Start of NightlyDBBackup task…’);

  var azure = require(‘azure’);
    var request = require(‘request’);
    var dbName = ‘<azure database name>’;
    var dbUserName = ‘<azure database user name>’;
    var dbPassword = ‘<azure database user password>’;
    var dbServerName = ‘<azure database server name>.database.windows.net’;
    var storageAccountName = ‘<azure storage account>’;
    var storageAccountKey = ‘<azure storage account key – XXXXXXXXXXXXXXXXXXXXXXXXX>’;
    var storageHost = storageAccountName + ‘.blob.core.windows.net’;
    var storageContainerName = ‘<azure storage container name where you want to store the nightly DB backups>’;

    // The code below saves a different file for every day of the month (up to 31 backup files).
    // To save a file for every day of the week, use d.getDay(); (7 backup files)
    var d=new Date();
    var dayOfMonth = d.getDate();
    var backupFileName = ‘DayOfMonthBackup_’ + dayOfMonth + ‘.bacpac’;
    var blobService = azure.createBlobService(storageAccountName, storageAccountKey, storageHost);

  /* endpoint depends on the Datacenter of the DB
        North Central US   https://ch1prod-dacsvc.azure.com/DACWebService.svc/Export
        South Central US   https://sn1prod-dacsvc.azure.com/DACWebService.svc/Export
        North Europe       https://db3prod-dacsvc.azure.com/DACWebService.svc/Export
        West Europe        https://am1prod-dacsvc.azure.com/DACWebService.svc/Export
        East Asia          https://hkgprod-dacsvc.azure.com/DACWebService.svc/Export
        Southeast Asia     https://sg1prod-dacsvc.azure.com/DACWebService.svc/Export
        East US            https://bl2prod-dacsvc.azure.com/DACWebService.svc/Export
        West US            https://by1prod-dacsvc.azure.com/DACWebService.svc/Export
    */

 var datacenterEndpoint = “https://by1prod-dacsvc.azure.com/DACWebService.svc/Export”;

   // Look for existing blob file with same name. If found, delete it.
    blobService.listBlobs(storageContainerName, function(error, blobs){
        if(!error){
            console.log(‘Attempting to find existing blob named: ‘ + backupFileName + ‘ in container ‘ + storageContainerName + ‘.’);

  for(var index in blobs){
                if (blobs[index].name === backupFileName){

console.log(“Blob name found in storage: ” + blobs[index].name);
                    console.log(“Name of blob backup file according to todays day: ” + backupFileName);

blobService.deleteBlob(
                        storageContainerName
                        , backupFileName
                        , function(){
                            if (!error)  {
                                console.log('deletion of blob ' + storageContainerName + ' successfully queued.');
                            }
                            else if (error) {
                                console.error('deletion of blob ' + storageContainerName + ' failed:' + error);
                            }
                        }
                    );
                }
            }
        }
    });
 // Now initiate backup to .bacpac file
    var body = {
        BlobCredentials : {
            __type : "BlobStorageAccessKeyCredentials:#Microsoft.SqlServer.Management.Dac.ServiceTypes",
            Uri : "https://" + storageHost + "/" + storageContainerName + "/" + backupFileName,
            StorageAccessKey : storageAccountKey
        },
        ConnectionInfo: {
            DatabaseName : dbName,
            Password : dbPassword,
            ServerName : dbServerName,
            UserName : dbUserName
        }
    }
   
    request.post({
        uri: datacenterEndpoint,
        headers: {
            'Content-Type': 'application/json',
            },
        body: JSON.stringify(body)
        }, function(e, r, b) {
            if (e || r.statusCode != 200) {
                console.error('backup failed:', e || r.statusCode, b);
            }
            else {
                console.log('backup successfully queued.', b);
            }
        }
    );
   
    console.log('End of NightlyDBBackup task.');
}

8. Save, then enable the job. Then click on the “Run Once” option to test it out. The console.log lines from the script will pump out onto the Job > Logs section, like so:

9. Finally, navigate to your Azure storage container and see if the SQL Azure .bacpac file was successfully created.

10. As with any recovery process, the final step should be to verify the backup was successful by performing an actual restoration of the database from this .bacpac file. For this step, there are several additional substeps:

  1. Navigate to your SQL Azure database.
  2. Select “Import” at the bottom of the screen to bring up the Bacpac import screen:

3. Next click the folder icon in the BACPAC URL field to navigate to the exact .bacpac file you wish to restore. Enter a name for your database (you can enter a completely new name here and a brand new database will be created) and select the server.

4. Voilà – You’re finished! Simply wait a few minutes for your database to be fully imported then test it with some queries to see that everything is there.

So there you have it – yet one more technique for scheduling a nightly backup of your SQL Azure database, this one entirely hosted from start to finish in Azure, with no 3rd party tools involved. Enjoy!

This article is part of the GWB Archives. Original Author: Ben Barreth

Related Posts