GA4

Sunday, July 20, 2014

When to use Azure Files vs Azure Blobs vs Azure Disks

We have three main ways to store data in the Azure cloud – Files, Blobs and Disks. All three abstractions take advantage of Azure Storage stack and the Azure platform. The following section compares the advantages of each abstraction.

Azure Files: Provides a SMB 2.1 interface in addition to the REST interface to provide access to files. The SMB interface enables applications running in the cloud to use native file system APIs to read and write from files. A file share is best for:

  • Lifting & shifting applications to the cloud which already use native file system APIs to share data between pieces of the applications
  • Storing development and debugging tools that need to be accessed from many cloud instances
  • Applications that want REST access to data from anywhere and SMB access to data from within the region the storage account is located in

Azure Blobs: Provides a REST interface for massively scale out object storage.

  • Applications can be written to use REST APIs to store unstructured data into Azure blobs at a massive scale (a single container can be 500TBs in size). .
  • Supports streaming scenarios and random access
  • Access to data from anywhere via REST

Azure Disks: Provides persistent disks to be attached to Azure virtual machine in which data is made durable by storing the disk as a fixed formatted VHD in a page blob in Azure storage. When used as a VHD, the disk can be attached to a single VM, but not shared across VM instances.

  • Lift & Shift applications that use native file system APIs to read and write data to persistent disks, but does not require this data to be available to other cloud VMs. Note that a single disk can be attached to only a single VM at any given time.
  • Individual data stored on the disk is not required to be accessed from outside the VM as the data is stored in a VHD formatted with NTFS. However, the VHD can be downloaded from anywhere via REST API and then loaded onto a local virtual machine.
  • Disks can be snapshotted to create point in time read-only backups. When recovery needs to take place, a snapshot can then be copied to a different blob and data can be recovered.

Creation of Azure Queue and adding message to it

// Retrieve storage account from connection string

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(

CloudConfigurationManager.GetSetting("StorageConnectionString"));

// Create the queue client

CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();

// Retrieve a reference to a queue

CloudQueue queue = queueClient.GetQueueReference("myqueue");

// Create the queue if it doesn't already exist

queue.CreateIfNotExists();
//Create a message and add it in a queue

CloudQueueMessage msg = new CloudQueueMessage("Hello World");

queue.AddMessage(msg);
 

Uploading simple Blobs to Azure Storage

Uploading simple Blobs to Azure Storage

 public static class BlobStorageHelper
    {
        public static void FileUpload(string filePath)
        {
            var credentials = new StorageCredentials("<<Azure Storage Name>>",
                                        "<<Primary Key>>");
            var client = new CloudBlobClient(new Uri("<<Azure Blob Storage Path - Full URL>>), credentials);
            // Retrieve a reference to a container. (You need to create one using the mangement portal, or call container.CreateIfNotExists())
            var container = client.GetContainerReference("<<Blob Name>>");
            // Retrieve reference to a blob named "myfile.gif".
            var blockBlob = container.GetBlockBlobReference("<<File Name>>");
            // Create or overwrite the "myblob" blob with contents from a local file.
            using (var fileStream = System.IO.File.OpenRead(@"<<Full File Path with file name and extension>>"))
            {
                blockBlob.UploadFromStream(fileStream);
            }
        }
    }

Migration of SQL Datatabase and its data to Azure SQL Database


BULK INSERT T-SQL Statement is not supported in SQL Azure. You can use BCP to load data, for that the user who connects needs to have write permissions on the database or tables, you can add that user to "db_datawriter" database role in that specific database
 
To perform the migration perform the following steps:
 
  1. Use the BCP utility to draw data out of your SQL Server into a file
  2. Then move the data from the file to SQL Azure
  3. It will be not easy to write BCP scripts by hand….Use some utility for these scripts.
Since writing BCP utilities can be cumbersome and require lots of efforts. To resolve this problem use the following BCP tool for the migration of schema and its data.
 
 
 
Note: There are some feature which are not supported by Azure SQL and require manual intervention or workaround in your database or application. Please refer the Microsoft link to know about more http://msdn.microsoft.com/en-us/library/ff394115.aspx