23 Jul 2020

Integrating Azure storage with Forge Design Automation


We have most design automation samples with OSS as cloud storage, more and more customers especially Enterprise customers are using Azure as their cloud storage solution, are requesting an easy way to integrate Azure cloud storage with Design Automation service.

The DA service is cloud-storage agnostic, as long as it can get or put files over HTTP, the DA service is good for you.

1. We need to get the storage connection string of Azure Storage Account , we assume that you are have storage account already in place.

Go to https://portal.azure.com/


2. Get the Access Keys, and Storage Connection String




Following code, create an Azure container, builds a signed URL for the resources.

We need to be careful while creating signed URL, the URLs need to signed for read-only if we are passing into Design Automation service for it download i.e., GET

And, URL should be write-only if we are passing into Design Automation service for it upload i.e., PUT or POST

public async Task<string> GetAzureUrlAsync(UrlType urlType, string blobName, string filePath = "")
bool isReadOnly = true;
string url = "";

switch (urlType)
case UrlType.downLoadUrl:
        isReadOnly = true;

case UrlType.upLoadUrl:
        //We need to get Azure SAS write URL
        isReadOnly = false;

default: break;
//Parse the connection string and return a reference to the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConfig.StorageConnectionString);
//Create the blob client object.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//Get a reference to a container to use for the sample code, and create it if it does not exist.
CloudBlobContainer container = blobClient.GetContainerReference(AzureContainer);
bool isCreated = container.CreateIfNotExists();
if (isCreated)
    Console.WriteLine($"\n\t Container {AzureContainer} is created");
string sasBlobToken;

//Get a reference to a blob within the container.
//Note that the blob may not exist yet, but a SAS can still be created for it.
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (File.Exists(filePath))
    //Upload input file from local disk to Azure blob
    await blob.UploadFromFileAsync(filePath);
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used to define the parameters of an ad-hoc SAS, 
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
    // Set start time to five minutes before now to avoid clock skew.
    SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),
    SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
    Permissions = isReadOnly ? SharedAccessBlobPermissions.Read :
                 SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create

//Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
url = blob.Uri + sasBlobToken;
if (urlType.HasFlag(UrlType.downLoadUrl))
    DownloadUrl = url;
    Console.WriteLine($"\tSuccess: signed resource for {blobName} created!\n\t{DownloadUrl.Mask()}");
    UploadUrl = url;
    Console.WriteLine($"\tSuccess: signed resource for{blobName} created!\n\t{UploadUrl.Mask()}");
catch (StorageException ex)
Console.WriteLine($"!!Error!! ->\n\t{ex.StackTrace}");
return url;

To use the method, suppose, I want to get a download URL, i.e, signed URL should be read-only.

DownloadUrl = await GetAzureUrlAsync(UrlType.downLoadUrl, inputFileNameOSS, FilePaths.InputFile);

The type of URL is read-only, and passing an input file from local disk, this will upload to created Azure container and returns read-only signed URL.

You can find complete source code.



Related Article