Import Export Job Management through REST API

Last updated: 8/29/2016
Edit on GitHub

This sample demonstrates how to create and manage (Create, List, Update, Get, Delete) your Import Export job using REST API. This sample only works against classic storage accounts. One can create an Import Export job in their target classic storage account using the Classic portal or the Import/Export REST API. We will use the latter to create our job in this sample

This sample contains Import Export REST-API Swagger definitions which will enable you to generate Import Export models for different languages like Ruby, CSharp, NodeJS, Java and Python using AutoREST. The Swagger definition is located at importexport.json. If you want to use CSharp, this sample already contains the generated files and you do not need to generate your own.

This sample only provides the C# version of Swagger-Based generated code. In this sample StorageImportExportClient is a wrapper over the existing REST API in Open specification format.

Pre Requisite

  1. Access to subscription and classic storage account.
  2. Access to Management certificate of the subscription in which this storage account lives.
  3. Create a new self-signed certificate.
  4. Upload the certificate

How to get started with Import Job

  1. Create a storage account.
  2. Procure the disk and adapter.
  3. Download the client tool.
  4. Prepare the disk.
  5. Create a job using REST API using this sample.
  6. Ship disks to Azure datacenter using address obtained using GetLocation API.
  7. Track your job using REST API using this sample.

How to get started with Export Job

  1. Procure a disk.
  2. Create a job using REST API to specify the data you intend to export using this sample.
  3. Ship disks to Azure datacenterusing address obtained using GetLocation API.
  4. Track your job using REST API using this sample.
  5. Retrieve the Bitlocker key using GetJob REST API using this sample.
  6. Unlock your drive and retrieve your data using this key when the disk arrives back to you.

Running this sample

  1. Download the solution and open the RestAPISample.sln in Visual Studio.
  2. Open TemplateExportConfig.xml or TemplateImportConfig.xml and set the following values

    <Location>{your azure storage location name}</Location>
      <StorageAccountName>{your storage account name}</StorageAccountName>
    <StorageAccountKey>{your storage account key}</StorageAccountKey>
            <Name>{contact person name}</Name>
          <Email>{contact email}</Email>
            <Address>{contact postal address}</Address>
          <Phone>{contact number}</Phone>
          <CarrierName>{carrier name like Fedex, DHL}</CarrierName>
            <CarrierAccountNumber>{account number for return shipping}</CarrierAccountNumber>
          <Blob BlobPaths="" BlobPathPrefixes="/export/"/>

In addition following are fields for an Import job from your journal file.

        <DriveId>{your drive serial number}</DriveId>
        <BitLockerKey>{drive bitlocker key}</BitLockerKey>
        <ManifestFile>{path to manifest file}</ManifestFile>
        <ManifestHash>{your manifest hash}</ManifestHash>
  1. Update the endpoint in storageimportexportlib.cs. It will be for public Azure and for US Gov Cloud

     this.BaseUri = new Uri("<management end-point>");
  2. Update managemnt certificate thumbprint the value in Program.cs. If you don't know about management certificates, here is the guide.

     var clientCertificateThumbprint = "<client certificate thumbprint>";
  3. Update the Azure subscription id

     var client = new StorageImportExportClient("<azure subscription id>", clientCert);
  4. Update name of the import or export job. This could be any string value to identify your job.

  5. Update storage account name and job name you wish to view

     var jobItem = client.GetJob("<storage accountname>", "<jobname import>");
  6. Set breakpoints and run the project using F10.

More information