Skip to Content

Migrate S3 storage to Azure Blog container

Recently I got a requirement of AWS complete infrastructure migration to Azure.

For this I taken 2 days of time, to go through the AWS environments to prepare a checklist what are the components in use and what are business critical. After preparing checklist, doing the inventory for the resources, I saw S3. It was having complete data of 34TB, mostly images and logs. The 2nd party said not to delete anything, so there was not 2nd chance to see for the purge mechanism. 

Let's move on to the implementation and the steps, how I started it.

🔧 Prerequisites

  1. Azure Subscription
    • Ensure you have enough storage quota. Each storage account can support up to 5PB depending on the redundancy and performance tier.
  2. Install Azure CLI
  3. Install AzCopy (v10+)
  4. Create a local staging server (optional)
    • Use an EC2 or Azure VM with high bandwidth and storage (if staging is needed).

These above is the suggested prerequisites by CHATGPT. But I used only few componets of it.

Checkout Our Cloud migration services.

Step 1. 

Install azcopy​ locally or any remote system, to make a mediator between S3 and Azure blob.

I followed this Microsoft document: https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?tabs=dnf#download-the-azcopy-portable-binary

After downloading, extract it and set the path of the azcopy in environment variables. This document will guide you through the process of install of azcopy​.

Step 2 .

Configure the environment variables:

Using windows

azcopy set AWS_ACCESS_KEY_ID=AKDJUUS**F**********T**

azcopy set AWS_SECRET_ACCESS_KEY=x**Hfbc1D7***************KLEoSr******

Replace the values of AWS_ACCESS_KEY and AWS_SECRET_ACCESS_KEY. This above command is used on windows system command. For other OS please refer Microsoft doc.

Step 3.

Create a Azure Storage account:


After Creation of storage account go to containers to create an container just like AWS buckets.

Your Dynamic Snippet will be displayed here... This message is displayed because you did not provided both a filter and a template to use.


Create a container, and give the name of the container same as your AWS bucket, so later less changes need to be done if required.

Step 4.

Setup the Azure blob container Generate SAS, by generating new token for access of Azure blob storage.

- Click on the container - right most three dots and check for Generate  SAS token and URL.

as: 

This token comes with an start and end dates, so be careful while creating the SAS token, also you will need to choose the permission of write and put, etc.

I have changed the permissions, end date and left the IP, I don't want to allow any specific IP.

Step 4.

Now my environment is up to date and everything I have on place, now prepare the azcopy​ parameters.

Source : "https://s3.amazonaws.com/demo-bucket"

Public S3 URL format: https://s3.amazonaws.com/{bucket-name}.

Destination: https://s3migrate.blob.core.windows.net/demo-bucket

SAS Token: ?sp=racwdl&st=2025-05-19T09:35:40Z&se=2025-05-19T17:35:40Z&spr=https&sv=2024-11-04&sr=c&sig=oApfv2h3Farkle5h9JDGCChOlnFiRh4R%2BIrCYoF3Pkk%3D


Flags:

--recursive=true

  • Tells AzCopy to recursively copy all folders/files under the source bucket.

--from-to=S3Blob

  • Explicitly tells AzCopy you're copying:
    • From: S3
    • To: Azure Blob

Now prepare the final migration command:

azcopy copy "https://s3.amazonaws.com/demo-bucket" "https://s3migrate.blob.core.windows.net/demo-bucket?sp=racwdl&st=2025-05-19T09:35:40Z&se=2025-05-19T17:35:40Z&spr=https&sv=2024-11-04&sr=c&sig=oApfv2h3Farkle5h9JDGCChOlnFiRh4R%2BIrCYoF3Pkk%3D" --recursive=true --from-to=S3Blob


It will start copying the files at high speed.


There are others programming ways too, where you can first store on local and move to azure, but in large data set you will need to go with the azcopy


Hope you find this helpful!!!

Installing NVIDIA driver on ubuntu server with container support