One of the most common things people want to do with Snowflake is simple in theory: load their data from cloud storage into a Snowflake table so they can start analyzing it.
If your team uses Microsoft Azure Blob Storage to store data products - CSVs, JSON files, Parquet datasets - then loading them into Snowflake is a natural next step. After all, Snowflake is designed to be the central hub for analytics, and Azure Blob is one of the most popular and cost-effective ways to store large amounts of raw or processed data.
But here’s the catch: while Snowflake can read from Azure Blob, the process isn’t exactly plug-and-play.
You have to:
It’s powerful, flexible - and a bit technical. If you’re comfortable with cloud IAM roles, SQL scripts, and debugging permissions errors, the official way works fine. But for many teams, especially those who just want to focus on delivering insights, this level of complexity can be frustrating and slow.
The good news? You don’t actually have to wrestle with all those steps anymore.
With Amplify Data, you can easily and securely load data products from Microsoft Azure Blob Storage into Snowflake without worrying about storage integrations, SAS tokens, or complex SQL scripts.
Amplify provides a user-friendly platform that handles all the plumbing for you:
✅ Connect your Azure Blob container
✅ Map your files to Snowflake tables
✅ Automate the ingestion
✅ And start querying your data within minutes
If you’re interested in the nitty-gritty of how it works behind the scenes, keep reading: the detailed steps are below.
But if you just want to save time and focus on your data, check out Amplify Data and skip the complexity entirely.
You or your team uploads the data (e.g. CSV, JSON, Parquet) to a container in your Azure Storage account.
E.g., your file is at:
https://<your-storage-account>.blob.core.windows.net/<your-container>/data.csv
You need to give Snowflake permission to read from your Azure Blob.
You can use either:
With Storage Integration:
In Snowflake:
CREATE STORAGE INTEGRATION azure_int
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = AZURE
ENABLED = TRUE
STORAGE_ALLOWED_LOCATIONS = ('azure://<your-storage-account>.blob.core.windows.net/<your-container>/');
This creates an integration and provides you with an AZURE_CONSENT_URL and an AZURE_MULTI_TENANT_APP_ID.
Your Azure admin must grant consent using that URL and assign the managed identity permissions (Storage Blob Data Reader) on the container.
Check the integration:
DESC STORAGE INTEGRATION azure_int;
Once permissions are set up, create an external stage pointing to your blob container.
With Storage Integration:
CREATE STAGE my_stage
URL='azure://<your-storage-account>.blob.core.windows.net/<your-container>/'
STORAGE_INTEGRATION = azure_int
FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"');
If using a SAS token instead of integration:
CREATE STAGE my_stage
URL='azure://<your-storage-account>.blob.core.windows.net/<your-container>/'
CREDENTIALS = (AZURE_SAS_TOKEN = '<your-sas-token>')
FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"');
Check what files are available in the stage:
LIST @my_stage;
Create a target table:
CREATE TABLE my_table (
column1 STRING,
column2 STRING,
...
);
Load data:
COPY INTO my_table
FROM @my_stage/data.csv
FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"' SKIP_HEADER=1);
You can also load multiple files with a pattern:
COPY INTO my_table
FROM @my_stage
FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"' SKIP_HEADER=1)
PATTERN='.*\.csv';
Skip the technical steps and load data products from Microsoft Azure Blob Storage into Snowflake easily using Amplify (powered by Monda). Find out more or get a demo.
150+ data companies use Monda to easily access, create, and share AI-ready data products.
Explore all featuresMonda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.
Sign up to Monda Monthly to get data & AI thought leadership, product updates, and event notifications.