Imagine this: you’ve built powerful data products inside Snowflake - cleaned, transformed, and ready-to-consume. Now your clients or data consumers want those results delivered - fast - to their own infrastructure. And that infrastructure is often Amazon S3, the go-to destination for shared analytics.
Exporting data from Snowflake to S3 is one of the most common workflows you’ll encounter in modern data pipelines. You use the reliable COPY INTO
command, set up external stages, configure IAM permissions, design storage‑integration objects, and schedule tasks or ETL jobs. That flow is rock-solid. But let’s be honest, from policy attachments and IAM roles to cryptic SQL and automation scheduling, it’s a pretty tech-heavy setup. One minor permission mismatch, and your data pipeline grinds to a halt. Not exactly the breezy “zero friction” flow that non‑engineer teams crave.
If you’d rather skip the cloud‑infra gymnastics, there’s great news: Amplify Data is built for exactly this scenario.
Amplify’s no‑code platform lets you securely expose curated Snowflake Data Products and have them delivered to destinations like S3, without writing COPY statements or wrestling with IAM policies. As a data provider, you can:
In short, Amplify shifts the heavy lifting of data operations to a no‑code platform, freeing your team to focus on building data products, not managing infra.
If you want to share data products from Snowflake to Amazon S3 without using Amplify, there isn’t a direct “one-click” export of a Snowflake Data Product / Data Share to S3 because Snowflake Data Sharing is meant to stay within the Snowflake ecosystem.
However, you can absolutely extract data from Snowflake and deliver it into Amazon S3 in a secure, automated way.
Here is how you can do it, step by step:
COPY INTO
Snowflake has a built-in COPY INTO
command that exports query results or table data to external stages, like an S3 bucket.
Steps:
✅ Create an S3 bucket and ensure you have an AWS IAM user or role with access to it.
✅ Create a Snowflake External Stage that points to the S3 bucket.
✅ Use COPY INTO
to export data.
Example:
sql
CopyEdit
-- create an external stage pointing to your S3 bucket
CREATE STAGE my_s3_stage
URL='s3://my-bucket-name/snowflake-export/'
STORAGE_INTEGRATION = my_s3_integration;
-- unload data from a table to S3
COPY INTO @my_s3_stage/orders/
FROM orders
FILE_FORMAT = (TYPE = CSV COMPRESSION = GZIP)
OVERWRITE = TRUE;
You can also export query results:
sql
CopyEdit
COPY INTO @my_s3_stage/orders_export/
FROM (
SELECT order_id, customer_id, total_amount
FROM orders
WHERE order_date >= '2024-01-01'
)
FILE_FORMAT = (TYPE = PARQUET);
Then, the files will appear in S3.
If you don’t want to manage the COPY INTO
jobs yourself, you can use an ETL or orchestration tool to move Snowflake data to S3:
These tools can schedule, transform, and export data to S3 more robustly.
You can also write a Python script that:
Here’s a mini example:
python
CopyEdit
import snowflake.connector
import pandas as pd
import boto3
# query Snowflake
conn = snowflake.connector.connect(...)
df = pd.read_sql("SELECT * FROM orders", conn)
# save locally
df.to_parquet("orders.parquet")
# upload to S3
s3 = boto3.client('s3')
s3.upload_file("orders.parquet", "my-bucket-name", "orders/orders.parquet")
Once you decide which method to use, you can schedule it with:
COPY INTO
)
If you love full control and don’t mind the extra complexity, stick with Snowflake’s native COPY INTO
+ storage integration path. But if you want a fast, secure, and maintenance‑free way to deliver curated Snowflake data to S3 (or other destinations), check out Amplify Data for a much friendlier experience.
150+ data companies use Monda's all-in-one data monetization platform to build a safe, growing, and successful data business.
Explore all featuresMonda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.
Sign up to Monda Monthly, the newsletter for data providers, and get thought leadership, product updates and event notifications.