How to Share Data Products from Snowflake to Amazon S3

Snowflake to Amazon S3
Table of contents

Imagine this: you’ve built powerful data products inside Snowflake - cleaned, transformed, and ready-to-consume. Now your clients or data consumers want those results delivered - fast - to their own infrastructure. And that infrastructure is often Amazon S3, the go-to destination for shared analytics.

Exporting data from Snowflake to S3 is one of the most common workflows you’ll encounter in modern data pipelines. You use the reliable COPY INTO command, set up external stages, configure IAM permissions, design storage‑integration objects, and schedule tasks or ETL jobs. That flow is rock-solid. But let’s be honest, from policy attachments and IAM roles to cryptic SQL and automation scheduling, it’s a pretty tech-heavy setup. One minor permission mismatch, and your data pipeline grinds to a halt. Not exactly the breezy “zero friction” flow that non‑engineer teams crave.

⚡️ The Simpler Way: Amplify Data

If you’d rather skip the cloud‑infra gymnastics, there’s great news: Amplify Data is built for exactly this scenario.

Amplify’s no‑code platform lets you securely expose curated Snowflake Data Products and have them delivered to destinations like S3, without writing COPY statements or wrestling with SQL. As a data provider, you can:

  • Configure filters, row‑column visibility, and file formats through a clean point‑and‑click UI
  • Choose S3 as a target without touching a line of infra code or credentials
  • Track usage, delivery success, and customer engagement in real time, right from their dashboard.

In short, Amplify shifts the heavy lifting of data operations to a no‑code platform, freeing your team to focus on building data products, not managing infra.

🧭 Technical Guide to Sharing Data Products from Snowflake to Amazon S3

If you want to share data products from Snowflake to Amazon S3 without using Amplify, there isn’t a direct “one-click” export of a Snowflake Data Product / Data Share to S3 because Snowflake Data Sharing is meant to stay within the Snowflake ecosystem.

However, you can absolutely extract data from Snowflake and deliver it into Amazon S3 in a secure, automated way.

Here is how you can do it, step by step:

📄 Options to share Snowflake data to S3

1. Unload data to S3 using COPY INTO

Snowflake has a built-in COPY INTO command that exports query results or table data to external stages, like an S3 bucket.

Steps:

✅ Create an S3 bucket and ensure you have an AWS IAM user or role with access to it.

✅ Create a Snowflake External Stage that points to the S3 bucket.

✅ Use COPY INTO to export data.

Example

-- create an external stage pointing to your S3 bucket
CREATE STAGE my_s3_stage
  URL='s3://my-bucket-name/snowflake-export/'
  STORAGE_INTEGRATION = my_s3_integration;

-- unload data from a table to S3
COPY INTO @my_s3_stage/orders/
FROM orders
FILE_FORMAT = (TYPE = CSV COMPRESSION = GZIP)
OVERWRITE = TRUE;

You can also export query results:

COPY INTO @my_s3_stage/orders_export/
FROM (
    SELECT order_id, customer_id, total_amount
    FROM orders
    WHERE order_date >= '2024-01-01'
)
FILE_FORMAT = (TYPE = PARQUET);



Then, the files will appear in S3.

2. Use Snowflake + ETL Tools

If you don’t want to manage the COPY INTO jobs yourself, you can use an ETL or orchestration tool to move Snowflake data to S3:

  • Fivetran (Snowflake → S3)
  • Matillion
  • Airflow (with Snowflake and S3 operators)
  • dbt + custom S3 hooks

These tools can schedule, transform, and export data to S3 more robustly.

3. Programmatically using Python/SDK

You can also write a Python script that:

  • Queries Snowflake (using the Snowflake Python Connector)
  • Saves results as CSV/Parquet
  • Uploads the file to S3 (using boto3)

Here’s a mini example:

import snowflake.connector
import pandas as pd
import boto3

# query Snowflake
conn = snowflake.connector.connect(...)
df = pd.read_sql("SELECT * FROM orders", conn)

# save locally
df.to_parquet("orders.parquet")

# upload to S3
s3 = boto3.client('s3')
s3.upload_file("orders.parquet", "my-bucket-name", "orders/orders.parquet")



Security & Permissions

  • In Snowflake, set up a storage integration and IAM role for secure S3 access.
  • Make sure S3 buckets have proper permissions, encryption, and policies in place.

Automating the flow

Once you decide which method to use, you can schedule it with:

  • Snowflake Tasks & Streams (for COPY INTO)
  • Airflow
  • Lambda + EventBridge
  • Cron jobs (for scripts)

TL;DR

If you love full control and don’t mind the extra complexity, stick with Snowflake’s native COPY INTO + storage integration path. But if you want a fast, secure, and maintenance‑free way to deliver curated Snowflake data to S3 (or other destinations), check out Amplify Data for a much friendlier experience.

Access & share data products, the easy way.

150+ data companies use Monda to easily access, create, and share AI-ready data products.

Explore all features

Related articles

Monda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.

Data Sharing

How to Share Data Products from Snowflake to Tableau

Lucy Kelly

Data Sharing

How to Share Data Products from Snowflake to Microsoft Azure Blob Storage

Lucy Kelly

Data Sharing

How to Share Data Products from Snowflake to Google Cloud Storage (GCS)

Lucy Kelly

Monda Logo

Share AI-ready data products, easily.

Get a demo

Stay up-to-date on everything data & AI

Sign up to Monda Monthly to get data & AI thought leadership, product updates, and event notifications.