How to Share Data Products from Snowflake to Google Cloud Storage (GCS)

Snowflake to Google Cloud Storage
Table of contents

One of the most common and powerful data engineering workflows today is sharing data from Snowflake to Google Cloud Storage (GCS). Whether you're building data pipelines, powering machine learning models, or just syncing data between cloud platforms, this workflow helps teams stay agile and connected across their data ecosystem.

But here's the catch: while Snowflake to GCS exports are extremely useful, they’re not exactly plug-and-play. The process involves several moving parts: configuring storage integrations, handling IAM permissions, staging files, managing formats like CSV or Parquet, and sometimes even writing custom Python scripts just to get data from point A to point B. For many teams, especially those without a dedicated data platform engineer, this can be a headache.

⚡ Skip the complexity with Monda

If you're looking for a faster, easier, and code-free way to share data products from Snowflake to GCS, Monda has you covered.

Instead of wrestling with stages or complex ETL jobs, Amplify lets you:

  • Seamlessly share datasets from Snowflake to GCS
  • Automate exports and syncs on your schedule
  • Collaborate across teams and clouds without writing a line of code

It’s built for modern data teams that want enterprise-grade data product sharing… minus the engineering lift.

In this guide, we'll walk through the manual process so you can understand what's going on under the hood. But if you're looking for the simplest, most scalable way to operationalize Snowflake → GCS data products, start here with Monda.

🧭 Technical Step-by-Step Guide

To share data from Snowflake to Google Cloud Storage (GCS), you need to extract data from Snowflake and upload it to a GCS bucket. This process typically involves:

  1. Export data from Snowflake (e.g., to CSV or Parquet).
  2. Stage the data, optionally in a temporary location.
  3. Upload the data to Google Cloud Storage using tools like gsutil, gcloud, or SDKs.

Option 1: Use Snowflake + Python (e.g. with Pandas and GCS SDK)

1. Install required packages (Bash)

pip install snowflake-connector-python pandas gcsfs

2. Export Snowflake data to Pandas and write to GCS (Python)

import pandas as pd
import snowflake.connector
from google.cloud import storage

# Snowflake connection
conn = snowflake.connector.connect(
   user='YOUR_USER',
   password='YOUR_PASSWORD',
   account='YOUR_ACCOUNT',
   warehouse='YOUR_WAREHOUSE',
   database='YOUR_DATABASE',
   schema='YOUR_SCHEMA'
)

# Query data
cursor = conn.cursor()
cursor.execute("SELECT * FROM your_table")
df = cursor.fetch_pandas_all()

# Save locally (optional)
df.to_csv('data.csv', index=False)

# Upload to GCS
client = storage.Client()
bucket = client.get_bucket('your-bucket-name')
blob = bucket.blob('folder/data.csv')
blob.upload_from_filename('data.csv')


Option 2: Use Snowflake External Stage to GCS

First, configure GCS credentials in Snowflake using a storage integration (SQL):

CREATE STORAGE INTEGRATION my_gcs_integration
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = GCS
ENABLED = TRUE
STORAGE_ALLOWED_LOCATIONS = ('gcs://your-bucket-name/your-path/');

Then create an external stage (SQL):

CREATE STAGE my_gcs_stage
 URL = 'gcs://your-bucket-name/your-path/'
 STORAGE_INTEGRATION = my_gcs_integration
 FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"');


Unload data from Snowflake to GCS (SQL)

COPY INTO @my_gcs_stage/data_export_
 FROM (SELECT * FROM your_table)
 FILE_FORMAT = (TYPE = CSV)
 SINGLE = TRUE
 OVERWRITE = TRUE;

This will unload the data as a CSV file into your GCS bucket.

Notes

  • You must configure access to GCS (Snowflake needs permissions via a service account with appropriate roles like Storage Object Admin).
  • For production workflows, consider orchestrating this with Airflow, dbt, or Cloud Functions.
  • File format can be changed to PARQUET, JSON, etc., in FILE_FORMAT.

🛠 The Manual Way: Powerful but Cumbersome

Exporting data from Snowflake to Google Cloud Storage manually gives you full control, but it comes at a cost. You’ll need to:

  • Set up and manage GCS buckets and IAM permissions
  • Create Snowflake storage integrations and external stages
  • Write and maintain SQL COPY INTO commands or Python scripts
  • Handle file formats, data validation, retries, and scheduling

This approach is flexible, but time-consuming, and it often requires deep familiarity with both Snowflake and GCP’s security model.

⚡ The Amplify Way: Fast, Secure, and No-Code

With Amplify Data, you can skip all the complexity:

  • Connect Snowflake and GCS in minutes
  • Select tables or queries to share - no SQL or scripts required
  • Set up automated syncs with built-in governance and observability
  • Collaborate with other teams or partners across cloud boundaries

Amplify is purpose-built to make Snowflake-to-GCS data sharing fast, secure, and repeatable, so your team can focus on insights, not infrastructure.

Access & share data products, the easy way.

150+ data companies use Monda to easily access, create, and share AI-ready data products.

Explore all features

Related articles

Monda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.

Data Sharing

How to Share Data Products from Snowflake to Power BI

Lucy Kelly

Data Sharing

How to Share Data Products from Snowflake to Microsoft Azure Blob Storage

Lucy Kelly

Data Sharing

How to Share Data Products from Snowflake to Tableau

Lucy Kelly

Monda Logo

Grow your business with one data commerce platform.

Get a demo

Be the best informed in the data industry

Sign up to Monda Monthly, the newsletter for data providers, and get thought leadership, product updates and event notifications.