One of the most common and powerful data engineering workflows today is sharing data from Snowflake to Google Cloud Storage (GCS). Whether you're building data pipelines, powering machine learning models, or just syncing data between cloud platforms, this workflow helps teams stay agile and connected across their data ecosystem.
But here's the catch: while Snowflake to GCS exports are extremely useful, they’re not exactly plug-and-play. The process involves several moving parts: configuring storage integrations, handling IAM permissions, staging files, managing formats like CSV or Parquet, and sometimes even writing custom Python scripts just to get data from point A to point B. For many teams, especially those without a dedicated data platform engineer, this can be a headache.
If you're looking for a faster, easier, and code-free way to share data products from Snowflake to GCS, Monda has you covered.
Instead of wrestling with stages or complex ETL jobs, Amplify lets you:
It’s built for modern data teams that want enterprise-grade data product sharing… minus the engineering lift.
In this guide, we'll walk through the manual process so you can understand what's going on under the hood. But if you're looking for the simplest, most scalable way to operationalize Snowflake → GCS data products, start here with Monda.
To share data from Snowflake to Google Cloud Storage (GCS), you need to extract data from Snowflake and upload it to a GCS bucket. This process typically involves:
gsutil
, gcloud
, or SDKs.1. Install required packages (Bash)
pip install snowflake-connector-python pandas gcsfs
2. Export Snowflake data to Pandas and write to GCS (Python)
import pandas as pd
import snowflake.connector
from google.cloud import storage
# Snowflake connection
conn = snowflake.connector.connect(
user='YOUR_USER',
password='YOUR_PASSWORD',
account='YOUR_ACCOUNT',
warehouse='YOUR_WAREHOUSE',
database='YOUR_DATABASE',
schema='YOUR_SCHEMA'
)
# Query data
cursor = conn.cursor()
cursor.execute("SELECT * FROM your_table")
df = cursor.fetch_pandas_all()
# Save locally (optional)
df.to_csv('data.csv', index=False)
# Upload to GCS
client = storage.Client()
bucket = client.get_bucket('your-bucket-name')
blob = bucket.blob('folder/data.csv')
blob.upload_from_filename('data.csv')
First, configure GCS credentials in Snowflake using a storage integration (SQL):
CREATE STORAGE INTEGRATION my_gcs_integration
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = GCS
ENABLED = TRUE
STORAGE_ALLOWED_LOCATIONS = ('gcs://your-bucket-name/your-path/');
Then create an external stage (SQL):
CREATE STAGE my_gcs_stage
URL = 'gcs://your-bucket-name/your-path/'
STORAGE_INTEGRATION = my_gcs_integration
FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"');
Unload data from Snowflake to GCS (SQL)
COPY INTO @my_gcs_stage/data_export_
FROM (SELECT * FROM your_table)
FILE_FORMAT = (TYPE = CSV)
SINGLE = TRUE
OVERWRITE = TRUE;
This will unload the data as a CSV file into your GCS bucket.
Storage Object Admin
).PARQUET
, JSON
, etc., in FILE_FORMAT
.
Exporting data from Snowflake to Google Cloud Storage manually gives you full control, but it comes at a cost. You’ll need to:
COPY INTO
commands or Python scriptsThis approach is flexible, but time-consuming, and it often requires deep familiarity with both Snowflake and GCP’s security model.
With Amplify Data, you can skip all the complexity:
Amplify is purpose-built to make Snowflake-to-GCS data sharing fast, secure, and repeatable, so your team can focus on insights, not infrastructure.
150+ data companies use Monda to easily access, create, and share AI-ready data products.
Explore all featuresMonda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.
Sign up to Monda Monthly, the newsletter for data providers, and get thought leadership, product updates and event notifications.