How to Share Data Products from Snowflake to Google BigQuery

Snowflake to Google BigQuery logos with a data product between
Table of contents

One of the most common patterns we see in modern data teams is using Snowflake as the source of truth while delivering analytics and insights to business users or applications running on Google BigQuery.

It makes perfect sense — Snowflake is fantastic for collecting, cleaning, and modeling your data, and BigQuery is deeply integrated with Google Cloud’s ecosystem, making it a natural choice for downstream reporting, ML workflows, or dashboards in tools like Looker and Data Studio.

But here’s the catch: sharing data from Snowflake to BigQuery isn’t as straightforward as it sounds.

Since Snowflake and BigQuery are separate cloud platforms without native interoperability, you typically need to set up a fairly technical workflow: exporting data to Google Cloud Storage, configuring permissions, loading it into BigQuery, and often orchestrating everything with ETL tools or custom scripts.

These steps work — but they can be time-consuming, brittle, and complex to maintain, especially if your data is changing frequently or you’re supporting multiple stakeholders.

⚡ The Easiest Way to Share Data Products from Snowflake to BigQuery

If you want to skip all the complexity and focus on delivering data products instead of wrestling with pipelines, you can use Amplify Data.

Amplify provides a platform purpose-built for sharing Snowflake data products directly to BigQuery (and other destinations) — securely, reliably, and without all the manual setup.

With Amplify, you can publish your Snowflake data products and let your BigQuery consumers access them seamlessly — no scripts, no storage buckets, no headaches.

In the rest of this guide, we’ll walk through the traditional steps so you understand what’s going on under the hood. But if you’re ready to save time and focus on impact, head over to Amplify Data and start sharing smarter today.

🧭 Technical Guide to Sharing Data Products from Snowflake to BigQuery

Sharing data products from Snowflake to Google BigQuery involves extracting data from Snowflake and loading it into BigQuery, as there's no native direct share functionality between the two platforms. Here are common methods to achieve this:

1. Using Cloud Storage as an Intermediary (Common Approach)

Step-by-Step:

A. Export data from Snowflake to cloud storage (e.g., Google Cloud Storage - GCS):

  1. Use the COPY INTO command to export Snowflake data to GCS in CSV, JSON, or Parquet format.
  2. Authenticate Snowflake with GCS using an external stage.

sql
CopyEdit
CREATE STAGE my_gcs_stage
 URL='gcs://your-bucket/path/'
 STORAGE_INTEGRATION = my_gcs_integration;

COPY INTO @my_gcs_stage/my_export_
FROM my_table
FILE_FORMAT = (TYPE = 'PARQUET');

Note: You must configure a Snowflake storage integration to securely access GCS.

B. Import data from GCS to BigQuery:

bash
CopyEdit
bq load --source_format=PARQUET \\
 your_dataset.your_table \\
 gs://your-bucket/path/my_export_000.parquet

Or use BigQuery’s Web UI or scheduled transfer to load data from GCS.

2. Using ETL Tools (No-Code / Low-Code Options)

Several third-party ETL platforms support Snowflake-to-BigQuery data pipelines:

  • Fivetran
  • Airbyte
  • Hevo Data
  • Matillion
  • Dataform (Google Cloud)

These platforms typically allow:

  • Scheduled syncs or real-time streaming
  • Schema mapping and transformation
  • Easier monitoring and logging

3. Using Python or Custom Code

You can build a custom pipeline using Python with snowflake-connector-python and google-cloud-bigquery:

python
CopyEdit
# Pseudocode:
import snowflake.connector
from google.cloud import bigquery
import pandas as pd

# Connect to Snowflake and query data
sf_conn = snowflake.connector.connect(...)
cursor = sf_conn.cursor()
cursor.execute("SELECT * FROM my_table")
df = cursor.fetch_pandas_all()

# Upload to BigQuery
bq_client = bigquery.Client()
table_ref = bq_client.dataset('your_dataset').table('your_table')
bq_client.load_table_from_dataframe(df, table_ref).result()

4. Using Apache Airflow (for Managed Pipelines)

If you're orchestrating workflows, an Airflow DAG with Snowflake and BigQuery operators can automate the full pipeline.

Tips:

  • Performance: Use compressed columnar formats (Parquet, ORC) for better performance and cost efficiency.
  • Security: Make sure to configure IAM roles and service accounts properly when accessing GCS or BQ.
  • Automation: Consider using scheduled jobs or triggers to refresh data regularly.

To Re-cap

🛠 Manual Way:

  • Full control over infrastructure (auth, APIs, database, storage).
  • Requires in-depth AWS knowledge.
  • Time-consuming and complex to set up.
  • High customization potential but prone to misconfigurations.
  • Manual handling of scaling, security, and monitoring.

⚡Amplify:

  • Simplifies backend setup (auth, APIs, hosting) with CLI.
  • Automatic integration with frontend.
  • Fast and easy to set up, no deep AWS knowledge required.
  • Less flexibility for custom architecture.
  • Managed scaling, security, and deployments.

Skip the technical steps and share data products from Snowflake to Google BigQuery easily using Amplify (powered by Monda). Find out more or get a demo.

Monetize your data

150+ data companies use Monda's all-in-one data monetization platform to build a safe, growing, and successful data business.

Explore all features

Related articles

Monda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.

Data Sharing

How to Share Data Products from Snowflake to Amazon S3

Lucy Kelly

Data Sharing

How to Share Data Products from Snowflake to Databricks

Lucy Kelly

Data Sharing

How to Share Data Products from Snowflake to Snowflake

Lucy Kelly

Monda Logo

Grow your business with one data commerce platform.

Get a demo

Be the best informed in the data industry

Sign up to Monda Monthly, the newsletter for data providers, and get thought leadership, product updates and event notifications.