---
title: Ingest from Snowflake using Azure Blob Storage
meta:
  description: Learn how to send data from Snowflake to Tinybird using Azure Blob Storage.
---

# Ingest from Snowflake using Azure Blob Storage

Read on to learn how to send data from Snowflake to Tinybird, for example when you need to periodically run full replaces of a table or do a one-off ingest.

This process relies on [unloading](https://docs.snowflake.com/en/user-guide/data-unload-overview), or bulk exporting, data as gzipped CSVs and then ingesting them using the Data Sources API.

## Prerequisites

To follow these steps you need a Tinybird account and access to Snowflake and permissions to create SAS Tokens for Azure Blob Storage.

{% steps %}

## Unload the Snowflake table

Snowflake lets you [unload](https://docs.snowflake.com/en/user-guide/data-unload-overview) query results to flat files to and external storage service. For example:

```tb
COPY INTO 'azure://myaccount.blob.core.windows.net/unload/'
  FROM mytable
  CREDENTIALS = ( AZURE_SAS_TOKEN='****' )
  FILE_FORMAT = ( TYPE = CSV  COMPRESSION = GZIP )
  HEADER = FALSE;
```

The most basic implementation is [unloading directly](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location#unloading-data-from-a-table-directly-to-files-in-an-external-location), but for production use cases consider adding a [named stage](https://docs.snowflake.com/en/user-guide/data-unload-azure#unloading-data-into-an-external-stage) as suggested in the Snowflake docs. Stages give you fine-grained control to access rights.

## Create a SAS token for the file

Using [Azure CLI](https://learn.microsoft.com/en-us/cli/azure/install-azure-cli), generate a [shared access signature (SAS) token](https://learn.microsoft.com/en-us/azure/ai-services/translator/document-translation/how-to-guides/create-sas-tokens?tabs=blobs) so Tinybird can read the file:

```shell
az storage blob generate-sas \
    --account-name myaccount \
    --account-key '****' \
    --container-name unload \
    --name data.csv.gz \
    --permissions r \
    --expiry <expiry-ts> \
    --https-only \
    --output tsv \
    --full-uri

> 'https://myaccount.blob.core.windows.net/unload/data.csv.gz?se=2024-05-31T10%3A57%3A41Z&sp=r&spr=https&sv=2022-11-02&sr=b&sig=PMC%2E9ZvOFtKATczsBQgFSsH1%2BNkuJvO9dDPkTpxXH0g%5D'
```

{% callout type="tip" %}
You can use the same behavior in S3 and GCS to generate presigned URLs.
{% /callout %}

## Ingest into Tinybird

Take the generated URL and make a call to Tinybird. You need a [token](/administration/auth-tokens) with `DATASOURCES:CREATE` permissions:

```shell
curl \
-H "Authorization: Bearer <DATASOURCES:CREATE token>" \
-X POST "{% user("apiHost") %}/v0/datasources?name=my_datasource_name" \
-d url='https://myaccount.blob.core.windows.net/unload/data.csv.gz?se=2024-05-31T10%3A57%3A41Z&sp=r&spr=https&sv=2022-11-02&sr=b&sig=PMC%2E9ZvOFtKATczsBQgFSsH1%2BNkuJvO9dDPkTpxXH0g%5D'
```

You now have your Snowflake Table in Tinybird.

{% /steps %}

## Automation

To adapt to production scenarios, like having to append data on a timely basis or replacing data that has been updated in Snowflake, you might need to define scheduled actions to move the data.

## Limits

Because you're using the data sources API, its [limits](/api-reference#limits) apply.

You might need to adjust your [COPY INTO <location>](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location) expression adding `PARTITION` or `MAX_FILE_SIZE = 5000000000`. For example:

```tb
COPY INTO 'azure://myaccount.blob.core.windows.net/unload/'
  FROM mytable 
  CREDENTIALS=( AZURE_SAS_TOKEN='****')
  FILE_FORMAT = ( TYPE = CSV  COMPRESSION = GZIP )
  HEADER = FALSE
  MAX_FILE_SIZE = 5000000000;
```

## Next steps

See the following resources:

- [Ingest from Snowflake using AWS S3](./ingest-from-snowflake-using-aws-s3)
- [Ingest from Snowflake using incremental updates](./ingest-from-snowflake-using-incremental-updates)
- [GCS Connector](/forward/get-data-in/connectors/gcs)
