Icon Apache Kafka


Your data in Snowflake turned into actionable API Endpoints 

Write SQL queries on your data held in Snowflake and directly expose the results, or queries on those results, as API endpoints. Easy to maintain. Always up-to-date. Fast as can be.

Snowflake GraphApache Kafka

Trusted by companies like...

The Hotels NetworkFeedback LoopStayPlytixAudienseSitumGenially
The Hotels Network
Feedback Loop

Connect to your Snowflake database using your Snowflake account details and Google account, project and bucket details, to start building APIs right away. 

Data Sources Snowflake

Use an SQL query to get the data you need from Snowflake and then run SQL queries on that data in Tinybird.

SQL based

All your Data APIs in one place, automatically documented and scaled. Consistent results for your Data/Dev Teams.

TTLs and Roll-ups

Use Auth tokens to control access to API endpoints. Implement access policies as you need. Support for row-level security.


Build in minutes, not weeks

Ingest, query and build APIs for your data at scale in a matter of minutes. Forget about ETLs, performance and complex security rules.

$ tb \
datasource append $DESTINATION_DATA_SOURCE \
   --connector snowflake \
   --sql "select * from $TABLE"
** 🐔
starting export process from snowflake
** 🥚
starting import process
** 🐥 done
Total rows in taxi: 232341
Data appended to data source 'taxi' successfully!
Data pushed to taxi
NODE avg_triptime_endpoint
    toDayOfMonth(pickup_datetime) as day,
    avg (dateDiff('minute', pickup_datetime, dropoff_datetime)) as avg_trip_time_minutes
  FROM tripdata
    {% if defined(start_date) and defined(end_date) %}
WHERE pickup_dt BETWEEN {{Date(start_date)}} AND {{Date(end_date)}}
    {% end %}
  GROUP BY day

$ tb push endpoints/avg_triptime.pipe
** Processing endpoints/avg_triptime.pipe
** Building dependencies
** Running avg_triptime
** => Test endpoint at
** 'avg_triptime' created
** Not pushing fixtures


Ingest CSV files fast and easily

Automate ingestion of data from your Snowflake database.


Create your Pipes

Filter, clean or enrich your data using Pipes, a new way of chaining SQL queries designed to ease development and maintenance.


Publish your API endpoints

Share access securely to your data in a click and get full OpenAPI and Postman documentation for your APIs.

We accelerate your data, no matter where it is.

Connect data from Relational Databases, Data Warehouses and Data Streams.

Amazon Redshift

Amazon S3

Google BigQuery

Apache Kafka