Amazon S3

Google Cloud Storage

From files in Google Cloud Storage to
low-latency APIs

All that data you have in Google Cloud Storage can be easily exploited to build data products. Automate ingestion and publish low-latency API endpoints with Tinybird.

Google Cloud Storage Graph

Trusted by companies like...

The Hotels NetworkFeedback LoopStayPlytixAudienseSitumGenially
The Hotels Network
Feedback Loop
Stay
Plytix
Audiense
Situm
Genially
Genially

You have your CSV files in a Google Cloud Storage bucket. Now you can easily build APIs on top.

We detect new files in the Google Cloud Storage bucket and ingest automatically, up to millions of rows per second.

After ingestion, run serverless fast transformations using our Data Pipes.

SQL based

Use Auth tokens to control access to API endpoints. Implement access policies as you need. Support for row-level security.

Secure

Build in minutes, not weeks

Ingest, query and build APIs for your data at scale in a matter of minutes. Forget about ETLs, performance and complex security rules.

$ tb datasource append tripdata https://storage.googleapis.com/nyc_taxi_example/fhv_tripdata_2021-03.csv
🥚 starting import process 
🐥 done

** Appended 973916 new rows 
** Total rows in tripdata: 23854144
** Data appended to data source 'tripdata' successfully! 
** Data pushed to tripdata

NODE avg_triptime_endpoint
SQL >
  SELECT
    toDayOfMonth(pickup_datetime) as day,
    avg (dateDiff('minute', pickup_datetime, dropoff_datetime)) as avg_trip_time_minutes
  FROM tripdata
    {% if defined(start_date) and defined(end_date) %}
WHERE pickup_dt BETWEEN {{Date(start_date)}} AND {{Date(end_date)}}
    {% end %}
  GROUP BY day

$ tb push endpoints/avg_triptime.pipe
** Processing endpoints/avg_triptime.pipe
** Building dependencies
** Running avg_triptime
** => Test endpoint at
https://api.tinybird.co/v0/pipes/avg_triptime.json
**'avg_triptime' created
** Not pushing fixtures

1

Ingest CSV files fast and easily

Automate ingestion of files from your Google Cloud Storage bucket through our REST API. Transform or augment on ingest if needed

2

Create your Pipes

Filter, clean or enrich your data using Pipes, a new way of chaining SQL queries designed to ease development and maintenance.

3

Publish your API endpoints

Share access securely to your data in a click and get full OpenAPI and Postman documentation for your APIs.

We accelerate your data, no matter where it is.

Connect data from Relational Databases, Data Warehouses and Data Streams.

Amazon Redshift

Amazon S3

Google BigQuery

Apache Kafka

PostgreSQL

MySQL

Snowflake

FAQ