Learn how to analyse TeraBytes of data in Real-Time with our "Principle of Real-Time Analytics" course

Turn your Kafka Data into actionable API Endpoints your teams can consume

Instead of building a new consumer every time you want to make sense of your Data Streams, write an SQL query and expose an API endpoint. Easy to maintain. Always up to date.

No hassle

Connect to Kafka and start building APIs right away. Forget about anything else.

Fast & Smart

Ingest millions of rows per second. Fix import errors on-the-fly.

SQL based

Run embedded fast transformations using our Data Pipes.

Secure

Implement Data Source or row-level permissions using our Auth tokens.

Turn streams into
answers in minutes, not weeks

Ingest, enrich and build APIs over your Kafka data streams at any scale in a matter of minutes. Let us worry about performance so you can focus on enabling your teams.

  1. 1

    One topic, one data source

    Tinybird consumes your topics in real-time into Data Sources that can be queried individually via SQL.
  2. 2

    Enrich your streams

    As Data comes in, you can enrich it with additional business relevant data via our Data Pipes and prepare it for consumption.
  3. 3

    Publish API endpoints

    Share access securely to your data with just one click and get full OpenAPI and Postman documentation for your APIs.
  • $ curl \ -H "Authorization: Bearer $TOKEN" -X POST "https://api.tinybird.co/v0/datasources" \ -d "name=my_datasource" \ -d "bootstrap_servers=you.confluent.cloud" \ -d "username=userkey" \ -d "password=yourconfluentcloudsecret" \ -d "topic=shopping-cart-events" \ -d "group_id=cart_charts" ` > Topic connected! Receiving data... $

  • $ curl \ -H "Authorization: Bearer $TOKEN" \ -X POST "https://api.tinybird.co/v0/pipes" \ -d "name=shopping_cart_events_per_hour" \ -d "sql=SELECT toStartOfHour(event_time) as hour, event_name, count() c FROM shopping-cart-events GROUP BY hour, event_name" > Pipe created `{ "id": "t_e2f96173af3da01c6e08ab647ea74404", "name": "shopping_cart_events_per_hour", "nodes": [ { "id": "t_3a11bf1e1fa4e98ff214ff99fc6ad2a6", "name": "shopping_cart_events_per_hour_0", "sql": "SELECT toStartOfHour(event_time) as hour, event_name, count() c FROM shopping-cart-events GROUP BY hour, event_name", "created_at": "2020-06-21 08:38:45.752960", "updated_at": "2020-06-21 08:38:45.752960" } ] "endpoint": null, "created_at": "2020-06-21 08:38:45.752936", "updated_at": "2020-06-21 08:38:45.752965" }` $ curl \ -H "Authorization: Bearer $TOKEN" \ -X PUT \ -d t_3a11bf1e1fa4e98ff214ff99fc6ad2a6 \ -d "https://api.tinybird.co/v0/pipes/shopping_cart_ events_per_hour/endpoint" > Pipe published! `{ "id": "t_e2f96173af3da01c6e08ab647ea74404", "name": "shopping_cart_events_per_hour", "nodes": [...], "endpoint": "shopping_cart_events_per_hour", "created_at": "2020-06-19 08:38:13.752936", "updated_at": "2020-06-19 08:49:59.864445" }` $

  • $ curl \ -H "Authorization: Bearer $TOKEN" \ "https://api.tinybird.co/v0/pipes/shopping_cart_events _per_hour.json" `{ "meta": [ { "name": "hour", "type": "DateTime" }, { "name": "event_name", "type": "String" }, { "name": "c", "type": "Int16" } ], "data": [ { "hour": "2020-06-22 10:00:00", "event_name": "item_added", "c": 12332 }, { "hour": "2020-06-22 10:00:00", "event_name": "item_removed", "c": 2435 }, { "hour": "2020-06-22 10:00:00", "event_name": "quantity_changed", "c": 129 }, { "hour": "2020-06-22 10:00:00", "event_name": "order_placed", "c": 685 } ], "rows": ..., "statistics": { "elapsed": 0.037134251, "rows_read": 23854144, "bytes_read": 190833152 } }` $

Accelerate data from almost anywhere

Connect and ingest data from Relational Databases, Data Warehouses and Data Streams easy and fast.

Amazon Redshift
Google BigQuery
MySQL
Snowflake