Changelog
New updates and improvements in Tinybird

Massive update after a busy launch week #2
Last weeks have been incredibly busy preparing for our second ever launch week while working on dozens of fixes and improvements.

Au revoir Recently Used
After some internal research on how users browse their Tinybird Workspaces we’ve decided to remove the Recently Used block from the Sidebar and add some dynamic reordering to the Pipes and Data Sources sections.
Tinyfixes all around
We’ve made several improvements to the CLI, the UI and more!

Launch week’s hangover!
The flock has been really busy during the last couple of weeks preparing our first launch week ever. Now we are quickly improving our latest released features with your feedback and preparing for the next one, but in the meantime here is a quick recap.

A new way to copy data within Tinybird is in beta preview
It’s a new API that enables a Pipe to write into a Tinybird Data Source, on-demand or as a scheduled operation.

New BigQuery connector in Private Beta
Bring your data from BigQuery and build production ready analytics applications.

Introducing the new Data Ingestion Experience
Say hi to our redesigned version of the Data Ingestion UI

Observability for the BI connector
Until recently, BI usage was not available to track. We have now made two new service datasources available that show stats about the BI Connector consumption

Increased observability capabilities in Service Data Sources
Several improvements and bug fixes this week around ingestion and ingestion/materialization progress tracking. And of course, great new observability data available in our extended and new Service Data Sources

Notable fixes and improvements
We've shipped bug fixes and small features to improve your product experience. Here are some of our most recent.

Quickstart on loading events to Tinybird
The most recent changes to Tinybird include the most simple ingestion method possible, plus CLI tools for testing

Data Source descriptions and beta testing of Parquet ingestion
Document your Data Sources by adding descriptions just like you do for pipes and nodes