---
title: "Import Postgres tables into Tinybird with PostgreSQL Table Function"
excerpt: "The PostgreSQL table function connects your Postgres data to Tinybird directly. Query across systems without complex ETL pipelines."
authors: "Cameron Archer"
categories: "Product updates"
createdOn: "2024-07-17 00:00:00"
publishedOn: "2024-07-29 00:00:00"
updatedOn: "2026-01-15 00:00:00"
status: "published"
---

<p>Today, we announce support for the <a href="https://www.tinybird.co/docs/ingest/postgresql" rel="noreferrer">PostgreSQL Table Function</a> in Tinybird. You can now export table from PostgreSQL and sync data from your Postgres database into a Tinybird Data Source with just a few lines of SQL.&nbsp;</p><p>For example, the following SQL gets all columns from a remote Postgres table hosted on Supabase, passing the username and password as secrets, and filtering by rows with a greater <code>stock_date</code> than the maximum <code>stock_date</code> in the corresponding Tinybird Data Source, <code>stock_prices</code>.</p>
<!--kg-card-begin: html-->
<iframe width="100%" src="https://snippets.tinybird.co/XQAAAAKFAQAAAAAAAABBKUqGk9nLKzhV5Sio6Br37u8N8vqBSXtopF4amPgEg1e13Ay5MEUBXoIt4KW2bgGeMgk7koGvsJLeCF2RIv58TWN2SBYEZaVCYMstU7yZYEICbNBzvb6fjmTVmKy9DIJfNEsr6f-gyVEI4iYoPoa84Kl3NzQQOgnQBLu4f-XkBjmWOKCG7x_TO3dXlyCHY1EJkcHEipcIw2H_5HvV2x7DiDiHBdSeDDEMVeFKoxESsLy3daA7dlbvxxLNqtXtG1eNdoQILxXRkMw4dY2cKQbG3b9tSO9tfanBGj4pgt_wMVyl1AEt99-uHIti5tW9kQLsFct3uj5mKMbCXWDZVolBtd0mGHasPnXewQjc_ese-cLX-7oAkA/embed"></iframe>
<!--kg-card-end: html-->
<p>You can run the PostgreSQL Table Function within Tinybird's Copy Pipes using a replace or append strategy to regularly bring data into Tinybird enrich your events with Postgres data.</p><p>Read on for more information about Tinybird's support for Postgres table imports, our reasons for supporting this feature, links to resources, and what's next.</p><h2 id="first-postgres-then%E2%80%A6">First Postgres, then…</h2><p>PostgreSQL is the most popular database in the world, chosen by developers due to its ease of use, vanilla SQL, many deployment options, and broad community.</p><p>Postgres is the perfect "general purpose" database, and that's why SaaS builders choose it to underpin their applications.</p><p>To some extent, Postgres can even support analytics. At a relatively modest scale, however, Postgres begins to break down as an analytics database. There are plenty of extensions that can make Postgres more analytics-friendly. </p><p>Those implementations address part but not all of the problem: they may add columnar storage but still rely on the existing query engine, or they may scale vertically on a single node but not horizontally for distributed query compute. Even the most fine-tuned Postgres instances can struggle in high-scale, high-performance, real-time scenarios.</p><p>To build scalable, real-time, user-facing analytics, developers typically need to migrate their analytics data off Postgres and into a more specialized DBMS. This can be painful, because it often means spinning up new infrastructure, learning new tools, and supporting custom extract-load services to perform the migration (and keep the databases in sync).</p><p>Tinybird's support for the PostgreSQL Table Function makes it easy to directly access data in Postgres and bring it over to Tinybird's purpose-built real-time platform for user-facing analytics. If you're wondering what's the best way to replicate Postgres data into Tinybird, this is it.</p><blockquote>“Tinybird was the clear winner in terms of how quickly we could get up and running to validate our ideas and build a great UX. We chose Tinybird because we could just throw data into it, build and iterate our analytics with SQL, and then immediately expose it as an API when it was ready, all without maintaining any external dependencies.” <br><br>- Aayush Shah, Co-Founder and CTO of Blacksmith</blockquote><p>If you're a developer who has built an app on top of Postgres, the PostgreSQL Table Function will help you:</p><ol><li>Export table from PostgreSQL and migrate timeseries and events data from a Postgres instance to Tinybird to scale real-time analytics workloads.</li><li>Replicate Postgres data into Tinybird to hydrate analytical queries with dimensional data.</li></ol><h2 id="using-the-postgresql-table-function-in-tinybird">Using the PostgreSQL Table Function in Tinybird</h2>
<p>Unlike Tinybird's native Data Source Connectors, the PostgreSQL Table Function leverages Tinybird Pipes: You write a bit of SQL to <code>SELECT FROM</code> your remote Postgres database and either replace or append that data to a Tinybird Data Source using Copy Pipes.</p><p>Below is a summary of how to use the PostgreSQL Table Function to both backfill historical data and append new data in Postgres tables. For a more thorough understanding of these features, including capabilities, requirements, and limitations, please <a href="https://www.tinybird.co/docs/ingest/postgresql" rel="noreferrer"><u>refer to the documentation</u></a>.</p><h3 id="new-environment-variables-api">New: Environment Variables API</h3><p>The PostgreSQL Table Function relies on the new Tinybird Environment Variables API. With the Environment Variables API, you can securely store secrets and other variables in Tinybird for use with table functions and (in the future) other external connections.&nbsp;</p><p>You can access stored secrets in Tinybird Pipes using <code>{{tb_secret('secret_name')}}</code>. For more details on the Environment Variables API, including how to define and store a secret in Tinybird, please refer to the <a href="https://www.tinybird.co/docs/api-reference/environment-variables-api" rel="noreferrer">Environment Variables API documentation</a>.</p><h3 id="step-1-define-a-backfill-query">Step 1. Define a backfill query</h3><p>In most cases, you'll start with a backfill of historical data, copying the full Postgres table over to Tinybird. To do this, create a Tinybird Pipe and `SELECT` data from your Postgres table using the <code>postgresql()</code> table function.</p><p>For example, consider a Postgres instance hosted on Supabase:</p>
<!--kg-card-begin: html-->
<iframe width="100%" src="https://snippets.tinybird.co/XQAAAAJjAQAAAAAAAABBKUqGk9nLKzhRrkSZ0VcSimwPiawteTMH3pr1cvfHej_frbwArBa7hFglQKfyIf68oN9QX7B82dmZ12aToYf0m839aGLoqZkofaIIleIkgrtsIA1oIMym63YOAUNvmRZN0HOy-Dv6KX4G3-BlQ4mr-9dbiE_8Rj-qZCQYRu2TXbwbU07e2WXjE8L2Wu_y4zPKdg9tInvy-mOpN0kNoKagyoqfHlrl79Alo75NqXiD0ivdpm6DZgYLQFH5kyZXi1-pxYnUVBIaxFJOkbdBUQZGDa7w26XM9GZPfJFtE_74Q4dSEsfDuj6FI0AEROCGK9HO9k0v7Ignhv2htt5X7wFOFV_RKLREzmWmCPH_liAqAA/embed"></iframe>
<!--kg-card-end: html-->
<p>This query uses the <code>postgresql()</code> table function to select data from the Postgres instance, using credentials stored in <code>tb_secrets</code>.</p><p>If you don't need the full table, you can push down filters to Postgres. This filter will be applied in Postgres, not Tinybird, reducing your Postgres scan size for better performance.&nbsp;</p><p>Filtering can also be useful when you want multiple Tinybird Data Sources based on certain filters, or if you're performing a large migration and need to partition your Postgres table into multiple backfill jobs.&nbsp;</p><h3 id="step-2-create-a-replacing-copy-pipe">Step 2. Create a replacing Copy Pipe</h3><p>You can use Tinybird's Copy Pipes to copy the results of this query into a new Tinybird Data Source table. Simply export the query node as a Copy Pipe, define your resulting table schema, select a copy strategy, and define a schedule.</p><figure class="kg-card kg-image-card"><img src="https://tinybird-blog.ghost.io/content/images/2024/07/postgres_backfill_2-min.png" class="kg-image" alt="" loading="lazy" width="1600" height="1096" srcset="https://tinybird-blog.ghost.io/content/images/size/w600/2024/07/postgres_backfill_2-min.png 600w, https://tinybird-blog.ghost.io/content/images/size/w1000/2024/07/postgres_backfill_2-min.png 1000w, https://tinybird-blog.ghost.io/content/images/2024/07/postgres_backfill_2-min.png 1600w" sizes="(min-width: 720px) 720px"></figure><p>For backfills, use the <code>REPLACE</code> strategy to import the full table. For a one-off import, you can set up an On Demand schedule that allows you to manually trigger a full replace at any time. Alternatively, you can schedule an occasional (e.g. weekly) replace job to ensure an occasional cleanup of any discrepancies from delta updates.</p><p>Complete the Copy Pipe definition and run the initial Copy Job. You've now copied your Postgres table into a properly indexed database table for real-time analytics.</p><figure class="kg-card kg-image-card"><img src="https://tinybird-blog.ghost.io/content/images/2024/07/postgres_backfill-min.png" class="kg-image" alt="" loading="lazy" width="1600" height="1096" srcset="https://tinybird-blog.ghost.io/content/images/size/w600/2024/07/postgres_backfill-min.png 600w, https://tinybird-blog.ghost.io/content/images/size/w1000/2024/07/postgres_backfill-min.png 1000w, https://tinybird-blog.ghost.io/content/images/2024/07/postgres_backfill-min.png 1600w" sizes="(min-width: 720px) 720px"></figure><p>If you're using the PostgreSQL Table Function to sync dimensional data into Tinybird, you can proceed to the next step. If you're performing a one-off migration of event data, you can then point your event streams at this new Tinybird Data Source using the Events API or one of our native streaming Connectors.</p><h3 id="step-3-build-an-incremental-update-query">Step 3. Build an incremental update query</h3><p>With the Postgres table copied into Tinybird, you can now set up a Copy Pipe to incrementally append updates from your Postgres table to your Tinybird Data Source.</p><p>To do so, you can copy over the initial backfill query, and add a <code>WHERE</code> clause to filter only Postgres records by timestamps greater than the latest timestamp in your Tinybird Data Source, for example:</p>
<!--kg-card-begin: html-->
<iframe width="100%" src="https://snippets.tinybird.co/XQAAAAKFAQAAAAAAAABBKUqGk9nLKvXMY9JRhPTh3mdXXaxcbqZy3YnN0nQdwYh9SG6KnGssCCUK_oAWg0jNLQLfyVa8awytbQ0QZrkV4T7o-_s0Ue7EL5dkHpjnS5VQvQ4Jr3IqaG5DStkVnT9bqUf051FhHrDmhbSZ6Uiej2QoSCF0zDGF5NJFoDprJK7LnPKSdRDv2cAkZoMcZbZ-58JS7KXjh4T-UvnrS5o7ADPIHmYRpyOICRMPljeJGe2IEIEdxaQdhbskSbtVReiSOBIZX-T744pRYCCgwgRyDz5a49Dj2eJ60FxfjjcyL7xT9IU1eOUVOTdYgV4tW5c9Ja6ug6zH-gtZ1mWD719Vaikb-IFvz8n4Yl-9BZujKwr_8GdAhw/embed"></iframe>
<!--kg-card-end: html-->
<p>There's some great functionality here: You can push the filter down to Postgres <em>and </em>use a subquery over a Tinybird Data Source to define that filter. This query pushes the <code>WHERE</code> filter down to Postgres, selecting only rows in the Postgres table whose value in the <code>pg_timestamp</code> column is greater than the latest timestamp in the corresponding column of the Tinybird Data Source (in this case called <code>tinybird_postgres_datasource</code>).&nbsp;</p><h3 id="step-4-create-an-appending-copy-pipe">Step 4. Create an appending Copy Pipe</h3><p>Now, you can publish this incremental query as a Copy Pipe using the `APPEND` strategy, setting a reasonable schedule based on your data freshness requirements. Based on the schedule you define, Tinybird will trigger incremental Copy Jobs to append new data from your Postgres table into your Tinybird Data Source.</p><figure class="kg-card kg-image-card"><img src="https://tinybird-blog.ghost.io/content/images/2024/07/postgres_append-min.png" class="kg-image" alt="" loading="lazy" width="1600" height="1096" srcset="https://tinybird-blog.ghost.io/content/images/size/w600/2024/07/postgres_append-min.png 600w, https://tinybird-blog.ghost.io/content/images/size/w1000/2024/07/postgres_append-min.png 1000w, https://tinybird-blog.ghost.io/content/images/2024/07/postgres_append-min.png 1600w" sizes="(min-width: 720px) 720px"></figure><h3 id="step-5-build-something">Step 5. Build something!</h3><p>You've successfully set up a fully managed pipeline from Postgres to Tinybird. The rest is up to you. Use Tinybird Pipes to shape your data, enrich events with Postgres dimensions, and publish low-latency, scalable REST Endpoints to power your user-facing analytics.</p><h2 id="get-started-with-the-postgresql-table-function">Get started with the PostgreSQL Table Function</h2><p>The PostgreSQL Table Function is now available in Private Beta. To gain access to these features, contact Tinybird's Customer Support team at <a href="mailto:support@tinybird.co"><u>support@tinybird.co</u></a>.&nbsp;</p><p>For more details about PostgreSQL Table Function, <a href="https://www.tinybird.co/docs/ingest/postgresql" rel="noreferrer"><u>refer to the documentation</u></a>. To see an example Postgres to Tinybird migration, check out the screencast below.</p><p>As always, if you have any questions about how to replicate Postgres data into Tinybird, please join our <a href="https://www.tinybird.co/community"><u>Slack Community</u></a>.</p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/P2ablq_4MX0?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="" title="Import and sync Postgres data into Tinybird with the PostgreSQL Table Function"></iframe></figure><h3 id="first-time-with-tinybird">First time with Tinybird?</h3><p>If you're currently working with Postgres and looking for a real-time analytics database to scale your performance, try Tinybird. The Build plan is free, with no credit card requirement and no time limit. <a href="https://www.tinybird.co/signup" rel="noreferrer">Sign up today</a> to unify data from Postgres and other sources, shape it with SQL, and publish your queries as scalable REST Endpoints that you can integrate into your application.</p>
