Back

How Order Editing Replaced a Failing DynamoDB Pipeline with Tinybird

A three-person team running $5B in Shopify GMV couldn't afford to babysit analytics infrastructure. Here's how they stopped debugging reconciliation failures and started shipping features.

About the company

Order Editing is a Shopify app that lets customers edit their own orders after checkout instead of filing support tickets. The app processes over $5B in Shopify GMV and is used by brands like HexClad, Nike, Adidas, Reebok, Oh Polly, and Mejuri. The company is bootstrapped, based between New Zealand and Toronto, and runs on a three-person engineering team.

3M
events per day
2K+
Shopify merchants
~$300
monthly cost

We're not babysitting infrastructure anymore. That sounds small, but it's huge for a three-person team. We're building features now instead of doom scrolling error logs while eating lunch.

Kiril

Engineering at Order Editing

Problem

Order Editing started where most Shopify apps start: with Shopify's built-in analytics primitives. That worked until it didn't.

As their merchant base grew to include enterprise retailers processing 10k-30k orders, the team needed to track revenue per edit, order modification patterns, and per-store analytics across thousands of Shopify stores. They built a custom pipeline on DynamoDB with materialized views.

It broke in the ways custom analytics pipelines usually break: numbers stopped reconciling, materialized views became impossible to debug, and AWS costs climbed without a clear ceiling. Roughly 50% of team bandwidth was consumed by rate-limit debugging, reconciliation gaps, and merchant-specific investigations.

The numbers weren't just wrong. They were inconsistently wrong. A merchant would flag a discrepancy, the team would investigate, and half the time they couldn't even reproduce it. Revenue totals, edit counts, usage metrics: all unreliable.

Why Tinybird

The team evaluated PostHog next, but it wasn't the right fit.

We looked at PostHog. The analytics capabilities were there, but we didn't need a full product analytics suite. We needed a fast, queryable data layer we could build our own dashboards on top of. Integrating their UI into our Shopify admin experience was more complexity than it was worth.

Kiril

Engineering at Order Editing

Tinybird fit because it solved the specific problem: get event data in, query it fast, expose it via API. No extra UI to wrangle, no analytics platform to integrate around.

Results

  • 3M events/day ingested for approximately $300/month
  • All Shopify admin analytics now powered by Tinybird API endpoints
  • QPS constraints drove better architecture: batching 50 store analytics into single queries
  • A three-person engineering team runs analytics infrastructure that serves enterprise retailers, without a dedicated data engineer

Tinybird x Order Editing

When the numbers don't reconcile, enterprise merchants notice

For a Shopify app handling $5B in GMV, analytics accuracy isn't optional. When a brand like HexClad opens Order Editing's admin panel, the revenue-per-edit numbers need to be right. When they're not, conversations get awkward fast.

The Order Editing team counted roughly 20 support tickets directly tied to reconciliation issues, but that was the tip of the iceberg. Account managers absorbed merchant frustration before it ever became a ticket. The head of product brought it up in every meeting, venting about client calls where he got caught flat-footed because the numbers on screen didn't make sense.

Imagine sitting across from a brand like HexClad and not being able to explain why their revenue numbers look off. Once a high-volume enterprise merchant started seeing wrong numbers, it stopped being a 'we'll fix it eventually' problem.

Kiril

Engineering at Order Editing

The DynamoDB architecture had worked for smaller merchants. But as soon as larger merchants with 10k-30k order volumes came in, reconciliations got longer, rate limits hit harder, and jobs started failing mid-stream.

Order Editing dashboard

The decision happened on a ping pong table in New Zealand

The breaking point came during a company offsite in New Zealand. Three engineers sat on a ping pong table in a garage, figuratively playing ping pong with ideas: different approaches, trade-offs, architecture options. After 90 minutes of brainstorming, they started building.

The team ingests events through an SQS queue feeding into Tinybird. It's a pattern they arrived at for reliability, even though it's not the typical recommended path. It works.

Explain code with AI
Copy
Shopify events (webhooks + app events + pixel-derived events)
    → ingestion service
    → SQS buffer
    → Tinybird ingestion
    → Tinybird endpoints
    → Shopify admin analytics UI

SQS sits in the middle for durability and replayability. When traffic bursts or something hiccups downstream, events don't get lost.

The team also uses Shopify's Web Pixels API as a structured event source. Web Pixels run in sandboxed iframes on the storefront and subscribe to customer events like page_viewed, product_viewed, checkout progression, and cart updates that server-side webhooks can't capture. The API provides access to cart and customer context at render time, enabling client-side behavioral tracking that complements webhook data.

Explain code with AI
Copy
// App Web Pixel example
import { register } from '@shopify/web-pixels-extension';

register(({ analytics, browser, init }) => {
  analytics.subscribe('product_viewed', (event) => {
    // Send to your analytics pipeline
  });

  analytics.subscribe('checkout_completed', (event) => {
    // Capture conversion with full cart context from init.data
  });
});

Order Editing subscribes to events like checkout_completed and custom events tied to order edits. These pixel-derived events flow through the same SQS → Tinybird pipeline alongside webhooks, giving merchants a unified view of both server-side transactions and client-side behavior.

Migration: two months, mostly backfill

The migration took about two months, mostly spent building and testing a backfill script for six months of historical data.

There were bumps. An initial deduplication issue caused by selecting the wrong attribute as the unique key. Some back-and-forth on materialized view backfill behavior that wasn't obvious from the docs. Claude Code helped with troubleshooting during the process.

Before Tinybird, as soon as larger merchants with 10k-30k order volumes came in, reconciliations got longer, rate limits hit harder, and jobs started failing mid-stream. At some point you stop debugging and start questioning your life choices.

Kiril

Engineering at Order Editing

Before: surviving. After: building

The DynamoDB pipeline kept the team in survival mode. Now they build features they're proud to show merchants, like upsell performance down to the variant level, which they wouldn't have even attempted before.

The architectural constraint that initially felt limiting (QPS limits requiring batched queries) turned out to produce better design. Pulling analytics for up to 50 stores in a single API call is more efficient than the one-query-per-store approach they would have built otherwise.

BeforeAfter
~50% team bandwidth on infrastructureBuilding features
Inconsistent reconciliation failuresReliable numbers
20+ support tickets from bad dataMerchants not complaining (a first)
Debugging rate limitsShipping upsell analytics

Honest friction points

The migration wasn't frictionless. A few things the team ran into:

  • QPS limits at 40 queries required rethinking query patterns. For an app serving thousands of stores, this was a real constraint that needed architectural work.
  • Materialized view backfill documentation didn't clearly explain what was possible, which added debugging time during migration.
  • AI tooling trained on old CLI: most AI assistants still reference outdated patterns, which led to confidently wrong answers during development.

The team worked through all of these, but they're worth noting for anyone building a similar multi-tenant analytics layer on Tinybird.

What's next

The team is exploring building an internal analytics app using Cursor with MCP integration, and Slack integration for natural language queries against their Tinybird data. Tinybird Forward's browser-based data exploration is also on their radar.

For a bootstrapped three-person team serving enterprise brands, the shift from babysitting infrastructure to building product features isn't incremental improvement. It's the difference between surviving and competing.

Why aren't you on Tinybird yet? If you need a fast, worry-free, API-first analytics backend, Tinybird is a no-brainer choice. Wish we integrated sooner.

Kiril

Engineering at Order Editing

Share this story!

Ship faster with Tinybird

The data infrastructure and tooling to ship enterprise grade analytical features faster and at a fraction of the cost

Try it for free