---
title: TOO_MANY_ROWS_OR_BYTES ClickHouse error
meta:
  description: Learn how to fix the TOO_MANY_ROWS_OR_BYTES error in ClickHouse and Tinybird. Understand what causes it and see examples of how to resolve it
---

# TOO_MANY_ROWS_OR_BYTES ClickHouse error

{% callout %}
This error usually means your query is trying to return more data than the system allows. You need to add LIMIT clauses or optimize your query to reduce the result set size.
{% /callout %}

The `TOO_MANY_ROWS_OR_BYTES` error in ClickHouse (and Tinybird) happens when a query tries to return more rows or bytes than the configured limits allow. This is a safety mechanism to prevent queries from consuming excessive resources.

## What causes this error

You'll typically see it when:

* Query returns too many rows (exceeds `max_result_rows` limit)
* Query returns too much data (exceeds `max_result_bytes` limit)
* Missing LIMIT clause on large datasets
* Aggregations without proper grouping produce too many results
* Window functions generate excessive intermediate results

## Example errors

```sql {% title="Fails: no LIMIT on large table" %}
SELECT * FROM events
```

```sql {% title="Fails: too many rows in aggregation" %}
SELECT user_id, event_type FROM events GROUP BY user_id, event_type
```

```sql {% title="Fails: large result set from window function" %}
SELECT *, ROW_NUMBER() OVER (ORDER BY timestamp) FROM events
```

```sql {% title="Fails: cross join producing too many rows" %}
SELECT * FROM users CROSS JOIN events
```

## How to fix it

### Add LIMIT clause

The most common fix is to limit the number of rows returned:

```sql {% title="Fix: add LIMIT" %}
SELECT * FROM events LIMIT 1000
```

### Use more specific WHERE clauses

Filter data before processing to reduce result size:

```sql {% title="Fix: add WHERE clause" %}
SELECT user_id, event_type
FROM events
WHERE timestamp >= now() - INTERVAL 1 DAY
GROUP BY user_id, event_type
```

### Optimize aggregations

Use more specific grouping or add LIMIT to aggregations:

```sql {% title="Fix: limit aggregation results" %}
SELECT user_id, event_type, COUNT(*) as count
FROM events
GROUP BY user_id, event_type
LIMIT 10000
```

### Use subqueries for large datasets

Break down complex queries into smaller parts:

```sql {% title="Fix: use subquery with LIMIT" %}
SELECT * FROM (
    SELECT user_id, COUNT(*) as count
    FROM events
    GROUP BY user_id
    LIMIT 1000
) t
WHERE count > 10
```

## Common patterns and solutions

### Pagination

Use LIMIT with OFFSET for pagination:

```sql {% title="Pagination example" %}
SELECT * FROM events
ORDER BY timestamp DESC
LIMIT 100 OFFSET 200
```

### Top N queries

Use LIMIT for top N results:

```sql {% title="Top 10 users by event count" %}
SELECT user_id, COUNT(*) as event_count
FROM events
GROUP BY user_id
ORDER BY event_count DESC
LIMIT 10
```

### Sampling large datasets

Use SAMPLE for large tables:

```sql {% title="Sample 1% of data" %}
SELECT * FROM events SAMPLE 0.01
```

### Time-based filtering

Filter by time ranges to reduce data:

```sql {% title="Last 24 hours only" %}
SELECT * FROM events
WHERE timestamp >= now() - INTERVAL 1 DAY
LIMIT 1000
```

## Advanced solutions

### Use approximate functions

For large datasets, consider approximate functions:

```sql {% title="Approximate count" %}
SELECT uniqHLL(user_id) as approximate_user_count
FROM events
```

### Optimize JOIN operations

Use proper JOIN conditions to avoid cartesian products:

```sql {% title="Proper JOIN instead of CROSS JOIN" %}
SELECT u.user_id, e.event_type
FROM users u
JOIN events e ON u.user_id = e.user_id
LIMIT 1000
```

### Use Materialized Views

For frequently accessed aggregations, use Materialized Views:

```sql {% title="Query from Materialized View" %}
SELECT * FROM events_daily_stats LIMIT 1000
```

## Tinybird-specific notes

In Tinybird, this error often occurs when:

* Querying large Data Sources without proper filtering
* Building Pipes that generate too many intermediate results
* Using Materialized Views that exceed size limits
* Running queries on high-cardinality columns

To debug in Tinybird:

1. Check your Data Source size and add appropriate filters
2. Use the Query Builder to test with LIMIT clauses first
3. Consider breaking large Pipes into smaller nodes
4. Use Materialized Views for pre-aggregated data

## Configuration limits

You can check and adjust limits in ClickHouse:

```sql {% title="Check current limits" %}
SELECT name, value
FROM system.settings
WHERE name LIKE '%max_result%'
```

Common settings to adjust:
* `max_result_rows` - maximum number of rows in result
* `max_result_bytes` - maximum size of result in bytes
* `max_block_size` - maximum block size for processing

## See also

* [ClickHouse Query Optimization](/work-with-data/optimization)
* [Working with Large Datasets](/forward/work-with-data/optimize)
* [Common function errors](/sql-reference/clickhouse-errors/ILLEGAL_TYPE_OF_ARGUMENT)
