Working with global applications means dealing with timestamps from different timezones, and ClickHouse's toTimeZone
function handles this conversion while preserving the exact moment in time. The function takes a DateTime value and shifts its display to match any target timezone without changing when the event actually occurred.
This guide walks through the complete syntax, common pitfalls, performance considerations, and practical examples for converting datetime values between timezones in ClickHouse. You'll learn how to handle daylight saving transitions, work around dynamic timezone limitations, and build timezone-aware analytics APIs.
What toTimeZone does and when to use it
The toTimeZone
function takes a DateTime or DateTime64 value and converts it from one timezone to another while keeping the same moment in time. Think of it like changing the clock display without changing when something actually happened.
When you call toTimeZone(datetime_value, 'Target/TimeZone')
, ClickHouse shifts the time representation to match the target timezone. If you have a UTC timestamp showing 10:00 AM, converting it to US Eastern time during winter gives you 5:00 AM - same moment, different display.
You'll find yourself reaching for timezone conversion in a few common scenarios:
- User localization: Show dashboard times in each user's local timezone instead of UTC
- Business reporting: Convert server logs to company headquarters timezone for daily reports
- Multi-region analytics: Display regional performance data in local business hours
Basic syntax and parameters
The function follows a simple pattern: toTimeZone(your_datetime_column, 'Timezone/Name')
. The first part is your datetime data, and the second tells ClickHouse where to convert it.
SELECT
event_time,
toTimeZone(event_time, 'Europe/Berlin') AS berlin_time
FROM events;
Parameter requirements
Your datetime input works with DateTime or DateTime64 columns, but strings won't work directly. If you have string timestamps, you'll need to convert them first with toDateTime()
.
The timezone parameter only accepts string literals - you can't use column values or variables here. ClickHouse needs to know the timezone at query planning time, not during execution.
Return types
Whatever type you put in comes back out. DateTime input gives you DateTime output, DateTime64 keeps its precision level. The difference is that the returned value carries new timezone information that changes how ClickHouse displays the timestamp.
-- Input: 2024-01-15 10:00:00 (UTC)
-- Output: 2024-01-15 11:00:00 (Europe/Berlin)
Constant vs non-constant zone arguments
Here's where things get tricky. ClickHouse requires the timezone to be a constant string, which means this works:
SELECT toTimeZone(created_at, 'US/Pacific') FROM orders;
But this fails:
SELECT toTimeZone(created_at, user_timezone) FROM orders;
-- Error: Timezone must be const string
The limitation exists because ClickHouse needs to resolve timezone conversions when it plans the query, not when it runs.
Supported time zone names in ClickHouse
ClickHouse uses the IANA timezone database - the same standard that powers Linux systems and most programming languages. These names follow the 'Continent/City' pattern like 'America/New_York' or 'Europe/London'.
Querying system.time_zones
You can explore available timezones directly in ClickHouse:
SELECT name FROM system.time_zones
WHERE name LIKE '%New_York%' OR name LIKE '%UTC%'
ORDER BY name;
This query shows you the exact strings ClickHouse recognizes, which helps avoid typos that cause runtime errors.
Common aliases and abbreviations
While full IANA names work best, ClickHouse supports some common shortcuts:
- UTC variants: 'UTC', 'GMT', 'Zulu'
- US regions: 'US/Eastern', 'US/Central', 'US/Mountain', 'US/Pacific'
- European zones: 'CET', 'EET', 'WET'
However, using complete names like 'America/New_York' instead of 'US/Eastern' gives you more explicit control over daylight saving behavior.
Convert UTC timestamps to local zones step by step
Let's walk through converting UTC order data to US Eastern time with a real example.
1. Select raw UTC data
Start by looking at your source data:
SELECT
order_id,
created_at,
total_amount
FROM orders
WHERE created_at >= '2024-01-01'
LIMIT 5;
This shows your original UTC timestamps before any conversion.
2. Apply toTimeZone with literal zone
Add the timezone conversion:
SELECT
order_id,
created_at AS utc_time,
toTimeZone(created_at, 'US/Eastern') AS eastern_time,
total_amount
FROM orders
WHERE created_at >= '2024-01-01'
LIMIT 5;
The hour values shift by 4-5 hours depending on whether daylight saving time is active.
3. Validate daylight saving shifts
Test across daylight saving boundaries:
SELECT
created_at AS utc_time,
toTimeZone(created_at, 'US/Eastern') AS eastern_time,
formatDateTime(toTimeZone(created_at, 'US/Eastern'), '%Y-%m-%d %H:%M:%S %z') AS with_offset
FROM orders
WHERE created_at BETWEEN '2024-03-09' AND '2024-03-11'
ORDER BY created_at;
You'll see the offset change from -05:00 to -04:00 as daylight saving begins, confirming automatic DST handling.
Convert and format in one query
Often you want both timezone conversion and formatting for display purposes.
formatDateTime with zone placeholder
The formatDateTime
function includes timezone placeholders:
SELECT
order_id,
formatDateTime(toTimeZone(created_at, 'Europe/Berlin'), '%Y-%m-%d %H:%M:%S %z') AS formatted_berlin
FROM orders;
The %z
placeholder shows the timezone offset (+0100), while %Z
displays the abbreviation (CET).
Combining toTimeZone and formatDateTime
Nest the functions for single-step conversion and formatting:
SELECT
order_id,
formatDateTime(
toTimeZone(created_at, 'Asia/Tokyo'),
'%b %d, %Y at %I:%M %p JST'
) AS tokyo_display
FROM orders
WHERE created_at >= today() - INTERVAL 1 DAY;
This produces readable output like "Jan 15, 2024 at 07:30 PM JST" for user interfaces.
Convert using dynamic zone values
While toTimeZone
requires constant timezone strings, you can work around this limitation.
Using zone stored in another column
Direct column usage fails:
-- This produces an error
SELECT toTimeZone(event_time, user_timezone) FROM user_events;
-- Error: Timezone must be const string
Work around with dictGet or CASE
For limited timezone sets, use CASE statements:
SELECT
event_time,
CASE user_region
WHEN 'US_EAST' THEN toTimeZone(event_time, 'US/Eastern')
WHEN 'US_WEST' THEN toTimeZone(event_time, 'US/Pacific')
WHEN 'EUROPE' THEN toTimeZone(event_time, 'Europe/London')
ELSE event_time
END AS localized_time
FROM user_events;
For larger timezone mappings, consider creating separate queries for each timezone group or using dictionaries with multiple query approaches.
toTimeZone vs toDateTime vs toDateTime64
Each function serves different conversion needs:
Function | Input Type | Primary Use | Timezone Handling |
---|---|---|---|
toTimeZone | DateTime/DateTime64 | Convert existing timestamps | Changes timezone metadata |
toDateTime | String/Number | Parse new timestamps | Sets initial timezone |
toDateTime64 | String/Number | Parse with sub-second precision | Sets timezone with microseconds |
When to prefer toDateTime
Use toDateTime
when parsing string representations with timezone information:
-- Parsing strings with timezone info
SELECT toDateTime('2024-01-15 10:30:00', 'Europe/Berlin') AS parsed_time;
-- Converting existing DateTime values
SELECT toTimeZone(existing_datetime, 'Europe/Berlin') AS converted_time;
The key difference: toDateTime
creates new DateTime values from strings, while toTimeZone
converts existing ones.
Why use DateTime64 for sub-second precision
When working with high-frequency data, toTimeZone
preserves precision levels:
SELECT
high_precision_timestamp,
toTimeZone(high_precision_timestamp, 'US/Pacific') AS pacific_precise
FROM trading_events
WHERE timestamp >= now() - INTERVAL 1 HOUR;
The conversion maintains all decimal places in your timestamp precision.
Performance and index impact of timezone conversion
Timezone conversion affects query performance, especially in sorting and filtering operations.
Effect on sorting keys
Using toTimeZone
in ORDER BY prevents ClickHouse from using existing datetime indexes:
-- Slower: conversion during sorting
SELECT * FROM events
ORDER BY toTimeZone(event_time, 'US/Pacific') DESC;
-- Faster: sort on original column
SELECT
event_id,
toTimeZone(event_time, 'US/Pacific') AS pacific_time
FROM events
ORDER BY event_time DESC;
The second approach leverages existing indexes while providing converted timestamps.
Materialized view strategies
Pre-convert timezones in materialized views for frequently queried zones:
NODE localized_events
SQL >
SELECT
event_id,
event_time AS utc_time,
toTimeZone(event_time, 'US/Eastern') AS eastern_time,
toTimeZone(event_time, 'Europe/London') AS london_time,
user_id,
event_type
FROM raw_events
TYPE MATERIALIZED
DATASOURCE localized_events_mv
This eliminates runtime conversion costs for common timezone combinations.
Determinism and caching
The toTimeZone
function produces identical outputs for identical inputs, allowing ClickHouse to cache conversion results within query execution. However, DST transitions create edge cases where the same local time can represent two different UTC moments during "fall back" periods.
Common mistakes to avoid
Several patterns lead to incorrect results or performance issues:
- Invalid timezone strings: Using 'PST' instead of 'US/Pacific'
- Wrong input types: Passing strings without DateTime casting
- Double conversion: Applying timezone conversion multiple times
Passing invalid zone strings
ClickHouse rejects unrecognized timezone names immediately:
-- This fails
SELECT toTimeZone(event_time, 'PST') FROM events;
-- Error: Unknown time zone: PST
-- Use this instead
SELECT toTimeZone(event_time, 'US/Pacific') FROM events;
Always verify timezone names against the system.time_zones
table before production use.
Forgetting to cast strings to DateTime
String datetime values need explicit conversion:
-- This produces an error
SELECT toTimeZone('2024-01-15 10:00:00', 'Europe/Berlin');
-- Cast strings first
SELECT toTimeZone(toDateTime('2024-01-15 10:00:00'), 'Europe/Berlin');
Applying conversion twice
Converting already-converted timestamps produces incorrect results:
-- Wrong: double conversion
SELECT toTimeZone(toTimeZone(utc_time, 'US/Eastern'), 'US/Eastern') FROM events;
-- Correct: single conversion
SELECT toTimeZone(utc_time, 'US/Eastern') FROM events;
Track which columns contain converted values to avoid this multiplication error.
Best practices for timezone safety
Following these guidelines prevents timezone-related bugs in production systems.
Store timestamps in UTC
Keep all stored timestamps in UTC, converting only during read operations:
-- Good: UTC storage schema
CREATE TABLE events (
event_id UInt64,
event_time DateTime('UTC'),
user_id UInt64
) ENGINE = MergeTree()
ORDER BY (event_time, user_id);
This approach eliminates ambiguity about stored data timezone and simplifies data pipeline logic.
Convert at read time only
Apply timezone conversion in SELECT queries rather than during data ingestion:
-- Convert for display, keep storage in UTC
SELECT
event_id,
toTimeZone(event_time, 'US/Pacific') AS display_time,
user_id
FROM events
WHERE event_time >= today() - INTERVAL 7 DAY;
This pattern maintains data consistency while providing flexible timezone display options.
Document your zone assumptions
Clearly document timezone expectations in schema definitions, API documentation, and data pipeline code. Include comments specifying whether datetime columns contain UTC or local times, and note any conversion requirements for downstream consumers.
Building an example time-series analytics API with Tinybird
Time series data is everywhere. Crypto, for example. Crypto transactions happen 24/7 across all timezones. Here's how to build a complete timezone-aware crypto analytics API using Tinybird's managed ClickHouse platform that converts UTC transaction data to users' local timezones.
Step 1: Create a data source for crypto transactions
First, create a data source to store global crypto transactions. Create a crypto_transactions.datasource
file:
SCHEMA >
`transaction_id` String,
`timestamp_utc` DateTime64(3, 'UTC'),
`user_id` String,
`wallet_address` String,
`transaction_type` LowCardinality(String),
`cryptocurrency` LowCardinality(String),
`amount_crypto` Float64,
`amount_usd` Float64,
`fees_usd` Float64,
`exchange` LowCardinality(String),
`user_timezone` String
ENGINE "MergeTree"
ENGINE_PARTITION_KEY "toYYYYMM(timestamp_utc)"
ENGINE_SORTING_KEY "timestamp_utc, user_id"
Deploy the data source:
tb --cloud deploy
Ingest sample crypto transaction data using the Events API:
curl -X POST \
-H "Authorization: Bearer $TINYBIRD_TOKEN" \
-H "Content-Type: application/json" \
"https://api.tinybird.co/v0/events?name=crypto_transactions" \
-d '[
{"transaction_id": "tx_001", "timestamp_utc": "2024-12-01 14:30:00", "user_id": "user_001", "wallet_address": "0x123...abc", "transaction_type": "buy", "cryptocurrency": "BTC", "amount_crypto": 0.025, "amount_usd": 2450.00, "fees_usd": 12.25, "exchange": "coinbase", "user_timezone": "America/New_York"},
{"transaction_id": "tx_002", "timestamp_utc": "2024-12-01 22:15:00", "user_id": "user_002", "wallet_address": "0x456...def", "transaction_type": "sell", "cryptocurrency": "ETH", "amount_crypto": 1.5, "amount_usd": 3750.00, "fees_usd": 18.75, "exchange": "binance", "user_timezone": "Europe/London"},
{"transaction_id": "tx_003", "timestamp_utc": "2024-12-02 03:45:00", "user_id": "user_003", "wallet_address": "0x789...ghi", "transaction_type": "swap", "cryptocurrency": "SOL", "amount_crypto": 50.0, "amount_usd": 12500.00, "fees_usd": 25.00, "exchange": "kraken", "user_timezone": "Asia/Tokyo"}
]'
Step 2: Create a timezone-aware transaction analytics API
Create a pipe that converts UTC timestamps to users' local timezones for portfolio analysis. Create user_transaction_history.pipe
:
TOKEN "crypto_analytics_read" READ
NODE transaction_analysis
SQL >
%
SELECT
transaction_id,
user_id,
-- Convert UTC to user's local timezone
toTimeZone(timestamp_utc, user_timezone) AS local_timestamp,
toDate(toTimeZone(timestamp_utc, user_timezone)) AS local_date,
toHour(toTimeZone(timestamp_utc, user_timezone)) AS local_hour,
-- Original UTC timestamp for reference
timestamp_utc,
user_timezone,
transaction_type,
cryptocurrency,
amount_crypto,
amount_usd,
fees_usd,
exchange,
-- Calculate daily trading patterns in user's timezone
CASE
WHEN toHour(toTimeZone(timestamp_utc, user_timezone)) BETWEEN 9 AND 17
THEN 'market_hours'
WHEN toHour(toTimeZone(timestamp_utc, user_timezone)) BETWEEN 18 AND 23
THEN 'evening'
ELSE 'night_early_morning'
END AS trading_period,
-- Running total in user's local day
sum(amount_usd) OVER (
PARTITION BY user_id, toDate(toTimeZone(timestamp_utc, user_timezone))
ORDER BY toTimeZone(timestamp_utc, user_timezone)
) AS daily_volume_running_total
FROM crypto_transactions
WHERE timestamp_utc >= {{DateTime(start_time, '2024-12-01 00:00:00')}}
AND timestamp_utc < {{DateTime(end_time, '2024-12-03 00:00:00')}}
{\% if defined(user_id_filter) %}
AND user_id = {{String(user_id_filter)}}
{\% end %}
{\% if defined(timezone_filter) %}
AND user_timezone = {{String(timezone_filter)}}
{\% end %}
{\% if defined(crypto_filter) %}
AND cryptocurrency = {{String(crypto_filter)}}
{\% end %}
{\% if defined(transaction_type_filter) %}
AND transaction_type = {{String(transaction_type_filter)}}
{\% end %}
ORDER BY toTimeZone(timestamp_utc, user_timezone) DESC
{\% if defined(limit) %}
LIMIT {{Int32(limit, 100)}}
{\% end %}
TYPE ENDPOINT
Deploy the API endpoint:
tb --cloud deploy
Step 3: Create timezone-aware portfolio summary API
Create another pipe for aggregated portfolio insights by timezone. Create portfolio_summary_by_timezone.pipe
:
TOKEN "portfolio_read" READ
NODE portfolio_summary
SQL >
%
SELECT
user_timezone,
toDate(toTimeZone(timestamp_utc, user_timezone)) AS trading_date,
transaction_type,
cryptocurrency,
count() AS transaction_count,
sum(amount_crypto) AS total_crypto_amount,
sum(amount_usd) AS total_usd_volume,
sum(fees_usd) AS total_fees,
avg(amount_usd) AS avg_transaction_size,
min(toTimeZone(timestamp_utc, user_timezone)) AS first_transaction_local,
max(toTimeZone(timestamp_utc, user_timezone)) AS last_transaction_local,
-- Most active trading hour in local timezone
topK(1)(toHour(toTimeZone(timestamp_utc, user_timezone)))[1] AS most_active_hour_local
FROM crypto_transactions
WHERE timestamp_utc >= {{DateTime(start_date, '2024-12-01 00:00:00')}}
AND timestamp_utc < {{DateTime(end_date, '2024-12-03 00:00:00')}}
{\% if defined(timezone_filter) %}
AND user_timezone = {{String(timezone_filter)}}
{\% end %}
{\% if defined(crypto_filter) %}
AND cryptocurrency = {{String(crypto_filter)}}
{\% end %}
GROUP BY user_timezone, trading_date, transaction_type, cryptocurrency
ORDER BY total_usd_volume DESC
{\% if defined(limit) %}
LIMIT {{Int32(limit, 50)}}
{\% end %}
TYPE ENDPOINT
Deploy the portfolio API:
tb --cloud deploy
Step 4: Query the timezone-aware crypto APIs
Your crypto analytics APIs are now available with timezone conversion. Query them for different analysis scenarios:
# Get transaction history for a specific user in their local timezone
curl "https://api.tinybird.co/v0/pipes/user_transaction_history.json?start_time=2024-12-01%2000:00:00&end_time=2024-12-02%2023:59:59&user_id_filter=user_001" \
-H "Authorization: Bearer $TINYBIRD_TOKEN"
# Analyze Bitcoin transactions across all timezones
curl "https://api.tinybird.co/v0/pipes/user_transaction_history.json?start_time=2024-12-01%2000:00:00&end_time=2024-12-02%2023:59:59&crypto_filter=BTC" \
-H "Authorization: Bearer $TINYBIRD_TOKEN"
# Get portfolio summary for European users
curl "https://api.tinybird.co/v0/pipes/portfolio_summary_by_timezone.json?start_date=2024-12-01%2000:00:00&end_date=2024-12-02%2023:59:59&timezone_filter=Europe/London" \
-H "Authorization: Bearer $TINYBIRD_TOKEN"
The APIs return timezone-converted data showing local timestamps alongside portfolio insights:
{
"meta": [
{"name": "transaction_id", "type": "String"},
{"name": "user_id", "type": "String"},
{"name": "local_timestamp", "type": "DateTime"},
{"name": "local_date", "type": "Date"},
{"name": "local_hour", "type": "UInt8"},
{"name": "timestamp_utc", "type": "DateTime64"},
{"name": "user_timezone", "type": "String"},
{"name": "transaction_type", "type": "String"},
{"name": "cryptocurrency", "type": "String"},
{"name": "amount_crypto", "type": "Float64"},
{"name": "amount_usd", "type": "Float64"},
{"name": "trading_period", "type": "String"},
{"name": "daily_volume_running_total", "type": "Float64"}
],
"data": [
{
"transaction_id": "tx_001",
"user_id": "user_001",
"local_timestamp": "2024-12-01 09:30:00",
"local_date": "2024-12-01",
"local_hour": 9,
"timestamp_utc": "2024-12-01 14:30:00",
"user_timezone": "America/New_York",
"transaction_type": "buy",
"cryptocurrency": "BTC",
"amount_crypto": 0.025,
"amount_usd": 2450.0,
"trading_period": "market_hours",
"daily_volume_running_total": 2450.0
}
],
"rows": 1,
"statistics": {
"elapsed": 0.004,
"rows_read": 3,
"bytes_read": 384
}
}
This approach provides:
- Timezone-aware transaction history with local timestamps for better user experience
- Trading pattern analysis showing when users are most active in their local time
- Portfolio insights aggregated by local trading days rather than UTC days
- Flexible filtering by user, timezone, cryptocurrency, and transaction type
- Running totals calculated correctly within each user's local trading day
Your time series crypt transaction data becomes available as production-ready APIs that automatically handle timezone conversion for global users. Start building with Tinybird's free plan to create timezone-aware analytics APIs.
Additional resources
- Date and Time Functions
- Type Conversion Functions
- DateTime Data Types
- Database Timestamps and Timezones Best Practices
- Tinybird vs. ClickHouse: What's the difference?
Frequently asked questions about ClickHouse timezone conversion
Is toTimeZone deterministic across ClickHouse versions?
Yes, toTimeZone
produces consistent results across ClickHouse versions since it uses the standard IANA timezone database for calculations. The underlying timezone data gets updated periodically, but conversion logic remains stable between releases.
How do I list daylight saving transitions for a zone?
Query specific dates around March and November to observe DST behavior:
SELECT
d,
toTimeZone(toDateTime(d || ' 12:00:00'), 'US/Eastern') as eastern_time,
formatDateTime(toTimeZone(toDateTime(d || ' 12:00:00'), 'US/Eastern'), '%z') as offset
FROM (
SELECT arrayJoin(['2024-03-10', '2024-03-11', '2024-11-03', '2024-11-04']) as d
);
Can I index a column produced by toTimeZone?
No, you cannot directly index function call results, but you can create materialized views with pre-converted timezone values and index those columns. This approach provides performance benefits while maintaining timezone conversion functionality.