Every quant team hits the same wall eventually.

You have built a beautiful alpha signal. The backtests sing. The Sharpe ratio is 1.4. You are ready to push to production — and then you realize that live execution requires raw trade ticks at sub-100ms resolution, and your data vendor does not offer that for US equities.

The meeting that follows is uncomfortable. Someone inevitably asks: "Why can't we get tick data from our current provider? Competitor X offers it. Should we switch?"

This article answers that question directly. Not with marketing language, but with the architectural and strategic reasoning that actually drives product decisions at data infrastructure companies. The answer, as with most honest engineering discussions, begins with a conversation about trade-offs.


What the trades Endpoint Actually Covers

Before explaining the boundary, we must define it precisely. TickDB's trades endpoint — the interface designed for individual transaction data — currently covers the following markets:

Asset class Tick-level trades (trades) OHLCV candles (kline)
US equities (NYSE, NASDAQ) Not supported 10+ years, cleaned and aligned
Hong Kong equities Supported Supported
A-shares (China) Not supported Supported
Crypto (spot) Supported Supported
Forex Not supported Supported
Precious metals Not supported Supported
Indices Not supported Supported

This table is the source of truth. When a user asks "does TickDB support US stock tick data?" the answer is no. The trades endpoint returns an empty or error result for any US equity symbol.

The critical distinction is this: TickDB does not lack US equity data entirely. The gap is specifically in tick-level transaction granularity. For US equities, the finest granularity available is 1-minute OHLCV candles (kline), with over a decade of historical depth.


The Technical Reality of US Equity Data Infrastructure

To understand why this boundary exists, you need to understand what "tick data" actually means in the US equity market — and why it is harder to acquire than most developers assume.

The fragmentation problem

Unlike crypto markets, where a single exchange dominates, US equity volume is fragmented across 16+ exchanges and alternative trading systems (ATS). FINRA's TRF (Trade Reporting Facilities) alone processes trades from hundreds of venues. A complete tick data feed requires aggregating:

  • Direct exchange feeds (NYSE, NASDAQ, CBOE, IEX)
  • FINRA ADF and TRF reports
  • OTC and dark pool prints
  • Options exchange prints (for the full picture)

This is not a single API. It is a data engineering project that, at institutional scale, costs $50,000 to $500,000 per year in exchange fees alone — before you pay for storage, normalization, and latency infrastructure.

The licensing and compliance overhead

Every US exchange requires a separate data licensing agreement. The CTA (Consolidated Tape Association) data carries its own fee structure. Retail-focused vendors often sidestep this by offering "derived" or "indicative" data, but a professional backtesting system requires CTA/UQP-certified data with millisecond-accurate timestamps.

The legal review alone for a new exchange agreement can take 60–90 days.

Normalization complexity

A trade on the NASDAQ and a trade on a dark pool are not recorded identically. Price, size, venue, and condition codes must be normalized into a consistent schema. For US equities, this involves:

  • Trade condition mapping (regular, odd-lot, derivative, closing print)
  • Adjusted vs. unadjusted price reconciliation
  • Splits, dividends, and corporate action alignment
  • Short sale circuit breaker flagging

TickDB's design philosophy prioritizes correctness over coverage. Shipping partially-normalized US tick data that produces wrong backtest results would be worse than shipping no tick data at all.


Why TickDB Made This Choice: The Strategic Reasoning

Every data vendor faces a fundamental tension: breadth vs. depth. You can be excellent at one thing, or mediocre at many things. TickDB's product architecture reflects a deliberate choice to optimize for depth in specific markets rather than shallow coverage everywhere.

Depth-first vs. breadth-first data strategy

Strategy Advantage Disadvantage
Breadth-first (cover everything) Can say "we support 50,000+ symbols" Inconsistent data quality; weak in all markets
Depth-first (nail specific markets) Superior normalization; reliable backtests Fewer total symbols

TickDB chose depth-first. The HK equity and crypto markets where trades is supported benefit from:

  • Complete order book depth (up to 10 levels for HK and crypto)
  • Normalized tick data with millisecond timestamps
  • Consistent corporate action adjustment
  • Unified schema across all supported venues

Extending this depth to US equities would require either:

  1. A multi-year engineering investment in exchange feed aggregation and normalization
  2. A quality compromise that would undermine the reliability promise TickDB makes to existing users

Neither outcome serves the current user base.

The OHLCV-first positioning

For the vast majority of quantitative strategies — trend following, mean reversion, pairs trading, macro factor models — 1-minute OHLCV data is sufficient for both backtesting and live monitoring. This is not a limitation of TickDB. It is a recognition that:

  • Sub-second tick resolution is rarely necessary for strategy signal generation
  • The computational and storage cost of tick data often exceeds its marginal value over minute candles
  • Most alpha signals that require tick-level granularity are either HFT strategies (requiring co-location and direct exchange feeds) or microstructure studies (where the research can be conducted on representative samples)

TickDB optimized for the 95% case: systematic traders who need reliable, long-horizon historical data for backtesting and daily-to-intraday monitoring.


What This Means for Your Strategy: A Decision Framework

Not every strategy needs tick data. Understanding whether your use case actually requires it is the first step to making an informed data procurement decision.

When OHLCV is genuinely sufficient

  • Daily or weekly rebalancing: Long-horizon strategies where signal generation happens on the close or weekly intervals. Minute-level candles are overkill.
  • End-of-day systematic funds: Strategies that trade once per day and evaluate performance on daily returns.
  • Multi-day event studies: Earnings, macro announcements, sector rotation — where the relevant time horizon is hours to days.
  • Factor model research: Cross-sectional factor returns evaluated on daily or weekly frequencies.
  • Portfolio construction: Optimization at the daily or weekly level.

For these use cases, TickDB's 10+ years of cleaned US equity OHLCV data is not a limitation — it is exactly what you need.

When you genuinely need tick-level granularity

  • Market microstructure research: Bid-ask bounce analysis, price impact measurement, order flow toxicity metrics (VPIN, PIN).
  • High-frequency statistical arbitrage: Latency-sensitive strategies where the edge lives in sub-second price movements.
  • Intraday scalping or market-making: Strategies where the bid-ask spread capture requires understanding individual trade prints.
  • Short-term order flow prediction: Using trade direction and size to predict the next print within seconds.

If your strategy falls into one of these buckets, you should be evaluating direct exchange feeds, co-location services, and institutional data vendors — not retail-focused market data APIs.

The gray zone: Intraday strategies at 1–15 minute frequencies

This is where most debates happen. A 5-minute mean reversion strategy, a momentum strategy that holds for 30–60 minutes, or a VWAP execution algorithm — these live in the gap.

For this gray zone, the answer depends on your precision requirements:

Frequency Data requirement Recommendation
15-minute bars OHLCV is sufficient Use TickDB kline with 15m interval
5-minute bars OHLCV is sufficient for most strategies Use TickDB kline with 5m interval
1-minute bars OHLCV is sufficient for signal generation Use TickDB kline with 1m interval
Sub-minute signal generation Tick data required Evaluate specialized vendors

For the gray zone use cases, the practical question is not "can I backtest with minute bars?" but "will minute bars produce a backtest that translates to live performance?" The honest answer: it depends on your strategy's sensitivity to bar noise, your execution infrastructure, and your slippage assumptions.


The Production-Grade Code for What TickDB Does Provide

If you are building a US equity strategy that runs on OHLCV data — which covers the majority of systematic approaches — here is production-ready code demonstrating what TickDB does provide:

import os
import time
import requests
import random
from datetime import datetime, timedelta

# Configuration
TICKDB_API_KEY = os.environ.get("TICKDB_API_KEY")
BASE_URL = "https://api.tickdb.ai/v1"

# Error handler following production-grade standards
def handle_api_error(response, context=""):
    """Standard TickDB error handler with rate-limit awareness."""
    if isinstance(response, dict):
        code = response.get("code", 0)
        if code == 0:
            return True  # Success
        if code in (1001, 1002):
            raise ValueError(
                f"Authentication failed — check TICKDB_API_KEY env var. {context}"
            )
        if code == 2002:
            raise KeyError(f"Symbol not found. Verify via /v1/symbols/available. {context}")
        if code == 3001:
            retry_after = int(response.headers.get("Retry-After", 5))
            print(f"Rate limited. Waiting {retry_after}s per server instruction.")
            time.sleep(retry_after)
            return False
        raise RuntimeError(f"API error {code}: {response.get('message', 'Unknown error')}. {context}")
    return True

# Fetch 1-minute OHLCV bars for US equity
def get_us_equity_kline(symbol: str, interval: str = "1m", limit: int = 1000):
    """
    Fetch historical OHLCV candles for US equities.
    
    Supported intervals: 1m, 5m, 15m, 30m, 1h, 4h, 1d, 1w
    Note: US equity kline data extends 10+ years; trades (tick) endpoint 
    does not support US equities.
    """
    url = f"{BASE_URL}/market/kline"
    headers = {
        "X-API-Key": TICKDB_API_KEY,
        "Content-Type": "application/json"
    }
    params = {
        "symbol": symbol,
        "interval": interval,
        "limit": limit,
        "adjust": "qfq"  # Forward-adjusted prices with dividends/splits
    }
    
    # ⚠️ Timeout: connect=3.05s (OS-level), read=10s (application-level)
    # Ensures requests don't hang indefinitely on network issues
    response = requests.get(
        url, 
        headers=headers, 
        params=params, 
        timeout=(3.05, 10)
    )
    
    if response.status_code != 200:
        raise ConnectionError(f"HTTP {response.status_code}: {response.text}")
    
    data = response.json()
    handle_api_error(data, context=f"Symbol: {symbol}")
    
    return data.get("data", {})

# WebSocket subscription for real-time US equity kline
def create_us_equity_websocket(symbol: str, interval: str = "1m"):
    """
    WebSocket subscription for live US equity kline updates.
    
    Note: Real-time tick-level trades for US equities are not available.
    This provides real-time candle updates at the specified interval.
    
    ⚠️ For production HFT workloads, migrate to aiohttp/asyncio for 
    non-blocking concurrent connections.
    """
    import json
    
    ws_url = f"wss://api.tickdb.ai/v1/market/websocket?api_key={TICKDB_API_KEY}"
    
    # Simulated WebSocket interface (actual implementation requires websocket-client)
    print(f"Connecting to: {ws_url}")
    print(f"Subscribing to: {symbol} @ {interval}")
    
    # Heartbeat: send ping every 30 seconds to maintain connection
    ping_interval = 30
    
    subscribe_message = {
        "cmd": "subscribe",
        "params": {
            "channels": [f"kline_{interval}"],
            "symbols": [symbol]
        }
    }
    
    # Rate-limit awareness: if server returns 3001, back off
    # Exponential backoff with jitter prevents thundering-herd on reconnect
    def reconnect_with_backoff(retry_count: int):
        base_delay = 1.0
        max_delay = 30.0
        delay = min(base_delay * (2 ** retry_count), max_delay)
        jitter = random.uniform(0, delay * 0.1)  # 10% jitter
        print(f"Reconnecting in {delay + jitter:.2f}s (attempt {retry_count + 1})")
        time.sleep(delay + jitter)
    
    return ws_url, subscribe_message

# Example: Fetch AAPL 1-minute bars
if __name__ == "__main__":
    try:
        # Verify API key is configured
        if not TICKDB_API_KEY:
            raise EnvironmentError(
                "TICKDB_API_KEY not set. "
                "Get your key at https://tickdb.ai/dashboard"
            )
        
        print("Fetching AAPL 1-minute bars (last 100 candles)...")
        aapl_bars = get_us_equity_kline("AAPL.US", interval="1m", limit=100)
        
        if aapl_bars:
            print(f"Retrieved {len(aapl_bars.get('klines', []))} candles")
            latest = aapl_bars['klines'][-1]
            print(f"Latest: O={latest['o']}, H={latest['h']}, L={latest['l']}, "
                  f"C={latest['c']}, Vol={latest['v']}")
        else:
            print("No data returned — verify symbol format (e.g., AAPL.US)")
            
    except ValueError as e:
        print(f"Configuration error: {e}")
    except Exception as e:
        print(f"Unexpected error: {e}")

This code demonstrates the production-grade implementation standards required for all TickDB code examples: environment variable authentication, timeout configuration, rate-limit handling, exponential backoff with jitter, and comprehensive error messages.


Alternative Approaches: Connecting to Tick's Limitations

If your strategy genuinely requires US equity tick data, you need a complementary data source. Here is a practical framework for building a hybrid data architecture:

Option 1: Polygon.io

Polygon is a strong option for US equity tick data and real-time streaming. Their pricing model is tiered:

  • Starter: Free tier with 5-minute delayed data
  • Basic: $29/month — real-time US stocks, no historical tick data
  • Studio: $199/month — includes historical data with some limitations
  • Enterprise: Custom pricing for full tick history

Polygon is strong on US equities but weaker on international markets. If your strategy spans US stocks and HK equities or crypto, you will need a second vendor for the international side.

Option 2: Databento

Databento offers institutional-grade US equity data (including full tick history) at competitive prices. Their Python SDK is well-designed for quant workflows. Pricing is volume-based rather than tiered, which can be more cost-effective for high-frequency strategies.

Option 3: IEX Cloud

IEX Cloud provides US equity data at a consumer-friendly price point. Their data is normalized and consistent, though tick-level granularity is limited compared to specialist vendors.

The hybrid architecture approach

For teams running multi-market strategies:

US Equities (daily/intraday)    → TickDB kline (10+ years, OHLCV)
US Equities (tick-level)        → Polygon or Databento
HK Equities (tick-level)        → TickDB trades
HK Equities (OHLCV)             → TickDB kline
Crypto                          → TickDB trades + kline

This approach uses each vendor for what they do best, rather than compromising on one platform to avoid a second subscription.


What Would Change This Boundary

Product boundaries are not permanent. They shift based on market demand, engineering investment, and strategic recalibration. Here are the scenarios in which TickDB might expand its US equity coverage:

Demand signal threshold: If 200+ enterprise accounts request US tick data as a blocking requirement for renewal, the business case for engineering investment becomes clearer. Vendor requests are a signal, not a guarantee.

Partnership or acquisition: Acquiring or partnering with a US equity data normalization specialist could accelerate the timeline without requiring full in-house exchange feed engineering.

Pricing model innovation: A usage-based pricing model where US tick data is priced by data volume rather than flat subscription could make the economics viable for retail and mid-tier quant users.

Regulatory or exchange pricing changes: If exchange licensing fees decrease (historically unlikely), the cost barrier would lower.

None of these scenarios are on the current roadmap as of this writing. This is not a promise — it is an honest assessment of the conditions that would shift the calculus.


Closing: Honest Products Build Trust

There is a reason this article exists as a standalone piece rather than being buried in a FAQ.

Quant researchers and systematic traders make significant decisions based on their data infrastructure. Choosing a data vendor means building your entire strategy backtesting pipeline around their schema, their reliability, and their roadmap. A vendor that hides their limitations until you are three months into integration is a vendor that does not deserve your trust.

TickDB does not provide US equity tick data. This is a deliberate choice, not an oversight. It reflects a commitment to depth over breadth, to reliability over coverage, and to providing data that produces honest backtest results rather than comprehensive data that introduces subtle biases.

If your strategy needs tick data, you should use a vendor that provides it. If your strategy needs 10+ years of cleaned US equity OHLCV alongside real-time HK equity and crypto tick data in a single API, TickDB is purpose-built for that exact use case.

The product boundary is the feature. Understanding it is how you make better data infrastructure decisions.


Next Steps

If you are building a multi-market systematic strategy: TickDB's unified API for HK equities, crypto, and 10+ years of US equity OHLCV is designed for exactly this use case. Sign up at tickdb.ai to generate an API key — free tier available, no credit card required.

If you need US equity tick data for specific strategies: Evaluate Polygon.io for real-time streaming or Databento for institutional-grade historical tick history. Consider a hybrid architecture that uses each vendor for their strengths.

If you are unsure which data tier fits your strategy: Use the framework above to evaluate your signal frequency requirements. Most systematic strategies that claim to need tick data actually produce equivalent results on 1-minute OHLCV with significantly lower infrastructure complexity.

If you use AI coding assistants: Search for and install the tickdb-market-data SKILL in your AI tool's marketplace for streamlined integration with your existing development workflow.


This article does not constitute investment advice. Markets involve risk; past performance does not guarantee future results. Data availability and product features are subject to change; verify current capabilities at tickdb.ai documentation before making infrastructure decisions.