Reverse ETL for the code-first data stack.
drt syncs data from your data warehouse to external services — declaratively, via YAML and CLI.
Think dbt run → drt run. Same developer experience, opposite data direction.
pip install drt-core # core (DuckDB included)
drt init && drt run| Problem | drt's answer |
|---|---|
| Census/Hightouch are expensive SaaS | Free, self-hosted OSS |
| GUI-first tools don't fit CI/CD | CLI + YAML, Git-native |
| dbt/dlt ecosystem has no reverse leg | Same philosophy, same DX |
| LLM/MCP era makes GUI SaaS overkill | LLM-native by design |
No cloud accounts needed — runs locally with DuckDB in about 5 minutes.
pip install drt-coreFor cloud sources:
pip install drt-core[bigquery],drt-core[postgres], etc.
mkdir my-drt-project && cd my-drt-project
drt init # select "duckdb" as sourcepython -c "
import duckdb
c = duckdb.connect('warehouse.duckdb')
c.execute('''CREATE TABLE IF NOT EXISTS users AS SELECT * FROM (VALUES
(1, 'Alice', '[email protected]'),
(2, 'Bob', '[email protected]'),
(3, 'Carol', '[email protected]')
) t(id, name, email)''')
c.close()
"# syncs/post_users.yml
name: post_users
description: "POST user records to an API"
model: ref('users')
destination:
type: rest_api
url: "https://httpbin.org/post"
method: POST
headers:
Content-Type: "application/json"
body_template: |
{ "id": {{ row.id }}, "name": "{{ row.name }}", "email": "{{ row.email }}" }
sync:
mode: full
batch_size: 1
on_error: faildrt run --dry-run # preview, no data sent
drt run # run for real
drt status # check resultsSee examples/ for more: Slack, Google Sheets, HubSpot, GitHub Actions, etc.
drt init # initialize project
drt list # list sync definitions
drt sources # list available source connectors
drt destinations # list available destination connectors
drt run # run all syncs
drt run --select <name> # run a specific sync
drt run --all # discover and run all syncs
drt run --select tag:<tag> # run syncs matching a tag
drt run --threads 4 # parallel sync execution
drt run --dry-run # dry run
drt run --verbose # show row-level error details
drt run --output json # structured JSON output for CI/scripting
drt run --log-format json # structured JSON logging to stderr
drt run --profile prd # override profile (or DRT_PROFILE env var)
drt run --cursor-value '…' # override watermark cursor for backfill
drt test # run post-sync validation tests
drt test --select <name> # test a specific sync
drt validate # validate sync YAML configs
drt status # show recent sync status
drt status --output json # JSON output for status
drt serve # start HTTP webhook endpoint
drt mcp run # start MCP server (requires drt-core[mcp])
drt --install-completion # install shell completion (bash/zsh/fish)
drt --show-completion # show completion scriptShell completion is supported for bash, zsh, and fish:
# Recommended: auto-install for your current shell (idempotent)
drt --install-completion
# Or manually add to your shell config (run once from the target shell)
drt --show-completion >> ~/.bashrc # bash
drt --show-completion >> ~/.zshrc # zsh
drt --show-completion > ~/.config/fish/completions/drt.fish # fishNote:
--show-completionoutputs the script for your current shell. Run it from the shell you want to configure. The manual>>append is not idempotent — run it once only.
After installation, restart your shell and tab-complete commands and options.
Connect drt to Claude, Cursor, or any MCP-compatible client so you can run syncs, check status, and validate configs without leaving your AI environment.
pip install drt-core[mcp]
drt mcp runClaude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"drt": {
"command": "drt",
"args": ["mcp", "run"]
}
}
}Available MCP tools:
| Tool | What it does |
|---|---|
drt_list_syncs |
List all sync definitions |
drt_run_sync |
Run a sync (supports dry_run) |
drt_get_status |
Get last run result(s) |
drt_validate |
Validate sync YAML configs |
drt_get_schema |
Return JSON Schema for config files |
drt_list_connectors |
List available sources and destinations |
Install the official Claude Code skills to generate YAML, debug failures, and migrate from other tools — all from the chat interface.
/plugin marketplace add drt-hub/drt
/plugin install drt@drt-hubTip: Enable auto-update so you always get the latest skills when drt is updated:
/plugin→ Marketplaces → drt-hub → Enable auto-update
Copy the files from .claude/commands/ into your drt project's .claude/commands/ directory.
| Skill | Trigger | What it does |
|---|---|---|
/drt-create-sync |
"create a sync" | Generates valid sync YAML from your intent |
/drt-debug |
"sync failed" | Diagnoses errors and suggests fixes |
/drt-init |
"set up drt" | Guides through project initialization |
/drt-migrate |
"migrate from Census" | Converts existing configs to drt YAML |
| Connector | Status | Install | Auth |
|---|---|---|---|
| BigQuery | ✅ v0.1 | pip install drt-core[bigquery] |
Application Default / Service Account Keyfile |
| DuckDB | ✅ v0.1 | (core) | File path |
| PostgreSQL | ✅ v0.1 | pip install drt-core[postgres] |
Password (env var) |
| Snowflake | ✅ v0.5 | pip install drt-core[snowflake] |
Password (env var) |
| SQLite | ✅ v0.4.2 | (core) | File path |
| Redshift | ✅ v0.3.4 | pip install drt-core[redshift] |
Password (env var) |
| ClickHouse | ✅ v0.4.3 | pip install drt-core[clickhouse] |
Password (env var) |
| MySQL | ✅ v0.5 | pip install drt-core[mysql] |
Password (env var) |
| Databricks | ✅ v0.6 | pip install drt-core[databricks] |
Access Token (env var) |
| SQL Server | ✅ v0.6 | pip install drt-core[sqlserver] |
Password (env var) |
| Connector | Status | Install | Auth |
|---|---|---|---|
| REST API | ✅ v0.1 | (core) | Bearer / API Key / Basic / OAuth2 |
| Slack Incoming Webhook | ✅ v0.1 | (core) | Webhook URL |
| Discord Webhook | ✅ v0.4.2 | (core) | Webhook URL |
| GitHub Actions | ✅ v0.1 | (core) | Token (env var) |
| HubSpot | ✅ v0.1 | (core) | Token (env var) |
| Google Ads | ✅ v0.6 | (core) | OAuth2 Client Credentials |
| Google Sheets | ✅ v0.4 | pip install drt-core[sheets] |
Service Account Keyfile |
| PostgreSQL (upsert) | ✅ v0.4 | pip install drt-core[postgres] |
Password (env var) |
| MySQL (upsert) | ✅ v0.4 | pip install drt-core[mysql] |
Password (env var) |
| ClickHouse | ✅ v0.5 | pip install drt-core[clickhouse] |
Password (env var) |
| Parquet file | ✅ v0.5 | pip install drt-core[parquet] |
File path |
| Microsoft Teams Webhook | ✅ v0.5 | (core) | Webhook URL |
| CSV / JSON / JSONL file | ✅ v0.5 | (core) | File path |
| Jira | ✅ v0.5 | (core) | Basic (email + API token) |
| Linear | ✅ v0.5 | (core) | API Key (env var) |
| SendGrid | ✅ v0.5 | (core) | API Key (env var) |
| Notion | ✅ v0.6 | (core) | Bearer Token (env var) |
| Twilio SMS | ✅ v0.6 | (core) | Basic (Account SID + Auth Token) |
| Intercom | ✅ v0.6 | (core) | Bearer Token (env var) |
| Email SMTP | ✅ v0.6 | (core) | Username / Password (env var) |
| Salesforce Bulk API 2.0 | ✅ v0.6 | (core) | OAuth2 (username-password) |
| Staged Upload | ✅ v0.6 | (core) | Configurable per provider |
| Connector | Status | Install |
|---|---|---|
| Dagster | ✅ v0.4 | pip install dagster-drt |
| Prefect | ✅ v0.6 | (core) |
| Airflow | ✅ v0.6 | (core) |
| dbt manifest reader | ✅ v0.4 | (core) |
Detailed plans & progress → GitHub Milestones Looking to contribute? → Good First Issues
| Version | Focus |
|---|---|
| v0.1 ✅ | BigQuery / DuckDB / Postgres sources · REST API / Slack / GitHub Actions / HubSpot destinations · CLI · dry-run |
| v0.2 ✅ | Incremental sync (cursor_field watermark) · retry config per-sync |
| v0.3 ✅ | MCP Server (drt mcp run) · AI Skills for Claude Code · LLM-readable docs · row-level errors · security hardening · Redshift source |
| v0.4 ✅ | Google Sheets / PostgreSQL / MySQL destinations · dagster-drt · dbt manifest reader · type safety overhaul |
| v0.5 ✅ | Snowflake / MySQL sources · ClickHouse / Parquet / Teams / CSV+JSON / Jira / Linear / SendGrid destinations · drt test · --output json · --profile · ${VAR} substitution · dbt manifest · secrets.toml · Docker |
| v0.5.4 ✅ | destination_lookup — resolve FK values by querying destination DB during sync (MySQL / Postgres / ClickHouse) |
| v0.6 ✅ | Databricks / SQL Server sources · Notion / Twilio / Intercom / Email SMTP / Salesforce Bulk / Staged Upload destinations · Airflow / Prefect integrations · drt serve · drt sources / drt destinations · --threads parallel execution · --log-format json · --cursor-value · watermark.default_value · test validators (freshness, unique, accepted_values) · JSON Schema validation · GOVERNANCE.md |
| v0.7 | DWH destinations (Snowflake / BigQuery / ClickHouse / Databricks) · Cloud storage (S3 / GCS / Azure Blob) |
| v0.8 | Lakehouse sources (Delta Lake / Apache Iceberg) |
| v1.x | Rust engine (PyO3) |
Community-maintained Dagster integration. Expose drt syncs as Dagster assets with full observability.
pip install dagster-drtfrom dagster import AssetExecutionContext, Definitions
from dagster_drt import drt_assets, DagsterDrtResource
@drt_assets(project_dir="path/to/drt-project")
def my_syncs(context: AssetExecutionContext, drt: DagsterDrtResource):
yield from drt.run(context=context)
defs = Definitions(
assets=[my_syncs],
resources={"drt": DagsterDrtResource(project_dir="path/to/drt-project")},
)See dagster-drt README for full API docs (Translator, Pipes support, DrtConfig dry-run, MaterializeResult).
drt is designed to work alongside, not against, the modern data stack:
We welcome contributions of all sizes — from typo fixes to new connectors. drt has a transparent contributor ladder so your work builds toward greater trust and responsibility over time.
- Get started: CONTRIBUTING.md — setup, workflow, and your first connector tutorial
- Pick something to work on: Good First Issues
- Understand how decisions are made: GOVERNANCE.md
drt is an independent open-source project and is not affiliated with, endorsed by, or sponsored by dbt Labs, dlt-hub, or any other company.
"dbt" is a registered trademark of dbt Labs, Inc. "dlt" is a project maintained by dlt-hub.
drt is designed to complement these tools as part of the modern data stack, but is a separate project with its own codebase and maintainers.
Apache 2.0 — see LICENSE.