Quick-deploy dev environments for one-off projects in your homelab.
Inspired by Fly.io's Sprites, Spark creates ephemeral development environments in your Kubernetes cluster. Perfect for vibe coding, experiments, and quick prototypes.
Each spark automatically gets:
- Random name: Auto-generated adjective-noun combinations (e.g.,
brave-dolphin,wise-falcon) - Dev environment: Debian container with SSH, Claude Code CLI, and your dotfiles
- Network access: Tailscale connectivity for external access
- Database: Dedicated PostgreSQL database with connection string pre-configured
- Secrets: Environment variables for
ANTHROPIC_API_KEY,GITHUB_TOKEN,DATABASE_URL - Git integration: Optional automatic cloning of a repository
- Persistent storage: 10GB volume mounted at
/home/user
- Kubernetes cluster with kubectl configured
- PostgreSQL instance accessible from the cluster
- Tailscale operator installed in the cluster
- Dedicated PostgreSQL user
sparkwith CREATEDB privilege (see DATABASE_SETUP.md) - Environment variables:
ANTHROPIC_API_KEY- Your Anthropic API keyPOSTGRES_PASSWORD- Password for thesparkPostgreSQL user- SSH public key at
~/.ssh/id_ed25519.pub(or setSSH_PUBLIC_KEY_PATH) GITHUB_TOKEN(optional) - For private repository access
go install github.com/t-eckert/homelab/spark@latestOr build from source:
git clone https://github.com/t-eckert/homelab.git
cd homelab/spark
go build -o spark
sudo mv spark /usr/local/bin/Create a new spark:
spark createThis will:
- Generate a random name (e.g.,
brave-dolphin) - Create a PostgreSQL database
- Deploy a Kubernetes pod with your dev environment
- Wait for the pod to be ready
- Automatically SSH into the container
Create with a git repository:
spark create --repo https://github.com/username/project.gitThe repository will be cloned to /home/user/project.
List active sparks:
spark listConnect to an existing spark:
spark shell brave-dolphinDelete a spark:
spark delete brave-dolphinThis removes the Kubernetes resources and PostgreSQL database.
Spark uses environment variables for configuration:
| Variable | Default | Description |
|---|---|---|
ANTHROPIC_API_KEY |
required | Anthropic API key for Claude Code |
POSTGRES_PASSWORD |
required | Password for the spark PostgreSQL user |
POSTGRES_HOST |
postgres.postgres.svc.cluster.local |
PostgreSQL hostname |
POSTGRES_PORT |
5432 |
PostgreSQL port |
POSTGRES_USER |
spark |
PostgreSQL username |
POSTGRES_DB |
homelab |
PostgreSQL database to connect to |
SSH_PUBLIC_KEY_PATH |
~/.ssh/id_ed25519.pub |
Path to SSH public key |
GITHUB_TOKEN |
- | GitHub token for private repos (optional) |
Each spark creates the following resources in the spark namespace:
- Deployment: Single replica running Debian with init script
- Service: LoadBalancer with Tailscale integration
- PersistentVolumeClaim: 10GB storage for
/home/user - ConfigMap: SSH authorized keys and configuration
- Secret: Database credentials, API keys, GitHub token
The Debian container runs an init script that:
- Installs system dependencies (SSH, git, curl, etc.)
- Creates a non-root user (
user) with sudo access - Configures SSH with your public key
- Installs Claude Code CLI
- Clones your dotfiles from
github.com/t-eckert/dotfiles - Optionally clones a specified git repository
- Starts SSH daemon
A PostgreSQL database is created with the same name as the spark. The connection string is available in the container as $DATABASE_URL:
host=postgres.postgres.svc.cluster.local port=5432 user=spark password=*** dbname=brave-dolphin sslmode=disable
Sparks are accessible via Tailscale:
ssh user@spark-brave-dolphinThe Tailscale operator creates a proxy pod that handles the LoadBalancer service.
spark/
├── cmd/ # CLI commands
│ ├── root.go # Root command and help
│ ├── create.go # Create command
│ ├── list.go # List command
│ ├── shell.go # Shell command
│ └── delete.go # Delete command
├── internal/
│ ├── k8s/ # Kubernetes client and resources
│ │ ├── client.go # K8s API operations
│ │ └── resources.go # Resource templates
│ ├── db/ # PostgreSQL operations
│ │ └── postgres.go # Database creation/deletion
│ ├── config/ # Configuration loading
│ │ └── config.go # Environment variable parsing
│ └── names/ # Name generation
│ └── generator.go # Random adjective-noun names
├── main.go # Entry point
└── go.mod # Dependencies
go build -o sparkgithub.com/spf13/cobra- CLI frameworkk8s.io/client-go- Kubernetes API clientk8s.io/api- Kubernetes API typesgithub.com/lib/pq- PostgreSQL driver
This project is inspired by Fly.io's Sprites, which provides ephemeral dev environments with excellent UX. Spark brings a similar experience to self-hosted Kubernetes environments.
MIT