I build tools at the intersection of code and generative AI — agentic applications, ComfyUI-powered media pipelines, and developer frameworks. If a workflow can be automated with AI, I'll wrap it in an agent or a skill.
Systems where LLMs don't just generate text — they take actions, call tools, and complete multi-step tasks autonomously.
- Skill-based agents — composable tools/skills that give LLMs real capabilities
- Agentic apps — full-stack applications where an agent is the core runtime (see aurora)
- Spec-driven development — structured specs to guide agents through complex tasks (nvst)
Mastra · Vercel AI SDK · Assistant UI · GitHub Copilot SDK · MCP · ElysiaJS / Bun
| reelpod-studio | Creator tool for AI-generated lo-fi music & visuals — audio-reactive R3F visualizers, ACEStep + diffusers backend |
| zikon | CLI for generating icons & SVG assets from text prompts via diffusion models — installable as an agent skill |
| aurora | Agentic RPG — Mastra agent narrates, builds world & generates media in real time |
| comfy-diffusion | Python library exposing ComfyUI's inference engine as importable modules — no server needed |
| nerds-vibecoding-survivor-toolkit | Spec-driven agentic development framework & CLI built on Bun |
| ComfyUI_ComfyAssistant | Natural language agent to control & explore ComfyUI workflows |
| ComfyUI_DSS_Wrapper | DiffSynth Studio wrapper — zero-shot image-to-LoRA nodes |
| comfyui_api_executor_nodes | Custom nodes for ComfyUI workflow execution via API |
| mushin-ai | Minimal AI-powered notepad / second brain |
| niji-prompt-generator | Browser extension for generating Midjourney prompts via OpenAI |
| comfyui-base-image | Docker container for running ComfyUI on RunPod |
| datasets-toolbox | Utility toolkit for managing AI training datasets |
Python · TypeScript · Bun · ElysiaJS · React · ComfyUI · Docker · OpenAI API · Anthropic API · Mastra · Vercel AI SDK · Assistant UI · GitHub Copilot SDK · MCP · LoRA / DiffSynth · RunPod