rc-mcp-generator is a Gemini CLI extension and deterministic TypeScript generator that creates standalone minimal Rocket.Chat MCP servers for only the workflows and API operations a project actually needs.
In one line:
Instead of shipping one huge Rocket.Chat MCP server with every possible tool, this project generates a small deployable MCP server with only the selected tools, tests, and configuration.
Rocket.Chat exposes a large API surface. A full MCP server for all of it adds unnecessary tool definitions, schema payload, and context overhead to every agent loop. This project solves that by generating a new MCP server per project, containing only the selected subset of capabilities.
That gives you:
- less context bloat
- fewer irrelevant tools
- lower token waste
- simpler tool selection for agents
- a standalone generated server you can run, inspect, and deploy
This MVP proves both required paths:
- Direct generator path
- generate a minimal Rocket.Chat MCP server from the CLI
- run the generated server
- connect with MCP Inspector
- execute a real Rocket.Chat action
- Gemini CLI extension path
- install this repo as a Gemini CLI extension
- let Gemini call discovery/generation tools
- generate a standalone MCP server through Gemini
- run that generated server and execute a real Rocket.Chat action
There are two layers in this repo:
-
packages/generator- the main product
- Gemini CLI extension MCP server over stdio
- endpoint discovery/search/suggestion
- deterministic server generation
- validation and minimality analysis
-
packages/mcp-server- reusable generated-server foundation
- typed Rocket.Chat client
- Streamable HTTP MCP server
- high-level Rocket.Chat workflow tools reused as generation templates
flowchart LR
A[Developer or Gemini CLI] --> B[rc-mcp-generator]
B --> C[OpenAPI extraction]
B --> D[Workflow registry]
B --> E[Suggestion / discovery tools]
B --> F[Deterministic scaffolder]
F --> G[Generated standalone MCP server]
G --> H[MCP Inspector / Claude / Cursor / Gemini]
G --> I[Rocket.Chat server]
sequenceDiagram
participant U as User
participant G as Gemini CLI
participant X as rc-mcp-generator extension
participant O as Rocket.Chat OpenAPI
participant S as Generated MCP Server
participant R as Rocket.Chat
U->>G: "Generate a minimal Rocket.Chat MCP server"
G->>X: rc_search_endpoints / rc_discover_endpoints / rc_list_workflows
X->>O: load current Rocket.Chat OpenAPI specs
O-->>X: endpoint metadata
G->>X: rc_generate_server
X-->>G: generated server directory
U->>S: start generated server
U->>S: connect via MCP Inspector
S->>R: execute selected Rocket.Chat action
R-->>S: API response
The extension exposes these MCP tools:
rc_list_workflowsrc_search_endpointsrc_discover_endpointsrc_suggest_endpointsrc_generate_serverrc_validate_serverrc_analyze_minimality
The workflow registry currently includes 10 platform-level operations:
send_channel_messagecreate_project_roomonboard_usersearch_messagesarchive_project_channelget_user_mentionspost_standupcreate_support_ticketbroadcast_announcementexport_channel_summary
These are not raw API wrappers. They are higher-level operations built for typical Rocket.Chat project workflows.
rc_generate_server creates a standalone project with:
src/server.tssrc/rc-client.tssrc/tools/*.tstests/*.test.ts.env.env.exampleREADME.mdGEMINI.mdgemini-extension.jsonDockerfile
The generated server runs over Streamable HTTP and reads:
RC_SERVER_URLRC_AUTH_TOKENRC_USER_IDPORTENABLED_TOOLS
From the repo root:
cd /home/samar/Projects/rc-mcp-generator
pnpm install
pnpm typecheck
pnpm build
pnpm testThis is the simplest mentor demo.
From the repo root:
node packages/generator/dist/cli.js generate \
--output ./generated/my-server \
--workflows send_channel_message,post_standup \
--operation-ids post-api-v1-chat.sendMessage,get-api-v1-chat.search \
--server-name my-rc-mcp-server \
--rc-url http://localhost:3000 \
--rc-auth-token YOUR_REAL_TOKEN \
--rc-user-id YOUR_REAL_USER_IDnode packages/generator/dist/cli.js validate ./generated/my-servercd /home/samar/Projects/rc-mcp-generator/generated/my-server
npm install
npm run build
npm testEdit generated/my-server/.env:
RC_SERVER_URL=http://localhost:3000
RC_AUTH_TOKEN=YOUR_REAL_TOKEN
RC_USER_ID=YOUR_REAL_USER_ID
ENABLED_TOOLS=
PORT=4000Use 4000 because Rocket.Chat itself usually runs on 3000.
npm startIn another terminal:
curl http://127.0.0.1:4000/healthExpected:
{"ok":true,"service":"rc-mcp-server"}Open MCP Inspector and configure:
- Transport:
Streamable HTTP - URL:
http://localhost:4000/mcp
Then run send_channel_message with:
{
"channelName": "general",
"text": "hello from generated my-server"
}Expected result:
- the tool returns success in Inspector
- the message appears in Rocket.Chat
#general
This proves the "extension/customization of gemini-cli" part.
From the repo root:
pnpm build
gemini extensions link .This uses the root manifest at gemini-extension.json.
geminiUse prompts like:
List the Rocket.Chat workflows available in the rc-mcp-generator extension.
Search Rocket.Chat endpoints related to sending messages and searching messages.
Discover the Rocket.Chat messaging domain and expand the Chat tag.
Expected:
- Gemini calls the extension tools
- Gemini returns workflows, endpoint search results, and expanded endpoint discovery data
Example prompt:
Generate a minimal Rocket.Chat MCP server in ./generated/gemini-server with workflows send_channel_message and post_standup and operationIds post-api-v1-chat.sendMessage,get-api-v1-chat.search. Use Rocket.Chat URL http://localhost:3000, auth token YOUR_REAL_TOKEN, and user ID YOUR_REAL_USER_ID.
Expected:
- Gemini calls
rc_generate_server - a new folder appears at
generated/gemini-serverorgenerated/gemini-server-*
From a normal terminal:
cd /home/samar/Projects/rc-mcp-generator/generated/gemini-server
npm install
npm run build
npm testEdit its .env to avoid port conflict:
RC_SERVER_URL=http://localhost:3000
RC_AUTH_TOKEN=YOUR_REAL_TOKEN
RC_USER_ID=YOUR_REAL_USER_ID
ENABLED_TOOLS=
PORT=4001Then start it:
npm startVerify:
curl http://127.0.0.1:4001/healthConnect MCP Inspector to:
- Transport:
Streamable HTTP - URL:
http://localhost:4001/mcp
Then run send_channel_message again.
If the message appears in Rocket.Chat, the Gemini-driven path is proven.
You do not need a custom API key for this project itself.
You do need:
- Gemini CLI installed and authenticated, if testing the Gemini extension path
- Rocket.Chat credentials for the generated server:
RC_SERVER_URLRC_AUTH_TOKENRC_USER_ID
Workspace checks:
pnpm typecheckpnpm buildpnpm test
Generated output checks were also verified:
- generated server installs successfully
- generated server builds successfully
- generated server tests pass
- generated server runs over Streamable HTTP
- tool calls succeed against a live local Rocket.Chat instance
- The first live generation run fetches Rocket.Chat OpenAPI specs from GitHub.
- Generated
.envfiles may default toPORT=3000; for local testing, change them to a free port like4000or4001. - The
generated/directory in this repo may contain previous smoke/demo outputs from local verification runs.
For a proposal or mentor review, the GitHub repository is the main deliverable.
Recommended package to share:
- GitHub repo link
- short demo video or GIF
- 2 screenshots:
- MCP Inspector tool call success
- Rocket.Chat message appearing in the workspace
Hosting is optional for the MVP.
You do not need to host the generator itself to prove the project idea. A clean repo plus a short reproducible demo is enough. If you want extra polish, you can host one generated MCP server later, but that is not necessary to demonstrate the MVP.