Skip to content

fix(agents-core): upgrade dolt and remove custom jsonb#3096

Merged
miles-kt-inkeep merged 3 commits intomainfrom
fix/remove-custom-jsonb
Apr 10, 2026
Merged

fix(agents-core): upgrade dolt and remove custom jsonb#3096
miles-kt-inkeep merged 3 commits intomainfrom
fix/remove-custom-jsonb

Conversation

@miles-kt-inkeep
Copy link
Copy Markdown
Contributor

No description provided.

@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented Apr 9, 2026

⚠️ No Changeset found

Latest commit: 9a0baf6

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 9, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
agents-api Ready Ready Preview, Comment Apr 9, 2026 10:29pm
agents-docs Ready Ready Preview, Comment Apr 9, 2026 10:29pm
agents-manage-ui Ready Ready Preview, Comment Apr 9, 2026 10:29pm

Request Review

@pullfrog
Copy link
Copy Markdown
Contributor

pullfrog Bot commented Apr 9, 2026

TL;DR — Upgrades Doltgres from 0.55.x to 0.56.1 and removes the custom jsonb workaround — plus its tests and one-off repair script — that existed to paper over a JSON escape-sequence bug in older Dolt versions. The schema now uses Drizzle's native jsonb directly.

Key changes

  • Bump Doltgres to 0.56.1 — updates the container image in both CI and local Docker Compose, picking up the upstream fix for the JSON backslash-parsing bug.
  • Delete dolt-safe-jsonb.ts and its test suite — removes the custom DoltSafeJsonb column type, encode/decode helpers, and 106 lines of tests that are no longer needed.
  • Switch manage-schema.ts to native jsonb — replaces the local import with Drizzle's built-in jsonb from drizzle-orm/pg-core.
  • Delete fix-doltgres-corrupt-jsonb.mjs — removes the 616-line one-off migration script that repaired rows corrupted by the old parser bug.

Summary | 6 files | 3 commits | base: mainfix/remove-custom-jsonb


Native jsonb after Doltgres 0.56.1

Before: Doltgres had an off-by-one error in its JSON escape-sequence state machine, so a custom DoltSafeJsonb column type encoded backslashes as a Unicode Private-Use-Area placeholder (U+E000) on write and decoded them on read. A separate 616-line script (fix-doltgres-corrupt-jsonb.mjs) existed to repair already-corrupted rows across all Dolt branches.
After: Doltgres 0.56.1 fixes the parser bug upstream, so the workaround, its tests, and the repair script are all removed. manage-schema.ts imports jsonb directly from drizzle-orm/pg-core.

No migration is needed since the underlying column type was always jsonb; only the application-layer encoding changes. Existing rows that still contain U+E000 placeholders would need a separate data migration if any survive, but the repair script was already run against production data.

manage-schema.ts · dolt-safe-jsonb.ts · dolt-safe-jsonb.test.ts · fix-doltgres-corrupt-jsonb.mjs

Pullfrog  | View workflow run | Triggered by Pullfrog | Using Claude Opus𝕏

Copy link
Copy Markdown
Contributor

@pullfrog pullfrog Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two issues — one build-breaking, one potential data-corruption risk.

  1. The test file packages/agents-core/src/__tests__/db/dolt-safe-jsonb.test.ts imports from the deleted module and will break the build/test suite. It needs to be deleted alongside dolt-safe-jsonb.ts.

  2. The old DoltSafeJsonb class encoded backslashes as U+E000 on write and decoded them back on read. If any rows in production were written while dolt-safe-jsonb.ts was active, those rows now contain literal U+E000 characters in their JSONB values. Removing the decode-on-read path means those values will be returned with U+E000 instead of backslashes. Before merging, confirm that either (a) a data migration has been run to replace U+E000 → \ in all affected rows, or (b) no production data was written through this codepath.

Pullfrog  | Fix all ➔Fix 👍s ➔View workflow run | Using Claude Opus𝕏

return new DoltSafeJsonbBuilder(name as any) as unknown as ReturnType<
typeof import('drizzle-orm/pg-core').jsonb
>;
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file is deleted, but packages/agents-core/src/__tests__/db/dolt-safe-jsonb.test.ts still imports encodeBackslashes and decodeBackslashes from it. That test file needs to be deleted too — otherwise the build/test suite will break.

Also, scripts/fix-doltgres-corrupt-jsonb.mjs has comments referencing dolt-safe-jsonb.ts (lines 13, 24, 213) and still encodes values with U+E000. If the workaround is no longer needed, that script's encoding logic should also be updated or the script removed entirely.

foreignKey,
index,
integer,
jsonb,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Switching to Drizzle's stock jsonb removes the mapFromDriverValue override that decoded U+E000 → \ on read. Any rows written while dolt-safe-jsonb.ts was active will still contain U+E000 placeholders in the database. Without the decode step, those values will now surface U+E000 to application code.

Has a data migration been run to replace U+E000 back to \ in all affected JSONB columns? If not, this change will silently corrupt reads for those rows.

Copy link
Copy Markdown
Contributor

@claude claude Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Review Summary

(4) Total Issues | Risk: High

🔴❗ Critical (2) ❗🔴

🔴 1) dolt-safe-jsonb.test.ts Orphaned test file imports deleted module

Issue: The test file packages/agents-core/src/__tests__/db/dolt-safe-jsonb.test.ts imports decodeBackslashes and encodeBackslashes from ../../db/manage/dolt-safe-jsonb, which was deleted in this PR.

Why: This will cause TypeScript compilation errors and test suite failures when pnpm check or pnpm test is run. CI will fail with "Cannot find module" error.

Fix: Delete the orphaned test file since the module it tests no longer exists:

rm packages/agents-core/src/__tests__/db/dolt-safe-jsonb.test.ts

Refs:


🔴 2) Data migration Existing U+E000-encoded data will be read incorrectly

Issue: The deleted dolt-safe-jsonb.ts module encoded backslashes as U+E000 on write and decoded them back to backslashes on read. With this module removed, any existing data containing U+E000 characters will no longer be decoded.

Why: Any JSONB data containing backslashes that was written while the custom encoder was active will now be read incorrectly. The U+E000 placeholder character will appear in strings instead of backslashes. This affects all tables with JSONB columns that stored user content with backslashes (file paths, regex patterns, escape sequences in prompts, tool configs, etc.).

The migration script scripts/fix-doltgres-corrupt-jsonb.mjs also encodes data using U+E000, confirming there is data in production that was intentionally encoded this way.

Before (with custom jsonb):

Write: "hello\world" → stored as "hello\uE000world"  
Read:  "hello\uE000world" → returned as "hello\world"

After (with standard drizzle jsonb):

Read: "hello\uE000world" → returned as "hello\uE000world" ❌

Fix: Before merging this PR:

  1. Run a data migration to convert all U+E000 characters back to backslashes in existing JSONB data across all Dolt branches, OR
  2. Verify with the team that no data exists with U+E000 encoding (unlikely given the fix script exists), OR
  3. Keep the decode-only portion of the custom jsonb to handle legacy data while writing new data without encoding

Refs:

🟠⚠️ Major (2) 🟠⚠️

🟠 1) scripts/fix-doltgres-corrupt-jsonb.mjs Migration script still encodes with U+E000

Issue: The fix script fix-doltgres-corrupt-jsonb.mjs contains an encodeBackslashes function (lines 213-225) that encodes data using U+E000. After this PR merges, if this script is run, it will write U+E000 characters that are never decoded.

Why: Running this script after the PR is merged would cause permanent data corruption from the user's perspective — U+E000 characters would remain in the data forever.

Fix: Update the fix script to either:

  • Remove the encodeBackslashes call since Dolt 0.56.1 no longer needs the workaround
  • Add a warning/error if run after Dolt 0.56.1 upgrade
  • Archive/deprecate the script with a note that it's only for pre-0.56.1 databases

Refs:


🟠 2) Deployment Deployment order dependency for self-hosted users

Issue: The Doltgres upgrade and removal of the encoding workaround should be coordinated with a data migration. Self-hosted deployments need clear upgrade instructions.

Why: The upgrade sequence matters:

  1. Upgrade Dolt to 0.56.1
  2. Run data migration to decode U+E000 characters back to backslashes
  3. Deploy code without the encoder

If steps are done out of order, data could be double-encoded or not decoded properly.

Fix: Document the required deployment order in release notes or changelog. Consider adding a one-time migration that runs automatically on startup to decode U+E000 characters.

💭 Consider (1) 💭

💭 1) .github/workflows/cypress.yml:40 Inconsistent Dolt version pinning across CI workflows

Issue: The Cypress workflow uses pinned version 0.56.1 while ci.yml uses latest.

Why: The inconsistency means the two CI workflows may test against different Dolt versions. Pinned versions are more deterministic, while latest automatically gets fixes but may introduce unexpected changes.

Fix: Consider aligning the version strategy — either pin both to 0.56.1 for reproducibility or use latest for both.

Refs:


🚫 REQUEST CHANGES

Summary: This PR has a build-breaking issue (orphaned test file) that will cause CI to fail, and a potential data corruption concern if there is existing data encoded with U+E000 characters. The orphaned test file must be deleted. The data migration concern needs clarification — please confirm whether a migration was run to decode existing U+E000 data before this change, or verify that no such data exists.

Discarded (0)

No findings were discarded.

Reviewers (3)
Reviewer Returned Main Findings Consider While You're Here Inline Comments Pending Recs Discarded
pr-review-standards 1 1 0 0 0 0 0
pr-review-devops 3 0 1 0 0 0 2
pr-review-breaking-changes 4 3 0 0 0 0 1
Total 8 4 1 0 0 0 3

Note: Orphaned test file finding was flagged by multiple reviewers (deduplicated). Version inconsistency finding was marked pre-existing by devops reviewer.

Copy link
Copy Markdown
Contributor

@claude claude Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Review Summary

(0) Total Issues | Risk: Low

✅ Prior Issues Resolved

The PR has been updated since the last review and has addressed the previously raised critical and major issues:

Prior Issue Status Resolution
🔴 Orphaned test file (dolt-safe-jsonb.test.ts) ✅ Resolved Deleted in commit e6aed96b7
🟠 Migration script still encodes U+E000 ✅ Resolved Script deleted in commit e6aed96b7

🕐 Pending Recommendations (2)

These items were raised in the prior review and still require clarification:

  • 🔴 Data migration concern — The custom DoltSafeJsonb encoder wrote data with U+E000 placeholders. If any production data contains U+E000 characters, it will now be returned as-is instead of decoded to backslashes. Please confirm: (a) a migration was run to decode U+E000 → \ in affected rows, or (b) no production data was written through this codepath.

  • 🟠 Deployment order documentation — Self-hosted deployments may need guidance on upgrade sequencing (Dolt upgrade → data migration → code deploy). Consider documenting in release notes if applicable.

💭 Consider (1) 💭

💭 1) cypress.yml:40 / ci.yml:274 Dolt version pinning inconsistency

Issue: The Cypress workflow pins to 0.56.1 while ci.yml, docker-compose.dbs.yml, and docker-compose.isolated.yml use latest.

Why: Inconsistent pinning may cause CI workflows to test against different Dolt versions. This is pre-existing and not introduced by this PR.

Fix: Consider aligning version strategy across all compose/workflow files for reproducibility.

Refs:


💡 APPROVE WITH SUGGESTIONS

Summary: The code changes look good! The previously identified critical issues (orphaned test file, migration script) have been properly resolved. The remaining concerns are about data migration confirmation and deployment documentation — please clarify whether production data contains U+E000-encoded values that need migration before this change goes live.

Discarded (0)

No findings were discarded.

Reviewers (1)
Reviewer Returned Main Findings Consider While You're Here Inline Comments Pending Recs Discarded
orchestrator 5 0 1 0 0 2 0
Total 5 0 1 0 0 2 0

Note: Delta review — sub-reviewers were not dispatched as changes since last review were file deletions addressing prior feedback.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 9, 2026

Preview URLs

Use these stable preview aliases for testing this PR:

These point to the same Vercel preview deployment as the bot comment, but they stay stable and easier to find.

Raw Vercel deployment URLs

@itoqa
Copy link
Copy Markdown

itoqa Bot commented Apr 9, 2026

Ito Test Report ❌

10 test cases ran. 5 failed, 1 additional finding, 4 passed.

Overall, the unified run failed with 6 of 10 tests failing (4 passing), indicating significant reliability issues despite some core safety checks working as expected. The most critical findings were multiple High-severity server-side JSONB persistence defects where valid escaped/backslash-heavy payloads cause 500s or can corrupt MCP tools, Data Components, and agent context-config records (including rapid multi-submit scenarios that poison subsequent list/get/delete paths), plus a Medium-severity API-validation gap that accepts dangerous transformation strings rejected by the UI, while unauthorized deep links, mobile JSON form usability, invalid-control-character blocking, and edit-flow refresh/back-forward resilience all passed.

❌ Failed (5)
Category Summary Screenshot
Adversarial ⚠️ Rapid escaped-string create/save requests can 500 and leave malformed JSONB rows that break subsequent list/get/delete operations. ADV-3
Happy-path ⚠️ Escaped MCP create payload leads to malformed persisted config and 500 on tools list read. ROUTE-1
Happy-path ⚠️ Escaped override payload with Windows-style paths triggers 500 instead of stable save+reload. ROUTE-2
Happy-path ⚠️ Creating a Data Component with escaped backslash-heavy JSON schema fails with a server 500 instead of persisting valid schema data. ROUTE-3
Happy-path ⚠️ Creating agent context configs with valid escaped path/token-like patterns can fail with HTTP 500 due to JSON/control-character handling in persistence. ROUTE-4
⚠️ Rapid create with escaped JSON can corrupt persisted tool and data-component rows
  • What failed: The system can return 500 during rapid creates, and subsequent list/get/delete endpoints also return 500 because malformed JSON content has been persisted; expected behavior is a safe rejection or atomic write that never corrupts stored rows.
  • Impact: A single malformed write can poison normal CRUD flows for project configuration entities, blocking management operations until manual data repair. This can disrupt admin workflows and require direct DB intervention.
  • Steps to reproduce:
    1. Open the Manage UI for tenant/project default/activities-planner.
    2. Create an MCP server using escaped-string content in prompt/config fields.
    3. Trigger rapid repeated Create submits (double/triple click) for the same create flow.
    4. Call tool list/get/delete endpoints and observe repeated 500 responses after the failed create.
    5. Repeat the same rapid create pattern for a Data Component payload containing escaped sequences and verify list/get/delete also fail with 500s.
  • Stub / mock context: Rapid multi-submit behavior was intentionally triggered to stress write safety, and local bypass authentication was used for deterministic API verification. No mocked backend responses or route interception were used for this test.
  • Code analysis: I inspected the Manage API create handlers and data-access writes for both tools and data components. The routes forward validated request bodies directly into insert operations with no idempotency protection for rapid submits and no guard to prevent malformed JSONB persistence from cascading into read/delete failures.
  • Why this is likely a bug: Production create paths can persist malformed JSON-adjacent payload states during rapid submissions, and that persisted state then deterministically breaks downstream CRUD reads/deletes with server-side 500s.

Relevant code:

agents-api/src/domains/manage/routes/tools.ts (lines 217-233)

async (c) => {
    const db = c.get('db');
    const { tenantId, projectId } = c.req.valid('param');
    const body = c.req.valid('json');
    const credentialStores = c.get('credentialStores');
    const userId = c.get('userId');

    logger.info({ body }, 'body');

    const id = body.id || generateId();

    const tool = await createTool(db)({
      ...body,
      tenantId,
      projectId,
      id,
    });

packages/agents-core/src/data-access/manage/tools.ts (lines 624-637)

export const createTool = (db: AgentsManageDatabaseClient) => async (params: ToolInsert) => {
  const now = new Date().toISOString();

  const [created] = await db
    .insert(tools)
    .values({
      ...params,
      createdAt: now,
      updatedAt: now,
    })
    .returning();

  return created;
};

packages/agents-core/src/data-access/manage/dataComponents.ts (lines 94-132)

export const createDataComponent =
  (db: AgentsManageDatabaseClient) =>
  async (params: DataComponentInsert): Promise<DataComponentSelect> => {
    if (params.props) {
      const propsValidation = validatePropsAsJsonSchema(params.props);
      if (!propsValidation.isValid) {
        const errorMessages = propsValidation.errors
          .map((e) => `${e.field}: ${e.message}`)
          .join(', ');
        throw new Error(`Invalid props schema: ${errorMessages}`);
      }
    }

    const dataComponent = await db
      .insert(dataComponents)
      .values(params as any)
      .returning();

    return dataComponent[0];
  };
⚠️ Create MCP server with escaped JSON content and verify round-trip
  • What failed: The created row leaves malformed JSON in tools.config, and subsequent GET /tools reads return 500 instead of returning tool rows.
  • Impact: MCP tool management can become unavailable for affected projects after a single escaped-input create flow. Users and automation cannot reliably list or load tools until corrupted rows are repaired.
  • Steps to reproduce:
    1. Open the MCP server create page for the target project.
    2. Enter prompt content with escaped path/backslash/newline patterns and create the server.
    3. Call or open the project tools list endpoint and observe the 500 response.
  • Stub / mock context: Authentication and project-scope gates were locally bypassed so MCP create/list flows could run on localhost fixture projects. The tool create/list API behavior itself was exercised directly without route-level response mocking.
  • Code analysis: The tools table persists raw JSONB config, create writes payloads directly, and list reads all rows directly; with the current JSONB path this allows malformed persisted config to crash list queries.
  • Why this is likely a bug: The create/read production path can persist and later crash on escaped config content, which violates expected round-trip behavior for valid MCP tool input.

Relevant code:

packages/agents-core/src/db/manage/manage-schema.ts (lines 451-461)

export const tools = pgTable(
  'tools',
  {
    ...projectScoped,
    ...uiProperties,
    config: jsonb('config')
      .$type<{
        type: 'mcp';
        mcp: ToolMcpConfig;
      }>()
      .notNull(),

packages/agents-core/src/data-access/manage/tools.ts (lines 624-637)

export const createTool = (db: AgentsManageDatabaseClient) => async (params: ToolInsert) => {
  const now = new Date().toISOString();

  const [created] = await db
    .insert(tools)
    .values({
      ...params,
      createdAt: now,
      updatedAt: now,
    })
    .returning();

  return created;
};

packages/agents-core/src/data-access/manage/tools.ts (lines 604-612)

const [toolsDbResults, totalResult] = await Promise.all([
  db
    .select()
    .from(tools)
    .where(whereClause)
    .limit(limit)
    .offset(offset)
    .orderBy(desc(tools.createdAt)),
  db.select({ count: count() }).from(tools).where(whereClause),
]);
⚠️ Edit existing MCP tool overrides and verify JSON stability
  • What failed: Save path returns 500 for the required escaped path scenario, so override JSON does not remain stable across save and reload.
  • Impact: Teams cannot safely store or edit valid escaped override patterns for MCP tools. Configuration updates fail and can block production tool customization workflows.
  • Steps to reproduce:
    1. Open an existing MCP server edit page.
    2. Submit tool override transformation content containing escaped regex and Windows-style backslash paths.
    3. Reload the tool and observe API 500 instead of stable persisted override values.
  • Stub / mock context: This test ran with local auth and tenant/project protections bypassed to exercise the MCP override edit API in a controlled localhost fixture project. No synthetic success responses were injected for the failing save/read requests.
  • Code analysis: Override payloads are persisted into the same JSONB tools.config field used by MCP config, and list/read code directly re-reads that column; escaped path payloads trigger the same malformed JSON persistence and read failure mode.
  • Why this is likely a bug: Production edit/read logic cannot reliably round-trip escaped override content, so valid override updates break instead of persisting deterministically.

Relevant code:

packages/agents-core/src/db/manage/manage-schema.ts (lines 456-461)

config: jsonb('config')
      .$type<{
        type: 'mcp';
        mcp: ToolMcpConfig;
      }>()
      .notNull(),

packages/agents-core/src/data-access/manage/tools.ts (lines 627-633)

const [created] = await db
    .insert(tools)
    .values({
      ...params,
      createdAt: now,
      updatedAt: now,
    })
    .returning();

packages/agents-core/src/data-access/manage/tools.ts (lines 606-612)

db
      .select()
      .from(tools)
      .where(whereClause)
      .limit(limit)
      .offset(offset)
      .orderBy(desc(tools.createdAt)),
    db.select({ count: count() }).from(tools).where(whereClause),
⚠️ Create data component with complex escaped JSON schema fails to persist
  • What failed: The create request ends in a server 500 with Bad escaped character in JSON instead of creating and returning the component.
  • Impact: Users cannot reliably create components that include common escaped-path or escape-sequence schema values. This blocks configuration workflows that depend on valid backslash-heavy JSON schema fields.
  • Steps to reproduce:
    1. Open the Data Component create page in the Manage UI.
    2. Enter a valid JSON schema that includes escaped backslashes and escaped sequence literals.
    3. Submit the form with Save.
    4. Observe the request fail with a server 500 instead of creating the component.
  • Stub / mock context: No stubs, mocks, or bypasses were applied for this test in the recorded run.
  • Code analysis: I inspected UI validation and API create paths and found they parse JSON and pass it straight into DB JSONB insertion without any escape-safety transform, while the schema now uses native jsonb directly.
  • Why this is likely a bug: Valid escaped JSON schema input should not trigger DB parser failure, and the current create path lacks the prior escape-safe handling before JSONB persistence.

Relevant code:

packages/agents-core/src/validation/extend-schemas.ts (lines 19-22)

export function transformToJson<T extends string>(value: T, ctx: z.RefinementCtx<T>) {
  try {
    return JSON.parse(value);
  } catch {

packages/agents-core/src/data-access/manage/dataComponents.ts (lines 97-129)

if (params.props) {
      const propsValidation = validatePropsAsJsonSchema(params.props);
      if (!propsValidation.isValid) {
        const errorMessages = propsValidation.errors
          .map((e) => `${e.field}: ${e.message}`)
          .join(', ');
        throw new Error(`Invalid props schema: ${errorMessages}`);
      }
    }

    const dataComponent = await db
      .insert(dataComponents)
      .values(params as any)
      .returning();

packages/agents-core/src/db/manage/manage-schema.ts (lines 356-363)

export const dataComponents = pgTable(
  'data_components',
  {
    ...projectScoped,
    ...uiProperties,
    props: jsonb('props').$type<JsonSchemaForLlmSchemaType>().notNull(),
    render: jsonb('render').$type<{
⚠️ Context config create fails on valid escaped JSON payloads
  • What failed: The create request returns HTTP 500 for valid JSON payloads with escaped-pattern content; expected behavior is successful persistence (201) without server JSON parse errors.
  • Impact: Admin users can fail to save valid context configuration payloads and are blocked from configuring agent context behavior. This causes reliability issues in production-like workflows whenever specific escaped sequences are present.
  • Steps to reproduce:
    1. Open an agent details page at /{tenantId}/projects/{projectId}/agents/{agentId} and open the context config editor.
    2. Enter valid JSON with escaped path/token-like patterns (for example C:\temp\token and escaped token patterns) in headers schema and context variables.
    3. Save to create the context configuration.
    4. Observe the create request returning HTTP 500 instead of 201.
  • Stub / mock context: No stubs, mocks, or bypasses were applied for this test in the recorded run.
  • Code analysis: I traced the create flow through the manage API route and data-access layer. The route passes request JSON directly into createContextConfig, and createContextConfig writes headersSchema and contextVariables straight into JSONB columns; with the current schema using native jsonb directly, there is no protective transformation in this path for problematic escape sequences.
  • Why this is likely a bug: Valid JSON inputs that should be accepted trigger server-side JSONB parse failure in the production create path, indicating a real persistence defect rather than a test harness artifact.

Relevant code:

packages/agents-core/src/db/manage/manage-schema.ts (lines 1-16)

import {
  boolean,
  doublePrecision,
  foreignKey,
  index,
  integer,
  jsonb,
  numeric,
  pgTable,
  primaryKey,
  text,
  timestamp,
  unique,
  varchar,
} from 'drizzle-orm/pg-core';

packages/agents-core/src/data-access/manage/contextConfigs.ts (lines 80-92)

const contextConfig = await db
  .insert(contextConfigs)
  .values({
    id,
    tenantId: params.tenantId,
    projectId: params.projectId,
    agentId: params.agentId,
    headersSchema: (params.headersSchema ?? null) as any,
    contextVariables: (contextVariables ?? null) as any,
    createdAt: now,
    updatedAt: now,
  })
  .returning();

agents-api/src/domains/manage/routes/contextConfigs.ts (lines 149-157)

const configData = {
  tenantId,
  projectId,
  agentId,
  ...body,
};
const contextConfig = await createContextConfig(db)(configData);

return c.json({ data: contextConfig }, 201);
✅ Passed (4)
Category Summary Screenshot
Adversarial Unauthenticated MCP and Data Component deep links redirected to login before protected edit content rendered. ADV-2
Adversarial Mobile viewport JSON form remained usable with clear validation and successful save+reload persistence. ADV-4
Edge Invalid JSON with a literal null byte was rejected, Save stayed disabled, and persisted data remained unchanged. EDGE-3
Edge Unsaved transient edits were discarded after back/forward, and only the intended saved description persisted after refresh and deep-link reload. EDGE-4
ℹ️ Additional Findings (1)

These findings are unrelated to the current changes but were observed during testing.

Category Summary Screenshot
Adversarial 🟠 Dangerous transformation strings are accepted and saved instead of being rejected. ADV-1
🟠 Block dangerous transformation patterns in MCP tool overrides
  • What failed: The API accepts and persists dangerous transformation strings with HTTP 200 instead of rejecting them.
  • Impact: Invalid or unsafe transformation expressions can be stored and later cause unstable behavior when transformations are executed. This also creates inconsistent behavior between UI validation and direct API usage.
  • Steps to reproduce:
    1. Open an existing MCP tool override update flow or call the update API directly.
    2. Send transformation values containing patterns like eval(, function(, ${...}, or __proto__.
    3. Observe HTTP 200 and persisted unsafe transformation data instead of validation rejection.
  • Stub / mock context: Unsafe and benign transformation payloads were submitted directly through local API flows to verify persistence behavior, which bypasses front-end form guards by design. The run used localhost auth bypass settings so only local non-production services were exercised.
  • Code analysis: Server-side tool schemas allow any string/object for transformation with no dangerous-pattern guard, while UI form validation has explicit dangerous-pattern blocking; API clients can bypass UI checks and still persist unsafe values.
  • Why this is likely a bug: The product enforces dangerous-pattern blocking in the UI but not in API validation, so unsafe transformations are accepted in production data paths.

Relevant code:

packages/agents-core/src/validation/schemas.ts (lines 1232-1244)

toolOverrides: z
            .record(
              z.string(),
              z.object({
                displayName: z.string().optional(),
                description: z.string().optional(),
                schema: z.any().optional(),
                transformation: z
                  .union([
                    z.string(), // JMESPath expression
                    z.record(z.string(), z.string()), // object mapping
                  ])
                  .optional(),

packages/agents-core/src/validation/schemas.ts (lines 2129-2133)

export const ToolUpdateSchema = ToolInsertSchema.partial();

export const ToolApiSelectSchema = createApiSchema(ToolSelectSchema).openapi('Tool');
export const ToolApiInsertSchema = createApiInsertSchema(ToolInsertSchema).openapi('ToolCreate');
export const ToolApiUpdateSchema = createApiUpdateSchema(ToolUpdateSchema).openapi('ToolUpdate');

agents-manage-ui/src/components/mcp-servers/form/validation.ts (lines 22-30)

const DANGEROUS_PATTERNS = [
  /\$\{.*\}/, // Template injection
  /eval\s*\(/i, // Eval calls
  /function\s*\(/i, // Function definitions
  /constructor/i, // Constructor access
  /prototype/i, // Prototype manipulation
  /__proto__/i, // Proto access
];

Commit: 9a0baf6

View Full Run


Tell us how we did: Give Ito Feedback

@miles-kt-inkeep miles-kt-inkeep added this pull request to the merge queue Apr 10, 2026
Merged via the queue into main with commit b77aad1 Apr 10, 2026
26 checks passed
@miles-kt-inkeep miles-kt-inkeep deleted the fix/remove-custom-jsonb branch April 10, 2026 21:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant