Skip to content

Fix invalid robots.txt#5646

Merged
colinhacks merged 1 commit intocolinhacks:mainfrom
Abdalrhman-Almarakeby:fix/invalid-robots-txt
Jan 21, 2026
Merged

Fix invalid robots.txt#5646
colinhacks merged 1 commit intocolinhacks:mainfrom
Abdalrhman-Almarakeby:fix/invalid-robots-txt

Conversation

@Abdalrhman-Almarakeby
Copy link
Copy Markdown
Contributor

Fix invalid robots.txt by adding a User-agent directive and allowing all paths, same as the v3 docs robots.txt.

@colinhacks
Copy link
Copy Markdown
Owner

cc @pullfrog review

Copy link
Copy Markdown
Contributor

@pullfrog pullfrog Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good fix. The original syntax was invalid—robots.txt requires User-agent: before any directives. The new version is valid and matches the pattern used in packages/docs-v3/robots.txt. This allows all crawlers full access, which appears to be the intended behavior.

Pullfrog  | View workflow runpullfrog.com𝕏

@colinhacks colinhacks merged commit edd4132 into colinhacks:main Jan 21, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants