# AGENTS.md — claudereviews.com

Instructions for AI coding agents (Codex, Claude Code, Cursor, Copilot, Factory, etc.) working on this repository.

## What this site is

A static-feeling PHP site — AI-authored publication with four sections (novels, data, news, software) plus interviews. Every page renders directly from `.php` files; no JS framework, no build step. Content auto-discovers: drop a new file into the right folder and every index, feed, sitemap, and agent-readiness file picks it up automatically.

## Key files you will encounter

### Discovery / registry layer

- `site-registry.php` — **single source of truth.** Exposes every content type through `registry_reviews()`, `registry_investigations()`, `registry_dispatches()`, `registry_news()`, `registry_interviews()`, `registry_sections()`, `registry_machine_endpoints()`, `registry_ai_user_agents()`. Do not bypass this — add new content to the right folder and the registry handles the rest.
- `reviews.php` — scans `/novels/*.php` and builds `$reviews`
- `data/investigations.php` — scans `/data/*/index.{html,php}` with stable numbering via `.registry.json`
- `software/dispatches.php` — scans `/software/*.php` for files with `$product` defined
- `interviews/*.meta.json` — interview metadata files

### Agent-readiness generators (all require `site-registry.php`)

- `llms.txt.php` → `/llms.txt`
- `llms-full.txt.php` → `/llms-full.txt`
- `llm.txt.php` → `/llm.txt`
- `llms.html.php` → `/llms.html`
- `ai.txt.php` → `/ai.txt`
- `ai.json.php` → `/ai.json`
- `identity.json.php` → `/identity.json`
- `brand.txt.php` → `/brand.txt`
- `faq-ai.txt.php` → `/faq-ai.txt`
- `developer-ai.txt.php` → `/developer-ai.txt`
- `robots-ai.txt.php` → `/robots-ai.txt`
- `robots.txt.php` → `/robots.txt` (replaces the static one)
- `agenticweb.md.php` → `/agenticweb.md`
- `openapi.yaml.php` → `/openapi.yaml`
- `openapi.json.php` → `/openapi.json`
- `api/v1/index.php` → `/api/v1/`
- `md.php` → per-page markdown mirrors via `.htaccess` rewrites
- `.well-known/ai-plugin.json` — static
- `.well-known/agent-card.json` — static (A2A)
- `.well-known/ai-agent.json.php` → `/.well-known/ai-agent.json` (Aiia)
- `.well-known/mcp.json` — static

### Per-page helper

- `agent-seo-include.php` — helper with `render_homepage_schema()`, `render_review_schema($slug)`, `render_dataset_schema($slug, ...)`, `render_software_schema($slug)`, `render_page_alternate_md($path)`, `render_og_image()`, `render_agent_discovery_meta()`. One require, one function call gets a page its full schema.

### Existing infrastructure (DO NOT break)

- `signals-include.php` — already emits Review schema for novels + manages the signal UI
- `feed.php` + `data/feed.php` — JSON feeds
- `sitemap.php` — dynamic XML sitemap (pulls from registry)
- `nav-include.php` — site-wide nav
- `style.css` — single stylesheet
- `moderate.php` — CLI signal moderation tool

## When adding content

### New book review
1. Create `novels/{slug}.php` from the template pattern (see any existing review like `klara-and-the-sun.php`)
2. Optionally add the slug to `order.txt` to control display ordering
3. Everything else updates automatically: `/feed.php`, `/sitemap.php`, `/llms.txt`, `/llms-full.txt`, `/ai.json`, `/identity.json`, `/developer-ai.txt`, `/api/v1/transmissions`

### New data investigation
1. Create `data/{slug}/index.{html,php}` with `<h1 class="page-title">` and a `<script type="application/ai+json">` block containing `tags`
2. Add raw CSVs under `data/raw/` and reference them in the investigation's ai+json block
3. `data/investigations.php` auto-registers the slug (stable numbering via `.registry.json`)
4. Use `render_dataset_schema($slug, $title, $description, $csvUrls)` at the top of the page for Dataset JSON-LD

### New software dispatch
1. Create `software/{slug}.php` with header vars: `$product`, `$maker`, `$category`, `$verdict`, `$tags`
2. `software/dispatches.php` auto-registers
3. Use `render_software_schema($slug)` for review JSON-LD

### New news dispatch
1. Create `news/{slug}/index.{html,php}` with `<h1 class="page-title">`
2. Auto-discovered by `registry_news()`

### New interview
1. Operator creates it via the interview API (`interview-api.php` action=create)
2. `interviews/{slug}.meta.json` + `interviews/{slug}.json` are created by the API
3. Auto-discovered by `registry_interviews()`

### New persona
1. Add an entry to the `SITE_PERSONAS` constant in `site-registry.php`
2. Add a section entry to `registry_sections()` if it's a new section
3. Every downstream file updates automatically

## When editing a review page

- Only touch `<div class="review-title">`, `<div class="review-author">`, and `<div class="review-body">`
- Preserve the `<script type="application/ai+json">` block — it is load-bearing for the Claude Desktop extension
- Preserve the `<script type="application/ld+json">` block (review schema) if present; `signals-include.php` emits one if a novel file exists
- If adding schema to a page that doesn't have one, use helpers from `agent-seo-include.php`

## DO NOT

- Delete or minify `application/ai+json` or `application/ld+json` blocks
- Introduce a JS framework, bundler, or build step
- Hardcode content lists anywhere — always go through `site-registry.php`
- Commit files under `/signals/` — those are runtime state written by `signal.php`
- Commit `data/.registry.json` or `software/.registry.json` — those auto-generate
- Rename canonical URL patterns — AI agents have them memorized via `llms.txt` and `openapi.yaml`
- Bypass the registry when adding new agent-readiness files — always `require_once __DIR__ . '/site-registry.php';`

## Commands

```bash
# Moderate pending signals
php moderate.php

# Sanity-check sitemap generation (should be 200+ URLs)
php -r "require 'sitemap.php';"

# Verify llms.txt renders correctly
php -r "require 'llms.txt.php';" | head -50

# Count content across all sections
php -r "require 'site-registry.php'; foreach (registry_sections() as \$s) { echo \$s['label'] . ': ' . \$s['count'] . PHP_EOL; }"
```

## Deployment surface

Only these files need to be live for agent-readiness scoring:

```
/site-registry.php
/agent-seo-include.php
/llms.txt.php        → /llms.txt
/llms-full.txt.php   → /llms-full.txt
/llm.txt.php         → /llm.txt
/llms.html.php       → /llms.html
/ai.txt.php          → /ai.txt
/ai.json.php         → /ai.json
/identity.json.php   → /identity.json
/brand.txt.php       → /brand.txt
/faq-ai.txt.php      → /faq-ai.txt
/developer-ai.txt.php → /developer-ai.txt
/robots-ai.txt.php   → /robots-ai.txt
/robots.txt.php      → /robots.txt (replaces static)
/agenticweb.md.php   → /agenticweb.md
/openapi.yaml.php    → /openapi.yaml
/openapi.json.php    → /openapi.json
/md.php              (per-page mirror)
/api/v1/index.php
/.well-known/ai-plugin.json       (static)
/.well-known/agent-card.json      (static)
/.well-known/ai-agent.json.php    → /.well-known/ai-agent.json
/.well-known/mcp.json             (static)
/.htaccess           (all URL rewrites + security headers + CORS)
/sitemap.php         (updated — pulls from registry)
/AGENTS.md           (this file)
```

And one injection into `index.php` `<head>`:
```php
<?php require __DIR__ . '/agent-seo-include.php'; render_homepage_schema(); render_og_image(); render_agent_discovery_meta(); ?>
```

## MCP server (separate repo)

The live MCP server is a Cloudflare Worker hosted at `mcp.claudereviews.com`. It proxies POST `/api/v1/signal` and serves the JSON-RPC 2.0 MCP endpoint. When editing `/api/v1/*` on this site you are editing the alias/index; the real API lives on the Worker. See `interview-api.php` for the inline PHP API pattern used for interviews (which is NOT on the Worker).

## Commit conventions

- One content piece per commit (makes rollbacks clean)
- Run the sitemap sanity-check before committing registry changes
- If you modify `site-registry.php`, test that `/llms.txt`, `/ai.json`, and `/identity.json` still render 200 with valid content

## Contact

claudewilder@pm.me
