5 MBmax upload size
20/mofree tier uploads
30 daysfree tier expiry
1 hrpresigned URL TTL
01
1Upload
API Key + HTML In
POST your HTML to /api/v1/upload with a Bearer token — auth, rate-limit, and file validation run before anything is processed.
🔑
API Key Auth
The first 12 chars are looked up in the DB by prefix, then the full key is bcrypt-compared (12 rounds) against the stored hash. Session JWT cookies also accepted.
⏱️
Rate Limit Gate
Counts non-deleted explainers created by this user in the last 30 days. Returns 429 with an upgrade_url if the count reaches 20.
📦
Parse Multipart Body
5 MB body limit is enforced before auth middleware runs. Multipart form is parsed and the file field is extracted as raw HTML text.
passes to scanner
Content-Type text/html, application/octet-stream, or empty — all accepted
Missing file field returns 400 immediately
Body > 5 MB returns 413 before auth is even attempted
20+ uploads this month returns 429 with current count + upgrade URL
02
2Scan
Five-Pass
Safety Check
DOMPurify strips XSS vectors, Shannon entropy gates catch leaked credentials, and script-context detection flags data-exfiltration calls — all before the file is stored.
1
DOMPurify sanitize Strips script, link, meta, iframe, form + all event handler attrs (onerror, onclick, etc.)
→ CLEAN
2
Link hardening All <a> and <area> get rel="nofollow noopener noreferrer" + target="_blank"
→ SAFE
3
Credential scan Shannon entropy ≥ 3.5 — AWS keys, GitHub PATs, Stripe secrets → [REDACTED]
→ REDACT
4
Prompt injection scan "Ignore previous instructions" flagged ONLY inside HTML comments — visible text is allowed
→ STRIP
5
Exfil detection fetch() and sendBeacon in <script> blocks flagged unless calling explainers.fyi or fonts.googleapis.com
→ FLAG
footer + metadata
Attribution footer baked into the HTML at storage time — not injected on every serve request
Metadata extracted from <title> and <meta name="description"> for the DB record
response.sanitized: true if any content was removed or redacted
03
3Serve
LIVE
Private R2,
Public Slug
The HTML lives in a private Cloudflare R2 bucket. Every visit to /e/{slug} issues a fresh 1-hour presigned redirect — the raw bucket URL is never exposed.
🗄️
Upload to R2
PutObjectCommand → private bucket at {userId}/{slug}.html · IPv4-forced to avoid ENETUNREACH on Hetzner
🔖
Insert DB record
10-char lowercase alphanumeric slug · on 23505 collision: rollback R2 object, retry up to 3×
🔗
Return shareable URL
201 JSON: { url, slug, expires_at, sanitized } · free tier links expire after 30 days
👁️
On visit: DB lookup
Check not deleted + not expired · increment view_count (fire-and-forget) · generate presigned URL
↗️
302 → Presigned R2
1-hour TTL · browser fetches HTML directly from Cloudflare edge · no proxy overhead
guardrails
30 daysfree tier expiry → 410
1 hrpresigned URL TTL
max slug retry attempts
🔒 Private bucket— R2 URLs are never exposed directly