The Problem
Every year, 7.5 million children in the U.S. receive special education services through an Individualized Education Program — the IEP. The document is a legal contract between a school district and a family. It determines what services a child receives, what goals they’ll work toward, and what accommodations protect them in the classroom.
Most parents can’t read it.
Not because they’re uneducated. Because IEPs are written in compliance language, by compliance teams, for compliance purposes. A parent sits across the table from a school psychologist, a special education coordinator, a general education teacher, a speech-language pathologist, and sometimes an administrator — all of whom wrote the document, know the law, and control the process.
The parent has a 30-page PDF and whatever they Googled the night before.
I know this world personally — through GrowTale, through the special education system, through years of sitting in those meetings. But IEP Says isn’t a passion project. It’s a product built with the same methodology I’ve applied to Fortune 500 engagements: research first, frameworks second, technology third.
The Product
IEP Says does one thing: it reads the document the parent can’t, and tells them what the school won’t.
Upload a PDF or photograph each page with your phone. The system runs the document through a multi-step AI analysis pipeline, cross-references it against state and federal special education law, and delivers a plain-English report that tells you:
- What the document actually says (translated from compliance language)
- Where it falls short of legal requirements (flagged by severity)
- What questions to ask at the next meeting (specific, not generic)
- How each goal measures up to quality standards (SMART criteria + state requirements)
- What services and accommodations might be missing
The report isn’t a summary. It’s a meeting prep tool. The parent who walks in with an IEP Says report walks in with the same information the school team has — sometimes more.
The Brand
The brand identity for IEP Says was designed around a single constraint: this is a product for people in crisis. Parents searching for help with their child’s IEP are stressed, overwhelmed, and often angry. The brand needed to feel trustworthy and calm — not clinical, not playful.
Wordmark & Identity System
“Says” is the brand. “IEP” is the qualifier. The coral comma-dot is the signature mark — a teardrop that anchors every lockup. The system includes stacked and inline wordmarks at four sizes, an “S.” icon mark for app icons and favicons, and the standalone comma mark for loading states and bullet replacements.
Color & Typography
Navy (#1A365D) and Coral (#F06050). Navy carries authority and trust — it’s the color of the professional side of the table. Coral is the human warmth — the accent that says “we’re on your side.” The palette extends into full tonal scales for both brand colors, with semantic greens, ambers, and reds for the severity system that drives the analysis output.
Typography pairs Poppins (headings, wordmark) with Inter (body, UI). Poppins has enough personality to feel approachable without sacrificing readability at small sizes. Inter is invisible in the best way — it stays out of the parent’s way when they’re reading a 2,000-word article at midnight. The type scale runs from display (3rem/700) down to small (0.75rem/400), all defined in the Tailwind config.
Interaction & Motion
The homepage features scroll-driven storytelling: every section reveals on scroll with staggered fade-in animations. The SVG meeting illustration draws itself as the parent scrolls — five school staff figures rendered in coral strokes. A bespoke 9-second hero animation walks through the complete user journey (upload → score → walkthrough → toolkit) using animated counters, a custom typewriter effect, and SVG progress rings. All animations respect prefers-reduced-motion.
The AI Pipeline
This is where the CX work becomes technical architecture. The analysis pipeline isn’t one big prompt — it’s a six-step orchestration that mirrors how a human advocate would read an IEP.
Document Ingestion
PDF upload or multi-image photo capture. For PDFs, a WASM-based renderer converts each page to JPEG and extracts OCR text. For photos, a QR-code handoff lets parents scan pages with their phone camera. Claude vision handles OCR on photographed pages — 8 images concurrently, 4 parallel threads, 30-second per-page timeout.
Structure Detection
Claude Haiku classifies the document type (IEP, 504 plan, evaluation, progress report) and identifies section boundaries. Extracts child information: name, grade, disability categories, school. This step takes ~5 seconds and determines how the heavier analysis steps are configured.
Section Analysis
Claude Opus analyzes each IEP section independently — 8 sections running in parallel. The system prompt includes the full state compliance context (loaded from the 50-state data engine), federal IDEA baseline, and the services catalog. Each section gets a plain-English translation, compliance flags with severity levels, and specific recommendations. Prompt caching reduces effective token usage by ~80%.
Goal Evaluation
Every IEP goal gets scored against SMART criteria and state-specific requirements. The system flags vague language, missing baselines, unrealistic timelines, and goals that don’t connect to present levels of performance. This is the analysis parents value most — it turns “Annual Goal: Student will improve reading skills” into “This goal has no baseline, no measurement criteria, and no timeline. Here’s what to ask for instead.”
Service & Accommodation Review
Extracted services are matched against a 20-item services catalog and cross-referenced with the accommodations database. The system identifies gaps — services the child’s disability profile suggests but the IEP doesn’t include — and flags intensity concerns (e.g., 30 minutes/week of speech therapy for a child with severe articulation disorder).
Synthesis & Enrichment
The application layer combines raw AI outputs into a unified analysis. An enrichment engine adds status indicators, severity classifications, cross-section conflict detection, and generates the final deliverables: top action items, meeting preparation questions, and email templates for follow-up.
Why Multi-Step Matters
A single-prompt approach would be cheaper and faster. But IEPs are adversarial documents in a way that most AI-analyzed content isn’t. Schools have compliance incentives to use language that technically satisfies requirements while practically meaning very little. “Student will make progress in math” is legally defensible and educationally meaningless.
Each pipeline step is tuned to detect a different category of evasion. Structure detection catches missing sections. Section analysis catches vague language. Goal evaluation catches goals that can’t fail (because they don’t measure anything). Service review catches services that are present on paper but insufficient in practice.
The pipeline doesn’t just read the document. It reads between the lines.
The Data Engine
The AI pipeline is only as good as the compliance knowledge behind it. Federal special education law (IDEA) provides the baseline, but every state layers on its own requirements, timelines, and procedural rules. A California IEP has different legal requirements than a New Hampshire one.
I built a 50-state compliance database.
The Scale
- 50 states + DC covered with baseline data
- 20 states fully verified with primary-source citations (NH, MA, CA, TX, NY, OH, FL, IL, PA, GA, NC, NJ, MI, VA, MN, WA, WI, AZ, IN, CO)
- 26 compliance topics per state: required sections, evaluation procedures, timeline requirements, eligibility criteria, reevaluation rules, goal standards, related services, accommodations, modifications, assistive technology, procedural safeguards, parent rights, due process, dispute resolution, behavior supports, transition planning, progress monitoring, funding, professional development, least restrictive environment, and more
- Quality scoring on each topic (65–99%), with multi-pass validation against primary sources
The Research Pipeline
This isn’t scraped data. Each state’s regulations were researched the same way I’d research a client engagement:
- Primary source identification — state education codes, administrative rules, department of education guidance documents
- Synthesis — Claude processes and structures the raw legal text into actionable requirements, timelines, and citations
- Multi-pass validation — a separate validation agent checks every claim against the source material, targeting 96%+ accuracy
- Correction — any inaccurate or unsupported claims are flagged, corrected, and re-validated
- Gap auditing — automated reports identify remaining coverage gaps per state
Each state JSON file contains the full topic structure, key requirements, relevant timelines, and source citations with URLs back to the original legal text. When the AI pipeline analyzes an IEP, it loads the correct state context and uses it as grounding — not as a reference, but as the system prompt’s compliance authority.
Vector Search Layer
State regulation data and the accommodations catalog are embedded into pgvector using BGE-base-en-v1.5 (768 dimensions) via a Cloudflare Workers proxy. This powers two features:
- Semantic chat — parents can ask natural-language questions about their state’s special education law and get relevant regulation snippets ranked by similarity
- Accommodation matching — the system semantically matches a child’s disability profile and current services to relevant accommodations they might be missing
The Content Strategy
IEP Says isn’t just a tool. It’s a content platform.
Learning Hub: 41 Published Articles
Every article is written for the parent Googling at 11pm the night before their IEP meeting. Topics range from foundational (“How to Read an IEP”) to specific (“IEP Red Flags: What to Look For”) to disability-focused (“Autism and the IEP: What Parents Need to Know”).
Each article includes:
- 3,000–8,000 words of substantive guidance (not thin SEO content)
- State-specific notes for all 20 verified states
- Plain English sidebars that translate jargon
- FAQ structured data for search visibility
- Internal links to the product features that address the article’s topic
Accommodations Directory: 451+ Entries
A searchable, filterable catalog of IEP accommodations with plain-language explanations. Filterable by disability category (17), accommodation type (14 categories), grade level, and subject area. Each entry includes a parent-friendly “tip” explaining why the accommodation matters and how to request it.
The directory started as a separate product (IEP Directory), validated demand, and was merged into IEP Says as an integrated feature — a natural extension of the analysis workflow.
SEO Infrastructure
- XML sitemap with all article, state, accommodation, and glossary pages
- Structured data (Organization, WebApplication, FAQ, BreadcrumbList)
llms.txtfor AI crawler discoverability- Google Search Console and Bing Webmaster integration for index monitoring
- State-specific landing pages for all 50 states (SEO long-tail for “[state] IEP requirements”)
The Business Model
Per-Document Paywall
- Free: Upload and get a preview — enough to see the value, not enough to act on it
- Paid: $9.99 unlocks the full report, chat, and email templates
- Stripe Embedded Checkout — no redirect, no friction
The pricing is deliberate. $9.99 is less than a single hour of a special education advocate’s time (typically $75–200/hr). The value frame isn’t “AI document analysis” — it’s “meeting preparation that would otherwise cost $200 or not happen at all.”
Why Not Subscription
Most parents need IEP Says 1–3 times per year (annual review, triennial evaluation, maybe a dispute). A monthly subscription would feel extractive for a product with episodic usage. Per-document pricing aligns cost with value delivered.
Future opportunities include subscription tiers for families with multiple children, school district site licenses, and advocacy organization partnerships — but the unit economics work at the per-document level first.
The Architecture
IEP Says is a full-stack production system, not a wrapper around an API call.
Stack
| Layer | Technology |
|---|---|
| Frontend | Next.js 14.2, React 18, TypeScript, Tailwind CSS v3 |
| UI | shadcn/ui + Radix UI primitives |
| Backend | Next.js API routes, Supabase (Postgres + Auth + Storage) |
| AI | Anthropic API — Opus (analysis), Haiku (structure), Sonnet (validation) |
| Vector Search | pgvector + Cloudflare Workers AI (BGE-base-en-v1.5) |
| Payments | Stripe (Embedded Checkout) |
| Resend (transactional) | |
| Monitoring | Sentry (errors + APM) |
| Deployment | Vercel |
Security & Compliance
IEP documents contain some of the most sensitive data a family has — their child’s disability diagnosis, behavioral notes, academic performance, and sometimes medical records.
- FERPA-aware design — no document content logged, optional auto-purge after 90 days
- Row-Level Security on every table — users can only access their own data
- No AI training — explicit opt-out in every Anthropic API call
- CSRF protection with origin validation on all non-GET requests
- Input sanitization — Zod schemas on every API route, HTML sanitization before rendering
- Rate limiting — free tier capped at 3 documents/month
Performance Engineering
The analysis pipeline runs against Vercel’s 300-second function timeout. Budget allocation:
- Document ingestion: ~30s
- Structure detection: ~5s
- Section analysis (8 parallel): ~60s
- Goal evaluation: ~30s
- Service review: ~30s
- Enrichment: ~5s
- Buffer: ~140s for retries and edge cases
Prompt caching reduces effective input tokens by ~80% across section analyses (the 37KB state compliance context is cached after the first section, then read for the remaining seven). Retry logic with exponential backoff and jitter handles transient rate limits gracefully.
What IEP Says Demonstrates
Every enterprise CX engagement I’ve run follows the same arc: understand the stakeholders, map the experience, identify where the system fails the human, build the framework that fixes it, and measure whether it worked.
IEP Says is that arc, end to end, without a client brief.
The research methodology is the same. Stakeholder interviews with parents, educators, and therapists. Journey mapping from “parent receives IEP draft” through “parent leaves the meeting.” Service blueprinting that reveals where the system’s incentives diverge from the family’s needs.
The strategic synthesis is the same. Fifty states of regulatory complexity distilled into a structured knowledge system. Competing information sources (federal law, state code, district policy, school practice) unified into a single compliance authority.
The framework thinking is the same. The six-step pipeline isn’t an engineering artifact — it’s a service blueprint encoded as software. Each step maps to a moment in the parent’s journey where expertise is needed and unavailable.
The measurement systems are the same. Quality scoring. Severity classification. Coverage tracking. The same accountability infrastructure I build for Fortune 500 CX programs, applied to an AI product.
The difference between a CX strategist and a product builder used to be the ability to ship. That gap has closed. What hasn’t changed is the ability to know what to build — and more importantly, what not to build. That still requires the research, the synthesis, and the judgment that comes from years of doing this work with real people.
IEP Says is live at iepsays.com.
What I’m building toward: a world where no parent walks into an IEP meeting outgunned. The school team will always have more people in the room. The parent should never have less information.