Overview
How can schools, states, and vendors report student achievement to parents in a way parents actually understand — and that links to real-world meaning (careers, college readiness, civic life, specific skills) rather than just test-score abstractions like "Approaches Standards"? That's the question this wiki is mapping: who is working on it, what strategies they're trying, and where the gaps are.
What we know so far
The fundamental problem is a perception gap between what parents believe about their child's academic achievement and what standardized data shows. Learning Heroes and Gallup have documented the parent-side version: roughly 9 in 10 parents believe their child is at or above grade level, while NAEP data puts it at fewer than half. Smarter Balanced and National PTA, from the vendor side, describe the same problem as an "assessment system literacy gap." The 2022 SBAC × PTA focus groups (Edge Research, 29 parents of 3rd–8th graders, including a dedicated Spanish-speaking group) sharpened that diagnosis: parents understand level descriptors in the abstract but stumble when claim-level subscores conflict with the headline score, when predictive language is used, and when K–8 results are framed in college-readiness terms ("college is not necessarily a goal for everyone"). The same study extended the framing to a co-equal teacher assessment-literacy gap: parents deprioritize test results when teachers minimize their importance.
Three broad strategic responses are visible in the landscape:
- Improve communication around existing scores. Learning Heroes' approach — make the same data more legible. The clearest concrete artifact is the HCM Strategists × Learning Heroes multi-state model report card, implemented across South Dakota, Texas, New Mexico, and Massachusetts (with CCSSO technical assistance for additional states). South Dakota rebuilt its state report card from scratch; New Mexico added a parent portal with explainer videos; Massachusetts ran parent usability sessions. School-level reporting, not student-level.
- Redefine what scores mean. Illinois lowered IAR cut scores in 2025 so reported proficiency rates would intuitively match grade-level expectations — moving ELA proficiency from 39.4% to 52.0%. Critics called it a "change in the goal posts"; the move also creates a secondary communication problem, since the same labels now mean something different than they did a year ago.
- Translate scores into named real-world meanings. ACT's NCRC is the clearest concrete example — a tier credential (Bronze/Silver/Gold/Platinum) mapped directly to the share of jobs those skills cover. "You're qualified for 93% of jobs in our database" is intelligible in a way "Approaches Standards" is not. Arkansas integrated NCRC into K–12 student accountability via Act 319 of 2021. The 2025 ACT Work Ready Communities adopter landscape clarified that Arkansas is the only state taking this K–12 route — Louisiana has more NCRCs (315k+) and New York is at pilot scale, but both treat NCRC as a workforce-development credential rather than an achievement signal feeding back into schools.
The vendor landscape is not converging on a shared format. Smarter Balanced reports proficiency bands. NWEA, through MAP Growth, has abandoned bands entirely — it reports a continuous RIT scale score plus norm-group percentile and growth percentile, framed as "is your child growing at a healthy rate?" rather than "is your child proficient?" Both are widely used, both have the same blind spot: they translate scores to other test-internal numbers (norms, percentiles, levels) rather than to anything outside testing. NWEA's recommended fallback when a parent doesn't understand a term is to ask the teacher.
On the competency side, Portrait of a Graduate (PoG) frameworks — championed by NGLC — attempt to frame achievement as real-world capacities (communication, critical thinking, wayfinding) rather than proficiency bands. The published framework material remains thin on parent-reporting tools, but vendors are starting to fill the gap. SpacesEDU is the first concrete tooling example in the wiki: district-defined PoG templates, multimedia evidence portfolios, and a "live visual report card" families can open daily. Whether the affordance translates into improved parent comprehension — versus just becoming another platform parents have to log into — has not been independently evaluated.
Key tensions
- Improve the signal vs. redefine the signal. HCM × Learning Heroes and Illinois diagnose the same problem and propose opposite fixes — one redesigns the report, the other moves the cut scores. Both have shipped artifacts; comparative effects are not measured.
- Bands vs. scale-only vs. tier credentials. Three categorically different vendor-facing patterns now sit side by side in the wiki: bands with cut scores (state assessments), scale-only with norm comparison (NWEA), and bands tied to real-world meaning (NCRC). Only the third closes the meaning gap — and only at graduation age.
- Framework vs. credential. PoG offers community-owned, lifelong-skills framing at the cost of measurement rigor and cross-district comparability. NCRC offers portability and literal legibility at the cost of being vendor-defined and graduation-era only. Could be competing answers or complementary layers; no documented district runs both.
- Framework-strong, tooling-weak narrowing. SpacesEDU starts to close the PoG tooling gap, but the question shifts from "is there a tool?" to "does the tool actually change parent comprehension?" — and on that, vendor marketing is silent.
- Concentrated qualitative-research supplier. Two of the most-cited recent qualitative findings on parent comprehension — the SBAC × PTA 2022 study and the HCM × Learning Heroes multi-state research — were both conducted by Edge Research. The wiki's qualitative evidence base is narrower than it appears.
Where to start reading
- Perception gap — the anchor concept; most pages ultimately reference back here.
- B-flation (Learning Heroes × Gallup) — canonical statement of the parent-side gap, with numbers.
- SBAC × PTA Findings (2022) — the closest look at how parents actually read state assessment reports.
- Chalkbeat on the Illinois cut-score change — the clearest case of the "redefine meaning" strategy.
- ACT WorkKeys / NCRC and Work Ready Communities adopters — the score-to-real-world translation done literally, plus why it's a one-state K–12 pattern.
- HCM × Learning Heroes report-card redesign — the "improve the signal" strategy executed at state scale.
- SpacesEDU product pages — first concrete competency-portfolio tool.
- Momentum and sentiment across the field — cross-cutting view: activity is broad and recent, but almost no one measures whether what they shipped actually helps parents.
What we don't know yet
- Whether anyone connects score bands to specific real-world competencies at the K–8 level — the "you can read a nutrition label" version of proficiency reporting, rather than only at graduation-era workforce credentials. Nothing ingested so far does this.
- Whether tools like SpacesEDU actually improve parent comprehension or just shift the form of the report. Vendor marketing makes the affordance argument; no independent evaluation is yet in the wiki.
- Where the "real-world applications" framing is taking hold beyond Arkansas — confined to career-readiness vendors and competency-based schools, or spreading into mainstream state K–12 accountability. Current evidence suggests Arkansas remains alone.
- Whether the Spanish-speaking subgroup in the SBAC × PTA 2022 focus groups surfaced findings distinct from the other four — the report does not segment results by group, and bilingual reporting remains undertreated.
- Whether the "teacher assessment system literacy" frame, newly named on the vendor side, will produce concrete teacher-side initiatives or remain a diagnosis without a program.