EdReports v2.0 Reports and Tools: Frequently Asked Questions
Learn more about EdReports v2.0 reports and K-12 curriculum review tools covering ELA, math, science, and supports for multilingual learners (MLL).
This resource provides answers to common questions about EdReports v2.0 reports and review tools.
- To access all EdReports’ current and previous review tools, visit our Review Tools page.
- For general questions about EdReports, see our main FAQs page.
In 2024, EdReports revised all its review tools for evaluating comprehensive, K-12 instructional materials in English language arts (ELA), math, and science. We began using these “version 2.0” tools for all new reviews of relevant materials starting in early 2025, with the first reports published in July 2025.
EdReports is committed to continuous learning and innovation to meet the evolving needs of the education community. We examine our review tools and review process on an ongoing basis, updating them as needed to ensure our reports provide maximum value to the field. Our v2.0 tool revisions reflect this commitment, marking our first multi-subject update since 2020.
Page contents
- FAQs: Key v2.0 updates and timings
- FAQs: ELA
- FAQs: Math
- FAQs: Multilingual learner (MLL) tools and reviews
- FAQs: Science
- FAQs: Earlier reports and review tools
- FAQs: Tool revisions
- FAQs: Listening and Learning Tours and Advisories
FAQs: Key v2.0 updates and timings
Q: When were the first v2.0 reports released?
A: Inaugural v2.0 reports for K-12 comprehensive materials were released in July 2025. Further reports will be released on a rolling basis; see which materials are currently under review and in queue on our Upcoming Reviews page.
Q: What changes have you made in version 2.0 tool revisions?
A: Version 2.0 review tools are available on our Review Tools page. Key updates include:
- Significant enhancements to ELA criteria to ensure stronger alignment with the science of reading and structured literacy practices.
- New, multilingual learner (MLL)-specific Review Criteria for each content area to enable broader and deeper evaluation for MLL supports.
- Deepened emphasis on the Standards for Mathematical Practice in math criteria and phenomena-driven three-dimensional instruction in science criteria.
- Streamlining of criteria and gateway structures across all content areas to increase consistency, clarity, and efficiency, and to facilitate a more nimble review process.
Our updated criteria also represent continuity alongside enhancements. They maintain EdReports’ decade-long commitment to providing a laser focus on alignment to college and career-ready standards, grade-level content, evidence-based practices, and other markers of quality including teacher and student supports.
Q: Do version 2.0 tool revisions include pre-Kindergarten (pre-K) materials?
A: No; while our pre-K review tools and process are informed by our learnings from 10 years of reviewing K-12 materials, we’re developing these new tools separately from K-12 tool revisions due to the many points of difference between pre-K and other grade bands.
Learn more via our pre-K FAQs, resources, and Review Tools page.
Back to top of page / contents
FAQs: English language arts (ELA)
Q: What are the key highlights and innovations of v2.0+ ELA reports and review tools?
A: The key highlights and innovations of v2.0 ELA reports and review tools include the following:
- Areas spanning K-12:
- Integration of knowledge building throughout comprehension criteria.
- Focus on a clear core instructional pathway plus guidance for using supplemental materials.
- Enhanced emphasis on sentence-level writing and explicit reading-writing connections.
- Evaluation for a range of full texts and excerpts, cohesive text sets that build knowledge, and varied perspectives and representation.
- Areas specific to K-5:
- Tighter alignment to structured literacy and research-based practices, incorporating standards as appropriate.
- Introduction of a dedicated indicator scoring materials based on the absence of three-cueing.
- Increased focus on phonemic awareness.
- Deepened foundational skills indicators in grades 3-5 to better align to developmentally appropriate research.
Q: What’s the difference between ELA v2.0 and v2.1 reports and tools?
A: All ELA review tools labeled v2.0 and v2.1 are part of EdReports’ "2.0 family," representing our most rigorous and current tools. These tools reflect the latest research and priorities for instructional materials, including alignment to the science of reading, enhanced multilingual learner criteria, and improved usability.
The v2.0 tools were developed in 2024 and served as the foundation for our inaugural 2.0 reviews. During the first review cycle in 2025, we made targeted refinements to our K-5 ELA tools. Because these changes were substantive, those tools are now labeled v2.1. The main areas of substantive change between v2.0 and v2.1 tools are:
- Gateway 1 (foundational skills) of our 3-5 tool
- Gateway 3 (teacher and student supports) of our K-2 foundational skills supplements tool
ELA criteria for grades 6-12 have not changed since their initial draft release and continue to carry the v2.0 designation.
For more information, see our ELA Review Tools page.
Q: How do v2.0+ ELA reports and tools compare to earlier reports and tools?
A: For a detailed comparison, see Comparing Current and Earlier ELA Tools on our ELA Review Tools page. This page compares current and earlier tools across multiple review components, including knowledge building, instructional pathways and program “bloat”, writing, vocabulary, text quality and complexity, assessments, structured literacy practices and standards, phonics and three-cueing, phonemic and phonological awareness, foundational skills: consistency across contexts, foundational skills: grades 3-5.
Back to top of page / contents
FAQs: Math
Q: What are the key highlights and innovations of v2.0 math reports and review tools?
A: Key highlights and innovations of v2.0 math reports and review tools include the following:
- Increased requirements to focus on the major work of the grade in K-2 materials, including fundamental math skills.
- All eight Math Practices scored separately and with binary scoring across all grade levels.
- Increased precision and consistency around rigor and balance across all grade levels.
To learn more, see 3 Ways EdReports Reviews Support Strong Foundations in Math.
Q: How do v2.0 math reports and tools compare to earlier reports and tools?
A: For a detailed comparison, see Comparing Current and Earlier Math Tools on our Math Review Tools page. This page compares current and earlier tools across multiple review components, including rigor and balance, the Standards for Mathematical Practice, assessments, and minimum K-8 thresholds for major work of the grade.
Back to top of page / contents
FAQs: Multilingual learner (MLL) tools and reviews
Q: Why have you created dedicated tools to evaluate materials for MLL supports?
A: Supporting multilingual learners is an area of critical need, today more than ever. There are now an estimated 4.9 million children in U.S. public schools learning the English language. Millions of those students are spending much of their days in general education classrooms, often with teachers not specifically trained to work with them.
While we have reviewed all comprehensive K-12 materials for MLL supports since 2020, we first prototyped MLL-specific tools in 2022 in order to broaden and deepen criteria in this essential aspect of curriculum quality. These tools were piloted in 2022 and 2023 and have been further revised since, leading to the creation of dedicated, standalone MLL tools for each K-12 content area in version 2.0 review tools.
Q: How do MLL scores impact overall series ratings or whether a review proceeds to all gateways?
A: MLL scoring does not impact the scores or review process for core content for inaugural version 2.0 reports. In keeping with all existing EdReports reviews, the overall series ratings for materials is based solely on scores for core content indicators (ELA, math, or science) and will not incorporate MLL scores.
In published reports, MLL indicators and scores are shown alongside relevant core content indicators to illustrate where multilingual learners are and are not supported within the content. However, the two sets of scores are reported separately.
MLL scores also do not impact the gateway process of core content reviews. For example, if a program meets expectations for core content in Gateway 1 but not for the corresponding MLL indicators in that gateway, the review still proceeds to Gateway 2.
Q: Why do MLL scores not count toward overall scores? Will that change in the future?
A: At EdReports, we know how important it is to shine a light on both the quality of core content and how well materials support multilingual learners. That’s why we report MLL and core scores separately for now. The field is still evolving in its understanding of what high-quality MLL supports look like, and presenting the scores separately helps ensure clarity and transparency for users. We’re continuing to learn alongside the field and may consider incorporating MLL scores into overall ratings in the future as we refine our tools and processes.
Q: What does it mean when a report shows “Scores Pending” for Multilingual Learner Supports?
A: In some cases, we publish the MLL section of a report separately from the core content sections (ELA, math, and science). Because MLL reviews require both content expertise and specialized knowledge, they’re more complex and time-intensive than core content reviews.
To keep ELA, math, and science reports flowing to the field, we publish them as soon as they are ready and release MLL scores and evidence on a later timeline to ensure quality, while still getting information into educators’ hands quickly.
Q: What do the MLL criteria ask reviewers to look for in materials? How are MLL tools and scoring structured?
A: EdReports MLL review tools use research-based aspects of success for MLLs to highlight where and how multilingual students can be successful within the materials. There are four categories of MLL criteria: MLLs’ Full and Complete Participation in Grade-Level Content, Coherence of MLL Supports, Teacher Guidance, and Assessment. Each criterion comprises several indicators.
The MLL criteria are the same across content areas and grade levels. The exact placement of MLL indicators in each tool varies slightly in ways that reflect the priorities and structures of its equivalent core content tool.
While MLL tools and scoring are separate from that of core content, the naming convention of each MLL indicator references its corresponding indicator in the relevant core content tool. This is to illustrate where multilingual learners are and are not supported within the core content.
Q: How do MLL reviews work? Who conducts MLL reviews?
A: Starting in early 2025, all comprehensive, K-12 materials in ELA, math, and science reviewed by EdReports are reviewed for MLL supports using the relevant, subject-specific MLL tool.
The core content and MLL reviews conducted for each program are separate processes involving separate review teams, although they take place along similar timeframes. MLL reviews are conducted by educator reviewers with expertise in both the relevant core content area and in MLL supports.
Q: Do you have criteria to evaluate MLL supports in ELA foundational skills content?
A: Not yet, but we hope to develop these in the future. The intersection of the science of reading and MLL supports is an evolving area of interdisciplinary research and collaboration, and we aim to ensure that our MLL indicators for foundational skills reflect the latest expert consensus.
Q: Will you review English language development supplements or materials in languages other than English for dual language/bilingual classes?
A: EdReports reviews comprehensive, year-long, K-12 materials in English language arts (ELA), math, and science, as well as K-2 ELA foundational skills supplements. Currently, we don’t review any other types of supplemental materials. However, we constantly monitor the landscape of materials used in U.S. classrooms and consider how we can best help decision-makers to make choices that advance more coherent, high-quality academic experiences for all students. Our expansion into reviewing pre-K materials is one example of this approach.
EdReports currently only reviews materials in English. From version 1.5 tools (2020) onwards, all of our review tools for K-12 comprehensive materials prompt reviewers to provide information about how materials support the use of home language in the classroom and whether home language is viewed as an asset in learning core content.
For more information on the types of materials we review, see our main FAQs page.
Back to top of page / contents
FAQs: Science
Q: What are the key highlights and innovations of v2.0 science reports and review tools?
A: Key highlights and innovations of v2.0 science reports and review tools include the following:
- Increased clarity and consistency across all grade levels in review criteria and scoring indicators related to:
- Phenomena and problems driving learning
- Three-dimensional learning and assessment
- Coherence and scope of materials.
These refinements help report users better understand the intent of all indicators and continue to support the field’s growth in its implementation of the Next Generation Science Standards (NGSS) and phenomenon-driven, three-dimensional instruction.
Q: How do v2.0 science reports and tools compare to earlier reports and tools?
A: For a detailed comparison, see Comparing Current and Earlier Science Tools on our Science Review Tools page. This page compares current and earlier tools across multiple review components, including phenomena and problems, three-dimensional learning and assessment, coherence and scope of materials, and assessments.
Back to top of page / contents
FAQs: Earlier reports and review tools
Q: How do version 2.0 tool revisions affect materials that have already been reviewed by EdReports? Will they be re-reviewed against the new criteria? Can materials gain or lose points?
A: Tool revisions do not affect existing, published reports. EdReports does not update completed reports retroactively when we revise our review tools. We’re always willing to consider re-reviewing materials if they have been substantively updated, but that decision is prompted by changes to the materials, not to our review tools.
Each report reflects a specific point in time, using the most current versions of both the materials and our review tools. We believe every report we've published offers valuable evidence and insights for school systems as they explore potential materials.
Q: Could programs that were rated green on the previous tools be rated yellow or red on the new tools, or vice versa?
A: Different copyrights of the same title may receive different ratings because publishers often make significant content changes between editions.
When a publisher releases a new edition under a new copyright or creates a new digital edition,* we treat it as a new set of materials. This means we conduct a new review using the latest tools and generate a new report.
All our reviews, including those of current and previous editions, are available on the EdReports website. The same is true for our current and previous review tools. This transparency helps inform the field about how materials and EdReports tools have changed over time.
* Digital copyrights are often unchanged even when the materials are updated (unlike print materials).
Q: How should users approach reports that were created using older review tool versions?
A: Reports created with earlier versions of our review tools (v1.0 and v1.5) contain valuable insights, but may not fully capture the most recent educational priorities and research. For more information and guidance, see How to Use EdReports’ Earlier Reports and Review Tools.
Q: Can publishers request an updated review of their program using the new tools?
A: This depends on several factors. Ultimately, EdReports strives to ensure that our reviews accurately reflect what is being used in the field, and we stand ready to re-review materials when they have been substantively updated.
When a publisher releases a new edition of a program under a new copyright or effectively creates a new digital edition of their program,* we consider it a new set of materials and start a new review process using the latest available tools.
If the publisher has made substantial changes under the same copyright, we ask them to indicate the scale and substance of the changes made since our review. Depending on the answer, we may conduct a new, full review process or only update the affected parts of the review.
* Digital copyrights are often unchanged even when the materials are updated (unlike print materials).
Back to top of page / contents
FAQs: Tool revisions
Q: What does EdReports mean by “review tools” and “tool revisions”?
A: Each set of EdReports review tools covers a content area (for example, science) and grade band (for example, K-5), and comprises two documents:
- A Review Criteria document identifying the “indicators,” or specific items, against which EdReports’ educator reviewers evaluate the quality of instructional materials.
- An Evidence Guide elaborating details for each indicator including its purpose, information on how to collect evidence, guiding questions and discussion prompts for reviewers, and scoring criteria.
By “tool revisions,” EdReports means revising or updating its review tools, as well as making any related updates to its review process and report formats necessitated by the tool revisions.
Q: What are “version 2.0" review tools?
A: Each “version” of EdReports review tools is similar to the idea of a generation of software or technology products: that is, building on the strong foundation of previous iterations with new innovations and improvements.
Our 2024-25 development of version 2.0 tools was a multi-subject update covering all review tools for K-12 comprehensive materials in ELA, math, and science, similar in scope to our 2020 version 1.5 tool revisions. For more information on the details of version 2.0 tool revisions, see this article and below. To view all current and previous review tools, see our Review Tools page.
Q: Why does EdReports revise its review tools? What is the goal of these revisions?
A: EdReports is dedicated to continuous improvement, striving to ensure that our review tools and processes stay relevant, rigorous, and aligned with advances in curriculum, technology, and research on effective learning methods. By regularly refining our tools, we aim to help key stakeholders—decision-makers, students, and publishers—in the following ways:
- For states, districts, and educators: Our goal is to equip curriculum decision-makers with reliable, evidence-based reviews that support informed decisions about program adoptions and allow them to demand high-quality, standards-aligned instructional materials.
- For students: Revisions help increase access to the high-quality, aligned instructional materials that all students need in order to thrive.
- For publishers: Updates to our tools provide clear, consistent expectations, supporting publishers in developing materials that meet the evolving needs of the field.
Q: How frequently does EdReports plan on updating its tools in the future?
A: EdReports is dedicated to continuous learning and growth, addressing the needs of students, teachers, districts, and states, and incorporating feedback from our users and the field. We don't follow a fixed schedule for updating our review tools because the field evolves rapidly; the needs of, and input from, the field primarily determine the timing.
We strive to balance responsiveness to new research findings and emerging trends, such as digital innovations and the use of AI, with near-term stability. This is to ensure our review tools and reports provide reliable guidance for publishers' design work and allow states to signal quality to districts with confidence.
Q: How frequently does EdReports plan on updating its tools in the future?
A: EdReports is dedicated to continuous learning and growth, addressing the needs of students, teachers, districts, and states, and incorporating feedback from our users and the field. We don't follow a fixed schedule for updating our review tools because the field evolves rapidly; the needs of, and input from, the field primarily determine the timing.
We strive to balance responsiveness to new research findings and emerging trends, such as digital innovations and the use of AI, with near-term stability. This is to ensure our review tools and reports provide reliable guidance for publishers' design work and allow states to signal quality to districts with confidence.
Q: What does EdReports’ tool revision process involve?
A: Since our last multi-subject tool revision in 2020, EdReports has been gathering input from a wide range of education stakeholders, including educators, experts, and decision-makers, to ensure our reviews keep pace with curriculum innovations and meet the field's evolving needs.
For version 2.0 tool revisions, we incorporated insights gained from stakeholder conversations and from all reviews conducted since our 2020 revisions.
We also worked closely with expert advisory groups and gathered targeted feedback through our 2024 Listening and Learning Tours for each K-12 comprehensive review focus area (ELA, math, MLL, and science), as well as through surveys, focus groups, and one-on-one discussions. These approaches enabled us to engage deeply with the field to understand how our tools and reports could better support today’s dynamic educational landscape. The resulting updates address the needs of the field and reinforce our commitment to the highest standards for instructional quality.
Q: How did EdReports engage publishers in the tool revision process?
A: As an independent nonprofit, EdReports regularly communicates with and seeks input from publishers while materials are being reviewed, inviting feedback on ways to continuously improve our process. This approach helped to inform v2.0 tool development alongside feedback from many other stakeholders.
Back to top of page / contents
FAQs: Listening and Learning Tours and Advisories
Q: How does EdReports use Listening and Learning Tours?
A: During each review tool development or revision process, EdReports conducts a Listening and Learning tour for each review focus area in order to understand the market landscape, assess the status of standards, identify needs, and engage various stakeholders for feedback. We seek input on a range of topics including technological innovations, the evolving nature of curriculum, the latest research, and how our reports can better serve users, the field, and decision-making around instructional materials.
Q: Who participates in Listening and Learning Tours?
A: Participants include classroom educators, researchers, nonprofits, states, districts, publishers, and students, ensuring comprehensive input and clarity on next steps. Depending on the review focus area, the tour involves up to hundreds of individuals through interviews, surveys, and focus groups with educator reviewers, professional organizations, curriculum leaders, and more.
Following initial conversations, EdReports continues to engage tour participants through specific follow-up discussions and contributions as we consider tool developments and adjustments.
Q: What are EdReports’ “advisories”? How is their involvement different from other Listening and Learning Tour contributors?
A: During each review tool development or revision process, EdReports convenes an “advisory” group for each review focus area. These groups consist of experienced classroom educators, researchers, curriculum experts, state and district leaders, and representatives from partner organizations. The advisories are subject-specific and often grade-band-specific, with members selected for their content expertise. Together, they provide a comprehensive range of independent field perspectives to guide the tool revision process for each subject.
Advisory members play a key role in bringing in other organizations and experts for the Listening and Learning Tour and actively participate in its sessions. They also help EdReports teams synthesize feedback from the tour and offer ongoing input on the creation and refinement of draft tools.



