How does Spring Math work?

The Spring Math process begins with assessing your students' current skill level. Assessments take less than five-minutes and are group administered. Spring Math summarizes screening results in an easy-to-read format with intervention recommendations.

There are two types of interventions in Spring Math: classwide and individual. Classwide intervention provides weekly adjustment of skill difficulty, new materials each week, summary reports reflecting classwide progress, and implementation support to promote sustained use of the intervention. Individual intervention begins with a “drill-down” assessment process to identify the right intervention for the student.

For both intervention types, Spring Math guides the teacher step-by-step. All intervention packets are dynamically generated as they are needed to ensure they match current skill development. Spring Math's proprietary decision rules direct instructional decisions, speeding up student growth.

Spring Math avoids two of the most common causes of intervention failure: choosing the wrong intervention and failing to use the intervention. Data from a randomized controlled pilot study showed that Spring Math-directed intervention produced stronger alignment with student need, teacher use of the intervention, and student learning in mathematics.

Spring Math simplifies intervention delivery by minimizing alternative causes of low mathematics performance, using assessment data to select the right intervention for the child, and delivering to the teacher all needed materials to conduct the intervention. The teacher only has to do the fun part—work with the student or the class as a whole for 15-20 minutes per day to raise achievement.

The system can be used to track core instruction in real time to verify mastery of essential grade-level skills. When a core concept has not been mastered, the system will notify the teacher and provide a brief classwide lesson plan or instructional protocol to re-teach the skill, along with a follow-up assessment to verify that it was successful.

View the Theory of Change for Spring Math.


How are the screening skills selected?

Screening skills are skills that students should have been introduced to before the screening. They are skills that students will need to master before learning the next skills appropriate for their grade. At each grade level, for fall, winter, and spring, there are 3-5 screening assessments. The assessments emphasize some skills more than others. These are called “tool” skills. Tool skills open the door to deeper understanding of big ideas in mathematics. Spring Math selects tool skills based on the latest research in effective math instruction.

All screening skills align with the Common Core State Standards by grade level. For some grades Spring Math provides screening and instruction on skills not listed in the Common Core. These are skills that pave the way for a Common Core skill that appears in a later grade.

View of a list of screenings by grade and time of year.


Why do the screening measures seem so hard for my students?

Academic screening can create anxiety for teachers. They may feel that if students don't perform well it reflects badly on them. In reaction to this, some screening systems have created measures that are too easy. In the short term, this may result in better screening results, which may feel good to decision makers. But the risk to this approach is that students may not receive support they need to grow to mastery.

Spring Math selects screening skills to reflect student progress toward mastering grade-level content. Students need to master these skills before moving onto the next skills appropriate for their grade. If a student fails an easier screening they likely need more help to reach proficiency, but many students who pass these easier screenings may actually need intervention as well. Although the measures used in Spring Math are challenging, studies have shown they do a better job predicting when a student needs help.

Spring Math measures align with the Common Core State Standards for mathematics. For some grades Spring Math provides screening and instruction on skills not listed in the Common Core. These are skills that pave the way for a Common Core skill that appears in a later grade. Example: At grade 3, Spring Math assesses multiplication and division facts 0-12, whereas the Common Core standard is 0-10. This is because mastery of 0-12 factors and products is a basis for creating equivalent fractions, which is a key part of Grade 4 instruction.


Why are the assessments timed?

The assessments are timed to minimize the amount of time devoted to assessment. Over-assessment is a problem in most school systems. It robs teachers of precious instructional time. Spring Math uses short, curriculum-based assessments to reliably and accurately determine if a student is at risk. Spring Math uses the shortest interval of time that provides the best measure of student proficiency. Once at-risk students are identified, they are then given a series of brief follow-up assessments. These determine where the problems are to make the right intervention recommendation. Recent research shows this more efficient approach to screening performs as well as or better than longer duration methods.

Timed assessments also provide a more accurate indicator of skill mastery. Imagine two students: one student scores 100% correct and responds quickly, without hesitation. She can solve a problem multiple ways, and can teach a friend how to solve the problem. The second student also scores 100%. But she is halting, unsure, and has to count hashmarks to find the correct answer. Clearly the first student is more proficient even though both students have the same score on an untimed assessment. Timed assessments tell teachers more about student proficiency. In timed assessments more proficient students will answer more problems correctly within the time limit.


How are Spring Math assessments different from those in other math assessment tools?

Spring Math’s unique, gated approach to math assessment saves teachers time by providing clear results on what skills a student has and hasn’t acquired. Spring Math provides the speed of curriculum-based measurement as well as the sensitivity to growth of specific-skill mastery assessment.

Spring Math assesses approximately 130 skills for core instruction in grades K-8 and remedies gaps in learning for grades K-12. The skills offer comprehensive but strategic coverage of the Common Core State Standards.

Spring Math assesses mastery of number operations, pre-algebraic thinking, and mathematical logic. It also measures understanding of “tool skills.” Tool skills provide the foundation a child needs to question, speculate, reason, solve, and explain real-world problems. This approach is unique in math assessment tools. Spring Math emphasizes tool skills across grades with grade-appropriate techniques and materials. The tool skills include:

  • combining whole numbers and variables
  • taking whole numbers and variables    
  • multiplicative reasoning including factors, products, and exponents
  • fractions, proportions, and division
  • quantity comparison, ordinal position, and place value
  • solving for unknowns
  • and creating equivalent quantities.


Many of the intervention materials in Spring Math provide an opportunity for mastering more than one tool skill.

View a complete list of skills assessed. 


How does Spring Math determine if a student is at risk or not?

Spring Math decision rules come from 40 years of research in math teaching and curriculum-based measurement. They are designed to increase alignment between student proficiency, skills, and teaching methods. If a student scores in the risk range on a Spring Math measure it means the student is unlikely to retain the skill or be able to use it to solve similar and more complex problems. At-risk students are likely to make errors when solving problems requiring use of the skill and the child is not likely to master future related and more complex skills.


Why are assessments given as part of the interventions?

Research shows that regular skill assessment is an important part of effective math instruction. Current screening tools don't recommend interventions based on student assessment scores. Likewise, online math skill tools give students practice but don't provide real feedback on skill mastery. Spring Math provides both, by using weekly assessment scores to recommend the right intervention for each student. And Spring Math's intervention materials are dynamically generated to make sure students are practicing the right skills each day.

It can be challenging and time consuming to perform high-quality assessments in the classroom on a regular basis. Spring Math makes it easier by providing assessment materials that are relevant to the skills the student is practicing. These materials are also generated dynamically, so they are available right when you need them.

Research results on follow-up assessments to track the effects of skill intervention have been promising. For more information refer to “Innovation Configuration for Mathematics” (VanDerHeyden & Allsopp, 2014).


Why do the risk criteria differ across grades for the same skill?

Spring Math verifies skill mastery of prerequisite skills before instruction begins for grade-level skills. In some cases, Spring Math may recommend intervention for an older student on a lower-level skill. This happens when a student needs help closing the gap between what they are able to do and what they have not yet mastered to get them back on track.

This approach is sensible in math. For example, some students struggle to create equivalent fractions, which they must understand in order to add and subtract with fractions that have unlike denominators. Sometimes the inability to create equivalent fractions is partially caused by the student not having mastered multiplication facts.

When an older student is recommended for intervention on a lower-level skill, the student may need to reach a higher score to be considered proficient on that skill relative to a younger student. This is because a higher rate of performance is necessary to predict retention, functional use of the skill, and learning of related and more complex skills. This reflects the older student’s ability to read, write, or type faster than a younger student.


What research evidence supports the use of Spring Math?

Spring Math benefits from research conducted by the developer (Amanda VanDerHeyden) but also by respected scholars in education and psychology. The design of Spring Math, from the screening assessments, skill content and sequences of skills, intervention protocols, to summary reports and implementation support features utilizes the best-available research evidence. Spring Math has been under development for over a decade. All assessments have been examined to ensure they meet the highest standards for assessment construction, reliability, validity, and decision accuracy. Many studies have been conducted and published in peer-reviewed journals in psychology and education. View a list of references to these studies.

Skill sequences used in assessment and intervention were developed in coordination with content-area experts as part of a wide-scale district trial of Response to Intervention implementation (VanDerHeyden, Witt & Gilbertson, 2007). These skill sequences have been cross-validated against the common core state standards with subtle adjustments to ensure fit. View the alignment of content in Spring Math and the Common Core State Standards.

The design of intervention features leverages the latest best practices and research evidence. These features include:

  • aligning intervention tactic and skill difficulty with student proficiency 
  • decision rules to determine the intervention tactics that produce maximal growth
  • sequence of intervention moving from prerequisite to goal skills
  • specific intervention tactics including modeling
  • guided practice
  • immediate versus delayed corrective feedback
  • verbal rehearsal strategies and “think aloud” problem solving
  • and scripted conceptual understanding tactics specific to each skill.

 

A randomized controlled pilot study of Spring Math conducted in the Boston public schools examined the effect of Spring Math on intervention skill and strategy alignment, intervention use, and mathematics learning. At-risk students in grades 1-5 (N = 39) were randomly assigned to use Spring Math or a teacher-selected intervention for four weeks. Brief skill assessments were administered each week following standard curriculum-based measurement procedures to track growth of each group. Growth was computed as answers correct per two minutes per week on the intervention skill and the generalization skill. Integrity was estimated by permanent product as days per week of intervention use. The study showed that intervention skill and strategy alignment were superior with Spring Math, with alignment approximating the base rate (or accuracy based upon chance alone) in the teacher-selected interventions. Statistically significant differences were detected on all outcome measures. Future studies will be conducted to measure student gains and to guide future adjustments of the tool.

View a current list of intervention studies.


What research was used as the basis for developing the assessments?

Skill sequences used in assessment and intervention were developed in coordination with content-area experts as part of a wide-scale district trial of RtI implementation (VanDerHeyden, Witt & Gilbertson, 2007). Skill sequences have been cross-validated against the common core state standards. View a report on alignment of content in Spring Math and the Common Core State Standards.

  • Assessments were built using the science of curriculum-based assessment. Measures were constructed to sample specific skills, yield reliable and valid scores, and to allow for brief, repeated assessment to model progress over time and in response to instructional changes.
  • Research has estimated score reliability of computation CBM probes for mathematics from r = .67 to above r = .90 (Foegen, Jiban, & Deno, 2007) for alternate form, test-retest, inter-scorer agreement, and internal consistency. Delayed alternate form reliability of the scores obtained on Spring Math measures used in one study was r = .85 (Burns et al., 2006). Criterion validity evidence has been somewhat variable in mathematics CBM with correlation coefficients typically falling in the moderate range (r = .3 to r = .6; Foegen et al., 2007). Scores obtained on the computation CBMs used in one study were found to correlate with Stanford Achievement Test (Harcourt, 1997) scores in the moderate range of r = .27 to r = .40 (VanDerHeyden & Burns, 2008).
  • For the kindergarten measures, research has found alternate form correlations for the math measures ranged from r = .7 to r = .84 and correlated with the math composite score on the Comprehensive Inventory of Basic Skills, Revised (CIBS-R; Brigance, 1999) (correlations range from r = .44 to r = .61; VanDerHeyden, Witt, Naquin, & Noell, 2001).

Usability testing has also been conducted for all the assessments generated within Spring Math to ensure equivalence across measures, meaning that any change in performance from one assessment occasion to the next is due to student growth, not measurement error.


What does the “weeks with scores” metric mean?

“Weeks with scores” represents the percentage of weeks during an intervention that a score was entered. For example, if a student has been in individual intervention for 10 weeks and scores were entered for 6 of those weeks, their weeks with scores percentage would be 60%. This metric is meant to bring attention to any intervention that has not had a score entered for at least 80% of the weeks.

Research in intervention implementation suggests that consistency is the most significant factor in determining whether an invention is successful. There are many reasons why score entry may be inconsistent. Sometimes the child is absent or the school has special programming that interferes with intervention time. The point of the “weeks with scores” metric is not to comment on why score entry may be inconsistent. The point is to call attention to an intervention that may have an issue. When a consistency issue arises, the coach or principal should work together with the teacher to see how they can improve consistency. They should ask “Have we as adults done everything we can to deliver this intervention consistently? Is there anything we can do better?”

Day-to-day life in schools and human nature can compete against consistent implementation of interventions. Most people can relate to how common it is to start but not continue a diet regimen, to fail to follow a budget, and to not finish an antibiotic prescription. Consistent implementation is the Achilles heel of mathematics achievement improvement. The “weeks with scores” metric helps ensure the best outcome for the student.


What are "tool skills"?

Tool skills provide the foundation a child needs to learn more advanced math skills and solve real-world math problems. Many are “life” skills that will serve the student for the rest of his or her formal schooling career, but also in life. Spring Math materials help students master tool skills across all grades. Below is list of all the Spring Math tool skills.

Combining Whole Numbers and Variables
Includes addition of all number quantities and collecting like terms.

Taking Whole Numbers and Variables
Includes subtraction of all number quantities and collecting like terms.

Multiplicative Reasoning
Includes working with factors, exponents, and whole numbers.

Proportional Reasoning
Includes fractions (placing on a number line, comparing quantities, conducting operations, and creating equivalent quantities), decimals, percentages, and division.

Quantity Comparison, Ordinal Position, and Place Value
Quantity comparison helps students predict correct answers when completing some math problems. When children can predict the the general range for an answer, they are able to anticipate their own errors and be more confident solving problems. Understanding of place value also helps students compare quantities and understand how and why standard algorithms work.

Solving for Unknowns
Addressed across all grades. In Kindergarten, students are asked to change existing dot quantities to specific dot quantities ranging from 1 to 10 by adding or removing dots. In grades 1-4, fact families are used to facilitate student proficiency in solving for an unknown. Applied problems include solving for remaining time and units of measurement so children are well-acquainted with the practical value of this skill. In the middle to upper grades, solving for unknowns is reflected in solving equations, creating equivalent equations, distributing and collecting terms, simplifying expressions, and solving systems of linear equations.

Creating equivalent quantities
This skill is supported with the use of manipulatives when appropriate, place value strategies, and number line demonstrations and activities. This is an example of a "math life skill" that effective problem solvers use. When adults make change, they commonly do “mental math” to convert one quantity to near-dollar or ten-cent value and then adjust the answer by the remaining ones (e.g., $21.00 item is reduced by $13.00-- $21.00 - $10.00 = $11.00 minus $3.00 to reach $8.00). Being able to convert quantities allows for flexibility in solving problems which in turn allows the problem solver to turn a more challenging problem into an easy problem.