Data sgp are an important indicator of student performance that allows teachers and school leaders to evaluate the effectiveness of their classroom practices. Educators who are successful in driving student achievement will typically see students’ SGP scores move up the scale over time. Similarly, educators who are not effective will see their students’ SGP scores drop. However, interpreting SGPs requires careful consideration of the methodology used to generate them.
SGPs are calculated by comparing current assessment scores with previous test results for each student. They are then compared with the growth percentage of other students who have similar prior assessment scores (their academic peers). While these numbers may seem complicated, they are designed to provide information about student growth in terms that are understandable to teachers and parents.
Our research shows that the SGPs derived from standardized test scores are error-prone and, therefore, are noisy measures of their underlying latent achievement attributes. These errors make it difficult to interpret estimates of true SGPs at the individual level. Moreover, the relationships that exist between true SGPs and student characteristics may be driven by unobserved student-level factors correlated with both.
A key finding of our study is that, when analyzed at the teacher level, estimated SGPs are susceptible to high levels of variance due to individual-level relationship effects and teacher sorting. These problems are mitigated to some extent when using a value-added model that regresses student test score covariates on teacher fixed effects and prior achievement attributes. Such a model also removes the correlation between the true SGP and student background variables, allowing for more transparent interpretation of aggregated SGP.
Educators whose students consistently perform above the 50th percentile on their subject matter SGP will likely be recognized for their outstanding accomplishments. Conversely, those who consistently see their students below the 50th percentile on their mSGP may need to review their practice to determine whether it is producing the desired results.
The sgpData exemplar data set models the format of the data that can be used with the lower level studentGrowthPercentiles and studentGrowthProjections functions. It contains five years of annual, vertically scaled, assessments in WIDE format. The sgpData is available on the DESE website, and provides an example of what is needed to run these analyses.
We recommend that schools and districts use this exemplar when setting up their own data sets for running these analyses. The lower level SGP functions work with WIDE data, whereas the higher level functions that provide wrappers for the lower level functions require LONG data. In general, we advise that any analysis with the SGP functions be run on a LONG formatted data set. Data preparation issues that cause errors in the calculations of SGPs are often symptomatic of more significant problems with the analysis process. This is why we focus our attention on these issues in our research and training efforts. Our goal is to reduce the amount of data preparation required for this work and to improve the quality of the analyses performed by schools and teachers.