Data sgp is a useful measure of student performance that ranks students by their growth relative to academically similar peers. It is often used to evaluate teachers, and can be used to predict future performance in specific subjects such as math or English language arts. SGPs are often reported as percentile ranks, and can range from 1 to 99, with higher numbers indicating greater growth.
Unlike standard test scores, SGPs are not linear and can fluctuate significantly from year to year. However, they are highly correlated across subject areas and grade levels. These patterns suggest that true SGPs are related to latent achievement attributes, and that the correlations are related to covariates such as student background characteristics.
We examine the distributional properties of these relationships by estimating SGPs for students with known student background variables. We use the WIDE and LONG format data sets provided with the SGPdata package, which are designed to represent longitudinal (time dependent) student assessment data. In the WIDE format, each case/row represents a student, while in the LONG format, time dependent data for a given student are spread across multiple rows.
Our results indicate that a student’s true SGP is positively correlated with their prior test scores and negatively correlated with their teacher’s average prior test score. We find that these patterns are present even when the prior test scores are regressed on teacher fixed effects and student background variables. This suggests that the interpretability and transparency benefits of SGPs require weighing against the costs of allowing this source of bias in SGP estimates.
In practice, the use of SGPs is motivated by the desire to evaluate both individual student growth and educator effectiveness in a more fair and relevant way than simply examining unadjusted student performance levels. SGPs can also be aggregated to the teacher or school level, and when aggregated are viewed as an important indicator of educator quality. These benefits need to be weighed against the potential for introducing a source of bias in aggregated SGPs that is easy to avoid in a value-added model that regresses student test scores on teacher fixed effects and prior test scores, while controlling for student background variables.
To improve the accuracy of SGPs, we condition on different amounts of data and find that estimators with less information produce better results than those with more information. This result is consistent with the assumption that SGPs are a function of a combination of the current and prior test scores, but it is also in line with expectations for the error term associated with a random variable. We can see this relationship in Figure 1, which shows the RMSE of conditional mean estimators of e4,2,i conditioned on various combinations of the current and prior test scores, with varying reliability l.