Research Report
2024-01
Has the Predictive Validity of High
School GPA and ACT Scores on
Postsecondary Enrollment Changed
Over Time?
XIUHAN CHEN, M.ED., AND EDGAR I. SANCHEZ, PH.D.
© 2023 by ACT, Inc. All rights reserved. | 2321
ACT Research | Research Report | January 2024 2
© 2023 by ACT, Inc. All rights reserved. | 2321
Conclusions
Accounting for student characteristics and school effects, using both high school GPA (HSGPA)
and ACT Composite score was more predictive of postsecondary enrollment than using either
predictor alone. When used together, the predictive power of HSGPA increased from 2010 to
2021, while the predictive power of the ACT Composite score decreased during this period.
HSGPA had greater predictive power than did the ACT Composite score after 2014. We believe
the adoption of test-optional admissions policies before and especially after the COVID-19
pandemic helps explain the increase in predictive power for HSGPA.
So What?
Given the extensive evidence that HSGPA inflation has increased dramatically in recent years,
the predictive power of HSGPA alone is threatened. This can result in underprepared students
entering postsecondary education and experiencing less favorable outcomes such as higher
drop-out rates and higher student loan debt. Combining ACT scores with HSGPA helps provide
a more comprehensive picture of which students are prepared to enroll in postsecondary
education. This can help postsecondary institutions more accurately predict which students may
continue their education after high school.
ACT Research | Research Report | January 2024 3
© 2023 by ACT, Inc. All rights reserved. | 2321
Now What?
Given the potential threat to the predictive power of HSGPA posed by grade inflation, using ACT
scores can help offset the uncertainty of the interpretation of HSGPA and help differentiate
between students with the same HSGPA (e.g., students who all have a 4.0 HSGPA). HSGPA
and ACT scores measure different aspects of student achievement; therefore, they provide
complementary information. We recommend using both HSGPA and ACT scores to predict
postsecondary enrollment because using both yields the strongest predictive power.
About the Authors
Xiuhan Chen, M.Ed.
Xiuhan Chen is a Ph.D. student at the
University of Missouri studying Statistics,
Measurement, and Evaluation in Education.
Her research focuses on the combination of
causality and psychometrics. Throughout her
academic journey, Ms. Chen has aimed to
deepen the understanding of the causal
relationship between human behaviors and
personality traits.
Edgar I. Sanchez, Ph.D.
Edgar I. Sanchez is a lead research scientist
at ACT, where he studies postsecondary
admissions, national testing programs, test
preparation efficacy, and intervention
effectiveness. Throughout his career, Dr.
Sanchez has focused his research on the
transition between high school and college
and supporting the decision-making capacity
of college administrators, students, and their
families. His research has been widely cited
in academic literature and by the media,
including The Wall Street Journal, The
Washington Post, USA Today, and the
education trade press.
Acknowledgements
The authors wish to thank Joann Moore, Jill McVey, Jeff
Allen, and Jeffery Steedle for their comments on earlier
drafts of this report.
ACT Research | Research Report | January 2024 4
© 2023 by ACT, Inc. All rights reserved. | R2321
Introduction
Grades are among the most important indicators of students’ academic success, skill, and
knowledge. Not only do grades influence academic awards, academic intervention, and
advanced course placement (Feldman, 2018), they also affect athletic or extracurricular
eligibility, employment, driving permits, car insurance rates, college admission, scholarships,
and financial aid (Griffin & Townsley, 2021). In American high schools, high school grade point
average (HSGPA) is one quantitative measure of grades (ACT, 2005). HSGPA is often
compared with standardized test scores (such as ACT
®
test scores). Both ACT scores and
HSGPA are common measures of high school students’ academic achievement. These
measures are typically used to predict students’ academic performance or future success, such
as first-semester college GPA, first-year college GPA, and college completion. However,
HSGPA may not always be an accurate measure of a student’s true ability (ACT, 2005). The
presence of grade inflation raises questions about the extent to which we should rely on grades
to measure of academic achievement or predict future success.
Grade inflation refers to an increase in grades without a concurrent increase in students’ true
ability or other objective measures of academic performance (Bejar & Blew, 1981; Camara et
al., 2004; Gershenson, 2018; Godfrey, 2011). In other words, students’ grades may not reflect
their true level of content mastery (Chowdhury, 2018). The evidence of grade inflation has been
well documented, with research consistently demonstrating that HSGPA has steadily increased
over the past several decades, while standardized assessment scores have remained
unchanged or declined (ACT, 2005; Bejar & Blew, 1981; Camara et al., 2004; Gershenson,
2020; Godfrey, 2011; Ziomek & Svec, 1995). For example, Ziomek and Svec (1995) chose a
sample of 530,000 students from 5,136 public schools between the 19891990 and the 1993
1994 school years to demonstrate that there was a steady increase in HSGPA while ACT
scores remained constant. A more recent study found that from 2009 to 2019 alone, HSGPA, as
reported on a 0.0 to 4.0 scale, increased from 3.00 to 3.11, while National Assessment of
Educational Progress (NAEP) scores remained constant (U.S. Department of Education,
Institute of Education Sciences,
National Center for Education Statistics [NCES], National
Assessment of Educational Progress [NAEP], 2022a). A second recent study found that from
2010 to 2021, HSGPA among ACT-tested students increased from 3.17 to 3.36 (Sanchez and
Moore, 2022).
Detecting grade inflation is challenging, as it requires both grades across time and a stable
measure against which to compare them (ACT, 2005; Bejar & Blew, 1981). Finefter-Rosenbluh
and Levinson (2015) summarized three approaches to assessing grade inflation. First, grade
inflation can be understood by comparing high school grades to standardized test scores using
longitudinal data. In the past, HSGPA has been compared to scores from several standardized
tests, such as ACT scores (ACT, 2005; Bellott, 1981; Woodruff & Ziomek, 2004; Zhang &
Sanchez, 2013), SAT scores (Bejar & Blew, 1981; Godfrey, 2011), NAEP data (U.S.
Department of Education, NCES, NAEP, 2020a), and end-of-course exams (Gershenson,
2018). Second, grade inflation can be understood as grade compression. For example, Hurwitz
& Lee (2018) used nationally representative data to estimate grade inflation among all high
ACT Research | Research Report | January 2024 5
© 2023 by ACT, Inc. All rights reserved. | R2321
school students who took the SAT. They found that the percentage of students with A averages
increased from 38.9% to 47.0% between 1998 and 2016. Meanwhile, there was a 4.2
percentage point decrease in the percentage of students with B averages, followed by a 3.8
percentage point decrease in the percentage of students with C averages. Similarly, Sanchez
and Moore (2022) found a decrease in the percentage of students reporting a B average
HSGPA from 2010 to 2021 (46.8% and 36.2%, respectively) and an increase in the percentage
of students reporting an A average HSGPA from 2010 to 2021 (40.3% and 54.9%,
respectively).
1
Lastly, grade inflation can be detected by comparing grades among high schools.
Nord (2011) recorded that grade inflation was concentrated in wealthier and white-majority high
schools. Similarly, Hurwitz and Lee (2018) observed greater grade inflation among white, Asian,
wealthy, and private school students than among students from low-income families and
students in public schools.
There are several possible reasons grade inflation occurs in high schools. Some researchers
argue that one reason is that high school students are taking advanced coursework that may
include “bonus” points that boost their HSGPA (Camara et al., 2004; Hurwitz & Lee, 2018).
Subjectivity in grade assignment among teachers also plays a role within and between schools.
For example, some teachers assign grades more strictly, while others are more lenient.
Gershenson (2018) also argued that teachers may be motivated to provide positive evaluations
of student performance not only to satisfy students and parents but also to enhance the
reputation of their schools or classrooms.
In recent years, grade inflation has become more pronounced, especially after the COVID-19
pandemic. Because of the COVID-19 outbreak, most American schools transitioned to remote
and distance learning in spring 2020 (U.S. Department of Education, NCES, NAEP, 2022b). In
response, some school districts moved away from the traditional AF grading policy to more
flexible policies implemented by districts, schools, or classroom teachers (Arundel, 2020; Cano,
2020; Sawchuk, 2020). Grading policies included pass/fail grades, credit/no credit grades, and
“do no harm” grading policies (Castro et al., 2020). These changes in mode of instruction and
assignment of grades raise concerns about how grades were assigned during this time, and the
reliability and accuracy of grades from this period deserve careful consideration. Moreover,
grades given during the pandemic may be more a function of district or school policies than
accurate reflections of differences in academic achievement across students (Sanchez &
Moore, 2022).
With access to additional electronic devices while learning at home during the pandemic, some
students may have used multiple devices to check or change answers while completing
assignments or exams (Schramm et al., 2021). Consequently, these students may have
received higher grades on exams than they would have otherwise. According to Gonzalez et al.
(2020), higher grades during remote schooling may be correlated with cheating on online
exams. Moreover, students from low-income families may lack basic technology access, such
as access to high-speed internet and an adequate number of digital devices, thereby
perpetuating social inequities (Herold, 2020). This may have resulted in students from higher-
income families receiving artificially inflated grades because of their greater access to
technology that allows them to check answers.
ACT Research | Research Report | January 2024 6
© 2023 by ACT, Inc. All rights reserved. | R2321
These circumstances have increased the uncertainty in evaluating students’ achievement during
the COVID-19 pandemic. An early case study has revealed some evidence that high school
grades improved slightly during COVID-19 (Schramm et al., 2021), and additional research has
found that grade inflation became more apparent in 2020 and 2021 (Sanchez & Moore, 2022).
Grade inflation is potentially problematic for students, postsecondary institutions and employers,
and society at l
arge (Finefter-Rosenbluh & Levinson, 2015; Silva et al., 2023). For students,
inflated grades can create a false sense of proficiency, leading students to believe that they
have mastered core content skills (Chowdhury, 2018; Finefter-Rosenbluh & Levinson, 2015). As
a result, students may study less, and parents may not understand that their children may need
help to catch up (Gershenson, 2018). Grade inflation may prevent students from reaching their
full potential (Gershenson, 2018) and subsequently may reduce educational opportunities. For
example, some students may take less rigorous courses and subjects in which they can receive
higher grades in order to boost HSGPA (Chowdhury, 2018).
Additionally, grade inflation increases the rate at which underprepared high school students
enter college, increasing the risk that these students will later drop out of college (Gershenson,
2018). Inflated grades may also influence postsecondary institutions and employment decisions
(Finefter-Rosenbluh & Levinson, 2015). Grade inflation may provide misleading information to
colleges and employers who use grades as indices of ability and content mastery (Gershenson,
2018).
Grade inflation diminishes the ability of postsecondary institutions to distinguish among
prospective students (Finefter-Rosenbluh & Levinson, 2015; Godfrey, 2011; Silva et al, 2023).
Similarly, employers who use inflated grades as a criterion in making hiring decisions may not
be able to make meaningful distinctions among job applicants; likewise, they may not be able to
tell whether an applicant has truly acquired the skills or knowledge required for the job (Finefter-
Rosenbluh & Levinson, 2015).
Finally, grade inflation poses societal problems because it may increase social disparities and
inequities (Chowdhury, 2018; Finefter-Rosenbluh & Levinson, 2015). Students from families
with high socioeconomic status and those attending private or affluent high schools are more
likely to receive inflated grades (Gershenson, 2018; Nata et al., 2014; Neves et al., 2017). More
recent research has found that students at schools with higher percentages of traditionally
underserved students experienced higher rates of grade inflation and students at schools with
higher percentages of students eligible for free or reduced-price lunch experienced higher grade
inflation (Sanchez & Moore, 2022). These two studies examined affluence slightly differently
and found somewhat divergent results. That said, when privileged students receive inflated
grades, they may also receive additional unearned advantages in college and graduate
admissions, further entrenching their elite status. Although these students are not directly
responsible for this situation, this unearned advantage reinforces inequalities (Finefter-
Rosenbluh & Levinson, 2015).
Grade inflation also weakens the predictive validity of HSGPA (Zhang & Sanchez, 2013). The
ceiling effect of grades (when students score at or near the maximum for the HSGPA
distribution) makes it difficult to compare the academic performances of different students,
ACT Research | Research Report | January 2024 7
© 2023 by ACT, Inc. All rights reserved. | R2321
thereby reducing the practical range of HSGPA. As more students approach the “ceiling,” less
variability in grades is observed, which leads to less predictive power (Chan et al., 2007;
Hurwitz & Lee, 2018).
Much research has focused on the predictive power of HSGPA and standardized admissions
test scores (e.g., ACT and SAT). Although the findings consistently demonstrate that HSGPA
and test scores are highly correlated (Zhang & Sanchez, 2013), conclusions diverge when it
comes to the predictive power of HSGPA and test scores. Some research indicates that HSGPA
is a better predictor of college completion, college cumulative GPA, and first-year grades than
college admissions tests (Allensworth & Clark, 2020; Bowen et al., 2009; Galla et al., 2019;
Geiser & Santelices, 2007; Zwick, 2006). For example, a study that employed a sample of
80,000 first-year students in the University of California system found that HSGPA is
consistently the strongest predictor of four-year college outcomes, such as cumulative college
grades and graduation (Geiser & Santelices, 2007). A 1999 study of nearly 150,000 first-year
students illustrated that HSGPA was more predictive of college completion than admissions
tests (Bowen et al., 2009). More recently, Galla et al. (2019) replicated the 1999 predictive
validity study findings using a national sample of 47,303 students who applied to college in the
20092010 academic year. This was later confirmed by Allensworth and Clark (2020).
Other studies have found that test scores are more predictive than HSGPA. For example,
Ramist et al. (1994) examined data from 45 colleges to demonstrate that SAT scores were more
predictive of individual college course grades than HSGPA. More recently, Gershenson (2018)
used a sample of nearly all high school students in North Carolina from 2005 to 2016 to show
that Algebra 1 end-of-course exam scores predict ACT math scores much better than do high
school course grades.
HSGPA and ACT scores have different predictive utility depending on the postsecondary
outcome examined. For example, HSGPA was slightly more accurate than ACT scores in
predicting whether students earn a 2.00 or higher first-year college GPA (FYGPA; ACT, 2022;
Noble & Sawyer, 2002). Yet the ACT Composite score and HSGPA were equally accurate in
predicting whether students earn a 3.00 or higher FYGPA (ACT, 2022). Likewise, HSGPA is
better than the ACT Composite score in predicting minimal success (retention through the first
year and attaining a 2.0 or higher FYGPA in college), while the ACT score is better than HSGPA
at predicting high-level success (retention through first year and attaining a 3.5 or higher
FYGPA) and very high-level success (retention through first year and attaining a 3.7 or higher
FYGPA). In addition, ACT Composite scores had incremental validity beyond HSGPA alone
(Sawyer, 2010).
However, it is undeniable that together, HSGPA and test scores are more predictive of future
success in college than either predictor alone (ACT, 1997; ACT, 2022; Bridgeman et al., 2008;
Camara & Echternacht, 2000; Kobrin et al., 2008; Mattern & Patterson, 2011; Noble & Sawyer,
2002; Sawyer, 2010; Westrick et al., 2015; Westrick et al., 2019; Willingham et al., 1990; Zwick,
2006). Moreover, HSGPA and test scores demonstrate strong contributions in predicting post-
secondary success across all race/ethnicity and gender groups (Bridgeman et al., 2008).
ACT Research | Research Report | January 2024 8
© 2023 by ACT, Inc. All rights reserved. | R2321
Although the predictive validity of HSGPA and test scores on postsecondary outcomes after
enrollment has been extensively established, it is noteworthy that relatively little research has
been done on predicting college enrollment using HSGPA and test scores (ACT, 2010; ACT,
2022). Much research has indicated that college enrollment is positively correlated with high
school grades (Okpych & Courtney, 2017; Zhang & Sanchez, 2013). However, with the increase
in grade inflation, the longitudinal predictive power of HSGPA alone, test scores alone, and the
combination of HSGPA and test scores to predict college enrollment deserve to be explored
further. This study is an extension of Sanchez and Moore’s (2022) study of the predictive validity
of HSGPA and ACT scores. This study employs multilevel logistic regression to examine the
predictive power of HSGPA and ACT test scores to predict postsecondary enrollment between
2010 and 2021, including students who tested during the pandemic.
The study addresses the following research questions:
1.
Has the predictive power of HSGPA on postseco
ndary enrollment changed over
time?
2.
Has the predictive power of the ACT Composite scor
e on postsecondary enrollment
changed over time?
3.
Has the predictive power of HSGPA and ACT Co
mposite scores combined on
postsecondary enrollment changed over time?
4.
Does a model which incorporates only HSGPA
, only ACT Composite scores, or both
best predict postsecondary enrollment?
Methods
Analytical Sample
The present research incorporates student data from the 2010 to 2021 ACT-tested high school
graduating cohorts. Between 2010 and 2020, we included every other year due to
computational limitations. The 2021 cohort was included to explore the change in predictive
validity post-pandemic. This data includes the most recent ACT scores for students who took
the ACT more than once. The sample was constrained to public high schools that were matched
to 2- and 4-year college enrollment data from the National Student Clearinghouse.
2
Students in
the sample either took the ACT on a national test day or through a state or district contract. Only
schools with at least 30 tested students each year were included to ensure the stability of
multilevel modeling estimates. As such, the individual year statistics may not match those of the
official graduating class. The study sample consisted of 16,323,815 students.
Measures
Student demographics. As part of the ACT registration process, students were asked to report
certain demographic information. In this study, we used the following student characteristics:
race/ethnicity, gender, and family income.
Course grade information. Students were also asked to provide information related to courses
they took in high school and their grades for each course. Self-reported grades in up to 23
ACT Research | Research Report | January 2024 9
© 2023 by ACT, Inc. All rights reserved. | R2321
courses in English, mathematics, social studies, and natural science were averaged to calculate
students’ cumulative HSGPA. Evidence has shown that students’ self-reported HSGPA is highly
correlated with students’ transcript GPA (Sanchez & Buddin, 2015). Additionally, self-reported
data has been shown to be a good substitute for transcript-reported grades for research
purposes (Camara et al., 2004; Kuncel et al., 2005; Shaw & Mattern, 2009).
Enrollment information. Enrollment data for the fall after high school graduation was retrieved
from the National Student Clearinghouse (NSC). ACT-tested students in the sample were
matched to NSC records.
Data Analysis
We started by describing studentscharacteristics and the number of schools. We then used
hierarchical logistic regression with the adaptive Gauss-Hermite quadrature approximation
method to estimate postsecondary enrollment for each year (see Appendix).
3
In the multilevel
model, Level 1 is students and Level 2 is schools. Students were nested in schools. Enrollment
was regressed on HSGPA, ACT Composite score, test type (i.e., national vs. state and district
testing), race/ethnicity, gender, and family income.
4
The Level 2 intercepts were allowed to vary across schools. Nakagawa and Shielzeth (2013)
proposed a simple method of obtaining pseudo
in multilevel logistic regression, which
computes the variance of the predicted outcomes using their logit form. We calculated the
marginal and conditional
to signify goodness-of-fit of the models. A marginal
is based on
the fixed effects only, while a conditional
is based on both the fixed and random effects
(Huang, 2022; Nakagawa & Shielzeth, 2013).
5
The conditional
value represents the total
model variance explained at both Level 1 and Level 2. The marginal
represents the
percentage of student-level variance explained by our models. HSGPA and ACT Composite
score were standardized across all years. Missing data on family income was imputed 5 times
using multiple imputation. Multiple imputation was conducted using the multivariate normal
distribution in SAS Proc Impute. The multiple imputation model included parental income,
race/ethnicity, gender, HSGPA, and ACT Composite score. One model was run per imputation
in R, and then the results were pooled to give final estimates. The data analysis for this study
was performed using R (R Core Team, 2022).
Results
Descriptive Statistics
Table 1 summarizes the number of schools and student characteristics in the analyses. From
2010 to 2021, the number of schools ranged from 19,283 to 22,700. Across years, there were
slightly more female students than male students. Among these students, the percentage of
Black students decreased from 13.5% to 10.7% from 2010 to 2021 and the number of White
students decreased from 63.2% to 61.1%. The proportion of Asian students and Hispanic
students increased from 2010 to 2020 but dropped slightly in 2021. From 2010 until 2021, the
proportion of students from low-income families (defined as students from a family with an
income less than $36,000 per year) decreased, while the proportion of students from high-
ACT Research | Research Report | January 2024 10
© 2023 by ACT, Inc. All rights reserved. | R2321
income families (those coming from a family with an income greater than $100,000 per year)
increased. College enrollment increased from 27.3% in 2010 to 32.7% in 2021. Students’ ACT
Composite scores increased by 0.6 scale score points between 2010 to 2021, while students’
HSGPA steadily increased from 2010 to 2021, with an especially pronounced increase in recent
years.
In the study sample, the percentages of students taking the ACT as part of the National testing
program (administered on Saturdays) decreased from 82.3% in 2010 to 62.4% in 2021, while
the percentage of students who participated in the State and District testing program (typically
during a school day) increased from 17.7% in 2010 to 37.6% in 2021. In 2016, about 30% of the
graduating class’s most recent record was from a State and District testing program. The State
and District testing program administers the ACT to nearly all students within a state or district.
The ACT National testing program is more likely to be used by college-bound students who plan
to use ACT scores as part of their college application package. The number of students who
use the National testing program has decreased; one possible contributor to this change (in
addition to the rise of State and District testing) is the growth of test-optional policies before and,
importantly, after the COVID-19 pandemic. On March 11, 2020, the World Health Organization
declared COVID-19 a global pandemic (WHO, 2020). In the United States, ACT canceled the
April, June, and July 2020 national test administrations and resumed limited testing in June
2020. As a result of the pandemic, many post-secondary institutions instituted test-optional
policies.
ACT Research | Research Report | January 2024 11
© 2023 by ACT, Inc. All rights reserved. | R2321
Table 1. Descriptive Statistics
Characteristic 2010 2012 2014 2016 2018 2020 2021
Number of students 1,353,918 1,424,173 1,522,907 1,644,802 1,367,430 1,014,662 748,592
Number of high schools 21,111 21,890 22,375 22,700 22,248 21,448 19,283
Testing program N (%)
National 82.3% 82.4% 77.3% 70.3% 73.1% 75.1% 62.4%
State and District 17.7% 17.6% 22.7% 29.7% 26.9% 24.9% 37.6%
Race/ethnicity N (%)
Asian 4.3% 4.2% 4.6% 4.8% 5.5% 5.8% 5.3%
Black 13.5% 13.3% 13.0% 12.8% 12.2% 11.6% 10.7%
Hispanic 9.9% 13.8% 15.1% 16.1% 16.2% 16.0% 12.8%
White 63.2% 59.7% 57.7% 55.7% 55.5% 56.4% 61.1%
Other 3.7% 4.4% 5.0% 5.3% 5.7% 5.9% 5.7%
Prefer not to respond 2.5% 3.6% 3.5% 3.6% 3.3% 3.2% 3.0%
Missing 2.9% 0.8% 1.1% 1.6% 1.7% 1.0% 1.4%
Student gender N (%)
Female 55.5% 55.3% 54.8% 53.9% 54.9% 56.4% 54.0%
Male 44.4% 44.6% 44.8% 44.9% 44.6% 43.2% 44.4%
Other/missing 0.1% 0.2% 0.4% 1.1% 0.4% 0.4% 1.6%
Enrolled in college N (%)
Yes 72.7% 71.6% 71.8% 67.8% 71.2% 68.8% 67.3%
No 27.3% 28.4% 28.2% 32.2% 28.8% 31.2% 32.7%
Family income N (%)
< $36K 27.5% 25.8% 25.1% 24.2% 22.1% 18.8% 15.9%
$36K$60K 18.6% 17.2% 17.0% 16.9% 15.7% 13.8% 12.4%
$60K$100K 19.7% 18.8% 18.7% 18.5% 18.1% 17.1% 16.7%
> $100K 14.2% 16.0% 18.1% 20.0% 23.9% 26.7% 30.0%
Missing 20.0% 22.2% 21.2% 20.4% 20.2% 23.6% 24.9%
ACT Composite mean (SD)
21.2
(5.20)
21.2
(5.23)
21.3
(5.35)
21.3
(5.48)
21.6
(5.65)
21.9
(5.83)
21.8
(5.97)
HSGPA mean (SD)
3.20
(0.64)
3.23
(0.63)
3.24
(0.63)
3.25
(0.63)
3.33
(0.61)
3.41
(0.58)
3.43
(0.59)
ACT Research | Research Report | January 2024 12
© 2023 by ACT, Inc. All rights reserved. | R2321
Q1: Has the Predictive Power of HSGPA on Postsecondary Enrollment
Changed Over Time?
Figure 1 shows the standardized coefficients of HSGPA on postsecondary enrollment from 2010
to 2021, controlling for test type, gender, race/ethnicity, and family income. From 2010 to 2016,
the predictive power of HSGPA changed little. In 2018, the predictive power of HSGPA
increased from 0.57 in 2016 to 0.63 in 2018. However, it decreased to previous levels in 2020
and increased again in 2021 to its peak. Figure 2 presents the odds ratios and 95% confidence
intervals of HSGPA on postsecondary enrollment from 2010 to 2021. It shows the same pattern,
in which the predictive power of HSGPA changed more dramatically starting in 2018 and 2021.
Figure 1.
Standardized Coefficients of HSGPA from 2010 to 2021
Note: For 2020, one of the five imputed data s
ets failed to meet one of two convergence criteria.
In that imputation, the optimizer convergence criterion was met but the gradient convergence
criterion was not. We calculated the pooled coefficient with the remaining four imputations, and
the coefficient was the same within three decimal places. Since there was no practical
difference, we report the results from all five imputations.
ACT Research | Research Report | January 2024 13
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 2. Odds Ratios of HSGPA from 2010 to 2021
Note: For 2020, one of the four imputed data sets failed to meet one of two convergence
criteria. In that imputation, the optimizer convergence criterion was met but the gradient
convergence criterion was not. We calculated the pooled coefficient with the remaining four
imputations, and the coefficient was the same within three decimal places. Since there was no
practical difference, we report the results from all five imputations.
Q2: Has the Predictive Power of the ACT Composite score on
Postsecondary Enrollment Changed Over Time?
Figure 3 shows the standardized coefficients of the ACT Composite score in predicting
postsecondary enrollment from 2010 to 2021, controlling for test type, gender, race/ethnicity,
and family income. The standardized coefficient decreased from 0.68 to 0.49 between 2010 and
2021. It reached its lowest coefficient in 2020 during the pandemic. Figure 4 shows the odds
ratios of the ACT Composite score in predicting postsecondary enrollment from 2010 to 2021. In
2010, the odds ratio was 1.97, indicating that with one standard deviation score increase on the
ACT Composite score, the odds of enrollment increased by 97%.
6
However, the odds ratio
dropped to 1.64 in 2021. The decrease in standardized coefficients and odds ratios indicates
that the predictive power of the ACT Composite score on postsecondary enrollment has
changed over time.
ACT Research | Research Report | January 2024 14
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 3. Standardized Coefficients of the ACT Composite Score from 2010 to 2021
ACT Research | Research Report | January 2024 15
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 4. Odds Ratios of ACT Composite Score from 2010 to 2021
Q3: Has the Predictive Power of Using Both HSGPA and ACT
Composite Scores on Postsecondary Enrollment Changed Over
Time?
Figure 5 and Figure 6 show the standardized coefficients and odds ratios of HSGPA and ACT
Composite scores in predicting postsecondary enrollment from 2010 to 2021, while controlling
for test type, gender, race/ethnicity, and family income. From 2010 to 2021, the HSGPA
coefficient increased from 0.40 to 0.51, and the odds ratios of HSGPA increased from 1.49 to
1.67. During the same period, the coefficients of ACT Composite scores decreased from 0.43 to
0.24, and the odds ratios of ACT Composite score coefficients decreased from 1.53 to 1.27.
Before 2014, the ACT Composite score was slightly more predictive of postsecondary
enrollment than HSGPA. After 2014, HSGPA was more predictive of postsecondary enrollment
than the ACT Composite score. In addition, the difference in the predictive power of HSGPA
and the ACT Composite score has widened since 2014. The gap between the predictive power
of the ACT Composite score and HSGPA became more pronounced after 2018. This may be
because of the growth of test-optional policies. As the use of the ACT has decreased in
importance during recent admissions cycles, it is logical that its predictive power would also
decrease. Similarly, as HSGPA has grown in importance in recent admissions cycles, its
predictive power should also increase.
ACT Research | Research Report | January 2024 16
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 5. Standardized Coefficients of HSGPA and ACT Composite Score from 2010 to 2021
ACT Research | Research Report | January 2024 17
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 6. Odds Ratios of HSGPA and ACT Composite score from 2010 to 2021
Note: The odds ratio for 2014 and 2016 for HSGPA are the same when rounded to two decimal
places.
In 2020, both the predictive power of HSGPA and the ACT Composite score in predicting
postsecondary enrollment declined; however, the predictive power of the ACT Composite score
declined more dramatically than did that of HSGPA. This decline may be related to the dramatic
changes required during the COVID-19 pandemic. Overall, HSGPA has more predictive power
than the ACT Composite score.
Q4: Does a Model Which Incorporates Only HSGPA, Only ACT
Composite, Or Both Best Predict Postsecondary Enrollment?
Figure 7 shows the conditional
from hierarchical logistic regression models of HSGPA alone,
ACT Composite score alone, and HSGPA and ACT Composite score together predicting
postsecondary enrollment from 2010 to 2021 while controlling for test type, gender,
race/ethnicity, and family income. The conditional
for the three models steadily increased
from 2010 to 2018, then dropped in 2020 and 2021. The model with both predictors always had
the largest
value in all years examined. In other words, while HSGPA alone was more
predictive than ACT Composite score alone in predicting postsecondary enrollment, using both
measures of academic achievement together was more effective at predicting post-secondary
enrollment. This finding is consistent with previous research. The conditional
for the ACT
Compositealone model and HSGPA-alone model were similar between 2010 and 2016. In
years where they differed, the difference in conditional
was very small, no more than 0.02.
ACT Research | Research Report | January 2024 18
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 7. Conditional
of HSGPA and ACT Composite Score from 2010 to 2021
Figure 8 shows the marginal
from hierarchical logistic regression models of HSGPA alone,
ACT Composite score alone, and HSGPA and ACT Composite score together predicting
postsecondary enrollment from 2010 to 2021 while controlling for test type, gender,
race/ethnicity, and family income. As found in the conditional
models, ACT Composite score
combined with HSGPA was more predictive of postsecondary enrollment across the years
examined. Here again, while the marginal
for each model differed slightly, there may not be
a practical difference in predictive validity.
ACT Research | Research Report | January 2024 19
© 2023 by ACT, Inc. All rights reserved. | R2321
Figure 8. Marginal
of HSGPA and ACT Composite Score from 2010 to 2021
Discussion
This study provided evidence of changes in the predictive validity of HSGPA and ACT
Composite score before and during the first year of the pandemic. This study found little
practical difference in the predictive validity of the three models examined (HSGPA, ACT
Composite, and both HSGPA and ACT Composite). It is possible that we found these results
because we included student level covariates that had similar explanatory power across
models. In this study, we used data from 2010 to 2021 to explore the predictive power of
HSGPA and ACT Composite score on postsecondary enrollment using hierarchical logistic
regression.
We first explored the change in the predictive power of HSGPA from 2010 to 2021. From the
results of the hierarchical logistic regression, we saw evidence of the predictive power of
HSGPA on postsecondary enrollment changing over time. From 2010 to 2021, the predictive
power of HSGPA has steadily increased, increasing to its highest value in 2021. We then
explored the change in the predictive power of the ACT Composite score from 2010 to 2021. In
contrast to the performance of the predictive power of HSGPA, the predictive power of the ACT
Composite score has decreased over time, reaching its lowest value in 2021.
In this analysis, we also compared the predictive power of HSGPA alone, ACT Composite score
alone, and HSGPA and ACT Composite score together on postsecondary enrollment from 2010
to 2021. We found evidence that HSGPA currently has more predictive power for postsecondary
enrollment than the ACT Composite score after accounting for students’ characteristics and
school effects. In the years examined, HSGPA was not always the stronger predictor when
compared to the ACT Composite score in predicting postsecondary enrollment. Before 2014,
ACT Research | Research Report | January 2024 20
© 2023 by ACT, Inc. All rights reserved. | R2321
the ACT Composite score had stronger predictive power than HSGPA. After 2014, HSGPA had
more predictive power than the ACT Composite score, and the gap in predictive power between
HSGPA and the ACT Composite score in predicting postsecondary enrollment has increased
since 2014.
There could be several potential reasons for this trend. First, test-optional policies introduced by
postsecondary institutions meant that students were no longer required to submit their ACT
scores as part of their application for postsecondary institutions. In a test-optional
postsecondary institution application, HSGPA became the primary, and at times only,
quantitative measure of student achievement. Based on data from the Integrated Postsecondary
Education Data System (U.S. Department of Education, National Center for Education
Statistics, Integrated Postsecondary Education Data System, 2023), the number of
postsecondary institutions requiring ACT or SAT test scores steadily decreased from 2010
(1,183) through 2019 (985). There were more dramatic decreases in the number of
postsecondary institutions requiring test scores in 2020 (562) and 2021 (163). Therefore,
HSGPA received more attention than test scores.
Second, COVID-19 sped up the adoption of test-optional policies. During COVID-19, most high
schools transitioned to remote or distance learning. Most postsecondary institutions instituted
test-optional policies because of students’ limited access to ACT and SAT testing. Moreover,
there may have been changes in the ways in which HSGPA and test scores were interpreted
and used for decision making in the admissions process. In addition, students’ perceptions of
the value of a postsecondary education may be changing over time. Both of the latter points
deserve further study.
As the role of test scores in admissions has changed in recent years, the importance and high-
stakes nature of the use of HSGPA for admissions highlight the need for valid and reliable
grades. As the importance of HSGPA increases in postsecondary enrollment, there is also
evidence that grade inflation has been increasing at a significant rate in recent years. This
raises concerns about the way HSGPA is understood and used by postsecondary institutions.
We must make sure HSGPA remains a valid indicator in the absence of test scores. This is of
particular concern if underprepared students are entering postsecondary education and
experiencing less favorable outcomes.
As prior studies have demonstrated, there is greater predictive validity of test scores for higher
levels of success in college (e.g., successive level of first-year college GPA), but both measures
of student achievement have uses depending upon their application. In this study, the three
models examined were found to have similar predictive power on enrollment in any
postsecondary institution. We must bear in mind, however, that these models predict enrollment
at both 2- and 4-year institutions. This criterion of continuing education after high school is
useful for guiding high school students but should be coupled with what we know about the
predictive validity of both measures for other postsecondary outcomes such as college GPA,
retention, and degree completion.
This study clearly demonstrated that using HSGPA and ACT Composite score combined is
more effective for predicting enrollment than using HSGPA alone, as may be the case in test-
ACT Research | Research Report | January 2024 21
© 2023 by ACT, Inc. All rights reserved. | R2321
optional admissions policies. HSGPA and ACT scores show different aspects of students’
achievement and therefore provide complementary information. Being able to use both HSGPA
and a standardized test score such as an ACT score provides a better understanding of student
achievement and content mastery. For those reasons, we recommend the use of both HSGPA
and ACT Composite score in predicting postsecondary enrollment.
Limitations
As noted, this study was concerned with predicting students’ enrollment in either a 2- or 4-year
institution. Most two-year institutions have open-admissions policies, which means that ACT
scores might not be required or used in the admissions process. This study found similar
predictive validity for HSGPA and ACT Composite score when used alone and together. If we
were to change the scope of the question to include only 4-year institutions, the results would
likely change. To test this assertion, we reran the three models to predict 4-year enrollment in
2021. In that analysis, we found that the use of both measures simultaneously accounted for a
greater percentage of variance in enrollment (about 25%) than did using either HSGPA or ACT
Composite alone (21% and 22%, respectively). This suggests that when we focus on enrollment
in 4-year institutions, where admissions decisions are typically more selective than in 2-year
institutions, using both measures of achievement is a better option, consistent with the overall
recommendations of this study.
ACT Research | Research Report | January 2024 22
© 2023 by ACT, Inc. All rights reserved. | R2321
References
ACT. (1997). Prediction research services tables.
ACT. (2005). Are high school grades inflated? Issues in College Readiness [Research report].
ACT. (2010). Mind the gaps: How college readiness narrows achievement gaps in college
success. https://www.act.org/content/dam/act/unsecured/documents/MindTheGaps.pdf
ACT. (2022). The ACT technical manual.
https://www.act.org/content/dam/act/unsecured/documents/ACT_Technical_Manual.pdf
Allensworth, E. M., & Clark, K. (2020). High school GPAs and ACT scores as predictors of
college completion: Examining assumptions about consistency across high schools.
Educational Researcher, 49(3), 198211.
Arundel, K. (2020, December). How educators are tweaking grading approaches in response to
the pandemic. K–12 Dive. Retrieved from
https://www.k12dive.com/news/how-
educators-are-tweaking-grading-approaches-in-response-to-the-pandemic/591729/
Bejar, I. I., & Blew, E. O. (1981). Grade inflation and the validity of the Scholastic Aptitude
Test. American Educational Research Journal, 18(2), 143156.
Bellott, F. K. (1981). Relationships of declining test scores and grade inflation.
Bowen, W. G., Chingos, M. M., & McPherson, M. (2009). Crossing the finish line: Completing
College at America’s Public Universities. Princeton University Press.
Bridgeman, B., Pollack, J., & Burton, N. (2008). Predicting grades in different types of college
courses. ETS Research Report Series, 2008(1), i27.
Camara, W. J., & Echternacht, G. (2000). The SAT
®
I and high school grades: Utility in
predicting success in college (Research Note No. 10). College Entrance Examination
Board, Office of Research and Development.
Camara, W., Kimmel, E., Scheuneman, J., & Sawtell, E. A. (2004). Whose grades are inflated?
Research Report No. 2003-4. College Entrance Examination Board.
Cano, R. (2020). How coronavirus has changed grading policies. Cal Matters.
https://calmatters.org/education/2020/05/how-coronavirus-has-changed-grading-policies/
Castro, M., Choi, L., Knudson, J., & O'Day, J. (2020). Grading policy in time of COVID-19:
Considerations and implications for equity. Policy and Practice Brief. California
Collaborative on District Reform.
Chan, W., Hao, L., & Suen, W. (2007). A signaling theory of grade inflation. International
Economic Review, 48(3), 10651090.
Chowdhury, F. (2018). Grade inflation: Causes, consequences and cure. Journal of Education
and Learning, 7(6), 8692.
ACT Research | Research Report | January 2024 23
© 2023 by ACT, Inc. All rights reserved. | R2321
Feldman, J. (2018). Grading for equity: What it is, why it matters, and how it can transform
schools and classrooms. Corwin Press.
Finefter-Rosenbluh, I. & Levinson, M. (2015). What is wrong with grade inflation (if anything)?
Philosophical Inquiry in Education, 23(1), 321.
Galla, B. M., Shulman, E. P., Plummer, B. D., Gardner, M., Hutt, S. J., Goyer, J. P., D’Mello, S.
K., Finn, A. S., Duckworth, A. L. (2019). Why high school grades are better predictors of
on-time college graduation than are admissions test scores: The roles of self-regulation
and cognitive ability. American Educational Research Journal, 56(6), 20772115.
Geiser, S., & Santelices, M. V. (2007). Validity of high-school grades in predicting student
success beyond the freshman year: High-school record vs. standardized tests as
indicators of four-year college outcomes. Research & Occasional Paper Series: CSHE.
6.07. Center for Studies in Higher Education.
Gershenson, S. (2018). Grade inflation in high schools (20052016). Thomas B. Fordham
Institute.
Gershenson, S. (2020). Great expectations: The impact of rigorous grading practices on student
achievement. Thomas B. Fordham Institute.
Godfrey, K. E. (2011). Investigating Grade Inflation and Non-Equivalence. Research Report
2011-2. College Board.
Gonzalez, T., de la Rubia, M. A., Hincz, K. P., Comas-Lopez, M., Subirats, L., Fort, S., & Sacha,
G. M. (2020). Influence of COVID-19 confinement on students’ performance in higher
education. PLOS ONE, 15(10), Article e0239490.
Griffin, R., & Townsley, M. (2021). Points, points, and more points: High school grade inflation
and deflation when homework and employability scores are incorporated. Journal of
School Administration Research and Development, 6(1), 111.
Herold, B. (2020). The disparities in remote learning under coronavirus (in charts). Education
Week, 10.
Huang, F. L. (2022). Practical multilevel modeling using R. Sage Publications.
Hurwitz, M., & Lee, J. (2018). Grade inflation and the role of standardized testing. Measuring
success: Testing, grades, and the future of college admissions, 6493.
Kobrin, J. L., Patterson, B. F., Shaw, E. J., Mattern, K. D., & Barbuti, S. M. (2008). Validity of the
SAT
®
for predicting first-year college grade point average. Research Report No. 2008-
5. College Board.
Kuncel, N. R., Credé, M., & Thomas, L. L. (2005). The validity of self-reported grade point
averages, class ranks, and test scores: A meta-analysis and review of the
literature. Review of Educational Research, 75(1), 6382.
Mattern, K. D., & Patterson, B. F. (2006). Validity of the SAT® for predicting fourth-year grades:
2006 SAT validity sample. Statistical Report 2011-7. College Board.
http://files.eric.ed.gov/fulltext/ED563098.pdf
ACT Research | Research Report | January 2024 24
© 2023 by ACT, Inc. All rights reserved. | R2321
Nakagawa, S., & Schielzeth, H. (2013). A general and simple method for obtaining R
2
from
generalized linear mixedeffects models. Methods in Ecology and Evolution, 4(2), 133
142.
Nata, G., Pereira, M. J. and Neves, T. (2014). Unfairness in access to higher education: a 11
year comparison of grade inflation by private and public secondary schools in Portugal.
Higher Education, 68(6), 851874.
Neves, T., Ferraz, H. and Nata, G. (2017). Social inequality in access to higher education:
Grade inflation in private schools and the ineffectiveness of compensatory education.
International Studies in Sociology of Education, 26(2), 190210.
Noble, J., & Sawyer, R. (2002). Predicting different levels of academic success in college using
high school GPA and ACT Composite score. ACT Research Report Series.
Nord, C. (2011). America’s high school graduates: Results from the 2009 NAEP high school
transcript study. DIANE Publishing.
Okpych, N. J., & Courtney, M. E. (2017). Who goes to college? Social capital and other
predictors of college enrollment for foster-care youth. Journal of the Society for Social
Work and Research, 8(4), 563593.
R Core Team (2022). R: A language and environment for statistical computing. R Foundation for
Statistical Computing. https://www.r-project.org/
Ramist, L., Lewis, C., & McCamleyJenkins, L. (1994). Student group differences in predicting
college grades: Sex, language, and ethnic groups. ETS Research Report
Series, 1994(1), i41.
Sanchez, E., & Buddin, R. (2015). How accurate are self-reported high school courses, course
grades, and grade point average? ACT Working Paper Series No. WP-201503.
Sanchez, E. I., & Moore, R. (2022). Grade inflation continues to grow in the past decade.
Research Report. ACT.
Sawchuk, Stephen. (2020). Grading students during the coronavirus crisis: What’s the right call?
Education Week.
https://www.edweek.org/teaching-learning/grading-students-during-
the-coronavirus-crisis-whats-the-right-call/2020/04
Sawyer, R. (2010). Usefulness of high school average and ACT scores in making college
admission decisions. ACT Research Report Series 2010-2. ACT.
Schramm, H., Rubin, I., & Schramm, N. (2021). Covid19 and high school grades: An early case
study. Significance (Oxford, England), 18(2), 6.
Shaw, E. J., & Mattern, K. D. (2009). Examining the accuracy of self-reported high school grade
point average. Research Report No. 2009-5. College Board.
Silva, P. L., DesJardins, S., Biscaia, R., Sá, C., & Teixeira, P. (2023). Public and private school
grade inflations patterns in secondary education. IZA Discussion Paper No. 16016.
U.S. Department of Education, Institute of Education Sciences, National Center for Education
Statistics (NCES), National Assessment of Educational Progress (NAEP). (2020). NAEP
ACT Research | Research Report | January 2024 25
© 2023 by ACT, Inc. All rights reserved. | R2321
long-term trend assessment results: Reading and mathematics. Retrieved from
https://www.nationsreportcard.gov/ltt/?age=9
U.S. Department of Education, Institute of Education Sciences, National Center for Education
Statistics (NCES), National Assessment of Educational Progress (NAEP). (2022a).
2019 NAEP high school transcript study (HSTS) results. Retrieved from
https://www.nationsreportcard.gov/hstsreport/#home
U.S. Department of Education, Institute of Education Sciences, National Center for Education
Statistics (NCES), National Assessment of Educational Progress (NAEP).
(2022b). U.S. education in the time of COVID. Retrieved from
https://nces.ed.gov/surveys/annualreports/topical-studies/covid/
U.S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS), 2010-2020. Admissions survey.
Retrieved from https://nces.ed.gov/ipeds/survey-components/6 on August 16, 2023.
Westrick, P. A., Le, H., Robbins, S. B., Radunzel, J. M., & Schmidt, F. L. (2015). College
performance and retention: A meta-analysis of the predictive validities of ACT
®
scores,
high school grades, and SES. Educational Assessment, 20(1), 23-45.
Westrick, P., Marini, J., Young, L., Ng, H., & Shmueli, D. (2019). Validity of the SAT® for
predicting first-year grades and retention to the second year (College Board Research
Report 2019-5). The College Board.
World Health Organization (WHO). (2020). Timeline: WHO’s COVID-19 response. Retrieved
from https://www.who.int/emergencies/diseases/novel-coronavirus-2019/interactive-
timeline
Willingham, W. W., Lewis, C., Morgan, R., & Ramist, L. (1990). Predicting college grades: An
analysis of institutional trends over two decades. Educational Testing Service.
Woodruff, D. J., & Ziomek, R. L. (2004). High school grade inflation from 1991 to 2003.
Research Report Series 2004-04. ACT.
Zhang, Q., & Sanchez, E. I. (2013). High school grade inflation from 2004 to 2011. ACT
Research Report Series, 2013(3). ACT.
Ziomek, R. L., & Svec, J. C. (1995). High school grades and achievement: Evidence of grade
inflation. NASSP Bulletin, 81(587), 105113.
Zwick, R. (2006). Higher education admission testing. In R. Brennan (Ed.), Educational
measurement (4th ed., pp. 647679). American Council on Education, Praeger.
ACT Research | Research Report | January 2024 26
© 2023 by ACT, Inc. All rights reserved. | R2321
Appendix
Table A1. HLM Model Coefficients: HSGPA Only
Characteristic
Coefficients
2010 2012 2014 2016 2018 2020 2021
Intercept
1.405*** 1.222*** 1.486*** 1.322*** 1.498*** 1.227*** 1.141***
HSGPA
0.574*** 0.574*** 0.589*** 0.566*** 0.630*** 0.571*** 0.641***
Test type State and District
1.234*** 1.218*** 1.262*** 1.169*** 1.272*** 1.204*** 0.794***
Gender
Male
0.116*** 0.149*** 0.209*** 0.200*** 0.275*** 0.284*** 0.261***
Other/missing
0.289*** 0.478*** 1.094*** 0.676*** 0.352*** 0.409*** 0.366***
Race
Black
0.234*** 0.286*** 0.168*** 0.129*** 0.166*** 0.153*** 0.191***
Hispanic
0.146*** 0.029* 0.120*** 0.081*** 0.080*** 0.090*** 0.128***
Missing
0.312*** 0.047 0.120*** 0.113*** 0.104*** 0.059* 0.111**
Other
0.033* 0.037* 0.073*** 0.092*** 0.106*** 0.102*** 0.119***
Prefer not to
respond
0.147*** 0.014 0.139*** 0.147*** 0.221*** 0.185*** 0.195***
White
0.128*** 0.206*** 0.037** 0.028** 0.000 0.022 0.063***
Family income
Linear
0.395*** 0.361*** 0.371*** 0.307*** 0.370*** 0.292*** 0.331***
Quadratic
0.101*** 0.081*** 0.073*** 0.091*** 0.073*** 0.074*** 0.065***
Cubic
0.001 0.006 0.003 0.001 0.002 0.003 0.002
Note: * p < 0.05, ** p < 0.01, ***p < 0.001; the reference categories in the model were National testing program, female, and Asian.
ACT Research | Research Report | January 2024 27
© 2023 by ACT, Inc. All rights reserved. | R2321
Table A2. HLM Model Odds Ratios: HSGPA Only
Characteristic
Odds Ratios
2010 2012 2014 2016 2018 2020 2021
Intercept
4.077*** 3.392*** 4.421*** 3.750*** 4.474*** 3.410*** 3.130***
HSGPA
1.776*** 1.776*** 1.802*** 1.761*** 1.877*** 1.770*** 1.898***
Test type State and District
0.291*** 0.296*** 0.283*** 0.311*** 0.280*** 0.300*** 0.452***
Gender Male
0.890*** 0.862*** 0.811*** 0.819*** 0.760*** 0.753*** 0.771***
Other/missing
0.749*** 0.620*** 0.335*** 0.509*** 0.703*** 0.664*** 0.694***
Race Black
1.264*** 1.332*** 1.183*** 1.137*** 1.181*** 1.165*** 1.211***
Hispanic
0.864*** 0.971* 0.887*** 0.922*** 0.923*** 0.914*** 0.880***
Missing
0.732*** 1.049 0.887*** 0.894*** 0.902*** 0.942* 0.895**
Other
0.967* 1.038* 0.930*** 0.912*** 0.899*** 0.903*** 0.888***
Prefer not to respond
0.864*** 0.987 0.870*** 0.863*** 0.801*** 0.831*** 0.823***
White
1.136*** 1.228*** 1.038** 1.029** 1.000 0.978 0.939***
Family income Linear
1.484*** 1.435*** 1.449*** 1.360*** 1.448*** 1.339*** 1.392***
Quadratic
0.904*** 0.922*** 0.929*** 0.913*** 0.929*** 0.928*** 0.937***
Cubic
0.999 0.994 1.003 1.001 0.998 0.997 0.998
Note: * p < 0.05, ** p < 0.01, ***p < 0.001; the reference categories in the model were National testing program, female, and Asian.
ACT Research | Research Report | January 2024 28
© 2023 by ACT, Inc. All rights reserved. | R2321
Table A3. HLM Model Coefficients: ACT Composite Only
Characteristic
Coefficients
2010 2012 2014 2016 2018 2020 2021
Intercept
1.572*** 1.436*** 1.730*** 1.532*** 1.742*** 1.446*** 1.425***
ACT Composite
0.677*** 0.658*** 0.647*** 0.580*** 0.615*** 0.469*** 0.492***
Test type State and District
1.369*** 1.353*** 1.386*** 1.279*** 1.369*** 1.298*** 0.846***
Gender
Male
0.253*** 0.291*** 0.360*** 0.347*** 0.442*** 0.430*** 0.439***
Other/missing
0.462*** 0.577*** 1.148*** 0.748*** 0.411*** 0.561*** 0.614***
Race
Black
0.267*** 0.291*** 0.153*** 0.128*** 0.186*** 0.186*** 0.215***
Hispanic
0.118*** 0.047*** 0.158*** 0.093*** 0.071*** 0.058*** 0.101***
Missing
0.230*** 0.022 0.183*** 0.136*** 0.136*** 0.028 0.036
Other
0.151*** 0.080*** 0.201*** 0.174*** 0.172*** 0.121*** 0.145***
Prefer not to respond
0.265*** 0.115*** 0.261*** 0.227*** 0.281*** 0.207*** 0.220***
White
0.030* 0.089*** 0.090*** 0.054*** 0.063*** 0.025* 0.073***
Family income
Linear
0.340*** 0.319*** 0.335*** 0.283*** 0.338*** 0.282*** 0.345***
Quadratic
0.114*** 0.089*** 0.083*** 0.100*** 0.093*** 0.095*** 0.096***
Cubic
0.009 0.015** 0.006 0.008 0.013* 0.008 0.012
Note: * p < 0.05, ** p < 0.01, ***p < 0.001; the reference categories in the model were National testing program, female, and Asian.
ACT Research | Research Report | January 2024 29
© 2023 by ACT, Inc. All rights reserved. | R2321
Table A4. HLM Model Odds Ratios: ACT Composite Only
Characteristic
Odds Ratios
2010 2012 2014 2016 2018 2020 2021
Intercept
4.815*** 4.203*** 5.639*** 4.628*** 5.707*** 4.244*** 4.159***
ACT Composite
1.968*** 1.930*** 1.910*** 1.786*** 1.849*** 1.599*** 1.635***
Test type State and District
0.254*** 0.259*** 0.250*** 0.278*** 0.254*** 0.273*** 0.429***
Gender
Male
0.777*** 0.747*** 0.698*** 0.706*** 0.642*** 0.651*** 0.645***
Other/missing
0.630*** 0.562*** 0.317*** 0.473*** 0.663*** 0.571*** 0.541***
Race
Black
1.306*** 1.338*** 1.165*** 1.136*** 1.204*** 1.205*** 1.240***
Hispanic
0.889*** 0.955*** 0.854*** 0.911*** 0.931*** 0.944*** 0.904***
Missing
0.794*** 1.022 0.832*** 0.873*** 0.872*** 0.972 0.964
Other
0.860*** 0.923*** 0.818*** 0.840*** 0.842*** 0.886*** 0.865***
Prefer not to respond
0.767*** 0.891*** 0.770*** 0.797*** 0.755*** 0.813*** 0.803***
White
1.030 1.094*** 0.914*** 0.947*** 0.939*** 0.975* 0.930***
Family income
Linear
1.404*** 1.376*** 1.398*** 1.327*** 1.403*** 1.325*** 1.413***
Quadratic
0.892*** 0.915*** 0.920*** 0.905*** 0.911*** 0.910*** 0.908***
Cubic
0.991 0.985** 0.994 0.992 0.987* 0.992 0.988
Note: * p < 0.05, ** p < 0.01, ***p < 0.001; the reference categories in the model were National testing program, female, and Asian.
ACT Research | Research Report | January 2024 30
© 2023 by ACT, Inc. All rights reserved. | R2321
Table A5. HLM Model Coefficients: ACT Composite and HSGPA
Characteristic
Coefficients
2010 2012 2014 2016 2018 2020 2021
Intercept
1.434*** 1.263*** 1.533*** 1.339*** 1.506*** 1.214*** 1.130***
HSGPA
0.395*** 0.393*** 0.415*** 0.410*** 0.462*** 0.444*** 0.513***
ACT Composite
0.427*** 0.416*** 0.396*** 0.336*** 0.358*** 0.245*** 0.238***
Test type School and District
1.207*** 1.193*** 1.227*** 1.141*** 1.232*** 1.176*** 0.765***
Gender
Male
0.160*** 0.198*** 0.258*** 0.240*** 0.320*** 0.318*** 0.295***
Other/missing
0.341*** 0.516*** 1.127*** 0.719*** 0.369*** 0.459*** 0.452***
Race
Black
0.388*** 0.431*** 0.298*** 0.253*** 0.318*** 0.284*** 0.324***
Hispanic
0.045** 0.054*** 0.048* 0.001 0.019 0.000 0.036*
Missing
0.199*** 0.129*** 0.061*** 0.037 0.025 0.030 0.009
Other
0.024 0.050** 0.067*** 0.064*** 0.062*** 0.047** 0.062***
Prefer not to respond
0.168*** 0.012 0.148*** 0.133*** 0.189*** 0.149*** 0.160***
White
0.114*** 0.191*** 0.015 0.031** 0.02 0.019 0.017
Family income
Linear
0.303*** 0.281*** 0.291*** 0.235*** 0.287*** 0.231*** 0.272***
Quadratic
0.104*** 0.082*** 0.076*** 0.092*** 0.080*** 0.081*** 0.072***
Cubic
0.007 0.012* 0.003 0.004 0.008 0.007 0.008
Note: * p < 0.05, ** p < 0.01, ***p < 0.001; the reference categories in the model were National testing program, female, and Asian.
ACT Research | Research Report | January 2024 31
© 2023 by ACT, Inc. All rights reserved. | R2321
Table A6. HLM Model Odds Ratios: ACT Composite and HSGPA
Characteristic
Odds Ratios
2010 2012 2014 2016 2018 2020 2021
Intercept
4.195*** 3.536*** 4.633*** 3.815*** 4.510*** 3.367*** 3.097***
HSGPA
1.485*** 1.482*** 1.515*** 1.507*** 1.588*** 1.559*** 1.670***
ACT Composite
1.532*** 1.516*** 1.486*** 1.400*** 1.430*** 1.278*** 1.269***
Test type State and District
0.299*** 0.303*** 0.293*** 0.320*** 0.292*** 0.308*** 0.465***
Gender
Male
0.852*** 0.821*** 0.773*** 0.786*** 0.726*** 0.728*** 0.745***
Other/missing
0.711*** 0.597*** 0.324*** 0.487*** 0.691*** 0.632*** 0.637***
Race
Black
1.474*** 1.539*** 1.347*** 1.288*** 1.374*** 1.329*** 1.382***
Hispanic
0.956** 1.056*** 0.953*** 0.999 1.019 1.000 0.965***
Missing
0.819*** 1.138*** 0.941* 0.964 0.975 1.031 1.010
Other
0.976 1.051** 0.935*** 0.938*** 0.940*** 0.954** 0.940**
Prefer not to respond
0.845*** 0.988 0.863*** 0.876*** 0.828*** 0.862*** 0.853***
White
1.121*** 1.210*** 1.016 1.031** 1.020 1.020 0.983
Family income
Linear
1.354*** 1.324*** 1.337*** 1.265*** 1.332*** 1.260*** 1.313***
Quadratic
0.901*** 0.921*** 0.927*** 0.912*** 0.923*** 0.923*** 0.930***
Cubic
0.993 0.988* 0.997 0.996 0.992 0.993 0.992
Note: * p < 0.05, ** p < 0.01, ***p < 0.001; the reference categories in the model were National testing program, female, and Asian.
ACT Research | Research Report | January 2024 32
© 2023 by ACT, Inc. All rights reserved. | R2321
Notes
1
HSGPA is reported on a scale of 0.0 to 4.0. A letter grade of A was defined as an HSGPA of
3.5 or higher, a letter grade of B was defined as an HSGPA between 2.5 and 3.5, a letter
grade of C was defined as an HSGPA between 1.5 and 2.5, a letter grade of D was defined
as an HSGPA between 1.0 and 1.5, and an HSGPA below 1.0 was considered an F.
2
The National Student Clearinghouse is a nonprofit education service whose database contains
enrollment information for 97% of students at postsecondary institutions
(https://nscresearchcenter.org/current-term-enrollment-estimates/).
3
To deal with nonconvergence issues, we changed the default optimization method from
Nelder-Mead optimization to the quadratic approximation approach.
4
Family income was included in the models as an ordered factor. In R, ordered factors in
logistic regression are tested for higher order terms. As such, the models included a test for
the linear, quadratic, and cubic terms.
5
The simplified equations from Huang (2022) are used. The marginal
formula:
=
 
 


, where
is the variance of fixed effects and

is the variance of the
random intercept. The conditional
formula:
=
 


 


, where
is the
variance of fixed effects and

is the variance of the random intercept.
6
Across years, the standard deviation for ACT Composite score was about 1.0.
© 2023 by ACT, Inc. All rights reserved. | R2321
ABOUT ACT
ACT is a mission-driven, nonprofit organization
dedicated to helping people achieve education and
workplace success. Grounded in more than 60 years
of research, ACT is a trusted leader in college and
career readiness solutions. Each year, ACT serves
millions of students, job seekers, schools, government
agencies, and employers in the U.S. and around the
world with learning resources, assessments, research,
and credentials designed to help them succeed from
elementary school through career.
For more information, visit act.org