Crossing the Finish Line, an important new book by former Princeton president William Bowen, former Macalaster College president Michael McPherson, and Matthew Chingos, relied on two massive databases on the entering class of 1999–one on 96,000 first-time freshmen and 30,000 entering transfer students at 21 flagship universities and the other on 108,000 freshmen and 42,000 transfers at less selective state colleges and universities in four states (Maryland, North Carolina, Ohio, and Virginia)–to compile a wide-ranging book of empirical research on topics impacting American higher education. This is the sixth in a series of posts on their findings (see previous installments on affirmative action, financial aid, transfer students, college admissions, and college dropouts).
Crossing the Finish Line has things to say about virtually every important factor in college life, but by far the most important thing is this:
The SAT and ACT do not matter in predicting college success.
I have been an unequivocal supporter of using the SAT/ACT* in making college admissions decisions (see here and here), but this sample of students and the rigor of this study are impossible to ignore. Here’s what the authors found:
- Taken separately, high school GPA is a better predictor of college graduation rates than SAT/ACT score. This findings holds true across institution type, and gets stronger the less selective an institution is. High school GPA is three to five times more important in predicting college graduation than SAT/ ACT score.
- SAT and ACT scores are proxies for high school quality. When the authors factored in which high schools students attended (i.e. high school quality), the predictive power of high school GPA went up, and the predictive power of SAT/ ACT scores fell below zero.
- High school quality mattered, but not nearly as much as the student’s GPA. Other research, most notably on Texas’ ten percent admission rule, has proven this before. It’s somewhat counter-intuitive, but it shows that a student’s initiative to succeed, complete their work, and jump any hurdles that come up matters more than the quality of their high school.
What should various actors do with this information? First, colleges and universities need to take a hard look at this new research. It’s one thing when a few rogue institutions go test-optional and claim it works well for them. It’s quite another when a robust, empirical study with a sample of thousands of students at 50 different public institutions shows the SAT and ACT to be so poor at predicting student success.
Second, higher education rankings need to drop the SAT and acceptance rate as measures of institutional quality. These were always indicators of prestige as opposed to quality, but the rankings definitely need to re-think their inclusion now. Back in August I wrote about how institutions going SAT-optional actually improved their SAT scores. Because only students with high scores submitted them, the colleges were able to report much higher scores. And, since the institutions received more applications, they were able to reject more students. All of this boosted their rankings in the eyes of US News.
These steps would pave the way for a more rational college admissions process. We’d get rid of the SAT, that manipulable gatekeeper to higher education opportunity, and move towards an increased reliance on high school GPA, a better predictor of college success. I’ll eat crow in the process, but the research says it’s the right thing to do.
*The SAT and ACT are two very different tests, of course, but they both claim the same benefit of being able to predict a first-year college student’s grades, they are becoming increasingly similar, and there is quite a bit of correlation between scores. The authors use them interchangeably, as do most statisticians when trying to control for incoming academic achievement of a sample of kids who take one or the other entrance exam.