I awoke yesterday with SATs and GPAs on my mind. It wasn’t a nightmare that pulled me from sleep but a segment on NPR’s Morning Edition questioning college entrance exams. Though half-asleep I caught the gist – some new study saying high school grades predict college grades as well as the usual test suspects. Got me out of bed fast, a rarity.
Grades, standardized tests. Go ahead, find another controversial issue speaking to this nation’s ambivalent relationship with education, standardized tests, academic achievement, intelligence, the brain? All this plus shades of socio-economic disparities and injustice, racial/ethnic bias, American students’ falling rank on international test scores and our uncertain future in the competitive global marketplace.
Oh I got outta bed and took a closer peek and here’s what I found out about the study conducted by William Hiss, a former dean, now retired from Bates College, one of the first colleges to go “test optional”. So this research involves schools who have “test optional” admissions, meaning applicants are not required to submit SATs, ACTS, etc and admissions is not based on test scores (with the exception of some students in the large public universities*):
Hiss’ study, “Defining Promise: Optional Standardized Testing Policies in American College and University Admissions,” examined data from nearly three-dozen “test-optional” U.S. schools, ranging from small liberal arts schools to large public universities, over several years… there was virtually no difference in grades and graduation rates between test “submitters” and “nonsubmitters.” Just 0.05 percent of a GPA point separated the students who submitted their scores to admissions offices and those who did not. And college graduation rates for “nonsubmitters” were just 0.6 percent lower than those students who submitted their test scores. via Bates News
The study is not published in a research journal but on the website for the National Association for College Admissions Counseling.
It doesn’t measure outcomes beyond college (e.g., graduate school entrance, labor market participation, Rhodes scholarships, social media start-up success).
It doesn’t measure any outcomes other than major, college GPAs (first year and cumulative) and college graduation rates.
It doesn’t address student performance in schools that require standardized test scores.
It doesn’t answer whether students who attend test-optional colleges and universities differ from those attending traditional schools.
None of these caveats prevented many media outlets, including news ones, from picking this story up yesterday.
Would love to see this replicated with other schools, accounting for students who attend schools that require tests. Now that would be interesting because I want to know how students in other schools compare. You know hot beds of research have that data. If only they’d reveal how kids with varying test scores perform at their school. I bet those studies exist…will check into it. That said, that study would miss the kids who do not apply due to low test scores and who might otherwise do well on campus.
I should mention that both authors of the study graduated from Bates. So not completely unbiased investigators here and that’s somewhat disappointing, wish they’d gone outside the circle. Did you catch the title of the study? In fact Bates is hardly unbiased in the matter. Here’s how they introduce the study:
NPR’s Morning Edition reports on a new study of 33 U.S. colleges and universities that yields more evidence to support what Bates instinctively knew back in 1984: that standardized tests do not accurately predict college success. And at their worst, NPR reports, standardized tests can narrow the door of college opportunity when America needs to give students more access to higher education, not less.
As if they needed to do the study since they knew the results since 1984. Instinctively knew. Yikes. It appears Bates needs to tighten up the STEM curriculum. I have to correct their interpretation of the results.
..standardized tests do not accurately predict college success.
Not quite.** The results don’t show test scores as inaccurate predictors of college success (that would be your high school principal who said you’d never get into that college, true story). Nor does it show high school GPA as accurate predictors. Neither are terrific but they’re the best we got and here, grades win top predictor (not always so in the research). Test scores are still better than any number of other measures more subject to bias and poor reliability such as extra-curricular participation, the dread student essay and teacher recommendations. The SAT and ACT have one clear advantage over high school GPA – they’re standardized and so a yard stick we can hold up to everybody even if we can’t agree or don’t know exactly what it measures.
Incidentally standardized tests came onto the college scene in part to limit discrimination and bias in the admission process. I don’t love the SAT or its elevated status but am not comfortable with completely trashing it. It’s the only standardized information in the admissions process. So what, throw it out and rely more heavily on what, high school GPA and how many charities a kid created? How many chess tournaments won? Marathons ran? eBooks published? And if we’re basically relying more heavily on GPA, I’m sorry, do you remember your high school valedictorian? I’m too old to remember names but do recall the term “brown-noser” from the mid-1980’s. Yes it takes effort and perseverance, some grit if you will, to earn high marks but that’s a particular set of skills too and not the path to successful adulthood.
BTW, the authors argue we should throw out standardized test because not only are they bad at predicting success but the scores “truncate” whole swaths of students that would do well in the schools, especially first-generation-in-college-students, minority students and students from lower-income household. No doubt, those are valid concerns in the admission process. But this very argument suggests other issues are at play in the admission process, or should be, than simply academic ones, than simply high school GPA. The goal of college admissions has never been simply to populate the campus with students that would be the most likely to succeed (or for that matter, to merely elevate the national ranking, diversify the campus, ensure the future of the nation in the global market, cure cancer, win football championships, build the new dorm/library/athletic field or otherwise placate alum, parents, faculty, Nobel prize committees, etc).
Not that I’d care to work in college admissions. The uncertainty and randomness in the process does not thrill me either as a bystander or a parent. Answers? Questions? Quirks? Thoughts?
*The public universities only considered test scores for students whose high school GPAs didn’t automatically make the cut for admission.
** In fact this study doesn’t directly compare HS GPA and SATs (at least I didn’t see any in the 70 page report, could have missed it let me know if you find it) – and remember a good portion of the students didn’t even report their scores even though schools often still asked for them as part of the enrollment process though they wouldn’t be seen by admissions). So even though the given data and the graphs seem to show GPA as a better predictor the effect sizes of the two are not reported (nor the analyses that would suggest this comparison). So the research doesn’t actually directly compare the two. That is the first thing I’d do if I had this data. So the description of the study is slightly off. But I do like all the charts and graphs because they provide lots of detail that often go missing in more stream-lined academic papers. This would be way too long for a journal article but you can see it’s valuable nonetheless.