Since the inception of the SAT in 1926, the admission world has debated (1976, 2001, 2008, 2015, 2018, 2019) the impact of and validity of the SAT (and later the ACT, CLT, CCTST, etc) on the pool of applicants and enrolled students at a university. Recently, more and more colleges have been asking themselves should they diminish the role of testing in their admission process and declare a test optional admissions policy. This debate has heated up recently with the release of Measuring Success: Testing, Grades, and the Future of College Admissions and the announcement of the University of Chicago’s test optional policy causing many institutions to look inward at their use of test scores. However, the question of whether colleges and universities should stop using the SAT and ACT might just be the wrong question. It’s certainly the wrong question at a particular set of schools. While there are some schools where the additional predictive information (generally .02 to .1) from tests lends support to difficult admissions decisions, I think there are as many institutions that should be test optional but are holding on to the tests. In requiring these tests are these schools doing more harm than good to themselves? By holding on to these tests are these schools abdicating their duties to fairly evaluate all candidates and relinquishing that authority to the SAT and ACT?
Are New York City’s teachers as smart as their students? John Sexton, the ex-president of New York University, thinks not. During a talk he gave on the future of American universities at the Library of Congress last week, he claimed that in the past five years, New York City public schools have been hiring “teachers that have lower SAT scores than the students you are graduating. That’s a ticket for failure, because you’re hiring from the bottom half of the existing class.”