[This three-part report can also be downloaded as a PDF.]
Promises and Problems with the New PSAT
The need to roll out the PSAT while still constructing the SAT puts College Board test developers and executives in a Catch-22. On the one hand, PSAT reporting must, in part, rely on preliminary data as it represents a work-in-progress toward the goal of a redesigned SAT. On the other hand, the merits of the PSAT are the best evidence the public has for judging the credibility of the new SAT. Moreover, the PSAT itself has important purposes. Students make vital decisions between SAT and ACT based on PSAT reports, and the exam gives students feedback on their college readiness. PSAT data is now linked to Khan Academy, where students can work on areas that the PSAT identified for improvement. On the school side, College Board has encouraged weaving the PSAT into a variety of counseling and tracking roles. Encouraging those uses comes with the responsibility of educating the educators on proper interpretation and use of scores. It’s not clear that this standard has yet been met for the 2015 PSAT/NMSQT. Counselors are struggling to interpret some of the changes that have been made and may not be aware of some of the shifts or inconsistencies that exist. There have been mechanical challenges in rolling out the new test and reporting, frustrating counselors and students. Those issues have been covered elsewhere. Instead, this three-part analysis focuses on issues of interpretation and validity. Does the PSAT live up to its promise, and what does it portend for the SAT?
Compass Education Group’s analysis is based on examination of student and counselor PSAT reports; discussions with students, parents, and counselors; and the tables and publications provided by College Board. The most important source document is College Board’s PSAT/NMSQT Understanding Scores 2015; additional links will be provided where applicable. Rather than attempt a definitive exploration of the 2015 PSAT, this report examines three problematic areas of the new exam.
A series of changes has greatly increased the percentile scores that students and educators are seeing on PSAT score reports. College Board has not been transparent about all of the changes and the ways in which they can distort score interpretation.
An historically narrow gap between sophomore and junior performance does not seem credible and leads to questions about how scoring, scaling, and weighting were performed and reported.
A dramatic lowering of the college and career readiness benchmark for the “verbal” portion of the PSAT and SAT calls for a deeper examination and reveals potential structural problems with the new exam.