Two studies find admissions scattergrams reduce applications to elite colleges


Among high-achieving students, an unidentified college that had received the third-most applications dropped out of the top 10 after Naviance was introduced. High-achieving students became much more likely to apply to local colleges, which were relatively unpopular choices before Naviance.

Sabina Tomkins, an assistant professor at the University of Michigan’s School of Information and lead author of the study, doesn’t know exactly why students were deterred, but she said there are two likely explanations. One is that students are intimidated when they see that their scores are slightly below the average of previously admitted students. Some kids might want to avoid the risk of rejection altogether and play it safe, applying only to places where they’re more likely to be accepted. 

Another possibility is that the scattergrams have an unintended marketing or advertising effect. Students may feel more motivated to apply to the most popular schools where they see masses of green checks, showing that many previous peers have been admitted. Students can’t see the scattergrams for the least popular schools. To preserve student privacy, high schools commonly suppress scattergrams for schools to which fewer than five or 10 alumni have applied. Small or far-away elite schools can often fall into this suppressed category. “When the school doesn’t show up as a scattergram, it might not cross their mind in the same way it would have before,” said Tomkins. 

Tomkins only had application data and doesn’t know where students enrolled in college. But if students are applying to fewer elite schools, they’re likely getting into and matriculating at fewer of them too, Tomkins said.

An earlier study, published in 2021 in the Journal of Labor Economics, also found that Naviance’s scattergrams deterred students from applying to and enrolling in the most selective colleges. That study looked at only 8,000 students at one unidentified school district in the mid-Atlantic region. At the time that study was released, some critics questioned whether the unintended consequences of scattergrams were true nationwide. The larger 2023 study bolsters the evidence that more information isn’t always a good thing for all students.

Importantly, both studies also found that the scattergrams encouraged lower-achieving students. They were more likely to apply to four-year colleges after seeing that their grades and test scores were similar to those of previous students who had been accepted. Before their schools purchased Naviance, more of these students avoided four-year colleges and opted for two-year community colleges instead. A separate body of research has generally found that starting at a four-year college, while more expensive, increases the likelihood of earning a bachelor’s degree and higher wages after graduation. 

Whether we should care about students attending the most prestigious and elite colleges is a matter of debate. Authors of the 2023 study pointed me to Harvard economist Raj Chetty’s research, which has found that going to an Ivy League university or four other elite colleges, instead of a top flagship public college, increases the likelihood of becoming a CEO or a U.S. senator and substantially increases a graduate’s chances of earning in the top 1%. However, attending an Ivy instead of a top public flagship didn’t increase a graduate’s income on average. 

The scattergram studies looked only at high schools that had purchased Naviance’s product. The company was the first to market scattergrams to schools in 2002 and says its product reaches nine million of the nation’s 15 million high school students. According to GovSpend, which tracks government contracts, public high schools have spent well over $100 million on Naviance, which, in addition to scattergrams, also allows high school counselors to manage their students’ college applications and send transcripts to colleges. Competitors include Scoir, Ciaflo and MaiaLearning, which all offer similar scattergrams. 

PowerSchool, the company that owns Naviance, points out that analyzing small slices of its customer base, as the academic researchers have, can be misleading. According to the data PowerSchool shared with me, 38% of the six million college applications that flow through its platform each year were sent to “reach” schools, schools where it would be challenging for a student to gain acceptance based on their grades and test scores. A spokesperson said that applications to reach schools have been increasing annually, proof that its product “does not discourage students from applying to their reach or target schools.” 

The company also highlighted the benefits for lower-achieving students, asserting that the scattergrams “increase equity.”  Indeed, the earlier 2021 study found that Black, Hispanic and low-income students were especially more likely to apply to and enroll in four-year colleges after using Naviance.

I talked with a half dozen college counselors who work with high school students and they said they generally didn’t see high-achieving students getting discouraged after seeing scattergrams. “If anything, I see the opposite,” said Scott White, an independent college counselor in New Jersey and a former high school guidance counselor for over 30 years. “Students are over-applying, not under-applying. They throw in dream applications. If you look at the Naviance scattergrams, they are not in profile. ‘I know I’m not gonna get in there, but I’m gonna apply there anyway.’  That is incredibly common.” 

Amy Thompson, a college counselor at York High School outside of Chicago, told me that the scattergrams are a “big hit” with high school students and get students engaged in the college process because clicking on the data can be fun and even addicting. 

Only one counselor told me he had seen a case where a student was discouraged after seeing scattergrams, but he said it was an unusual experience. That doesn’t mean the researchers’ data analysis is wrong. It’s common for data to point out things that we’re not aware of or that we cannot readily see. 

The biggest drawback to scattergrams, according to veteran college counselors, is that the information is incomplete and can give students the false sense that admissions decisions at elite schools are primarily based on grades and test scores. The scattergrams don’t show whether a student was an athlete, a musician or from a wealthy family with many generations of alumni. Students might see a green check with a low test score and not appreciate that the student had other factors weighing in his or her favor. 

Counselors told me the scattergrams are most useful and accurate for large state schools, where there is a lot of data and the academic range of past admittees helps students identify safety and target schools. The more competitive the college, and the more the college looks at factors other than grades and test scores, the less useful the scattergrams. 

And just like the stock market, past performance is no guarantee of future results. Schools fall in and out of favor. What was a safety school one year can unexpectedly rise in selectivity. A school that was once hard to get into can lower its standards in an effort to fill seats.

I don’t know that I care so much about kids not applying to enough Ivy League schools. But it’s fascinating how the information age changes our behavior for better and for worse, and how kids are influenced by spending hours and hours clicking on websites and absorbing masses of data.

This story about scattergrams was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Proof Points newsletter.



Source link

About The Author

Scroll to Top