"After all their ballyhoo about how the new test was going to be a better tool for college admissions, it's not," said Robert Schaeffer, director of the group FairTest. "It's longer and more expensive. That's all you can say about it."
The College Board defended the SAT, saying that no predictor of college success is perfect, but that the exam is a remarkably good one. It emphasized the finding that the writing test actually does a slightly better job of predicting freshman-year college grade point average than do the math or critical reading sections, both of which are multiple choice.
"Both tests are very valid, the old one and the new one," said Laurence Bunin, the senior vice president who oversees the SAT program. "What's important here is that the new SAT places an emphasis on writing" and offers a valid test of another skill that is "critical to college success."
The SAT now runs three hours, 45 minutes — or 45 minutes longer than the old version — and will cost $45 in 2008-09, up from $29.50, though aid is available. The ACT, the other leading college admissions exam, has an optional writing section.
The College Board added the writing test, including a 25-minute essay, to help colleges make more finely tuned decisions about students' skills. College admissions officers can even download a student's essay and read it. The multiple-choice sections were also changed somewhat in 2005.
The College Board, a not-for-profit group, claimed the test would elevate the place of writing in high school classrooms. It backed up that argument last year with a survey reporting 88 percent of teachers said writing had become a bigger priority in their schools.
From the start, however, some teachers criticized the exam, arguing it encouraged formulaic writing and was susceptible to coaching.
The findings released Tuesday are the most comprehensive study yet of the new exam, covering about 150,000 students.
The analysis measured the connection between SAT performance for the high school of class of 2006 and college grades.
The correlation scale ranges from minus 1 to 1. A correlation of zero would indicate no connection between scores and grades, and 1 would show a perfect correlation — basically, that high scorers on the SAT are guaranteed to earn high college grades.
The study found high school GPA had a .54 correlation with college grades, which is considered fairly strong. Individually, all three SAT sections had lower correlations, but taken together they were .53
Combining high school GPA with the three SATs scores was stronger still — .62. But that was just .01 higher than if the writing exam weren't included.
There were numerous studies of the old SAT's predictive value. A 2001 analysis that combined about 3,000 validity studies found the correlation ranged from .44 to .62.
The latest research also found that the new SAT, like the old one, continues to predict college grades with varying levels of accuracy for different groups. For instance, SAT scores "overpredict" the college grades of women, and are less accurate for minorities than for whites.
Critics contend those variations reveal fundamental problems with the SAT that should limit how it is used.
"My view is that, systemically, these tests aren't working as well as they should," said William E. Sedlacek, a testing expert and a retired professor at the University of Maryland.
But the College Board noted SAT scores are still a better predictor for minorities than high school grades are. "What that suggests is that it's very important for these minority students to have a fair benchmark, a fair, merit-based way to be evaluated in the college admission process," Bunin said.
Many colleges have said they would wait for research like this study before making long-term decisions about how to use students' SAT writing scores. Currently, some give the new section equal weight with the math and critical reading. Others look at writing scores selectively, while some ignore them completely.
Dozens have dropped the SAT altogether as an admissions requirement.
Stephen Farmer, director of admissions at the University of North Carolina-Chapel Hill, said the findings echo UNC's own preliminary research.
"What we haven't seen on our campus is that writing tells us much that critical reading does not," he said. "For that reason we probably use writing less than we might have."
Typically, "we've used it mainly when there's been a discrepancy between critical reading and a writing score, when a writing score has helped a student with low critical reading," he said.
Standardized tests are "useful if limited tools," Farmer said. "The problems crop up when people forget either about the usefulness of them, or about their limitations."
** What's making the news in local schools? Find plenty of education news and student achievements in our 'In the Classroom' section