Initial Thoughts on the New SAT

June 30, 2014 by  
Filed under All Posts, Featured, SAT Watch

Back in late March, I was quoted in the award-winning student newspaper, Redwood Bark, as part of a featured article by Chloe Wintersteen on the recently announced changes to the SAT.  Chloe does a very nice job of summarizing the changes to the test and some of the issues regarding the College Board’s collaboration with Khan Academy. I highly recommend this article to anybody interested in learning more.

Additionally, I have included below my extended comments to Chloe, which obviously didn’t all make the article.

________

Hi Chloe,

Thanks very much for contacting me. Happy to answer your questions. Please see below:

-Are you in favor of the changes or against them and why?

Some of the changes planned for the SAT are positive developments. Eliminating the guessing penalty, an anachronistic and counterproductive feature, is long overdue. Shortening the multiple-choice sections is also welcome, as the current long-form format emphasizes raw mental endurance over true knowledge and reasoning.

For many of the changes, however, it is simply too early to tell. Other than a broad outline of the content, the College Board really hasn’t been very specific about what actually is going to change. (I’m not sure they actually know all of it themselves at this point, as the President of the College Board has said that the development is still a work in progress.) Eliminating grammar multiple-choice questions, streamlining the math topics, emphasizing evidence in passage reading, dumbing down the vocabulary, and lengthening the essay all sound interesting in theory, but just how well these modifications play out in practice remains to be seen.

-Khan Academy will be offering free test preparation to all students. How will this effect Marin SAT Prep? Will you’re job still be relevant after the changes have been implemented? Why? (ed. Bless your heart, Chloe. 😉 )

I applaud Khan Academy for its efforts in continuing to provide free SAT materials online over the last five or so years as well as now in its official association with the College Board. It’s important that students have as much access to information about these tests as possible. Indeed, we at Marin SAT Prep have a long shown our commitment to that goal by making our training materials available online for anyone to use at SATUnlocked.com. It’s also the reason for many years I have published my ‘SAT Tutor’s Blog’ (sat-tutors-blog.com) which provides a wealth of free training and advice.

With that said, I believe the statements by President of the College Board that its association with Khan Academy will somehow reduce or eliminate the need for test prep services is both hubristic and hyperbolic (two vocabulary words that you probably won’t see on the new SAT).

Currently, Khan Academy provides videos of the answers to questions in the Official SAT Study Guide, a practice we can assume will continue, yet just how much more in terms of an actual, comprehensive training program Khan Academy can provide remains a big, unanswered question. Even assuming its new test prep program provides a half-way decent training curriculum, the reality is that web videos simply cannot take the place of an actual person sitting down one-on-one with you to train you thoroughly over multiple weeks on every aspect of the test. Indeed, with all the SAT web videos already available, both at Khan Academy and many other places, why is the demand for private test prep tutoring greater than ever?

Moreover, as good as any website may be, it will have a hard time helping you with your test anxiety, planning your test prep calendar over multiple tests, tailoring your homework to your individual needs, encouraging you and building your confidence, and doing all the other holistic things that a live person can.

Remember too that because of its official association with the College Board, Khan Academy is significantly hampered in its ability to actually help you. Simply put, the people who write the test are not going to tell you how to beat it. Certainly, Khan Academy, in its new official role, can never advise you on aspects of test prep that do not actively promote the SAT, such as something as simple as whether the ACT may actually be a better test for you.

-How will the changes effect how you prepare students for the test?

Obviously, we are going to have to wait to see more details on the content of the new SAT, but in terms of our overall approach, which has been extremely successful for many years, we do not anticipate too many changes. In terms of curriculum, we may find ourselves spending a little less time on a multitude of math topics and spending more time on the essay as well as training students to spot evidence in the reading passages, but with all of the changes, we will still continue our student-centered approach that tailors our comprehensive training to the student’s specific needs.

-Some believe the changes are simplifying the SAT too much. Do you agree or disagree?

Too early to say for the most part. I do think it is a shame that the new SAT will eliminate arcane and esoteric vocabulary words, which I personally think are important in preserving the fullness and richness of the English language. Generally, the focus of the College Board in designing this new SAT appears to be on identifying students with aptitude towards graduate level professional studies, where business vocabulary, evidence spotting, and ratio-based math skills are quite useful, rather than on identifying students with creative and “out-of-the-box” reasoning skills who would perform well in undergraduate humanities courses, which the current SAT is actually quite good at doing.

-Any other comments?

We at Marin SAT Prep believe the changes to the SAT are a great opportunity for us. We are already planning our new training curriculum and materials, and our goal, as it always has been, is to be able to provide the absolute best test prep program available anywhere for students preparing for the SAT.

Share

SAT Executive Director Comments on College Readiness

November 28, 2012 by  
Filed under All Posts, SAT Watch, Scores & More

Earlier this Fall, the College Board released a study on the use of the SAT in determining a student’s likelihood of college and career success.  Jennifer Karan, Executive Director of the SAT Program at the College Board, was kind enough to send me her thoughts on the report, which are reprinted below:

Does Parental Education Affect SAT Achievement?

The College Board’s most recent SAT Report on College & Career Readiness shows that the class of 2012 was the most diverse class of SAT takers in history – and representative of the diversity in our nation’s classrooms.

While our efforts to democratize access to higher education are yielding some positive results, the report shows that non-school factors, including parental education, can have an outsized impact on students’ college readiness.

Sixty percent of SAT takers whose parents had attained a bachelor’s degree met the SAT College and Career Readiness Benchmark, compared to only 27 percent of SAT takers whose parents had not attained a four-year college degree. The differences in college readiness by level of parental education is another indicator of the inequities in our educational system, as students from families with lower levels of parental education are more likely to attend less resourced schools where they may not have access to or are not encouraged to complete a rigorous core curriculum, which is a basic requirement for college readiness.

In concert with many of the efforts currently underway, lawmakers and policymakers should continue to focus on ensuring that underserved and first-generation college students have access to the SAT and other educational opportunities. Without these proactive efforts, it is likely that the number of Americans with college degrees over the long term will not grow, causing further inequity in educational achievement and income disparity.

Of the students in the class of 2012 who reported being first generation college goers, nearly half – 46 percent – were minority students. Specifically, 62 percent of all Hispanic SAT takers and 48 percent of all African American SAT takers in the class of 2012, respectively, are first-generation college goers.

A central part of the College Board’s efforts to expand access to the exam for underserved students is the SAT Fee Waiver Program. In the 2011-2012 academic year, the College Board expended over $44 million on SAT Fee Waivers and related expenses, enabling 22 percent of SAT takers to have access to the test and other services free of charge.

Jennifer’s observations dovetail nicely with my own previous comments about the outsized role that parental education plays in determining a student’s likelihood of success:

The correlation between family income and/or race and SAT performance may be in some ways misleading. It’s not necessarily that students are simply ‘buying’ better scores or that the test is culturally biased against minorities, so much as the parents of better scoring students tend to be better educated themselves, and therefore have developed skill sets that can be passed down to help their children perform more optimally. Since better educated parents are also more likely to be both wealthy and white, these socio-economic discrepancies appear amplified in the SAT score disparities.

Bottom line: the College Board study further supports the notion that, in terms of a student’s college success, nothing succeeds like parental success.

 

Share

SAT & ACT test center security changes could create problems (updated)

 

 

In response to publicity from last Fall’s SAT cheating scandal in Long Island, the makers of the SAT (ETS) and ACT (ACT Inc.) announced new measures that they believe will help prevent students from hiring impersonators to take the tests for them.

The SAT security changes go into effect in the Fall of 2012. Frankly, it’s not at all clear the test makers have fully considered the implications of these changes.

Privacy Concerns About Photo Database

The issue that will probably get the most attention involves the new photo requirement.

“Students will be required to submit a current, recognizable photo during registration that will be included on a new photo admission ticket.

“Test center supervisors also will have access to a printable on-line register of the photos uploaded during registration for each student registered to test at that test center.”

In the age where teen privacy (online and offline) is already such a big concern for parents and students, allowing a random stranger at an SAT test center to print out a student’s photo and registration details (including full name, gender, date of birth, test taken, and high school) is certain to generate controversy.

But that is only the tip of the SAT privacy iceberg. ETS also has plans to compile a photo database of nearly every college applicant in the country, without any guarantees as to the database’s use or security.

“A registration data repository will be created containing the information and photo provided by the test-taker at the time of registration and used to produce the photo admission ticket required for test center admittance. High schools, colleges and universities, and other institutions that receive SAT scores will have access to the repository, as will the ETS Office of Testing Integrity. The registration data repository will not include test scores.”

‘High schools, colleges and universities, and other institutions that receive SAT scores’ is so broad as to basically grant access to employees of any and all educational (and even non-educational) institutions the SAT chooses, while in turn providing the individual test taker with zero control over who can access his or her photo and personal information or how this information is eventually used.

ETS will basically create its own Facebook-like page on every test taker that all its friends can see and  over which the individual test taker has no control.

All of this begs the question, of course, as to whether any of this will actually prevent test taking impersonations. After all, what is to stop an impersonator simply submitting his or her own picture in place of the actual test taker? ETS would argue that allowing so many others (colleges admissions officers, high school counselors, test center employees, etc.) to cross check the test taker with the picture on the registration will act a deterrent to would be impersonators and their potential clients.

The effectiveness of the deterrent remains to be seen, but even so, where should we draw the line between the security of the SAT and the security of a teenager’s likeness and personal information? ETS has apparently decided there is no line, and that its own concerns about a relatively tiny number of cheaters (ETS says there were 150 score cancellations due to impersonations last year) wholly trumps the legitimate privacy and safety concerns of millions of honest test takers.

Other Hidden Problems

While privacy concerns will certainly feature prominently in any push back the test makers receive about their new policy, as an SAT tutor I also see other changes that won’t get a lot of attention but will almost certainly impact many would be test takers.

Eliminating Standby Registration and Test Changes Will Hurt Seniors

One seemingly innocuous change is the elimination of standby registration.

“Students will be required to preregister for the SAT and SAT Subject Tests. Standby (walk-in) testing will no longer be permitted.”

The elimination of standby registration will be especially problematic for Seniors deciding when and whether to take their last tries at the SAT in October, November and/or December. These Fall test dates are typically scheduled for the first Saturday of each month. The scores are then released on the third Thursday after the test date, which only leaves a little over a week between the time a student receives his or her scores and the next SAT test date.

The problem here is that the date when the scores are released is well past the registration deadline for the next SAT, which means a student has no way of knowing what the scores from the previous test are before making a call on whether to try again on the next test date.

Previously, the way around this dilemma had been to wait for the scores to come out and, if the student wanted to try again, he or she could go standby at the next test date. This will no longer be possible under the new rules. (The problem also applies to the May/June test schedule.)

What also won’t be possible is changing the type of test students can take.

“Students who want to change the type of test they intend to take (i.e., SAT rather than SAT Subject Tests or vice versa) must do so in advance. Test-type changes will no longer be permitted on test day.”

Again, the problem with the tight Fall SAT test schedule is that by the time the first test’s scores come back, the deadline for changing the type of test has also passed, so November and December test takers who are wondering whether to take the SAT or switch to Subject Tests will not be able to rely on their test scores from the previous sitting to make an informed decision.

Basically under the new rules, if a student is unsure about how many times and/or which SAT tests to take in the Fall, the student will now simply have to double register for successive Fall tests and make a best guess about which tests to sign up for.

One important note: if the student, after getting the scores back from the first test, decides not take the second test, the student will most likely forfeit the registration fee for that second test.

Can Non-High School Students Still Even Register?

Another seemingly minor, but potentially problematic change in the test center security rules is the requirement that test applicants must now register with the names of their high schools.

“Students will be required to provide the name of their attending high school during registration. Once SAT registration opens for the 2012-13 school year, registrations submitted without attending high school will not be processed.”

Taken at face value, this rule would appear to eliminate literally hundreds of thousands of potential SAT test takers who do not attend formal high school. So-called ‘non traditional test takers’ (PDF) include: 7th and 8th grade students who apply to ‘talent identification’ programs (like Johns Hopkins Center for Talented Youth), non-adults no longer in high school, and adult test takers. Others affected by the new rule would also appear to include home school students, community college students, and some International students.

I would assume some accommodation in the security rules will be made for these non-high school test takers, but as written, the rules would appear to preclude anybody not in high school from even registering for the SAT.

Update:

Citing concerns over potential bias in the college admissions process, ACT has announced a change in its policy and will now not provide access to student photos to colleges and universities. The College Board however said in a statement that it will not change its own policy.

The College Board chose not to attach test-taker photos to the score reports sent to high schools and colleges, as such a process could be exploited no matter who the recipient. Instead, we are creating a separate, secure, password-protected database that tracks both the user and the date and time the database was accessed as an additional layer of security that will all but eradicate any attempts at test-taker impersonation. We expect the mere existence of this “name and photo” database will stand as a strong deterrent to any test-taker who might still consider impersonation while the access-controlled functionality of the database will prevent misuse by score recipients.

Note the emphasis on “secure, password protected database” and the new mention of a time stamp as an “additional layer of security”.  One would have assumed both the password requirement and the time stamp (both actually part of the same level of security) would have been a given from the outset, but the fact that ETS and the College Board now feel it necessary to mention these at least shows that the test makers are finally acknowledging that its millions of test takers do indeed have some legitimate privacy concerns.  Not that the College Board has actually changed its policy of course, but perhaps its a start.

Even so, the College Board still needs to be a lot more specific about its database clients before students and parents can feel even remotely confident that their photos and personal information will not be abused.

 

Share

AP nixes guessing penalty. SAT next?

August 30, 2010 by  
Filed under All Posts, SAT Watch, Scores & More

Beginning in May 2011, the College Board will eliminate the ‘guessing penalty’ for AP exams.

Under the old College Board policy, AP scores were based on the total number of correct answers minus a fraction for every incorrect answer—one-third of a point for questions with four possible answers and one-fourth of a point for questions with five possible answers. AP students were trained to work the odds by eliminating one or more possible answers and then making an “educated guess.” In fact, the College Board traditionally supported this strategy saying, “…if you have SOME knowledge of the question, and can eliminate one or more answer choices, informed guessing from among the remaining choices is usually to your advantage.”

The College Board similarly applies a 1/4 point guessing penalty for each incorrect SAT multiple choice answer, so it’s not a stretch to assume that a change AP scoring may presage a change in SAT scoring down the road:

Robert Schaeffer, public education director of the National Center for Fair and Open Testing, said he viewed it as significant that the College Board was changing any policy related to guessing, since the organization has argued since the 1950s that a penalty was needed. He said he looked forward to seeing how the College Board would justify having one policy for AP and another for the SAT.

For the moment, the College Board maintains a studiously ambiguous stance on the prospects of change in SAT scoring policy:

As for the SAT, the College Board spokeswoman indicated that the change is being announced only for AP. “The SAT Program has no immediate plans to change scoring processes, and will keep the public informed if that position changes,” she said.

I wouldn’t exactly call that a firm statement in support of the existing SAT scoring system. Would you?

The sudden impetus for the change may come from the increased popularity of the ACT, which does not use a guessing penalty:

Schaeffer also said that the guessing penalty is “a major competitive disadvantage for the SAT” vs. the ACT. “While the ACT is not a better test in any psychometric sense, the lack of a guessing penalty is one of the ways it is more consumer-friendly,” he said.

Although I agree with Mr. Schaeffer that the lack of a guessing penalty most likely contributes to the ACT’s increasing popularity, I do not believe the difference in scoring policy is purely cosmetic.

The SAT’s guessing penalty distorts the test’s ability to evaluate student performance accurately because it makes the test more about evaluating a student’s level of self-confidence, and less about evaluating his or her level of actual knowledge.

With the guessing penalty in play, it’s not enough just to choose an answer. For each question, the student also has to decide whether he or she is confident enough in the choice to risk a quarter point reduction for being wrong. This extra layer of decision making tends to discourage less assertive students, who will often shy away from those questions whose answers they are not wholly sure of, including questions where they would otherwise guess correctly were it not for their fear of the guessing penalty.

The result is that the guessing penalty ends up favoring the bold, guessing student over the more cautious, selective student – exactly the opposite outcome from what the guessing penalty is supposed to accomplish.

Studies suggest that the guessing penalty may also contribute to the persistent lag in the SAT performance of female test takers (especially in Math).

Research indicates that males are more likely to take risks on the test and guess when they do not know the answer, whereas females tend to answer the question only if they are sure they are correct. Unwillingness to make educated guesses on this exam has been shown to have a significant negative impact on scores.

The ACT does not have a guessing penalty, which may be one reason why the gender gap on that test is much smaller.

In my own teaching experience, I find that female SAT students often display a greater tendency to skip questions when they are not completely sure of the answer – even when the answer they would have picked turns out be the correct one. These less assertive students lose points they would otherwise earn were there no guessing penalty to discourage them from answering – points more assertive students earn even though they may have no better understanding of why a particular answer is correct.

Bottom line: if and when the College Board finally does away with the SAT guessing penalty, it will be doing itself and its test takers a big favor – not only because it will make the SAT more ‘consumer friendly’ but also, and more importantly, because it will help SAT scoring better reflect each student’s level of academic performance regardless of his or her level of personal self-confidence.

Share

SAT score map for Top 20 US Universities

August 23, 2010 by  
Filed under All Posts, SAT Watch, Scores & More, Tutor's Lounge

Nancy Xiao at teachstreet sent me this cool map of SAT scores for universities listed in US News and World Report’s Top 20 ranking for 2010.


Via: SAT Prep Courses

Rankings lists always generate a lot of debate about what the ‘best’ schools really are, and this list, with its rather obvious northeastern bias, is sure to be no exception. And before anyone gets too excited, be aware that the map is based on US News & World Report’s university rankings and does not include liberal arts colleges (links to 2011 rankings).

Yet regardless of which particular universities you think deserve to be in the Top 20 list, Nancy’s teachstreet map provides a good illustration of the general level of SAT scores needed for admission to America’s elite colleges and universities.

Generally, top US schools require a minimum cumulative SAT score of around 2100 for a chance at admission, while the ‘rest of the best’ require a minimum score of around 2000 for consideration.

Thanks for the illustrating that Nancy!

Share

Q&A with SAT Expert Dr. Gary Gruber

March 2, 2010 by  
Filed under All Posts, SAT Watch, Tutor's Lounge

Before many of us were even old enough to take the SAT, Dr. Gary Gruber was already helping students improve their test scores. 30 years later, Dr. Gruber remains one of the foremost authorities on SAT & ACT test preparation, publishing more than 30 test prep books that have sold over 7 million copies.

Dr. Gruber’s test prep books include:

Gruber’s Complete SAT Guide
Gruber’s Complete ACT Guide 2010
Gruber’s SAT 2400
Gruber’s SAT Word Master
Gruber’s Complete SAT Reading Workbook
Gruber’s Complete SAT Writing Workbook
Gruber’s Complete SAT Math Workbook

Dr. Gruber was kind enough to answer a few questions about his long experience in test prep, his thoughts about the new SAT, and his recommendations for tutors and students.

How did you get started in test prep? Do you still personally train students?

When in fifth grade I received a 90IQ (below average) on an IQ Test, my father who was a High School teacher at the time, was concerned so he was able to get me an IQ test hoping I could study it and increase my score. However, when I looked at the test, I was so fascinated at what the questions were trying to assess, I started to figure out what strategies and thinking could have been used for the questions and saw interesting patterns of what the test-maker was trying to test. I increased my IQ to 126 and then to 150. The initial experience of scoring so low on a first IQ test and branded as “dull minded”actually developed my fascination and research with standardized tests and I was determined to afford all other students my knowledge and experience so they would show their true potential as I did. So I constantly write books, newspaper and magazine articles and columns, software, and personally teach students and teachers.

The College Board revamped the SAT in 2005. How has the new SAT changed from the old SAT? Do you think the new SAT is harder or easier than the old SAT?

The College Board had taken out the Analogies and Quantitative Comparisons and had included and Essay section. In the Reading section shorter reading passages and questions relating to “double-reading passages” were added. The new math section was enhanced and added items from third year college preparatory math.

What is the ‘Gruber method’ and how does it differ from other test prep methods?

The unique aspect of my method is that I provide a mechanism and process where the student internalizes the use of strategies and thinking skills and then reinforce those methods so that students can answer questions on the SAT or ACT without panic or brain wracking. This is actually a “fun” process. The Gruber method focuses on the student’s patterns of thinking and how the student should best answer the questions. I have also developed a nationally syndicated test which is the only one of its kind and which actually tracks a student’s thinking approach for the SAT (and ACT) and directs the student to exactly what strategies are necessary for them to learn. Instead of just learning how to solve one problem at a time, if you learn a Gruber strategy you can that problem and thousands of other problems.

How do you ensure that the practice questions in your books are accurate reflections of what students will see on the actual tests?

There are two processes. For the first, I am constantly critically analyzing all the current questions and patterns on the actual tests. The second process is based on the fact that I am in directly in touch with the research development teams for any new items or methods used in the questions on any upcoming test, so I am probably the only one besides the actual SAT or ACT people that knows exactly what is being tested and why it is being tested on the SAT or what will be tested on current and upcoming tests.

What percentage of test prep study time should students spend learning vocabulary words?

The student should not spend too much time on this—perhaps 4 hours at most. The time should be invested in learning the Important Prefixes and Roots I have developed and the 3 Vocabulary Strategies. The student might also want to learn the 291 words and their opposites, which I have developed based on research of 100’s of SAT’s.

What advice can you give to students suffering from test anxiety?

I find when the student learns specific strategies they see how a strategy can be used for a multitude of questions and when they see a question on an actual SAT that uses the strategy it reinforces a confidence in them and reduces the panic. They can also treat the SAT as a game by using my strategic approaches and the panic is also reduced as a result.

SAT vs. ACT: How should students decide which test to take?

The correlation happens to be very high for both tests in that if you score well on one you will score equivalently well on the other. However, the ACT is more memory oriented than the SAT. The material is about the same, for example, there is Grammar on both tests. Math is about the same except the ACT is less strategically oriented. There is Reading on both tests and they test about the same things. However on the ACT there is a whole section on scientific data interpretation (The SAT has some questions on this topic in the Math). Fortunately you don’t have to know the science subject matter on the ACT. If you are more prone to memory, I would take the ACT. If you are more prone to strategizing or you like puzzles, I would take the SAT. In any event, I would check with the Schools that you are applying to and find out which test they prefer.

What is the single most important piece of advice you can give to students taking the SAT or ACT?

Learn some specific strategies which can be found in my books. This will let you think mechanically without wracking your brains. When answering the questions, don’t concentrate or panic about finding the answer. Try to extract something in the question which is curious and/or which will lead you to a next step in the question. You will through this “processing” the question, enable you to get an answer.

What is the single most important piece of advice you can give to tutors teaching the SAT or ACT?

Make sure that you learn the specific strategies and teach students those strategies using many different questions which employ the strategy, so the student will see variations on how that strategy is used.

What recommendations can you give to tutors who want to use your books in their test prep programs?

In Sections VI and VII in the INTRODUCTION to the SAT book there are programs for 4 hours and longer for studying for the SAT. You can use this information to create a program for teaching the student.

In Sections III and IV in the INTRODUCTION to the ACT book there are programs for 4 hours and longer for studying for the ACT. You can use this information to create a program for teaching the student.

Always try to reinforce the strategic approach, where the student can hone and internalize strategies so that they can use them for multitudes of questions.

Thank you Dr. Gruber!

Share

College Board SAT Class of 2009 Report

August 26, 2009 by  
Filed under All Posts, SAT Watch, Scores & More, Tutor's Lounge

From the New York Times:

Average SAT scores in reading and writing declined by one point this year, while math scores held steady, according to a report on the high school class of 2009 released Tuesday by the College Board.

Average scores on the three sections of the SAT were 501 in critical reading, 493 in writing, and 515 in mathematics. Scores for each section of the test range from 200 to 800.
Average scores last year, for the high school class of 2008, were 502 in reading, 494 in writing, and 515 in math.

More than 1.5 million college-bound seniors took the SAT, the largest group that had ever taken the test.

Males continue to outperform females on Math and Critical Reading (slightly), while females outperform males on Writing.

larger here

Ethnic disparities in performance continue:

In critical reading, non-Hispanic white students on average scored 528, compared with 516 for Asian students, 455 for Hispanic ones and 429 for African-Americans. In math, Asian students averaged 587, compared with 536 for non-Hispanic whites, 461 for Hispanics and 426 for blacks. In writing, Asians averaged 520, compared with 517 for non-Hispanic whites, 448 for Hispanics and 421 for blacks.

There also remains a strong correlation between family income and SAT performance:

The average scores for all three sections of the test directly reflected students’ family wealth. Students from families with an annual income above $200,000 scored, on average, 68 points higher in critical reading than students from families earning less than $20,000 per year, with similar disparities for math and writing.

Critics of the SAT typically point to disparities like these to claim that the test favors wealthier white students, and to a certain extent these criticisms may be justified.  However, there is also another factor at work here:

An even sharper correlation showed up between students’ average scores and the highest educational attainment of their parents. Students whose parents did not graduate from high school averaged 420 in critical reading, 139 points lower than students whose parents had a graduate degree, who averaged 559.

The correlation between family income and/or race and SAT performance may be in some ways misleading.  It’s not necessarily that students are simply ‘buying’ better scores or that the test is culturally biased against minorities, so much as the parents of better scoring students tend to be better educated themselves, and therefore have developed skill sets that can be passed down to help their children perform more optimally.   Since better educated parents are also more likely to be both wealthy and white, these socio-economic discrepancies appear amplified in the SAT score disparities.

That’s not to say that factors of  race and income do not affect SAT performance, but simply that the relative impact of these factors on student success may be overstated when compared to the impact of parental education.

Share

Why the SAT is not a great measure of Intelligence

February 14, 2009 by  
Filed under All Posts, SAT Watch

The econblog Marginal Revolution points to an interesting academic study of intelligence (PDF) published by Professors Keith E. Stanovich and Richard F. West.

In 7 different studies, the authors observed that a large number of thinking biases are uncorrelated with cognitive ability. These thinking biases include some of the most classic and well-studied biases in the heuristics and biases literature, including the conjunction effect, framing effects, anchoring effects, outcome bias, base-rate neglect, “less is more” effects, affect biases, omission bias, myside bias, sunk-cost effect, and certainty effects that violate the axioms of expected utility theory. In a further experiment, the authors nonetheless showed that cognitive ability does correlate with the tendency to avoid some rational thinking biases, specifically the tendency to display denominator neglect, probability matching rather than maximizing, belief bias, and matching bias on the 4-card selection task. The authors present a framework for predicting when cognitive ability will and will not correlate with a rational thinking tendency.

Basically, the paper reports on various experiments that purport to determine how ‘cognitive ability’ (i.e., intelligence) affects an individual’s decision making processes.  What caught my attention was the study’s use of SAT scores to determine the cognitive ability of individuals in the test group.

The participants were 434 undergraduate students (102 men and 332 women) recruited through an introductory psychology subject pool at a medium-sized state university in the United States. Their mean age was 19.0 years….

Students were asked to indicate their verbal, mathematical, and total SAT scores on the demographics form. The mean reported verbal SAT score of the students was 577 (SD 68), the mean reported mathematical SAT score was 572 (SD 69), and the mean total SAT score was 1149 (SD 110). The institution-wide averages for this university in 2006 were 565, 575, and 1140, respectively…

The total SAT score was used as an index of cognitive ability in the analyses reported here because it loads highly on psychometric g (Frey & Detterman, 2004; Unsworth & Engle, 2007). For the purposes of some of the analyses described below, the 206 students with SAT scores below the median (1150) were assigned to the low-SAT group, and the 228 remaining students were assigned to the high-SAT group. Parallel analyses that are fully continuous and that did not involve partitioning the sample are also reported.

A serious flaw in this research is the erroneous conflation of student reported SAT scores with cognitive ability. As a professional SAT tutor and educator who has individually taught many hundreds of students to achieve substantially higher scores, I do not believe that SAT performance can or should be considered an appropriate baseline measure of cognitive ability, as this study does.

Simply put, proper SAT training can improve student scores by literally hundreds of points, and while there is indeed a score ceiling that each student usually reaches, unless the study provides statistical control for whether has student has been ‘optimized’ to achieve this ceiling, its reliance on SAT scores as a test of a student’s cognitive ability must be considered fatally flawed.

Moreover, even if a student has not had any training, the College Board’s own study (PDF) shows that simply by taking the test multiple times, a student can improve his or her scores.   Without controlling for how many times a student took the test, let alone whether the student is reporting best scores in individual subjects from the same test or different tests, there is simply no way to say that a student’s final reported SAT score is a legitimate cognitive measure on which to base other experiments of biases.

It should also be noted that many other non-intelligence factors, both internal and external, also affect SAT scores. Parental and peer pressure can have a severe (usually negative) impact on student performance. Mental fatigue (often caused by student overscheduling) and physical fatigue (common among student athletes) are also factors. Likewise, a student’s maturity level (both emotional and physical) is an important variable. There are others.

In sum, the reliance on self reported SAT scores as the definitive indicator of a student’s cognitive ability skews the results of this study to such a significant degree that they must be questioned.  I’m not saying that the conclusions of Profs. Stanovich & West do not have merit; just that without a more accurate and controlled baseline of cognitive ability, there is simply no way to tell.

Share