Jonathan E. Martin
September 11, 2013

Faking It?

Jonathan E. Martin
September 11, 2013
Jonathan E. Martin
September 11, 2013

“The skills easiest to test are also easiest to digitize, automate, and outsource.”
Andreas Schleicher, Head of the Program for International Student Assessment (PISA)

“Whole children,” with gifts and strengths both cognitive and non-cognitive, are the greatest contributors to our school communities. Accordingly, schools are re-examining non-cognitive assessment of their applicants – reviewing what they are doing now, and strategizing and experimenting around what they might add.

Reports one recent article, echoing a 2008 ETS report, “Self assessments are [the] most common [technique], and are responsible for most of what we know about the relationship of non-cognitive factors and educational outcomes.” Yet, in the fields of organizational psychology and psychometrics, faking in self-assessment is so large a concern that entire conferences and books are regularly devoted to the matter.

Shall we thus abandon self-assessments? We need to be judicious, to be sure. However, this book does lay out multiple strategies and prescriptions for continued, restricted use of such tools.

1. Make the questionnaire more complicated. The scholarly term for this is practicing “evaluative neutralization.” “Desirable-sounding and undesirable-sounding items are rephrased to sound more neutral and test takers will not be tempted to alter their responses.” (p. 320)

2. Ask for verification. So-called “biodata” and “elaboration” tactics require survey takers not just to rate themselves but to supply evidence supporting their claims. Instead of stating yes, I am a hard worker, students provide an example of the same. (p. 320)

3. Force choices between multiple positive choices. In this technique, a school or test organization would determine a relatively finite set of specific sought-after attributes, say four or five, and then generate a set of another ten to fifteen attractive characteristics. Applicants are then required to select from a list of positive attributes, not knowing which the organization prefers. (p. 321)

4. Warn applicants not to cheat. This strategy has been widely studied, but the results have been inconclusive at best. Only warnings which are persuasive that the organization can effectively detect faking AND will punish fakers seem to have statistically significant effects, and meanwhile, even these can have the negative result of depressing the responses of the honest respondents, due to their anxiety of getting caught. (p. 323)

5. Make it more like a test. This strategy is considerably more complex than the previous four; it requires applicants to self-assess by answering test-like multiple choice questions about situations and scenarios. Situational Judgment Tests (SJTs) are increasingly common, advocated and employed, for instance, by Robert Sternberg in his college admissions work at Tufts and elsewhere. These are appealing, but unless their design is emphatically intended to combat faking, they can fall prey to the same problems.

6. Use third party reporting. This is hardly innovative; we’ve all been using teacher recommendations for years. ETS explored a wide variety of options before reconciling themselves, for graduate student applicants, to a Personal Potential Index constructed exclusively upon “ratings by others.” (Kyllonen, 2008)

Gloomy as this book is about faking, it does not conclude on balance that we should eliminate self-assessments. Kyllonen, et.al offer excellent case studies for effective non-cognitive self-assessment in higher education (p. 290-92), and the book’s concluding chapter rallies users to not surrender. Paul Sackett, Ph.D., University of Minnesota, argues: "there is at least the potential of checks and balances, rather than simply taking the personality score at face value…In sum, although improved methods of preventing, detecting, and correcting for faking will be welcome developments, it is not the case that there is a need to suspend operations while waiting for these developments." (p. 342)

Report from the Field:

“The whole child, not just the cognitive ability, is what we are committed to assessing at Galloway,” reports Polly Williams and Elizabeth King, former and current admissions directors, respectively. They are quick to say this is not to dismiss the importance of the intellectual ability, only that they know “traditional measures don’t work in isolation to capture the breadth of what kids can do.”

Both Williams and King have a background in special needs and dyslexic educational programs, and draw upon that background in the assessment work they do. They know from that experience that often “brilliant kids are not able to demonstrate their brilliance in typical ways.” Instead, the mission of their admissions operation is identify the strengths and weaknesses of every individual child, honoring the unique qualities of each, and to build a balanced class well-suited for their particular educational program and not just select the top academic performers.

Because their school has long been committed to what is now called 21st Century learning, they’ve identified three major assessment domains and have developed assessment tools for each. First on their list are the so-called executive functioning abilities, such as time management, organization, and prioritization. For this, they use a self-assessment survey they’ve built from various published resources.

Second is perseverance and grit, for which they’ve recently begun using the Duckworth grit assessment available on her website. It is early still, but in an initial analysis, they’ve found strong grit scores do correlate well with predicted student success. As Williams and King explain, “We are excited about the use of this tool, because perseverance in the face of challenge is so critical to learning in a project-based environment like Galloway’s.”

Third is the interpersonal: the ability to interact and collaborate effectively. To evaluate this in admission to middle and high school, each applicant participates in a group activity – usually building a tower out of various parts – while Galloway educators observe carefully and evaluate with their collaboration rubrics.

The Galloway team is confident and optimistic about their progress on this critically important activity. “We are well on our way to defining which “soft skills” are most important to us and in developing assessments for them. This is doable and worth doing.”

Become a member to gain access to our full magazine
and more professional development tools.

Subscribe to learn more about EMA and our services.

Jonathan E. Martin
September 11, 2013

Jonathan Martin has 15 years experience as an independent school head, most recently as Head of St. Gregory College Preparatory School (AZ). He holds degrees from Harvard University (BA, Government, cum laude); Starr King School for the Ministry (M.Div., Unitarian ministry preparation); and the University of San Francisco School of Education (MA, Private School Administration). In 2008, he was a Visiting Fellow at the Klingenstein Center at Teachers College, Columbia University. He previously headed Saklan Valley School (CA) and Maybeck High School (CA). In the first stage of his educational career, he taught History, Social Studies, and English at Maybeck, and served in a role equivalent to Dean of Students. From 2010-12 he was a member of the board, and Program & Professional Development Chair, of the Independent School Association of the Southwest (ISAS). He was a contributor to the new National Association of Independent Schools publication A Strategic Imperative: A Guide to Becoming a School of the Future.

Ready to make EMA part of your enrollment toolkit?

Subscribe to learn more about EMA and our services.

“The skills easiest to test are also easiest to digitize, automate, and outsource.”
Andreas Schleicher, Head of the Program for International Student Assessment (PISA)

“Whole children,” with gifts and strengths both cognitive and non-cognitive, are the greatest contributors to our school communities. Accordingly, schools are re-examining non-cognitive assessment of their applicants – reviewing what they are doing now, and strategizing and experimenting around what they might add.

Reports one recent article, echoing a 2008 ETS report, “Self assessments are [the] most common [technique], and are responsible for most of what we know about the relationship of non-cognitive factors and educational outcomes.” Yet, in the fields of organizational psychology and psychometrics, faking in self-assessment is so large a concern that entire conferences and books are regularly devoted to the matter.

Shall we thus abandon self-assessments? We need to be judicious, to be sure. However, this book does lay out multiple strategies and prescriptions for continued, restricted use of such tools.

1. Make the questionnaire more complicated. The scholarly term for this is practicing “evaluative neutralization.” “Desirable-sounding and undesirable-sounding items are rephrased to sound more neutral and test takers will not be tempted to alter their responses.” (p. 320)

2. Ask for verification. So-called “biodata” and “elaboration” tactics require survey takers not just to rate themselves but to supply evidence supporting their claims. Instead of stating yes, I am a hard worker, students provide an example of the same. (p. 320)

3. Force choices between multiple positive choices. In this technique, a school or test organization would determine a relatively finite set of specific sought-after attributes, say four or five, and then generate a set of another ten to fifteen attractive characteristics. Applicants are then required to select from a list of positive attributes, not knowing which the organization prefers. (p. 321)

4. Warn applicants not to cheat. This strategy has been widely studied, but the results have been inconclusive at best. Only warnings which are persuasive that the organization can effectively detect faking AND will punish fakers seem to have statistically significant effects, and meanwhile, even these can have the negative result of depressing the responses of the honest respondents, due to their anxiety of getting caught. (p. 323)

5. Make it more like a test. This strategy is considerably more complex than the previous four; it requires applicants to self-assess by answering test-like multiple choice questions about situations and scenarios. Situational Judgment Tests (SJTs) are increasingly common, advocated and employed, for instance, by Robert Sternberg in his college admissions work at Tufts and elsewhere. These are appealing, but unless their design is emphatically intended to combat faking, they can fall prey to the same problems.

6. Use third party reporting. This is hardly innovative; we’ve all been using teacher recommendations for years. ETS explored a wide variety of options before reconciling themselves, for graduate student applicants, to a Personal Potential Index constructed exclusively upon “ratings by others.” (Kyllonen, 2008)

Gloomy as this book is about faking, it does not conclude on balance that we should eliminate self-assessments. Kyllonen, et.al offer excellent case studies for effective non-cognitive self-assessment in higher education (p. 290-92), and the book’s concluding chapter rallies users to not surrender. Paul Sackett, Ph.D., University of Minnesota, argues: "there is at least the potential of checks and balances, rather than simply taking the personality score at face value…In sum, although improved methods of preventing, detecting, and correcting for faking will be welcome developments, it is not the case that there is a need to suspend operations while waiting for these developments." (p. 342)

Report from the Field:

“The whole child, not just the cognitive ability, is what we are committed to assessing at Galloway,” reports Polly Williams and Elizabeth King, former and current admissions directors, respectively. They are quick to say this is not to dismiss the importance of the intellectual ability, only that they know “traditional measures don’t work in isolation to capture the breadth of what kids can do.”

Both Williams and King have a background in special needs and dyslexic educational programs, and draw upon that background in the assessment work they do. They know from that experience that often “brilliant kids are not able to demonstrate their brilliance in typical ways.” Instead, the mission of their admissions operation is identify the strengths and weaknesses of every individual child, honoring the unique qualities of each, and to build a balanced class well-suited for their particular educational program and not just select the top academic performers.

Because their school has long been committed to what is now called 21st Century learning, they’ve identified three major assessment domains and have developed assessment tools for each. First on their list are the so-called executive functioning abilities, such as time management, organization, and prioritization. For this, they use a self-assessment survey they’ve built from various published resources.

Second is perseverance and grit, for which they’ve recently begun using the Duckworth grit assessment available on her website. It is early still, but in an initial analysis, they’ve found strong grit scores do correlate well with predicted student success. As Williams and King explain, “We are excited about the use of this tool, because perseverance in the face of challenge is so critical to learning in a project-based environment like Galloway’s.”

Third is the interpersonal: the ability to interact and collaborate effectively. To evaluate this in admission to middle and high school, each applicant participates in a group activity – usually building a tower out of various parts – while Galloway educators observe carefully and evaluate with their collaboration rubrics.

The Galloway team is confident and optimistic about their progress on this critically important activity. “We are well on our way to defining which “soft skills” are most important to us and in developing assessments for them. This is doable and worth doing.”