Testing for Learning Proficiency

Jonathan E. Martin
September 30, 2014

Testing for Learning Proficiency

Jonathan E. Martin
September 30, 2014

Testing for Learning Proficiency

Jonathan E. Martin
September 30, 2014
Jonathan E. Martin
September 30, 2014

 As we continue to see here on the Think Tank blog, there are an abundance of things we should consider assessing and measuring in our schools. One we haven’t previously addressed is one that strikes this writer as being of especially strong merit: learning ability. Rather than measuring what students already know, or their underlying intelligence - which has its importance to be sure, but which we recognize is sometimes limited in its classroom manifestation - we could be measuring learning proficiency.

A recent book from MIT Press entitled Measuring What Matters Most: Choice-Based Assessments for the Digital Age, by Schwartz and Arena, explains that “most current assessments [are] sequestered problem solving (SPS). In the typical SPS assessment, students are sequestered, like a jury, from learning opportunities and outside resources that might contaminate the validity of the assessment. Learning during a test would be cheating.” SPS measures what students already know, but not what they can learn, because no such opportunity is provided.

On the game show “Are You Smarter than a 5th Grader?” for example, often the students can answer the questions from their grade school curriculum better than some of the adults participating, but that doesn’t mean that when presented with a complex and challenging problem requiring research, analysis, and application over an extended time, a company would prefer to have in their employ the ten-year-old TV show winners to an adult experienced in the workforce.

Instead, drawing from previous scholarly work by Feuerstein, Bransford, and Schwartz, the authors of this work argue for “dynamic assessment” in a format termed “preparation for future learning" (PFL). “In a PFL assessment, there are resources for learning during the test, and the question is whether students learn from them.” To the authors and this writer, PFL offers powerful additional insight when we are seeking to understand our students’ and our applicants’ proficiencies. In a fast-changing world where it is difficult or impossible to predict what kind of challenges await our students, instilling in them an ability to be effective learners and skillful users of new information to address novel challenges is perhaps the most important thing we can do for them.

Arena and Schwartz report on a research study in which two groups of students were taught the same content but in alternate approaches: one in “direct” instruction, the other in “guided discovery/inventing.” A test came next, designed in two versions, both of which included a challenge problem demanding knowledge beyond what students had studied to date. Half the children in both instructional groups received a version of the test without any resources for learning during the test, and they did about the same on the challenge problem (30% correct for “invent” instruction, 32% for the direct instruction).

The other half included a so-called “worked example” showing students how to solve a new type of problem which contained, in a non-obvious way, information key to addressing the challenge problem. In this group, again only 32% of the “direct” instructional group solved the challenge problem, but 61% of the “invent” instruction group solved it. The invent group had developed a much stronger PFL ability due to their different, and demonstrably better, instruction.

The implications seem clear: educators should review the balance of their current assessment strategies to see whether they have the desired mix of static and dynamic assessments, and they should explore whether (and how) they might improve their assessments to better employ PFL strategies. This is especially true if they are seeking to shift their instructional strategy toward inquiry-driven, project- and problem-based learning. This could include utilizing a wide variety of methods, from more open book, open notes, even open computer/internet testing, to more frequent use of what is called “pre-testing,” in which students are provided sample end-of-unit tests before the unit instruction, and then are coached to use classroom and informational resources to build the understanding necessary to address the kind of problems they observed in the pre-test.

Assessment for admission might follow suit: How can we better balance static and dynamic assessment? How can we better evaluate whether our students have the skills and dispositions to be powerful learners in our school?

Become a member to gain access to our full magazine
and more professional development tools.

Subscribe to learn more about EMA and our services.

Jonathan E. Martin
September 30, 2014

Jonathan Martin has 15 years experience as an independent school head, most recently as Head of St. Gregory College Preparatory School (AZ). He holds degrees from Harvard University (BA, Government, cum laude); Starr King School for the Ministry (M.Div., Unitarian ministry preparation); and the University of San Francisco School of Education (MA, Private School Administration). In 2008, he was a Visiting Fellow at the Klingenstein Center at Teachers College, Columbia University. He previously headed Saklan Valley School (CA) and Maybeck High School (CA). In the first stage of his educational career, he taught History, Social Studies, and English at Maybeck, and served in a role equivalent to Dean of Students. From 2010-12 he was a member of the board, and Program & Professional Development Chair, of the Independent School Association of the Southwest (ISAS). He was a contributor to the new National Association of Independent Schools publication A Strategic Imperative: A Guide to Becoming a School of the Future.

Ready to make EMA part of your enrollment toolkit?

Subscribe to learn more about EMA and our services.

Testing for Learning Proficiency

Jonathan E. Martin
September 30, 2014

 As we continue to see here on the Think Tank blog, there are an abundance of things we should consider assessing and measuring in our schools. One we haven’t previously addressed is one that strikes this writer as being of especially strong merit: learning ability. Rather than measuring what students already know, or their underlying intelligence - which has its importance to be sure, but which we recognize is sometimes limited in its classroom manifestation - we could be measuring learning proficiency.

A recent book from MIT Press entitled Measuring What Matters Most: Choice-Based Assessments for the Digital Age, by Schwartz and Arena, explains that “most current assessments [are] sequestered problem solving (SPS). In the typical SPS assessment, students are sequestered, like a jury, from learning opportunities and outside resources that might contaminate the validity of the assessment. Learning during a test would be cheating.” SPS measures what students already know, but not what they can learn, because no such opportunity is provided.

On the game show “Are You Smarter than a 5th Grader?” for example, often the students can answer the questions from their grade school curriculum better than some of the adults participating, but that doesn’t mean that when presented with a complex and challenging problem requiring research, analysis, and application over an extended time, a company would prefer to have in their employ the ten-year-old TV show winners to an adult experienced in the workforce.

Instead, drawing from previous scholarly work by Feuerstein, Bransford, and Schwartz, the authors of this work argue for “dynamic assessment” in a format termed “preparation for future learning" (PFL). “In a PFL assessment, there are resources for learning during the test, and the question is whether students learn from them.” To the authors and this writer, PFL offers powerful additional insight when we are seeking to understand our students’ and our applicants’ proficiencies. In a fast-changing world where it is difficult or impossible to predict what kind of challenges await our students, instilling in them an ability to be effective learners and skillful users of new information to address novel challenges is perhaps the most important thing we can do for them.

Arena and Schwartz report on a research study in which two groups of students were taught the same content but in alternate approaches: one in “direct” instruction, the other in “guided discovery/inventing.” A test came next, designed in two versions, both of which included a challenge problem demanding knowledge beyond what students had studied to date. Half the children in both instructional groups received a version of the test without any resources for learning during the test, and they did about the same on the challenge problem (30% correct for “invent” instruction, 32% for the direct instruction).

The other half included a so-called “worked example” showing students how to solve a new type of problem which contained, in a non-obvious way, information key to addressing the challenge problem. In this group, again only 32% of the “direct” instructional group solved the challenge problem, but 61% of the “invent” instruction group solved it. The invent group had developed a much stronger PFL ability due to their different, and demonstrably better, instruction.

The implications seem clear: educators should review the balance of their current assessment strategies to see whether they have the desired mix of static and dynamic assessments, and they should explore whether (and how) they might improve their assessments to better employ PFL strategies. This is especially true if they are seeking to shift their instructional strategy toward inquiry-driven, project- and problem-based learning. This could include utilizing a wide variety of methods, from more open book, open notes, even open computer/internet testing, to more frequent use of what is called “pre-testing,” in which students are provided sample end-of-unit tests before the unit instruction, and then are coached to use classroom and informational resources to build the understanding necessary to address the kind of problems they observed in the pre-test.

Assessment for admission might follow suit: How can we better balance static and dynamic assessment? How can we better evaluate whether our students have the skills and dispositions to be powerful learners in our school?