Standardized testing the video game: Coming to a student near you - SmartBrief

All Articles Education Standardized testing the video game: Coming to a student near you

Standardized testing the video game: Coming to a student near you

6 min read

Education

Place an image in your mind of this: Standardized testing. Depending on where you stand your blood might boil with rage or a sense of accountability may rise up like a patriotic anthem, in either case you most likely picture students in rows, staring at pages of multiple-choice bubbles, attempting to avoid “distractors.” The two groups developing the assessments that align with the Common Core State Standards, SMARTER Balanced and the Partnership for Assessment of Readiness for College and Careers, aim to dramatically overhaul that vision of testing.

Like all things common core, there are loads of questions swirling around the development of these tests. My aim in this post is specifically to make sure you are aware of what is ahead, and more importantly, to suggest that you help your school keep an eye on what matters most — your students’ learning.  It is a time ripe for jumping headlong into every widget, gadget, and clicker on the market and spending hours upon hours in computer labs, but that would be missing the whole point.

First, let’s begin with that new image of standardized testing.

“Technology-enhanced items,” as the consortia refer to them, are not new. In fact, research in the field of “computer-based assessment” has been going on for years, and a few states have already dabbled in it. Picture a student you know — maybe even your own child — sitting down on the day of the test and instead of holding paper and pencil, they sit in front of a computer, laptop, or tablet. Next, visit this website, This grid, from the University of Oregon, displays a broad range of types of computer-based assessment prompts and ranks the challenge of each type. Move from more simple tasks at the top and left to more complex ones down and right. Click in any box, and you can interact a bit with how that type of item may work. It’s a far cry from A, B, C or D (I’m particularly intrigued by the spinning controls of 4C “The Wall Shadow” and realize how little Science I have retained with 5D “Protein Table”).

Other examples of computer-based assessments abound:

  • The 2009 NAEP Science assessment included a variety of “interactive computer tasks.”
  • SMARTER Balanced has released examples, though a bit hard to find. Scroll to “Technology Enhanced Item Supporting Materials (ZIP)” on this page to download movie files.
  • The Oregon University grid is referenced here in a much longer and more testing consortia-specific webinar on test development.

The consortia describe the purposes for these technology enhancements with much fanfare and promise. To be frank, if their wishes are delivered upon, they would lead to some improvement, some slight silver lining to our current obsession with testing, namely a more refined view of where students are beyond the mutually obscure “below grade level” or “above grade level.” They highlight the ability to assess students’ use of technology, such as Writing Standard 8’s expectation that students can gather information “from print and digital” and faster turn-around time for reporting, as short as two weeks.

Now, to the more essential point — our students’ learning.

I ended a previous SmartBlog post  with this caution: “Remember that the day of any test, students work alone. Without us. They employ not what we have ‘taught’ but what they have ‘learned.’” In regards to computer-based testing, this is even more true.

Consider one fourth-grade example found in that SMARTER Balanced Zip file. In it a student is asked to read a bit of a story that contains only descriptions with no dialogue. The prompt states: “This is the beginning of a story written by a student who wants to add dialogue. Decide where the three pieces of dialogue should be placed. Click on them and move them into the correct order.” Then, the child must do just that. Instead of simply selecting from four multiple choices, a fourth-grader interacting with that prompt, drags several sentences containing dialogue around and around until they believe they are in the correct order. In another example, listed as eighth grade, a student is presented with a passage, then this prompt: “‘Joy Hakim, the author of this passage, admires Sojourner Truth. How can you tell that the above statement is true? Click on a sentence in the passage that could be used as evidence to support this statement.’”  Then, again, instead of selecting one of four choices, a student could click on any sentence in the entire passage to back up that claim.

Yes, having some familiarity with technology can help. However, in order to answer these questions well, time spent in a computer lab is second to a deep internalization of skills. To answer that first prompt well, a student needs to have gained independence with the many aspects of dialogue — what it is, how it is written, its purposes in a narrative and how it moves plot ahead. A student who has written dialogue, read it in a number of texts, and reflected on its uses will certainly perform better with that prompt. The same holds true for the second, a student who both writes and reads informational texts and is mindful of how details are used to develop point of view will find this an extension of already familiar work. Learning to point and click well is only a small percentage of the rigor of those tasks.

In essence, these tests — whatever your view of them — are attempting to move away from months of mind numbing, isolated, test preparation drills and more to supporting students developing skills, meaning, we must watch that all of our teaching is leading to independence, not co-dependence. Students who read, really read — not just listen to adults talk to them about reading — and students who write, really write, will be strides ahead. As the assessments become more and more technology driven, a smart response should be more and more reading and writing.

Christopher Lehman (@iChrisLehman) is an author, a speaker and a lead staff developer with the Teachers College Reading and Writing Project at Columbia University. Check out his newest book, “Energize Research Reading and Writing.”