Data-Driven Instruction: Simple, Compelling, and Wrong

As the news slowly spreads that it is absolutely not a good idea to teach reading as a set of transferable strategies and skills as defined by standards derived from benchmark assessments, a formidable barrier remains – Data Driven Instruction (DDI).

“This practice [DDI] arose from a simple logic: To improve student outcomes, teachers should study students’ prior test performance, learn what students struggle with, and then adjust the curriculum or offer students remediation where necessary.” – Does Studying Student Data Really Raise Test Scores

What started as “simple logic” has since mutated into coercing teachers into teaching reading in a Frankenstein-esque manner with short passages and repetitive drilling of standards-based questions. We’re more focused on practicing question stems than on learning content.

Robert Slavin writes how “benchmark assessments fall into the enormous category of educational solutions that are simple, compelling, and wrong.” Let’s explore how DDI distorts and fragments the very nature of teaching and how we might get things back in order.

Literally Teaching To The Test

In my second year of teaching, I attended a Driven by Data professional development training led by Paul Bambrick-Santoyo as part of my professional development. In the workshop I was taught that curriculum planning and pacing was to obsessively start and revolve around high-stakes assessments and academic standards. Pacing guides and curricula were to be organized by standards, not content.

While this model of plannning and assessment might (?) work well with mathematics standards that are generally more closely related to content, DDI is a nightmare when applied to a subject like ELA with its litany of reading comprehension strategies and skills.

Timothy Shanahan writes how “reading comprehension questions are a horse of a different color. There is no reason to think that practicing answering particular types of comprehension questions would improve test performance.”

And yet this Driven By Data training was clearly instructing school leaders and teachers like myself to spend weeks and months trying to teach and assess to reading comprehension questions derived from standards like:

“Determine a theme or central idea of a text and how it is conveyed through particular details.” – RL 6.2 

What did this look like? In weekly PLCs, each reading standard taught was to be “unpacked” to list what students should “show” and “know” to be successful on an exemplar assessment item. Then, students would practice that standard typically using IXL or TpT worksheets.

main idea

After a few weeks, teachers would set quizzes and comprehensive assessments on the standards taught, then spend hours grading and filling out spreadsheets (data trackers) to provide question-level analysis. Using the spreadsheet, we would identify which standards students were not proficient in, and then devise a “re-teach plan.” But these re-teach plans often just included additional practice with the same multiple-choice questions attached to short passages – maybe in a smaller group. Nothing fundamentally changed in the instruction.

Crucially, the DDI model did not allow teachers to adjust their approach to teaching reading outside the narrow confines of academic standards and reading comprehension questions; instead, students were just given double or triple doses of what was already not working.

Going Mad

How do we excise ourselves of this DDI madness? How do we dissuade school leaders from building “data rooms” and poring over benchmark assessments displaying how well students can find the main idea? We can start by recognizing three key flaws:

  • ELA standards give no clear signal to follow. When assessment is centered around students’ ability or inability to “find the main idea,” the information we get is not at all clear or helpful in determining what a student actually struggles with. Was it a lack of background knowledge and general language comprehension? Or confusion around the question? Or lagging decoding skills? Teachers are left to guess at what students need additional instruction in, which makes the re-teaching plans even more futile.
  • DDI fragments teaching and learning. The curricula and pacing guides developed for DDI result in choppy, atomized lessons and units based on standards instead of content knowledge. It’s nearly impossible to build a cohesive curricular narrative when all you see are standards. DDI somehow manages to destroy the magic of teaching, reading, and learning while also being wholly ineffective.
  • Energy Sink-Hole. A belief in DDI creates a false sense of control around student learning and teacher accountability. It is a complete mirage. DDI squanders precious teacher time and energy that could be better spent intellectually preparing lessons or incorporating improved instructional resources/techniques. Alex Quigley writes how “for every initiative we add, what are we taking away? Our success, and the efforts and energy levels of our teachers, may well depend on it.”

Efficient and Manageable Assessment

While I have spent most of this piece articulating how DDI has mutated and resulted in all sorts of unforeseen and troublesome consequences, I’d like to end with what might be an ideal model of assessment for a subject like ELA or social studies.

Graham Nuthall writes in The Hidden Lives of Learners how “if you want to evaluate student learning and use that evaluation to improve your teaching, then you need to realise that nothing less than knowledge of how the students’ beliefs and understandings have actually changed will serve these purposes” (p. 52).

Mike Schmoker notes how “the effective use of data depends on simplicity and economy…data analysis shouldn’t result in overload and fragmentation.”

How then can we ensure that assessment and data collection captures how students’ understandings have changed while also being manageable and useful for future instruction? Here are four steps that I think get us close.

First, identify the purpose of the assessment. How will the assessment be used, and how will it inform future instruction? Is baseline reading ability (phonological awareness, letter sounds, oral reading etc.) being screened, or content knowledge and language comprehension? Don’t waste your time or energy trying to assess a “Reading Level.” Daisy Christodoulou states how “clarity of purpose is vital for great assessment… until you are clear about exactly what your different purposes are, you won’t be able to use the right assessments.”

Second, clearly identify key concepts and content knowledge to be taught. If you’re teaching for comprehension, know that building background knowledge is a best bet. Knowledge organizers and well-sequenced curricula greatly help with developing this foundation. It also helps for me to think of these key concepts and ideas as threads of knowledge to be woven together into a quilt of ever-greater understanding.

Third, measure change in learning. I think this can and should be done a few different ways. First, before a unit, set an open-ended writing prompt (write everything you know about ________) with a list of the key concepts (Tier 3 vocabulary) that will be taught. Then after a few weeks instruction, give that same prompt. In addition, periodically give small, low-stakes multiple-choice quizzes (technology makes this very manageable and efficient) on key concepts and vocabulary to tap into the power of retrieval practice and the testing effect. Utilize a mixture of factual and conceptual questions, and gradually increase challenge as students demonstrate success.

Lastly, after looking over students’ quiz results and writing, spotlight and correct misconceptions using whole-class feedback. Do not spend hours marking every single error; instead, keep an eye out for common errors and give timely feedback that nearly all students can benefit from.

Put Simply

  • Data-Driven Instruction and the obsession with benchmark assessments based on academic standards fragments and distorts curriculum and instruction.
  • Ideally, assessment should be simple, manageable, and demonstrate how students’ understanding has changed.
    1. Identify the purpose
    2. Clearly identify key concepts and content knowledge to be taught
    3. Measure change in learning
    4. Spotlight and correct misconceptions

 

 

4 thoughts on “Data-Driven Instruction: Simple, Compelling, and Wrong

  1. The resource you linked next to, “Don’t waste your time or energy trying to assess a “Reading Level.” is not working. I am curious about the source of this statement. Can you direct me to the right place?

    Like

  2. While this model of planning and assessment might (?) work well with mathematics standards.

    I’d be massively surprised if it did.

    If a student keeps getting something wrong, there is a reason. Relentlessly targeting the particular skill you want them to learn won’t find the underlying misconceptions, and will quickly become very boring. Your “Put Simply” is perfect for Maths as well.

    Maths teachers do tend to teach to the test as exams come up, and I do it too, but they are idiots if they teach that way through the year. It leaves deeper misunderstanding uncorrected, fails to build links between concepts and is dull for the students.

    Even on its own terms DDI is useless, as builds no resiliency if the questions are slightly different in the exams from what was drilled. I’ve seen this a number of times where poor teachers have been far too focused on recent exams at the expense of teaching the subject broadly, and a small change in question types has left their students without the tools to cope. Sadly, I have even seen teachers complain that type of question changed slightly and that was “unfair”.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s