Are you over 18 and want to see adult content?
More Annotations
A complete backup of garantiturizm.com
Are you over 18 and want to see adult content?
A complete backup of winesofchile.org
Are you over 18 and want to see adult content?
A complete backup of getconcentrating.com
Are you over 18 and want to see adult content?
A complete backup of refactortactical.com
Are you over 18 and want to see adult content?
A complete backup of zapatosmodelos.es
Are you over 18 and want to see adult content?
Favourite Annotations
بلیط هواپیما - بلیط هواپیما چارتر و بلیط هواپیما ارزان | سفرمی
Are you over 18 and want to see adult content?
Πρωτοσέλιδα Εφημερίδων - Protoselida
Are you over 18 and want to see adult content?
إسهامات الحضارة الاسلامية في مختلف التخصصات العلمية
Are you over 18 and want to see adult content?
Materiales, Cursos, Universidades
Are you over 18 and want to see adult content?
32 Forex Indicator | 32 Forex System Indicators
Are you over 18 and want to see adult content?
Text
GRANTED, AND…
~ THOUGHTS ON EDUCATION BY GRANT WIGGINSSearch:
MY REPLY TO WILLINGHAM, PART 2 25 _Monday_ May 2015 Posted by grantwigginsin General
≈ 9 COMMENTS
In part 1 of my reply to Willingham’s article on reading comprehension strategies published recently in the _Washington Post_, I took issue with his reasoning and analogies. In Part 2 of my response to Willingham, let me get right to the evidence. I propose that he has quoted very selectively and drawn questionable conclusions from the research hedoes cite.
In his _Washington Post_ article, he doesn’t cite research specifically; rather, he refers us to a paper he co-authored with Lovette, published in the _Teachers College Record_. Oddly, that article is almost identical to the Post article. The only difference is that he provides a paragraph of citations to support the claims he wishes to make on duration of intervention related to the strategies. And he changes the baseball analogy to golf. Here is the key paragraph from his _Washington Post_ article: Gail Lovette and I (2014) found three quantitative reviews of RCS instruction in typically developing children and five reviews of studies of at-risk children or those with reading disabilities. All eight reviews reported that RCS instruction boosted reading comprehension, but NONE reported that practice of such instruction yielded further benefit. Here are the two related paragraphs from the _TC Record_ article; note the different, less sweeping conclusion: RCS instruction has a serious limitation. Its success is not due to the slow‐but‐steady improvement of comprehension skills, but rather to the learning of a bag of tricks. The strategies are helpful but they are quickly learned and don’t require a lot of practice. And there is actually plenty of data showing that extended practice of RCS instruction yields no benefit compared to briefer review. We know of eight quantitative reviews of RCS instruction, some summarizing studies of typically developing children (Fukkink & de Glopper, 1998; Rosenshine, Meister, & Chapman, 1996; Rosenshine & Meister, 1994) and some summarizing studies of at‐risk children or those identified with a learning disability (Berkeley, Scruggs, & Mastropieri, 2009; Elbaum, Vaughn, Tejero Hughes, & Watson Moody, 2000; Gajria, Jitendra, Sood, & Sacks, 2007; Suggate, 2010; Talbott, Lloyd, & Tankersley, 1994); none of these reviews show that more practice with a strategy provides an advantage. Ten sessions yield the same benefit as fifty sessions. The implication seems obvious; RCS instruction should beexplicit and brief.
Thus, in his _Washington Post_ article he actually overstates what he and Lovette claimed in the original article. There is no justification for his claim in the _Post_ article that “All eight reviews reported that RCS instruction boosted reading comprehension, but NONE reported that practice of such instruction yielded further benefit.” In fact, even the original claim is over-stated, as we shall see. WHAT THE CITED STUDIES ACTUALLY SAY. Here is what we learn when we actually go to the studies that Willingham cites to make his point: * The Rosenshine, Meister, and Chapman study (1996) looked only at one strategy – generating questions of the text – not many reading strategies. Yet, it appears that Willingham’s sweeping conclusion that 10 sessions are as good as 50 about all strategy instruction was drawn from this one analysis. Here is the text and chart from thatstudy:
_Length of training._ The median length of training for studies that used each type of procedural prompt is shown in Table 4. We uncovered no relationship between length of training and significance of results. The training period ranged from 4 to 25 sessions for studies with significant results, and from 8 to 50 sessions for studies with nonsignificant results. * Here, from a second study Willingham cites (Gadjria et al 2007), are the cautious comments on amount of time on strategies from theauthors:
Unfortunately, the limited database does not allow us to infer the capacity of strategy use to achieve maintenance or transfer. Also, more research is needed to draw conclusions about the duration and length of treatments needed to positively affect maintenance and transfer effects. Although the database is larger for treatment intensity than for maintenance and transfer effects, we cannot make persuasive conclusions about the potential relationship between thesevariables.
* Willingham is clearly leaning on the research in Elbaum et al (2000) since that study shows that duration of treatment is not as salient as we might think. (Suggate and Gadjria _et al_ also quote from this study). However, Willingham chose to not mention a critical distinction in that study that bears on his claim. Here is the salientsection:
Intervention intensity was examined in two ways: by _duration_, coded as the number of weeks over which the intervention was carried out, and _total instructional time_, coded as the number of hours of instruction provided to each student. Information on the duration of the intervention was available for 30 samples of students; information on total instructional time was available for 27 samples. The interventions ranged in duration from 8 to 90 weeks and in total instructional time from 8 to 150 hr. Duration of the intervention was reliably associated with the variation in effect sizes, QB(1) = 7.9; interventions lasting up to 20 weeks had a mean weighted effect size of 0.65, compared with 0.37 for those lasting longer than 20 weeks. Total instructional time, however, was not reliably associated with effect size variation, fiB(l) = 0.35. We further examined the relation between intervention duration and intensity. The mean instructional time for interventions lasting up to 20 weeks was 63 hr; the mean time for interventions lasting longer than 20 weeks was 61 hr. Duration and total instructional time did not significantly covary (r = .116, ns). _This finding suggested that the same amount of instructional time, delivered more intensively, tends to have more powerful effects_. Furthermore, the Elbaum study focused exclusively on one-on-one tutoring in both phonics and strategies, not teacher instruction and student practice of strategies in class. Even so, most of the interventions were far longer than Willingham lets on. For example: One study that contrasted a standard Reading Recovery program with a modified Reading Recovery program (Iversen & Tunmer, 1993) reported that students in the modified program were discontinued after an average of 41.75 lessons, compared with 57.31 lessons for students in the standard program. The effect size for students in the modified program was comparable to that of students in the standard program, suggesting that it is possible to achieve the same outcomes in a much shorter period of time by modifying the content of instruction. This finding suggests that efficiency, or the amount of progress over time, may be a useful variable to consider in conducting future studies. That’s a far cry from “10 quick lessons” … More disconcertingly, not once in either article does Willingham discuss the results of the six most well-known and well-studied interventions using multiple strategies: PALS, POSSE, CSR, TSI, CORI – all of which show significant gains through a significant investment in time, and many of which are highlighted in the variousmeta-analyses.
Here, for example, is the data on PALS: * In the study, 20 teachers implemented PALS for 15 weeks, and another 20 teachers did not. Students in the PALS classrooms demonstrated greater reading progress on all three measures of reading achievement used: words read correctly during a read-aloud, comprehension questions answered correctly, and missing words identified correctly in a cloze (maze) test. The program was effective not only for students with learning disabilities but also for students without disabilities, including low and average achievers. Michael Pressley, author of _Reading Instruction That Works_, arguably did more direct and indirect research on reading strategies than anyone, and his work is cited in almost every review of research. Here is what he says about duration and results: * As far as policymakers are concerned, however, the gold standard is that an educational intervention make a difference with respect to performance on standardized tests. What was striking in these validations was that a semester to a year of transactional strategies instruction made a definitive impact on standardized tests… In light of this set of quotes, does the following Willingham conclusion seem warranted to you? RCS instruction has a serious limitation. Its success is not due to the slow‐but‐steady improvement of comprehension skills, but rather to the learning of a bag of tricks. The strategies are helpful but they are quickly learned and don’t require a lot of practice. “TRICKS” AND TRANSFER. Willingham is clearly having some fun referring to the strategies as “tricks” but he might have taken a page from the research he cites instead. Because in Rosenshine, Meister, and Chapman, they say this about the strategies: In contrast, reading comprehension, writing, and study skills are examples of _less-structured tasks. _Such a task cannot be broken down into a fixed sequence of subtasks or steps that consistently and unfailingly lead to the desired end result. Unlike well-structured tasks, less-structured tasks are not characterized by fixed sequences of subtasks, and one cannot develop algorithms that students can use to complete these tasks. Because less-structured tasks are generally more difficult, they have also been called _higher-level tasks. _However, it is possible to make these tasks more manageable by providing students with cognitive strategies and procedures. A cognitive strategy is a heuristic. That is, a cognitive strategy is not a direct procedure or an algorithm to be followed precisely but rather a guide that serves to support learners as they develop internal procedures that enable them to perform higher-level operations. Generating questions about material that is read is an example of a cognitive strategy. Generating questions does not lead directly, in a step-by-step manner, to comprehension. Rather, in the process of generating questions, students need to search the text and combine information, and these processes help students comprehend whatthey read.
Such heuristic thinking is essential to transfer; it’s hardly a trick, as we know from all the research on how general ideas and schemas bridge seemingly unique experiences (cf. Chapter 3 in _How People Learn_). Yet, Willingham does not mention transfer once, though it is indeed worried about in almost every study he cites. Why worry? Because results on the experimental post-test, designed by the researchers to assess their intervention on specific strategies, are typically much higher than results on a standardized test of reading comprehension later, where no prompts or reminders about the particular intervention studied are provided – i.e. transfer. Here are two relevant quotes, one from the paper by Gadjria _et al_ cited by Willingham, and the second from Allington and McGill-Franzen in the _Handbook of Research on Reading Comprehension_ that I havequoted from before:
Unfortunately, the limited database does not allow us to infer the capacity of strategy use to achieve maintenance or transfer. Also, more research is needed to draw conclusions about the duration and length of treatments needed to positively affect maintenance and transfer effects (Gersten et al., 2001). Although the database is larger for treatment intensity than for maintenance and transfer effects, we cannot make persuasive conclusions about the potential relationship between these variables. Furthermore, few studies helped children develop a deep understanding of complex text by effectively processing structural elements of expository text (e.g., Bakken et al., 1997; Smith & Friend, 1986) or stressed the social aspect of collaborative learning (e.g., Englert & Mariage, 1991; Klingner et al., 2004; Lederer, 2000) that Gersten et al. (2001) noted is critical to mediate learning and transfer effects. Improving performance is possible. However there is less evidence that comprehension focused interventions produce either autonomous use of comprehension strategies or longer-term improvements in comprehension proficiencies. The lack of evidence stems from the heavy reliance on smaller sample sizes and shorter-term intervention designs as well as limited attention to a gold standard of transfer of training toautonomous use.
Arguably, transfer can only be caused by many interventions, a gradual release model, and lots of practice of multiple strategies simultaneously over a long period of time – as the research repeatedly says and as common sense tell us. Indeed, to close with one more sports analogy, the drills do not transfer easily to the game. It basically takes a full season of scrimmages, de-briefings, and lots of practice trying to apply the drills to game situations to make that transfer happen. Nor are the drills “tricks” even though they ultimately fade away in fluent automatic performance. And that’s arguably a more apt analogy for reading the research than Willingham’s discussion of sport andfurniture-building.
PS: I neglected in the first post to copy and paste my comments on one of the other research studies that Willingham cites: Sheri Berkeley, Thomas E. Scruggs and Margo A. Mastropieri (2010). Here is what they say about intervention duration: For criterion referenced measures, mean weighted treatment effect sizes were highest for treatments of medium duration (more than 1 week but less than 1 month). Differences among treatments of varying length were statistically different according to a homogeneity test, _Q_(2, _N _= 30) = 6.68, _p _= .04. However, differences on norm-referenced tests by study duration were not statistically significant (_p _= .83). That treatments of moderate length were associated with higher effect sizes than either shorter or longer treatments is not easily explained. Note that only three studies were examined that took place over more than 1 month, due to the parameters of their study (a focus on remedial education for special needs students.) As we have seen many such studies exist for regular students, with strong effect sizes. Nor do this data quite support Willingham’s conclusion about the valueof practice.
SHARE THIS:
* Pinterest5
*
LIKE THIS:
Like Loading...
PREVIOUS ARTICLES
PART 1 OF A REPLY TO WILLINGHAM ON READING STRATEGIESMay 15, 2015
This is Part 1 of a 2-part response In a recent Washington Post article (excerpted from his new book), Daniel … Continue reading → SOME EXCERPTS FROM PISA MATH RESULTS – 15 YEAR OLDSMay 11, 2015
Some not surprising but still depressing excerpts from the PISA Math Results, US results: Students in the United States have … Continue reading → MY MOTHER’S PUZZLEMENTMay 8, 2015
My mother is an extraordinary person. Yesterday, while I made lunch for my son, father, and her, she was playing … Continue reading → A BRIEF POST ON NAEP CIVICS AND HISTORY TEST RESULTSMay 3, 2015
Yes, we know: kids “don’t know much about history” in the words of the immortal Sam Cooke. The weak NAEP … Continue reading → A GUEST POST ON (TOO MUCH) LECTURING IN HS HISTORYApril 30, 2015
In a previous post, I posed the question – based on student survey data and my own observations over the … Continue reading → AN OPEN LETTER TO GOVERNOR CUOMO: RE-THINK THE REGS OF APPRApril 27, 2015
Dear Governor Cuomo: I have my whole professional educational life been a supporter of teacher accountability. And, as you may … Continue reading → WHY DO SO MANY HS HISTORY TEACHERS LECTURE SO MUCH?April 24, 2015
Really, why do HS teachers lecture so much? Almost every HS I go to I see teachers talking and kids … Continue reading → ON WISE TEXT SELECTION FOR DEVELOPING COMPREHENSION: POST #8 INA SERIES
April 23, 2015
In the previous literacy posts in this series I identified a few guiding questions that stem from the research: Do … Continue reading → ON TRANSFER AS THE GOAL IN LITERACY (7TH IN A SERIES)April 20, 2015
In the previous literacy post I identified a few take-away questions and related issues from my recent research on comprehension, … Continue reading →← Older posts
ABOUT THE AUTHOR
Grant Wiggins is the co-author of _Understanding by Design_ and the author of _Educative Assessment_ and numerous articles on education. He is the President of Authentic Education in Hopewell NJ. You can read more about him and his work at the AE site (click here)EMAIL SUBSCRIPTION
Enter your email address to subscribe to this blog and receive notifications of new posts by email. Join 6,655 other followersSign me up!
PAGES
* All the posts on literacy research and its implicationsRECENT POSTS
* My reply to Willingham, Part 2 * Part 1 of a reply to Willingham on reading strategies * Some excerpts from PISA Math Results – 15 year olds * My mother’s puzzlement * A brief post on NAEP Civics and History Test Results * A guest post on (too much) Lecturing in HS History * An Open Letter to Governor Cuomo: Re-think the Regs of APPR * Why do so many HS history teachers lecture so much? * On wise text selection for developing comprehension: Post #8 ina series
* On transfer as the goal in literacy (7th in a series) * Another shadowing report * 8 Reasons that today’s high school is poor preparation fortoday’s college
* On literacy and strategy, part 6: my first cut at recommendations * On Reading, Part 5: A key flaw in using the Gradual Release of Responsibility model * On reading, Part 4: research on the comprehension strategies – acloser look
* My 200th Post – On Literacy Part 3 * On reading, Part 2: what the research REALLY reveals * Teacher Effectiveness Ratings – Part 2 * Teacher Effectiveness Ratings – Part 1 * 5 unfortunate misunderstandings that almost all educators have about Bloom’s Taxonomy.USEFUL SITES
* Authentic Education – home page * Models by Design – Alexis Wiggins * Washington Post Education PageRECENT TWEETS
* Sorry! Didn't realize that Grant's twitter account is the one active on my phone. -Denise, Grant's wife 1 year ago * Me, too. 1 year ago * Yes. Grant is gone, suddenly and unexpectedly. We are bereft.4 years ago
* Grant Wiggins, of brilliant mind and dearest heart, died yesterday. The world has lost a true champion of learning.Carry on the work.-Denise 4 years ago * A poster child for Common Core wapo.st/1RabJpc A blunt defense of C Core in OK by a conservative WaPo writer: 4 years agoCURRICULUM
MATTERS BLOG
* Need a Primer on Education Week's Civics Project? Listen to ThisEWA Podcast
Listen to the Education Writers Association's interview with reporter Stephen Sawchuk on the core themes in Education Week's Citizen Z civics education reporting project. Stephen Sawchuk * Ways to Improve Civic Engagement and Student Voice: An EdWeek Chat Take in the highlights and strategies from Education Week's recent online chat on improving civics education and student voice. StephenSawchuk
* Students Increasingly Are Not Reading Over the Summer, Poll Finds The Scholastic survey also finds that knowledge helps: Parents who know about "summer slide" make more efforts to help their child keep reading. Sasha Jones * History Instruction Indicted: Too Much Memorization, Too LittleMeaning
Students in U.S. classrooms are startingly ignorant of American history, but it's not because their teachers have failed them. It's because the curriculum in most schools focuses on memorizing "irrelevant, boring" names and dates, a new study finds. CatherineGewertz
* This Tool Can Help Identify 'STEM Deserts.' But It Needs YourFeedback
The National Math and Science Initiative's new tool aims to help the field look for patterns in STEM data, so educators and policy folks can fill in holes. Stephen Sawchuk * Battle Over Reading: Parents of Children With Dyslexia WageCurriculum War
A program to teach children with dyslexia how to read, will now be used with every child in Arkansas. Parents led the way—forcing the state to rethink reading. Lisa StarkNY
TIMES ON EDUCATION
* Liberty University Brings Back Its Students, and Coronavirus Fears,Too
The decision by the school’s president, Jerry Falwell Jr., to partly reopen his evangelical university enraged residents of Lynchburg, Va. Then students started getting sick. Elizabeth Williamson * With Coronavirus Disrupting College, Should Every Student Pass? College students across the country are pushing to abolish grades, saying the only letters that matter now are C-O-V-I-D. AnemonaHartocollis
* Early Graduation Could Send Medical Students to Virus Front Lines Hundreds of fourth-year students at universities in Boston and New York could start caring for patients months ahead of schedule. EmmaGoldberg
Teach.com
GOODREADS
GOODREADS: READ
Educative Assessment: Designing Assessments to Inform and ImproveStudent Performance
by Grant P. Wiggins
Understanding by Design Guide to Advanced Concepts in Creating andReviewing Units
by Grant P. Wiggins
Essential Questions: Opening Doors to Student Understandingby Jay McTighe
The Understanding by Design Guide to Creating High-Quality Unitsby Grant P. Wiggins
Share book reviews
and ratings with Grant, and even join a book club on Goodreads.RECENT COMMENTS
grantwiggins on My reply toWillingham, Part…
Dan Willingham on My reply toWillingham, Part…
grantwiggins on My reply toWillingham, Part…
ghewgley (@ghewgley) on My reply to Willingham, Part… grantwiggins on My reply toWillingham, Part…
PAGES
* All the posts on literacy research and its implications Blog at WordPress.com. Do Not Sell My Personal InformationPost to
Cancel
%d bloggers like this: Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use. To find out more, including how to control cookies, see here: CookiePolicy
* Follow
*
* Granted, and...
* Customize
* Follow
* Sign up
* Log in
* Report this content * Manage subscriptions* Collapse this bar
Send to Email Address Your Name Your Email AddressCancel
Post was not sent - check your email addresses! Email check failed, please try again Sorry, your blog cannot share posts by email.Details
Copyright © 2024 ArchiveBay.com. All rights reserved. Terms of Use | Privacy Policy | DMCA | 2021 | Feedback | Advertising | RSS 2.0