6 The Ontario Secondary School Literacy Test
Creative Higher-Order Thinking?
Jan Sobocan
The Ontario Secondary School Literacy Test (OSSLT) is designed to “prepare students with the knowledge and higher-order thinking skills they will need to solve increasingly complex problems and make decisions in a richly diverse, information-driven society” (Ministry of Education 2003a, 6 [emphasis added]). The test seems to substitute the concept “literacy” for the less fashionable 1980s phrase “critical thinking,” at least to the extent that the “critical” in “critical thinking” represents “higher-order” thinking. In the Ontario Curriculum, whose professed goal is teaching students a skill set that will enable them to solve “increasingly complex problems” and “make decisions,” the critical thinking that is implied includes creative thinking.
It is in view of these considerations that I set out to answer a number of related questions: What is the relationship between literacy and critical thinking? What is the relationship between critical thinking and creative thinking? And can an instrument like the OSSLT solicit and validly test for higher-order thinking, in particular, creative critical thinking?
Literacy and Creative Critical Thinking
In the Ontario Curriculum, “literacy” is defined as “the skills and knowledge in reading, writing, speaking, listening, representing, and viewing that empower learners to make meaningful connections between what they know and what they need to know” (Ministry of Education 2003a, 6). But when one reviews the OSSLT, the 2003 Literacy Report, and its complement document Think Literacy Success: Cross-Curricular Approaches Grades 7-12 (Ministry of Education 2003b), it is clear that literacy involves more than making connections. In particular, it comprises the skills students need to acquire and to critically assess the information given to them: skills that enable them to make good inferences and judgments (70). Such judgments require that students “assess different viewpoints and perspectives… and thin[k] critically about important concepts, issues and ideas” (74). Literacy is thus interpreted as the ability to read and assess various types of texts critically in order to make informed judgments about what to believe; to make better decisions at home, work, and school; and to speak and write persuasively (70).
In the supplementary documents provided for Ontario teachers and principals, instructors of all subjects are encouraged to foster higher levels of literacy in students in a variety of ways. They include ways of helping students learn how to critically digest various media, to review and reflect on information from a variety of disciplines in order to generate questions, and among other skills, to communicate opinions clearly (Ministry of Education 2003a, 7). Reviewing is deciding what the most important information is, and using this information to make reasonable inferences or to develop a persuasive piece of writing, or both. “Reflecting” implies developing key questions and generating questions for reflection (Ministry of Education 2003b, 12, 70), constructing arguments (giving relevant reasons for opinions), and considering alternative points of view or assessing various perspectives (41, 74). More generally, instructors are to help students understand the importance of — and how to ask — key questions when making judgments. Such questioning is said to help students learn how to “process information… to assess the importance and relevance of the information, and apply it in a new context” (74).
Many of these thinking activities are familiar to those taking reasoning skills courses at North American universities, as is evident in the critical thinking textbooks used to teach and test students in these courses (Gratton 2001). Reflection is often taken to be the heart of critical thinking and the connection between it and literacy skills is made readily apparent by Ennis (1996), who provides an approach to critical thinking that focuses on reading and writing, and on interpretation and evaluation skills. The latter skills include identifying main ideas and issues, asking key questions, and constructing arguments (which implies the ability to provide relevant reasons, evaluate context, self-evaluate, and so on). His textbook content, like the content in the Think Literacy documents, provides examples and practice exercises that aim to teach students how to “make reasonable decisions about what to believe or do” (xvii). The skills generally regarded as the critical aspects of a thinking process are the skills involved in argumentation: the construction, interpretation, and evaluation of arguments and information as well as situations or contexts (Sobocan 2003). Many of these skills are also used in creative thinking. The ones I will discuss are detecting bias and hidden assumptions, considering alternative points of view, imagining authors’ intentions and intended audiences, and making inferences.
Most contributors to this volume would likely agree that considering alternative points of view, ferreting out assumptions, and making good inferences are a few of the essential elements of a step-by-step thinking process used in the critical evaluation of arguments and information.[1] Such elements are incorporated in my fellow authors’ working definition of critical thinking as “skilled, active analysis and evaluation, done with a strong emphasis on the identification and due consideration of alternative interpretations and points of view.”[2] The component of this description that is most relevant to literacy education, and most obviously concerned with creative thinking, is the “consideration of alternative interpretations and points of view”
In the reading and writing sections of the 2007 literacy test, respectively, the students are asked to consider “all Canadians” in deciding whether it is good to have honourary citizens, and whether “every student should be required to take a Physical Education class every year of high school” (EQAO Educator Resources OSSLT 2007). Such judgments require consideration of a large range of views when one considers the diverse citizenry of Canada, or a population of high school students (also relatively diverse). For many critical thinking theorists, considering alternative views is part of a judgment of the quality of argument, but for educators it remains to be seen whether considering a range of views is to be evaluated as critical thinking, or as creative inquiry.
The significance of alternative points of view in creative critical thinking is evident in the role they play in the making of inferences or the development of arguments. As suggested in Think Literacy Success (Ministry of Education 2003a, 70-4) and in Ennis (1996, 365), persons who draw conclusions are thinking critically only when they have searched out and considered points of view other than their own. Hare (this volume) make precisely this point when they hold that critical thinkers must be able to imagine different perspectives or a variety of communication styles.
In many significant cases, critical thinkers must be able to imagine alternatives to the views presented to them because something that is presented as a fact is questionable; or because only one view of a controversial issue or ambiguous situation is communicated. This alternative viewing (which is implied whenever one detects bias) is an especially important aspect of the careful reading of a text, and an aspect of literacy that is described in curriculum documents as reading “between” or “beyond” the lines (Ministry of Education 2003b, 14).
In its account of literacy, the Ontario Curriculum states that higher-order thinking incorporates the way in which we process what we read, verbal reasoning, and written communication skills (ibid., 26). In the documentation on literacy, one can discern three broad categories of critical thinking that imply creative thinking skills: generating questions and ideas; developing opinions or constructing arguments; and visualizing and understanding unseen text (more on this below). In all three of the cases, the ability to creatively imagine and consider, alternative points of view is a core element of literacy. I will therefore emphasize this core ability in turning specifically to the OSSLT (2003), and considering the extent to which the OSSLT does (or could) test for creative aspects of critical thought.
The Ontario Secondary School Literacy Test
In 1998 the Ontario government publicly announced the OSSLT as a diploma requirement that would be administered by the province’s Education Quality and Accountability Office (EQAO). This particular performance or “achievement” test was first said to measure basic reading and writing skills to identify at-risk students. The test was implemented during the 2001-02 school year, when it was administered to Grade 10 students who had enrolled in Grade 9 in September 2000. The test takes five hours over two days, and is divided into reading and writing sections.
The OSSLT is described as a “useful quality assurance measure that shows the extent to which Ontario students are meeting a common, basic standard for literacy across the province” (EQAO 2001-02 Report of Provincial Results, 1). I believe the test attempts to measure much more than basic or minimum competency. This claim is supported first by what is implied by the government’s definition of literacy, which is that the understanding and importance of literacy extends to a “notion of literacy as freedom” (Ministry of Education 2003a, 7). Second, as the reading and writing questions above indicate, literacy encompasses both basic reading and writing, as well as higher-order thinking (as in taking and defending a controversial position on what it means to be a citizen). Given the latter consideration, much more than a basic disciplinary understanding is also clearly required. Before considering further what level of thinking is tested, whether the test validly measures what it intends to measure or whether it tests creative higher-order thinking, I will briefly discuss some of the social and political consequences intended by EQAO with respect to the OSSLT.
The EQAO has stated that the results of the OSSLT can be used, provided they are used efficiently and ethically, to give at-risk students the remedial support they need to graduate (Lipman 2004, 168). Those students who cannot perform the foundational reading and writing tasks necessary for learning are to be provided with the additional help they need to meet the standards set by the test. Students can achieve these standards by passing an additional test or by completing a literacy course. There are no OSSLT performance-based financial incentives for schools, and although students could take the test up to three times, the OSSLT was nevertheless often identified as a “high-stakes” standardized selection test (Murphy 2001, 146).
Although the stakes in Canada are not as high as in the United States (as Giancarlo-Gittens discusses in this volume), in a climate where the validity and usefulness of test results are consistently questioned (Gorrie 2004, for example) it is not surprising that the OSSLT has been generally criticized for being a waste of precious education monies. More specifically, the test is said to be unfair to those whose native language is not English and to those with a lower socio-economic status. A number of commentators have said that it employs inconsistent grading criteria implying invalid diagnoses of levels of literacy (Lipman 2004; Ricci 2004). Others have claimed that it compromises teacher autonomy (Runte 1998), that it creates undue stress for students, and that the money it costs would be better spent on books. These criticisms — the prevalent (and unauthorized) use of the test results to rank schools, and the persistent question of why the government spends $15 million annually on minimum competency testing — have convinced many that the test is not worthwhile. I take a different view.
I disagree with the widespread sentiment that there is nothing redeemable in the OSSLT or in accountability programs in general. In particular, I contend that the generalization that the test does not help serve to diagnose and improve student learning is a hasty one. Instead of rejecting the OSSLT entirely, and lamenting neo-conservative agendas and testing “regimes” in general, I believe it is more productive to try to improve the tests themselves. By doing so, perhaps accountability initiatives can move closer to helping teachers, schools, and boards not only diagnose at-risk students but achieve higher educational standards. In keeping with this, I will attempt to show that the 2003 version of the Ontario Secondary School Literacy Test does have significant potential, particularly for the testing of higher-order critical and creative thinking abilities.
In the discussion that follows, I examine some of the types of questions that the 2003 OSSLT uses to solicit critical thinking processes that include creative elements. This analysis will provide the basis for two general conclusions: first, a conclusion about the extent to which the test solicits creative higher-order thinking, and second, a conclusion about the extent to which standardized instruments of this sort might validly test such skills.[3]
The OSSLT: Critical thinking?
I asserted earlier that a core element of creative critical thinking is the ability to imagine, analyze, interpret, and evaluate alternative points of view. In examining the OSSLT tests, I will argue that it provides some significant opportunities for testing these abilities, though I am more interested in the potential for such testing than the details of the OSSLT. The OSSLT is much too long to be systematically studied here, especially because its answer keys and rubrics are not publicly available. Rather than attempt a systematic study of the test, therefore, I will consider particular aspects of it that can illustrate its potential (and sometimes its failure) in testing and promoting creative critical thinking.
The OSSLT (2003) test format is comprised of a series of multiple-choice and short-answer questions in the reading section. The writing section asks students to write a summary, a three-paragraph opinion piece, a news-style report, and an information paragraph (Ricci 2004, 79). In his contribution to this volume, Groarke criticizes the use of multiple-choice questions in the California Critical Thinking Skills Test. Murphy (2001) raises similar concerns about multiple-choice questions. Such questions are problematic in a critical thinking test because they do not ask students to demonstrate the reasoning behind their answers, though it is this reasoning (not their answers) that most determines whether they are engaged in critical thinking. In many cases, critical thinkers may reasonably defend different answers to multiple-choice questions, especially when there is room for “reading between” or “reading beyond” the sentences on the page. What matters is the evidence they adduce for reading something in a particular way, not the reading itself.
As in the California Critical Thinking Skills Test (see Groarke, this volume) some of the multiple-choice items in the reading section of the OSSLT do not allow for “between” or “beyond” (critical) readings of the text. In one question, students are asked to select the best meaning of the word “swear” in a paragraph with which they are provided. In the paragraph a witness is asked whether she still believes, after discovering that the defendant has an identical twin, that she saw the defendant involved in a crime. The sentence begins “Can you still swear that the man you saw…” and test-takers are asked to define “swear” as (a) “trust”; (b) “curse”; (c) “think”; or (d) “claim.” But one could reasonably argue that “trust,” “think,” and “claim” are all interchangeable with the term “swear” in the above case — perhaps in any case. In a test question of this sort, one cannot reasonably discern whether the student is thinking critically, particularly in “comprehending subtle meanings in texts” (Ministry of Education 2003b, 40). Such questions are not higher-order thinking items, because the test-taker is not given the opportunity to explain how he or she may be reading beyond the text. A grader, then, could only guess at the test-taker’s reasoning. One could easily see how the question could be considered to go beyond minimum competency to “critical” thinking if the test-taker is asked to explain the difference in meaning.
Other questions in the OSSLT better measure the critical and creative skills that are an integral part of literacy. Parts of the reading section of the test ask students to “provide various interpretations of the situations described in each statement” (ibid., 41). Consider, for example, questions about the situation described in the courtroom scenario discussed above. Defence counsel is trying to discredit the testimony of a woman who claims to have seen a defendant drop a murder weapon in the dead of night. Though she has never worn glasses, and she saw the man under a streetlight, the defence lawyer has pointedly argued that the man she saw could be the defendant’s twin brother (the defence points out the twin, who is dressed exactly like the defendant). One of the questions on the OSSLT is whether or not the witness is “believable.” Students are asked to state “why or why not.”
This is a question with creative potential because one may imagine reasons for alternative positions or answers to the question of whether this witness is reliable. She could be said to be believable because she has good eyesight; because the accused was directly under a streetlight; and most importantly, because there was no motive established for the twin. Yet another answer might be that it seems ludicrous that the twin would incriminate himself by appearing in the courtroom. An alternative view is that the witness is not believable: the similarity of the twins must raise a reasonable doubt in the minds of the jurors (even if there has been a clever collaboration). This range of possible answers does not exhaust the possible reasons for believing or not believing the witness. Reasons why she is or is not believable will be good grounds for creative critical thinking as long as they consider “various interpretations of the situation described in the passage.”
Though this illustrates one way in which a test question can solicit creative critical thinking, one might criticize this question on the grounds that a student is given only three lines on which to write his or her answer. Such limited space inhibits a creative answer. To the extent that it is desirable to have questions that promote critical thinking it would be best to have additional space for students to explicitly state reasons for “why or why not” the witness might be considered reliable and to choose from among them. In the current test, there is simply not enough space for students to be able to illustrate the requisite creative critical thinking — or even to construct a convincing argument that would support their beliefs or chosen answers. While the question allows for a range of answers or alternative views, the test structure itself does not ensure that students engage in the creative thinking that this question evokes.
It is not difficult to find other examples that illustrate the unrealized potential for creative critical thinking in the OSSLT. Consider a group of questions in the test’s reading section about a public notice on water conservation entitled “Be Water Wise.” The notice organizes information about water conservation according to various environments: home, farm, and along rivers. In relation to the presentation and classification of information, one test question asks why the title “Be Water Wise” is a good title, and provides one line for an answer. Another question asks test-takers to explain why it was a good idea to use boxes to frame the information. These two types of questions attempt to measure critical thinking by asking students to give their reasoning for their answers, and provide what could have been an opportunity for creative thinking. Yet the questions are formatted and asked in ways that limit creative critical thinking.
The above questions limit the development of an opinion (and in this way creative thinking) and the expression of alternative views. By assuming that both the title and organization of the piece in the question are “good,” the questions leave no room for the creative thought that possibly this is not so — that there would be a better way to organize the text, for example. Similarly, the questions’ design detracts from one of the tenets of good critical thinking: a position on qualitative matters such as these is never unequivocally true and any judgment or use of information, for that matter, should be evaluated in terms of both its strengths and weaknesses. One might, for example, argue that the title “Be Water Wise” is good in one way — it is catchy because of the alliteration. But one could also imagine someone arguing that the title falls short of the mark. For example, the public notice for water conservation was published during a period of drought, and one might point out that in these circumstances the title should convey the necessity of water conservation and the seriousness of the situation much more directly. Perhaps a title like “Don’t Be a Water Waster!” would work better because of the persuasiveness of the rhetoric given an audience who tends to neglect reading past the headlines of news articles and government-issued brochures. Once again, an interest in creative critical thinking could best be promoted by a question format which would ask students to consider alternative points of view.
I have argued that most of the reading questions that I have briefly analyzed attempt to solicit creative critical thinking, but that ultimately the question format and wording prevent a valid test of it. To the extent that higher-order literacy requires that students “develop greater awareness that texts can be understood on more than one level” (Ministry of Education 2003a, 40), the reading questions on the OSSLT could do more to require students to imagine and analyze different points of view.
What about the writing portion of the test? The Ontario Curriculum documents suggest that good literacy teaching will place an emphasis on verbal reasoning and written communication skills and strategies for writing in a variety of forms (ibid., 9). Do the types of questions in the writing section adequately test such abilities, and if so, which ones? And do they pose questions that measure creative critical thinking? It is difficult to answer such questions in detail without the rubrics used to judge answers to the test questions but one can assess, to some extent, the content validity of the writing segment by considering how well the structure of the questions promotes a potential for creative answers.
The Think Literacy Success curriculum documents suggest that a key part of literacy is, in a writing context, “visualization” of “unseen text,” “unseen text” being “the information that resides in the reader’s head: ideas, opinions, essential background knowledge” (Ministry of Education 2003b, 56). This sort of visualization involves the consideration of views other than those that are literally presented, and is indeed creative, because the missing text and the corresponding point of view must be “imagined.” Visualization can also be considered an aspect of critical thinking, because in order to imagine what is missing, students must generate key questions and arguments that would lead them to reasonable conclusions (about the author’s intentions, about the logical structure of the text, about how other readers might interpret the passage, and so on).
Even in the reading section of the OSSLT, the type of creativity I describe is tested in questions that ask students to imagine the intended audience for a paragraph: an imagining that is the first step in visualization. Still more significantly, the writing section of the OSSLT includes questions that solicit opinion paragraphs, asking students to write a short argument on a topic with a specific audience in mind. Test-takers are instructed to support their ideas with evidence in the way of proof, facts, examples, etc. In this process, students taking the test must imagine how the audience in question will be persuaded. This imagining is creative in the sense that students must “develop content and opinions for persuasive writing” (ibid., 70 [emphasis added)). The “stepping inside the shoes of another” that this requires is a type of role-playing which is a paradigm of creative activity that requires higher-order thinking, but particularly when they must include in their answers the purpose for arguing one point or another, or for choosing a particular style of communication. In role-playing, imagining, and assessing various purposes and audiences, choosing requires both imagining (creative thinking) and seeking out good reasons (critical thinking). However obvious, I need to add that in these interrelated choices, the creative and critical elements of thinking are inseparable parts of the thinking process that informs good choices, and therefore, good answers to the questions asked.
More generally, the opinion writing required by the OSSLT illustrates the key features of questions that elicit and test creative critical thinking. First, such questions ask a test-taker to construct (rather than simply criticize) an argument. Unlike in the questions in the reading section of the test, test-takers are given more space in the writing section to reflect, review, and generate some questions. In keeping with the literacy documents’ claim that literacy necessarily includes the analysis of text, the drawing of conclusions, and the assessment of different points of view (Ministry of Education 2003a, 40, 70), the writing section asks students to demonstrate such skills. In a question that asks whether Canada should “join” the United States, for example, students have the room to generate a question about, say, the meaning of “join” and can proceed to argue from that standpoint. If the scoring criteria are flexible, a student who exercises higher-order thinking might even entertain the possibility of answers based on different political orientations, or two or three ways in which Canada could “join” the United States.
The writing questions on the OSSLT, therefore, might be improved by including fewer questions that ask students to identify the main point of a particular paragraph or story. This sort of question inhibits creativity: students can choose only one answer, an answer that surely must be keyed as the only “passing” or “correct” answer.
In soliciting and testing creativity, it is better to have students independently explore alternative points and conclusions, thinking that would involve higher-order activities like ferreting out an implicit premise or conclusion (or in Ministry of Education terminology, a “hidden” or “unseen” premise or conclusion).
Creation and Evaluation: At Odds?
In attempting to address whether the OSSLT has the potential to (or does) measure creative critical thinking, I have tried to offer some practical insights that may be considered in the design of future tests (particularly literacy tests). A quick look at the political and validity aspects of the OSSLT, alongside its associated documents, shows that the test is well intentioned and has the potential to accurately measure more than minimum competency critical thinking or literacy (perhaps, then, has potential for government money to be better spent). Still, an important question remains: How can the design, validity, and political consequences of such tests be improved to allow for answers that can be claimed to be creative, critical thinking?
Received wisdom suggests that creativity cannot be measured in multiple-choice formats, and that it is not accurately measured by standardized formats (Ricci 2004, 80; Ryan 2004). I have agreed with some of the criticisms of multiple-choice questions. Multiple-choice questioning is problematic because it denies test-takers the opportunity to provide evidence of their own thinking, in particular the reasoning behind their choices of answers. And this is what matters when judging whether students are engaged in critical creative thinking.
It does not follow, however, that it is impossible to test for critical creativity in any formatted way. On the contrary, I hope that I have shown that standardized formats that ask for written answers (and possibly even multiple-choice answers supplemented with written answers) can do more than test for minimum competency in critical and creative thinking. I believe such question formats could lead to stronger inter-rater reliability and thus have potential to lead to the achievement of higher educational standards in Canadian classrooms.[4] I have already argued that many of the kinds of questions already contained in the OSSLT show how we might test not only critical, but also creative thinking outside of portfolio or authentic assessment formats — impossibly expensive and completely impractical testing formats to administer in a system of accountability.
If it is true that “[a]ccording to research, students who lack literacy strategies and skills need the… [a]ctivities that involve higher-level thinking, reasoning, and communication” (Ministry of Education 2003a, 8), then we would do well to construct test questions that ask students to imagine and consider alternative points of view, to develop opinions, to visualize, and so on. In constructing a test instrument that validly measures creative critical thinking, three rules of thumb should be followed. First, multiple-choice items should be avoided; if they are used, they should be combined with short-answer questions with sufficient space where students must and are able to justify their choice of answers. Second, questions should be constructed in a way that widens the range of answers that test-takers can give in response to a question. Among other things, this means that questions should not prejudice the issue with an explicit value judgment that some claim, remark, or discourse is good. Third, writing questions should be designed in a manner that pushes students beyond pre-set answers, toward the consideration of alternative points of view (and, ideally, beyond the typically polarized “why or why not” choice of answers).
This leaves open questions of grading in such contexts, and raises one last major concern about content validity. I have not looked at the scoring criteria of the OSSLT because I do not have access to them or to the specific grading procedures. One cannot make concrete suggestions about how to improve the test without access to the scoring key (if there is one for a pass/ fail evaluation format). More generally, one might blame a failure to analyze, understand, and improve the test on the lack of transparency with respect to grading criteria and how grading team supervisors make decisions about disputed judgments.
When it comes to the grading of creative critical thinking skills, there are many general issues to address. First, it is a problem that scoring criteria tend to emphasize minimum competency skills such as the mechanics of spelling and grammar, both of which are rote cognitive capacities (Ricci 2004, 83). Second, and also related to scoring, the pass/ fail rubric developed for grading the literacy test limits the range of accurate responses to one not-so-apparent accurate answer (Lipman 2004; Ricci 2004). The OSSLT is indeed a pass/ fail test, but the critical thinking it demands implies many more discrete levels of competency (a critical thinker thinks beyond minimum competency, and a simple pass does not distinguish between low, minimum, and higher-order competencies). I thus fail to see how such a rubric and the standard it sets could help Ontario teachers diagnose and correct a specific lack of competencies in individual students and improve the quality of education generally (both expressed aims of the EQAO).
Though I have argued that the OSSLT has potential for testing creative critical thinking, scoring limitations render the test invalid as a measure of higher-order thinking skills. From the point of view of creative thinking, simplistic scoring criteria of this sort raise the most common concern educational commentators have expressed in discussions of standardized testing initiatives: that they encourage teachers to “teach to the test,” which inhibits the exploration of alternative views and discourages independent thinking not only for test writers, but for evaluators and teachers alike (Runte 1998; see Giancarlo-Gittens and Hare in this volume). As serious as these problems are, I think it would be a mistake to conclude that there should be no attempts to design more valid tests of critical thinking, and more importantly, of creative critical thinking. To the extent that better tests can be constructed, the Ontario system could benefit from testing of this sort.
Conclusion
Though many commentators have raised issues about the validity of tests that claim to improve education, there has been little, if any, analysis of standardized tests that attempt to measure creative critical thinking. In the place of careful discussion, I believe there has been a blanket and educationally unhelpful critique of government-mandated standardized or performance testing that has usually presented such testing as an unfortunate correlate of Tory or neo-conservative governance (see Moll [2004] for many such critiques). I have provided an alternative view — one that would embrace a need for accountability, and provide suggestions that might help turn existing tests into instruments that allow students to read, write, and think more critically and creatively, and so allow teachers more space to think and to “provide creative and relevant instruction” (Ministry of Education 2003a, 9).
Standardized testing does, hold promise for testing parts of a higher-order thinking process, a process that includes creativity. We can and should develop tests that measure higher-order thinking skills and that test for the kind of creativity captured in literacy documents that go beyond helping develop abilities to “understand, think, apply and communicate in reading and writing” toward a level of effectiveness that would help our students have better relationships, become more discriminating as consumers, and perhaps then be more effective citizens (Ministry of Education 2003a, 6).
References
Education Quality and Accountability Office (EQAO) Educator Resources OSSLT. 2007. Online. Available at http://vv-wvv.eqao.com/Students/Secondary/10/10.aspx?Lang =E&gr=10&Aud= Students.
Education Quality and Accountability Office (EQAO). 2001-02. Ontario secondary school literacy test resource guide. Online. Available at http://wvvw. eqao.com.
Ennis, R. 1996. Critical thinking. Upper Saddle River, NJ: Prentice Hall. Gorrie, P. 2004. Literacy test a write-off? Toronto Star, Sunday, February 15.
Lipman, P. 2004. The Ontario grade 10 literacy test and the neo-conservative agenda. In Passing the test: The false promises of standardized testing, ed. M. Moll, 166-71. Ottawa: Canadian Centre for Policy Alternatives.
Ministry of Education. 2003a. Think literacy success: The report of the expert panel on students at risk in Ontario. Toronto: Queen’s Printer.
Ministry of Education. 2003b. Think literacy success: Cross curricular approaches grades 7-12. Toronto: Queen’s Printer.
Moll, M., ed. 2004. Passing the test: The false promises of standardized testing. Ottawa: Canadian Centre for Policy Alternatives.
Murphy, S. 2001. No-one has ever grown taller as a result of being measured revisited: More educational measurement lessons for Canadians. In The erosion of democracy in education: Critique to possibilities, ed. R. Solomon and J. Portelli, 145-68. Calgary: Detselig Enterprises.
Ricci, C. 2004. Breaking the silence: An EQAO marker speaks out against standardized testing. Our Schools/ Our Selves 13(2)#74: 75-88.
Runte, R. 1998. The impact of centralized examinations on teacher professionalism. Canadian Journal of Education 23: 166-81.
Sobocan, J. 2003. Teaching informal logic and critical thinking. In Informal Logic @ 25 Symposium, ed. H. Hansen and C. Tindale, CD-ROM.: University of Windsor: Informal Logic.
- Ennis (1996) codifies this process as "FRISCO" (Focus, Reasons, Inference, Situation, Clarity, and Overview). Much of what is applied when using the FRISCO framework is outlined in the Think Literacy Success: Cross-Curricular Approaches document (Ministry of Education 2003b). There is too much that is entailed by "good" inference to be covered in the scope of this chapter. ↵
- Generally agreed to at The University of Western Ontario Workshop and Symposium organized to develop this book. ↵
- At this stage, I do not make any claim about the validity of the OSSLT in terms of inter-rater reliability. Because I do not have access to specific scoring criteria, I can only imagine what the limitations of current and potential scoring criteria might be. ↵
- For reasons other than what one might say in response to Ricci's (2004) critique, I believe that stronger inter-rater reliability can be achieved in the evaluation of answers to more open-ended question formats (or answers to essay questions). Space does not allow me to enter into such reasoning here, but Hatcher does deal with this subject more extensively in Chapter Eleven of this volume in relation to evaluating extended arguments. ↵
pp.151-166
pp.132, 151-156, 161-162
p. 152
p. 152
pp. 1-322
pp. 71, 77, 128, 151, 153 - 155, 157-164, 219, 261, 304
pp. 151-155, 157-161, 163-166
pp. 45-64, 217, 227-232
pp. 161-162