1 Rigor and Reality

This chapter was written in an atmosphere of challenged change in the teaching of logic. According to Howard Kahane, whose book on fallacies (Logic and Contemporary Rhetoric) went through many editions, his interest in that topic stemmed from the career of Spiro Agnew. Spiro Agnew was Vice President of the United States from 1969 to 1973. His highly imaginative and well-publicized rhetoric, incorporating such famous expressions as “nattering nabobs of negativism,” led several of Kahane’s students to ask what the tools of logic could offer for the evaluation of Agnew’s claims and arguments. Kahane realized that formal logic had little to offer, a realization that led him to develop his own text, emphasizing the understanding of fallacies. In that work, the examples were taken from American politics, a selection that prompted Ralph Johnson and Tony Blair to produce their book, Logical Self-Defence, with Canadian material. That work was an important stimulus for my own.

In the early nineteen eighties, “logic” as used by philosophers, meant “formal logic”, and the standard presumption was that by studying logic, students could learn to reason, detect poor reasoning and argument, and construct good arguments. This presumption was coming into question in the nineteen eighties but in my experience it was still strongly defended by many philosophers. In the context, I was astounded when I heard in 1978 from  Michael Scriven – still active in the field – that formal logic had little or nothing to offer as applied to real arguments expressed in natural language. There were problems of translation, argument type, structure, premise assessment, dialectical context, audience, and much else. For all its rigour and status, formal logic was of little use as applied to real arguments. To me the discovery was shocking. How could philosophers have been so wrong about practicalities? Were they deceiving themselves? They prided themselves on knowing how to argue, but were sadly lacking when it came to theorizing about that. I was not only shocked; I was excited about the many under-explored themes that emerged from this failure.

In 2017, the situation has changed significantly. Argumentation studies is a recognized field, inter-disciplinary and international. International conferences are many, and are well-attended by philosophers, academics in speech and communication studies, cognitive scientists, linguists, and others. The term “informal logic” is used to name a thriving and recognized journal. To some extent, the problems described in this chapter have disappeared. Not entirely, though. Psychologists and neuroscientists may claim expertise in critical thinking, unaware of work on argumentation, apparently on the grounds that they understand some of the workings of the brain. Within philosophy, people are sometimes hired to teach critical thinking, informal logic, or argumentation in the absence of any expectation that they should have studied these fields. Among mainstream philosophers, these areas of study do not generally enjoy a high status. Certainly, they are not accorded the status given to formal logic. Even in a difficult job market, it is possible to be hired to teach critical thinking or informal logic without having studied the topics. What is assumed in this practice? That a philosopher will have that expertise simply because she is a philosopher? That some background in formal logic will do the trick? That the material is, at root, remedial in nature and simple to understand? I cannot claim to know the answer. The phenomenon described in this chapter persists to some extent. And it is regrettable.


There was a time when science was thought to be perfectly objective, immune from political influences, when scientific theories were deemed certain and true, when absolute scientific proof was thought to be possible. That time has passed. Philosophers of science and scientists themselves now hesitate to distinguish categorically between theory and observation, between metaphysics and science, between conviction by proof and the decision to accept.

Logic is deemed by logicians to be a science, but a comparable sense of the ambiguity of things has not been conveyed to logicians. They continue to regard logic as a fully objective study in which results can be shown true to anyone who understands the problem and proofs are models of strictness and rigor. Here enter no values, no history, no politics, no uncertainty, no ambiguity. Logic is a science in the old-fashioned sense. Perhaps it is the last such science.

1. The Prestige of Formal Logic

Since the turn of the century, formal logic has been closely linked with mathematics. A formal logician does not do his work primarily in a natural language. Rather he deals in artificial languages and formal systems. Here, symbols like ‘v’, ‘-‘, and ‘.’ are precisely defined and used to represent a core element of the meaning of such natural words as ‘or’, ‘if-then’, ‘not’, and ‘and’. In such systems, every term used has a perfectly precise definition; strict rules govern one’s every move. If natural language arguments are considered at all, they are re-stated so that their structure (or form) is represented in the symbols of a formal system. An argument valid in virtue of its form may be provably valid if this form is the kind of form the system has been devised to deal with. Arguments in which connections depend on formal aspects not covered by the system, or on meaning, or on features that cannot be handled in deductive logic, do not come out very well by such tests.

The highly technical and intimidating nature of formal logic as currently practiced by professional logicians can be conveyed by a look at just a few recent titles in the field: ‘Large Matrices which Induce Finite Consequence Operations’; ‘Counterpart Theoretical Semantics for Modal Logics’; ‘S5 without Modal Axioms’; ‘More about the Lattice of Tense Logics’.

Formal systems are created structures, beloved in some quarters for their provision of endless opportunities to manipulate symbols in blissful isolation from the ambiguities of real language and the uncertainties of the real world. ‘If’ and ‘implies’ in English mean many things in many contexts, but the material implication symbol in standard propositional logic means one and only one: ‘p→q’ says that q is never false when p is true, and it says nothing other than this.

Logic, understood as formal logic, has enormous prestige. In part, this is because the study of logic is supposed to help us construct sound arguments, reason well, and find flaws in shaky or deceptive arguments. Logic does not describe or explain the way people in fact think. Rather, it is an evaluative discipline, which originally was supposed to set forth standards delineating good reasoning from poor. Formal logic, however, is now so technical, so rarefied, and so specialized, that it is greatly removed from this original concept of what logic is supposed to do. A person could study formal logic for years and gain no idea that it was supposed to have anything to do with differentiating good arguments from poor ones.1 Logic as it exists now is primarily the study of artificial formal systems. The idea that it has something to do with the construction and understanding of good arguments and the development of critical skills that apply to natural discourse surfaces largely in student texts and the pedagogic rationalizations logic professors offer to curriculum committees.

That formal logic cannot capture all of the factors we need when we evaluate a real piece of argumentation in a natural language is, in an important way, quite obvious. In fact, no formal logician would seriously dispute this claim, for all recognize the distinction and difference between formal and natural languages, and the role of information in the premises of arguments. What is strange is that in view of these substantial gaps between real arguments and the subject matter of formal logic, formal logic is still widely regarded as having something important to offer to the non-specialist.2 The dichotomy between formal systems and the chaotic reality of discourse is no mystery but the persistence, in the face of this dichotomy, of formal logic as an educational and intellectual institution, should give pause for thought.

So great is the prestige of formal logic that those who try to teach more practical argument and reasoning skills at universities are under some pressure to label their endeavors so as to avoid using the term ‘logic’. Such activities merit a less prestigious title: critical thinking, rhetoric, or communication skills, perhaps. The shocking truth is that if courses have the aim of treating natural argumentation, there is scarcely a pertinent body of academic expertise on which they can be based. Nonformal matters pertinent to the assessment of natural argumentation have been long neglected. (Since the Renaissance, some would say.) Of late, concerned instructors in departments of philosophy and literature have come to think that it is desirable to teach students how to identify and evaluate arguments expressed in ordinary English prose. When they set out to do this, they apparently make surprising discoveries. Textbooks in applied logic make peculiar reading for the philosopher trained to respect the traditions of formal logic. Their authors seem to have found that when they try to apply traditional logical categories to real arguments in natural languages, things do not work out well.3

2. Formal Validity

In formal logic, the category ‘valid’ is of the all-or-nothing kind. If an argument is such that, given its premises, it is absolutely impossible for its conclusion to be false, it is valid. If not, it is invalid. Period: there is nothing in between. But for many real arguments, things are not so clear-cut. For example, John Kenneth Galbraith, discussing inflation, once argued that the problem should not be avoided by our becoming content to accept inflation as a natural fact of life. He gave a number of distinct reasons for his view: inflation leads to social inequities and instability; inflation makes contracts difficult to arrange because prices for future dates may be uncertain; inflation causes difficulties in international trade. Now these factors clearly do not substantiate or demonstrate the conclusion in the logician’s strong sense of showing that it is absolutely impossible for that conclusion to be false. They are relevant to the point. They clearly go some way towards establishing it. Galbraith gives us an argument for his view. When we come to assess it, if we follow the canons of formal logic, we are left asking is this argument valid or invalid? And somehow this doesn’t seem like the right question to be asking. We are inclined to assess the argument as one which goes part way toward establishing its conclusion, though not all the way.4 In the face of these and other difficulties, philosophically minded analysts of actual arguments have been driven to suggest that perhaps validity will have to be understood as a matter of degree, or redefined in terms of an ultimate consensus of reflective normal minds. From the standpoint of logical tradition, both proposals are shockingly heretical. Validity is supposed to be a formally demonstrable and absolute feature of an argument.

Then there are problems with the hallowed old distinction between inductive and deductive arguments. Most logic texts state that deductive arguments are those that ‘involve the claim’ that the truth of the premises renders the falsity of the conclusion impossible, whereas inductive arguments ‘involve’ the lesser claim that the truth of the premises renders the falsity of the conclusion unlikely, or improbable. This distinction proves difficult to apply to actual arguments.5 Few arguers are so considerate as to give us a clear indication as to whether they are claiming absolute conclusiveness in the technical sense in which logicians understand it. In assessing arguments we need to arrive at some view as to how good the reasons put forward are. Asking whether they were supposed to be conclusive or not is often not a useful stage in the assessment procedure. The distinction between deductive and inductive arguments is hard to apply to actual arguments, and not clearly useful. Yet modern logic has developed around this distinction.

Thus traditions wobble against the pressures of the surprisingly new task of analyzing actual arguments. In the face of such problems, the teaching of courses in practical logic is not easy. Surprisingly radical things are suggested by the authors of texts for these courses. With few exceptions, they do not publish their radical suggestions in professional journals of logic and philosophy. Either they are afraid to buck tradition, or their writings are edited out.6

3. Actual Arguments

The uninitiated reader might wonder why I refer so often to actual arguments and what contrast is intended here. An actual argument is simply a piece of discourse or writing in which someone tries to convince others (or himself or herself) of the truth of a claim by citing reasons on its behalf. I speak of actual arguments because I do not wish to speak of the contrived arguments-series of statements constructed by logicians to illustrate their principles and techniques. It is common practice for a logician to state a principle in a formal system and then invent an ‘argument’ to illustrate that principle. This custom provides bizarre exercises for logic students. A selection:

If she comes closer, she will seem even more beautiful. Provided that she marries you, she will seem even more beautiful. Hence if she does not marry you, she will not come closer.7
If he has ten children, then that character will be written on his face. If his character is written on his face, he cannot deceive us. So either he cannot deceive us, or he does not have ten children.8
If the weather is warm and the sky is clear, then either we go swimming or we go boating. It is not the case that if we do not go swimming, then the sky is not clear. Therefore, either the weather is warm or we go boating.9
Any author is successful if and only if he is well read. All authors are intellectuals. Some authors are successful but not well read. Therefore all intellectuals are authors.10

These sequences of sentences can be more or less represented in the apparatus of formal logic, but the so-called arguments they express are totally unrealistic. A common rationalization for setting out such treats before student eyes is that the interesting content of more realistic arguments, whether these deal with issues of the day or enduring intellectual concerns, could distract people from purely logical considerations. (For ‘purely logical’ here, read ‘purely formal and resolvable by a mechanical procedure’.) Critics suspect that the real reason for the strange examples of the logic texts may be that a purely formal analysis is only rarely helpful in evaluating that argument. Students need exercises to develop skills. Since they cannot use formal skills on real arguments, they need invented ones.

The science of logic, formal logic, has progressed by ignoring real arguments and attending to constructed systems. Stipulated definitions and rules make objectivity and rigor possible, but only because the logician is not analyzing a real argument. He is doing something else, and doing it very precisely. But when he has completed his task, neither he nor anyone else can easily apply his formal rules and definitions to anything but his own systems. Nevertheless, formal logic is a venerable intellectual institution. Ordinary people, who know little about it, regard it as terribly technical and difficult, and often feel a little embarrassed about what they suspect would be their personal ineptitude at high-level symbol manipulation. (Tell someone you are writing a book on logic and he will give you a look of respect you will never get if you tell him you are writing on equality or the early Cold War.)

Because it deals with the artificial, formal logic can exhibit rigor and objectivity. In an artificial system, we can obtain certainty in results. After all, we have provided for this in the very construction of the system. Formal logicians primarily devote themselves to developing systems and to working out the implications of the rules and definitions around which they have decided to construct those systems. If they theorize about why this or that rule has been adopted, or what the interpretation of various systems might be, they call this study philosophy of logic, rather than logic per se. The genuine logician, in the modern world, is an adept manipulator of symbols, a creator of elegant proofs. Proofs in a formal system can be perfectly rigorous; they depend on no undefended move. With such proofs, we know exactly where we start, where we finish and why we are entitled to make the moves we do. It is the quest for rigor and certainty, the drive to achieve that impeccable result, which has led logicians more and more to concentrate on formal systems.

But such rigor and certainty are achieved at the cost of emptiness. Real arguments in natural language are not amenable to fully precise treatment. They deal with topics of controversy, disputed facts, plausible hypotheses, approximately correct analogies. To evaluate them, we must sort out ambiguities, see how diverse factors fit together, weigh pros and cons, consider the credibility of those on whom we may depend for testimony or expertise. Formal logic is, by its very nature, incompetent to address such matters. At best, it will apply to (only) some arguments in natural language, after virtually all interesting questions about the interpretation, content, and substantive truth they contain have already been resolved.

4. An Example

An example seems the best way to illustrate the truth of this claim. Alasdair MacIntyre once argued that Christianity does not require any fully objective justification for beliefs about God. MacIntyre defended this conclusion by saying that anyone founding his religious belief on an objective justification proving God’s existence would, in effect, be unfree in his belief in God. He put it this way:

(S)uppose religion could be provided with a method of proof. Suppose for example . . that the divine omnipotence was so manifest that whenever anyone denied a Christian doctrine he was at once struck dead by a thunderbolt. No doubt the conversion of England would ensue with a rapidity undreamt of by the Anglican bishops. But since the Christian faith sees true religion only in a free decision made in faith and love, the religion would by this vindication be destroyed. For all the possibility of free choice would have been done away. Any objective justification of belief would have the same effect. Less impressive than thunderbolts, it would equally eliminate all possibility of the decision of faith. And with that, faith too would have been eliminated.11

This is a complex passage, and one has to do some work to unearth the main argument. Essentially, it is:

  1. If people were struck dead by being bolted with thunder after denying any Christian doctrine, there would be mass conversions due to fear.

  2. Christian faith sees true religion only in free decisions made in faith and love.

  3. Conversion due to this kind of fear would not be due to free decisions made in faith and love.

    So,

  4. Conversion by thunderbolt would destroy Christian faith.

  5. Conversion on the basis of an objective justification of Christian belief would similarly eliminate all possibility of free decisions made in faith and love.

  6. Therefore,

  7. The objective justification of the beliefs of Christian religion would eliminate the Christian faith.

To decide whether MacIntyre has a good case here, we have, essentially, to decide how he is using the concept of freedom, what sort of freedom it makes sense to expect in contexts where people are believing as opposed to acting, how good the comparison between believing from fear and believing on the basis of an objective justification is, and how sound his comments about the freedom of the decision to believe being essential to Christian faith are. None of these issues are formal. To bend and twist this argument by analogy into a representation to which the rules of a formal system would apply would be useless even if it were possible.

Now formal logic is a prestigious academic subject with important connections with the foundations of mathematics. It is not likely to go out of existence, and no one is recommending that it should. The problem is, logic is supposed by many to provide us with standards by which to assess arguments. And as it is taught today, formal logic does not adequately do this. Logic in western universities and as standardly taught in North America has become formal logic and formal logic, whatever its hardheaded virtues, simply does not provide a sufficient basis for the assessment of actual arguments.12 Formal logicians who try to support their academic endeavors by making courses compulsory on the grounds that students’ reasoning will thereby be improved should be met with scepticism. In view of the rupture between the rigorous precision of formal logicians and the complexities of real argumentation, such claims are at best the unthinking repetition of slogans and at worst outright dishonesty.

5. The Desire for Rigor and Certainty

Rigor and the resulting certainty are not ideals that formal logicians have adopted out of perversity and foisted upon an unsuspecting public. Rather, they have roots in the history of philosophy, roots that go back at least as far as Plato. Mathematics has long been the philosopher’s envy, mathematical knowledge the ideal of perfect knowledge. For mathematics, which is non-empirical, is in an important sense liberated from the real world. In mathematics there can be uncontrovertible proofs and perfect agreement. Against its eternal verities, philosophical systems appear weak and insecure.13 Philosophers have long sought to emulate mathematical proof and produce mathematical certainty. According to Descartes, human knowledge was to be constructed so as to presume nothing. After the apprehension of an undeniable and fundamental truth, one would proceed by clear and uncontrovertible proofs to deduce simple, certain propositions from which, in turn, more complex knowledge could be concluded. Spinoza set his Ethics forward in a system of axioms, theorems and propositions, like a geometric system. Kant compared philosophy unfavorably with mathematics and speculated that the certainty of mathematics came from the fact that in that discipline the mind creates its own objects and is thereby able to know them with perfect certainty. The model of mathematical knowledge has had immense appeal and influence throughout the history of western philosophy. It is perhaps this model which motivates formal logicians. Within the artificialities of constructed systems, issues are clear, perfectly true answers possible, and completely rigorous proofs discoverable.

Nor is the attainment of perfect certainty an exclusively philosophical idea. People do not have to be taught to desire certainty. There is something satisfying and secure in the sense that one can know for certain that what one believes is true, that things must be as one thinks, that anyone who disagrees is simply mistaken. For teachers such knowledge carries a position of authority not likely in more realistic contexts. There premises may be only partially warranted, new information may outweigh that which is at hand, analogies may appear persuasive and yet not entirely convincing. ‘Valid’ or ‘invalid’, we may pronounce of the stylized, contrived arguments in the formal logic textbooks. In real life, things are rarely so simple. And yet the ‘valid’/’invalid’ verdict is a more satisfying one than a qualified judgment to the effect that an argument gives relevant reasons but not conclusive reason for accepting the conclusion. In formal logic, the desire for rigor, certainty, and categorical answers may be satisfied, but at the cost of applicability. To grasp ‘objectivity’ and rigor at this price seems more neurotic than scientific.

For more than two decades, Chaim Perelman has sought to distinguish argument from formal demonstration, defending the view that much to be said about actual argumentation transcends the bounds of formal logic. Perelman wrote in French, but his major works have been translated into English. Gilbert Ryle, P.F. Strawson, Stephen Toulmin, C.L. Hamblin, Carl Wellman, and F.L. Will are English-language philosophers who have expounded related views. In one way or another, several of the English-language thinkers have been influenced by Ludwig Wittgenstein who, in his later philosophy, railed against the imposition of scientific standards of rigor and precision in contexts where they had no proper place. Wittgenstein emphasized the value of attending to particular cases and the concrete phenomena against which (on his view) general theories should be tested. To follow his advice in developing a logic of argument would constitute a radical departure from tradition in logic. For as we have already seen, models of correct argument have been stipulated a priori and then used to evaluate actual instances of argument.

These ‘back to the phenomena’ thinkers have had exceptionally little influence on the development of logic. Though philosophers no longer venerate Wittgenstein, they continue to study his works. Yet the implications of his methodological views for logic itself are rarely remarked.14 The other authors mentioned are scarcely read in standard North American graduate programs in philosophy, and rarely considered in university courses in logic. Toulmin’s book on argument seems to have been the least successful of his many works, among philosophers at least. It was largely as a concession to sixties’ student demands for relevance that philosophy departments started to teach courses in which actual arguments, expressed in natural language, were the primary basis for study. Only recently have courses about critical thinking or ‘informal logic’ been the focus of attention. Such courses now proliferate but are regarded in many circles as poor siblings, feeble substitutes for the ‘real logic’ in which students should be taught, above all else, the ideal of rigor.

Many intellectual factors should have led to the loosening of the formalist model of successful argument. Descartes’ theory of knowledge has been subjected to severe attack in our century. Few epistemologists would accept Cartesian certainty as a reasonable goal for human knowledge; few believe that first principles should be proven or that knowledge can proceed from them in deductive steps that are intuitively clear. The formalist ideal of rigor and certainty, the model of successful argument that has emerged from formal logic, is Cartesian in concept and in origin. Yet it has not been subjected to criticism in logic as it has elsewhere. The distinction between deductive and inductive arguments, around which modern logic is erected, is closely related to another distinction philosophers have made, between analytic and synthetic statements. The latter distinction is now regarded with scepticism, as one which it is difficult to draw with exactitude. Such doubts, however, have not been transferred back to the former distinction. The hallowed tradition of inductive and deductive seems immune from serious criticism.

Curiously enough, philosophical arguments are not exclusively deductive in character. They are not, on the whole, valid in virtue of logical form alone, not easily translatable into the technical symbols of formal systems. Nor are philosophical arguments like the arguments studied in inductive logic. In accepting the model of successful argument presumed in traditional logic, philosophers have put themselves in an absurd position, because they are unable to apply what passes as a theory of argument to many of their own arguments. This difficulty would be merely silly had it not such a venerable philosophical history. Scholars have often pointed out that Descartes, Hume, Kant, and the logical positivists all have difficulties in getting their own practice to conform to their own theory.15

These problems should have produced a more flexible basis for argument analysis than that provided by formal logic. They did not. Although rigor pulls away from applicability, formal logic continues to be regarded as a science that offers something relevant to real arguments. To serve this function, however, it would have to offer canons of reasoning and argument that are more than techniques for manipulating symbols in meticulous respect for the stipulated rules of an artificial game. Formal logic shows little tendency to develop in this way, yet its practitioners seek to maintain its authority as offering the ultimate standards against which natural arguments should be assessed. Its continuing authority in the face of this disparity is a phenomenon that calls for explanation. Sociologists of knowledge should take a look at it.

6. Paradigms

While awaiting their accounts, I am led to reflect on the philosophy of science of Thomas Kuhn. Philosophers have been impressed and greatly influenced by Kuhn’s work. They delight in applying it to the social and physical sciences but are not quite so ready to apply it to themselves. Certainly no one has been so presumptuous as to apply it to the venerable establishment of logic as the formal, entirely rigorous, science of reasoning.

Kuhn introduced to philosophy of science a novel concept, that of the scientific paradigm. A paradigm is an example or model of successful work which dominates an intellectual field to determine what problems will be seen as interesting, what methods will be seen as rational, and what the standard of a good solution to a problem will be. Kuhn argued that students in science are taught, in effect, to see the world in a particular way, to ask questions within a specified range, to look in particular directions and not others for answers to those questions. The paradigm dominates a science, making consensus and rapid progress possible. Scientific questions and projects remain within a delimited range, though the limitations become so routine to practitioners as to be virtually indiscernible. Kuhn regarded paradigms as necessary and therefore legitimate, but he emphasized that their force lies more in education and established institutions than in their pure objective truth. When there are paradigm shifts in science, these are grounded as much in psychological and historical factors as in data or experimentation. So radical is a paradigm shift that Kuhn terms it a ‘revolution’ in scientific theory.

It is illuminating to look at the current situation in logic from the perspective of Kuhn’s theory. Formally valid arguments seem to be functioning as a kind of paradigm. This paradigm works so strongly on formally trained logicians and philosophers that they are unable to take account of the obvious. The obvious is that rigor and reality are uneasy mates, that real argumentation is not easily or usefully amenable to formal treatment, and that there are many interesting nonformal questions about arguments. They cry out for attention.

In logic, a paradigm is at work. But it blinds us. I recommend revolution.

Notes

1. I once knew someone who did, and did very well in terms of marks. However, this comment is less accurate now (1987) than it was when this essay was first written (1980), due to the influence of practically oriented textbooks and, in some jurisdictions, such as California, state imposed requirements for education in critical thinking.

2. Compare Patrick Suppes, Introduction to Logic, (Princeton: Van Nostrand, 1957), Michael Resnick, Elementary Logic, (New York: McGraw- Hill, 1970), Irving Copi, Symbolic Logic (New York: Macmillan, many editions). Copi’s text has been so widely used that it may be said to represent a consensus over some decades. It and other texts traditionally taught formal skills, but nevertheless commenced with bold promises as to the practical advantages a study of the formal subject matter would have, even for those not aspiring to be professional logicians or mathematicians.

3. For instance, S.N. Thomas, in his successful text, Practical Reasoning in Natural Language (Englewood Cliffs, N.J.: Prentice Hall 1974) recommends that validity be seen as a matter of degree and boldly claims that the distinction between inductive and deductive arguments is a useless dogma.

4. This reinterpretation is suggested by Carl Wellman in Challenge and Response: Justification in Ethics (Carbondale, Ill.: University of Illinois Press, 1970). It recalls C.S. Peirce’s notion of truth as the doctrine which trained minds will, in the final analysis, come to accept.

5. Definitions of ‘inductive’ vary. (Compare ‘The Great Divide’, Chapter 3.) The conception I’m alluding to here is common, however, as one may see from examining a variety of first year formal logic texts including those referred to in note 3.

6. This is less true than it was in 1980, due in part to the success of the Informal Logic Newsletter. Essays on the pedagogy of the ‘new logic’ have appeared in Teaching Philosophy, and relevant theoretical papers have appeared in Metaphilosophy, the Canadian Journal of Philosophy, and the American Philosophical Quarterly. Reviews of texts and pertinent books have appeared in these journals and in Dialogue and Canadian Philosophical Reviews. It would, however, still be accurate to say that there is a vast disparity between the popularity of applied logic courses among students and the perceived importance of related theoretical issues as research topics.

7. Resnick, Op. cit., p. 105.

8. Ibid., p. 105.

9. Copi, Op. cit., p.27.

10. Ibid., p. 90.

11. Alasdair Maclntyre, in Metaphysical Beliefs. (London: SCM Press, 1957).

12. Pure logic is formal logic. High prestige logic is formal logic. Research logic is primarily formal logic. Logic for philosophy majors is primarily formal logic. But logic for nonprofessionals, textbook logic, and college logic, are no longer primarily formal logic. My comment was true in 1980 but needs qualification in 1987.

13. I am told by those in the know that this overstates the epistemic virtues of mathematics, that nonformal reasoning is needed there also, and that formal proofs have been found wrong after being accepted by mathematicians for decades. If this is true, then ironically, the formalist model of proof may fail even to accurately capture its own paradigm of adequacy.

14. Notable among Perelman’s works is The New Rhetoric: A Treatise on Argumentation (Notre Dame: Notre Dame University Press, 1969), written jointly with Mme. L. Olbrechts Tyteca. Other works on argument in a ‘back to the phenomena’ vein are Stephen Toulmin, The Uses of Argument (Cambridge, Eng.: Cambridge University Press, 1958); C.L. Hamblin, Fallacies (London: Methuen, 1970); Carl Wellman, Challenge and Response (Carbondale Ill.: Southern Illinois University Press, 1971); and F.L. Will, Induction and Justification (Ithaca, N.Y.: Cornell University Press, 1974). The most relevant essay by Ryle is in Dilemmas (London: Cambridge University Press, 1954). Strands of resistance to formal approaches to handling natural language also appear in P.F. Strawson’s Introduction to Logical Theory (London: Methuen, 1952). Wittgenstein’s approach is evident in his Blue Book, where he said, ‘Our craving for generality has another main source: our preoccupation with the method of science. I mean the method of reducing the explanation of natural phenomena to the smallest possible number of primitive natural laws; and, in mathematics, of unifying the treatment of different topics by using a generalization. Philosophers constantly see the method of science before their eyes, and are irresistibly tempted to ask and answer questions in the way science does. This tendency is the real source of metaphysics, and leads the philosopher into complete darkness … Instead of ‘craving for generality’, I could also have said ‘the contemptuous attitude towards the particular case’.’

15. Descartes sought to doubt all his prior beliefs and found a system on the cogito and deductions from it and basic principles of logic. But he presupposed beliefs from ordinary life and medieval philosophy more than he knew, and this was not accidental. Hume said that reasonings neither quantitative nor experimental should be committed to the flames in apparent obliviousness to the fact that many of his own reasonings fell into that category. Kant stated that we could not know things-in-themselves, but seemed committed to knowing the human mind and its capacities as more than appearances. The logical positivists rejected as meaningless any claim which was not either made true by rules governing the use of symbols or verified by sense experience. Critics were quick to point out that their own key principle turns out to be meaningless if these criteria are applied to it.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Problems in Argument Analysis and Evaluation Copyright © 2018 by Trudy Govier & Windsor Studies in Argumentation is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book