terça-feira, 22 de março de 2011

Entrevista com Timothy Williamson


TIMOTHY WILLIAMSON (hereafter TW) has been the Wykeham Professor of Logic at Oxford University since 2000. He is a fellow of the British Academy and a foreign honorary member of the American Academy of Arts and Sciences. His main research interests are in philosophical logic, epistemology, metaphysics and philosophy of language. He is the author of:


Identity and Discrimination (Blackwell, 1990), 
- Vagueness (Routledge, 1994), 
- Knowledge and Its Limits (Oxford University Press, 2000), 
- The Philosophy of Philosophy (Blackwell, 2007) 
- and over 120 articles.
CHEN BO (hereafter CB), PhD, is a Professor at the Institute of Foreign Philosophy/Department of Philosophy, Peking University, China, and was an academic visitor at Oxford University between August 2007 and August 2008.
CB: Could you first introduce yourself briefly? For example, your education, your academic career, your main research fields, your family, some amateur hobbies, something like these?
TW: Although I'm British, I was born in Sweden (in 1955), because my parents were temporarily teaching English at Uppsala University. Both of them went on to teach English Literature at Oxford University, so I grew up in an academic atmosphere. I too was a student at Oxford. My first degree was in mathematics and philosophy. My subsequent doctoral thesis was on Karl Popper's idea of verisimilitude, according to which scientific theories can approximate better and better to the truth without ever being perfectly true. After that I taught philosophy at Trinity College Dublin from 1980 to 1988 and at Oxford University from 1988 to 1994. Then I moved to be Professor of Logic and Metaphysics at Edinburgh University before returning to Oxford in my present position. I've been a visiting professor or researcher at MIT, Princeton, the Australian National University, the Chinese University of Hong Kong and elsewhere. If I had to pick out one central theme in my research, I'd say: the gap between what is true and what can be known. That has led me to investigate many related questions, both technical ones in logic and broad philosophical questions, such as the nature of knowledge. As for my non-academic life, I'm married to Ana Mladenović, a pianist from Belgrade in Serbia; I was previously married to an Italian. I have a daughter of 17 and sons of 14 and 5. My work and family life keep me pretty busy. I used often to go walking in the mountains, but these days, alas, I rarely find time for that, and not even enough time for the poetry, novels and films I love. My philosophy has given me the opportunity to visit countries in many parts of the world and to some extent to see them not just as a tourist but from the inside, through the intelligent eyes of the philosophers who have been my hosts.
CB: You have an academic background in mathematics and philosophy, you are also a logician, and before university you worked in an atomic energy research station as a computer programmer. I'd like to know what contribution all this background and training made to your philosophical career.
TW: Computer programming provided useful training in how to clarify initially vague ideas to the point where they can be made formally precise. It also gave me a concrete understanding of what was meant by models of the mind as a computer. From mathematics I learnt the importance of thinking about the abstract structure of problems, and of discerning such structure by means of an elegantly suitable conceptual framework. Some of the mathematics I learnt, such as probability theory, is directly relevant to philosophy. Logic was strongly emphasized in my undergraduate philosophy degree because it provides the widest and most natural bridge between mathematics and philosophy. My philosophical training was firmly in the tradition of analytic philosophy. Altogether, it's not surprising that there is a formal aspect to most of my philosophical work, even though most of it is not purely formal. At its simplest, the formal method involves giving informal arguments for some formally expressed premises, and then arguing formally from those premises to a conclusion. These structural ways of thinking are useful in more ways than many philosophers realize. For instance, they make it much easier to find a counterexample to philosophical theories, because one can work out in advance what sort of structure it must have. I was lucky in the quality of my training. Oxford was (and is) a wonderful place to study philosophy. As an undergraduate I met my philosophy tutor for an hour once a week, either by myself or with one other student; I had to write an essay for each meeting, which the tutor commented on. Similarly, one has long individual supervisions on one's written work as a graduate student. There is no place to hide: one knows that every sentence one writes must be able to withstand scrutiny. Besides my teachers, many of my fellow-students were very able, and I learnt much from talking philosophy with them. I also gained confidence to follow through original ideas from working in a world centre of philosophy, where many of the teachers had international reputations and were making new contributions as it were in front of my eyes. When I was a student, the leaders of Oxford philosophy and logic included Michael Dummett, Peter Strawson, A. J. Ayer, Dana Scott, Gareth Evans, John McDowell, John Mackie, R. M. Hare, Derek Parfit, Christopher Peacocke and Crispin Wright. I thought: I can do that too.
CB: When you studied at Oxford, you met the “Davidsonic boom” as it was called. You said in 2005, “The excesses of the manner in which the Davidsonian programme was pursued struck me, and still strike me, as an abuse of formalization.” That statement is quite bold, direct and impressive. I'd like to know your current comments on Davidson's philosophy.
TW: Donald Davidson made important contributions to philosophy. In the 1960s he played a major role in stimulating research of more constructive and systematic kinds, and acted as a corrective to some unfortunate aspects of the influence of Wittgenstein and ordinary language philosophy. In the philosophy of mind, Davidson emphasized the causal significance of explanations of action in terms of the agent's beliefs and desires. In the philosophy of language, he laid out the programme of truth-conditional semantics, by which the truth-conditions of all sentences of the language can be derived from the semantic contributions of their constituent phrases, which requires a systematic approach to the language as a whole. He also made many highly influential contributions on specific problems: for instance, his account of self-deception and his theory of adverbs, according to which most sentences are implicitly talking about events. On the negative side, his style was too cryptic and elliptical, so that it was often unclear what his claims or arguments were. In response to questions or objections, he was defensive and guarded, doing little to articulate his ideas more explicitly. Unfortunately, some people allowed him to get away with that by treating him like a guru. There was also a tendency in the Davidsonian tradition to lay down formal constraints in a dogmatic rather than experimental spirit, without properly explaining the justification for them, and to use those constraints to dismiss legitimate theoretical alternatives. It didn't help that some of the people involved had a rather amateur knowledge of logic (“A little learning is a dangerous thing”). This all went too far towards a refusal of the scientific spirit of open debate, and obstructed progress. Fortunately, that quasi-religious attitude towards Davidson is much less common now than it once was.
CB: Let us go to your first book, Identity and Discrimination (Blackwell, 1990). Could you outline the most important ideas presented in this book?
TW: I was interested in situations where one wanted to give a particular criterion of identity for objects of some kind, but the criterion didn't have the right logical properties. For example, you might want to say that two groups of animals are of the same species if and only if they can successfully interbreed with each other. But that doesn't work, because there are cases in which group X can successfully interbreed with group Y, and group Y can successfully interbreed with group Z, but group X can't successfully interbreed with group Z – several small genetic differences can add up to a large genetic difference. By contrast, identity can't have that structure; it has to be transitive, in the logical sense: if U is identical with V, and V is identical with W, then it logically follows that U is identical with W. By set-theoretic reasoning, I showed that there is always a “best approximation” to the original criterion that, unlike the original one, does have the logical properties required for identity. One can therefore fall back to that “next best thing”. I was particularly interested in instances of this problem where the problematic original criterion was one of indiscriminability (in the cognitive sense, not in the purely logical sense). For instance, you might want to say that two objects have the same apparent colour if and only if the subject can't see any difference between them in colour. That doesn't work because there are some cases in which the subject can't see any difference in colour between A and B, and can't see any difference in colour between B and C, but can see a difference in colour between A and C – several invisible differences can add up to a visible difference. Again, genuine identity can't have that non-transitive structure. I applied the same structural approach to those cases, but I also became interested in the nature of indiscriminability itself. I argued that the indiscriminability of two things is a matter of the subject not being able to know that they are not identical. I showed how to formalize that idea and explore its logical consequences rigorously within the setting of epistemic logic. Although Identity and Discrimination is similar to my later books in style of argument, it doesn't reach the sort of clear, simple conclusions they do. As a result, it has had less impact than they have, although there has been a recent increase of interest in its account of indiscriminability.
CB: Leibniz's principles include both the indiscriminability of identicals and the identity of indiscriminables (where “indiscriminability” is understood in the logical sense); in symbols together:
  • image
It seems to me that these principles only apply to abstract objects like numbers in mathematics, and are inapplicable to cross-temporal objects, such as a person across his childhood, youth, middle age and old age; and are also inapplicable to physical objects which exist cross-spatially. What do you think about this? How to determine the identity of those objects in space and time?
TW: The indiscriminability of identicals applies to all objects without exception. It is tempting to think that although the old man is identical with the young child, the very same human being, they are discriminable because the old man has properties that the young boy lacks, such as being wrinkled. However, such apparent counterexamples dissolve once one is accurate about time references. The old man is identical with the person who was a young boy; the old man is wrinkled but he wasn't wrinkled when he was a young boy. It is false that the young boy still is unwrinkled. David Wiggins has given an excellent defence of the indiscriminability of identicals along these lines. The spatial case is similar. One and the same river, the Thames, runs through Oxford and then London; it is wide in London but not in Oxford. Thus different parts of the same river have incompatible properties – the stretch of the Thames that runs through London is wide, the stretch of the Thames that runs through Oxford is not wide – but that does not imply a counterexample to the indiscriminability of identicals. The identity of indiscriminables also applies to all objects without exception, provided that the property F can be non-qualitative, for example the property of being one and the same person as Ted, which applies to Ted but not to his twin brother Fred. Leibniz himself intended a restriction to purely qualitative properties, which apply to both or neither of two exactly similar objects, but I assume that you have in mind the more liberal modern interpretation of the identity of indiscriminables. On the modern interpretation, both principles are logical truths. Of course, logic by itself can't tell us whether objects we encounter in different places at different times are identical; we need to know something about the objects too!
CB: In your second book, Vagueness (Routledge, 1994), you develop an epistemic view of vagueness, on which vagueness is an epistemic phenomenon, a kind of ignorance: there really is a specific grain of sand whose removal turns the heap into a non-heap, but we cannot know which one it is. It sounds quite counter-intuitive, strange and even a little bit wild. Could you clarify your position and arguments for it? Why and how do you develop such a conception of vagueness?
TW: The central argument for epistemicism about vagueness is very simple and straightforward. It combines standard principles of classical logic and semantics with common sense observations that almost everyone accepts. For instance, 10,000 grains make a heap but 0 grains do not make a heap. It follows by quite ordinary logical or mathematical reasoning that there is a natural number n, somewhere between 10,000 and 0, such that n + 1 grains make a heap but n grains do not make a heap. By a standard principle about truth, we can add that it is true to say of n + 1 “That many grains make a heap” and of n“That many grains do not make a heap”. But it is obvious that we have no idea how to find out which number is n. That is, in the relevant vague sense of “heap”, we are in no position to know that n + 1 grains make a heap and n grains do not make a heap, where the number n is identified by a numeral such as “29”. Consequently, the extension of “heap” has a boundary but we are in no position to know where that boundary is. This solves the ancient sorites paradox: it appears that subtracting just one grain from a heap still leaves a heap; therefore if you subtract grains one by one from a heap of 10,000 grains there is still a heap when no grains are left. The principle that subtracting a grain from a heap always leaves a heap is false, but we cannot know exactly when there ceased to be a heap. Of course, many philosophers dislike that conclusion, and try to avoid it by changing classical principles of logic or semantics or somehow restricting their application. That unwillingness to apply basic principles as soon as difficulties are encountered strikes me as methodologically unsound. I also show in the book that the suggested non-classical alternatives introduce all sorts of difficulties of their own, including analogues of the ones their proponents use against epistemicism. I came to the view as a result of writing Identity and Discrimination. In explaining the non-transitivity of discriminability, I had formulated what I call the “margin for error principle”, which says roughly that in order to know in a given situation you must avoid false belief in relevantly similar situations, otherwise your belief in the original situation is too unsafe to constitute knowledge. I realized that the margin for error principle would also explain how, given that vague concepts have sharp boundaries, we still cannot know where they are. Such an explanation makes epistemicism a much more attractive theoretical option. Those ideas came to me late in the writing of Identity and Discrimination. My conception of vagueness was still evolving as I wrote, and I did not arrive at a fully general theory of epistemicism until shortly after it was published.
CB: What is the problem of vagueness? What difficulties do many-valued logic, fuzzy logic and supervaluationism have to face when they are used to deal with vagueness and the sorites paradox (the paradox of the heap)? What advantages does classical logic have when it is used to deal with them?
TW: The central problem of vagueness is how logic and semantics should treat vague words and concepts, which include virtually all our words and concepts – including scientific words and concepts; although they are often more precise than everyday words and concepts, they are not perfectly precise. Does classical logic or semantics need to be modified or restricted in some way for vagueness? If so, how? If not, why not? The problem arises not only for philosophers but also for linguists and for computer scientists who want to build computers that can communicate with human beings in vague language. You have mentioned three of the most popular non-classical approaches. Many-valued and fuzzy logics change classical logic. For instance, they imply that contradictions can be half-true, whereas they should be counted as definitely false. Supervaluationism changes classical semantics. For instance, it says that an existential claim such as “There is a cut-off point for being a heap” can be true without having a true instance – according to supervaluationists there is no point of which it is true to say “That is a cut-off point for being a heap”. Classical logic automatically avoids these difficulties. A major difficulty for most non-classical approaches is higher-order vagueness: borderline cases of borderline cases and so on. I argue in the book that they do not treat it in a satisfactory way. By contrast, epistemicism can easily handle higher-order vagueness, which it interprets as limits on our knowledge about limits on our knowledge. The margin for error principle predicts exactly such higher-order limits on knowledge. Classical logic is standardly used in logic, mathematics and science, where it has proved immensely successful. Philosophers who reject it rarely confront the difficulties that such a revision would make for science. In English there is a saying “A bad workman blames his tools”. Classical logic and semantics are amongst the philosophical workman's tools.
CB: As Dummett says, bivalence presupposes some kind of realism: it is the state of the world that makes some sentences true and others false. But the intuitionists and Dummett claim that truth is some kind of epistemic concept concerned with provability or knowability; it is only when we are able (at least in principle) to prove or know the truth-values of a sentence that we can say it is either true or false. What is your comment about such claims?
TW: I completely reject the claim that truth is any kind of epistemic concept. There is no contradiction whatsoever in the hypothesis that there are truths about our world that nobody could ever be in a position to know. The fundamental nature of truth and falsity is captured in Aristotle's principle that to say of what is that it is not, or of what is not that it is, is false, while to say of what is that it is, or of what is not that it is not, is true. For instance, the proposition that it is raining is true if and only if it is raining. That equivalence is stated in terms none of which seems to be epistemic. The burden of proof is on those who claim that truth is an epistemic concept: after all, on their own view, if their view is true then it should be provable in some sense. When one analyses attempts to argue for an epistemic conception of truth, they turn out to be very weak. Intuitionistic mathematics, developed by Brouwer as an alternative to classical mathematics on the basis of his rejection of bivalence, has remained utterly marginal within mathematics. Attempts to extend the intuitionistic programme to the natural sciences have had no serious success.
CB: It is said that your epistemic view of vagueness was at first viewed with suspicion by many people, but has become a relatively mainstream view, at least quite an influential one in research on vagueness and the sorites paradox. Is that true? By the way, what standards or criteria should we follow to evaluate a philosophical work?
TW: Before I published Vagueness, epistemicism was generally regarded as too crazy to be taken seriously, even though a few people had defended less developed versions of it. My initial hope for the book was that it would force people to take the view seriously. It has certainly had that effect: epistemicism is now normally listed as one of the main theories of vagueness. In view of its simple, classical nature, many younger philosophers regard it as the default theory of vagueness, the one to be rejected only on the basis of strong arguments. In general terms, the standards to be followed in evaluating a philosophical work are not so different from evaluating a work in some other science. Are there valid arguments for it from premises that we can independently evaluate as true? Are there valid arguments from it to conclusions that we can independently evaluate as false? What explanatory power does it have? Does it have the intrinsic virtues of simplicity and elegance?
CB: I'd also like to know why metaphor has become such a hot topic in contemporary philosophy of language. Could you explain that to me?
TW: Metaphor constitutes an intriguing challenge to a central idea in contemporary semantics and philosophy of language, compositionality, according to which the meaning of a complex linguistic expression is built out of the meanings of its constituent words (as I mentioned in talking about Davidson). That is the standard explanation of humans' ability to understand new sentences which they have never encountered before (for instance, in reading this interview), provided that they are constructed out of previously encountered words. Metaphor doesn't work like that. If I tell you that an actor is a peacock, you can probably work out what I have in mind, even though it isn't simply the result of combining the meanings my words already had. Some people have postulated a special kind of metaphorical meaning to explain the phenomenon. Perhaps the most important single work on the topic is an article by Donald Davidson, whom you were asking about before, in which he argues that there is no such thing as metaphorical meaning; there are just the literal meanings of the expressions, and the effects they cause in hearers. Although his view may be too simple, it provides a beautifully clear and powerful basis for further discussion of the topic.
CB: Also in 2000, you published your third book, Knowledge and Its Limits (Oxford). It seems that this book has been warmly welcomed and highly evaluated by English philosophical circles. I read such comments as it is “the best book in epistemology to come out since 1975”, “It sets the agenda for epistemology for the next decade and beyond.” My question is similar to my previous ones: could you summarize the most important ideas presented in this book?
TW: The slogan of the book is “Knowledge first”. It advocates taking the notion of knowledge as a starting-point on the basis of which to understand other cognitive phenomena, such as belief, rather than vice versa. Knowledge is a mental state in its own right. Belief is a more general mental state that aspires to the condition of knowledge. Successful belief constitutes knowledge; belief that doesn't constitute knowledge is defective. The notion of knowledge can't be analysed in more basic terms. It is also used to explain the nature of justification, evidence and assertion. The view is a radical form of externalism, because it implies that our most basic cognitive states are constituted by relations to our external environment. For instance, you know that the cat sat on the mat only if the cat really did sit on the mat.
CB: As far as I know, there is a deeply rooted tradition from Plato's Theatetus according to which knowledge implies truth. Quine advocates an empirical and holistic theory about “the web of belief”, and also insists on a quite radical fallibilism; he even does not accept knowledge as a legitimate concept, claiming that there is no such thing as knowledge, but only true belief warranted to varying degrees. Although I will not go that far, I have some sympathy with him. I strongly suspect the thesis that knowledge implies truth, which is against the common sense concept of knowledge. What is your opinion about this topic? What similarities and differences are there between your conception of knowledge and Quine's?
TW: Surely the common sense view is that knowledge does entail truth. If someone claims to know that Beijing is in Japan, his claim is just false. He may think that he knows that Beijing is in Japan, but he is wrong; he doesn't really know that Beijing is in Japan, for Beijing is not in Japan. He is ignorant of his own ignorance just as he is ignorant of geography. I don't think Quine would disagree with that. Even radical fallibilists can agree that knowledge entails truth, since we are fallible about whether we know. Even extreme sceptics agree that knowledge entails truth, and often assume as much in their arguments; such sceptics are radical fallibilists. Actually, it is quite difficult to give an accurate, precise definition of “fallibilism”, but it is at least clear that the claim that knowledge entails truth is perfectly compatible with the claim that there is no general type of subject matter about which we are infallible. I accept both claims. If you are happy with the concept of true belief warranted to varying degrees, then you could define “knowledge” as true belief warranted to a high degree, and knowledge in that sense would automatically entail truth. The view I defend in Knowledge and Its Limits is that the concept of knowledge is at least as basic as the concepts of belief and warrant. Quine's distaste for the concept of knowledge was based on unwarranted reductionist assumptions.
CB: What are the opposite doctrines with which you intend to fight in this book? How do you object to them? Could you outline your main arguments against them?
TW: I'm opposing a long tradition in epistemology that takes belief as more basic than knowledge and tries to analyse knowledge in terms of belief, truth and other factors. I'm also arguing against related doctrines of internalism, which give epistemological primacy to purely internal states of the subject as the starting-point of epistemology. Another target is scepticism, the arguments for which presuppose some sort of internalism even when they appear not to. One argument against the attempt to analyse knowledge in terms of belief is that literally hundreds of attempts have been made so far and all of them have failed; either there are counterexamples to the analysis or it is circular. The view I defend explains why all such attempts must fail. I also argue that the motivation for internalism has been the assumption that there is a core of mental states that are luminous in the sense that whenever one is in them, one is in a position to know that one is in them. For instance, internalists tend to think that the state of being in pain is luminous, in that whenever one is in pain, one is in a position to know that one is in pain. I argue that the only luminous states are trivial, in the sense that one is always or never in them. For instance, pain is a non-trivial state, because one is in pain at some times and not at others. Therefore being in pain is a non-luminous state: one can be in pain without being in a position to know that one is in pain. Thus the motivation for internalism is unsound.
CB: The argument against luminosity seems to have an important function in your epistemology. Could you clarify this argument again for us? Why do you claim that epistemology cannot be fully operationalized? Could you argue for that briefly?
TW: The anti-luminosity argument considers a process by which slow changes lead you into or out of the mental state in question. For instance, an excruciating pain very gradually diminishes until you are completely comfortable. I use a version of the margin for error principle to show that such a gradual transition can occur only if the state is not luminous; for instance, at some intermediate stage in the process, near the boundary between being in pain and not being in pain, there must come a point at which you are in pain but are not in a position to know that you are in pain. Thus the arguments of Knowledge and Its Limits are closely related to the epistemology ofVagueness. However, I show that one can accept the anti-luminosity argument whether or not one accepts epistemicism about vagueness. As for “operationalizing” epistemology, by that I mean turning it into the evaluation of rules for acquiring, retaining and rejecting beliefs that are operational in the sense that whenever one obeys the rule one is in a position to know that one is obeying it. Of course that would make obeying the rule a luminous state, and it is not a trivial state. But, by the anti-luminosity argument, only trivial states are luminous. Thus there are no states of the sort that operationalized epistemology requires. Consequently, epistemology can't be operationalized.
CB: You identify a person's total evidence with what the person knows, in symbols, E = K. Such a concept of evidence seems to be quite counter-intuitive. Is this kind of evidence still objective or subjective? Could it still serve us to reach truth?
TW: I regard E = K as a very natural, common sense view of evidence. It does not make your evidence independent of your personal situation, for what you know depends on your personal situation – for instance, on what events you happened to turn your head and notice yesterday. But equally it does not make evidence purely subjective, for since knowledge requires truth, E = K ensures that all evidence consists of truths (although not all truths are evidence). Thus evidence can serve us to reach truths. The probability of truths we don't know can be evaluated on the basis of the truths we do know. Of course we are not always in a position to know whether something constitutes part of our evidence, since we are not always in a position to know whether we know it, but the anti-luminosity argument shows that that is not an objection to E = K, since whatever evidence is, the state of having a specified proposition as part of one's evidence will not be luminous. Therefore, one can have that proposition as part of one's evidence without being in a position to know that it is part of one's evidence. We must simply learn to live with that fact.
CB: Does your new conception of knowledge have any implications for the development of epistemic logic? What are your comments about the current situation of epistemic logic? Could you predict the prospects of epistemic logic?
TW: On the new conception of knowledge, we should reject two of the axioms of epistemic logic known as “positive introspection” and “negative introspection”, except in drastically idealized situations. Positive introspection says that if you know something, you know that you know it. Negative introspection says that if you don't know something, you know that you don't know it. The founder of formal epistemic logic, Jaakko Hintikka, already gave decisive reasons for rejecting negative introspection, but he held on to positive introspection. I argue in Knowledge and Its Limits that positive introspection fails, by a special case of the anti-luminosity argument. Much of the recent development of epistemic logic has been in computer science and theoretical economics, both of which are concerned with common knowledge, where everybody knows something, and everybody knows that everybody knows it, and so on. Epistemic logic turns out to be a good setting for the study of common knowledge. The discussion there tends to be technically sophisticated but philosophically naïve. I'm trying to encourage epistemologists to make more use of epistemic logic to model problems in which they are interested. The technicians have much to learn from philosophers and vice versa, but it is hard to overcome the barriers to communication that arise from differences in training and goals.
CB: You claim that the gap between truth and knowability is pervasive; there are unknowable truths. Could you summarize your arguments for them here? Could you also outline your views about the paradox of the surprise examination and the paradox of knowability?
TW: Epistemicism about vagueness, the anti-luminosity argument and the rejection of positive and negative introspection all involve truths that cannot be known, at least in the relevant circumstances. My treatment of the surprise examination paradox is similar. An example is a teacher who informs her pupils that on just one day in the year they will be given an examination, and that they will not know on the morning of the examination that it will be that day. The argument is that it cannot be on the last day, since that morning they will know that it is the only day left for the examination. Therefore they can rule out an examination on the last day in advance. Therefore the day before the last is the last possibility. But then they can rule out an examination on that day too, by an analogous argument. By continuing the argument in advance for each day of the year, the pupils can “prove” that there cannot be such an examination. But clearly, if the teacher is trustworthy the pupils can know in advance that there will be a surprise examination. The solution is to see that even if one knows something today, it does not follow that one knows today that one will still know it tomorrow. This is simply to reject a diachronic version of the positive introspection principle. In Knowledge and Its Limits I also defend against recent criticisms a very different sort of argument for the existence of unknowable truths. The argument was devised by Alonzo Church but first published by Frederic Fitch. It goes like this. There are unknown truths. For instance, either the number of books in my office on 1 January 2008 was odd or it was even. Since I did not count then, and too much has changed since, nobody will ever know which it was. Thus either it is an always unknown truth that the number of books was odd or it is an always unknown truth that the number of books was even. We can allow that those truths, although always unknown, are not unknowable, since on 1 January 2008 somebody could have known one of them by counting the books in my room. However, if it is an always unknown truth that the number of books was odd, then it is an unknowable truth that it is an always unknown truth that the number of books was odd, for if anybody knew that it was an always unknown truth that the number of books was odd, they would thereby know that the number of books was odd, in which case it would not be always unknown that the number of books was odd, in which case they would not after all know that it was an always unknown truth that the number of books was odd (since knowledge requires truth; the whole argument uses reductio ad absurdum). Similarly, if it is an always unknown truth that the number of books was even, then it is an unknowable truth that it is an always unknown truth that the number of books was even. Thus, either way, there is an unknowable truth. Anti-realists often call the argument the paradox of knowability, because they don't like the conclusion. In my view it is not a paradox, just a surprisingly neat argument from true premises to a true conclusion.
CB: I'm afraid that your epistemicism about vagueness and your claim that there are unknowable truths will imply some kinds of agnosticism, which means that we are unable in principle to know some parts or some aspects of the world, e.g., in borderline cases. What do you think? Is agnosticism acceptable or not in epistemology? If yes, how to draw the boundary between the knowable and the unknowable?
TW: My views do entail a limited agnosticism, according to which we must accept that there are some truths we can't know. However, there are also many truths we can know – even about whether there is a god. The very same epistemological principles explain ignorance in some cases and the possibility of knowledge in others. I see no objection to such agnosticism, provided that it does not turn into scepticism. In some very clear cases we know that we know things. The failure of positive introspection just means that when we know, we can't always know that we know; it does not mean that when we know, we can't ever know that we know. Similarly, the failure of negative introspection just means that when we don't know, we can't always know that we don't know; it does not mean that when we don't know, we can't ever know that we don't know. Arguments of the sorts I have been explaining give us considerable knowledge about the location of the boundary between knowability and unknowability, but they also show that we cannot have complete knowledge of its location. That's what life is like.
CB: Recently I read your new book, The Philosophy of Philosophy (Blackwell, 2007), I think it is quite important, original and also provocative. But I also find it is not so easy to read and understand, because its topics and ideas are quite new, some chapters quite technical, its arguments quite complicated. Could you summarize the most important ideas presented in this book?
TW: One main aim of the book is to defend the legitimacy of the traditional armchair method of philosophical reasoning – most philosophers don't do experiments like those of the natural sciences – but without doing so by restricting the subject matter of philosophy to our own words or concepts. I regard philosophy as part of our overall attempt to understand the world in general and our place in it in particular. Thus the question arises: how can armchair methods give us knowledge of the world outside our armchairs? The example of mathematics is encouraging, because it uses armchair methods but still gives us valuable knowledge of the world. I argue that the armchair methods of philosophy are developments out of quite ordinary methods of gaining knowledge of the world. For instance, the thought experiments that give us knowledge about metaphysical possibility and metaphysical necessity are really just special cases of our ordinary ability to evaluate counterfactual conditionals (such as “If only ten people had come to the party, it would have been a failure”) by a disciplined use of the imagination. It is wrong to think of the imagination as primarily a realm of capricious fantasy; it plays a significant cognitive role. Naturally, logic also plays an important role in armchair reasoning. In defending armchair methods, I don't mean that the results of scientific experiments are never relevant to philosophical questions, just that their relevance in principle does not undermine the use of armchair methods too. After all, even though the results of experiments are in principle relevant to mathematical questions too, an armchair methodology is still usually the most efficient way of deciding a mathematical question. For some philosophical questions – for instance, about the philosophy of perception or the philosophy of space and time – experimental results are important. For others – for instance, about the relation between possibility and existence – they do not help.
CB: You assert that the “linguistic turn” and the “conceptual turn” are past and should be left behind, because they are wrong. Could you explain these claims more clearly? For example, what is the linguistic turn? What is the conceptual turn? Why are they wrong? For what reasons? Especially, could you further explain the aboutness of language and thought?
TW: The linguistic turn was one of the most important features of twentieth-century philosophy. It was the movement, in both analytic and non-analytic traditions, by which language became the central focus of philosophy, either as its subject matter – the idea that philosophy studies the nature and structure of our actual language or of artificial formal languages – or as its method – the idea that philosophizing consists largely of applying the competence one has as a speaker of such a language. Philosophers such as Ludwig Wittgenstein, Rudolf Carnap and J. L. Austin constitute diverse paradigms of the linguistic turn on the analytic side; Derrida might be regarded as an example on the non-analytic side. The conceptual turn is similar but substitutes thought for language and mental concepts for words in the central role. In his later work, Peter Strawson represented the conceptual turn as much as the linguistic turn, and those like Gareth Evans who advocated the priority of the analysis of thought over the analysis of language in philosophy went further. Both the linguistic and conceptual turns can be interpreted as defences of the armchair methodology of philosophy by restricting the ambitions of philosophy to what we still have in the armchair: our own language and thought. The linguistic turn and the conceptual turn are closely related because a very pertinent feature of both words and concepts is their intentionality, their aboutness, in the sense of their capacity to refer. For instance, both the word “water” and the concept water refer to the substance water. One motivation for writing the book was that it has become increasingly clear that much recent analytic philosophy doesn't fit either turn. For instance, philosophers of physics do not primarily study the words “space” and “time” or the concepts space and time, or even what words or concepts should replace them; they primarily study space and time themselves. Nor do they primarily rely on their competence simply as speakers of a language or possessors of the concepts in their methodology. They rely on the results both of thought experiments and of physicists' experiments. I argue through detailed case study that there is a similar shift of focus away from words and concepts even in parts of philosophy where physical experiments do not play such a role.
CB: I'd like to know what heritage the last century of Anglo-American philosophy left us. What advantages and disadvantages has analytic philosophy had? What can we learn from it, whether good or bad? Would you still like to call yourself an analytic philosopher, since you are a philosopher and at the same time a logician?
TW: I'm happy to call myself an analytic philosopher, because it is obvious that I belong to that tradition. But I don't regard the label as very informative, since there is so much variety within the tradition. For instance, Wittgenstein and Quine both count as analytic philosophers, even though they are very different from each other. Of course, not all analytic philosophers are Anglo-American and not all Anglo-American philosophers are analytic. Several giants of early analytic philosophy, such as Gottlob Frege and Carnap, were German. Some Anglo-American philosophers are post-modernists rather than analytic philosophers (fortunately, not too many). One of the most important developments in recent decades has been the rapid growth of analytic philosophy outside English-speaking countries beyond those (such as Sweden) in which it has a long and rich history. For example, there are now many analytic philosophers in China. I see no tension between being a philosopher and being a logician. Many leaders of analytic philosophy have been logicians. Frege, Russell, Quine and Kripke are obvious examples. Their logic is just the more formal part of their philosophy. At one time many analytic philosophers regarded the linguistic turn as the central achievement of analytic philosophy. As I've said, I regard it as a mistake. Nevertheless, when one looks back over the past century, one sees that analytic philosophy over that period has left a huge positive heritage, and not one that requires us to scale down the ambitions of philosophy in the way the linguistic turn and the conceptual turn did. The achievement is too huge and varied to summarize properly, but I'll single out a few of its most important methodological aspects. First, the refinement of a straightforward style of philosophical reasoning in plain ordinary language with clearly stated theses, explicitly articulated arguments, carefully presented evidence, common sense counterexamples and imaginative thought experiments, in a spirit that encourages the reasoned discussion of disagreements. Second, the application of the semantics of natural languages to reflect more deeply and systematically on the semantic structure of philosophical theses, and thereby to evaluate the validity of informal philosophical arguments more accurately (this is an instrumental role for semantics, not the primary role envisaged in the linguistic turn). Third, the use of formal languages and logics to provide more precise versions of philosophical theses and to elicit their consequences, sometimes under judiciously idealized assumptions. Fourth, the integration of scientific knowledge – for instance, from mathematics, physics, biology and psychology – into philosophical reflection on the relevant domains. None of these trends originated with analytic philosophy – one can find early versions of all of them in Aristotle – but they have all been taken significantly further in analytic philosophy than ever before. Of course, we also have a huge legacy of more specific knowledge from the past hundred years of analytic philosophy, not just of logic and semantics but also of the variety of theoretical options available in respect of various philosophical problems and the difficulties those options face.
CB: You argue that philosophy is no exception from the other sciences, in its subject-matter, methodology, evaluation criteria et al. You fiercely criticize philosophical exceptionalism, and argue that it is false. From your arguments I can detect some shadows of Quine's views, e.g, his conception that philosophy is continuous with science, that philosophy also belongs to the totality of our knowledge of the world, etc. Could you explain the similarity and difference between your position and Quine's about the relation between philosophy and sciences?
TW: You are right that my view of philosophy is similar to Quine's in regarding philosophy as continuous with, or even overlapping, other branches of human knowledge. The main difference is that Quine privileges natural science, especially physics, over all the other branches of human knowledge. He tends to reject anything that isn't reducible to fundamental physics. I'm much less reductionist than Quine. For instance, he thinks that ultimately the concepts of belief and desire are not scientifically respectable, because they can't be expressed in the language of physics. But the language of physics was never intended to be a universal language. It may be the best language for the questions of physics, and physicists are the best people to ask about the answers to those questions, but not all good questions are questions of physics; other words are needed for other questions. If physics does not entail that there are things of a certain kind (such as musical symphonies) it does not follow that physics entails that there are not things of that kind. I take our knowledge of history and mathematics just as seriously as I take our knowledge of physics. Philosophy too has some limited autonomy: it can't ignore considerations from physics, but it must also pay due respect to armchair reasoning.
CB: You seem to divide analyticity into three or four categories: metaphysical, epistemological, and modal or Fregean analyticities. You also argue that analyticity is substantial, not insubstantial; analytic truths and synthetic truths are not true in radically different ways. Could you explain this more clearly?
TW: According to one form of the linguistic or conceptual turn, the starting point for philosophy consists of analytic or conceptual truths. To assess such claims we must ask what “analytic” means in them. Roughly speaking, a sentence is metaphysically analytic if it is true simply in virtue of its meaning. Modal analyticity is one way of clarifying metaphysical analyticity; a sentence is modal-analytic if having that meaning necessitates being true. I allow that some truths, such as truths of logic and mathematics, are modally-analytic, but deny that this shows that they express something epistemologically or metaphysically obvious or insubstantial, for it is not given that necessary truths themselves are in any way obvious. A sentence is epistemologically analytic if assenting to it is a necessary condition of understanding it. I argue that no sentence is epistemologically analytic, because a competent speaker can always dissent on theoretical grounds; even if those grounds are mistaken, a theoretical error does not amount to linguistic incompetence. A sentence is Frege-analytic if it is synonymous with a logical truth. I allow that some truths, such as truths of logic, are Frege-analytic, but again deny that this shows that they express something epistemologically or metaphysically obvious or insubstantial, for it is not given that logical truths are in any way obvious. In general, ideas of analytic or conceptual truth do not illuminate the epistemology of philosophy in the way that proponents of the linguistic or conceptual turn had hoped. They do not deflate or explain philosophical knowledge. To argue that supposedly analytic and supposedly synthetic truths are true in the same way, I show that their truth is handled in a uniform way by logic and semantics, the two disciplines most concerned with the concept of truth.
CB: I'd like to know your view of the nature of logical truth: What is a logical truth? Are logical truths a priori and necessary? In what sense? Is logic revisable? If so, how to revise it? Do you agree with revisionism or fallibilism in philosophy of logic? Why?
TW: In my view, Alfred Tarski gave the best account of logical truth. It is one that has proved immensely fruitful in the development of logic as a discipline over the past seventy years. First we divide the basic vocabulary of the language into the “logical” and the “non-logical”. The “logical” vocabulary typically consists of words such as “is”, “not”, “and”, “or”, “all”, “some”, that is, words whose meaning is structural, in a sense that can be made more precise; we may also include more specific words such as “know” in epistemic logic if we are interested in their structural features. Then a sentence is a logical truth if and only if it is true under every interpretation that keeps the meaning of the logical words fixed but may vary the interpretation of the other words. For instance, “If grass is green then grass is green” is a logical truth because it is true no matter what “grass” and “green” refer to, given the fixed meaning of “if”. “Grass is green” itself is a truth but not a logical truth, since it is false under the interpretation on which “grass” refers to water and “green” to dry things. The fact that this definition is free of epistemological or modal elements is an advantage, because it makes the concept of logical truth pure and simple, as a basic conceptual instrument should be. The inclusion of such epistemological or modal elements in the definition of logical truth would have severely obstructed the development of modern logic. It is not automatic that logical truths in Tarski's sense are a priori or necessary. Some people don't like that consequence but I see nothing wrong with it. Logic is in some sense concerned with formal or structural matters; why should it be taken for granted that the answers to such questions are a priori or necessary? When we restrict ourselves to the most elementary branches of logic, such as propositional logic, we can argue on more specific grounds that all logical truths in Tarski's sense are a priori, in the sense of provable, and metaphysically necessary. For less elementary branches of logic, the picture changes. For instance, second-order logic – which allows us to generalize into predicate position as well as into name position – is essentially incomplete; it is impossible to have a formal system of proofs in which all and only the logical truths of second-order logic are provable. Thus one cannot expect all logical truths of second-order logic to be knowable a priori. In modal logic – the logic of possibility and necessity – one can include a special operator “actually”, analogous to “now” in tense logic, and doing so results in logical truths that don't express necessary truths. An example is “It is raining if and only if it is actually raining”, which is a logical truth because it is true no matter how “raining” is interpreted; but it is not a necessary truth because it could have rained even if it is not raining in the actual world. Tarski's conception of logical truth nicely fits fallibilism about logic, since it makes it possible in principle to think that a sentence is a logical truth when in fact it isn't. In practice we know what the logical truths are in elementary logic, just as we know what the arithmetical truths are in elementary arithmetic. In less well understood branches of logic, we may well be in error about whether some particular sentence is a logical truth, and be forced to revise our opinion, without any change in what the sentence means. Such changes would not be radically dissimilar to changes of theory in other sciences.
CB: By the way, what do you think about Quine's paper “Two Dogmas of Empiricism” and its influence in analytic philosophy? Moreover, what is your general evaluation of Quine's philosophy, especially his naturalized epistemology?
TW: Quine's criticism of the analytic-synthetic distinction in “Two Dogmas” has been immensely influential, but his arguments against the distinction look less and less compelling. Although he showed that attempts to define it involve other semantic concepts, he didn't show what is wrong with semantic concepts – such as synonymy – in general. After all, they are used in semantics as a branch of linguistic science. His objection to them seems to be that they are too unclear because they can't be reduced to purely behavioural terms, but few philosophers these days would accept his implicit reductionist or behaviourist premise. Nevertheless, what surprised me in writing on analyticity was to see how well Quine's conclusions about the analytic-synthetic distinction stand up even if one rejects his scepticism about meaning. No such distinction can do the work that philosophers such as Carnap expected; it can't trivialize the epistemology or metaphysics of a bunch of truths called “analytic”. There are indications in Quine's own writings of that alternative, less ideological critique of an analytic-synthetic distinction. Other aspects of Quine's work seem very dated, for instance his behaviourism: fifty years ago Noam Chomsky demolished the pretensions of behaviourism to be the scientific approach to mind. Quine's reductionism looks far too narrow and preconceived. But some of his comments on logic still come as a breath of fresh air. As for Quine's naturalized epistemology: I agree with him that the acquisition of knowledge is a process within the natural world, but that point by itself has comparatively few methodological consequences, if one doesn't combine it with his extreme reductionism. For instance, it does not force one to try to translate the words “knowledge” and “belief” into the language of fundamental physics. That language probably can't express any interesting epistemological questions at all. However, one can see Quine's naturalized epistemology as an extreme manifestation of a more general shift in epistemology, away from a Cartesian internalist approach, on which the starting-point is restricted to what the subject knows. In studying someone's epistemic situation, we as theorists can use all our knowledge of the world, even if the person we are studying doesn't share that knowledge. For instance, we can use knowledge gained from experiments on the psychology of perception. This more externalist approach to epistemology has been one of the most important developments in the period since 1960. One sees it in epistemologists such as Alvin Goldman, Gilbert Harman, Robert Nozick and Hilary Kornblith and in Knowledge and Its Limits, and it is at least part of what Quine meant by naturalized epistemology. I use it in The Philosophy of Philosophy too, for instance when I appeal to the results of experiments on the psychology of reasoning in discussing the epistemology of logic.
CB: You argue that like all other sciences philosophy also needs thought experiments. You also said, “Philosophy will benefit from an attitude that looks more actively for opportunities for model-building.” Could you explain how to apply thought experiments and model-building methodology in philosophy? Please give some example(s).
TW: A famous example of a thought experiment is Bertrand Russell's imaginary example of looking for a moment at a stopped clock that by luck shows the correct time. The person acquires a true belief, even in some sense a justified true belief, about what time it is, but the belief does not constitute knowledge. It follows that neither true belief nor even justified true belief is sufficient for knowledge. In imagining the example and judging that the person involved would not know the time one is performing a simple thought experiment. One learns something about what would be the case in a specific possible situation, from which we can infer something about the properties involved in the thought experiment – in this case, about the nature of knowledge. Epistemologists and other philosophers use such thought experiments continually. Models are different from thought experiments. Typically, models use simplifications and idealizations in order to achieve a formal description of a possibility whose consequences can then be elicited by formal reasoning rather than by imagination-based judgement. For instance, we can set up a model in epistemic logic as follows. The “possible worlds” are the numbers 0, 1 and 2. A “proposition” is a subset of the set {0, 1, 2}. A proposition is “true” at a world if and only if the world is a member of the proposition. A relation called “accessibility” holds between worlds if and only if the numerical difference between them is at most 1. The proposition that one knows a proposition X contains a given world if and only if every world accessible from that one is a member of X. Thus if X is the proposition {0, 1}, the proposition that one knows X is just {0}. For the worlds accessible from 0 are 0 itself and 1, both of which belong to {0, 1}, whereas world 2 is accessible from worlds 1 and 2 and does not belong to {0, 1}. One can then conclude the proposition that one knows that one knows X is the proposition that one knows {0}, which is the empty set {}, since there is no world from which only world 0 is accessible. Thus at world 0 one knows X without knowing that one knows X. In other words, we have constructed a formal counterexample to the positive introspection axiom. The argument that the axiom fails in the model is purely mathematical. That is an advantage over a thought experiment. Another advantage is that the formal consistency of the scenario is clear from the way in which it was specified. A disadvantage is that the model builds in the unrealistic, idealized assumption that one knows all the logical consequences of what one knows. One has to argue that the idealization is irrelevant to the point we are currently interested in, positive introspection, and that the example is sufficiently realistic in more relevant respects. Those sorts of issue arise for models in natural science too and can often be resolved by good judgement without destroying the value of the model. Such a methodology is applicable to philosophy too.
CB: You argue that philosophy needs and uses evidence as all intellectual disciplines do, and that evidence consists of facts or true propositions. I want to know, in your opinion, what on earth is a fact? Is a fact objective or subjective? How to individuate facts, in other words, how to count them, e.g, “We are talking about your new book”, “We are sitting in armchairs”, and “Our tongues are making sounds”– are they different versions of a same fact or different facts? How to discriminate them?
TW: The discussion of evidence in The Philosophy of Philosophy is one of the places in which I'm using the epistemology of Knowledge and Its Limits, emphasizing that we are not always in a position to know whether something is part of our evidence. The equation E = K applies to philosophy too. As for the notion of a fact, I don't appeal to a highly metaphysical conception of facts. It suffices for my purposes to treat facts as true propositions. Since truth is an objective matter, so is being a fact. If we are talking about my new book in the normal way while sitting in armchairs then the proposition that we are talking about my new book, the proposition that we are sitting in armchairs and the proposition that our tongues are making sounds are all true, so they are all facts. They are three distinct propositions because none of them is necessarily equivalent to any of the others. We could talk about my new book without sitting in armchairs; we could sit in armchairs without our tongues making sounds; our tongues could make sounds without our talking about my new book. Thus they are three distinct facts. Although more questions can be asked about the individuation of facts, the epistemology of The Philosophy of Philosophy is compatible with any reasonable answer to them.
CB: You claim that experience has at least two different functions: evidential and enabling. Past experience could have the function of moulding, regulating our imagination, the habits of abilities of judgement, so all knowledge, even including our mathematical intuition, has some distant connections with past experiences. The use of the categories of the a priori and a posteriori to characterize our knowledge produces little insight; in fact, it is wrong. Could you explain further? Could I call you an empiricist in epistemology?
TW: In playing an evidential role, experience supplies us with new evidence: for instance, the fact that a man in a red shirt is walking past is added to my evidence when I see that a man in a red shirt is walking past. In playing an enabling role, experience enables us to acquire new concepts without supplying us with new evidence: for instance, we may acquire the concept of red through experience of the application of the word “red” by others to visibly red things. On the standard way of drawing a distinction between “a priori” and “a posteriori” knowledge, a priori knowledge is knowledge in which experience plays no evidential role but may play an enabling role. A posteriori knowledge is knowledge in which experience plays an evidential role. For instance, on that conception, one can know a priori that all red things are coloured, even though one needed experience to acquire the concept of red, because that is a purely enabling role. One can only know a posteriori that a man in a red shirt is walking past. The problem I raise for this distinction is that experience often plays a role in armchair knowledge that is neither purely enabling nor strictly evidential. In doing thought experiments properly we use our skill in applying the relevant concepts offline, in the imagination – for instance, in the case of the stopped clock, our skill in applying the concept of knowledge. We are not really experiencing someone looking at a stopped clock. Experience is playing a less than evidential role here. But even if the concept has an innate basis, our skill in applying it – which exceeds what is required merely to possess the concept – was developed through online experience in perception of actual cases of knowledge and ignorance or related phenomena and actual uses of the word “know”. That is a more than enabling role for experience. I argue in The Philosophy of Philosophy that however one tries to fit the a priori/a posteriori distinction to such examples, it does not work in the intended way, and obscures underlying similarities and differences between cases. I don't think of myself as an empiricist, but rather as someone trying to find a middle way between empiricism and rationalism. Empiricists emphasize a posteriori knowledge; rationalists emphasize a priori knowledge. I'm emphasizing knowledge that doesn't fit comfortably into either category. Another respect in which I'm not particularly empiricist is that I believe on the basis of evidence from linguistics and psychology that there is a large innate element in human cognition.
CB: Could you briefly explain the role of intuition in philosophy? What is your opinion about philosophy of mind, especially the studies on intentionality in contemporary philosophy?
TW: Some philosophers talk as though intuition is a separate type of cognition, one that we need to do thought experiments, for instance. When I came to analyse such supposed examples of intuition, I realized that what was going on was less strange and distinctive than that. For instance, in a thought experiment one makes a judgement whose content is a counterfactual conditional about what would be the case if certain conditions obtained: “If those conditions obtained, he wouldn't know the time”. I reject the concept of intuition as misleading. You are right that these issues are related to the philosophy of mind. Few philosophers of mind use the concept of intuition. Parts ofKnowledge and Its Limits concern the philosophy of mind, where I discuss how an agent's knowledge and other environment-involving mental states can causally explain why they acted as they did. Just as I take reference to objects in the external environment as a central case of intentionality, so I take knowledge of states of affairs in the external environment as a central case of thought. It is an externalist, success-oriented philosophy of mind, in which unsuccessful cognition (such as false beliefs and empty names) is understood in relation to successful cognition, malfunctioning in relation to functioning. By contrast, internalists take as basic mental states that are neutral between success and failure, such as belief (which may be true or false) and apparent reference. From a methodological point of view, the philosophy of mind raises fascinating questions. In the 1980s many people thought that it was taking over from the philosophy of language in the driving-seat of philosophy, in some cases because they thought that the conceptual turn was more fundamental than the linguistic turn, in other cases because they thought that breakthroughs in experimental psychology and neuroscience would transform philosophy. Philosophy of mind since then hasn't really lived up to that promise. Indeed, no one branch of philosophy is clearly central to the subject, even if some are more peripheral than others: instead, like most other academic disciplines, philosophy has become more specialized. Experimental work has certainly contributed to our philosophical understanding of the mind, and will continue to do so, but the idea that we could just read our philosophy of mind off the experimentalists' conclusions was always naïve.
CB: You said, “Logic is a branch of both philosophy and mathematics; it cannot be separated into mutually exclusive contributions from mathematics and philosophy.” I agree with you. Recently you finished a new paper, “Logic and philosophy in the 21st century”. Could you clarify the relationship between logic and philosophy in the 21st century? What can logic do for philosophy, and what can philosophy do for logic in this century?
TW: Logic is necessary but not sufficient for solving philosophical problems. To make progress in philosophy we need to use long chains of reasoning in areas where it is very hard to distinguish valid from invalid reasoning. Logic helps us to do that more reliably. For instance, the development of quantified modal logic enables us to reason far more securely than before about possibility, necessity and existence. Secondly, logic plays a central role in the sort of model-building activity that I was talking about before – I gave an example from epistemic logic. Thirdly, logic contributes specific results to many branches of philosophy, not only to the philosophy of mathematics. For instance, one of the most active and fruitful areas of research on truth concerns solutions to the paradox of the liar, who says “What I am now saying is not true”– is what he is saying true? Most of that work requires difficult techniques from mathematical logic. The logic of action is becoming increasingly important too, with applications both to the philosophy of action and to computer science. In the reverse direction, philosophy will supply new problems for logicians to try to solve. Just as modal logic and epistemic logic emerged from philosophical concerns but then underwent autonomous technical development, the same thing will happen in other areas. Logicians are on the look-out for new problems, which philosophers can sometimes see because they are looking from new, unexpected angles. A possible example is the logic of completely unrestricted generality, which concerns the logic of generalizations about absolutely everything. The motivation is philosophical but many of the problems are tricky technical ones.
CB: You use formal methods extensively in your philosophical research. But you also claim, “it should be clear that informal methods are also essential to philosophy, for formal methods, like all others, easily mislead when applied without good informal judgement.” I agree with you strongly. Could you explain more about these claims?
TW: Most philosophical problems initially present themselves in vague informal terms. To apply formal methods, one must formalize the problem, express it in precise formal terms. One needs good informal judgement to determine whether the formalization captures what is most important about the original problem. For instance, are the simplifications or idealizations made in the process of formalizing harmless, or do they omit the heart of the difficulty? Judging the plausibility of formalized premises is also an informal matter. And so on. There may even be problems to whose solution formal methods have nothing at all to contribute, although I'm a bit sceptical of that.
CB: Could you list ten important academic papers from all of your papers so far? Could you describe their main contributions just in one sentence separately?
TW: I revised many of my best-known articles for incorporation into my books. It will be more interesting for me to describe ten that have not undergone that process. They will help to give a better sense of the variety of my work. My very first published article was “Intuitionism Disproved?” (Analysis, 1982), a short piece in which I showed that the argument of the “paradox” of knowability against the anti-realist principle that all truths are knowable doesn't work by itself against many anti-realists who regard intuitionism as the right logic for anti-realism; of course, I reject the anti-realist principle, but I think more argument against it is needed. In “Converse Relations” (Philosophical Review, 1985) I argued that pairs such as kicking and being kicked by or above and below correspond to the same language-independent feature of reality and explained some of the logical and semantic problems raised by that idea. “First Order Logics for Comparative Similarity” (Notre Dame Journal of Formal Logic, 1988) discusses the logic of sentences such as “Anna is more similar to Betty than Chris is to David”. “Two Incomplete Anti-Realist Modal Epistemic Logics” (Journal of Symbolic Logic, 1990) explores what the logic of an anti-realist theory in which truth is equivalent to the possibility of some specified epistemic status would have to look like in order for the theory not to collapse. “Representing the Knowledge of Turing Machines” (Theory and Decision, 1994) was co-authored with the theoretical economist Hyun Song Shin, who was then a colleague of mine in Oxford; it concerns the way in which epistemic logic can be constrained by the assumption that agents are not more intelligent than computers. “Bare Possibilia” (Erkenntnis, 1998) develops a metaphysical conception that allows one to defend the Barcan formula, a controversial principle about the relation between existence and possibility that greatly simplifies quantified modal logic. “On the Structure of Higher-Order Vagueness” (Mind, 1999) takes the analysis of vagueness in my book Vagueness further by showing how borderline cases of borderline cases and so on obey unexpected logical principles. My most cited article is “Knowing How” (Journal of Philosophy, 2001), co-authored with the American philosopher Jason Stanley (whom I first met when he was assigned to be my teaching assistant at MIT); it uses evidence from the semantics of English indirectly to argue that knowing how to do something is a form of knowing that something is the case, contrary to what Gilbert Ryle and many other authors had claimed – someone said that people reacted to the paper as though they had just discovered that Santa Claus does not exist. “Everything” (Philosophical Perspectives, 2003) is a defence of the intelligibility of generalizing simultaneously about absolutely everything whatsoever, against critics who argue that the attempt to generalize without restrictions leads to paradoxes in set theory, such as Russell's paradox of the set of all sets that are not members of themselves (is it a member of itself?); I show how dangerous such paradoxes are, but also how to avoid them at the last minute. “Necessitism, Contingentism and Plural Quantification” (Mind, 2010) takes the argument of “Bare Possibilia” much further, by analysing the same dispute in modal metaphysics and showing how one side is better placed than the other to see what its opponents are getting at. That turns out to involve quite technical work in second-order modal logic.
CB: Could you tell me something about your current research? What do you hope to achieve in the near future?
TW: I had to write replies to fifteen articles on Knowledge and Its Limits for Williamson on Knowledge (Oxford University Press, 2009), edited by Patrick Greenough and Duncan Pritchard, and to ten commentaries on The Philosophy of Philosophy, for symposia on that book in various journals. Of course, I'm more interested in doing something new and constructive. I am currently writing a book on the issues about existence and possibility raised in articles of mine such as “Bare Possibilia” and “Necessitism, Contingentism and Plural Quantification”, just mentioned. The Leverhulme Trust gave me a three-year fellowship to write it. Parts of the book will be rather technical, but other parts will be accessible to all analytic metaphysicians. The working title is Ontological Rigidity; it will be published by Oxford University Press. After that – who knows?
CB: As I have mentioned before, although I admit that Kripke is a very great philosopher I still have some systematic objections to his bookNaming and Necessity. As I know, you are a strong Kripkean. Could you say something about Kripke and his works?
TW: As a first-year undergraduate, I went to hear Saul Kripke give the prestigious John Locke lectures in Oxford; my tutor had described him as the brilliant young man of American philosophy (Kripke was then in his early thirties). Although I didn't understand everything he said, I was hugely impressed – by his clarity, accurate discrimination, logical power, imagination and common sense, all combined with a touch of humour. Ever since he has been a model to me of how philosophy should be done. He probably influenced me more than any of the Oxford teachers did. I especially liked, and like, his insistence on keeping metaphysical questions separate from epistemological questions, which is the basis of his famous examples of the contingent a priori and the necessary a posteriori. Although some of the details in Naming and Necessity may not be quite right, the picture he presents there strikes me as basically correct – which is not something I'd say about many philosophical books. Of course his early technical work on modal logic has also been of fundamental importance. Even if one disagrees with some aspects of its philosophical motivation (as I do, for instance on the Barcan formula mentioned above), it's still the best starting point for thinking about the metaphysical issues raised by quantified modal logic. His article on truth has inspired much of the leading technical work on truth over the past thirty years. Kripke's later book Wittgenstein on Rules and Private Language was rejected by most followers of Wittgenstein as a misinterpretation of him. Maybe it was, as a matter of historical scholarship, but what they may not have realized is that Kripke was offering Wittgenstein a sort of lifeline. In the 1970s the position was roughly this. Wittgenstein's followers held that Wittgenstein's central contribution to philosophy was his argument against the possibility of a private language. The private language argument appeared to rest on a verificationist premise, to the effect that the meaning of sentences describing mental states consisted in their method of verification. Since most philosophers (except for some anti-realists) regarded verificationism as discredited, it was widely agreed that if the private language argument really did rest on a verificationist premise, it was unsound and uninteresting. Anti-Wittgensteinians held that it did indeed rest on a verificationist premise. Wittgensteinians insisted that it didn't, but were never able to explain convincingly how it could work without such a premise. Kripke offered a way of interpreting the argument that explained how it could work without a verificationist premise, which made Wittgenstein's work relevant again for philosophers of a younger generation who weren't impressed by traditional Wittgensteinian defences of the private language argument. Once it was agreed that Kripke's interpretation didn't fit Wittgenstein's intentions, the private language argument lost most of its interest, except to committed Wittgensteinians. In part as a result, Wittgenstein has become increasingly marginal to the development of mainstream analytic philosophy. Unfortunately, much of Kripke's important work remains unpublished, for instance on identity over time and on some epistemological questions, but many of the ideas have been absorbed by the philosophical community from his lectures and samizdat transcripts of them. It is a great pity that he hasn't published more.
CB: There is a family of concepts such as “to be”, “to exist”, “being(s)”, “existence”, “reality”, “actuality”, etc., which is very important in metaphysics. Philosophers whose native tongue is not English sometimes cannot tell the delicate differences between them in philosophical contexts. Could you clarify them for us?
TW: Native English speakers often get confused about the differences between them in philosophical contexts too! Each of those terms is itself used in a variety of different ways. Often, when philosophers make delicate distinctions with them, they are not simply relying on a clear but subtle difference in their usual meanings; rather, they are harnessing them as technical terms for their own theoretical purposes. For instance, if a philosopher says “Some entities have being but not existence”, it is hard to know what distinction is being made until one hears the rest of the philosopher's discourse. In my paper “Bare Possibilia” (mentioned above), I suggest that the word “exist” is ambiguous between a logical reading and a more concrete reading. On the logical reading, “exist” is equivalent to “be something” (in the most general sense of “something”); in this sense, “Everything exists” is obviously true, because it is equivalent to the logical truth “Everything is something”. On the more concrete reading, “exist” is equivalent to something more like “be in space or time”; in this sense, “Everything exists” makes the highly controversial claim that everything is in space or time – arguably, it is a false claim because the empty set is something that is not in space or time. To bring out the difference between “reality” and “actuality” in philosophy, it is helpful to notice that “reality” is typically contrasted with “appearance” or “pretence” whereas “actuality” is typically contrasted with “possibility”. For instance, one would naturally say “If I had eaten much less, I would have weighed much less than I actually do” rather than “If I had eaten less, I would have weighed less than I really do”: the comparison is between actuality and a non-actual possibility, but not one that appears or is pretended to be the case. Conversely, if one is watching a love scene in the theatre, it would be natural to say “Those actors are not really in love with each other” rather than “Those actors are not actually in love with each other”, since the focus is on the contrast with the fiction that they are in love rather than with the possibility that they are in love. But in many contexts it matters very little which term one uses; one could say either “Did Macbeth actually murder his predecessor?” or “Did Macbeth really murder his predecessor?” My advice to readers when they see these terms in a philosophical text is not to worry too much about the nuances of English usage, but rather to try to work out what this particular philosopher is doing with the terms.
CB: How do you define your own philosophical position? In other words, what are your main philosophical beliefs?
TW: I start with classical logic and the principle that falsity is saying of what is not that it is, while truth is saying of what is that it is. What is true depends little on us humans. We form just a tiny bit of the natural world, with no special privileges. But despite huge areas of ignorance, some accidental, some necessary, we have significant knowledge of the world, from both common sense and more systematic inquiry. I'm suspicious of reductionism; everything is what it is and not another thing. I reject all religious belief as consisting largely of falsehoods humanity should have grown out of but can't. We should regulate our belief by the evidence, however difficult it is to do so, and not give way to wishful thinking. An anti-realist philosopher once described me as a “rottweiler realist”. I'm happy with that description.
CB: In your opinion, what are the most important characteristics of a successful philosopher? What kinds of basic training, abilities and knowledge should he have? When a young scholar intends to involve himself in a research project in philosophy, what elements has he to consider carefully? For example, how should he choose his topic? How should he do his research?
TW: It's hard to give general advice to ambitious young philosophers because no one size fits all: good philosophers differ from each other in temperament, skills and other qualities. A way of working that suits one philosopher doesn't suit another. Nevertheless, one can say a few obvious things. Some qualities are needed for success in just about any academic field. Naturally, however clever you are, you won't succeed unless you work hard (a banal virtue, but a surprisingly neglected one in some parts of Europe). You need the boldness and imagination to think new thoughts, and the patience, persistence and accuracy to develop them properly. If you have neither the power of self-criticism nor the modesty to accept criticism from others, your work will never improve. You must be able to recognize when your teacher has made a mistake, but if you are too willing to attribute mistakes you will not learn enough from them. It's difficult to find the middle way. You must learn to write clearly, precisely and unpretentiously. Don't worry about whether what you are saying is deep; worry about whether it's true. I like philosophers who move slowly and carefully, one step at a time, but of course they will not get anywhere unless they have a sense of where they are going. A good basic training in philosophy includes some logic and some history of philosophy, and lots of practice in writing about philosophical problems in your own words, analysing and criticizing alternative solutions and having your analyses and criticisms analysed and criticized in detail by your teachers. Choose a research topic that interests you and on which you feel that you might have something to say; it should be small enough to be manageable (it is much more common to choose a topic that is too large than one that is too small). It doesn't matter whether the topic is fashionable – the most fashionable areas are overcrowded – but there should have been some previous work on it to define the starting-point. Contributing to the published discussion of a topic is like joining a conversation. You must make what you say relevant to what the others have just said, and intelligible to them, but it is boring unless you add something of your own.
CB: Professor Timothy Williamson, thank you for agreeing to talk with me.


Fonte / Source: http://onlinelibrary.wiley.com/doi/10.1111/j.1755-2567.2010.01094.x/full