Tag Archives: Davidson

Anti-physics

“(..) who cannot reveal himself cannot love, who cannot love is the unhappiest of all. But you do this out of sheer obnoxiousness, you train yourself in the art to become a riddle for others.” S. Kierkegaard, Either/Or, my translation from Dutch.

Kierkegaard and Nietzsche have a common cause in destroying the confusion that places us as an instrument of ultimate truth. They are seen as anti-metaphysical and adopted by many of this very (psycho)physical age – oh, the irony of it! – as evidence that these times, our times, are superior in having overcome that specific Hegelian disease of thinking.

That’s not true. We straightened some wrinkles to find injecting botox has just made us look more preposterous than ever. We now believe physics can solve anything and that it is just a matter of time to resolve the riddle of life. Deep down we believe there is eternal truth; we happen now to believe as well it is just a matter of time before we find it. Some even think that if they live long enough they’ll live forever and don’t stop to think what a ghastly thought that is.

The question in this evidence based world is: what counts as evidence?

Is it the increasingly complex models picking out specific observations designed to falsify it but somehow always winding up verifying something? Or is it the everyday meeting of minds where you make yourself vulnerable to misunderstanding?

Continue reading

Is ‘real’ really real or just no big deal?

“In giving up the dualism of scheme and world, we don’t give up the world, but re-establish unmediated touch with the familiar objects whose antics make our sentences and opinions true or false.”  Donald Davidson, Inquiries into Truth & Interpretation, Clarendon Press, Oxford, 2001, p. 198.

The word ‘real’ is a divider. Just like with God, when your real isn’t my real that’s enough to create the type of zeal to come to blows. We constantly show we don’t need the Gods to start a war. That is a fact. It might seem everyday and familiar to sophisticated modern people like us constantly figuring out what is real and what a mere figment of our fancy. Nevertheless it is a fact: whenever people think they’re right, ‘really’ right, death’s on our doorstep.

So let us examine this little word ‘real’ for what it does to our reality. Let’s see whether it belongs with the familiar family of other infamous four-letter words. To start the inquiry, try to remember the last time you heard somebody saying person X was not a ‘real’ Y. For instance X was in fact a muslim but she wasn’t a ‘real’ muslim in that she did not wear a hijab. Or, X was in fact liberal but he wasn’t a ‘real’ liberal in that he didn’t verbally come out in support of gay marriage. Or, X was in fact born here but he wasn’t a ‘real’ national because he failed to defend his identity. Or, X was indeed a refugee but she wasn’t a ‘real’ refugee in that she did adopt our identity. Or, like in my case, I am an atheist but I am not a ‘real’ atheist because I do not think religion is the worst thing that ever happened to the whole wide world. Like I’m not a ‘real’ autistic because, well, I don’t look like one.

It won’t be too hard to come up with your own examples where something like this was thrown at you or somebody you liked. So follow me in tracking how the word ‘real’ flies like a boomerang hitting the utterer of it smack in its own face. At least when we’re lucky enough it doesn’t hit a very real person in a very real way before it has fully bent back.

Continue reading

The Autinomies of Philosophy

The chance of there being an unconscious typo in the title is about as big as that of Freud not having slipped up. If it appears I am talking in riddles that is only because you feel that there is something to decipher. One thing is certain: philosophers are weird. So am I. Even if that doesn’t establish anything as far as me being a philosopher, you got my drift.

Carnap_in-science-there-are-no-depths-there-is-surface-everywhere-403x403-nk55v7

Let us wonder a while about the weirdness of philosophers. They have come up with waves and particles, with particulars and universals. Then they calculated and associated to come to one invariable conclusion: neither the one nor the other, or both at the same time but in an at most a superficial manner. Philosophers say they despair about this. That is merely a mask they wear to ensure somebody feeds them. If they’re particularly power hungry they will even exclaim they’ve solved it. Solutions sell, this much they know of real life. It’s one of those regularities that have neither rhyme nor reason.

Without weirdness we would discuss in caves instead of about waves. What is wrong with that? Caves are no place for philosophers. So what’s up with them?

Continue reading

Being of Two Minds: Anomalous Monism

“Anomalous monism resembles materialism in its claim that all events are physical, but rejects the thesis that mental phenomena can be given purely physical explanations.” D. Davidson, Essays on Actions and Events, Clarendon Press, 2001, p. 214.

The lack of clarity in philosophy of mind is a lack of clarity of its terms. That lack of clarity of terms is, in its turn, nothing else than a lack of terms. There was a time the discussion was about mind/body dualism whilst most recent scientific writing is, implicitly at least, based on the identity of brain and mind. It’s all a blur and no matter how many tokens of supervenience or emergency types are exchanged, it remains a blur of bodies, minds and brains. The classical solution to this lack of terms is to index terms like consciousness1 or prefix them with an adjective like ‘basic’ mind or some such. This is then a temporary definition just good enough to make a local argument without risking to enter into holistic arguments. Good for publishing but bad for discussion.

I have always thought that Davidson’s anomalous monism was a basis for getting out of this black hole of terminological unclarity. It has the strength of common sense: there are no extra-natural things but mental descriptions of natural things aren’t something purely physically determined either. The thing is this: anomalous monism of what? Of the mental and the physical, sure, but what about the brain and its mind.

Let me repeat that: what about the brain and ‘its‘ mind? That the mind is ‘of’ the brain would not startle many if I had not also italicized it (and – to play it safe – put it in scare quotes too). Well, if the mind is of the brain I think we don’t have enough anomalousness and still too much monism. Since the mental indeed doesn’t allow itself to be reduced to the physical, this leads to minds1 and minds2 and hence right back into the muddy waters of going mental at or talking past each other.

So I made a picture to try to put the mind right back where it belongs: very much outside the brain. So far out that the mind does not have a location at all, which seems to me rather in tune with the anomalousness of the mental.

twominds

Here goes the not so short explanation: Continue reading

The self is both made and explored with words

“The self is both made and explored with words; and the best for both are the words spoken in the dialogue of friendship.”
Charles Taylor, Sources of the Self – The Making of the Modern Identity, Harvard University Press, 1989, p. 183.

In reading these pages, I was reminded by the abomination that is the word “paradigm”. Although I am largely sympathetic to the project of Charles Taylor in tracing the origins of self and identity, there is a certain something about it which annoys me. Thinking about it his pinpointing of pivotal moments in philosophy is the cause of this slight discomfort. In his own words I think his is the natural way of explaining, as against the more convoluted way which is less prone to be accepted in this scientistic bottom-up world. Sure, this way serves the purpose of bringing home the point that the way we see things naturalistically is neither eternal nor inescapable. Still it also exposes us to the risk of marking “paradigm shifts” showing side by side clear before’s and after’s and simultaneously expressing a strong valuation that such before’s are inferior and the corresponding after’s are superior. Thinking in “paradigm shifts” has led to the abominable results that we see all around us, marking in’s and out’s in the most uncharitable of ways.

The quote stresses, I think, not the discrete but the continuous; not the sudden but the emerging; not revolution but evolution. It connects the continuous evolution of language with its essence in friendship. The quote gets it all right. From that very first time that people pointed to the same thing in uttering or gesturing (hence thinking) the word “that”, the mechanism of development is a mechanism of co-operation (see P. Grice), a mechanism presupposing being charitable to understanding the other (D. Davidson) and best seen in one of Quine’s favorite metaphors of rebuilding the ship as we are sailing it:

“We are like sailors who on the open sea must reconstruct their ship but are never able to start afresh from the bottom. Where a beam is taken away a new one must at once be put there, and for this the rest of the ship is used as support. In this way, by using the old beams and driftwood the ship can be shaped entirely anew, but only by gradual reconstruction.”
Otto Neurath, from Wikipedia. Continue reading

On Saying “That”

Perhaps it should come as no surprise to learn that the form of psychological sentences in English apparently evolved in much the way these ruminations suggest. According to the Oxford English Dictionary

The use of that is generally held to have arisen out of the demonstrative pronoun pointing to the clause which it introduces.

(..)” D. Davidson, “On Saying That” in Inquiries into Truth and Interpretation, Oxford University Press, 2001, p. 106.

Charles Taylor is right, the question ‘Why?’ is humanly unavoidable. My question is why this is so. If we know why we cannot but ask ‘Why?’, we have resolved that part of the mystery of life which tends to separate people. I agree with Charles Taylor as well that naturalistic reduction – scientism is the more appropriate label – is a moral hazard but I think science, under the lead of the human sciences, must address why this is so. The strategy for doing so is the most basic strategy of all science: to trace to the origins of the phenomenon, just like Charles Darwin did.

Well, I think the origin of the word ‘why’ lies in the word ‘that’. The latter word has been the object of intense scrutiny for instance by Donald Davidson. Its non-verbal equivalent, finger pointing, is a  necessary gateway between the realm of things and that of thought. The word ‘that’ presupposes a lot. It presupposes a lot of things of which at least two need to be complex enough to point to some third thing. These things qua things are the subject matter studied by so called hard science. Kant called this pure reason. Most have forgotten his critique of it though. Basically: that this hard science presupposes subjects that do the pointing and have mastered a language consisting of a lot more words than ‘that’.

Indeed, on the other side of the word ‘that’ lies our language and therefore also lie we qua humanity as studied by human science. These are sciences that do not merely use words but are, strictly speaking, about words. The division between the hard and human sciences cuts across scientific fields, in particular across psychology, which explains why they are internally so divided. It has become a vulgar truth of scientism that individuals become objective if they just follow the insight of the hard sciences; that the question ‘Why?’ Is just a phenomenon, that there is no basic reason why this is so, that the word is the mental equivalent of an appendix that just needs to be removed when it is inflamed. This runs counter to the origins of hard science in multiple ways. Just take “that”: it’s no small matter that before it can be uttered there need to be at least two beings complex enough to point to some third thing and identify it as the same. The work of George Herbert Mead is a work of hard science and already explains that it takes a community to create anything that can be properly called an individual.

Tracing our origin to the word ‘that’ establishes a before and an after but only what comes after is capable of examining what came before. People may have little patience nowadays to appreciate  a difficult point: still, the hard sciences presuppose human sciences and not vice versa. They will be as incapable of overtaking them as the tortoise the hare. Scientism is a dangerous, and harshly metaphysical, fallacy that subjects subjects to matter. It’s to be fought with the type of science so characteristic of a Ludwig Wittgenstein. That’s a fight we will fight insight by insight for the patient who know solutions don’t come cut and dried and overnight. Everything I ever wrote is dedicated to this fight.

Another fallacy is the Augustinian one of spiritualism in which the before and the after of the word ‘that’ are impermeable. The word then stands alone without its body, without grounding and (speaking as an as of yet unconfirmed autist) leads to the kind of mysticism and skepticism that throws a tantrum every time its selfmade why it is so gets challenged and changed. It ignores that desire, vital energy in the sense of Bergson, was there long before the word ‘that’ – and that this desire is as fully permeating the creation of anything in language as it permeates any living thing. It ignores the fact that human science is science as wel: a human activity of reason (and therefore of mathematics) that can explain why a tantrum is thrown. Philosophical hermeneutics as proposed by Gadamer is nothing else as tracing what is after ‘that’; it is the origin of humanity in the survival of ideas that fit the environment of reason.

Before the word ‘that’ there was only desire and its dynamics of energy shaping matter against entropy; all a mere matter of pure probability whether physical, thermodynamical or evolutionary. After the word ‘that’ there’s a new force of energy in the development of reason that shapes thought, basically around the mathematics of probability. We’re still discovering how this is all linked but one link is a priori necessary (even if it is real hard to synthesize): that we are creatures driven by desire in developing reason and creatures of reason in driving our desires. Neither reason nor desire can rule (in) us – this is why asking ‘Why?’ is so basic, so universal and so common to all of us. What rules in us is judgment (it is Deleuze who explained me why Kant wrote 3 critiques). If we realize we all ask the same question for a same reason we can transcend the specific comprehensive answers that we individually need at any given time. This does not discredit any of these answers as such but opens the playground to develop understanding – to develop our common language, to discover the Rawlsian overlapping consensus we all share. We may be just (im)probable creatures, but as creatures we are necessarily driven to adaptation to a common judgment sharing as we do a common desire and a common reason.

So in this Charles Taylor is most probably wrong: there is a universal moral claim of reason – there is a “basic reason” of morality. We should not seek it in what specifically comes after the ‘that’, in what we point to (we should not seek it in the specific answer we give at a certain point to the question ‘Why?’). We should see it in the fact that we all use ‘that’ in the same way to point to what we believe (this is the universal ‘Why it is so’ we all ask that same question ‘Why?’). This may seem opaque but it really isn’t. It’s something we know in everyday life and everyday speech. There is nothing ultimately arbitrary or relative or perspectivist in what we point to with ‘that’. A bird is a bird even if it may be difficult to be sure we are pointing to the same thing. The same is true for beliefs in a ‘that’-clause, they are not magically lifted out of the realm of logic to do with as we please (not even Humpty Dumpty can do that). We can hold many false beliefs but once somebody points us out that the earth is round we simply can no longer also believe it is flat. You cannot hide stupidity in personal opinion even if it has become the most popular political opinion. There always is a fact in the moral, however difficult and endless it may be to uncover all the facts.

On the Very Idea of a Conceptual Scheme

“It would be wrong to summarize by saying we have shown how communication is possible between people who have different conceptual schemes, a way that works without need of what there cannot be, namely a neutral ground, or a common co-ordinate system. For we have found no intelligible basis on which it can be said that schemes are different. It would be equally wrong to announce the glorious news that all mankind – all speakers of language, at least – share a common scheme and ontology. For if we cannot say that schemes are different, neither can we intelligibly say that they are one.

In giving up dependence on the concept of an uninterpreted reality, something outside of all schemes and science, we do not relinquish the notion of objective truth – quite the contrary. Given the dogma of a dualism of scheme and reality, we get conceptual relativity, and truth relative to a scheme. Without the dogma, this kind of relativity goes by the board. (..)”

D. Davidson, Inquiries into Truth and Interpretation, Clarendon Press, Oxford, 2001, p. 197-198.

[Re-posted from The Old Site, original dd. 19-04-09. Key quote, weak thought, at least very weakly expressed.]

This long quote is one of the rare wormholes (some basic notion of science fiction is a assumed existant in the reader of this) between philosophy of language and ethics.

If true we have speakers that understand each other at least somewhat and a world against which they can check each other’s understanding. Insofar as speakers don’t understand each other, they are not speakers and they are merely, if that, part of the background against which understanding is possible., part of the world. That is clean and neat. It is not much but it is not only better than nothing, it is enough to make some quite striking moral observations.

Continue reading