Future, Present, & Past:



Speculative
~~ Giving itself latitude and leisure to take any premise or inquiry to its furthest associative conclusion.
Critical~~ Ready to apply, to itself and its object, the canons of reason, evidence, style, and ethics, up to their limits.
Traditional~~ At home and at large in the ecosystem of practice and memory that radically nourishes the whole person.

Oυδεὶς άμουσος εἰσίτω

Tuesday, March 22, 2016

A Moving Picture of Eternity


The liturgical year is a complex matrix of many interlocking cycles. The weekly cycle from Sunday to Sunday turns within two slower processions of observances that move through the year. One cycle is a set of fixed celebrations with set calendar dates (Christmas, for instance, always on December 25; Epiphany, always January 6; various Saints' days, each assigned a calendar date usually associated with their death, or sometimes the transfer or relics or some other event). The other is the so-called "moveable feasts," which occur on different dates in different years. Easter is the most obvious of these, and many moveable feasts have their center of gravity at Easter and move forward or backward through the calendar based on when Easter falls. Thus, Pentecost comes fifty days after Easter, and Clean Monday comes 48 days before it. Obviously, moveable observances (not always feasts, as will be apparent in a moment) will pass nearer or farther from various fixed observances in various years.

2016 brings a rare occurrence this coming Friday -- the coinciding of two very solemn observances, one fixed, one moveable: the Feast of the Annunciation, and Good Friday. The Annunciation -- the day the archangel Gabriel announced to the Blessed Virgin Mary that she would conceive the messiah, and she responded "Be it unto me according to your word," is traditionally fixed on March the 25th. Good Friday is of course the day of the crucifixion; and not a feast but a fast. This conjunction of these two observances is very infrequent. It happened in 2005, but before that it happened last in 1932, before either of my parents were born; it won't happen again this century.

Whenever the Annunciation falls during Holy Week, the practice in the Roman rite of late has been to transfer it to the first unimpeded day after the Easter Octave. (The same happens with the feast of St Joseph, March 19.) The Anglican and Lutheran churches follow suit these days, but of course it was not always so, and it was not so in 1608, when the dean of St Paul's cathedral wrote his poem
Upon the Annunciation and Passion Falling upon One Day
1608

Tamely, frail body, abstain today; today
My soul eats twice, Christ hither and away.
She sees Him man, so like God made in this,
That of them both a circle emblem is,
Whose first and last concur; this doubtful day
Of feast or fast, Christ came and went away;
She sees Him nothing twice at once, who’s all;
She sees a Cedar plant itself and fall,
Her Maker put to making, and the head
Of life at once not yet alive yet dead;
She sees at once the virgin mother stay
Reclused at home, public at Golgotha;
Sad and rejoiced she’s seen at once, and seen
At almost fifty and at scarce fifteen;
At once a Son is promised her, and gone;
Gabriel gives Christ to her, He her to John;
Not fully a mother, she’s in orbity,
At once receiver and the legacy;
All this, and all between, this day hath shown,
The abridgement of Christ’s story, which makes one
(As in plain maps, the furthest west is east)
Of the Angels’ Ave and Consummatum est.
How well the Church, God’s court of faculties,
Deals in some times and seldom joining these!
As by the self-fixed Pole we never do
Direct our course, but the next star thereto,
Which shows where the other is and which we say
(Because it strays not far) doth never stray,
So God by His Church, nearest to Him, we know
And stand firm, if we by her motion go;
His Spirit, as His fiery pillar doth
Lead, and His Church, as cloud, to one end both.
This Church, by letting these days join, hath shown
Death and conception in mankind is one:
Or ‘twas in Him the same humility
That He would be a man and leave to be:
Or as creation He had made, as God,
With the last judgment but one period,
His imitating Spouse would join in one
Manhood’s extremes: He shall come, He is gone:
Or as though the least of His pains, deeds, or words,
Would busy a life, she all this day affords;
This treasure then, in gross, my soul uplay,
And in my life retail it every day.


– John Donne
(spelling modernized)
One of the reasons for transferring the Annunciation, of course, is that it is a feast, and the prototype of all feasts is the Eucharist, which is not celebrated on Good Friday (but rather distributed from the sacrament reserved from Maundy Thursday, the night before), and is not eaten at all on Holy Saturday. But in the Byzantine rite, the Annunciation is not transferred, and if it should coincide with either Good Friday or Holy Saturday, the Eucharist is celebrated nonetheless, and in the various hourly prayers things also get rather complicated. Such practices may strike one as an instance of the liturgical impulse to complexity and piling rules upon rules -- the sort of thing Agamben refers to in his critique, in The Highest Poverty, of the drive to make all of life into a liturgy. Or it may look like just another relic of what happened before we were clever enough to use the metric system. Wouldn't it all be so much simpler to just use one calendar instead of this bizarre and inconsistent mesh of solar and lunar approximations inherited from over a spread of 4,000 years? But it is neither a symptom of some delight in minutiae, nor a hangover from uneducated sun worship. There was along-standing tradition that the crucifixion had transpired on this date. Tertullian, or whoever wrote the Adversus Judaeos attributed to him, writes:
And the suffering of this "extermination" was perfected within the times of the seventy hebdomads, under Tiberius Caesar, in the consulate of Rubellius Geminus and Fufius Geminus, in the month of March, at the times of the Passover, on the eighth day before the calends [i.e., the 1st] of April, on the first day of unleavened bread, on which they slew the lamb at even, just as had been enjoined by Moses.
Probably, though, the simplest demonstration that the crucifixion is traditionally dated on the 25th of March is that this is the date of the commemoration of St. Dismas, the penitent thief, who was crucified alongside Jesus and asked Him, "Remember me when you come into your kingdom." As for the Annunciation, of course it is assigned to March 25. The date follows with perfect logic from the date of Christmas. Just do the math.

By other calculations, March 25 was also traditionally determined to be the date of Adam’s creation and Fall. I did a little digging and found bit of Medieval Latin poetry here, which begins:
Salva festa dies, quae vulnera nostra coerces,
Angelus est missus, est passus et in cruce Christus,
Est Adam factus, et eodem tempore lapsus.
"Sacred festival day that heals our wounds, on which the Angel is sent, Christ suffers and is crucified, Adam is made and on the same falls," is my stammering (possibly wrong) rendering. The poem goes on to mention a number of other important events which all just happened to occur on this date: the slaying of Abel, the blessing of Melchizedek (on Abram), the sacrifice of Isaac, the beheading of John the Baptist, the rescue of Peter and the slaying of James under Herod (as per Acts 12). I'm unsure whether the author here is drawing on earlier tradition in every case or just piling up events on his own initiative.

Incidentally, and I'll stake three pints that it was not by accident, Tolkien also made the New Year begin on March 25, after the War of the Ring:
'Noon?' said Sam, trying to calculate. 'Noon of what day?'

'The fourteenth of the New Year,' said Gandalf; 'or if you like, the eighth day of April in the Shire reckoning. But in Gondor the New Year will always now begin upon the twenty-fifth of March when Sauron fell, and when you were brought out of the fire to the King.
(It was also the first day of the year in England until 1750)

OK, all very interesting, you may say (at lest to be polite), but... well, so what?

A liturgical calendar is not just a set of rules telling you what to do. It's a body of worship you move through in time; a set of corporate spiritual exercises, undertaken together. It is stable, but fluid. It has a pattern, but it is always shifting. If you enter into it intentionally, it can be a gentle and ongoing curriculum of ever-deepening prayer. Scriptures, and collects, and commemorations are paired with each other one year and then move away from each other, in a gesture like the circling of heavenly bodies or the gradual dance of one of Calder's mobiles. There is no single, memorize-this lesson to it; the constellation is always in motion, and this is a good thing. It's a way of reading the Bible in dialogue with itself, with the ongoing tradition of which it is a part, and with the whole community of the faithful, out of which moments of realization can emerge, sometimes slowly dawning on you, sometimes flashing out in startling clarity. The coincidence of the Annunciation and the Passion is obviously a very resonant and potent alignment, fraught with theological and symbolic associations. You may appreciate some of those when you consider them in the abstract; but it is a different matter to pray them, as part of the discipline you and your whole community has undertaken. Donne saw some of this, and sensed more, and wrote what he could (which is a good deal more than I could have). The poem is itself worth meditating upon a good deal, and I'll not try to unpack it here. But I want to point out the way the last line opens upon a matter of real import, moving beyond its own occasion. Donne is writing about a single day in the year, and indeed an exceptional day, a conjunction wonderful and rare. The first line underscores this marvel by repeating (and ending with) the word "today." And yet, he concludes:
This treasure then, in gross, my soul uplay,
And in my life retail it every day.
Every day. The correspondences of the liturgical calendar are not magical. There is no special power in the day the 25th of March by virtue of the square of five or the proximity to the Spring Equinox or anything else. The calendar -- like every set of spiritual practice -- does reveal something, and without entering into it deeply, you can and will miss much; but what it shows is what is always there.

Saturday, March 12, 2016

Philosophy Departments, Nihilism, Psychic Research, and the Continental / Analytic Divide. And a few other things you didn't think were related. Part ii


Last post I ended with a question: could the "new psychology" -- the empirical psychology being forged into a science in the late 19th century and whose departmentalization in universities triggered the responding departmentalization of philosophy -- have attained its final, deadly form in the rise of the neurosciences? When RS Bakker argues that logic is neurons and phenomenology is neurons and that any attempt at higher-level neuronic reflection on these patterns is an illusion, is this anything but a final, scoffing fulfillment of the anxieties Reed claims motivated the founding of philosophy departments in the first place?

This would be the triumph of just what, in the previous post, we saw Socrates lampoon in the Phaedo. "The Good" will have been exhaustively described as "sinews and bones," or neurons and glia, "arranged 'good'-wise." If we go all the way with Bakker, this simply means that philosophy is, and has always been -- not just since 1901 or 1879 -- an enormous act of denialism. But we need not go all this way. In my last post but one, I claimed strongly that it is untrue "that our circumstance has somehow encountered a game-changer in science:"
The void was not discovered by science ...I absolutely deny that our dilemma is in any decisive sense completely new and without precedent. Nihilism has always been "at the door," to those who had the skin to sense the chill. If there was ever a real response to it, there remains one now.
Then I remembered that Brassier had anticipated this argument:
‘Nihilism’ in its broadest sense, understood as the predicament in which human life and existence more generally are condemned as ‘meaningless’ (i.e. ‘purposeless’), certainly predates the development of modern science (think of Ecclesiastes). But the emergence of modern science lends it a cognitive import it did not previously enjoy, because where pre-modern nihilism was a consequence of a failure of understanding – “We cannot understand God, therefore there is no meaning available to creatures of limited understanding such as we” – modern nihilism follows from its unprecedented success – “We understand nature better than we did, but this understanding no longer requires the postulate of an underlying meaning”. What has happened in this shift is that intelligibility has become detached from meaning: with modern science, conceptual rationality weans itself from the narrative structures that continue to prevail in theology and theologically inflected metaphysics. This marks a decisive step forward in the slow process through which human rationality has gradually abandoned mythology, which is basically the interpretation of reality in narrative terms. The world has no author and there is no story enciphered in the structure of reality. No narrative is unfolding in nature, certainly not the traditional monotheistic narrative in which the human drama of sin and redemption occupied centre stage, and humanity was a mirror for God.
Quite a bit ago, I remarked (following up on a comment by my friend dy0genes) on Nihilism as a sort of ghost story, and I have since had many occasions to point out the magnetism between nihilism and the literary genre of horror. And what makes a ghost story more compelling than the preface: this story is true ?

Brassier holds that science gives us reasons for nihilism: rather than finding its motive in human insufficiency ("Even if one were to by chance hit upon the truth," as one of the preSocratics glumly put it, "one would not know"), contemporary, science-powered nihilism has a robust case for itself grounded in human capacity.

I have no knock-down response to this. But I do disagree with the way Brassier reads the motive for nihilism past. His claim that this was ignorance about God is especially striking. This ignorance has never been in question and remains axiomatic, for the infinite God and the finite (or even differently infinite) mortal human intellect are by definition incommensurable; but the strong voices in the tradition were always those who, while thinking this total incommensurability through, nevertheless found this an insufficient reason for despair, since God could and did reach out from the other side of the unbridgeable ontological gulf. The prophet Isaiah, St Gregory of Nyssa, Ibn Arabi, Simone Weil -- all are insistent that we do not reach God via some rope-ladder of syllogisms, let alone by rolling balls and inclined planes; but they know from experience that God breaks in upon the human condition nonetheless. Now if this assertion about "experience" sends your eyes rolling, there is nothing I can do about it. No argument is stronger than rolled eyes; but that is because rolling your eyes prescinds from argument.

Comes the rejoinder: "Bah!! It's your invocation of some unverifiable "experience" that prescinds from argument!" But everything hinges on what counts as "verifiable" here. I am not going to rehash the whole "everyone-takes-some-things-on-faith" line, not because I don't think it's true or relevant (it is both) but because it usually doesn't work as a front-line tactic. I'm going to argue, rather, that what counts as evidence is heavily skewed in the official courts.

Jeffrey Kripal argues that you don't need God to blast open the doors of perception; simply taking seriously a precognitive dream would be sufficient to demonstrate that something besides the exchange of sodium and potassium ions is going on. The article in which Kripal made this suggestion (in the Chronicle of Higher Education) drew an intensely dismissive eye-rolling screed by Jerry Coyne in response. "What about all the false predictions?" Coyne asked, as if no one had ever thought of that. But the point is not that we don't make mistakes all the time, and far more mistakes than "direct hits." It's that when some curious event like a precognitive dream or a powerful synchronicity comes upon you, you experience this as meaningful and while you can rationalize your experience away, this comes at a very high cost. In short (say I), such experiences are striking (but by no means the only) counter-examples to Brassier's claim that "intelligibility has become detached from meaning" -- but they also force a revision of just what is meant by "intelligibility."

Amod Lele first pointed me to this thread of Kripal's research in a comment on this post, and I have since read a good deal of him and find a lot to applaud and somewhat to critique; that will come in some later post, perhaps. Here I want only to underscore Kripal's claim that when weird things happen, they count and should not be brushed aside or left unacknowledged.
A paranormal event is one in which the world “out there” and the world “in here” manifest themselves as the same world. It is as if the mental and material dimensions of our experience have “split off” the same deeper Ground or One World. The physical world now begins to behave like a story or a series of signs. Hence the common descriptors of these experiences: “It was as if I were a character in a novel” or “It was as if I were caught in a movie.” These sensibilities, I suggest, are very accurate perceptions, because, of course, we all are caught in novels and movies, which we call culture and religion. A paranormal moment is one in which we realize that this is so. What we then do with this realization is up to us. (Kripal interview)
Elsewhere, remarking on this same self-description (‘it was as if I was a character in a novel’, or ‘it was as if I was inside a movie’) Kripal expands on how and why he thinks people are right to describe themselves thus:
I think they are. I think we are too, right now. We’re written, composed by our ancestors.
What shocked me was how many textual allusions people would naturally use to describe a paranormal event. They would talk about puns, jokes, allusions, readings or messages. It’s a textual process going on in the physical environment.

A paranormal event becomes an invitation to re-write the script. That could be on a personal level, or a cultural level for writers and film-makers. Take writers like Philip K. Dick or Whitley Strieber – these are people who create fantasy for a living. They know their spiritual experiences are fantastic. They know they’re being mediated by their imagination. They’re not taking them literally. And yet they would insist that something very real is coming through.
(interview)
This claim of Kripal's that in such moments, "The physical world now begins to behave like a story or a series of signs," is very far-reaching, and it directly conflicts with Brassier's contention above that "no narrative is unfolding in nature," though it does leave a bit vague the precise contours of the referent "nature" and what the character of this "unfolding" would be. It certainly does not demand that there be one "narrative." For myself, of course, I have already said more than once that if there is any sense to the notion of apprehending a "story" to the context-of-all-contexts, it will be matter not of plot but of theme. This, I take it, is crucial to figuring out how best to get out of the stupid zero-sum game of the analytic/continental divide, which, as I mentioned before, Harman (following Brentano) sees as roughly mirroring the split between the sciences and the arts: slow, collaborative progress vs. the cult of the genius. If the danger is that the Analytic side, in its adulation of science, will simply capitulate to scientism, the answer certainly cannot be to merely laud the artistic model of the Continental side; for the same rot that has corroded science has (differently, but in the same process and perhaps to worse effect) deeply corrupted art. It would take us even further afield in the loop-the-loop paths these two posts are "following" to pursue the matter, but the same arguments about the pernicious effect of academia have been going on for even longer with regards to poetry and fiction and the visual arts, and perhaps music above all. Attackers of MFA programs complain, sometimes in so many words: you can teach craft, to some extent, but you can't teach vision, and sometimes the teaching ruins the vision that was there. Defenders shrug: American poetry is vibrant, diverse, and thriving, and most of the critics teach at MFA programs, so what gives? The debate in music gets less airplay but is just as serious and just as deep-seated. When, the accusation goes, was the last time a great work of music came out of the academy, anything moving enough, or even catchy enough, to compare with Cole Porter, Stephen Sondheim, Lennon and McCartney, Joni Mitchell, Prince, Radiohead? This sounds anti-elitist, because we think of pop music as being, well, populist in some sense; but it turns out to be just as much a function of the "cult of the genius" as any worship of Haydn or Mahler. It's not a myth we really want to say we believe in anymore, but we do not know what to replace it with.

What would a philosophy look like, though, that rejected the binary of "art" vs "science"? I am unsure, but Kripal provides us with another hint. At the very same time that the psychology and philosophy departments were differentiating themselves from each other, one towering figure was conducting intense research into just such strange phenomena as Kripal insists we ought to "put back on the table." That figure was William James, who straddles philosophy and psychology and is by any account one of the most important American thinkers in both; but whose tireless investigation into what we would call paranormal phenomena has been systematically marginalized by the unofficial archive-keepers, Kripal contends.
James was a also very active psychical researcher... I spent 20 years studying mysticism and never really thought about the paranormal or psychical phenomena.... We all read James, and we all taked about his pragmatism or his comparativism, but nobody ever talks about his psychical research. That really was a revelation to me, and I wanted to address the question of why we’ve taken that William James off the table.... Which William James are we talking about here? (about 5-6 min in.)
Kripal is overt about his resistance to scientism. For my part, while I do not consider scientism the same thing as nihilism (for one thing, it's often shallower), neither do I consider it the same thing as science. I would hope this need not be said, but as Coyne's misreading of Kripal makes clear, it does. I consider the scientific drive for truth, for as accurate as possible an account of the universe, to be a non-negotiable part of the philosophical spirit; the critique of superstition and of fear of thought must be cultivated intentionally (and it is not easy -- for thinking can indeed be frightening, and anyone who sneers at this hasn't done enough thinking yet to know.) I consider Scientism, on the other hand -- which I will gloss here as an inflation of methodological naturalism into an ontological presupposition, with attendant casualties elsewhere, including science, and in ethics above all -- as one of the main alibis of nihilism today. Thus it is, not without some mild amusement, that I find myself lending tentative credence to the case that a really good ("true"?) ghost story could be an antidote to nihilism.

Edward Reed notes that
The projects on which James expended the most labor -- his stream of consciousness psychology, his studies of psychic powers, his analysis of religious experience and conversion -- have never been taken up seriously by those who claim to be his heirs in philosophy and psychology throughout the United States. (p 201)
This is somewhat overstated if one includes disciplines like religious studies, but as regards the second item on Reed's list, it surely stands. I don't have a well-thought-out explanation of this neglect, except to suggest that it stems from an obvious consensus-reality discomfort with such phenomena, "alleged" or not. Nor do I have an over-arching moral to draw. It is surely not the case that James' interest in psychic research by itself would provide some watertight response to nihilism. Nihilism is compatible (if it can be said to be "compatible" with anything) with any number of ostensibly paranormal hypotheses and phenomena, including robust ghost sightings, alien encounters, and the ESP that Jerry Coyne hand-waves away. It is important to note that Kripal does not limit his interest in the "paranormal" to these rather lurid examples. His main anecdotes -- which is not a dirty word to him -- are those very private and unverifiable events that steal upon us, or overwhelm us; unsought-for mystical encounters (whether thus labeled or not), like Philip K. Dick's 2-3/74 (which Kripal treats at length in chapter six of Mutants and Mystics), or Nietzsche's Sils Maria realization (which as far as I know he doesn't mention), or his own powerful kundaliniesque experience. What I want to insist upon is not the lab-ready status of such events but the importance of Kripal's claim that in these situations the world seems to be more like a story than like an equation. If, in our modern democratic hearts, despite wanting to lean towards Continental philosophy's concern with the good (and rightly-conducted) life, we don't want to unreservedly embrace the notion of "towering geniuses," the notion of the world itself as "art" seems -- while not a philosophical panacea -- clearly apposite.

"Never trust the artist," says D.H. Lawrence; "Trust the tale."

Thursday, March 10, 2016

Philosophy Departments, Nihilism, Psychic Research, and the Continental / Analytic Divide. And a few other things you didn't think were related. Part i


University philosophy has produced some crucial work, given livelihood to thinkers "important" and garden-variety, and fostered the awakening of generations of students. But we all know it's fucked up; that careerism and the interests of institutions are at best -- and the best is rare -- strange bedfellows with the love of wisdom. Of late this critique got a little bit of publicity in the New York Times' The Stone with an article by Robert Frodeman and Adam Briggle, and then a rejoinder recently by Scott Soames. Frodeman and Briggle write:
Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university.... Philosophers needed to embrace the structure of the modern research university, which consists of various specialties demarcated from one another. That was the only way to secure the survival of their newly demarcated, newly purified discipline. “Real” or “serious” philosophers had to be identified, trained and credentialed. Disciplinary philosophy became the reigning standard for what would count as proper philosophy. This was the act of purification that gave birth to the concept of philosophy most of us know today. As a result, and to a degree rarely acknowledged, the institutional imperative of the university has come to drive the theoretical agenda.
They see a deeper and more pernicious effect of all this:
The implicit democracy of the disciplines ushered in an age of “the moral equivalence of the scientist” to everyone else. The scientist’s privileged role was to provide the morally neutral knowledge needed to achieve our goals, whether good or evil. This put an end to any notion that there was something uplifting about knowledge. The purification made it no longer sensible to speak of nature, including human nature, in terms of purposes and functions. ...Once knowledge and goodness were divorced, scientists could be regarded as experts, but there are no morals or lessons to be drawn from their work. Science derives its authority from impersonal structures and methods, not the superior character of the scientist. ...
Their conclusion is dire:
Philosophy should never have been purified. Rather than being seen as a problem, “dirty hands” should have been understood as the native condition of philosophic thought — present everywhere, often interstitial, essentially interdisciplinary and transdisciplinary in nature. Philosophy is a mangle. The philosopher’s hands were never clean and were never meant to be.....Having become specialists, we have lost sight of the whole. The point of philosophy now is to be smart, not good. It has been the heart of our undoing.
Soames wants to contest all of this -- a far too dreary assessment. He sums up Frodeman and Briggle as having claimed:
that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences. This institutionalization... led [philosophy] to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives.
Soames rejects both contentions.
I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
I wish Soames would address the first part of this argument -- that philosophy is "not separated" from the sciences -- to Stephen Hawking, Lawrence Krauss, or even Bill Nye, or any of the growing cadre of scientists who keep saying silly things like "philosophy is dead." As for the claim that philosophy was never "centrally" about goodness, justice and virtue -- this is to my mind a flabbergasting thing to say. Perhaps we have reached a differend here, but this assertion is, to my mind, a part -- a very large part -- of the problem.

The ever-engaging Brandon Watson at Siris rightly objects that
"Never" is an oddly strong word here -- the claim is certainly false (for instance) of very large portions of ancient philosophy. (Try to imagine a Plato who did not regard goodness, justice, and virtue as the central aim of philosophy. Or what in the world were Hellenistic philosophers mostly talking about if not primarily about "goodness, justice and virtue"?) But you don't have to go back so far. While one can argue about whether it's quite correct to call it "the" central aim, "the study of goodness, justice and virtue" was certainly far more central in the nineteenth century than you ever find it in the twentieth century.
I do not know exactly where Soames, Frodeman, and Briggle fall on the Analytic / Continental spectrum, but it isn't hard to discern the rough outlines of this split in their contentions, or in their curricula vitae. Soames has authored not one but two multi-volume histories of Analytic philosophy; every twentieth-century thinker he names to make his case is arguably from this "tradition," as he calls it, and the easy rapprochement with science which he commends is very much of a piece with Carnap and Quine, Sellars and Armstrong. There are any number of moral philosophers who one could call Analytical, e.g. Anscombe, Rawls, Midgley (who incidentally is a good example of an anti-scientistic Analytic thinker); nonetheless, Frodeman and Briggle's concern with "how shall we live?" -- and they are both very engaged with this question on (or even under) the ground, so to speak -- it is clearly flavored Continental (Frodeman studied under Stanley Rosen and Alphonso Lingis). All of which is meant to suggest that the A/C split may actually have something to do not just with reactions to the "departmentalization" of philosophy, but even perhaps with its origins. We'll come back to this.

Soames thinks he can argue cogently that Frodeman and Briggle have their genealogy wrong, and that "scientific progress" did not
rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools. Sometimes it does so when sciences are born, as with 17th-century physics and 19th-century biology. But it also does so as they mature. As science advances, there is more, not less, for it to do.
But the point is not whether philosophers have found material in the sciences, nor even whether they have contributed to scientific discourse (as has arguably been the case). The question is whether this is a good model for philosophy. Socrates clearly believed it was not, and it is worth quoting Plato at some length here:
When I was young, Cebes, I had an extraordinary passion for that branch of learning which is called natural science. I thought it would be marvelous to know the causes for which each thing comes and ceases and continues to be. I thought it would be marvelous to know the causes for which each thing comes and ceases and continues to be. nd I was always unsettling myself with such questions as these: Do heat and cold, by a sort of fermentation, bring about the organization of animals, as some people say? Is it the blood, or air, or fire by which we think? Or is it none of these, and does the brain furnish the sensations of hearing and sight and smell, and do memory and opinion arise from these, and does knowledge come from memory and opinion in a state of rest? And again I tried to find out how these things perish, and I investigated the phenomena of heaven and earth until finally I made up my mind that I was by nature totally unfitted for this kind of investigation....Then I heard some one reading, as he said, from a book of Anaxagoras.... I rejoiced to think that I had found in Anaxagoras a teacher of the causes of existence such as I desired, and I imagined that he would tell me first whether the earth is flat or round; and whichever was true, he would proceed to explain the cause and the necessity of this being so, and then he would teach me the nature of the best and show that this was best; and if he said that the earth was in the centre, he would further explain that this position was the best, and I should be satisfied with the explanation given, and not want any other sort of cause. And I thought that I would then go on and ask him about the sun and moon and stars, and that he would explain to me their comparative swiftness, and their returnings and various states, active and passive, and how all of them were for the best.... How grievously was I disappointed! As I proceeded, I found my philosopher altogether forsaking mind or any other principle of order, but having recourse to air, and ether, and water, and other eccentricities. I might compare him to a person who began by maintaining generally that mind is the cause of the actions of Socrates, but who, when he endeavoured to explain the causes of my several actions in detail, went on to show that I sit here because my body is made up of bones and sinews; and the bones, as he would say, are hard and have joints which divide them, and the sinews are elastic, and they cover the bones, which have also a covering or environment of flesh and skin which contains them; and as the bones are lifted at their joints by the contraction or relaxation of the sinews, I am able to bend my limbs, and this is why I am sitting here in a curved posture—that is what he would say, and he would have a similar explanation of my talking to you, which he would attribute to sound, and air, and hearing, and he would assign ten thousand other causes of the same sort, forgetting to mention the true cause, which is, that the Athenians have thought fit to condemn me, and accordingly I have thought it better and more right to remain here and undergo my sentence; for I am inclined to think that these sinews and bones of mine would have gone off long ago to Megara or Boeotia—by the dog they would, if they had been moved only by their own idea of what was best, and if I had not chosen the better and nobler part. (Phaedo 96a-99a)
Watson does not think much of Soames' case, and he musters some evidence in support of the claim that philosophy was aping the sciences, and psychology in particular:
The first clear, definite philosophy departments arose in response to the formation of psychology departments. The first clear, definite philosophy journals, associated with subject matter studied in departments devoted specifically to what was called philosophy, arose in the same way and for the same reason. It is not an accident that one of the first such philosophy journals, formed in 1876, is called Mind
Watson aptly points out that a bit over a century ago, every academic degree was in philosophy if it was not in medicine, theology, or law. This is, by the way, what Kant's The Conflict of the Faculties is about, and of course, it is why to this day you can get an M.D. as "doctor of medicine" from the university's medical department or a Ph.D as "doctor of philosophy" from just about any other department you care to name -- though there remain equivalent degrees -- ofttimes honorary -- in Law and Divinity.

Why then, did "philosophy" become its own separate academic department? Watson cites the intellectual history traced in Edward Reed's work, From Soul to Mind, a work tracing the development psychology. Watson first pointed me to this book in a discussion on philosophical diversity. Chapter 10 in particular lays out some of this history:
Both modern psychology and modern philosophy – as academic disciplines comprising professional scientists or scholars – began to emerge toward the end of the nineteenth century. Psychology in this sense preceded philosophy by about ten years, although it tended to be housed within philosophy departments. Obviously a great deal of jockeying for position power, prestige, and influence took place. In the United States it was only in the 1890s that philosophers sought to organize specialized journals and started to think about founding a professional society (which did not begin functioning until 1901). In these activities they lagged at least a few years behind the psychologists, and many of the founding documents of ‘strictly philosophical’ institutions explicitly refer to the successes in psychology as one of the reasons for establishing such distinctively philosophical entities. Small wonder that the new professional philosophers latched onto the most provocative antipsychological methodologies available, phenomenology and logic, as defining the activity of members of their emerging discipline. (p 200)
Some may want to push back here by asking, Wait, What about William James? Founder of "Pragmatism," the quintessentially American philosophy, and also author of Principles of Psychology? Even Reed confesses that James is an anomalous figure for his account. James never "took" to the new psychology. He also never approved of the way the "institutional imperative of the university ha[d] come to drive the theoretical agenda" of the humanities, even by his day. (See his "The Ph.D. Octopus," though it is occasioned by an objection to the English department in this case.)

We'll return to James -- in particular, his energetic and now completely neglected enthusiasm for psychic research -- in the next post; for now, there are a couple of things to note about the general picture Reed sketches. The first is historical. The advent of philosophy departments was the part and parcel of the aforementioned infamous Continental / Analytic "split" which followed less than a generation later. A few years ago another student of both Lingis and Rosen, Graham Harman, saw the roots of this divide in Brentano's 1894 lecture on "The Four Phases of Philosophy," which you can read in translation in this book, along with some uneven commentary. Brentano writes:
The history of philosophy is a history of scientific efforts, and it is thus similar in some respects to the history of the other sciences. On the other hand, it is different from the latter and seems rather to be analogous to the history of the fine arts. Other sciences, as long as scientists pursue them, show a constant development which may sometimes be interrupted by periods of stagnation. Philosophy, however, like the history of the fine arts, has always had periods of ascending development and, on the other hand, periods of decadence.
This is of course not just a pair of analogies. The investigations of experimenters and theorists like Newton, Boscovich, Faraday, and Maxwell all fell under the rubric of "Natural Philosophy;" even as recently as 1949, Einstein (at least) could be described, in an echo of archaic usage, as "philosopher-scientist" and there was still a chance that this would be understood.

A propos Brenato's characterization, Harman remarks that: "the entire analytic/continental rift is foreshadowed and explained in this passage." It plays out on the level of process, arguably, even more than it does on the level of doctrines promulgated. The usual argument is that analytic philosophy tries to apply the hard sciences' methods to philosophical problems, or that it tries to back up and do meta-science. But Harman thinks the problem is, rather, that Analytic philosophy tries to model itself upon what it sees as the history of science; while Continental philosophy is animated by a particular myth of the arts:
The difference between the two currents... isn’t so much one of content as of professional mission and self-understanding. Analytic philosophy is deeply committed to the idea that philosophy is a cumulative enterprise, and that the adding up of small discoveries will lead to a general professional advance.... By contrast, it seems pretty clear that continental philosophy follows the “fine arts” model of the history of philosophy…. The progress of philosophy is made not of cumulative argumentation but by the vision of towering geniuses.
Now, it is easy to see how this analysis resonates with Reed's case, and Watson's revisionary use of the case, that philosophy as academic discipline was responding to the rise of the "new psychology." If Brentano's lecture points to an apparent tension in philosophy, the crisis pressed upon academic philosophy by departmentalization turned this tension into a schizoid structure. Not only would philosophers now decide "what would count as proper philosophy" by virtue of who was in or outside of institutional circles, they would wind up squabbling within the institution as well, until Analytic philosophers would deny that Continental philosophy was philosophy at all, and we got things like Quine signing a letter protesting Derrida's receiving an honorary degree, claiming that "In the eyes of philosophers, and certainly those working in leading departments of philosophy throughout the world, M. Derrida’s work does not meet accepted standards of clarity and rigour."

So much, then, for the historical question.

But a deeper and more unsettling question also arises. If we play this history forward, perhaps we are now living through the endgame of this untenable arrangement? That will be the starting point for Part 2.