Future, Present, & Past:

~~ Giving itself latitude and leisure to take any premise or inquiry to its furthest associative conclusion.
Critical~~ Ready to apply, to itself and its object, the canons of reason, evidence, style, and ethics, up to their limits.
Traditional~~ At home and at large in the ecosystem of practice and memory that radically nourishes the whole person.

Oυδεὶς άμουσος εἰσίτω

Tuesday, May 24, 2016

Wanting truth: both ways

An article by Galen Strawson in The NY Times' philosophy column maintained that
the hard problem is not what consciousness is, it’s what matter is — what the physical is.
I read this recently in the context of a philosophy discussion group about the philosophy of mind (we are reading John Searle at the moment). Strawson's remark reminded me that in an earlier conversation, one friend had opined, in the midst of a heated back-and-forth about just where consciousness "arises" in the complex web from neuro to social, "I don't see consciousness as a problem." This remark occasioned a total break in conversation for a space of some seconds, until something like an uncomfortable "ohhh-kay, then" broke the silence.

The anecdote nicely illustrates, I think, something about starting points -- not our axioms, but our situated values, with which we embark into enquiry.

No one in our discussion is an out-and-out eliminativist, alas; if they were, I might be able to understand better. As it is, my attempts to understand eliminativism founder, but here's what I've got so far.

I think eliminativism arises from a kind of fixation on the deflationary experience of explanation, a deflationism run riot. There is often something about an really convincing explanation that makes us mildly disappointed; an "Oh, so that was all." (Cue the song "Is that all there is?") One of the tricks (the good tricks) of the scientific urge is to turn this deflation into an asset, to get us excited about that experience, the rush of delimiting the problem. At least in some cases, this deflationary stance in turn breeds reactionaries whose rallying cry is "But wait, there must be Something More..." It is easy to make a caricature of these latter as chasers after the MacGuffin. ("No such thing!") But the main point isn't about whether there is, or is not, Something More. It's about the animus that wants to pursue it without ever catching it, like King Pellinor after the Questing Beast; or conversely, the animus that wants to track down every last Something More until the very idea of Something More is eradicated -- because that's what it's after, really. One stance gets off on a high which is the glimpse of something ever-slipping beyond the horizon; the other gets off on the deflationary feeling of "That's all it ever was," which it parleys into a rush of "Ha, Got you!" (like when you finally thump the glass down trapping the running spider). (If you just step on them, you will have to substitute your own metaphor.)

Sometimes one wants very badly to cut through the issue of the satisfaction a style of thought is after, and instead to ask and to know, Which One of These (deflationists or something-more-ists) is "right"? This is obviously a clumsy formulation in this case -- there are far too many complicating questions -- but we still have an intuition about the bifurcation true/false, that only one of these terms can pertain. I however do not think the issue ("which is right?") can be separated from the question of the drive that motivates them. I know that I risk in this a sort of psychologistic reduction of philosophy, or even Bulverism (this was C.S. Lewis' joke-name (after its fictitious founder) for a dismissive attitude to any argument, dispensing with the reasons for the argument by "explaining" them instead, in radically ad hominem terms). Although I am (in modern, shorthand terms) a Kantian as regards faith and reason, I emphatically reject any psychologism that reduces truth to the function of "our sort of mind" (whether that "sort" is simian or bourgeois or postmodern or whatever); but I still deny that we have any unmediated access to a pure algebra of rationales hanging suspended in the Space of (Pure) Reasons.

Fichte famously said that philosophy was a function of temperament:
what philosophy a man chooses depends entirely upon what kind of man he is; for a philosophical system is not a piece of dead household furniture, which you may use or not use, but is animated by the soul of the man who has it.(Science of Knowledge, Introduction, sec. V)
Philosophers often recoil from this notion--as if it were offered as a kind of excuse, or a blanket explanation like phrenology. Taken as such it is clearly unacceptable, since it would derail the very idea of dialogue. But there is something to this notion of philosophical temperament, nonetheless. It's very hard to get a handle on, especially in one's own case, and most thinkers prefer to ignore the question entirely, or at best to marginalize it into the human-interest part of philosophy.

One who doesn't ignore it is Thomas Nagel. Here is a lovely and forthright passage from The Last Word:
In speaking of the fear of religion, I don’t mean to refer to the entirely reasonable hostility toward certain established religions and religious institutions, in virtue of their objectionable moral doctrines, social policies, and political influence. Nor am I referring to the association of many religious beliefs with superstition and the acceptance of evident empirical falsehoods. I am talking about something much deeper–namely, the fear of religion itself. I speak from experience, being strongly subject to this fear myself: I want atheism to be true and am made uneasy by the fact that some of the most intelligent and well-informed people I know are religious believers. It isn’t just that I don’t believe in God and, naturally, hope that I’m right in my belief. It’s that I hope there is no God! I don’t want there to be a God; I don’t want the universe to be like that. (p 130)
It is extremely difficult to entwine these approaches, to want truth and to want truth; to desire it both qua truth, and qua desirable. Philosophy is not accomplished by impossibly cutting oneself off from context, including the context within oneself. (Though it feels like that sometimes, and includes any number of impossible efforts.) No matter what sort of explanation appeals to you, ask yourself: why does it appeal? What is this "appeal"? Think enough about this and you will find yourself staring at the question of the nature of the Good.

Friday, April 29, 2016

Publishing. And not.

When Hilary Putnam died recently, a common theme in the many tributes offered was his well-known willingness to change his mind. I can no longer track down the remark and so cannot quote it exactly, but I remember some place where he spoke frankly of these about-faces, strongly commending the capacity to say things in the form: "I once thought.... I now think....".

This came back to me recently while reading R.G. Collingwood's Autobiography. Collingwood is reminiscing about of John Cook Wilson, a professor of logic at Oxford:
"I rewrite, on average, one third of my logic lectures every year," said he. "That means I'm constantly changing my mind about every point in the subject. If I published, every book I wrote would betray a change of mind since the last."
For Putnam, it simply meant he was willing to conduct his shifts of position in public. For Cook Wilson, it was (according to Collingwood) a reason not to publish at all. (In fact, Cook Wilson did publish -- mainly articles, many of which were collected in the posthumous Statement and Inference in 1926). There are others who fit these profiles: Socrates in the agora, (supposedly) constantly changing his position, is a model of continual public revision; Wittgenstein pruning and re-arranging the Investigations until there was no realistic chance of making them public in his lifetime is on the other extreme. But most thinkers fall somewhere between.

Collingwood remarks that
I already knew that there are two reasons why people refrain from writing books; either they are conscious that they have nothing to say, or they are conscious that they are unable to say it.
For almost the first two decades after I came of age intellectually, I was haunted by the first reason. But I begin to wonder if it won't be the second that really gets me.

Tuesday, March 22, 2016

A Moving Picture of Eternity

The liturgical year is a complex matrix of many interlocking cycles. The weekly cycle from Sunday to Sunday turns within two slower processions of observances that move through the year. One cycle is a set of fixed celebrations with set calendar dates (Christmas, for instance, always on December 25; Epiphany, always January 6; various Saints' days, each assigned a calendar date usually associated with their death, or sometimes the transfer or relics or some other event). The other is the so-called "moveable feasts," which occur on different dates in different years. Easter is the most obvious of these, and many moveable feasts have their center of gravity at Easter and move forward or backward through the calendar based on when Easter falls. Thus, Pentecost comes fifty days after Easter, and Clean Monday comes 48 days before it. Obviously, moveable observances (not always feasts, as will be apparent in a moment) will pass nearer or farther from various fixed observances in various years.

2016 brings a rare occurrence this coming Friday -- the coinciding of two very solemn observances, one fixed, one moveable: the Feast of the Annunciation, and Good Friday. The Annunciation -- the day the archangel Gabriel announced to the Blessed Virgin Mary that she would conceive the messiah, and she responded "Be it unto me according to your word," is traditionally fixed on March the 25th. Good Friday is of course the day of the crucifixion; and not a feast but a fast. This conjunction of these two observances is very infrequent. It happened in 2005, but before that it happened last in 1932, before either of my parents were born; it won't happen again this century.

Whenever the Annunciation falls during Holy Week, the practice in the Roman rite of late has been to transfer it to the first unimpeded day after the Easter Octave. (The same happens with the feast of St Joseph, March 19.) The Anglican and Lutheran churches follow suit these days, but of course it was not always so, and it was not so in 1608, when the dean of St Paul's cathedral wrote his poem
Upon the Annunciation and Passion Falling upon One Day

Tamely, frail body, abstain today; today
My soul eats twice, Christ hither and away.
She sees Him man, so like God made in this,
That of them both a circle emblem is,
Whose first and last concur; this doubtful day
Of feast or fast, Christ came and went away;
She sees Him nothing twice at once, who’s all;
She sees a Cedar plant itself and fall,
Her Maker put to making, and the head
Of life at once not yet alive yet dead;
She sees at once the virgin mother stay
Reclused at home, public at Golgotha;
Sad and rejoiced she’s seen at once, and seen
At almost fifty and at scarce fifteen;
At once a Son is promised her, and gone;
Gabriel gives Christ to her, He her to John;
Not fully a mother, she’s in orbity,
At once receiver and the legacy;
All this, and all between, this day hath shown,
The abridgement of Christ’s story, which makes one
(As in plain maps, the furthest west is east)
Of the Angels’ Ave and Consummatum est.
How well the Church, God’s court of faculties,
Deals in some times and seldom joining these!
As by the self-fixed Pole we never do
Direct our course, but the next star thereto,
Which shows where the other is and which we say
(Because it strays not far) doth never stray,
So God by His Church, nearest to Him, we know
And stand firm, if we by her motion go;
His Spirit, as His fiery pillar doth
Lead, and His Church, as cloud, to one end both.
This Church, by letting these days join, hath shown
Death and conception in mankind is one:
Or ‘twas in Him the same humility
That He would be a man and leave to be:
Or as creation He had made, as God,
With the last judgment but one period,
His imitating Spouse would join in one
Manhood’s extremes: He shall come, He is gone:
Or as though the least of His pains, deeds, or words,
Would busy a life, she all this day affords;
This treasure then, in gross, my soul uplay,
And in my life retail it every day.

– John Donne
(spelling modernized)
One of the reasons for transferring the Annunciation, of course, is that it is a feast, and the prototype of all feasts is the Eucharist, which is not celebrated on Good Friday (but rather distributed from the sacrament reserved from Maundy Thursday, the night before), and is not eaten at all on Holy Saturday. But in the Byzantine rite, the Annunciation is not transferred, and if it should coincide with either Good Friday or Holy Saturday, the Eucharist is celebrated nonetheless, and in the various hourly prayers things also get rather complicated. Such practices may strike one as an instance of the liturgical impulse to complexity and piling rules upon rules -- the sort of thing Agamben refers to in his critique, in The Highest Poverty, of the drive to make all of life into a liturgy. Or it may look like just another relic of what happened before we were clever enough to use the metric system. Wouldn't it all be so much simpler to just use one calendar instead of this bizarre and inconsistent mesh of solar and lunar approximations inherited from over a spread of 4,000 years? But it is neither a symptom of some delight in minutiae, nor a hangover from uneducated sun worship. There was along-standing tradition that the crucifixion had transpired on this date. Tertullian, or whoever wrote the Adversus Judaeos attributed to him, writes:
And the suffering of this "extermination" was perfected within the times of the seventy hebdomads, under Tiberius Caesar, in the consulate of Rubellius Geminus and Fufius Geminus, in the month of March, at the times of the Passover, on the eighth day before the calends [i.e., the 1st] of April, on the first day of unleavened bread, on which they slew the lamb at even, just as had been enjoined by Moses.
Probably, though, the simplest demonstration that the crucifixion is traditionally dated on the 25th of March is that this is the date of the commemoration of St. Dismas, the penitent thief, who was crucified alongside Jesus and asked Him, "Remember me when you come into your kingdom." As for the Annunciation, of course it is assigned to March 25. The date follows with perfect logic from the date of Christmas. Just do the math.

By other calculations, March 25 was also traditionally determined to be the date of Adam’s creation and Fall. I did a little digging and found bit of Medieval Latin poetry here, which begins:
Salva festa dies, quae vulnera nostra coerces,
Angelus est missus, est passus et in cruce Christus,
Est Adam factus, et eodem tempore lapsus.
"Sacred festival day that heals our wounds, on which the Angel is sent, Christ suffers and is crucified, Adam is made and on the same falls," is my stammering (possibly wrong) rendering. The poem goes on to mention a number of other important events which all just happened to occur on this date: the slaying of Abel, the blessing of Melchizedek (on Abram), the sacrifice of Isaac, the beheading of John the Baptist, the rescue of Peter and the slaying of James under Herod (as per Acts 12). I'm unsure whether the author here is drawing on earlier tradition in every case or just piling up events on his own initiative.

Incidentally, and I'll stake three pints that it was not by accident, Tolkien also made the New Year begin on March 25, after the War of the Ring:
'Noon?' said Sam, trying to calculate. 'Noon of what day?'

'The fourteenth of the New Year,' said Gandalf; 'or if you like, the eighth day of April in the Shire reckoning. But in Gondor the New Year will always now begin upon the twenty-fifth of March when Sauron fell, and when you were brought out of the fire to the King.
(It was also the first day of the year in England until 1750)

OK, all very interesting, you may say (at lest to be polite), but... well, so what?

A liturgical calendar is not just a set of rules telling you what to do. It's a body of worship you move through in time; a set of corporate spiritual exercises, undertaken together. It is stable, but fluid. It has a pattern, but it is always shifting. If you enter into it intentionally, it can be a gentle and ongoing curriculum of ever-deepening prayer. Scriptures, and collects, and commemorations are paired with each other one year and then move away from each other, in a gesture like the circling of heavenly bodies or the gradual dance of one of Calder's mobiles. There is no single, memorize-this lesson to it; the constellation is always in motion, and this is a good thing. It's a way of reading the Bible in dialogue with itself, with the ongoing tradition of which it is a part, and with the whole community of the faithful, out of which moments of realization can emerge, sometimes slowly dawning on you, sometimes flashing out in startling clarity. The coincidence of the Annunciation and the Passion is obviously a very resonant and potent alignment, fraught with theological and symbolic associations. You may appreciate some of those when you consider them in the abstract; but it is a different matter to pray them, as part of the discipline you and your whole community has undertaken. Donne saw some of this, and sensed more, and wrote what he could (which is a good deal more than I could have). The poem is itself worth meditating upon a good deal, and I'll not try to unpack it here. But I want to point out the way the last line opens upon a matter of real import, moving beyond its own occasion. Donne is writing about a single day in the year, and indeed an exceptional day, a conjunction wonderful and rare. The first line underscores this marvel by repeating (and ending with) the word "today." And yet, he concludes:
This treasure then, in gross, my soul uplay,
And in my life retail it every day.
Every day. The correspondences of the liturgical calendar are not magical. There is no special power in the day the 25th of March by virtue of the square of five or the proximity to the Spring Equinox or anything else. The calendar -- like every set of spiritual practice -- does reveal something, and without entering into it deeply, you can and will miss much; but what it shows is what is always there.

Saturday, March 12, 2016

Philosophy Departments, Nihilism, Psychic Research, and the Continental / Analytic Divide. And a few other things you didn't think were related. Part ii

Last post I ended with a question: could the "new psychology" -- the empirical psychology being forged into a science in the late 19th century and whose departmentalization in universities triggered the responding departmentalization of philosophy -- have attained its final, deadly form in the rise of the neurosciences? When RS Bakker argues that logic is neurons and phenomenology is neurons and that any attempt at higher-level neuronic reflection on these patterns is an illusion, is this anything but a final, scoffing fulfillment of the anxieties Reed claims motivated the founding of philosophy departments in the first place?

This would be the triumph of just what, in the previous post, we saw Socrates lampoon in the Phaedo. "The Good" will have been exhaustively described as "sinews and bones," or neurons and glia, "arranged 'good'-wise." If we go all the way with Bakker, this simply means that philosophy is, and has always been -- not just since 1901 or 1879 -- an enormous act of denialism. But we need not go all this way. In my last post but one, I claimed strongly that it is untrue "that our circumstance has somehow encountered a game-changer in science:"
The void was not discovered by science ...I absolutely deny that our dilemma is in any decisive sense completely new and without precedent. Nihilism has always been "at the door," to those who had the skin to sense the chill. If there was ever a real response to it, there remains one now.
Then I remembered that Brassier had anticipated this argument:
‘Nihilism’ in its broadest sense, understood as the predicament in which human life and existence more generally are condemned as ‘meaningless’ (i.e. ‘purposeless’), certainly predates the development of modern science (think of Ecclesiastes). But the emergence of modern science lends it a cognitive import it did not previously enjoy, because where pre-modern nihilism was a consequence of a failure of understanding – “We cannot understand God, therefore there is no meaning available to creatures of limited understanding such as we” – modern nihilism follows from its unprecedented success – “We understand nature better than we did, but this understanding no longer requires the postulate of an underlying meaning”. What has happened in this shift is that intelligibility has become detached from meaning: with modern science, conceptual rationality weans itself from the narrative structures that continue to prevail in theology and theologically inflected metaphysics. This marks a decisive step forward in the slow process through which human rationality has gradually abandoned mythology, which is basically the interpretation of reality in narrative terms. The world has no author and there is no story enciphered in the structure of reality. No narrative is unfolding in nature, certainly not the traditional monotheistic narrative in which the human drama of sin and redemption occupied centre stage, and humanity was a mirror for God.
Quite a bit ago, I remarked (following up on a comment by my friend dy0genes) on Nihilism as a sort of ghost story, and I have since had many occasions to point out the magnetism between nihilism and the literary genre of horror. And what makes a ghost story more compelling than the preface: this story is true ?

Brassier holds that science gives us reasons for nihilism: rather than finding its motive in human insufficiency ("Even if one were to by chance hit upon the truth," as one of the preSocratics glumly put it, "one would not know"), contemporary, science-powered nihilism has a robust case for itself grounded in human capacity.

I have no knock-down response to this. But I do disagree with the way Brassier reads the motive for nihilism past. His claim that this was ignorance about God is especially striking. This ignorance has never been in question and remains axiomatic, for the infinite God and the finite (or even differently infinite) mortal human intellect are by definition incommensurable; but the strong voices in the tradition were always those who, while thinking this total incommensurability through, nevertheless found this an insufficient reason for despair, since God could and did reach out from the other side of the unbridgeable ontological gulf. The prophet Isaiah, St Gregory of Nyssa, Ibn Arabi, Simone Weil -- all are insistent that we do not reach God via some rope-ladder of syllogisms, let alone by rolling balls and inclined planes; but they know from experience that God breaks in upon the human condition nonetheless. Now if this assertion about "experience" sends your eyes rolling, there is nothing I can do about it. No argument is stronger than rolled eyes; but that is because rolling your eyes prescinds from argument.

Comes the rejoinder: "Bah!! It's your invocation of some unverifiable "experience" that prescinds from argument!" But everything hinges on what counts as "verifiable" here. I am not going to rehash the whole "everyone-takes-some-things-on-faith" line, not because I don't think it's true or relevant (it is both) but because it usually doesn't work as a front-line tactic. I'm going to argue, rather, that what counts as evidence is heavily skewed in the official courts.

Jeffrey Kripal argues that you don't need God to blast open the doors of perception; simply taking seriously a precognitive dream would be sufficient to demonstrate that something besides the exchange of sodium and potassium ions is going on. The article in which Kripal made this suggestion (in the Chronicle of Higher Education) drew an intensely dismissive eye-rolling screed by Jerry Coyne in response. "What about all the false predictions?" Coyne asked, as if no one had ever thought of that. But the point is not that we don't make mistakes all the time, and far more mistakes than "direct hits." It's that when some curious event like a precognitive dream or a powerful synchronicity comes upon you, you experience this as meaningful and while you can rationalize your experience away, this comes at a very high cost. In short (say I), such experiences are striking (but by no means the only) counter-examples to Brassier's claim that "intelligibility has become detached from meaning" -- but they also force a revision of just what is meant by "intelligibility."

Amod Lele first pointed me to this thread of Kripal's research in a comment on this post, and I have since read a good deal of him and find a lot to applaud and somewhat to critique; that will come in some later post, perhaps. Here I want only to underscore Kripal's claim that when weird things happen, they count and should not be brushed aside or left unacknowledged.
A paranormal event is one in which the world “out there” and the world “in here” manifest themselves as the same world. It is as if the mental and material dimensions of our experience have “split off” the same deeper Ground or One World. The physical world now begins to behave like a story or a series of signs. Hence the common descriptors of these experiences: “It was as if I were a character in a novel” or “It was as if I were caught in a movie.” These sensibilities, I suggest, are very accurate perceptions, because, of course, we all are caught in novels and movies, which we call culture and religion. A paranormal moment is one in which we realize that this is so. What we then do with this realization is up to us. (Kripal interview)
Elsewhere, remarking on this same self-description (‘it was as if I was a character in a novel’, or ‘it was as if I was inside a movie’) Kripal expands on how and why he thinks people are right to describe themselves thus:
I think they are. I think we are too, right now. We’re written, composed by our ancestors.
What shocked me was how many textual allusions people would naturally use to describe a paranormal event. They would talk about puns, jokes, allusions, readings or messages. It’s a textual process going on in the physical environment.

A paranormal event becomes an invitation to re-write the script. That could be on a personal level, or a cultural level for writers and film-makers. Take writers like Philip K. Dick or Whitley Strieber – these are people who create fantasy for a living. They know their spiritual experiences are fantastic. They know they’re being mediated by their imagination. They’re not taking them literally. And yet they would insist that something very real is coming through.
This claim of Kripal's that in such moments, "The physical world now begins to behave like a story or a series of signs," is very far-reaching, and it directly conflicts with Brassier's contention above that "no narrative is unfolding in nature," though it does leave a bit vague the precise contours of the referent "nature" and what the character of this "unfolding" would be. It certainly does not demand that there be one "narrative." For myself, of course, I have already said more than once that if there is any sense to the notion of apprehending a "story" to the context-of-all-contexts, it will be matter not of plot but of theme. This, I take it, is crucial to figuring out how best to get out of the stupid zero-sum game of the analytic/continental divide, which, as I mentioned before, Harman (following Brentano) sees as roughly mirroring the split between the sciences and the arts: slow, collaborative progress vs. the cult of the genius. If the danger is that the Analytic side, in its adulation of science, will simply capitulate to scientism, the answer certainly cannot be to merely laud the artistic model of the Continental side; for the same rot that has corroded science has (differently, but in the same process and perhaps to worse effect) deeply corrupted art. It would take us even further afield in the loop-the-loop paths these two posts are "following" to pursue the matter, but the same arguments about the pernicious effect of academia have been going on for even longer with regards to poetry and fiction and the visual arts, and perhaps music above all. Attackers of MFA programs complain, sometimes in so many words: you can teach craft, to some extent, but you can't teach vision, and sometimes the teaching ruins the vision that was there. Defenders shrug: American poetry is vibrant, diverse, and thriving, and most of the critics teach at MFA programs, so what gives? The debate in music gets less airplay but is just as serious and just as deep-seated. When, the accusation goes, was the last time a great work of music came out of the academy, anything moving enough, or even catchy enough, to compare with Cole Porter, Stephen Sondheim, Lennon and McCartney, Joni Mitchell, Prince, Radiohead? This sounds anti-elitist, because we think of pop music as being, well, populist in some sense; but it turns out to be just as much a function of the "cult of the genius" as any worship of Haydn or Mahler. It's not a myth we really want to say we believe in anymore, but we do not know what to replace it with.

What would a philosophy look like, though, that rejected the binary of "art" vs "science"? I am unsure, but Kripal provides us with another hint. At the very same time that the psychology and philosophy departments were differentiating themselves from each other, one towering figure was conducting intense research into just such strange phenomena as Kripal insists we ought to "put back on the table." That figure was William James, who straddles philosophy and psychology and is by any account one of the most important American thinkers in both; but whose tireless investigation into what we would call paranormal phenomena has been systematically marginalized by the unofficial archive-keepers, Kripal contends.
James was a also very active psychical researcher... I spent 20 years studying mysticism and never really thought about the paranormal or psychical phenomena.... We all read James, and we all taked about his pragmatism or his comparativism, but nobody ever talks about his psychical research. That really was a revelation to me, and I wanted to address the question of why we’ve taken that William James off the table.... Which William James are we talking about here? (about 5-6 min in.)
Kripal is overt about his resistance to scientism. For my part, while I do not consider scientism the same thing as nihilism (for one thing, it's often shallower), neither do I consider it the same thing as science. I would hope this need not be said, but as Coyne's misreading of Kripal makes clear, it does. I consider the scientific drive for truth, for as accurate as possible an account of the universe, to be a non-negotiable part of the philosophical spirit; the critique of superstition and of fear of thought must be cultivated intentionally (and it is not easy -- for thinking can indeed be frightening, and anyone who sneers at this hasn't done enough thinking yet to know.) I consider Scientism, on the other hand -- which I will gloss here as an inflation of methodological naturalism into an ontological presupposition, with attendant casualties elsewhere, including science, and in ethics above all -- as one of the main alibis of nihilism today. Thus it is, not without some mild amusement, that I find myself lending tentative credence to the case that a really good ("true"?) ghost story could be an antidote to nihilism.

Edward Reed notes that
The projects on which James expended the most labor -- his stream of consciousness psychology, his studies of psychic powers, his analysis of religious experience and conversion -- have never been taken up seriously by those who claim to be his heirs in philosophy and psychology throughout the United States. (p 201)
This is somewhat overstated if one includes disciplines like religious studies, but as regards the second item on Reed's list, it surely stands. I don't have a well-thought-out explanation of this neglect, except to suggest that it stems from an obvious consensus-reality discomfort with such phenomena, "alleged" or not. Nor do I have an over-arching moral to draw. It is surely not the case that James' interest in psychic research by itself would provide some watertight response to nihilism. Nihilism is compatible (if it can be said to be "compatible" with anything) with any number of ostensibly paranormal hypotheses and phenomena, including robust ghost sightings, alien encounters, and the ESP that Jerry Coyne hand-waves away. It is important to note that Kripal does not limit his interest in the "paranormal" to these rather lurid examples. His main anecdotes -- which is not a dirty word to him -- are those very private and unverifiable events that steal upon us, or overwhelm us; unsought-for mystical encounters (whether thus labeled or not), like Philip K. Dick's 2-3/74 (which Kripal treats at length in chapter six of Mutants and Mystics), or Nietzsche's Sils Maria realization (which as far as I know he doesn't mention), or his own powerful kundaliniesque experience. What I want to insist upon is not the lab-ready status of such events but the importance of Kripal's claim that in these situations the world seems to be more like a story than like an equation. If, in our modern democratic hearts, despite wanting to lean towards Continental philosophy's concern with the good (and rightly-conducted) life, we don't want to unreservedly embrace the notion of "towering geniuses," the notion of the world itself as "art" seems -- while not a philosophical panacea -- clearly apposite.

"Never trust the artist," says D.H. Lawrence; "Trust the tale."

Thursday, March 10, 2016

Philosophy Departments, Nihilism, Psychic Research, and the Continental / Analytic Divide. And a few other things you didn't think were related. Part i

University philosophy has produced some crucial work, given livelihood to thinkers "important" and garden-variety, and fostered the awakening of generations of students. But we all know it's fucked up; that careerism and the interests of institutions are at best -- and the best is rare -- strange bedfellows with the love of wisdom. Of late this critique got a little bit of publicity in the New York Times' The Stone with an article by Robert Frodeman and Adam Briggle, and then a rejoinder recently by Scott Soames. Frodeman and Briggle write:
Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university.... Philosophers needed to embrace the structure of the modern research university, which consists of various specialties demarcated from one another. That was the only way to secure the survival of their newly demarcated, newly purified discipline. “Real” or “serious” philosophers had to be identified, trained and credentialed. Disciplinary philosophy became the reigning standard for what would count as proper philosophy. This was the act of purification that gave birth to the concept of philosophy most of us know today. As a result, and to a degree rarely acknowledged, the institutional imperative of the university has come to drive the theoretical agenda.
They see a deeper and more pernicious effect of all this:
The implicit democracy of the disciplines ushered in an age of “the moral equivalence of the scientist” to everyone else. The scientist’s privileged role was to provide the morally neutral knowledge needed to achieve our goals, whether good or evil. This put an end to any notion that there was something uplifting about knowledge. The purification made it no longer sensible to speak of nature, including human nature, in terms of purposes and functions. ...Once knowledge and goodness were divorced, scientists could be regarded as experts, but there are no morals or lessons to be drawn from their work. Science derives its authority from impersonal structures and methods, not the superior character of the scientist. ...
Their conclusion is dire:
Philosophy should never have been purified. Rather than being seen as a problem, “dirty hands” should have been understood as the native condition of philosophic thought — present everywhere, often interstitial, essentially interdisciplinary and transdisciplinary in nature. Philosophy is a mangle. The philosopher’s hands were never clean and were never meant to be.....Having become specialists, we have lost sight of the whole. The point of philosophy now is to be smart, not good. It has been the heart of our undoing.
Soames wants to contest all of this -- a far too dreary assessment. He sums up Frodeman and Briggle as having claimed:
that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences. This institutionalization... led [philosophy] to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives.
Soames rejects both contentions.
I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
I wish Soames would address the first part of this argument -- that philosophy is "not separated" from the sciences -- to Stephen Hawking, Lawrence Krauss, or even Bill Nye, or any of the growing cadre of scientists who keep saying silly things like "philosophy is dead." As for the claim that philosophy was never "centrally" about goodness, justice and virtue -- this is to my mind a flabbergasting thing to say. Perhaps we have reached a differend here, but this assertion is, to my mind, a part -- a very large part -- of the problem.

The ever-engaging Brandon Watson at Siris rightly objects that
"Never" is an oddly strong word here -- the claim is certainly false (for instance) of very large portions of ancient philosophy. (Try to imagine a Plato who did not regard goodness, justice, and virtue as the central aim of philosophy. Or what in the world were Hellenistic philosophers mostly talking about if not primarily about "goodness, justice and virtue"?) But you don't have to go back so far. While one can argue about whether it's quite correct to call it "the" central aim, "the study of goodness, justice and virtue" was certainly far more central in the nineteenth century than you ever find it in the twentieth century.
I do not know exactly where Soames, Frodeman, and Briggle fall on the Analytic / Continental spectrum, but it isn't hard to discern the rough outlines of this split in their contentions, or in their curricula vitae. Soames has authored not one but two multi-volume histories of Analytic philosophy; every twentieth-century thinker he names to make his case is arguably from this "tradition," as he calls it, and the easy rapprochement with science which he commends is very much of a piece with Carnap and Quine, Sellars and Armstrong. There are any number of moral philosophers who one could call Analytical, e.g. Anscombe, Rawls, Midgley (who incidentally is a good example of an anti-scientistic Analytic thinker); nonetheless, Frodeman and Briggle's concern with "how shall we live?" -- and they are both very engaged with this question on (or even under) the ground, so to speak -- it is clearly flavored Continental (Frodeman studied under Stanley Rosen and Alphonso Lingis). All of which is meant to suggest that the A/C split may actually have something to do not just with reactions to the "departmentalization" of philosophy, but even perhaps with its origins. We'll come back to this.

Soames thinks he can argue cogently that Frodeman and Briggle have their genealogy wrong, and that "scientific progress" did not
rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools. Sometimes it does so when sciences are born, as with 17th-century physics and 19th-century biology. But it also does so as they mature. As science advances, there is more, not less, for it to do.
But the point is not whether philosophers have found material in the sciences, nor even whether they have contributed to scientific discourse (as has arguably been the case). The question is whether this is a good model for philosophy. Socrates clearly believed it was not, and it is worth quoting Plato at some length here:
When I was young, Cebes, I had an extraordinary passion for that branch of learning which is called natural science. I thought it would be marvelous to know the causes for which each thing comes and ceases and continues to be. I thought it would be marvelous to know the causes for which each thing comes and ceases and continues to be. nd I was always unsettling myself with such questions as these: Do heat and cold, by a sort of fermentation, bring about the organization of animals, as some people say? Is it the blood, or air, or fire by which we think? Or is it none of these, and does the brain furnish the sensations of hearing and sight and smell, and do memory and opinion arise from these, and does knowledge come from memory and opinion in a state of rest? And again I tried to find out how these things perish, and I investigated the phenomena of heaven and earth until finally I made up my mind that I was by nature totally unfitted for this kind of investigation....Then I heard some one reading, as he said, from a book of Anaxagoras.... I rejoiced to think that I had found in Anaxagoras a teacher of the causes of existence such as I desired, and I imagined that he would tell me first whether the earth is flat or round; and whichever was true, he would proceed to explain the cause and the necessity of this being so, and then he would teach me the nature of the best and show that this was best; and if he said that the earth was in the centre, he would further explain that this position was the best, and I should be satisfied with the explanation given, and not want any other sort of cause. And I thought that I would then go on and ask him about the sun and moon and stars, and that he would explain to me their comparative swiftness, and their returnings and various states, active and passive, and how all of them were for the best.... How grievously was I disappointed! As I proceeded, I found my philosopher altogether forsaking mind or any other principle of order, but having recourse to air, and ether, and water, and other eccentricities. I might compare him to a person who began by maintaining generally that mind is the cause of the actions of Socrates, but who, when he endeavoured to explain the causes of my several actions in detail, went on to show that I sit here because my body is made up of bones and sinews; and the bones, as he would say, are hard and have joints which divide them, and the sinews are elastic, and they cover the bones, which have also a covering or environment of flesh and skin which contains them; and as the bones are lifted at their joints by the contraction or relaxation of the sinews, I am able to bend my limbs, and this is why I am sitting here in a curved posture—that is what he would say, and he would have a similar explanation of my talking to you, which he would attribute to sound, and air, and hearing, and he would assign ten thousand other causes of the same sort, forgetting to mention the true cause, which is, that the Athenians have thought fit to condemn me, and accordingly I have thought it better and more right to remain here and undergo my sentence; for I am inclined to think that these sinews and bones of mine would have gone off long ago to Megara or Boeotia—by the dog they would, if they had been moved only by their own idea of what was best, and if I had not chosen the better and nobler part. (Phaedo 96a-99a)
Watson does not think much of Soames' case, and he musters some evidence in support of the claim that philosophy was aping the sciences, and psychology in particular:
The first clear, definite philosophy departments arose in response to the formation of psychology departments. The first clear, definite philosophy journals, associated with subject matter studied in departments devoted specifically to what was called philosophy, arose in the same way and for the same reason. It is not an accident that one of the first such philosophy journals, formed in 1876, is called Mind
Watson aptly points out that a bit over a century ago, every academic degree was in philosophy if it was not in medicine, theology, or law. This is, by the way, what Kant's The Conflict of the Faculties is about, and of course, it is why to this day you can get an M.D. as "doctor of medicine" from the university's medical department or a Ph.D as "doctor of philosophy" from just about any other department you care to name -- though there remain equivalent degrees -- ofttimes honorary -- in Law and Divinity.

Why then, did "philosophy" become its own separate academic department? Watson cites the intellectual history traced in Edward Reed's work, From Soul to Mind, a work tracing the development psychology. Watson first pointed me to this book in a discussion on philosophical diversity. Chapter 10 in particular lays out some of this history:
Both modern psychology and modern philosophy – as academic disciplines comprising professional scientists or scholars – began to emerge toward the end of the nineteenth century. Psychology in this sense preceded philosophy by about ten years, although it tended to be housed within philosophy departments. Obviously a great deal of jockeying for position power, prestige, and influence took place. In the United States it was only in the 1890s that philosophers sought to organize specialized journals and started to think about founding a professional society (which did not begin functioning until 1901). In these activities they lagged at least a few years behind the psychologists, and many of the founding documents of ‘strictly philosophical’ institutions explicitly refer to the successes in psychology as one of the reasons for establishing such distinctively philosophical entities. Small wonder that the new professional philosophers latched onto the most provocative antipsychological methodologies available, phenomenology and logic, as defining the activity of members of their emerging discipline. (p 200)
Some may want to push back here by asking, Wait, What about William James? Founder of "Pragmatism," the quintessentially American philosophy, and also author of Principles of Psychology? Even Reed confesses that James is an anomalous figure for his account. James never "took" to the new psychology. He also never approved of the way the "institutional imperative of the university ha[d] come to drive the theoretical agenda" of the humanities, even by his day. (See his "The Ph.D. Octopus," though it is occasioned by an objection to the English department in this case.)

We'll return to James -- in particular, his energetic and now completely neglected enthusiasm for psychic research -- in the next post; for now, there are a couple of things to note about the general picture Reed sketches. The first is historical. The advent of philosophy departments was the part and parcel of the aforementioned infamous Continental / Analytic "split" which followed less than a generation later. A few years ago another student of both Lingis and Rosen, Graham Harman, saw the roots of this divide in Brentano's 1894 lecture on "The Four Phases of Philosophy," which you can read in translation in this book, along with some uneven commentary. Brentano writes:
The history of philosophy is a history of scientific efforts, and it is thus similar in some respects to the history of the other sciences. On the other hand, it is different from the latter and seems rather to be analogous to the history of the fine arts. Other sciences, as long as scientists pursue them, show a constant development which may sometimes be interrupted by periods of stagnation. Philosophy, however, like the history of the fine arts, has always had periods of ascending development and, on the other hand, periods of decadence.
This is of course not just a pair of analogies. The investigations of experimenters and theorists like Newton, Boscovich, Faraday, and Maxwell all fell under the rubric of "Natural Philosophy;" even as recently as 1949, Einstein (at least) could be described, in an echo of archaic usage, as "philosopher-scientist" and there was still a chance that this would be understood.

A propos Brenato's characterization, Harman remarks that: "the entire analytic/continental rift is foreshadowed and explained in this passage." It plays out on the level of process, arguably, even more than it does on the level of doctrines promulgated. The usual argument is that analytic philosophy tries to apply the hard sciences' methods to philosophical problems, or that it tries to back up and do meta-science. But Harman thinks the problem is, rather, that Analytic philosophy tries to model itself upon what it sees as the history of science; while Continental philosophy is animated by a particular myth of the arts:
The difference between the two currents... isn’t so much one of content as of professional mission and self-understanding. Analytic philosophy is deeply committed to the idea that philosophy is a cumulative enterprise, and that the adding up of small discoveries will lead to a general professional advance.... By contrast, it seems pretty clear that continental philosophy follows the “fine arts” model of the history of philosophy…. The progress of philosophy is made not of cumulative argumentation but by the vision of towering geniuses.
Now, it is easy to see how this analysis resonates with Reed's case, and Watson's revisionary use of the case, that philosophy as academic discipline was responding to the rise of the "new psychology." If Brentano's lecture points to an apparent tension in philosophy, the crisis pressed upon academic philosophy by departmentalization turned this tension into a schizoid structure. Not only would philosophers now decide "what would count as proper philosophy" by virtue of who was in or outside of institutional circles, they would wind up squabbling within the institution as well, until Analytic philosophers would deny that Continental philosophy was philosophy at all, and we got things like Quine signing a letter protesting Derrida's receiving an honorary degree, claiming that "In the eyes of philosophers, and certainly those working in leading departments of philosophy throughout the world, M. Derrida’s work does not meet accepted standards of clarity and rigour."

So much, then, for the historical question.

But a deeper and more unsettling question also arises. If we play this history forward, perhaps we are now living through the endgame of this untenable arrangement? That will be the starting point for Part 2.

Thursday, February 4, 2016

"...the intriguing insanity of religion..."

"... you’re one of the few that I’ve come across, probably the only one that is openly available for dialogue, that seems to have faced the void in all its implications, and somehow came through with his faith intact. I suppose I simply want to know how you did it."
This from Kalimere, who in several lengthy comments on a couple of posts from last year has raised in pressing terms the question of how or whether one can face down nihilism consistently and coherently -- especially now.

I think Kalimere's questions deserve a better response than I can offer, mostly because they are a version of the only question worth asking. Caveat lector: I am nobody to give out "spiritual advice," and insofar as my own idiosyncratic "spiritual" (Ha!) biography is of the slightest interest, I would think it would be as an ongoing cautionary tale. Nonetheless, I am going to offer some few reflections because I was asked, which is really what it comes down to.

One of my responses to Kalimere's assertion is almost a scoff: "Intact," indeed. I am strongly tempted to say "only a broken faith is worthy of the name 'faith,'" or some such. I am merely naming this, precisely as a temptation, because I think it is a bit of a cheap and too-easy pseudo-paradox. There are a few exemplars of such faith who I can think of -- recent ones are Sergio Quinzio, Ivan Illich, Simone Weil, Elie Wiesel -- but I don't want to enumerate a little pantheon or wrap myself in a mantle-by-association; nor do I for a moment demean (though I cannot really relate to) the faith of ordinary believers who would never dream of the temptation of cursing God or suggesting that He has always been a landlord absconditas. Moreover, as I say in the post to which Kalimere is responding, I don't see any meaning to the idea of piety that is not enacted in worship. The worship may be like Job's, but I truly hold that it must be worship, prayer. Mere "theology" as "God-talk" is so much flatus vocis, but if we want to really pray, this means using forms of worship which we find -- which we do not make up ourselves -- for if we make it up ourselves, we run straight into the arms of mere egoism, be it ever so concealed under the abnegation of the ego (and everything else). One's faith may indeed be broken, but talk about "broken faith" runs a dire risk of more-broken-than-thou syndrome. You can't contend against the wiles of the ego with a strategy you make up yourself. You need traditional forms.

But why would one think that traditional forms are anything other than "the rotted names," to use Stevens' line? This brings me to my second response. The void was not discovered by science and it has not been made especially More Void-y by it. The void has always been there. I have learned a great deal from historicism, and it's an important part of my philosophy that there has been what I'll abbreviate as an "evolution of consciousness;" but I absolutely deny that our dilemma is in any decisive sense completely new and without precedent. Nihilism has always been "at the door," to those who had the skin to sense the chill. If there was ever a real response to it, there remains one now. So I do not concede the case (pressingly made by R.S. Bakker, for instance) that our circumstance has somehow encountered a game-changer in science. Science can be carried out in a (perhaps unconsciously) nihilist spirit. And it articulates a thousand-and-one reasons for nihilism if you are already leaning -- the slightest bit, perhaps -- that way. I don't wish to dismiss that lean; I simply dispute that there is any inherent connection between it and science per se.

I take this nihilism very seriously. I am, indeed, an ultimate "optimist," in the sense that I maintain (without being able to really imagine) that "All shall be well, and all manner of thing shall be well." But I am also a proximate "pessimist" -- there is no reason to assert that at any moment before the eschaton, any given thing need be "well" at all. I am, by the way, leaving wholly aside the complicated questions of what sort of philosophy of time and history this involves; but I grant that it may well feel like insanity to "expect" or even hope for such a denouement of reality -- some kind of big "Roll the Credits" at the End of Everything. Obviously I don't grant that it is stupid, or immature, or the craven echo of our ancestors' bad dreams or bad reality; but those who want to say that such eschatology is just another alibi for denialism are welcome to do so -- I don't believe I can persuade them against their will. (I do believe there is a question of will here). On the other hand, those who want to say that eschatology is more accurately a discourse of absolute doom -- that there is an End to All Things a-comin', and it will be cold, and meaningless, like everything that came before, but this time undeniable -- are, whether they know it or not, trading upon a secularized version of categories to which they have no legitimate claim.

But unlikely though it may seem, one can also combine faith with just such a bleak outlook. I drew upon Derrida a good deal for part of the work which drew Kalimere's comments; but probably the writing of Mikhail Bakhtin is more pertinent here. Bakhtin's dialogism was often associated with deconstruction when I was first exposed to this milieu; but as also happened with the work of Michel Serres, gradually people realized that there was a significant difference. Bakhtin's work does indeed strongly claim that "meaning is never foreclosed," and like Derrida there's a strong element of play in it ("carnival," it's often translated), but it's striking that Bakhtin never feels like he flirts with nihilism, though there is plenty of dourness, as might well be the case for a theorist whose works were forged under Stalinism and were often published posthumously (if they were not lost entirely). This won't be a post on Bakhtin, but I want to recall a relevant anecdote: Bakhtin, whose work is surprisingly and esoterically shot through with Orthodox Christianity, was once asked by a friend "whether or not good will eventually triumph." Bakhtin responded sharply: "No, of course not." This, I take it, is a rejection of the terms of the question. Whatever "All Manner of thing shall be well" looks like, it will not look like triumph, even though such categories may offer a rough analogy for now. Perhaps it would be better to say, as Lesslie Newbigin said: "I am neither an optimist nor a pessimist. Jesus Christ is risen from the dead."

In an article wrestling with Christian doctrines of eschatology and creation, David Bentley Hart, while also giving a long litany of the horrors of the world both moral and natural, then writes:
. . . It would be impious, I suppose, to suggest that, in his final judgment of creatures, God will judge himself; but one must hold that by that judgment God truly will disclose himself (which, of course, is to say the same thing, in a more hushed and reverential voice).
Yes, my faith is eschatological. But as Hart points out, this by itself does not suffice. One way of putting my own master-question, the question which in some senses characterizes my entire project, is, What is the right voice for philosophy? The right tone? And before we go to the easy, too-easy answer -- "more than one" -- I'll point out that Hart has already used more than one above, but he gives the last word -- the provisional "last" word, the last word here -- to the reverential. Yes, more than one, and yet, one. Which, when you stop to think about it, is the question of the One and the Many, the question philosophy is always asking.

My post's title comes from Kalimere's comment when he responded to my remark that "the watchword here is paradox;" it reminds me that Socrates tells Phaedrus that the poets are mad, but that "madness is the greatest of gifts, when it comes from the gods." There is another maxim, often thought to have been current in Socrates' day, to wit: "Whom the gods would destroy they first make mad." In fact, in that form the saying is not ancient -- it's by Longfellow -- but there are variants enough, that go back far enough, to justify the supposition that something like this was probably written by Euripides. And so it would seem not to be an issue not just of a distinction of the source (gods or not) of the madness, but of the motive of the gods for the "gift." Bakhtin in Art and Answerability says: "Inspiration that ignores life and is itself ignored by life is not inspiration but a state of possession." There would be, then, madness and madness; perhaps there are gods, and gods. It is hard to know the difference sometimes. One of the lessons I have imbibed from deconstruction is that there is no simple distinction to be made here, at least not by me. It takes someone with far more spiritual discernment than I to tell where one shades off into the other. The only thing I can claim is that such discernment is not a meaningless idea. In short, there is more than one sort of paradox; there is faith that is just a turning-away from reason, and there is faith that is the culmination thereof. I am sure I don't always know the difference; but I am sure there is a difference.

Wednesday, January 27, 2016

From my collection

There's a particular art project I do with students every year. Art Cards are small (2½" x 3½") cards, each of which bears an original work of art -- these ones shown here are in crayon, ink, and pencil. I've been doing this with students for over a decade now and it always amazes me what remarkable results they can attain in such a restricted compass.

Children's art famously has a number of common features wherever it is found, but on a small scale their instincts are often uncannily keen and their accomplishments deeply moving; particularly, for me, their non-representational designs. Many of these works, if executed on a scale of feet or meters instead of inches, would at the very least be comparable to any piece of hotel room art I've seen. Frequently they could hang beside Braque, Kandinsky, Pollock, or Rothko.

Shifting up by a factor a 12 does not, of course, make us graduate from little kid to grown-up. Any good gallery or museum ought to be able to show you some examples from masters of the miniature. But a 2½ x 3½ meter canvas dominates a visual field in a way that a playing-card sized piece does not. It's not the difference between maturity and childhood; it's the difference between outer and inner.

While the students who made these pieces -- all between five and eleven years old at the time -- had doubtless been exposed to such art, the art cards were never a result of an assignment ("look at this Picasso, this Mondrian. What do you notice? Now try something yourself!") All I do is put the materials in front of them and encourage them to make things they like. They take some inspiration from each other, and over the years I've learned a few techniques I pass on to them, but the designs are theirs alone.

Inevitably some students are immediately excited by the project, and others are blasé. I long ago stopped requiring participation in something like this, so it sometimes happens that a student is slow to take it up. But even those who seem uninterested at first often try it, because the initial investment is so small -- after all, it doesn't take much to scrawl with some colored pencil on a playing-card sized bit of card stock.

What makes the project take off, however, is a further dimension. I cannot take credit for it; as far as I know, it was the stroke of genius of Swiss artist Vänçi Stirnemann. The cards are trading cards; they are swapped, one-for-one. This means students rarely stay at that initial level of minimal effort. To build up a stock of tradeable work, they have to devote serious effort -- not hours per card, but enough that they feel they are parting with something that has cost them creative labor, when they are asking for a card they desire made by someone else -- a card that presumably also cost effort and attention, or it wouldn't have caught their eye.

In some ways this use of the art itself as a medium of exchange is an even more revolutionary element than the small and portable size. It step jump-starts the circulation of the art and generates the magic which desire and possible attainability can confer upon it. Students experience the strange spell of wanting something just because it is beautiful, and also the delight of having made something that someone else is willing to trade for.

Some of these cards shown here were made this week. Some are over ten years old. I of course have some sentimental investment in them as well -- they are each signed on the back, and I can remember each artist, and sometimes even the occasion of the work -- but I still find that each one summons up a kind of world, an aesthetic unity, which is independent of whatever accidents of biography attach to them for me. Needless to say, I traded for each of them myself.

Tuesday, December 15, 2015

John Bremer 1927-2015

Socrates had something which I would like to have myself and which lies behind my educational work. It is this: that even when he faces death, even when he faces the dissolution of his mortal life, he is nevertheless able to face the situation as if it were an educational opportunity. He responds to it in an educational way, not only for himself, but for his friends. I myself, I suspect, would be scared. I would not only be scared, I would be so scared that I would be more concerned about the possibility of surviving than I would be about the possibility of leaving this world gracefully or in an educational manner. What is it that Socrates had? I would like to indicate in a general way what I think his achievement was because it is at the center of my own thinking now.
John Bremer, A Matrix for Modern Education, p 9 (1973)

I was deeply saddened to learn of the death of John Bremer on November 30.

I did not know Bremer intimately and we never met, but we emailed during the last years of his life. (How I wish I had moved more quickly interviewed him, as we had discussed.) What he may have been like in person I do not know, but I read his books and treasured his correspondence, and every encounter with his words underscored the slowly dawning impression that he was the real thing: a lover of wisdom.

He was deeply committed to learning and scholarship and he took meticulous time on careful, detail-laden readings of a broad range of cultural texts: from Plato and Homer, to anonymous English ballads and Shakespeare, to the traditional dances of England. At the same time, his close readings were always alive to the real matters. These accounts -- I'm thinking especially of his work on the Meno, the Ion, and above all the Polity (as he usually insisted on calling the dialogue usually called the Republic) -- were fine-grained to the level of syllable-counts. He articulated and substantiated a picture of Plato (and by extension, much ancient philosophy) as extraordinarily attentive to the questions of literary form. But he was insistent that these analyses only had any real point in the context of the work Plato called us to: learning how to live. When Jay Kennedy lately announced that he had discovered a musical structure to many Platonic dialogues, Bremer gently reminded readers that this claim (along different lines) had been made some twenty-five years previously by Bremer himself, to say nothing of the intersecting work of Ernest McClain in the 1970's. But Bremer's more substantive point was not about academic priority. It was, rather, that the question over whether Kennedy's claims were credible obscured much deeper and more pressing concerns with what threatened to be a swarm of mathematical minutiae.
Even if Jay Kennedy and myself have understood something of the mathematical structure of Plato's dialogues, there remains the question that Plato is always asking: How does this effect the way a man should live? Or what is its relation to the Good? If we don't face those questions, we might as well do crossword puzzles.
Bremer never lost sight of these problems and he persued them doggedly but without airs for his whole life. His career as a radical educator (though he may not have wanted the term "radical," he certainly was one -- in the etymological sense of one who goes to the roots -- compared to the status quo of his day or ours) began with re-educating ex-Hitler Youth in Germany after World War II, and took him through several professorial positions in England; to New York City where he headed a school district; then to Philadelphia in the late 1960's where he helped to found and run the Parkway Program, a "school without walls" in which the city of Philadelphia became the campus of students; then to British Columbia where he served as Commissioner of Education; to Australia where he found the Education Supplement for The Australian newspaper; finally to Massachusetts where he founded the Institute of Open Learning, which became Cambridge College.

This curriculum vitae looks, and is, very impressive; but it was not a smooth ride. Bremer and his wife Anne came to the United States together following a case of professional discrimination against Anne; later, Bremer resigned his post in New York in frustration over his incapacity to actually change the school system. He said at the time,
If we wish to improve the education of children in New York City public schools, it is my opinion that this can only be done if we can change the relationship between child and teacher, between child and child, and between child and material. To change these relationships involves the total re-structuring of the New York City public school system.
This far-sighted radicalism could only collide violently with the status quo, and collide it did. This inevitably led to more reversals: the Parkway Program, despite its obvious successes, was essentially re-absorbed back into a traditionally-structured brick-and-mortar model; Bremer was dismissed from his position in British Columbia (the Education Minister said carefully that "we both want to create the finest education system here, but we differ as to the manner in which it is to be achieved"); and Cambridge College, where Bremer was Professor of Humanities from 2005 to 2008, later seemed to him to be a disappointment, having lost its vision and floundered in financial mismanagement. (It is still operating, still accredited, and may yet validate its founder's hopes.)

Bremer never glossed over these setbacks; he simply held to his vision, and his legacy in education is indisputable, though he may be remembered by name only by a few. The Parkway Program, especially, inspired a large number of experiments in education, many of which still hold to their principles against the odds. The genesis of this project was, of course, not idealism; it was money. Bremer had been called in to help with "decentralizing" some of the city's overcrowded schools as a way of wrestling with tremendous budget shortfalls. Bremer saw it as an opportunity to do a great deal more than make ends meet:
Once the confines of classroom and school were removed, it would be possible to re-define, to re-structure, the whole educational process. The freedom and responsibility of the student could become paramount.
Bremer reconfigured the whole administrative apparatus of a public high school; it became a genuinely (and, to some, shockingly) collaborative venture between students and faculty. The school was divided into self-governing units which held weekly "town meetings" where the curriculum was planned and discussed. Students told teachers what they hoped to learn; teachers proposed to students what they needed to know. Age distinctions dwindled. Attendance was not mandatory. No letter-grades were given; they were replaced by individual written evaluations of the students' work. An informal atmosphere prevailed; "Students can smoke in class, call teachers by their first names, and utter four-letter words without inhibition," Time magazine reported. To this day there are students who refer to it as one of the best periods of their lives.

Though the district leaders may have been taken aback by getting more than they bargained for, the opportunity was ripe for such experiments (it was 1968, the same year the Sudbury Valley School was established), and the Parkway Program experienced considerable success, not to mention notoriety. The write-up in Time brought educators flocking to see how it was done. Bremer tried to make sure that no one misunderstood it along the lines of counter-cultural clichés: the Parkway program was not "unstructured;" it was structured differently. "I don't know what an unstructured experience would be," he said, and in any case no learning transpires without structure. The question was: what structures would best support learning?

I've already mentioned Bremer's close attention to Plato. The "structure" he found there was extreme; no one could accuse him of being slapdash. One example: he believed that the Polity was meant to be read in a single day; that if you paid heed to clues in the text you could discern which day of the year it was set on; and that, if you had read it at the relevant (Greek) latitude on that day, the text coinciding with key moments (sunrise, sunset, midnight, noon) would reveal a meta-structural significance.

Or again: why is Apollo, the god of poetry, never named in Plato's Ion which is devoted to the nature of poetry? Bremer has recourse to a careful reading of Plutarch, extensive music tuning theory, and painstaking count of the number of syllables in the dialogue to answer this one, which I will leave to the reader to discover in his book Plato's Ion: Philosophy as Performance.

But perhaps even more than in these readings of the ancients, Bremer's attention to structure and how it enabled learning emerged in his love of dance, especially the folk-dance inheritance of England -- dances he believed to be remnants of a pan-European ritual tradition. He knew and taught these folk dances for years, sensing his body and abilities change until he could no longer leap as he once had but feeling that in some ways he was a better dancer as an old man than he had ever been in his prime. I did not know him as a dancer -- everything about this aspect of his life I learned from his writings -- but to me it epitomizes the secret that kept him from false modesty and false seriousness alike:
The music is more important than the steps and figures. Anyway, I should know the tune and be able to prepare my body to move in cooperation with it—that kind of mastery comes with experience, but is not reducible to absolute rules. But knowing as well as possible the tune and the dance steps and figures does not make the dance; they mark off the limits of possibility within which the dance can be created.

This seems most important to me. Within the limits of possibility, the dance is created. It does not pre-exist, nor is it constituted by the figures and not even by the tune—these are its pre-conditions but they are not its essence. The essence, the mystery, is what I, as dancer, create within those limits.
Bremer knew his limits. As an educator one could enact certain opportunities, and make information available, exemplify and even train expertise. But that is a matter of setting up a structure. There the educator reaches a limit, a limit inherent in the nature of human freedom. "Each person is free to learn for himself, and that freedom cannot be exercised by anyone else," he said -- almost a tautology, one might think, but easily lost. There is, Bremer maintained, a "second kind of education,"
a kind that has almost been forgotten. If the first kind of education is characterized by passivity, by a taking in, by memorization, by submission, then this second kind is characterized by activity, by a generous giving out, and by a creativity which shows, for example, the moral purposes that the acquired knowledge might serve.

But, it may well be asked, how do we carry out this second kind of education? And the answer is, we don’t. It is not something that teachers can do; only learners can do it, and they must do it for themselves. All that the teacher can do is, first, to help the students understand what has happened to them in their prior education and, secondly, to clear away the obstacles and impediments to the freedom of creativity. We do not give students their creative power—nature has done that by giving them what may be called a soul.
Eventually Bremer came to the most basic of limits: time. I do not know just how he faced his death, but his whole life had been bent toward making it an educational opening for himself and others. In a late email he told me about last words of Socrates, that
'We owe a cock to Asklepios' ... is almost universally misunderstood. Their true meaning I am sure is that they were the customary sacrifice to Asklepios on the birth of a child.
Towards the end, his computer crashed and he lost a large amount of work. Writing to me in some frustration but without a trace of self-pity, he said that he was struggling to re-organize his thoughts, and that perhaps it would be better thus; then he added, wryly, "Horace was right, but I don't have nine years." He had, though, something better -- what Socrates had.

He was referring to the adage from the Epistle On the Art of Poetry:
...if at any time you do write anything, submit it to the hearing of the critic Maecius, and your father's and mine as well; then put the papers away and keep them for nine years. You can always destroy what you have not published, but once you have let your words go they cannot be taken back.
We are fortunate that he wrote what he did. In one of his last emails he had told me,
I only ever thought that I could do two things tolerably well: one was dancing, the other writing. And they seemed not unconnected for I am very conscious of the rhythm in what I write and of the 'figures' of the 'argument'.
A dance cannot be "put away," for it happens in the moment, and is gone. As to the writing that might have happened, it is gone too. What we have is what he published (many examples can be found on his website, which I hope will continue to be maintained and updated): work that is wise, generous, and self-effacing; that turns close analysis -- every "step" -- to the service of the largest and most open questions.

Memory Eternal.

Thursday, December 10, 2015

Grammar of Miracle

A couple of weeks ago, James Chastek at Just Thomism put up an excellent post on miracles, which sparked some further reflections on my part, and the midst of Hanukkah, celebrating the "great miracle" of the oil in the re-dedication of the temple after the Maccabees' revolt*, seems a good time to put it up.

Somewhere in an interview -- I can't recall the source anymore -- Ken Wilber remarked that reincarnation is still one of those topics that you cannot mention without your standing being immediately compromised in academic or professional philosophy circles. There are a number of these forbidden topics, and you can quickly suss out the assumptions of whatever in-crowd is dominant wherever you are by just asking yourself which matters you would feel uncomfortable being caught taking seriously. (The neo-reactionaries like to push the socio-political ones in your face to see standard-issue liberals get uncomfortable.)

Miracle is high on this list. Even in many a seminary or house of worship there are those who squirm at it. It just seems so clearly to be a vestigial meme from an earlier, more credulous era. We are very confident.

Rosenzweig introduces the second section of The Star of Redemption with a meditation on miracles that is (like so much of that indispensable book, really one of the short list of great philosophical works of the last century) still unplumbed. Miracle, says Rosenzweig, is the embarrassment of modern theology, and this embarrassment is a symptom of a decisive break with classical theology, which (he says) was rooted in the idea of miracle. Rosenzweig parallels the decline of theology with a decline of philosophy, both of which had seemed to come to a coinciding triumph in Hegelianism, and both of which were compromised by fatal flaws in Hegel's system.

For Rosenzweig, a miracle is not an inexplicable event, and it need not be "contrary to the laws of nature." It is, however, crucially bound up with prophecy -- a point which marks one of his vital connections to Pascal. When I first read Pascal, I was surprised and mildly put off. I had expected quite a lot more of the moralism along the lines of "All the misfortunes of Man come from his inability to sit quietly in his room alone;" or the proto-existential anxiety in the face of infinite interstellar spaces. Instead, I found huge swathes of text unpacking the fulfillment of Hebrew scripture in the New Testament. I did not know what to make of it. But Rosenzweig's whole project -- which is at least partially glossable as an exploration of what the Biblical inheritance means vis-a-vis the sum of philosophy -- provides one way of understanding: the issue of prophecy and of miracle alike is intra-traditional. For Pascal, the issue is not a proof directed to the pagan philosophers, but to those who already accept the Hebrew scripture as authoritative. To Rosenzweig as well, miracle is a "proof" of revelation -- the miracle par excellence -- and of providence, to be sure, because it exemplifies the way "all things work together" from the moment of Creation; but it is only this to those who already believe. To outsiders, to unbelievers, and in particular to the enemies of belief (those for whom "unbelief" is not neutral and bemused but antagonistic and resentful), the miracle is not experienced as a refutation. The hosts of Pharaoh do not flock to the camp of Israel to learn of Moses, nor do the believers in Baal turn to Elijah in the wilderness after their priests are consumed by fire. The unbeliever is not converted, but merely confounded. And, at least in many circumstances, they turn to "miracles" of their own (e.g., the snakes of Pharaoh's magicians). At best, "miracle" in this sense proves to be, in the Bible, a confrontation of power with greater power. But this never validates God; it merely validates -- power.

In the New testament, this pattern is confirmed -- miracles are frequently beside the point for most people, or illustrate the wrong point, even those intimately involved. The disciples are sure a ghost is walking toward them on the water; those who ate at the miraculous feeding of the 5,000 is later told they are seeking Jesus not because they saw the signs performed, "but because you had your fill of the loaves;" of ten lepers who are cleansed, only one turns back to give thanks and homage. In short, the Biblical authors do not seem to think that miracles produce any big swing from unbelief to belief.

What, then? This is where Chastek's post on Miracles is so clarifying:
miracles ... [are] not meant to get unbelievers to believe but to get believers to change their beliefs. (emphasis in original)
Chastek's rationale for this point -- that miracles are rare and occur "at transition points in salvation history" is important but only really pertinent to those who will grant, at least for the sake of argument, that "salvation history" is a meaningful category. I'm not going to argue that here, though I will note that this is one way in which Chastek anticipates a frequent objection -- to wit, that the Bible somehow "makes us expect" that miracles happen frequently. Not so, he says:
Scripture records two thousand years of narrative history, and not a hundred years of it are great times of miracle. Even that overstates the case since we certainly don’t mean that we find a hundred years of continuous miracles when we add them all up.
Rosenzweig agrees:
The question as to why miracles do not come to pass "today" as they used to "once upon a time" is simply stupid. Miracles never "came to pass" anyway. The atmosphere of the past blights all miracle. The Bible itself explains the miracle of the Red Sea post eventum as something "natural." Every miracle can be explained after the event. Not because the miracle is not a miracle, but because explanation is explanation. Miracles always occur in the present and, at most, in the future. One can implore and experience it, and while the experience is still present, one can feel gratitude. When it no longer seems a thing of the present, all there is left to do is explain. ("A Note on a Poem by Judah ha-Levi" in Franz Rosenzweig: his life and thought, ed. Glatzer, p 289-90)
But what is really important is that the Biblical authors do regard "salvation history" as relevant (and n.b., this "history" is decisively oriented towards the future in a crucial sense), and that this casts real light on the way "miracle" functions for this worldview. Miracles do not aim to change unbelievers into believers, but to make believers believe differently. This is my own, stronger, re-phrasing of Chastek's point -- it isn't just, or primarily, or perhaps at all about the content of the belief, but about what we might call the mode of belief. Not, we may say, the meme, but the meta-meme.

In the comments to the post, a reader asked: well, what about the miracles of the Saints? To this Chastek replied, completely consistently I think: the saints' miracles are a function of the liturgy (I would have said, of the Eucharist) -- and so are an extension of the principle that miracles are "addressed to" believers.

"No, no," I hear someone object -- "the point isn't whether miracles "mean" such and such; the question is whether they happen at all. For if they don't happen, then they can't very well "mean" anything, can they?" But this is to miss the point. In fact, and much to some of his admirers' dismay, Meillassoux has (without quite using the terminology) re-opened the issue of the plausibility, or at least possibility, of "miracles" in a certain sense -- not, to be sure, as "exceptions" to a law of nature, but simply as momentary changes in such a law. To say this is certainly to interpret Meillassoux against his own intent, but the point here is not whether I'm reading him correctly; it is that a consistent materialist and non-providential account of "miracles" as "events our current laws of nature do not permit" is certainly possible. For Rosenzweig, the miracle always functions within the context of an understanding of Providence; what Meillassoux offers is an account of "miracle" (of a sort) in the radical absence of providence. Doubtless, this account has a formal ingeniousness to it which makes it an object of interest, if not indeed a kind of perverse fascination. Probably, in fact, many such accounts could be possible, so far as this formal interest is concerned. But so what? What this shows is that the notion of "whether miracles happen" (or can happen) in that sense is not the question. We could even stipulate that they can and do; alternatively, we can prescind entirely from the question of "whether miracles happen" in the sense of the big Cecil B. DeMille special effects, because the question of "whether they happen" is playing a different role for the non-believer than it plays for the believer. The non-believer who asks this way is trying to say, if there "are no miracles," then such-and-such follows -- which implies (disingenuously, though they may not be aware of this disingenuousness), that if "there are miracles," something else follows. I.e.: a miracle "now" -- a real, bona-fide, nope-we-can't-deny-it-and-we-can't-explain-it miracle -- would prove something; and so, by implication, the absence of a miracle proves something else -- something opposite. What the Biblical account of miracle implies (according to the reading I am offering of Rosenzweig, Chastek, and to some degree Pascal) is that no such thing follows. The calculus does not play out that way. That is not how the grammar of "miracle" in the Bible works; it isn't meant to offer that sort of "proof" at all.
If they hear not Moses and the prophets, neither will they be persuaded, though one rose from the dead.
(Luke 16:31)
But if that grammar refers us to the question of believers and non-believers (a tendentious terminology that has bequeathed us an ambiguous heritage), then the real issue raised here is not the meaning of miracle at all, but the meaning of -- belief itself. It seems to me that the question of how this term functions for the Biblical writers is one of the most difficult and pressing of all.

*It is perhaps worth mentioning that the miracle of the oil lasting eight days does not figure in the narrative of I or II Maccabees. It is referred to only in the Talmud.

Wednesday, November 11, 2015

Philosophy, standard and nonstandard, terminable and interminable

I uphold the option and obligation of philosophy for universal and ubiquitous occasion. Philosophy may indeed "philosophize" anything. The difficulty is that, in order to do this qua philosophy and not qua opinion, it must think Everything.

The mind is an itch. It connects. There is an moment of seeing the thing in itself, and then the mind is off, connecting it to another thing, and another. Green becomes the green of the apple or the green of envy or the green of my true love's eyes or a certain wavelength of the electromagnetic spectrum. The in-itself is an instant.

Socrates and Plato loved the Thing Itself, that augenblick before the mind had linked it up to something else. In order to recapture that moment of innocence, the mind has to link it to everything else. This is why its task is endless and perfectly hopeless (as Wittgenstein said of our attempts to break out of language), unless by a trick the mind can jump out of its track. The best tricks work by getting the mind to do what it does -- all that linking-up -- as perfectly as it possibly can, and then to see the gap between this admittedly exquisite performance and the transfinite magnitude of the task. If it doesn't drive you to despair, it opens you up to something more than hope. If there is a perennial 'secret' in the Guenonian sense of "secrets passed down through the ages," it is in the tradition's quiver of techniques for making the mind do this -- none of which is guaranteed for life.

The most vociferous opposition to the claim of philosophy to address anything comes currently from Laruelle, who calls this ostensible hubris the "principle of sufficient philosophy" -- a name we can understand, obviously, via its analogy to the principle of sufficient reason: "There is nothing without philosophy," or perhaps, "without philosohizability", to coin a barbarism. Laruelle has set his face against a certain style of philosophical arrogance and power-playing, and this I take to be wholly legitimate. There is a danger in philosophy, well before you get to the real capital-D Dangers like madness or even mere nihilism: the danger of arrogance, of self-congratulation, of being In The Know. Very few students of philosophy have gone mad because of it. Some, perhaps, have found the slippery slide into nihilism made easier by bad philosophy. But many, many have known and savored the delicious superiority of being Above the Herd. They think they are philosophers when they are (barely) exceptions.

I have been reflecting on this of late during my reading of Aristotle's Nichomachean Ethics with a couple of friends. This book is not a "treatise"; it's more like a novel in which the protagonist is your own quest. You are led along from perspective to perspective, always thinking that resolution is around the corner, always being brought up short. The first part, with its search for the virtue, culminates with a famous exposition of megalopsychia, "greatness-of-soul." By all the signs, this should be it, and seems to be the end of our quest, until one stumbles -- for the great-souled man is in a certain way not self-sufficient; he is concerned with honors, with the admiration and respect he receives from his fellows. Aristotle allows even that the megalopsychos "may seem arrogant." Well, one might say the same about the philosopher, yes? Socrates is always going on about how he Doesn't Know, but he's awfully cocky towards his jury, even suggesting that what most befits his situation is that the Athenians put him up at public expense with a stipend for life. "Seems," eh?

Plato warns in his seventh letter that, even if he could, per impossible, have written a treatise on the real content of his own doctrine, this would not be a good thing to do:
I do not think it a good thing for men that there should be a disquisition, as it is called, on this topic -- except for some few, who are able with a little teaching to find it out for themselves. As for the rest, it would fill some of them quite illogically with a mistaken feeling of contempt, and others with lofty and vain-glorious expectations, as though they had learnt something high and mighty.
Well. And yet, what is philosophy, according to philosophy? The examined life, that without which life is not worth living; the sine qua non of attending to one's soul; the thing most needful. Small wonder if philosophy "seems to be arrogant;" it cannot but risk this narrow passage. Now press this further, beyond philosophy to the gospel: the most dangerous and treacherous of temptations on the way of ascesis is the risk of prelest, of thinking oneself humble and spiritually adept, when this (possibly real) attainment is only a symptom of pride. A danger to which one never knows if one has succumbed. Then the question becomes: how to countenance this without merely slipping into despair, or into indifference -- i.e., despair by another name?

I think the ancients knew very well this risk; they had certain safeguards but also knew that there was a real, inescapable, danger. Indeed, the danger has to be real, because it is the facing of real danger that spurs one on to real humility.

I happened to mention to a friend that I had been reading Laruelle, and he asked me, so what's it all about? I don't consider myself competent to unpack Laruelle for anyone else, but of one thing I have been resolutely confident from the moment I first read him, and I told my friend: "All this Non-philosophy? It's philosophy." This is pretty clear when you press the analogy Laruelle says he is making between non-Euclidean geometry and his non-standard philosophy: both "suspend" certain axioms, but they are still engaged in a similar project; Euclidean geometry now becomes a special-case instance of geometry as a whole, with various other axiom-sets as possible configurations alongside the Euclidean. Well, it turns out that these non-Euclidean possibilities were known to be possible before Euclid, as Imre Toth has exhaustively detailed. The fact that "anachronism!" may be one's first instictive response to such a claim is an index of how deeply ingrained the notion of historicism has become for us. In the same way, Laruelle's non-standard philosophy is simply philosophy qua philosophy. In my language (and, for this instance, Freud's), I would say that philosophy is interminable: it cannot succeed in "thinking everything," it can only either fail to do this, or succeed in failing. On the other hand, what Laruelle opposes is a philosophy that seems to think it could succeed in succeeding. Another way of putting this is that Laruelle is closer to the ancients than to the moderns -- a point that other, better, readers of him have noted before me. Of course, Laruelle's kinship with neoPlatonism is hard to miss (and, no doubt, easy to misconstrue -- I make no claim of understanding it in a way he would endorse). But I do note with some gratification that in an interview (in French) which was given in 2011, but which I have just read -- pretty poorly, I am sure, since my French is weak -- Laruelle confirms my suspicions:
Je veux croire que je suis un philosophe loyal, peut-être trop passionné.
I want to believe I am a loyal philosopher, perhaps too passionate.
It is far from my intention to merely conflate Non-[standard] Philosophy with some abstraction called "Ancient Philosophy," but I do want to suggest that this passion is an integral part of what has been slowly bled out of philosophy by the moderns.

But against this, it would need to be acknowledged that the ancients have a certain "coolness" to them which feels off-putting to us as well. So perhaps Laruelle also right when he characterizes his own passion as trop.

To leave things here clearly leaves a lot of loose ends. I'll try to address some of them in a further post. But it's already a foregone conclusion that we won't be able to connect everything to everything.*

*Laruelle would say that this attempt is precisely the problem -- it isn't our job to think our way to the Real, but to try to think from the Real. To this, the Biblical paradox says: Yup.