Pico Ultraorientalis

Just another WordPress.com weblog

Archive for the ‘History’ Category

In Defense of “Man”

Posted by nouspraktikon on July 15, 2017

Not Even Wrong

Suddenly.

Not suddenly as you or I measure time, but suddenly according to the stately cadences of historical events, we have lost, if not yet our species, at least, and ominously, our name for it.  At some point in the not very distant past, “Man” vanished…not extinguished as an  organism, but as an object of consciousness.  For where there is no name there can be no consciousness, where there is no consciousness there can be no science.  Today there is no longer a science called Anthropology worthy of its name, for the name has been banished.   I don’t mean the entertaining science of bones and basket weaving and many other shining objects which is offered in college curricula as “Anthropology.”  I mean Anthropology in the most specific of species-centered meanings, inquiry into that simple question….”What is…what is…[bleep!].”   It is a question which can scarcely be asked today, let alone answered.

This masking of “Man” strikes me as an important development which deserves an extended and serious discussion.   To that end, some ground rules are necessary, concerning which I have some good news and some bad news.  Here goes both:  Sex will not be mentioned in the course of this article.  I have no interest whether the reader be sex-crazed or celibate, male or female or anywhere on the spectrum in-between.  I am only interested in whether you think this Anthropological murder mystery is worth of your time and consideration.

If you concur, then the omission of sex and his/her ugly sibling “gender” is good news indeed, because these things are monumental and, I would argue, intentional, distractions from the difficulties involved in Philosophical Anthropology.  Those bad news bears,  non-adults who think sexuality is the central, nay exclusive, issue in life, can adjourn to their favorite safe space, the Reading Room on Gender, where they can reinforce their own bias among those vast collections of literature which are supplemented daily by our subsidized scholars and their media mimes.

Now to be sure, there are other rabbit paths leading away from the essential inquiry, its just that sex and gender are the most obvious, if not the most obnoxious, and hence need to be eliminated first.  However, those other anti-Anthropological rabbit paths, though less celebrated, become increasingly subtle as the core of the problem is approached.  In any subject, the task is hard enough when we have been force-fed the wrong answers…the real difficulties start when we realize that we started off on the wrong foot by asking the wrong questions.  Today, when we encounter the fundamental question of  Philosophical Anthropology, to paraphrase the incidentally sexy but essentially humane Erin Brockovitch, “..all we have is two wrong feet and damn ugly shoes.”  We don’t know”bleep!”…and the absence of the word doesn’t help.

If we wish to restore that lost science, it will prove necessary to go back and wrap our brains around that simple word “Man” which was once the standard English term for the class of all human beings, much like its French equivalent “l’homme” etc..  Man has long since disappeared out of scholarly, correct and polite language , which means pretty much everywhere, since in casual idiom, if we discount “Man oh man!” and similar oddities, the universalizing nomenclature of Philosophical Anthropology is worse than useless.  After all, you can tell a good joke about Poles, or rabbis, or priests, or homosexuals, or women, and yes, even about “men” qua the male gender, but its hard (short of aliens or the envious algorithms of The Matrix) to envision a “Man” joke.  However, while the comedians won’t notice, there might be a few instances where, for the health of civilization, the ability to have a word for the human species could come in handy.  From this, we can derive another important consideration, once “Man” has been abolished, it  is unlikely to be missed by the broad masses.  The only people who are likely to be bothered are a few specialists in what it means to be a unique species, and these specialists are generally regarded an over-serious, isolated and boring bunch.  Likewise, if the word “epidemic” and all synonyms for “epidemic” were outlawed, the only people likely to get in a panic would be epidemiologists.  Everyone else would get along quite splendidly…at least for a while.

To be sure, the abolition of “Man” and the Abolition of Man, as per the essay by C.S. Lewis are not identical.  The latter concerns the weakening of the species, the former concerns the loss of its name.  Indeed, the distinction between signs and things signified is another treasure which must be jealously guarded against the ravages of post-modernity, which is trying to slouch its way back towards a magical worldview.  Be that as it may, we can still surmise that in the defense of something it might prove essential to be able to speak about it.

On the other hand, we have to make especially sure we don’t get lured down another popular rabbit path, a highly respectable path none the less leads away from the Anthropological core: The path of language.  For example, we could easily lump this abolition of “Man” (the word) together with similar language “correction.”  Pointing out the absurdity of these corrections is the strategy of many conservatives, such as British philosopher Sir Roger Scruton who talks about the way that gender neutrality reforms have “violated the natural cadences of the English language.”   On an esthetic level, there may still be some residual irritation at “people” (or similar substitutes) in lieu of “Man”.  Yet, while this is good Edmund Burke-vintage common sense, it heads off in a trivial and logic mincing direction, of the kind favored by British analytical philosophers and American word-pundits in the Bill Safire tradition.  It expresses a futile, rearguard, hope that inane reforms, like the substitution of his and hers by “hez” can be reversed by a return to  convention, or even mutual rationality.  Rather, the Postmodernist hoards are not likely to be stemmed by a grammar policeman, policewoman, or even policeperson holding up a gloved hand, shouting “Stop!”  Its not that the “reforms” can’t be exposed as illogical and unappealing, its that they are just the tip of the spear carried by acolytes in a far deeper struggle.

Whether the war over language is winnable, I maintain it is the war against Man (as a concept) which is primary, a battle with ideological motives rooted in the hoary past.  Call it a “conspiracy” if you will, keeping in mind that conspiracy is just  popular philosophy prosecuted by cadres of minimally educated but highly motivated minions.  The generals in this conspiracy knew that they could not launch a frontal assault on Man (a.k.a. the human race), so they focused their attention on “Man” at first as a concept and then as a word.  This history of this war is better measured by centuries than by decades and has taken many a convoluted turn.  Hence my belief that contemporary Feminism is, at best, a secondary effect.  It is the Amazon battalion thrown into the breach of the citadel after the the groundwork had been patiently laid and the initial battlefield secured.  That crucial battlefield was anthropology, and not what one is liable to think of as the field of anthropology, but its philosophical cousin, that key science of all sciences, namely, the “Philosophy of…[bleep!]…”

A good “Man” is wrong to find

One can admit something exists and is important without idolizing it.  There was all too much idolization of the human race after the Renaissance and building up to the Enlightenment, a period bookended by Pico de la Mirandola’s On the Dignity of [Bleep!] and Alexander Pope’s Essay on [Bleep!] tomes which style and economy have rendered, perhaps mercifully, unreadable today.  In those days, whenever errant scholars ventured too far from the Pauline/Augustinian double anthropology of fall and redemption, it spelled trouble.  However, personal repentance generally put a  limit to the damage which could be inflicted before the toxic juice of self-worship became endemic to society.  Mirandola befriended and was converted by Savonarola, that misunderstood Catholic puritan, while at least Pope never became the Pope nor were his verses rendered into binding encyclicals.  Savonarola taught the early humanists the secret of Christian Anthropology, that Man is both sacred and bad.  For his tuition, and other causes, he was burned at the stake.

The last child and virtual apotheosis (that is, one “made into God”) of the early modern period was Voltaire, who’s hatred of religion was legendary.  None the less, even Voltaire had too much common sense to think that his animus towards Christianity could be transmuted into a new and living faith.  He noted that “It is easy enough to start a new religion, all you have to do is get yourself crucified and then rise from the dead!”  In recent years, the late Rene Girard has documented Voltaire’s insight with numerous case-studies, illustrating how most human religions originate in scapgoating, death, and subsequent apotheosis.  However the wily Voltaire could see where all this was heading, and limited his disciples to the “cultivation of  their gardens” i.e., the enjoyment of a quiet and restrained sensuality.  We might call this soft-core Humanism, or the humanism of the self.   This early modern Man-ism, which today is probably the most popular (albeit unconscious) religion on the planet, is little more than a recrudescence of old Epicurus, whose famous doctrine Paul once debated on the field of Athenian Mars.  At worst the virtues of this philosophy, such as conviviality, apolitical repose, refined aesthetics etc., are disguised vices, vices centered on feelings.  Think of the the steriotypical Country Club Republican of today’s America.  Such people are pathetic, but not in any superficial sense of the word, since the purpose of their  life is “pathic”…that is, to have feelings, high quality feelings.

Hard-core Humanism was a novelty of Voltaire’s rival, J. J. Rousseau.  In contrast to the soft doctrine, here the object of action is the ideal of Man, not the feeling-satisfaction of individual human beings.   It was Rousseau who managed to transmute the Enlightenment’s carping animus against Christianity into something resembling a true religion.  As the founder of this new religion, which has variously been termed Modernism, Humanism, Socialism and much else, Rousseau should have found himself subject to the pitiless Law of the Scapegoat.  However he eluded martyrdom, and not just because he died a natural death nineteen years prior to the outbreak of the revolution he had inspired.  Rousseau’s Man differed in important ways from both Christian and Renaissance conceptions, which were predicated on either a personal God, or at any rate, a hierarchy of beings of which the human race was but one link in the chain of existence.  Although initially disguised by Deistic code-words, the new religion lifted up Man as the Head of the Cosmos.  Since this Man was a collective, it was not expedient that any individual anti-Christ need suffer the Law of the Scapegoat.  If there were to be any suffering, it would only be in accord with the tyrant Caligula’s wish for the Roman people, “If only they all had but one neck!”  In principle, the head which lifts itself too high gets chopped off.  Caligula himself  proved  no exception to the rule.

At all events, by the 2nd or 3rd  year of the Human Revolution (c. 1793AD) modern technology had outstripped antiquity, democratizing death and allowing Caligula’s dream to come true.  The guillotine enabled the disciples of Rousseau to liquidate the old political class en mass, and then in a predictable turn of events, those disciples themselves mounted the scaffold, suffering a kind of mechanical crucifixion to the god whom they had lifted up, Man.  It was a collective crucifixion to a collective god, for this “Man” was not the same as in the soft Humanism of Voltaire, which was just a category designating a collection of individuals.  Rather, this post-Rousseau “Man” was, if not quite a concrete organism, at least cohesive enough to have a single will, a doctrine as lethal as it was democratic.

The carnage of the Revolutionary/Napoleonic period was not repeated in Europe until 1914 and thereafter, after which great quantities of men and women again began to be killed as a consequence of political and military action.  Here  we would like to inquire whether this carnage (lit. carnal death) was in some sense related to the death (or life) of an abstraction.  Is there a relation between the death of humans and the death of “Man” as a concept and a word, and if so, is that relation positive or negative?  The example of the French Revolution would seem to caution us against a laudatory Humanism, on the suspicion that the higher the ideal of “Man” is lifted up, the more human individuals are likely to be subjected to political violence.

At this point in the argument however, such a conclusion would be premature.  The period between the exile of Napoleon and the shooting of Archduke Ferdinand in Bosnia, which saw relative calm in European politics was conversely that period which witnessed, for good or ill, a wholesale revolution in popular concept of “Man” under the impact of Evolution, Marxism, and Psycho-analysis.  However none of these epicenters of scientific upheaval were directly concerned with Anthropology, at least Philosophical Anthropology, rather they were centered on the cognate disciplines of biology, economics, and psychology.

More to the point, none of these revolutionaries set out to solve the problem, “What is… [bleep!]…”   However others took up that now forbidden question, and we should try to pick up their tracks from where they left off in the tumult of 19th century thought.

Philosophical Anthropology: The Conspiracy Thickens

Today if you mention “Illuminism” it is likely to conjure up secret societies, occultism and political skulduggery, critical investigation into which is no doubt important and proper.  However in the literary salons of Europe and America during the 1840s and 185os Illuminism had a second, though in all probability related, meaning.  It referred to the then-novel research which today’s theologians refer to as the “Higher Criticism.”  If you know about, say, the “Jesus Seminar” then you pretty much know what Illuminism a.k.a. “Higher Criticism” was, except that the contemporary Seminar is pretty much an isolated rehashing of themes which were treated with greater plausibility and seriousness 170 years before.  Those earlier 19th century critics of religion were advancing along the front of a broad intellectual movement which was in the early stages of transiting from spiritualism to materialism.  The cynosure of the movement was Germany in the years following, and in reaction to, the death of philosopher G.F.W. Hegel.  To simplify a very complex way of thinking, many people of that time had accepted Pantheism, the idea that the universe and God are the same thing.  Since most people are not very quick on the uptake, and are willing to sign on to a belief systems before they grasp all of its correlative implications.

Thus, many a happy Pantheist, circa 1840AD, was surprised and saddened to learn that their system no longer permitted them to believe in the personal divinity of Jesus, whom they had hoped to retain as a spiritual hedge in spite of their infidel inclinations .  They should have figured this out from reading Hegel, but it took the shock treatment administered by some young, radical, German intellectuals of the time (a.k.a.,  the Illuminists, Higher Critics etc.) to rub the noses of these au currant ladies and gentlemen in the compost of atheism.  After a halfhearted embrace of Pantheist ambiguity, some among the elite classes of Europe were again courting hard-core, Rousseau-vintage, Humanism, very much along the lines of the original French Revolution of 1789, albeit the European political revolutions of the 40s didn’t amount to much.  This time, humanism broke out with more scientific rigor and less heartfelt enthusiasm, “Man” was made the vehicle of those hopes and dreams which had previously been invested in God.  Moreover, the unprecedented technological progress of the times were conducive to putting faith in human works.

Yet those works, splendid as they might be, begged the nature of their creators.  What was the essence of Man?  Or as we would say today, “What is the essence of….[bleep!]?”  Amazing though it might seem in retrospect, some people of that era actually took the time and pains to ask the Anthropological question.  The man who best serves as archetype of those questioners, actually proposing and discarding several solutions over the course of his life, was the German philosopher Ludwig Feuerbach (1804-1872).  One thing that can be said of Feuerbach, even if we dismiss him as a serial wrong-guesser who justly earned posthumous obscurity, was his persistent and scrupulous engagement with the Anthropological question.  His best remembered quote,”You are what you eat!” might ornament a nutritionist more gloriously than a philosopher.  Yet we must consider that, as a thinker, he was an anvil and not a hammer, pounded left and right by forces which were not just making Modernity but shattering the classical mirror of Man (better known to us as “bleep!”).  Feurerbach’s lifetime bracketed an epochal turn in human self-definition, a turn which Feuerbach didn’t initiate so much as chronicle.

Therefore, meditate on the chronological sketch below and notice how the the turn from Anthropology to anti-Anthropology transpired in the space of a specific, species-haunted, generation.  I know this narrative will be easy to dismiss as a curmudgeon’s rant on “the origins of the left”  but if you visualize the broad movement behind, and independent of, individual intentions will you grasp  its Anthropological significance.  In spooky confirmation of a simultaneous and  universal (or at least pan-Western) turn of thought, the history of early Positivism could be adduced as  a development in synchronicity with Idealism, but in this case the decapitation of Man being conducted by French, and allegedly “conservative” social scientists from August Compte to Emile Durkheim.  But I rather prefer the bold and brooding history of Anglo-German radicalism.

1804  death of Immanuel  Kant, birth of L. Feuerbach

1806 Hegel publishes his Phenomenology, consciousness posited as the motive force in the history of the world, subjective (individual) consciousness conditioned in a “dialectical” relationship to objective (collective) consciousness.

1818-19 Lectures on the History of Philosophy, S. T. Coleridge introduces German Idealism to the English reading public, slowly Idealism will replace the reigning Scottish “common sense” philosophy in the English speaking world.

1831  death of Hegel

1835 Life of Jesus, by Strauss

1841 The Essence of Christianity by Feuerbach

1843 The Essence of Christianity translated by George Eliot

1844 Karl Marx, Theses on Feuerbach, critical of objectivity and lack of political engagement in speculative Anthropology

1847-48 Revolutions in France and central Europe

1848 The Communist Manifesto

1850 The Great London Exposition, popular vindication of applied technology over philosophical and scientific theory

1854-56 Crimean War (only major European war between 1815-1914)  Nightingale, progressive transfer of humane care from family and church to state

1859 Charles Darwin, the Origin of Species, natural selection adduced as motive force in natural history

1860 Essays and Reviews, English theologians embrace the methods of Higher Criticism

1861-65 American civil war, first modern “total” war

1861 Marx, Capital vol. 1 published

1871 Charles Darwin, the Descent of Man

1872 Death of Feuerbach

Note that at the outset Man was The All-In-All, but at the end of the period, not even the  child of a monkey, rather, a scion of some anonymous animal.

In The Essence of Christianity Feuerbach attempted to equate God with “good.”  In his view all the things which were posited of a Supreme Being were actually virtuous attributes of the human species-being.  Justice, mercy, love, fidelity, etc., were human characteristics, which had been mistakenly projected on to an alienated figment of the collective imagination and deified.  However, and here’s the rub, the human individual had no more ultimate reality than God.  Feuerbach’s Man was not men, or men and women, or even people, but the species as a collective.   Individuals were mortal but the species was immortal.  Man was God, Man was good, and Man would live forever.  At the time it seemed like a grand faith, a devotion to something tangible which might give meaning to the limited and fragile life of individuals.

Feuerbach’s intention was  to make a smooth transition from the crypto-Pantheism of Hegel, to a less infatuated, more earthy, Humanism.  Yet  his critics were were more likely to see this continuity with idealism as contamination by unrealistic nonsense.  As thinkers more cunning and sanguinary than Feuerbach were quick to point out, this alleged Human species-being never managed to will anything concrete and  unanimously, but rather, all real  history has been the history of antagonistic groups engaged in fratricidal strife.  For the critics, the ultimate meaning of history was far better illustrated by victorious parties dancing on the graves of the defeated than a universally inclusive chorus singing Beethoven’s Ode to Joy.  According to Karl Marx the antagonistic parties were economic classes, and to some extent nations.  Today we would add genders, races, religions, and even sexual orientations.  Under fire from its radical critics, Human species-being quickly melted into the solvent of class analysis.

Small wonder that Marx happily discarded Feuerbach’s anthropology for the naturalism of Darwin, at one point seeking (and being refused) permission to dedicate Capital to the British naturalist.  Darwin’s system was founded on the assumption of conflict and competition, not the deduction of human from divine virtues.  Feuerbach continued to revise his system in the direction of increasingly consistent materialism, but was no longer in the forefront of a generation which had jumped from philosophical speculation to natural science, now that the latter was backed up by the prestige of  rapidly developing technology.

More significantly, the capital which Darwin did not endorse was the capital M in Man.  In classical anthropology Man had been one of the primordial kinds, as in Spirit, Man, Animal, and Mineral.  Naturalists from Aristotle to Buffon had recognized that  qua organism, the human body was akin to other mammals, and especially to apes and monkeys.  However in a consistently despiritualized science, the one human species was no longer set apart from the myriad of other animals, but rather fell under the same biological and ethological constraints as any other organism.  This reduction may have deeply bothered Darwin personally, but as a scientist he never really posed the Anthropological question the same way that Feuerbach had done, rather he was resigned to viewing homo sapiens as a single object within the purview of the natural science.  In spite of the title, after The Decent of Man, Man ceased to exist as a problem for natural science.  Or more precisely, from a Darwinian point of view, Man, as a unique aspect of the world, had never existed to begin with.

From Man to “Man”

We began by hinting that the loss of “Man” was a harbinger of the death of our own species.  After some clarification we can now understand that the situation is rather worse than we had initially feared, in that, conceptually, Man was killed off sometime in the middle of the 19th century, while “Man” (the word) actually survived the concept by more than a hundred years.  To maintain clarity, we must remember that there are actually three deaths.  First, the death of the concept, second the death of the word, and third, and yet to happen, the actual species extinction of homo sapiens.  That the third death is yet to happen should not imply that it necessarily will, it is only a hypothesis.  None the less, the three deaths are cognitively related.  In particular, the death of Man (the concept) at the hands of Darwinism, is strongly associated with the putative mortality of the species.  If Man is subject to species extinction, as are all organic taxa according to the laws of natural selection, then Man cannot be considered a primary aspect of the world.  As an analogy, consider the concept of “states of matter” which are generally accepted as uniform, or at least ubiquitous, aspects of nature.  If, say, all liquids could disappear from the cosmos, it would put the schema of “states of matter” in serious doubt.  Something of that nature is what has happened with Man, due to the anti-Anthropological turn circa 1860.

Now, would it be too wicked for me to suggest that while Man is not a “species” in the same sense that felix domestica is a species, none the less Man bears an uncanny resemblance to the cat, that enigmatic creature of the proverbial nine lives?  Not only did the word “Man” persist far longer than one might have expected, but Anthropology entered a period of great fruition after the death of Darwin.  Here I’m not referring primarily to what people ordinarily think of as “Anthropology”, the post-Darwinian people-within-nature paradigm which covers everything from bones to basket weaving.  Be wary that, just as in politics, where the nomenclature for everything gets twisted around to its opposite, and we now are forced to call socialists “liberals” in similar fashion those post-Darwinian scholars who no longer believe in a human essence are liable to call themselves “Anthropologists.”  In fact, they are mostly anti-Anthropologists who just want to study the secondary attributes and accidental properties associated with human beings.   Granted, there is nothing intrinsically wrong with that, and on the whole these so-called Anthropologists are not a bad lot, being no more consistently anti-Anthropological than the other professionals who have have inherited scattered fragments among the human sciences.  If the so-called Anthropologists have any besetting sins, those would be 1) they stole the name away from genuine Anthropology, 2) some sub-schools were virulently anti-cognitive, for example the ethnologist Franz Boaz who never saw a theory that he didn’t want to grind down into a powder of facts, 3) others, notably the Structuralists, were hyper-cognitive, and sought to gin up a Theory of Everything, based on some attribute (usually kinship or language) of human thought or behavior.

The anti-Anthropologists who called themselves “Anthropologists” loved “Man” (the word).  After all, it was their schtick, and made a nifty title for textbooks, even textbooks written by sophisticated Darwinians and Marxists who knew that human species-being had gone out of fashion with Feuerbach.  In the meantime, anything on two legs with an opposable thumb would do, and it was all great fun until Feminism put the kibosh on that particular branding.  None the less, so-called  “Anthropology” took the ban on “Man” in stride, since their usage of the term was based on a consistent nominalism, if not on a conscious memory of the anti-Anthropological roots of modern natural science.  Fortunately, due to the exclusion of classical languages, undergraduates could still take “Anthro” and not worry their heads that banned “Man” had never meant just  andro…indeed, that it had meant much more than both andro and gyno put together.

Yet, I wanted to mention the 2oth century miracle of Anthropology, not so-called “Anthropology” but genuine Philosophical Anthropology, as it flourished after, and in spite of, the anti-Anthropological turn of the previous generation.  If I thought that Man were a mere species and not an attribute of Created Being, my inclination would be to classify it somewhere within the family Leporidae, as a mammal with a capacity for making unexpected intellectual leaps, and multiplying thoughts faster than other species can reproduce their genes.  To that end, what great broods have been spawned, not just among the anti-Anthropologists, which is only to be expected, but even among genuine Anthropologists during the 20th and even 21st centuries!

Now remember, when I heap praise on the battered remnants of genuine, philosophical, Anthropology, I’m only lauding them for asking the right question, namely: “What is…[bleep!]”  And by now you understand what “bleep!” is and that a Philosophical Anthropologist is one who would know and say that “bleep!”=Man, and that possibly we should even come out and say “Man” when we mean Man.  I am not saying that many, or even any, of these Anthropologists have answered the question correctly, although I think there is an answer, and that some have made a closer approach to the correct solution than others.  Naturally I have my own views, but I would consider anyone a legitimate Anthropologist who asked the question aright.

There are schools of Philosophical Anthropology of every description.  Some are religious, some are frankly atheistic, but even the most starkly atheistic Anthropologists demure from post-Darwinian naturalism in positing something unique and essential about the human race.  In that sense, all Anthropologists, from atheists to Christians, are tendering a kind of “minority report” against the consensus view of modern science and society.  An atheistic, but genuine, Anthropologist might posit that the human race has a unique responsibility to conserve the cosmos and bring it to its best potential.  Countering this, the consensus view would maintain that such an assertion was errant nonsense, an arbitrary projection of human values into the unthinking and unthinkable void.

In a brief treatment, it is impossible to do more than allude to all the speculative “minority reports” which have been filed by Philosophical Anthropologists against the hegemony of post-Darwinian naturalism.  No doubt many of these speculations have been wrong-headed, but they have at least kept a window open to world-views outside the standard narrative.  If I had to pick a representative of the type it would be Max Scheler(German, d. 1928).  Feuerbach’s anthropolgy began with materialistic idealism and sloped inexorably down to idealistic materialism, however Scheler’s thought described a parabola, which at its height sought the divine in Man.   Personality, both Divine and Human, was arguably Scheler’s main concern, however his reluctance to deal with the limits imposed by a temporal creation, as per the Judeo-Christian scriptures, subordinated individuality to the vague infinity of deep time, a dilemma similar to that encountered by the ancient Gnostics.  Abandoning his initial, and intentionally Christian, viewpoint, Scheler made the alarming discovery that, in precluding a personal God, the amoral instinctual urges of the Cosmos were far stronger than  any principle of spiritual form or sentiment.   The intellectual public in Germany and beyond, repelled by such otiose metaphysics embraced existentialism, a doctrine which gave up on the reality of anything but individuals.  Anthropology once again retreated to the shadows.

In retrospect, Feurebach and Scheler seem like tragic figures who lifted up Man, in one or another guise, as a god, only to see their systems crushed down by more consistently nihilistic doctrines.  However it doubtful whether their contemporaries saw the loss of Anthropological hegemony as something to be lamented.  Rather, they were relieved to be unburdened of Man, just as they had greeted the earlier, and logically prior, “death of God” with satisfaction.

The return of Man, and the return of “Man”…which, both or neither?

The operational assumption is that people can get along perfectly well without a conception of their own species occupying a special place in the system of the world.  Underlying this assumption is the more fundamental axiom that the natural science narrative is our default outlook on the world.  After all, its “natural” is it not?

However the “minority report” of Philosophical Anthropology raises the  specter of a completely different world, a world in which the unique bearers of the divine image have been persuaded that they are but one of a myriad of animal species.  By this account, the conceptual framework of natural science within which the image bearers were circumscribed, was not so much a “discovery” as the imputation of a belief-system.  From this perspective, it is naturalism, not the classical Man-centered cosmology, which is fabulous.  To get the masses of humanity to believe such a deflating fable in the course of a few centuries, has been a superbly effective triumph of propaganda.  Although we have some hints as to who has disseminated this propaganda, the question of in whose interest it was disseminated remains enigmatic.

Within the English-speaking world, the banner of the old classical Anthropology (Christian or secular) was “Man.”  The banner was not furled up until long after the cause was lost.  Yet the banner itself was essential, so essential that the high command of anti-Anthropology decided to send in the Amazonian battalion to haul it down under the pretext of the gender wars.  Lost in the confusion of that particular skirmish, was the deep import of having a proper name for that key nexus of Creation through which the Divine, ideally, was to communicate its dominion over the visible world.  “People” is more than just an innocent substitute for “Man”, since, being a plural, it serves as a pretext for importing the entire philosophy of nominalism into the human sciences.  Nominalism views entities (you and me and the cat and the carpet) as capable of being grouped into any category which happens to be convenient.   Who’s convenience?

It can be safely inferred that this is a view well suited to those who want to abolish the boundaries between species.  Perhaps now the reader can see the relevance of all the preceding esoteric Anthropology, for looming on the event horizon of our world are a thousand crises brought about by relation of the human to the non-human.  Indeed, we are conjuring up new categories of non-humans day by day.  AI and aliens, robots and Chimeras, not to mention all those entities of the natural and spiritual world who are ancient in human lore.  I eagerly await the rebirth of the “dinosaur” from its amber-encased DNA.  Or will it be a dragon?   Names make a difference.

None the less, we proceed without caution, for the night-watch has been relieved of its duties as the evening of human history encroaches.  Isolated voices cry out, “There may be a problem here!” and anxiety is ubiquitous, but few are willing to “get real.”  This is not an accident.  The “real” tools, nay, the “real” weapons with which we might have fought were long ago taken away and beaten, not into plowshares, but into the bars of zoological confinement for what remains of the dignity of Man.  The “real” tools were realistic in a properly philosophical sense, exalting created kinds as the unalterable building blocks from which God created our world.  Such was Man.  Hence the necessity of having a personal name for the species.

Will Man come again?  I think so, but more on the basis of faith than calculation.  In the meantime others look towards a rapidly accelerating future, and begin to realize that “Nature” is hardly a better idol than secular Man, that the sense of “nature-in-itself” is an illusory effect of what psychologists call normalcy bias.  None the less, something is approaching, we know not what.  Intellectuals call it “the end of history” while technologists speak of “the singularity.”  Most just ignore it, but it will come nonetheless.

Suddenly.

 

 

 

 

 

 

Advertisements

Posted in Anthropology, Art, Christianity, Culture & Politics, Esoterism, Evolution, History, Paleoconservativism, Philosophy, Politics, Traditionalism, Uncategorized | Leave a Comment »

From Ike with love: The Age of Deception (1952-2016)

Posted by nouspraktikon on July 5, 2017

Nothing has changed except our point of view, but that counts for something

It is easy to think, as the left continues to overplay its cards, that something significant has occurred, and that our trajectory towards an Orwellian future has accelerated .  On the contrary, the Trump victory has triggered a new gestalt in people’s minds.  By 2017 fairly average people can see what only hardened conspiracy theorists were willing to hypothesize as late as 2015.   Whether or not we are at the beginning of a new era, for good or ill, is a matter of conjecture.  Indisputably, we have taken our leave of a period in political history which will prompt nostalgia among anyone but truth-seekers.  While it was hardly an era of good feelings, it was held up by its laureates as a time of consensus, or at least bi-partisanship.

Rather, it seems better to call our recent past the Age of Deception.  The Great Deception consisted in draping a de facto one party system in the vestments of a two party system.  If you had said this in 1965, or 1975, or 1980, or 1994, or 2001, or perhaps even 2008…most people would have called you an extremist.

However somebody, somebody who thought extremism in the cause of truth was no vice, had already pointed this out as early as 1958.  Sure enough, his opponents, and they were legion, labeled this man a slanderer, effectively burying  his work from the sight of the general public, first using savage opprobrium, subsequently silence, and at last retrospective ridicule.   The man was Robert Welch, and the “book” he wrote, initially penned as a private circular and later published as The Politician, targeted none other than President Dwight Eisenhower as an agent of communism.

Then as now, to the half-informed mind of the general reading public, such an allegation was patently absurd.  Eisenhower was venerated as a war hero on the basis of his direction of the Allied war efforts in Europe.  Now admitedly, there are a number of ways to think about the “heroism” of strategic commanders as opposed to direct combatants, but generally, if the war is won, the public will grant the former a triumph and allow them to retire in luxurious obscurity.  “Ike’s” not-so-obscure military retirement consisted of becoming President of Columbia University.  After that, for reasons most people are rather vague about, he was drafted to become the Republican candidate for another kind of presidency, nominated over Sen. Robert Taft of Ohio, the last champion of the “Old Right.”

After that, we usually go to sleep in our American history class until it is time to wake up for Kennedy.  Indeed, this might be a kind of clue that something is amiss in the standard Eisenhower narrative, like the barking dog who falls strangely silent in the dead of night.  How many books, popular and scholarly, are published each year about JFK in comparison to good old “Ike” (even subtracting those chillers which focus entirely on Kennedy’s murder)?  I doubt that a ratio of a hundred to one would be far off base.  Either America’s political ’50s were incredibly boring, or there is a story which, in the view of some, were best left untold….

A few history mavens might even remember that “We…(presumably all Americans)..like Ike”…because (warning, redundancy!) he was “…a man who’s easy to like.”  And furthermore, as the campaign jingle continued with mesmerizing repetition…”Uncle Joe is worried, ’cause we like Ike!”  Of course, if Mr. Welch was anywhere close to on-target in The Politician, “Uncle Joe” a.k.a. Joseph Stalin had little to be worried about, at least in regard to Dwight Eisenhower.

If you are skeptical that “Ike” could have been a communist front man, then I can sympathize with you.  Frankly, I was skeptical myself…indeed, everybody has a right to be skeptical of startling claims.  On the other hand, if you think that it is disrespectful to raise the issue of presidential duplicity at all, then you are on shaky grounds.  You are on especially shaky grounds if you happen to be one of those people who think that our sitting president was sponsored by (today’s post-communist) Russia.

You see, after 2016 everything has changed.  Whether or not Mr. Welch’s claims regarding “Ike” can be vindicated, at the very least we are now in position to read The Politician as an objective historical account.  The Politician is a strong and scholarly witness of an already forgotten time, one that now can, and should, be approached without bias or malice.

Why Robert Welch didn’t “like Ike”

It is an uncomfortable but inescapable truth that once certain things come to one’s attention it is impossible  to “unsee” them.  There is a shift in perception which renders impossible any  return to “normal” however rosy that mythical past might have been.  For example, a beloved though eccentric uncle can seldom be restored to a family’s unguarded intimacy once he comes under suspicion of pederasty, and rightly so.  Likewise, the image of Eisenhower would be shattered, not so much as war hero, but as the epitome of a stable, normal and normalizing politician, were he to be exposed as a willing agent of communism.  Conversely, just as the suspect uncle would insist on due process, even if he knew himself to be guilty, the upholders of the Eisenhower legacy are apt to clamor for iron clad proof of what, according to mainstream historiography, would be considered an outrageous accusation.

Sadly, for the reputation of Eisenhower and our national narrative, the claims of Mr. Welch are well documented, coherent, detailed, and were compiled by a contemporary who knew the American political class of the 1950s like the back of his hand.  If you wish to keep Eisenhower in your pantheon of heroes, read no further.  If, on the other hand, you would like to see the claims against him substantiated, read The Politician.  Here, I can only provide a brief, albeit damning, sampling drawn from Mr. Welch’s researches.  Therein he documents the following egregious policies which were either authorized or enabled by Eisenhower:

*Even in his role as allied commander, the fountainhead of his public esteem, Eisenhower was allegedly (The Politician provides graphic details) complicit in the nefarious Operation Keelhaul, a rendition program which forcibly repatriated ex-Axis agents collaborating with the American forces to their home countries behind the iron curtain.  This eliminated numerous sources of “worry” for “Uncle Joe.”

*Eisenhower was instrumental, as President of Columbia University, in pushing that already left-leaning institution further in the same  direction.  He continued to associate with and hire left-wing and communist front faculty, procuring for them teaching/research endowments.  Again, the allegations in The Politician have been strengthened in the light of subsequent events.  Just ten years after the publication of Welch’s Eisenhower exposure, the University of Columbia erupted as an epicenter of the spreading “new left” movement of the ’60s.

*At the heart of The Politician’s allegations is “the politician” himself.  Prior to Eisenhower’s nomination as a candidate for president on the Republican ticket, all of his political associations had been with the left-wing of the Democrat party.  This is perhaps the most uncanny aspect of Eisenhower’s career, and the one most germane to the establishment of a faux two-party system beginning in the ’50s.  The only fig leaf concealing this duplicity was the absence of any prior political office holding (Democrat or Republican) by the President-to-be.  Again, historical retrospect adds, if not new facts, new irony to the narrative of The Politician.  Our current presidency is commonly considered exceptional, if not down right illegitimate, on grounds that Mr. Trump held no prior office and was not sufficiently initiated into the mysteries of the political class.  In the light of Eisenhower’s prior example this current “exceptionalism” can only be caviled at by those who either 1) adhere to the dangerous proposition that generalship is a political office, or 2) are willing to admit that such rules can only be broken on the left.

*Once inaugurated President Eisenhower continued the policies of FDR’s New Deal.  Indeed, programs and bureaucracies which existed only in embryo in previous administrations were fleshed out, expanded, and duplicated.  The agricultural sector is typical, and just one of the many that Welch enumerates. Amazingly, farm subsidies swelled to half of farmers’ revenue, a fact of which “Ike” was very proud.  Moreover, unlike FDR and the Democrats of the ’30s, these programs were not justified as “emergency” measures, but were considered a permanent and “normal” restructuring of the relation between the public and the private sector, i.e., de facto socialism.   This was enabled by the collapse of any meaningful two-party opposition due to the alliance between left-wing Democrats and the establishment Republicans who backed Eisenhower.  The monolithic bureaucracy, exemplified by the Department of Health, Education, and Welfare, long resisted by the “Old Right” was institutionalized under the faux two-party consensus.  Hence the public sector actually saw a spurt of growth in terms of employees and expenditure in the transition from Truman to Eisenhower.  Consequently, the national debt rose at a rate several times higher than even the Democrats had been willing to incur.

*As shocking as many of the above allegations might seem, the most controversial aspect of the Eisenhower administration was its acceptance and further entrenchment of the post-WWII National Security State system inaugurated under Harry Truman.  This has to be remembered both in conjunction with, and contrast to, the only quote that most people today are likely associate with Dwight Eisenhower, namely, his “prescient” warning against the dangers of the “military industrial complex.”  This utterance was prescient only in so far as Eisenhower was speaking prior to the Vietnam debacle, after which such forebodings became commonplace.  To the best of my knowledge Mr. Welch doesn’t reference this quote, which dates from a time subsequent to the initial redaction of The Politician, although not prior to later editions.  However, Mr. Welch frequently draws attention to rhetorical gestures made by Eisenhower through which he exculpated himself from responsibility for his suspect policies by seeming to condemn their inevitable negative consequences.   Thus he might condemn “galloping socialism” while rapidly expanding the public sector.  Seen in this light, we might take Ike’s warning against the “military industrial complex” to heart, while doubting the speaker’s innocence of the very thing he condemned.

Does this “Ancient History” even matter?

The short answer…yes, it does.

You might recall a scene in Starwars where Luke Skywalker asks Yoda about the future.  Yoda answers, “A strange thing the future, always in motion it is…”  In a sense the past is also in motion, shaped by the interpretation given it by the present.  Yet it would be too great a concession to the irrational forces of our times to say that this was a real, and not an apparent, motion.  The past must be uncovered, not invented…although the temptation to  invent myth is strong.

There is always a strong mental resistance to meddling with any society’s pantheon, or in more American terms, we might say, tampering with Mt. Rushmore.  In Mr. Welch’s day, The Politician seemed rude to the point of slander, while today it seems impious.  We might say “only” impious, when actually it’s the primal sin.  Mr. Welch mentioned something nobody was supposed to notice.  That’s impiety.

Or is it?  Note another odd thing about the Eisenhower myth, that there is no such myth!  Somehow or other Eisenhower has eluded both the pantheon and the rogue’s gallery of American history.  If the entire history of the Presidency during the ’50s elicits very little commentary, is that because the whole period was boring?  Hardly.  Rather, might not such a presidency be likened to a constant background noise, or better yet a universal solvent…the purpose of which is to set the standard of normality for “the end of history”?

Today we have come out the other end of “the end of history.”  Not that we really know how things will end, or for that matter continue.  All we know is that, for the first time in a long time the carefully scripted design for the future has suffered a setback.  The planners, whoever and whatever they may be (though from a galaxy far away I think they be not!) are in disarray and many things are back on the table which once were considered “settled.”  This may be a good thing, it may be a dangerous thing, and most likely both, but this is where we seem to be at present.

Consequently, under today’s conditions, reading, and taking seriously, the thesis in Mr. Welch’s The Politician, is no longer an act of impiety.  It is an essential measure of the road which we have traversed through the land of manipulated consensus.  Having finished that journey, we can look back at the trail-head, take stock, and get a new perspective.  However, in contrast to the fantasies of the “progressives” no perspective is better just because it is newer…only if it is truer to realities which transcend perspective itself.  Furthermore, to get at those realities one has to crunch a lot of historical data, and there is a lot of data to crunch, most of it rather unpleasant, in The Politician.

Only those with a deep urge for enlightenment need apply to the task.

 

Posted in Constitutionalism, Culture & Politics, Economics, History, Media, Paleoconservativism, Politics, Uncategorized | Tagged: | Leave a Comment »

Slouching towards the Post-Legal Society (Introduction: “The Beast”)

Posted by nouspraktikon on June 23, 2017

Cultural Marxism:  From show trials to no trials

If property is the proverbial nine points of the law, it is not surprising that Marxism, its frontal attack on property having stalled out (NB: ideology aside, we all like our “stuff”) would have eventually gotten around to launching a second front against law itself.  The total annihilation of law never succeeded with Communism Classic (Stalin’s version), since the Soviet state needed a judicial apparatus to highlight its superiority to “bourgeois law” …not to mention providing a half-way house on the way to the Gulag.  The nightmare of totalitarianism having been quietly put aside, if not entirely exorcised, we have emerged into the glaring, and presumably lawful, light of the Global Village.  Or have we?

Today, the legal “reforms” of the (allegedly) defunct Soviet state are held to be little more than antiquarian curiosities.  However this does not mean that “bourgeois law” a.k.a., classic legal principles of the Civil and Common law, have triumphed throughout the world.  Rather, the struggle against law has gone underground, or rather above ground and hidden in plain sight.  It dares not risk exposing itself, and therefore avoids clear opposition to the institution which makes civilization possible: Objective Law.  Since it eschews both thesis and antithesis, running for the dense cover of ambiguity, it must be tracked like a beast…by locating and examining its spores.  We know not what it is, but like W. B. Yeats, we can at least pose the question…

And what rough beast, its hour come at last

Slouches towards Bethlehem to be born?

But at least we have a track, where the beast has digested large swaths of civilization’s foliage and left us a species-specific excrement where form has been neatly reduced to matter.   If we can track the down the spoor-dropper, perhaps it can be slain.  Or perhaps not.  But at least we may come to know who, or what, our adversary is.

Antinomianism

We must pick up the beast’s trail in the foothills of religion, and especially false religion.  The journeyman tracker will think that we have found the beast itself, and with a gleeful cry of “Antinomianism! Antinomianism!” presume that they have him treed, when in fact it is just a spoor, albeit very a significant find.  Actually the beast has moved on to an entirely different part of the forest, since the “true” false region of today is not a religion at all, but science, or rather scientism.

However there are enough who still believe in ersatz-Christianity to cloud the contemporary scene with a subtle contempt for law.  This is an Oedipal Christianity in which the God of Law is slain by the Son of Love, a doctrine preached by a vague figure named Jesus something or other.  Scientifically this is supposed to be Yeshua ben Yosef, but it really doesn’t matter, since this ersatz-Christianity has been purified of all but universal truths which all good natured people ought to be able to agree to.  Among these is that law is mean and should be dispensed with in favor of good will.

Yeats was assuming that the reader of his poem knew that he was talking about the “Antichrist.”  However if we get too hung up on the idea of the Antichrist being an ugly, brutal, beast then we are likely to be deceived.  Granted, there are many cults which like to dress up in spandex costumes, going about sporting horns and tridents.  They may even enjoy frightening middle-class people on Halloween and sundry sabbaths with their clownish antics.  But this is all an exercise in misdirection.  Such cultists may be “anti-Christs” but not the final beast who arrives at the end of history. The real threat to our spiritual well being doesn’t come from avowed nihilists who dance around impersonating a cartoon Satan.

The real threat comes when the world-system (what the Bible calls the “Aeon”) proceeds to abolish law in favor of a “higher morality.”  In today’s virtue-signaling pseudo-saints we see a harbinger of the real Antichrist.  The real Antichrist will not look evil or demonic, in fact the real Antichrist will try to resemble Christ to  whatever extent that might be possible.  After all, Christ did transpose law-abiding to a higher abiding in Him.   Call that a “higher morality” if you will.  However the “higher morality” of the Antichrist will not be based on fear of the Creator, but fear of the creatures.  Specifically, it will involve fear of the Human collective, a fear that will initially manifest itself as virtue-signaling, but in fact will rest upon appeasement of human (and ultimately demonic) lusts.

Having broken through the firewall of law (whether we choose to call such formal restraints law, culture, morality, ethics, or whatever) the direct confluence of collective human lusts and fears will create a Democracy of Desire.  Initially such a state of affairs may not seem ugly to behold.  It may even appear to be morally beautiful.

A beautiful beast.

 

Posted in Anthropology, Appologetics, Charismata, Christianity, Constitutionalism, History, Law, Paleoconservativism, Philosophy, Politics, Uncategorized | 1 Comment »

Constitutional Contrary or Conundrum? The Imperial Presidency vs. the Unitary Executive

Posted by nouspraktikon on June 4, 2017

Strong President, Weak President

Setting boundaries and limits to power is the essence of politics in a republic.  No Latin word was ever belabored more than imperium in the era prior to Caesar’s crossing of the Rubicon.  Originally it referred to the “sphere of power” which was exercised by a magistrate, great or small, beyond which the office holder infringed upon the rival authority of some other elected official.  With the atrophy of the Republic, it became a personal noun, the Imperator, the root of our term for a King of Kings, an “Emperor.”  The word, thus transformed, described a  person who’s “sphere of power” had become the whole world, thus annihilating the use to which its root had once been put, namely, to define and limit power.

Last year I predicted that Donald Trump, if elected President, would not become a fascist dictator, an “Emperor” so to speak.  Rather, the tremendous forces arrayed against him would ensure that the office would be brought to heel to a much greater degree than those who fear an Imperial Presidency are wont to imagine.  None the less, even I have been surprised by the extent of the weakness in the executive.  If we have passed any Rubicon, it seems rather that we have passed over from a concealed, to an open, form of oligarchy.

One way of coming to grips with this non-revolution is to admit from the outset that 1) the Imperial Presidency, and 2) the unitary executive, are contraries, not complements.  If we were to talk about official spheres of power with the fastidiousness of the ancient Romans, we might call the first, the President’s “lateral power” and the second the President’s “upright power.”  Imagine that presidential power is a rectangle of fixed area which loses depth whenever it is stretched horizontally.  I know that is a rather strange image to put in the service of a radical hypothesis, but bear with me.

Why the unitary executive is a great Constitutional doctrine

Generally when we ( and by “we”I mean, libertarians, conservatives, traditionalists, natural rights advocates, strict constructionists, etc.) hear the word “president” modified by the word “strong” we go into a fit of moral indignation, if not outright hysteria.  Yes, generally heads of state should be weak, lest they turn into tyrants.  However the American presidency is a unique institution, one which the founders of the Republic intended as a safeguard of liberty, just as much as the legislative and judicial branches.  To begin with, the very notion that the American president is a “head of state” is an extra-Constitutional notion, one which arises from the necessity of adjusting American nomenclature to the standards of  diplomacy.  Indeed, since the Congress is our premier branch of government, the Speaker of the House has a fairly good claim to be the federal head of state, on the analogy of parliamentary systems.

Leaving aside the symbolic, and rather silly, issue of heads of state, let’s turn to a more fundamental question which impacts on the idea of the unitary executive.  Each of the branches of the Federal government must conduct its internal affairs in hermetic isolation of the other, while being in constant cooperation as corporate bodies to conduct the governance of these United States.  Naturally, each of the branches will attempt to extend its sphere of authority, or what the Romans called, their imperium.

Now the matters which are of concern to each branch are well spelled out in the Constitution, but each of the branches always attempts to grow its authority by multiplying those things by which it exerts authority.   Thus the legislative branch attempts to grow its authority by increasing the volume and complexity of legislation, while the judicial branch attempts to grow its authority through the multiplication of rulings, judgements, and injunctions.  On the other hand, it is primarily the executive branch which attempts to grow its authority through the multiplication of offices.  Sad to note, but the three branches may remain evenly balanced while all of them grow in concert, disrupting the larger balance between governmental and non-governmental institutions in civil society.

Whatever cure there might be for the exponential growth of government in the legislative and judicial spheres, the theory of the unitary executive provides both a unique analysis and possible cure for burgeoning bureaucracy.  How so?

Strictly speaking, in the American republic there can never be more than one government officer at a given time.  The name of this officer is the President of the United States!

Oh yes, if you must quibble, there is also a deputy in case of death or incapacitation, the anomalous Veep.  None the less, two officers is a pretty strict limit for the bureaucracy of a large republic.  It reminds one of the twin consuls of Rome, a historical precedent which was never far from the thoughts of the American founders.  In terms of modern political theory we have arrived at genuine “minarchism”…an ungainly word which has been coined to express the most limited of limited governments.

Of course, for true unity of will and purpose, a person can never really trust anyone else to do their own job.  Hence the most pristine unitary executive would be one in which the President did all the work of executive branch personally.  We can imagine a President who, dispensing with the service of a secretary, was able to handle all executive correspondence personally.  (NB: The reason we can imagine it is that we live in a world of word processors, computers, and the internet.)  However other things, such as warfare, might be a bit more tricky, unless our chief magistrate had the strength of the Biblical Samson or a modern-day comic super-hero.

So to be on the realistic side, even our pristine unitary executive would, of necessity, need to contract out for a few staffers.  Hopefully these would all be temporary workers.  After all, the chief magistrate himself is a temporary worker, limited to four, or at the maximum, eight years of employment by the American people.

Now before you dismiss this as nothing more than utopian swamp fever, perhaps we should take a look at the way the doctrine of the unitary executive has played out in the history of the Republic.

 

The historical roots of a weakening unitary executive

Unfortunately, while the imperial Presidency is the most realistic of real-political realities, the concept of a “unitary executive” is little more than a constitutional doctrine which has had to go hat in hand through the corridors of history in search of application.  To put the theory in its clearest form, the unitary executive is the President himself, who is at once both the only employee of the American people, and also the boss of every federal office holder outside of the Congress and the Judiciary.  The theory seemed most incarnate in the reign of those generals who seemed to be able to wield their authority with the same imperious might in the Oval Office as on the battlefield.  One thinks of Andrew Jackson and Teddy Roosevelt.

That was then, and now is now, when Mr. Trump’s executive leadership seems more like an exercise in herding cats.  Yet people with even a tad of historical lore under their skulls recognize that The Donald didn’t suddenly fumble the unitary executive to the horror of his fans and the delight of his detractors.  Common wisdom suggests that the unitary executive began to unravel, at the very latest, in the aftermath of the Watergate (1973) scandals.  Legislation which sought to limit the presidential imperium resulted in severe checks on arbitrary presidential power.  However these reforms failed to check arbitrary governmental power in general, or to stave off the multiplication of executive projects, expenditures and offices.  Rather, by setting up checks and balances within the executive branch of the federal government, they added to the executive bureaucracy.  And this went to the extent that the “special prosecutors” who were the plumb in the cake of the post-Watergate reforms threatened to become a “Fourth Branch” of trans-Constitutional governance.

Those who can see beyond the historical horizon of Watergate are more likely to see the first unraveling of the unitary executive in the New Deal, and the multiplication of those “alphabet agencies” such as the ICC, TVA, and NRA, each of whom were endowed with judicial as well as executive authority.  Yet an earlier starting point is the Progressive era, which saw the rise of the intellectual in the federal administration, a creature who was less likely to be constrained by, or even understood by, whatever folksy president inherited the legacy of those hybrid characters like Wilson who both studied and practiced administration.

Loyalty vs. Merit

However these movements were actually just footnotes to the unitary executive’s original fall from grace, which coincided with the rise of a merit based civil service.  It was the Pendelton Act of 1878 which consolidated the system of permanently employed government service.  After that there was little reason to think that officers would be loyal to a politician who’s term of office was likely to be far shorter than the duration of their career.   Like all sea changes in the policy of the republic, the effect of this reform was not immediately apparent.  After all, presidents in the late 19th century were just expected to be “weak.”  Think Grover Cleveland.

Today, because we read history from public school textbooks, the pre-reform civil service gets a bad press.  Typically it is referred to as the “spoils system” which conjures up images (not entirely unsubstantiated) of bribery and largess.  However there is another side to this issue.  We should at least try to be “Mugwumps” that fanciful word for a person who was willing to consider the merits and demerits of a permanent civil service.  In the interests of fairness, I would like to exercise a bit of Mugwumpery and dub the temporary civil servant system the “Loyalty System.”  After all, the politically appointable (and removable) civil servant would at least have no vested interest sabotaging the chief executive who, unlike him or herself, was directly chosen through the electoral mechanisms of the Republic.

In certain moods our progressives and our conservatives might even agree that disloyalty is a bad thing and moreover presidents should at least have the chance to formulate policy on their own turf before being challenged by either the courts or the legislature.  However there is a libertarian remnant which stubbornly insists that a strong president is a bad president, and indeed that a strong administration is nothing more than a step along the primrose path to empire.

However, as illogical as it may seem, the presidency became “weak” before it became imperial.  After WWI and as the 20th century wore on, there was need to have an emperor to complement the existence of an empire.  However the discipline of the bureaucracy which manifested itself at this time was not due to the charismatic appeal of those politicians who became, willy-nilly, chief magistrates of the republic.  Rather, it was due to the professional association of those who had a vested interest in the expansion of state power, both internationally and domestically.  Presidential orders were obeyed because presidents of whatever party were (to a greater or lesser extent)  aligned with the expansion of a robust administrative state. In 1952 Sen. Taft of Ohio lost the Republican nomination against General Dwight Eisenhower.  Taft was the last mainstream presidential candidate to seriously challenge the operational premise of expanding state power.  Barry Goldwater and Ron Paul would later mount doomed, albeit educational, campaigns dedicated to challenging that same premise.

Then in 2016 Donald Trump was elected after campaigning on many of the same anti-statist planks that animated Taft, Goldwater, Paul and (very inconsistently) Reagan.  Trump had the good sense to mix his contrarian rhetoric with a dash of jingoist appeal.  So far, the bureaucracy is in somewhat less than full scale revolt.  But only a very naive observer would be surprised that the doctrine of the unitary executive has been utterly abrogated.

The not-so-deep-state and the demise of the unitary executive

Today when “deep state” has become a household expression, it is easy to substitute James Bond intrigue for fundamental political analysis.  No doubt there is a great deal of skulduggery going on in high places these days, but the unitary executive would have floundered without any alienation between the Oval Office and the intelligence services.  It is not just the Praetorian Guard who are in revolt, but the clerks…and there are a lot of clerks.  It is not just a cabal, but the system, a system in which managers are independent of elected policy-makers.  In the EU this system appears in its most naked form.  In the US it still has to make end runs around the remains of a Constitutional Republic.

As Richard Weaver said, “Ideas have consequences!”  One of the great, pure, ideas of the 19th century was civil service reform.  However in creating a permanent state independent of politics, civil service reform ensured that all future reforms would be bound inside the parameters of the managerial state.  The owl of Minerva takes flight at night, and only now do we see the luster of those single-minded individuals whom the progressives have been eager to denounce as dictators-in-waiting.  The aristocratic Washington, the Jacobin Jefferson, mean old Andy Jackson, the imperious Polk and (though they were already compromised by the permanent state) later figures such as Lincoln and Teddy Roosevelt.

Finally, we can at last see the wisdom of the Founders in endowing one third of the federal government with a vestige of monarchy.  At very worst a monarchy, but never, ever, an empire, since a strong individual, unencumbered by bureaucracy and backed by the people, might indeed succeed in ruling the daily affairs of one nation…but then it would be bedtime.

 

Posted in Constitution, Constitutionalism, culture, Culture & Politics, Economics, History, Law, Paleoconservativism, Politics, Traditionalism, Uncategorized | Tagged: | 1 Comment »

Dammit Man!

Posted by nouspraktikon on May 18, 2017

A Pickup Placard Peccadillo

Driving along a trunk route of my community I was “shocked, shocked I tell you!” (well, kinda) to see an advertising placard on a pickup blazoned with the unique corporate moniker Dammit Man!  Dammit Man?  Not, mind you, a bumper sticker, but the name of the firm!  Well, context is everything, and from the barrels and tools in the back of Dammit Man’s pickup, it was evident that the  cussing commercial was advocating the services of a lawn care and cleaner-upper specialist.  Since my town is full of trees, deciduous and otherwise, there is a huge market for lawn waste removal.

Thus it took me less than two seconds to figure out the reference, which is a mark of good ad copy in itself.  Clearly, Dammit Man! was an unexpurgated expletive prefacing the tacit, but easily guessed proposition: “Dammit man, how did you get that lawn cleaned up so well…and in record time!”  Perhaps a local ordinance needs to be passed prohibiting foul language appearing as part of a corporate logo.  I suspect that most of our churches would line up in support of the motion.

However that really misses the point, both of the joke and the phrase itself.  Somehow I suspect that the Dammit Man, whether or not he can dispatch decaying vegetable matter with the celerity implied by the slogan, is a better Christian than most of us.  He has that seldom mentioned but welcome Christian virtue: Hilarity…or in plain language, a sense of humor.  It is the note of the pilgrim who is both seasoned and sincere on the spiritual path.

(And , incidentally, since I can’t resist a snarky soliloquy, this virtue was notably absent from the recently trounced politician who bore the name!)

On a deeper level, “dammit” is not an obscenity but an imprecation, and since we really don’t want to send our lawn-care specialists to the infernal regions, at least if they have done a good job, the expression in context has to be taken as an effusion of rare praise, not condemnation.  Therefore, since the vendor is praising himself using a curse word, the ultimate intent is humorous, even if the humor doesn’t exclude the likely “damn” goodness of the lawn care specialist.

But of course, real damnation is no laughing matter.  By “damnation” I don’t just mean the final, definitive judgement of sin before the throne of God, I mean condemnation in the broadest sense.  Every atheist dog-and-pony show purports to demonstrate that “damnation” was an innovation foisted on mankind by the God of Abraham, or rather by His ameneusis.

However damnation, in the broad sense of condemnation, is not something which we would have to wait on revelation before we learned of it.  True, when we consider “last things” there are some elements which natural reason could never have guessed, such as the immortality of the resurrected body, either for good or for ill.  That knowledge only comes from revelation, and admittedly it complicates things.  But that is not what we are concerned with at present, however much atheists would like to “put God in the dock.”

Rather, we are talking about what happens when human beings put each other “in the dock” or to paraphrase what Voltare said of God, “If damnation didn’t exist we would have to invent it.”

My contention is that we did.

 

When Man Damns

Indeed, damnation, rather than being fostered upon the human race by bad religions (Abrahamic or otherwise, indeed, there are Buddhist hells, and nasty ones at that) is an intrinsic category of the human mind.  Just as Adam Smith told us that “humanity has an intrinsic propensity to truck and barter,” likewise there is an “intrinsic propensity to damn” which has been shared by all human beings since the time of that Adam who was the progenitor of Mr. Smith and the rest of us.  For though the human race has no brimstone (or at least it didn’t prior to Hiroshima and Nagasaki) it has the faculty of condemnation in abundance.

Consider that we get our English word “damnation” from Latin.  Now in secular history the most revealing instance of total condemnation is the custom of damnatio memoria which was exercised from time to time during the Rome’s late republic and empire.  After an unpopular politician or emperor had been removed, either by natural causes or assasination, the Senate, by official decree, would order the erasure of all inscriptions mentioning the tyrant, and the removal of all his statues from public view.  Historians dispute how often and how effectively this rite was observed, but the intention was clear.  The victorious party in the Senate wanted to consign all memory of the condemned emperor to oblivion.

This desire to condemn and erase the past, or at least that part of the past connected with unpopular personalities, was by no means a uniquely Roman obsession.  Rather, contrary to the intentions of the Senate, the abundance of historical records during the Classical period, combined with a human delight in monstrosity, has assured an unwonted immortality to such “damned” creatures as Caligula and Nero.  Among more ancient civilizations, the local equivalent of the damnatio memoria was more effective.  Until Carter’s discoveries in 1922, Egyptologists had no more than an inkling of King Tut’s existence, since the boy monarch and his heretical Sun-worshiping dynasty had been rubbed out of the historical record by pious defenders of Egypt’s polytheistic faith.  This remarkably effective act of collective forgetfulness endured for three thousand years.  But as the saying goes, truth will out.

However we must go beyond the early civilizations to primordial times if we want to find the origins of damnation.  Was it not Cain who first issued a decree of oblivion to his brother?  He did not just murder, but buried Able, for we know that “his blood cried out from the ground.”  The mind of Cain, full of wrath, was inconsolable at the offense of Able.  And what was that offense?  Surely that his brother had been judged acceptable in the eyes of God, while he had been found wanting.  Cain had the choice of repentance…or, or what?  The only way to restore reality to its pre-judgement status was to erase the very idea of Able as an alternative to Cain.  It wasn’t enough to just terminate Able’s existence (murder), it was necessary to deny that Able had ever existed (burial).

If there had been no outside observer, it would have been the perfect crime.  However the Holy Spirit was recording the incident for our benefit.  Cain did have a brother, and though he failed as his “keeper” neither did he succeed as his “thrower-awayer.”  In this first case, and ever since, it has been hard to make the damnatio memoria stick.

If there is a God, history is for keeps.

The Rise of the Orwellian Memory Hole

As God-centered world views have been nudged aside by various forms of Humanism, especially the most consistent form of humanism, Marxism, the damnatio memoria has experienced a modern renaissance.  Instead of statues of Caesar being removed from the Roman forum, the images of Stalin’s rivals were airbrushed out of the picture.  Before…

And after…

This process was frequently repeated until only the reigning god, “Uncle Joe” himself, remained.

Marxism is not only the last stage of humanism, but it brings to moral completion the views of time that are implicit in all forms of secularism.  According to this world-view, only the visible world is real, and all unseen worlds are either imaginary or manipulable fictions.  This means that the present always has domination over a past which has disappeared from sight and only exists in archival or artifact form.  Thus the past is worse than non-existent, it is plastic in the hands of the present…liquid, and ripe for liquidation.  To the primordial animus which the human mind harbors towards its rivals, past, present, and future…Marxism has added a theory of history which grants moral superiority to whatever faction has most recently emerged from the struggles of time.  Add to this a penchant for organization and propaganda, and one gets a veritable “science of damnation.”

As Marxism has become the hidden, but hegemonic, ideology of America’s academic and journalistic institutions, this penchant for damning the past, rather than trying to understand it, has ascended to power.  Today, in the world of Photo-shop, Stalinist airbrushing seems crude and cartoonish.  But what can be done with the more substantial archives of the past, those made of bronze and stone?  Sadly, we discover that they are scheduled for removal in cities across the nation.

Like the busts of Caesar, the generals of America’s public squares are disappearing, and not just those who fought for the South during the disturbances of 1861-65.  One wonders how long Andrew Jackson, who conquered New Orleans from the British, will be left unmolested.  Certainly, Jackson was a flawed man, but none the less a man whom it is important to grapple with in order to understand vast chunks of American history.  He is now high on the list of those scheduled for the damnatio memoria.

And who shall replace General Jackson?  Dr. King perhaps?  Whomever it might be, it will not be someone who will be able to escape the gnawing criticism of the future.  New values and new demographics will come to the fore, and then the politically correct heroes of today will themselves fall victim to future damnatio memoria.  I believe it was Chateubriand who observed, “Like Father-Time, the revolution devours its own children.”  And what does this devouring consist of but a desire to see the past as totally evil, and the present as justified by its condemnation of the past?  However this is ultimately a suicide pact and a self-imprecation, since time flows onward, and in the Marxist view this flow is not morally neutral but a process of continuous judgement and re-evaluation.

History, thus construed, becomes a pyramid of skulls with a small band of executioners at the top.  From time to time there is a new work shift and the past shift’s executioners become the next band of victims, hence providing more skulls for the pyramid, ever growing in height and volume.  This is as good an illustration as any of the human form of damnation.  It is a damnation which, if not eternal, is at least infinite.  For time has no end.

Except that, in the Christian view, it does end, and the infinite damnation that humanity wished upon itself is eclipsed by eternity.

It makes me sad.  And I wish I had the innocent guile of that bold lawn-cleaner to say,  “Dammit man!  Stop your damn man-damning man!  Just look, listen….and repent!”

 

Posted in Anthropology, Christianity, culture, Esoterism, History, Philosophy, Politics, Theology, Uncategorized | Leave a Comment »

Captain Obvious calling: What if Myths are just (you guessed it!) myths?

Posted by nouspraktikon on May 3, 2017

From unsophisticated lies to sophisticated rationalizations

I have spent more of my  life than I would care to admit trying to unravel the mysteries of myths and mythologies.   The dominant theories among anthropologists, psychologists and other scholars reflects the prevailing assumption that myth reflects a key to some deep primitive wisdom which modern people have gotten out of touch with.  Thus for Levi-Strauss, myth reveals the primitive meta-logic of the mind which is far more socially cohesive than the analytical categories of common sense logic.  Carl Jung goes further in seeing the primal spirituality of all human beings stored in a collective unconscious which from time to time is expressed in mythical terms.

The assumption is that there are truths too deep to be expressed in plain expository language.  But what if myth, far from expressing truths, is actually giving vent to falsehoods.  This is the viewpoint of Rene Girard, who sees in the incoherence of myth, a similarity to rationalization.  When the main character of a mythical narrative suddenly turns into a god or a totemic animal, Girard suggests that the hero was the subject of envy and fell victim to murder most foul.  To disguise the crime the survivors in society changed the narrative and promoted the hero from the status of victim to god.  Those who notice some similarity to Christ’s passion will not be surprised that Girard is a Christian and was influenced by the gospel narrative in framing his social theory.

One need not concur with all the details of Girard’s anthropology to see the wisdom of applying a forensic approach to myth.  If myths are primitive rationalizations of the great crimes committed in antiquity, this would go a long way to explaining the convoluted and contradictory logic which seems characteristic of all primitive societies.  As Mark Twain once said, “I don’t tell lies because its too much work to keep them all straight in my memory.”

From Fall to Falsehood

However the human race seems, on the whole, to have taken liberties with the truth at the price of developing a vast and often incoherent body of narratives which we call mythology.  To say that myths are lies and nothing more than lies, would seem to put the work of generations of anthropologists and folklorists to naught.  Yet this might be a true key to understanding the enigma of the human past.  All myths might be variations on one Big Lie which has been told generation after generation, growing in detail and complexity as each narrator attempted to put more distance between his contemporaries and some Primal Crime of deep antiquity.

In this context, it might be useful to note that the Bible, whatever “genre” we might assign to it, most certainly is not myth.  Even the most superficial acquaintance with scripture shows that its style and method is completely different from all the mythological systems which have been passed down through the traditions of the nations.  Indeed, scripture and myth are not just different but opposite, and comparing them is much like looking through a telescope alternatively from different ends.  Thus, while myths are human attempts at making a theology, the Bible was given us by God as a book of anthropology.  In understanding ourselves, we understand our relationship to God, or lack thereof.

Unlike myths, the Bible reveals to us the Great Crime which broke our fellowship with God.  It tells the truth in straight, unambiguous terms, in terms which would be recognized by any logician, whether or not such a logician accepted the moral of the story.  In contrast, mythology, the Bible’s primitive rival, is forever losing the logical thread of its narrative, much like dreams, which are simply the nocturnal counterpart of the mythological madness told in broad daylight.  When myth is on the witness stand the story is always changing, backtracking, and the names are changed to protect the guilty.

Not so with scripture, which radiates a clarity similar to the last pages in a classical “whodunit.”  Of course, this makes it unpopular with the criminal class, a class which (in regard to the Original Crime) includes the entirety of the human race.  Conversely this explains the popularity of myth which is, in the absence of other virtues…at least highly creative.

Posted in Anthropology, Art, Christian Education, Christianity, culture, Fiction, History, Paleoconservativism, Theology, Uncategorized | Tagged: | Leave a Comment »

From Old-papers to Lie-papers, this is what the media calls “progress”

Posted by nouspraktikon on April 7, 2017

Newspapers never used to contain “news”…but now the situation has been corrected

Decades ago when I heard an old monk exclaim, “These things you call newspapers…they contain nothing new!”  it was more of a self-evident truth than a revelation.  Aristotle, writing 2400 years ago, observed that if you read one book by Thucydides you didn’t need to read another history book for the rest of your life.  A lot of history has been written since then, but the principle still holds, for while the specifics of time and place may bear recording, the human comedy (or perchance tragedy) recapitulates the same old themes in every generation.  As “Rick” (portrayed by Humphrey Bogart) asked Sam the piano player to croon…

Its still the same old story,

A fight for love and glory,

A case of do or die

Of course if you really want to known the specifics of what was happening in North Africa c. 1942, Thucydides isn’t of much help.  That’s not what Aristotle or the old monk meant.  For “as time goes by” the concretes of time, place, and technology alter, but the human passions which animate the historical drama remain constant.

So I became rather casual in my attitude towards the media, deeming the daily old as soon as it was printed, and even before it redeemed its paper-value as a wrapper for the remains of maritime edibles.  Looked at in that way, there was something quaint about the Old-paper, as it regurgitated the same facts about different people while the generations cycled through their time on Earth.  To epitomize, the Weather section was paradigmatic of all the other sections.  Sun and storm might iterate through the seasons, but one never expected an entirely new form of weather to emerge.

This is not to say that novelty was entirely absent.  There was technological innovation and discovery of remote locations.  However these were like gardens which were expected to grow over time.  If there had been no innovations or discoveries, that would have been a far greater novelty. Moreover, since it was just the same expansive human nature which motivated the discovery process in accordance with human needs (or curiosity) even the greatest innovations lined up with the same doctrine of human nature.  Yet most importantly, even the greatest changes were reported on, as if they were a part of a natural order, they were not…what shall I say…they were not “promulgated.”

However I must now confess that, either I was wrong in my assumption that “no news is new news,” or something has changed.  I suspect the latter.  At some point the media moved from reportage to promulgation.  One suspects that deep in the heart of the media complex, people no longer recognize a distinction between journalism and fiction.  Selective reportage, outright suppression of facts, story-crafting, and agenda-fitting have replaced investigation.  The archetypal media man or woman no longer aspires to uncover a great story so much as to become the Great Novelist, rewriting reality according to the inspiration of their genius.  Today the newspaper has at last become a novelty.  Indeed, it has become “poetry” according to the Greek root of our word, i.e., total innovation.

In Journalism and elsewhere, Post-Modernism is past Marxism

How has this odd situation come about?  We are all aware of that confluence of factors which has changed “the news” in the past several decades, from the rise of social media to corporate concentration of the older journalistic outlets.  None the less, I am inclined to count what men and women have in their heads at the salient factor, in accordance with the principle “ideas have consequences.”  Journalists don’t just bloom like lilies of the valley, and before they are recruited into the media complex they must matriculate from the academic complex.

If it ever were, the academic complex is no longer a free marketplace of ideas.  Rather certain ideologies have gained an ironclad ascendancy on American campuses.  The most general and erudite (were it not elitist to admit) of these ideologies is so-called “post-modernism” which claims that human minds can have no contact with anything remotely resembling objective reality.  Rather, particular humans spin out their narratives, much like a caterpillar weaving its cocoon around its body.

Taken at face value, this sounds like a formula for toleration and harmony, such as was claimed on behalf of the ancient skeptics and cynics.  Those ancient “know-nothings” professed not to care about social opinion, to the point where whether a person wore clothes or not was a matter of indifference.  Whatever the merits of such skeptical liberty, it is a far cry from the atmosphere which surrounds post-modernism.  As anyone who has contact with modern academics is aware, hypersensitivity and condemnation are the qualities most apparent on university campuses today.

In reality, the hippy-like indifference on the surface of post-modernist thought masks a deeper level of ideological doctrine.  This doctrine is invariably Marxism of one or another ilk, but most especially the cultural Marxism associated with the Frankfurt school or the ideas of Antonio Gramchi.  The idea is not just to create novelty, but to create novelty which is subversive of the present state of affairs.  A new idea or a narrative which created greater harmony in society, though superficially compliant with postmodernist thought, is not sufficient.   The new narrative must be destructive of the old narratives.

This is the ideological reason why today’s media not only embrace new perspectives on human nature, but why these new perspectives are designed to create conflict and chaos.  To be sure there are other, simpler, reasons.  The most evident is the standing insight of yellow journalism that disasters sell newspapers, and that while natural disasters can’t be conjured up to order, wars and riots can be.  So today conflict, both domestic and global, is not just reported on, but spawned by the media itself.

The idea that human beings can create their own world ex nihilo is, of course, blasphemous.  But this is an attitude which goes back, behind even the Marxists, at least to Kant and the way modernity defined “culture” in opposition to nature.  Ultimately it goes back to Adam, or whoever that human was who first knowingly spit into God’s eye.  Unfortunately today’s corporate journalists are not such of whom one expects genuine, Godly, repentance.  Rather, and unlike wise King Canute, they are apt to stand stubbornly on the shore of their own subjective fancy, until engulfed by an objective tsunami far beyond their reckoning.

 

Posted in Culture & Politics, History, Media, Philosophy, Theology, Uncategorized | Tagged: , | Leave a Comment »