Pico Ultraorientalis

Just another WordPress.com weblog

Archive for the ‘Culture & Politics’ Category

In Defense of “Man”

Posted by nouspraktikon on July 15, 2017

Not Even Wrong

Suddenly.

Not suddenly as you or I measure time, but suddenly according to the stately cadences of historical events, we have lost, if not yet our species, at least, and ominously, our name for it.  At some point in the not very distant past, “Man” vanished…not extinguished as an  organism, but as an object of consciousness.  For where there is no name there can be no consciousness, where there is no consciousness there can be no science.  Today there is no longer a science called Anthropology worthy of its name, for the name has been banished.   I don’t mean the entertaining science of bones and basket weaving and many other shining objects which is offered in college curricula as “Anthropology.”  I mean Anthropology in the most specific of species-centered meanings, inquiry into that simple question….”What is…what is…[bleep!].”   It is a question which can scarcely be asked today, let alone answered.

This masking of “Man” strikes me as an important development which deserves an extended and serious discussion.   To that end, some ground rules are necessary, concerning which I have some good news and some bad news.  Here goes both:  Sex will not be mentioned in the course of this article.  I have no interest whether the reader be sex-crazed or celibate, male or female or anywhere on the spectrum in-between.  I am only interested in whether you think this Anthropological murder mystery is worth of your time and consideration.

If you concur, then the omission of sex and his/her ugly sibling “gender” is good news indeed, because these things are monumental and, I would argue, intentional, distractions from the difficulties involved in Philosophical Anthropology.  Those bad news bears,  non-adults who think sexuality is the central, nay exclusive, issue in life, can adjourn to their favorite safe space, the Reading Room on Gender, where they can reinforce their own bias among those vast collections of literature which are supplemented daily by our subsidized scholars and their media mimes.

Now to be sure, there are other rabbit paths leading away from the essential inquiry, its just that sex and gender are the most obvious, if not the most obnoxious, and hence need to be eliminated first.  However, those other anti-Anthropological rabbit paths, though less celebrated, become increasingly subtle as the core of the problem is approached.  In any subject, the task is hard enough when we have been force-fed the wrong answers…the real difficulties start when we realize that we started off on the wrong foot by asking the wrong questions.  Today, when we encounter the fundamental question of  Philosophical Anthropology, to paraphrase the incidentally sexy but essentially humane Erin Brockovitch, “..all we have is two wrong feet and damn ugly shoes.”  We don’t know”bleep!”…and the absence of the word doesn’t help.

If we wish to restore that lost science, it will prove necessary to go back and wrap our brains around that simple word “Man” which was once the standard English term for the class of all human beings, much like its French equivalent “l’homme” etc..  Man has long since disappeared out of scholarly, correct and polite language , which means pretty much everywhere, since in casual idiom, if we discount “Man oh man!” and similar oddities, the universalizing nomenclature of Philosophical Anthropology is worse than useless.  After all, you can tell a good joke about Poles, or rabbis, or priests, or homosexuals, or women, and yes, even about “men” qua the male gender, but its hard (short of aliens or the envious algorithms of The Matrix) to envision a “Man” joke.  However, while the comedians won’t notice, there might be a few instances where, for the health of civilization, the ability to have a word for the human species could come in handy.  From this, we can derive another important consideration, once “Man” has been abolished, it  is unlikely to be missed by the broad masses.  The only people who are likely to be bothered are a few specialists in what it means to be a unique species, and these specialists are generally regarded an over-serious, isolated and boring bunch.  Likewise, if the word “epidemic” and all synonyms for “epidemic” were outlawed, the only people likely to get in a panic would be epidemiologists.  Everyone else would get along quite splendidly…at least for a while.

To be sure, the abolition of “Man” and the Abolition of Man, as per the essay by C.S. Lewis are not identical.  The latter concerns the weakening of the species, the former concerns the loss of its name.  Indeed, the distinction between signs and things signified is another treasure which must be jealously guarded against the ravages of post-modernity, which is trying to slouch its way back towards a magical worldview.  Be that as it may, we can still surmise that in the defense of something it might prove essential to be able to speak about it.

On the other hand, we have to make especially sure we don’t get lured down another popular rabbit path, a highly respectable path none the less leads away from the Anthropological core: The path of language.  For example, we could easily lump this abolition of “Man” (the word) together with similar language “correction.”  Pointing out the absurdity of these corrections is the strategy of many conservatives, such as British philosopher Sir Roger Scruton who talks about the way that gender neutrality reforms have “violated the natural cadences of the English language.”   On an esthetic level, there may still be some residual irritation at “people” (or similar substitutes) in lieu of “Man”.  Yet, while this is good Edmund Burke-vintage common sense, it heads off in a trivial and logic mincing direction, of the kind favored by British analytical philosophers and American word-pundits in the Bill Safire tradition.  It expresses a futile, rearguard, hope that inane reforms, like the substitution of his and hers by “hez” can be reversed by a return to  convention, or even mutual rationality.  Rather, the Postmodernist hoards are not likely to be stemmed by a grammar policeman, policewoman, or even policeperson holding up a gloved hand, shouting “Stop!”  Its not that the “reforms” can’t be exposed as illogical and unappealing, its that they are just the tip of the spear carried by acolytes in a far deeper struggle.

Whether the war over language is winnable, I maintain it is the war against Man (as a concept) which is primary, a battle with ideological motives rooted in the hoary past.  Call it a “conspiracy” if you will, keeping in mind that conspiracy is just  popular philosophy prosecuted by cadres of minimally educated but highly motivated minions.  The generals in this conspiracy knew that they could not launch a frontal assault on Man (a.k.a. the human race), so they focused their attention on “Man” at first as a concept and then as a word.  This history of this war is better measured by centuries than by decades and has taken many a convoluted turn.  Hence my belief that contemporary Feminism is, at best, a secondary effect.  It is the Amazon battalion thrown into the breach of the citadel after the the groundwork had been patiently laid and the initial battlefield secured.  That crucial battlefield was anthropology, and not what one is liable to think of as the field of anthropology, but its philosophical cousin, that key science of all sciences, namely, the “Philosophy of…[bleep!]…”

A good “Man” is wrong to find

One can admit something exists and is important without idolizing it.  There was all too much idolization of the human race after the Renaissance and building up to the Enlightenment, a period bookended by Pico de la Mirandola’s On the Dignity of [Bleep!] and Alexander Pope’s Essay on [Bleep!] tomes which style and economy have rendered, perhaps mercifully, unreadable today.  In those days, whenever errant scholars ventured too far from the Pauline/Augustinian double anthropology of fall and redemption, it spelled trouble.  However, personal repentance generally put a  limit to the damage which could be inflicted before the toxic juice of self-worship became endemic to society.  Mirandola befriended and was converted by Savonarola, that misunderstood Catholic puritan, while at least Pope never became the Pope nor were his verses rendered into binding encyclicals.  Savonarola taught the early humanists the secret of Christian Anthropology, that Man is both sacred and bad.  For his tuition, and other causes, he was burned at the stake.

The last child and virtual apotheosis (that is, one “made into God”) of the early modern period was Voltaire, who’s hatred of religion was legendary.  None the less, even Voltaire had too much common sense to think that his animus towards Christianity could be transmuted into a new and living faith.  He noted that “It is easy enough to start a new religion, all you have to do is get yourself crucified and then rise from the dead!”  In recent years, the late Rene Girard has documented Voltaire’s insight with numerous case-studies, illustrating how most human religions originate in scapgoating, death, and subsequent apotheosis.  However the wily Voltaire could see where all this was heading, and limited his disciples to the “cultivation of  their gardens” i.e., the enjoyment of a quiet and restrained sensuality.  We might call this soft-core Humanism, or the humanism of the self.   This early modern Man-ism, which today is probably the most popular (albeit unconscious) religion on the planet, is little more than a recrudescence of old Epicurus, whose famous doctrine Paul once debated on the field of Athenian Mars.  At worst the virtues of this philosophy, such as conviviality, apolitical repose, refined aesthetics etc., are disguised vices, vices centered on feelings.  Think of the the steriotypical Country Club Republican of today’s America.  Such people are pathetic, but not in any superficial sense of the word, since the purpose of their  life is “pathic”…that is, to have feelings, high quality feelings.

Hard-core Humanism was a novelty of Voltaire’s rival, J. J. Rousseau.  In contrast to the soft doctrine, here the object of action is the ideal of Man, not the feeling-satisfaction of individual human beings.   It was Rousseau who managed to transmute the Enlightenment’s carping animus against Christianity into something resembling a true religion.  As the founder of this new religion, which has variously been termed Modernism, Humanism, Socialism and much else, Rousseau should have found himself subject to the pitiless Law of the Scapegoat.  However he eluded martyrdom, and not just because he died a natural death nineteen years prior to the outbreak of the revolution he had inspired.  Rousseau’s Man differed in important ways from both Christian and Renaissance conceptions, which were predicated on either a personal God, or at any rate, a hierarchy of beings of which the human race was but one link in the chain of existence.  Although initially disguised by Deistic code-words, the new religion lifted up Man as the Head of the Cosmos.  Since this Man was a collective, it was not expedient that any individual anti-Christ need suffer the Law of the Scapegoat.  If there were to be any suffering, it would only be in accord with the tyrant Caligula’s wish for the Roman people, “If only they all had but one neck!”  In principle, the head which lifts itself too high gets chopped off.  Caligula himself  proved  no exception to the rule.

At all events, by the 2nd or 3rd  year of the Human Revolution (c. 1793AD) modern technology had outstripped antiquity, democratizing death and allowing Caligula’s dream to come true.  The guillotine enabled the disciples of Rousseau to liquidate the old political class en mass, and then in a predictable turn of events, those disciples themselves mounted the scaffold, suffering a kind of mechanical crucifixion to the god whom they had lifted up, Man.  It was a collective crucifixion to a collective god, for this “Man” was not the same as in the soft Humanism of Voltaire, which was just a category designating a collection of individuals.  Rather, this post-Rousseau “Man” was, if not quite a concrete organism, at least cohesive enough to have a single will, a doctrine as lethal as it was democratic.

The carnage of the Revolutionary/Napoleonic period was not repeated in Europe until 1914 and thereafter, after which great quantities of men and women again began to be killed as a consequence of political and military action.  Here  we would like to inquire whether this carnage (lit. carnal death) was in some sense related to the death (or life) of an abstraction.  Is there a relation between the death of humans and the death of “Man” as a concept and a word, and if so, is that relation positive or negative?  The example of the French Revolution would seem to caution us against a laudatory Humanism, on the suspicion that the higher the ideal of “Man” is lifted up, the more human individuals are likely to be subjected to political violence.

At this point in the argument however, such a conclusion would be premature.  The period between the exile of Napoleon and the shooting of Archduke Ferdinand in Bosnia, which saw relative calm in European politics was conversely that period which witnessed, for good or ill, a wholesale revolution in popular concept of “Man” under the impact of Evolution, Marxism, and Psycho-analysis.  However none of these epicenters of scientific upheaval were directly concerned with Anthropology, at least Philosophical Anthropology, rather they were centered on the cognate disciplines of biology, economics, and psychology.

More to the point, none of these revolutionaries set out to solve the problem, “What is… [bleep!]…”   However others took up that now forbidden question, and we should try to pick up their tracks from where they left off in the tumult of 19th century thought.

Philosophical Anthropology: The Conspiracy Thickens

Today if you mention “Illuminism” it is likely to conjure up secret societies, occultism and political skulduggery, critical investigation into which is no doubt important and proper.  However in the literary salons of Europe and America during the 1840s and 185os Illuminism had a second, though in all probability related, meaning.  It referred to the then-novel research which today’s theologians refer to as the “Higher Criticism.”  If you know about, say, the “Jesus Seminar” then you pretty much know what Illuminism a.k.a. “Higher Criticism” was, except that the contemporary Seminar is pretty much an isolated rehashing of themes which were treated with greater plausibility and seriousness 170 years before.  Those earlier 19th century critics of religion were advancing along the front of a broad intellectual movement which was in the early stages of transiting from spiritualism to materialism.  The cynosure of the movement was Germany in the years following, and in reaction to, the death of philosopher G.F.W. Hegel.  To simplify a very complex way of thinking, many people of that time had accepted Pantheism, the idea that the universe and God are the same thing.  Since most people are not very quick on the uptake, and are willing to sign on to a belief systems before they grasp all of its correlative implications.

Thus, many a happy Pantheist, circa 1840AD, was surprised and saddened to learn that their system no longer permitted them to believe in the personal divinity of Jesus, whom they had hoped to retain as a spiritual hedge in spite of their infidel inclinations .  They should have figured this out from reading Hegel, but it took the shock treatment administered by some young, radical, German intellectuals of the time (a.k.a.,  the Illuminists, Higher Critics etc.) to rub the noses of these au currant ladies and gentlemen in the compost of atheism.  After a halfhearted embrace of Pantheist ambiguity, some among the elite classes of Europe were again courting hard-core, Rousseau-vintage, Humanism, very much along the lines of the original French Revolution of 1789, albeit the European political revolutions of the 40s didn’t amount to much.  This time, humanism broke out with more scientific rigor and less heartfelt enthusiasm, “Man” was made the vehicle of those hopes and dreams which had previously been invested in God.  Moreover, the unprecedented technological progress of the times were conducive to putting faith in human works.

Yet those works, splendid as they might be, begged the nature of their creators.  What was the essence of Man?  Or as we would say today, “What is the essence of….[bleep!]?”  Amazing though it might seem in retrospect, some people of that era actually took the time and pains to ask the Anthropological question.  The man who best serves as archetype of those questioners, actually proposing and discarding several solutions over the course of his life, was the German philosopher Ludwig Feuerbach (1804-1872).  One thing that can be said of Feuerbach, even if we dismiss him as a serial wrong-guesser who justly earned posthumous obscurity, was his persistent and scrupulous engagement with the Anthropological question.  His best remembered quote,”You are what you eat!” might ornament a nutritionist more gloriously than a philosopher.  Yet we must consider that, as a thinker, he was an anvil and not a hammer, pounded left and right by forces which were not just making Modernity but shattering the classical mirror of Man (better known to us as “bleep!”).  Feurerbach’s lifetime bracketed an epochal turn in human self-definition, a turn which Feuerbach didn’t initiate so much as chronicle.

Therefore, meditate on the chronological sketch below and notice how the the turn from Anthropology to anti-Anthropology transpired in the space of a specific, species-haunted, generation.  I know this narrative will be easy to dismiss as a curmudgeon’s rant on “the origins of the left”  but if you visualize the broad movement behind, and independent of, individual intentions will you grasp  its Anthropological significance.  In spooky confirmation of a simultaneous and  universal (or at least pan-Western) turn of thought, the history of early Positivism could be adduced as  a development in synchronicity with Idealism, but in this case the decapitation of Man being conducted by French, and allegedly “conservative” social scientists from August Compte to Emile Durkheim.  But I rather prefer the bold and brooding history of Anglo-German radicalism.

1804  death of Immanuel  Kant, birth of L. Feuerbach

1806 Hegel publishes his Phenomenology, consciousness posited as the motive force in the history of the world, subjective (individual) consciousness conditioned in a “dialectical” relationship to objective (collective) consciousness.

1818-19 Lectures on the History of Philosophy, S. T. Coleridge introduces German Idealism to the English reading public, slowly Idealism will replace the reigning Scottish “common sense” philosophy in the English speaking world.

1831  death of Hegel

1835 Life of Jesus, by Strauss

1841 The Essence of Christianity by Feuerbach

1843 The Essence of Christianity translated by George Eliot

1844 Karl Marx, Theses on Feuerbach, critical of objectivity and lack of political engagement in speculative Anthropology

1847-48 Revolutions in France and central Europe

1848 The Communist Manifesto

1850 The Great London Exposition, popular vindication of applied technology over philosophical and scientific theory

1854-56 Crimean War (only major European war between 1815-1914)  Nightingale, progressive transfer of humane care from family and church to state

1859 Charles Darwin, the Origin of Species, natural selection adduced as motive force in natural history

1860 Essays and Reviews, English theologians embrace the methods of Higher Criticism

1861-65 American civil war, first modern “total” war

1861 Marx, Capital vol. 1 published

1871 Charles Darwin, the Descent of Man

1872 Death of Feuerbach

Note that at the outset Man was The All-In-All, but at the end of the period, not even the  child of a monkey, rather, a scion of some anonymous animal.

In The Essence of Christianity Feuerbach attempted to equate God with “good.”  In his view all the things which were posited of a Supreme Being were actually virtuous attributes of the human species-being.  Justice, mercy, love, fidelity, etc., were human characteristics, which had been mistakenly projected on to an alienated figment of the collective imagination and deified.  However, and here’s the rub, the human individual had no more ultimate reality than God.  Feuerbach’s Man was not men, or men and women, or even people, but the species as a collective.   Individuals were mortal but the species was immortal.  Man was God, Man was good, and Man would live forever.  At the time it seemed like a grand faith, a devotion to something tangible which might give meaning to the limited and fragile life of individuals.

Feuerbach’s intention was  to make a smooth transition from the crypto-Pantheism of Hegel, to a less infatuated, more earthy, Humanism.  Yet  his critics were were more likely to see this continuity with idealism as contamination by unrealistic nonsense.  As thinkers more cunning and sanguinary than Feuerbach were quick to point out, this alleged Human species-being never managed to will anything concrete and  unanimously, but rather, all real  history has been the history of antagonistic groups engaged in fratricidal strife.  For the critics, the ultimate meaning of history was far better illustrated by victorious parties dancing on the graves of the defeated than a universally inclusive chorus singing Beethoven’s Ode to Joy.  According to Karl Marx the antagonistic parties were economic classes, and to some extent nations.  Today we would add genders, races, religions, and even sexual orientations.  Under fire from its radical critics, Human species-being quickly melted into the solvent of class analysis.

Small wonder that Marx happily discarded Feuerbach’s anthropology for the naturalism of Darwin, at one point seeking (and being refused) permission to dedicate Capital to the British naturalist.  Darwin’s system was founded on the assumption of conflict and competition, not the deduction of human from divine virtues.  Feuerbach continued to revise his system in the direction of increasingly consistent materialism, but was no longer in the forefront of a generation which had jumped from philosophical speculation to natural science, now that the latter was backed up by the prestige of  rapidly developing technology.

More significantly, the capital which Darwin did not endorse was the capital M in Man.  In classical anthropology Man had been one of the primordial kinds, as in Spirit, Man, Animal, and Mineral.  Naturalists from Aristotle to Buffon had recognized that  qua organism, the human body was akin to other mammals, and especially to apes and monkeys.  However in a consistently despiritualized science, the one human species was no longer set apart from the myriad of other animals, but rather fell under the same biological and ethological constraints as any other organism.  This reduction may have deeply bothered Darwin personally, but as a scientist he never really posed the Anthropological question the same way that Feuerbach had done, rather he was resigned to viewing homo sapiens as a single object within the purview of the natural science.  In spite of the title, after The Decent of Man, Man ceased to exist as a problem for natural science.  Or more precisely, from a Darwinian point of view, Man, as a unique aspect of the world, had never existed to begin with.

From Man to “Man”

We began by hinting that the loss of “Man” was a harbinger of the death of our own species.  After some clarification we can now understand that the situation is rather worse than we had initially feared, in that, conceptually, Man was killed off sometime in the middle of the 19th century, while “Man” (the word) actually survived the concept by more than a hundred years.  To maintain clarity, we must remember that there are actually three deaths.  First, the death of the concept, second the death of the word, and third, and yet to happen, the actual species extinction of homo sapiens.  That the third death is yet to happen should not imply that it necessarily will, it is only a hypothesis.  None the less, the three deaths are cognitively related.  In particular, the death of Man (the concept) at the hands of Darwinism, is strongly associated with the putative mortality of the species.  If Man is subject to species extinction, as are all organic taxa according to the laws of natural selection, then Man cannot be considered a primary aspect of the world.  As an analogy, consider the concept of “states of matter” which are generally accepted as uniform, or at least ubiquitous, aspects of nature.  If, say, all liquids could disappear from the cosmos, it would put the schema of “states of matter” in serious doubt.  Something of that nature is what has happened with Man, due to the anti-Anthropological turn circa 1860.

Now, would it be too wicked for me to suggest that while Man is not a “species” in the same sense that felix domestica is a species, none the less Man bears an uncanny resemblance to the cat, that enigmatic creature of the proverbial nine lives?  Not only did the word “Man” persist far longer than one might have expected, but Anthropology entered a period of great fruition after the death of Darwin.  Here I’m not referring primarily to what people ordinarily think of as “Anthropology”, the post-Darwinian people-within-nature paradigm which covers everything from bones to basket weaving.  Be wary that, just as in politics, where the nomenclature for everything gets twisted around to its opposite, and we now are forced to call socialists “liberals” in similar fashion those post-Darwinian scholars who no longer believe in a human essence are liable to call themselves “Anthropologists.”  In fact, they are mostly anti-Anthropologists who just want to study the secondary attributes and accidental properties associated with human beings.   Granted, there is nothing intrinsically wrong with that, and on the whole these so-called Anthropologists are not a bad lot, being no more consistently anti-Anthropological than the other professionals who have have inherited scattered fragments among the human sciences.  If the so-called Anthropologists have any besetting sins, those would be 1) they stole the name away from genuine Anthropology, 2) some sub-schools were virulently anti-cognitive, for example the ethnologist Franz Boaz who never saw a theory that he didn’t want to grind down into a powder of facts, 3) others, notably the Structuralists, were hyper-cognitive, and sought to gin up a Theory of Everything, based on some attribute (usually kinship or language) of human thought or behavior.

The anti-Anthropologists who called themselves “Anthropologists” loved “Man” (the word).  After all, it was their schtick, and made a nifty title for textbooks, even textbooks written by sophisticated Darwinians and Marxists who knew that human species-being had gone out of fashion with Feuerbach.  In the meantime, anything on two legs with an opposable thumb would do, and it was all great fun until Feminism put the kibosh on that particular branding.  None the less, so-called  “Anthropology” took the ban on “Man” in stride, since their usage of the term was based on a consistent nominalism, if not on a conscious memory of the anti-Anthropological roots of modern natural science.  Fortunately, due to the exclusion of classical languages, undergraduates could still take “Anthro” and not worry their heads that banned “Man” had never meant just  andro…indeed, that it had meant much more than both andro and gyno put together.

Yet, I wanted to mention the 2oth century miracle of Anthropology, not so-called “Anthropology” but genuine Philosophical Anthropology, as it flourished after, and in spite of, the anti-Anthropological turn of the previous generation.  If I thought that Man were a mere species and not an attribute of Created Being, my inclination would be to classify it somewhere within the family Leporidae, as a mammal with a capacity for making unexpected intellectual leaps, and multiplying thoughts faster than other species can reproduce their genes.  To that end, what great broods have been spawned, not just among the anti-Anthropologists, which is only to be expected, but even among genuine Anthropologists during the 20th and even 21st centuries!

Now remember, when I heap praise on the battered remnants of genuine, philosophical, Anthropology, I’m only lauding them for asking the right question, namely: “What is…[bleep!]”  And by now you understand what “bleep!” is and that a Philosophical Anthropologist is one who would know and say that “bleep!”=Man, and that possibly we should even come out and say “Man” when we mean Man.  I am not saying that many, or even any, of these Anthropologists have answered the question correctly, although I think there is an answer, and that some have made a closer approach to the correct solution than others.  Naturally I have my own views, but I would consider anyone a legitimate Anthropologist who asked the question aright.

There are schools of Philosophical Anthropology of every description.  Some are religious, some are frankly atheistic, but even the most starkly atheistic Anthropologists demure from post-Darwinian naturalism in positing something unique and essential about the human race.  In that sense, all Anthropologists, from atheists to Christians, are tendering a kind of “minority report” against the consensus view of modern science and society.  An atheistic, but genuine, Anthropologist might posit that the human race has a unique responsibility to conserve the cosmos and bring it to its best potential.  Countering this, the consensus view would maintain that such an assertion was errant nonsense, an arbitrary projection of human values into the unthinking and unthinkable void.

In a brief treatment, it is impossible to do more than allude to all the speculative “minority reports” which have been filed by Philosophical Anthropologists against the hegemony of post-Darwinian naturalism.  No doubt many of these speculations have been wrong-headed, but they have at least kept a window open to world-views outside the standard narrative.  If I had to pick a representative of the type it would be Max Scheler(German, d. 1928).  Feuerbach’s anthropolgy began with materialistic idealism and sloped inexorably down to idealistic materialism, however Scheler’s thought described a parabola, which at its height sought the divine in Man.   Personality, both Divine and Human, was arguably Scheler’s main concern, however his reluctance to deal with the limits imposed by a temporal creation, as per the Judeo-Christian scriptures, subordinated individuality to the vague infinity of deep time, a dilemma similar to that encountered by the ancient Gnostics.  Abandoning his initial, and intentionally Christian, viewpoint, Scheler made the alarming discovery that, in precluding a personal God, the amoral instinctual urges of the Cosmos were far stronger than  any principle of spiritual form or sentiment.   The intellectual public in Germany and beyond, repelled by such otiose metaphysics embraced existentialism, a doctrine which gave up on the reality of anything but individuals.  Anthropology once again retreated to the shadows.

In retrospect, Feurebach and Scheler seem like tragic figures who lifted up Man, in one or another guise, as a god, only to see their systems crushed down by more consistently nihilistic doctrines.  However it doubtful whether their contemporaries saw the loss of Anthropological hegemony as something to be lamented.  Rather, they were relieved to be unburdened of Man, just as they had greeted the earlier, and logically prior, “death of God” with satisfaction.

The return of Man, and the return of “Man”…which, both or neither?

The operational assumption is that people can get along perfectly well without a conception of their own species occupying a special place in the system of the world.  Underlying this assumption is the more fundamental axiom that the natural science narrative is our default outlook on the world.  After all, its “natural” is it not?

However the “minority report” of Philosophical Anthropology raises the  specter of a completely different world, a world in which the unique bearers of the divine image have been persuaded that they are but one of a myriad of animal species.  By this account, the conceptual framework of natural science within which the image bearers were circumscribed, was not so much a “discovery” as the imputation of a belief-system.  From this perspective, it is naturalism, not the classical Man-centered cosmology, which is fabulous.  To get the masses of humanity to believe such a deflating fable in the course of a few centuries, has been a superbly effective triumph of propaganda.  Although we have some hints as to who has disseminated this propaganda, the question of in whose interest it was disseminated remains enigmatic.

Within the English-speaking world, the banner of the old classical Anthropology (Christian or secular) was “Man.”  The banner was not furled up until long after the cause was lost.  Yet the banner itself was essential, so essential that the high command of anti-Anthropology decided to send in the Amazonian battalion to haul it down under the pretext of the gender wars.  Lost in the confusion of that particular skirmish, was the deep import of having a proper name for that key nexus of Creation through which the Divine, ideally, was to communicate its dominion over the visible world.  “People” is more than just an innocent substitute for “Man”, since, being a plural, it serves as a pretext for importing the entire philosophy of nominalism into the human sciences.  Nominalism views entities (you and me and the cat and the carpet) as capable of being grouped into any category which happens to be convenient.   Who’s convenience?

It can be safely inferred that this is a view well suited to those who want to abolish the boundaries between species.  Perhaps now the reader can see the relevance of all the preceding esoteric Anthropology, for looming on the event horizon of our world are a thousand crises brought about by relation of the human to the non-human.  Indeed, we are conjuring up new categories of non-humans day by day.  AI and aliens, robots and Chimeras, not to mention all those entities of the natural and spiritual world who are ancient in human lore.  I eagerly await the rebirth of the “dinosaur” from its amber-encased DNA.  Or will it be a dragon?   Names make a difference.

None the less, we proceed without caution, for the night-watch has been relieved of its duties as the evening of human history encroaches.  Isolated voices cry out, “There may be a problem here!” and anxiety is ubiquitous, but few are willing to “get real.”  This is not an accident.  The “real” tools, nay, the “real” weapons with which we might have fought were long ago taken away and beaten, not into plowshares, but into the bars of zoological confinement for what remains of the dignity of Man.  The “real” tools were realistic in a properly philosophical sense, exalting created kinds as the unalterable building blocks from which God created our world.  Such was Man.  Hence the necessity of having a personal name for the species.

Will Man come again?  I think so, but more on the basis of faith than calculation.  In the meantime others look towards a rapidly accelerating future, and begin to realize that “Nature” is hardly a better idol than secular Man, that the sense of “nature-in-itself” is an illusory effect of what psychologists call normalcy bias.  None the less, something is approaching, we know not what.  Intellectuals call it “the end of history” while technologists speak of “the singularity.”  Most just ignore it, but it will come nonetheless.

Suddenly.

 

 

 

 

 

 

Posted in Anthropology, Art, Christianity, Culture & Politics, Esoterism, Evolution, History, Paleoconservativism, Philosophy, Politics, Traditionalism, Uncategorized | Leave a Comment »

From Ike with love: The Age of Deception (1952-2016)

Posted by nouspraktikon on July 5, 2017

Nothing has changed except our point of view, but that counts for something

It is easy to think, as the left continues to overplay its cards, that something significant has occurred, and that our trajectory towards an Orwellian future has accelerated .  On the contrary, the Trump victory has triggered a new gestalt in people’s minds.  By 2017 fairly average people can see what only hardened conspiracy theorists were willing to hypothesize as late as 2015.   Whether or not we are at the beginning of a new era, for good or ill, is a matter of conjecture.  Indisputably, we have taken our leave of a period in political history which will prompt nostalgia among anyone but truth-seekers.  While it was hardly an era of good feelings, it was held up by its laureates as a time of consensus, or at least bi-partisanship.

Rather, it seems better to call our recent past the Age of Deception.  The Great Deception consisted in draping a de facto one party system in the vestments of a two party system.  If you had said this in 1965, or 1975, or 1980, or 1994, or 2001, or perhaps even 2008…most people would have called you an extremist.

However somebody, somebody who thought extremism in the cause of truth was no vice, had already pointed this out as early as 1958.  Sure enough, his opponents, and they were legion, labeled this man a slanderer, effectively burying  his work from the sight of the general public, first using savage opprobrium, subsequently silence, and at last retrospective ridicule.   The man was Robert Welch, and the “book” he wrote, initially penned as a private circular and later published as The Politician, targeted none other than President Dwight Eisenhower as an agent of communism.

Then as now, to the half-informed mind of the general reading public, such an allegation was patently absurd.  Eisenhower was venerated as a war hero on the basis of his direction of the Allied war efforts in Europe.  Now admitedly, there are a number of ways to think about the “heroism” of strategic commanders as opposed to direct combatants, but generally, if the war is won, the public will grant the former a triumph and allow them to retire in luxurious obscurity.  “Ike’s” not-so-obscure military retirement consisted of becoming President of Columbia University.  After that, for reasons most people are rather vague about, he was drafted to become the Republican candidate for another kind of presidency, nominated over Sen. Robert Taft of Ohio, the last champion of the “Old Right.”

After that, we usually go to sleep in our American history class until it is time to wake up for Kennedy.  Indeed, this might be a kind of clue that something is amiss in the standard Eisenhower narrative, like the barking dog who falls strangely silent in the dead of night.  How many books, popular and scholarly, are published each year about JFK in comparison to good old “Ike” (even subtracting those chillers which focus entirely on Kennedy’s murder)?  I doubt that a ratio of a hundred to one would be far off base.  Either America’s political ’50s were incredibly boring, or there is a story which, in the view of some, were best left untold….

A few history mavens might even remember that “We…(presumably all Americans)..like Ike”…because (warning, redundancy!) he was “…a man who’s easy to like.”  And furthermore, as the campaign jingle continued with mesmerizing repetition…”Uncle Joe is worried, ’cause we like Ike!”  Of course, if Mr. Welch was anywhere close to on-target in The Politician, “Uncle Joe” a.k.a. Joseph Stalin had little to be worried about, at least in regard to Dwight Eisenhower.

If you are skeptical that “Ike” could have been a communist front man, then I can sympathize with you.  Frankly, I was skeptical myself…indeed, everybody has a right to be skeptical of startling claims.  On the other hand, if you think that it is disrespectful to raise the issue of presidential duplicity at all, then you are on shaky grounds.  You are on especially shaky grounds if you happen to be one of those people who think that our sitting president was sponsored by (today’s post-communist) Russia.

You see, after 2016 everything has changed.  Whether or not Mr. Welch’s claims regarding “Ike” can be vindicated, at the very least we are now in position to read The Politician as an objective historical account.  The Politician is a strong and scholarly witness of an already forgotten time, one that now can, and should, be approached without bias or malice.

Why Robert Welch didn’t “like Ike”

It is an uncomfortable but inescapable truth that once certain things come to one’s attention it is impossible  to “unsee” them.  There is a shift in perception which renders impossible any  return to “normal” however rosy that mythical past might have been.  For example, a beloved though eccentric uncle can seldom be restored to a family’s unguarded intimacy once he comes under suspicion of pederasty, and rightly so.  Likewise, the image of Eisenhower would be shattered, not so much as war hero, but as the epitome of a stable, normal and normalizing politician, were he to be exposed as a willing agent of communism.  Conversely, just as the suspect uncle would insist on due process, even if he knew himself to be guilty, the upholders of the Eisenhower legacy are apt to clamor for iron clad proof of what, according to mainstream historiography, would be considered an outrageous accusation.

Sadly, for the reputation of Eisenhower and our national narrative, the claims of Mr. Welch are well documented, coherent, detailed, and were compiled by a contemporary who knew the American political class of the 1950s like the back of his hand.  If you wish to keep Eisenhower in your pantheon of heroes, read no further.  If, on the other hand, you would like to see the claims against him substantiated, read The Politician.  Here, I can only provide a brief, albeit damning, sampling drawn from Mr. Welch’s researches.  Therein he documents the following egregious policies which were either authorized or enabled by Eisenhower:

*Even in his role as allied commander, the fountainhead of his public esteem, Eisenhower was allegedly (The Politician provides graphic details) complicit in the nefarious Operation Keelhaul, a rendition program which forcibly repatriated ex-Axis agents collaborating with the American forces to their home countries behind the iron curtain.  This eliminated numerous sources of “worry” for “Uncle Joe.”

*Eisenhower was instrumental, as President of Columbia University, in pushing that already left-leaning institution further in the same  direction.  He continued to associate with and hire left-wing and communist front faculty, procuring for them teaching/research endowments.  Again, the allegations in The Politician have been strengthened in the light of subsequent events.  Just ten years after the publication of Welch’s Eisenhower exposure, the University of Columbia erupted as an epicenter of the spreading “new left” movement of the ’60s.

*At the heart of The Politician’s allegations is “the politician” himself.  Prior to Eisenhower’s nomination as a candidate for president on the Republican ticket, all of his political associations had been with the left-wing of the Democrat party.  This is perhaps the most uncanny aspect of Eisenhower’s career, and the one most germane to the establishment of a faux two-party system beginning in the ’50s.  The only fig leaf concealing this duplicity was the absence of any prior political office holding (Democrat or Republican) by the President-to-be.  Again, historical retrospect adds, if not new facts, new irony to the narrative of The Politician.  Our current presidency is commonly considered exceptional, if not down right illegitimate, on grounds that Mr. Trump held no prior office and was not sufficiently initiated into the mysteries of the political class.  In the light of Eisenhower’s prior example this current “exceptionalism” can only be caviled at by those who either 1) adhere to the dangerous proposition that generalship is a political office, or 2) are willing to admit that such rules can only be broken on the left.

*Once inaugurated President Eisenhower continued the policies of FDR’s New Deal.  Indeed, programs and bureaucracies which existed only in embryo in previous administrations were fleshed out, expanded, and duplicated.  The agricultural sector is typical, and just one of the many that Welch enumerates. Amazingly, farm subsidies swelled to half of farmers’ revenue, a fact of which “Ike” was very proud.  Moreover, unlike FDR and the Democrats of the ’30s, these programs were not justified as “emergency” measures, but were considered a permanent and “normal” restructuring of the relation between the public and the private sector, i.e., de facto socialism.   This was enabled by the collapse of any meaningful two-party opposition due to the alliance between left-wing Democrats and the establishment Republicans who backed Eisenhower.  The monolithic bureaucracy, exemplified by the Department of Health, Education, and Welfare, long resisted by the “Old Right” was institutionalized under the faux two-party consensus.  Hence the public sector actually saw a spurt of growth in terms of employees and expenditure in the transition from Truman to Eisenhower.  Consequently, the national debt rose at a rate several times higher than even the Democrats had been willing to incur.

*As shocking as many of the above allegations might seem, the most controversial aspect of the Eisenhower administration was its acceptance and further entrenchment of the post-WWII National Security State system inaugurated under Harry Truman.  This has to be remembered both in conjunction with, and contrast to, the only quote that most people today are likely associate with Dwight Eisenhower, namely, his “prescient” warning against the dangers of the “military industrial complex.”  This utterance was prescient only in so far as Eisenhower was speaking prior to the Vietnam debacle, after which such forebodings became commonplace.  To the best of my knowledge Mr. Welch doesn’t reference this quote, which dates from a time subsequent to the initial redaction of The Politician, although not prior to later editions.  However, Mr. Welch frequently draws attention to rhetorical gestures made by Eisenhower through which he exculpated himself from responsibility for his suspect policies by seeming to condemn their inevitable negative consequences.   Thus he might condemn “galloping socialism” while rapidly expanding the public sector.  Seen in this light, we might take Ike’s warning against the “military industrial complex” to heart, while doubting the speaker’s innocence of the very thing he condemned.

Does this “Ancient History” even matter?

The short answer…yes, it does.

You might recall a scene in Starwars where Luke Skywalker asks Yoda about the future.  Yoda answers, “A strange thing the future, always in motion it is…”  In a sense the past is also in motion, shaped by the interpretation given it by the present.  Yet it would be too great a concession to the irrational forces of our times to say that this was a real, and not an apparent, motion.  The past must be uncovered, not invented…although the temptation to  invent myth is strong.

There is always a strong mental resistance to meddling with any society’s pantheon, or in more American terms, we might say, tampering with Mt. Rushmore.  In Mr. Welch’s day, The Politician seemed rude to the point of slander, while today it seems impious.  We might say “only” impious, when actually it’s the primal sin.  Mr. Welch mentioned something nobody was supposed to notice.  That’s impiety.

Or is it?  Note another odd thing about the Eisenhower myth, that there is no such myth!  Somehow or other Eisenhower has eluded both the pantheon and the rogue’s gallery of American history.  If the entire history of the Presidency during the ’50s elicits very little commentary, is that because the whole period was boring?  Hardly.  Rather, might not such a presidency be likened to a constant background noise, or better yet a universal solvent…the purpose of which is to set the standard of normality for “the end of history”?

Today we have come out the other end of “the end of history.”  Not that we really know how things will end, or for that matter continue.  All we know is that, for the first time in a long time the carefully scripted design for the future has suffered a setback.  The planners, whoever and whatever they may be (though from a galaxy far away I think they be not!) are in disarray and many things are back on the table which once were considered “settled.”  This may be a good thing, it may be a dangerous thing, and most likely both, but this is where we seem to be at present.

Consequently, under today’s conditions, reading, and taking seriously, the thesis in Mr. Welch’s The Politician, is no longer an act of impiety.  It is an essential measure of the road which we have traversed through the land of manipulated consensus.  Having finished that journey, we can look back at the trail-head, take stock, and get a new perspective.  However, in contrast to the fantasies of the “progressives” no perspective is better just because it is newer…only if it is truer to realities which transcend perspective itself.  Furthermore, to get at those realities one has to crunch a lot of historical data, and there is a lot of data to crunch, most of it rather unpleasant, in The Politician.

Only those with a deep urge for enlightenment need apply to the task.

 

Posted in Constitutionalism, Culture & Politics, Economics, History, Media, Paleoconservativism, Politics, Uncategorized | Tagged: | Leave a Comment »

Constitutional Contrary or Conundrum? The Imperial Presidency vs. the Unitary Executive

Posted by nouspraktikon on June 4, 2017

Strong President, Weak President

Setting boundaries and limits to power is the essence of politics in a republic.  No Latin word was ever belabored more than imperium in the era prior to Caesar’s crossing of the Rubicon.  Originally it referred to the “sphere of power” which was exercised by a magistrate, great or small, beyond which the office holder infringed upon the rival authority of some other elected official.  With the atrophy of the Republic, it became a personal noun, the Imperator, the root of our term for a King of Kings, an “Emperor.”  The word, thus transformed, described a  person who’s “sphere of power” had become the whole world, thus annihilating the use to which its root had once been put, namely, to define and limit power.

Last year I predicted that Donald Trump, if elected President, would not become a fascist dictator, an “Emperor” so to speak.  Rather, the tremendous forces arrayed against him would ensure that the office would be brought to heel to a much greater degree than those who fear an Imperial Presidency are wont to imagine.  None the less, even I have been surprised by the extent of the weakness in the executive.  If we have passed any Rubicon, it seems rather that we have passed over from a concealed, to an open, form of oligarchy.

One way of coming to grips with this non-revolution is to admit from the outset that 1) the Imperial Presidency, and 2) the unitary executive, are contraries, not complements.  If we were to talk about official spheres of power with the fastidiousness of the ancient Romans, we might call the first, the President’s “lateral power” and the second the President’s “upright power.”  Imagine that presidential power is a rectangle of fixed area which loses depth whenever it is stretched horizontally.  I know that is a rather strange image to put in the service of a radical hypothesis, but bear with me.

Why the unitary executive is a great Constitutional doctrine

Generally when we ( and by “we”I mean, libertarians, conservatives, traditionalists, natural rights advocates, strict constructionists, etc.) hear the word “president” modified by the word “strong” we go into a fit of moral indignation, if not outright hysteria.  Yes, generally heads of state should be weak, lest they turn into tyrants.  However the American presidency is a unique institution, one which the founders of the Republic intended as a safeguard of liberty, just as much as the legislative and judicial branches.  To begin with, the very notion that the American president is a “head of state” is an extra-Constitutional notion, one which arises from the necessity of adjusting American nomenclature to the standards of  diplomacy.  Indeed, since the Congress is our premier branch of government, the Speaker of the House has a fairly good claim to be the federal head of state, on the analogy of parliamentary systems.

Leaving aside the symbolic, and rather silly, issue of heads of state, let’s turn to a more fundamental question which impacts on the idea of the unitary executive.  Each of the branches of the Federal government must conduct its internal affairs in hermetic isolation of the other, while being in constant cooperation as corporate bodies to conduct the governance of these United States.  Naturally, each of the branches will attempt to extend its sphere of authority, or what the Romans called, their imperium.

Now the matters which are of concern to each branch are well spelled out in the Constitution, but each of the branches always attempts to grow its authority by multiplying those things by which it exerts authority.   Thus the legislative branch attempts to grow its authority by increasing the volume and complexity of legislation, while the judicial branch attempts to grow its authority through the multiplication of rulings, judgements, and injunctions.  On the other hand, it is primarily the executive branch which attempts to grow its authority through the multiplication of offices.  Sad to note, but the three branches may remain evenly balanced while all of them grow in concert, disrupting the larger balance between governmental and non-governmental institutions in civil society.

Whatever cure there might be for the exponential growth of government in the legislative and judicial spheres, the theory of the unitary executive provides both a unique analysis and possible cure for burgeoning bureaucracy.  How so?

Strictly speaking, in the American republic there can never be more than one government officer at a given time.  The name of this officer is the President of the United States!

Oh yes, if you must quibble, there is also a deputy in case of death or incapacitation, the anomalous Veep.  None the less, two officers is a pretty strict limit for the bureaucracy of a large republic.  It reminds one of the twin consuls of Rome, a historical precedent which was never far from the thoughts of the American founders.  In terms of modern political theory we have arrived at genuine “minarchism”…an ungainly word which has been coined to express the most limited of limited governments.

Of course, for true unity of will and purpose, a person can never really trust anyone else to do their own job.  Hence the most pristine unitary executive would be one in which the President did all the work of executive branch personally.  We can imagine a President who, dispensing with the service of a secretary, was able to handle all executive correspondence personally.  (NB: The reason we can imagine it is that we live in a world of word processors, computers, and the internet.)  However other things, such as warfare, might be a bit more tricky, unless our chief magistrate had the strength of the Biblical Samson or a modern-day comic super-hero.

So to be on the realistic side, even our pristine unitary executive would, of necessity, need to contract out for a few staffers.  Hopefully these would all be temporary workers.  After all, the chief magistrate himself is a temporary worker, limited to four, or at the maximum, eight years of employment by the American people.

Now before you dismiss this as nothing more than utopian swamp fever, perhaps we should take a look at the way the doctrine of the unitary executive has played out in the history of the Republic.

 

The historical roots of a weakening unitary executive

Unfortunately, while the imperial Presidency is the most realistic of real-political realities, the concept of a “unitary executive” is little more than a constitutional doctrine which has had to go hat in hand through the corridors of history in search of application.  To put the theory in its clearest form, the unitary executive is the President himself, who is at once both the only employee of the American people, and also the boss of every federal office holder outside of the Congress and the Judiciary.  The theory seemed most incarnate in the reign of those generals who seemed to be able to wield their authority with the same imperious might in the Oval Office as on the battlefield.  One thinks of Andrew Jackson and Teddy Roosevelt.

That was then, and now is now, when Mr. Trump’s executive leadership seems more like an exercise in herding cats.  Yet people with even a tad of historical lore under their skulls recognize that The Donald didn’t suddenly fumble the unitary executive to the horror of his fans and the delight of his detractors.  Common wisdom suggests that the unitary executive began to unravel, at the very latest, in the aftermath of the Watergate (1973) scandals.  Legislation which sought to limit the presidential imperium resulted in severe checks on arbitrary presidential power.  However these reforms failed to check arbitrary governmental power in general, or to stave off the multiplication of executive projects, expenditures and offices.  Rather, by setting up checks and balances within the executive branch of the federal government, they added to the executive bureaucracy.  And this went to the extent that the “special prosecutors” who were the plumb in the cake of the post-Watergate reforms threatened to become a “Fourth Branch” of trans-Constitutional governance.

Those who can see beyond the historical horizon of Watergate are more likely to see the first unraveling of the unitary executive in the New Deal, and the multiplication of those “alphabet agencies” such as the ICC, TVA, and NRA, each of whom were endowed with judicial as well as executive authority.  Yet an earlier starting point is the Progressive era, which saw the rise of the intellectual in the federal administration, a creature who was less likely to be constrained by, or even understood by, whatever folksy president inherited the legacy of those hybrid characters like Wilson who both studied and practiced administration.

Loyalty vs. Merit

However these movements were actually just footnotes to the unitary executive’s original fall from grace, which coincided with the rise of a merit based civil service.  It was the Pendelton Act of 1878 which consolidated the system of permanently employed government service.  After that there was little reason to think that officers would be loyal to a politician who’s term of office was likely to be far shorter than the duration of their career.   Like all sea changes in the policy of the republic, the effect of this reform was not immediately apparent.  After all, presidents in the late 19th century were just expected to be “weak.”  Think Grover Cleveland.

Today, because we read history from public school textbooks, the pre-reform civil service gets a bad press.  Typically it is referred to as the “spoils system” which conjures up images (not entirely unsubstantiated) of bribery and largess.  However there is another side to this issue.  We should at least try to be “Mugwumps” that fanciful word for a person who was willing to consider the merits and demerits of a permanent civil service.  In the interests of fairness, I would like to exercise a bit of Mugwumpery and dub the temporary civil servant system the “Loyalty System.”  After all, the politically appointable (and removable) civil servant would at least have no vested interest sabotaging the chief executive who, unlike him or herself, was directly chosen through the electoral mechanisms of the Republic.

In certain moods our progressives and our conservatives might even agree that disloyalty is a bad thing and moreover presidents should at least have the chance to formulate policy on their own turf before being challenged by either the courts or the legislature.  However there is a libertarian remnant which stubbornly insists that a strong president is a bad president, and indeed that a strong administration is nothing more than a step along the primrose path to empire.

However, as illogical as it may seem, the presidency became “weak” before it became imperial.  After WWI and as the 20th century wore on, there was need to have an emperor to complement the existence of an empire.  However the discipline of the bureaucracy which manifested itself at this time was not due to the charismatic appeal of those politicians who became, willy-nilly, chief magistrates of the republic.  Rather, it was due to the professional association of those who had a vested interest in the expansion of state power, both internationally and domestically.  Presidential orders were obeyed because presidents of whatever party were (to a greater or lesser extent)  aligned with the expansion of a robust administrative state. In 1952 Sen. Taft of Ohio lost the Republican nomination against General Dwight Eisenhower.  Taft was the last mainstream presidential candidate to seriously challenge the operational premise of expanding state power.  Barry Goldwater and Ron Paul would later mount doomed, albeit educational, campaigns dedicated to challenging that same premise.

Then in 2016 Donald Trump was elected after campaigning on many of the same anti-statist planks that animated Taft, Goldwater, Paul and (very inconsistently) Reagan.  Trump had the good sense to mix his contrarian rhetoric with a dash of jingoist appeal.  So far, the bureaucracy is in somewhat less than full scale revolt.  But only a very naive observer would be surprised that the doctrine of the unitary executive has been utterly abrogated.

The not-so-deep-state and the demise of the unitary executive

Today when “deep state” has become a household expression, it is easy to substitute James Bond intrigue for fundamental political analysis.  No doubt there is a great deal of skulduggery going on in high places these days, but the unitary executive would have floundered without any alienation between the Oval Office and the intelligence services.  It is not just the Praetorian Guard who are in revolt, but the clerks…and there are a lot of clerks.  It is not just a cabal, but the system, a system in which managers are independent of elected policy-makers.  In the EU this system appears in its most naked form.  In the US it still has to make end runs around the remains of a Constitutional Republic.

As Richard Weaver said, “Ideas have consequences!”  One of the great, pure, ideas of the 19th century was civil service reform.  However in creating a permanent state independent of politics, civil service reform ensured that all future reforms would be bound inside the parameters of the managerial state.  The owl of Minerva takes flight at night, and only now do we see the luster of those single-minded individuals whom the progressives have been eager to denounce as dictators-in-waiting.  The aristocratic Washington, the Jacobin Jefferson, mean old Andy Jackson, the imperious Polk and (though they were already compromised by the permanent state) later figures such as Lincoln and Teddy Roosevelt.

Finally, we can at last see the wisdom of the Founders in endowing one third of the federal government with a vestige of monarchy.  At very worst a monarchy, but never, ever, an empire, since a strong individual, unencumbered by bureaucracy and backed by the people, might indeed succeed in ruling the daily affairs of one nation…but then it would be bedtime.

 

Posted in Constitution, Constitutionalism, culture, Culture & Politics, Economics, History, Law, Paleoconservativism, Politics, Traditionalism, Uncategorized | Tagged: | 1 Comment »

The Trump fizzle….the R3volution that wasn’t (and the one that was)

Posted by nouspraktikon on May 27, 2017

Trump’s non-revolution as an educational device

As of this writing pretty much everything which was promised in the salad days of Mr. Trump’s MAGA tours has been either hung up in pending legislation or put on the back burner.  Nobody, at least nobody who wasn’t born yesterday, really expected Ms. Clinton to go to jail or a physical wall to be built on the Mexican border, even assuming such things were desirable.  However few anticipated that  the President would morph into a double of his worst enemy, a.k.a., Sen. John McCain, which is pretty much what happened on foreign policy.  On the domestic front we now hear that refugee resettlement, something which is very different from voluntary immigration, can be expected to reach record highs.  The politics of blow-back, “invade the world and invite the world” is still as much the order of the day as it would be under any hypothetical Democrat administration.

I still retain a basic gut-level sympathy for Mr. Trump and his family, and a chivalrous disdain for the libelous attacks of the old-line media on their reputation.  None the less, I have lost any sense that a Trump revolution is afoot, unless that means a rebellion of Trump’s subordinates against their titular boss.  In place of a revolution, the most that conservatives and libertarians are likely to glean from this (possibly short-lived) administration is what, in patronizing terms, we refer to as a “learning experience.”  Yes, we are getting “a-lot-a-learning” taught to the tune of something far worse than a hickory stick…a broken heart.

On a deeper level, anyone who thought that a “Revolution”  was possible at this stage of American history is deluded.  However if we spell it R3volution, on the understanding that this is a counter-counter-revolution ( and if you see where the “3” comes in you are very clever!) then perhaps we have the basis, if not for hope, at least for a coherent narrative.

Put into schematic form that would be.

1.The original (libertarian) revolution against state absolutism. (a.k.a., the “Spirit of ’76)

2.The counter revolution of the administrative state under the pretext of various ideologies (egalitarianism, socialism, scientism).

3.The various attempts at counter-counter revolution launched against the New Order of the administrative/managerial state, usually labeled with that awkward term “conservatism.”

Basically, we are stuck at item 2, since we live in a historical situation where the administrative state has entrenched itself to the extent that most attempts at push-back fail before they become a credible threat to the New Order.  Mr. Trump’s revolution-manque is only the most recent and glaring example of this process.  Probably the best description of this situation was a series of essays written by an ex-editor of the Saturday Evening Post around the mid-point of the 20th century.

The Revolution Was

The man was Garet Garrett, a curmudgeon of the anti-New Deal resistance.  His thesis was that conservatives and moderates didn’t need to fear the advent of socialist revolution…since it had already occurred.  Of course by “revolution” he meant the authoritarian counter-revolution, not the American revolution, let alone any R3volution to restore the ideals of ’76.

Furthermore, Garrett underscored the permanence and near irreversible nature of the administrative state by articulating three reinforcing spheres in which the state made itself dominant and absolute. The welfare state, the system of international managed trade, and the system of collective security.  These were all solidly in place by the end of the Korean war.  These were each covered by an installment in his trilogy of essays, The Revolution Was(1938), Ex-America(1951), and The Rise of Empire(1952). (Note: the whole trilogy was packaged as The People’s Pottage , 1992)

Subsequent to Mr. Garrett’s analysis, but implicit in it, we see that so-called conservatives cavil at the welfare state, but accept it as the price of empire, while so-called liberals cavil at the empire, but accept it as the price of the welfare state.   Thus the people, through their representatives in Congress, were not liable to overrule the autonomy of the state bureaucrats, since the policy outcomes were always amenable to one or other section of the politically active classes.

A New (albeit false) Hope

Garet Garrett pointed out that at no specific point was the system of Constitutional government abrogated.  Rather, the Constitution was simply ignored and a substitute system of norms evolved to face changing contingencies.  Mr. Garrett dubbed this “Revolution within the form,” or in more exact nomenclature “counter-revolution within the form.”

The remedy therefore became opaque, since it was not a question of  legislating a new constitution, but of reasserting the salient provisions of the original, but neglected, law.  At the time of FDR the judicial branch occasionally still used it powers to limit the scope of the federal administrative state, a stance which was commonly thought to be the main justification for the doctrine of judicial review.  However, since that time, and especially since the ’60s the courts have become progressively (pun intended) subversive of the idea of any sphere of authority outside the administrative state.

An alternative to judicial redress was the possibility, however unlikely, that the American people would elect a libertarian president, or at least a kind of anti-FDR who would restore the Republic to its original vitality.  I had occasionally heard such sentiments voiced in libertarian and conservative circles prior to the election of Mr. Trump, however most people were surprised when the scarcely hoped for became incarnate in the form of a celebrity non-politician.  Or as it turned out, not.

We are left with what we should have started with, the prospects for political education and its impact on the legislative branch.  We now know that the “Hail Mary! pass” to a heroic chief executive doesn’t work.  Why? Because the theory of the unitary executive only works when it is in the interests of the administrative state.  When the chief executive opposes the interests of the (albeit “his own”) managerial class, the unitary executive crumbles like a sand castle at high tide.  We are at the high tide of statism.

If there is a silver lining to the present circumstances it is that the legislative branch can still throw a monkey wrench into the works, for good or evil.  In theory, a legislative branch that responded to the long range interests of the people, which is not that of the managerial state, could reverse the (counter-) revolution.  In theory, the right way to the right kind of freedom can be found…if only after exhausting every other way first.

 

Posted in Constitution, Constitutionalism, Culture & Politics, Economics, Libertarianism, Philosophy, Politics, Traditionalism, Uncategorized | Tagged: | Leave a Comment »

How Churchmen are changed into Ducks

Posted by nouspraktikon on May 9, 2017

George Whitfield (1714-1770)

Among the more formidable characters in church history is George Whitfield (sometimes spelled Whitefield but pronounced without the “e”) the preacher who spread a Calvinistic variety of Methodism in colonial America.  You must understand that at the time Methodism was, as the very name indicates, a methodology and not a sect.  It was Whitfield’s aggressive preaching method, not to the taste of some, which had such a tremendous effect on forming the unique spirituality of early America.

His odd looks (he was cross eyed) and forceful rhetoric must have convinced many that Whitfield  was more an angel than a man.  It was related that he could pronounce a word as neutral and exotic as “Mesopotamia” in such a way as to draw tears from his audience.  For some this was sorcery, but for others it was salvation, and the crowds that he was able to gather were a mighty tributary in that powerful river of revival which we call America’s Great Awakening.

Like his rival in preaching the good news, John Wesley, Whitfield was a life long clergyman in the Anglican church.  Oddly enough, this evangelist with Tory sympathies earned the esteem of freethinking Benjamin Franklin, and the two struck up a friendship which lasted throughout their mature lives.  None the less, it is hard to imagine Whitfield, who died five years before the outbreak of the American Revolution, throwing in his lot with the founding fathers.  For Whitfield being an Anglican was not a doctrinal affirmation, and indeed he despised most of what today would be called “Anglican theology.”  For him, membership in the established church was just the normative state of being born into the British branch of Christendom.  In the Whitmanian view, the established church didn’t get you into heaven, but you couldn’t get out of the established church.  A questionable deal, but a deal nobody could refuse in Britain or its colonies.

To Whitfield’s amazement, many of the Americans whom he had converted on matters spiritual in the 1740’s were loath to join his church, preferring to form into autonomous assemblies, notably Baptist associations.  Whitfield sighed, in reference to the immersion of his converts, “It seems that my fledglings have become ducks!”  From our modern perspective this seems odd as well, why would someone get evangelized by a preacher from one denomination and then go out and join another denomination?  Why did the Whitfield Christians “become ducks”?

Erastianism

To begin with, “denominations” in our contemporary sense didn’t exist, although there were already a multitude of sects.  What did exist was a passionate clash of opinions over ideological and theological issues which today seem obscure and unimportant.  A key word in these debates was “Erastianism” which dropped out of our household vocabularies a century and a half ago and has not been missed yet.

However, unless we know how this “Erastianism” could get people hot under the collar (both clerical and lay collars) we wont understand how churchmen became ducks.  Fortunately there is a term of  recent coinage which conveys much the same meaning to modern ears.  Among libertarian, Constitutional, and conservative circles “statism” has become the contemporary opprobrium of choice for what the colonists called “tyranny.”  Today we can define Erastianism as “statism applied to church governance”, or church-statism.  Keeping that in mind, and equipped with a Bible in one hand and the Declaration of Independence in the other, we are well underway to unravel the ecclesiastical conundrums of 18th century America.  We know what the outcome was, the rise of the Methodists and Baptists and the decline of the Anglican/Episcopalians.  Was this due to the vagaries of demographics or was there some underlying principle working itself out in the lives of Christian men and women?

Going back to the mid-18th century British America, one must keep in mind that Erastianism was not just a theory but a practice.  Take the colony of North Carolina as an example.  The Church of England was established as a public institution, essentially an arm of the state.  Did this mean that those early Tarheels were enthusiastic Anglicans?  Hardly!  In fact the region was largely unchurched during its early history.  None the less a system of church vestries (lay committees) was established paralleling the civil administration, and all subjects were required to pay taxes to maintain this apparatus.

As in all monarchical church-state systems the organization was pyramidal.  Yet, curiously, within British North America this was a truncated pyramid.  Above the vestries and the occasional parish priest, there were no high church officials.  North Carolina, and all other colonies (mostly outside New England) where Anglicanism was established, reported to the Bishop of London.  This led to a curious ambivalence on the part of the colonials.  Some persons, of an Episcopal persuasion, were eager to have cathedrals and bishops established on American shores.  They blamed the crown for foot-dragging on this issue.

Another, and presumably larger, party was heartily glad that the bishops had not yet arrived.  Their fear was that the crown was scheming to impose a hierarchy on the colonies, a hierarchy which would coerce believers in matters of doctrine and impose heftier church taxes.  This was a major item of contention among the colonists in the run up to the revolution, and the fact that it was not directly mentioned in the Declaration of Independence is, like the dog that doesn’t bark, rather a testimony to the seriousness of the issue than the contrary.  It was, like slavery, one of those issues that divided the Founders at a time when it was crucial to present a united front against the crown.

Voting with their (webbed) feet

Keeping these things in mind, perhaps it is easier to understand why the fruits of the Great Awakening, sparked by the evangelism of Anglican priests, did not redound to the Established Church.  Again, taking North Carolina as our example, there are records of a great increase in the membership of Baptist assemblies, while the Established Church remained largely a bureaucratic skeleton.  Converted by the Spirit (through the preaching of Whitfield, Wesley et al) the rustic colonists saw no need to perfect their salvation through works, where the “works” in question were attendance on the ceremony and obligations of local established parishes.  Moreover, such were were added on top of (prior to the revolution)the “work” of paying the church tax…that is regardless of one’s belief, atheist, dissenter or whatever.

Really, Whitfield ought not to have been surprised, for the Spirit was working through his eccentricities, not his Anglicanism.  The crowds swooned at his uncanny words such as “Mesopotamia”…I know not whether they would swoon at “Mother England.”

We too should cry when we hear the world “Mesopotamia”!

These things are of interest to me since I am persuaded by a kind of Calvinistic Methodism myself.  Albeit that I am only a Calvinist in supposing that all people are sinners, while my Method has little in common with that of the Wesley brothers.  Rather, the method consists in this, that (at least under ceterus paribus conditions, a.k.a., all things being equal) freedom is a good thing and coercion is wrong.

Now today in Christendom (or rather post-Christendom) we are no longer so clearly divided into and Established Church and Dissenters.  However the same perennial urges resurface under different guise.  Thus today we have Liberal churches and Conservative churches.  In both these “denominations” there are churches and individuals who seek to become an Establishment.  Both seek to establish a church-state, albeit according to a different view of what the proper function of the state might be.  The liberal churchmen, and churchwomen, want to be the altruistic cheerleaders of the journalistic-academic-welfare-health complex, while the conservatives want the church to be an official apologist for the military-industrial-banking complex.

However there is always a remnant which has been granted the wisdom to understand human folly.  Among the greatest of follies is what has been called “the tyranny of good intentions.”  This is when we try to force something good on someone.  If we try to force Christ on someone we get the Inquisition.  If we try to force “democracy” (a problematic concept in itself!) on a people we get…well, we get something like the contemporary Middle East, a region in constant turmoil where two thousand year old Christian communities are today on the verge of extinction.

It is we, not Whitfield’s auditors, who should weep when we hear that old name for Iraq and its neighbors…”Mesopotamia”!

Yet through the gloom of it all, let’s remember that Jesus loves us.  I’m afraid I may have increased the gloom by throwing a heavy theological tome at your head.  But at least I warned you…

Duck!

 

 

Posted in Appologetics, Charismata, Christian Education, Christianity, Constitution, Constitutionalism, culture, Culture & Politics, Paleoconservativism, Philosophy, Politics, Traditionalism | Tagged: | Leave a Comment »

The Gun You Should Reach For When You Hear the Word “Culture”

Posted by nouspraktikon on April 24, 2017

Why “Culture” is a loaded word which needs to be disarmed

All advocates of a civilized world, and most emphatically all Christians, need to be skeptical every time the word “culture” is mentioned.  Evolution and culture are the two key concepts which have destroyed genuine anthropology, anthropology in the Christian sense of the word.  If today we live in a world where the barbarians are at the gates, it is only because the vital distinction between civilization and barbarism was first erased from the scholarly vocabulary in the name of an ambiguous and relativistic understanding of human nature, an understanding which is encapsulated in the term “culture.”

The word “culture” (an otherwise unobjectionable term) was adopted by secular anthropologists as the label for a mental package deal known as “the culture concept.”  The essence of this concept is that human beings create their own mental reality.  Even humanists are humble enough to realize that human beings do not create their own physical reality.  That sort of thing went out of style with Renaissance magic.  Humanists claim that the universe has arisen through something other than human agency, and since human agency is the only rational design they recognize, they conclude that it is a result of chance plus vast quantities of time.  This is the celebrated theory of evolution.

There is another sense in which Humanists exhibit a minimal degree of humility.  The culture concept implies that “Man Makes Himself” to quote a title  from V. Gordon Child, from a day when even left-wing scholars could use masculine pronouns.  However the culture concept admonishes the would be Ubermench that human individuals do not make themselves, only groups have the power to shape the mental environment of their members.  Since the culture concept derives ultimately from the thinking of Immanuel Kant, this is an important revision in the theory.  Kant asserted that the human mind creates its own reality, but he was very abstract in his presentation.  He didn’t stress the role of groups in forming their own environments.  This was worked out in the century after Kant by various neo-Kantian scholars and passed down through the educational system in the form of anthropological dogma.

This formula, that 1) evolution makes the physical environment, and 2) culture makes our mental environment, is the one-two punch of all Humanist thought.  It is diametrically opposed to Christian anthropology, which sees the human race as part of creation dependent upon almighty God.  To be sure, in the Christian view the human race occupies a unique role in creation, as the thinking and governing part, just as in Humanism the humans are unique in possessing “culture.”  However there is a world of difference in these two forms of uniqueness.  The first uniqueness is related to something personal outside itself, a condition which renders objective morality possible.  The second uniqueness, the uniqueness of “culture” is purely self-referential.  It cannot be brought to the bar of any moral standard higher than itself.  From the Humanist viewpoint, this isolated uniqueness reflects the principle of human autonomy.  From the Christian viewpoint, it is an illusion resulting from sin.

Culture as the moral ultimate means that culture itself cannot be judged, and implies relativism.  The history of the culture concept is the progress of increasingly consistent forms of relativism.  In the 19th century anthropologists tried to rank cultures on the basis of degrees of civilization, or put negatively, emergence from barbarism.  However as the relativistic implications of the culture concept were systematized, notably by Franz Boaz and his followers, attempts at judging cultures were suppressed.   Today, all judgments of different cultures according to some objective standard outside culture are considered prejudicial.  However this moral conclusion is the consequence of the supposed impossibility of any objective standard.

When the Nazi German Propaganda Minister Goebbels famously exclaimed, “When I hear the world culture I reach for my gun!” he was diametrically opposed to the cultural criticism which we are trying to undertake.  Like Franz Boas, Goebbels was aiming for the idea of “high culture” as opposed to barbarism.  We should translate his words as “when I hear the word ‘civilization’ I reach for my gun.”  Both Nazism and cultural relativism have tried to make it impossible to isolate barbarism as a descriptive category and set it over against civilization.  Of course there were profound moral differences between Boaz, the liberal Jew, and Goebbels, the German fascist.  The latter went beyond theory and was determined to normalize barbarism by acting it out in real life.  However in the long run it has been the gentle scholar who has been more effective in destroying civilization, first as an ideal and then as a reality, among people of good intentions.

Yes, traditions exist

The major opposition to a frontal assault on the culture concept is the contention that culture aptly describes the variety and richness of human traditions found throughout the world.  However this diversity has always been recognized, certainly prior to the academic hegemony of the culture concept.  Some of these traditions were instituted by the Most High God, some are human innovations, and some have been inspired by lesser spirits.  Human innovation is not to be gainsaid, either for good or for evil, and neither is the vast diversity of traditions.

The culture concept adds nothing to our understanding of the richness of human institutions.  However by insisting on the human origin of our mental world, the culture concept begs one of the most significant questions which can be asked about history: Who, or what, instituted institutions?  Its long range effect is to flatten out the mental world into the single, flat, plane of human reality.  Cultural Humanists boast of having an “immanent frame” in which they are free to make any judgement they wish about human affairs.  However “any judgement” ultimately means that no judgment is authoritative, and hence that all are meaningless.  This default to meaninglessness and nihilism is the next to last stage in the decline of cultural relativism.

The final stage occurs when “culture” having outlived its usefulness in the promotion of nihilism is reabsorbed by “evolution” the master-concept which required culture as a temporary supplement and diversion.  When the ideals of humanity have lost their charm, the spiritual descendants of Goebbels will round on the spiritual descendants of Boaz, with guns metaphorical or otherwise.

It is to save these people of good intentions, these so-called “Humanists” from the fate which dooms their concepts, their bodies, and their souls (not necessarily in that order) that we must insist on a God beyond culture.

Posted in Anthropology, Appologetics, culture, Culture & Politics, Paleoconservativism, Philosophy, Politics, Theology, Traditionalism, Uncategorized | Leave a Comment »

Dear Michael Savage, here is your prize-winning proof of Human Stupidity (which assumes the existence of God)

Posted by nouspraktikon on April 12, 2017

Dear Michael Savage,

First of all I want to let you know how much I enjoy your program.  After taking a lot of guff and being called a deplorable, you have now dumped the Trump train over Syria.  Just goes to show, that for true blooded deplorables, it was more than just a “thing” about the orange hair.  Oh well….

So much for WWIII and the other small stuff.  Now getting down to that proof of the existence of God!  As you and I and everyone else knows, God exists.  However there are a certain class of scholars, known as apologists, who go beyond just knowing that God exists to trying to prove that he exists.  God must love these people very much, since he doesn’t blast them out of existence for doing something which is ultimately blasphemous.  I love them too, especially the really complicated ones like Thomas Aquinas and Gottfried Leibniz, who’s thoughts are as intellectually challenging as they are useless.   These are the people who attempted  a frontal assault on human infidelity and ignorance, which in itself is rather stupid.

The correct procedure is to reverse the question and ask why human beings reject God and all knowledge of His existence and character.  In scholarly circles this method is called “presuppositionalism” and if left to run amuck it will lead to academic disputations as obscure as anything spawned from the pen of Thomas Aquinas.  However the basic insight perfectly simple.  We all live in a world which is screaming at us 24 hours a day seven days a week, “I am God’s creation!”  Yet there are two classes of human beings, those who accept the Creator and their creaturely status, and those who feel that both the universe and they themselves are self-made.

Since both the believers and the God-rejecting people live in the same world, a world in which we are nurtured and have our being, there would not seem to be much ground for metaphysical disputation.  Even rather evil people such as Martin Heidegger have never doubted that existence exists, although that benighted philosopher expressed great surprise that Being had managed to nudge out non-existence in the contest for reality.

No, both classes of human beings inhabit the same life-world, but they think according to different principles.  As scholars would say, they adhere to different epistemological systems.  The believers see themselves as mentally naked in front of God and the world.  For them there is no “problem of knowledge” per se, since the  information we get from our world is abundant and, except in limiting cases, generally reliable.

However, in the case of the non-believer, one must have an epistemology before venturing into the wilds of the universe.  For such people, there is a gap between the ego and reality, a gap which can only be bridged through strenuous philosophical or scientific investigation.  However this plight of inadequate knowledge is not just an epistemological inconvenience, but rather grounded in the moral attitude of the non-believer him or herself, since before staking any claim to knowledge the non-believer has already declared a state of ego-autonomy.  This declaration of independence has the unfortunate consequence of stranding the ego on a deserted island of his or her own making, from which venturing out into the world of bruit fact, governed only by the laws of chance,  is a perilous adventure.

Well now Mr. Savage, even if you accept all that I have written above, it certainly doesn’t present a “proof of the existence of God”…at least in the classic sense.  However, from a forensic point of view, it ought to make us suspicious of of the non-believer’s motivation.  Why the insistence on autonomy?  Why the cumbersome epistemological apparatus?  It would almost seem as if there were something or Someone out in the wilds of reality whom the non-believer was afraid of, and for whom this gap between the ego and the Other was improvised.

Indeed, there are grounds for supposing that the gap between the ego and its environment is not a fact of nature, but an improvisation designed to suppress the original confluence between the human mind and God.  This would also explain the general uselessness of “proofs of the existence of God” since these are attempting to employ a metaphysical tool in order to solve a moral problem.  The “proofs” usually only work on people who are already believers.

To conclude, Mr. Savage, I know that this is a rather bleak judgement, and furthermore begs the question, “What is to be done?”  After all it implies that humanity is divided into two non-communicating epistemological camps.  Instead of offering you an inductive or deductive proof of God’s existence, all I have done is explain the irreducible ignorance of a vast segment of humanity.  Or as you would say, the reason why “they are stupid.”

Well, I suppose prayer wouldn’t hurt.

Blessings upon you and yours,

Mark Sunwall

Posted in Anthropology, Christian Education, Christianity, Culture & Politics, Paleoconservativism, Philosophy, Theology, Uncategorized | Tagged: | Leave a Comment »

The Culture Conspiracy: A critical investigation into the destruction of civilization (Introduction)

Posted by nouspraktikon on April 10, 2017

The Culture Conspiracy

This is the first installment of a multi-part series on how the modern “culture concept” has, as a complement to the theory of evolution, demoralized and degraded civilization, or actual “culture” in the original intent of that word.  While it is not intended to be an exhaustive overview of the topic, the investigation will try to hit on all the major aspects of the problem.  Tentatively, it will be organized along the following themes,

  1. The Great Baton Pass
  2. The Measure of Man vs. the Measure of God
  3. From Custom to Culture
  4. Erasing the essential Civilization/Barbarism distinction
  5. From Kant to Hegel: From the individual to the species
  6. From Hegel to Boaz: From the species to the people
  7. The Super-organic, the Spiritual, and the Ugly
  8. The Enigma of Innovation
  9. Man Makes Himself Part II: From Custom to Customization
  10. Beyond the Culture Concept

Though each of these contains enough to provide a mini-course in itself, in its present state the work is likely to appear as the outline of a syllabus rather than a detailed treatment of the subject.

Introduction: The Culture Conspiracy

Suppose you were able to travel back in time to the mid-Victorian era.  Just to pick a date, let’s suppose it were 1859, the year in which Darwin published his master work, Origin of Species.  You arrive in London, England and are able to established communications with a middle class person, of either sex, and ask them two questions about the future.  First, do you expect technology to improve in the future?  Second, do you expect culture to improve in the future?  If I am not greatly mistaken, the answer of a well-informed Londoner of 1859 would be a resounding “Yes!” to both questions.

Next, through the magic of your time-traveling you offer them a vista of life at the beginning of the twenty-first century.  Now they are able to judge whether their optimistic prophecies have been vindicated.  There is no need to waste time on the answer to the first question.  The mid-Victorian would find the technological wonders of the present to be little less than a magical transformation of the human environment.  Even if the lady or gentleman in question were a Luddite, or like Mr. Butler, apprehensive of “machines” in general, they would be forced to admit that the machines had won the day, whether or not the technical triumph was in the long range interests of the human race.

And what of culture?  If cultural optimism were vindicated in proportion to the Victorian’s technological optimism, what wonderful variations on Moore’s Law might one expect?  In the year 2017 music would be one-hundred times more sonorous than Mozart, paintings one-hundred times beautiful than Turner, the law-courts one-hundred times more just and expeditious, families one-hundred times more peaceful and harmonious,  architecture one-hundred times more symmetrical and stately,  and the religious life of the average man or woman one-hundred times more pious.

I am sure everyone understands that such exaggerated expectations would suffer bitter disappointment.  But I would go beyond that and hypothesize that our representative Victorian would judge that much of culture had regressed rather than progressed.  Looking around at a population dressed in t-shirts and jeans, the well-dressed Victorian might assume that he or she (especially she) had landed in a sartorial dark ages.  Dress might be the most ubiquitous and offensive sign of cultural degeneration, but further investigation would reveal a myriad of aspects in which 21st century culture had decayed far beyond the lowest level of Victorian expectations.

Art might be cheap and easily accessible but so primitive, cartoon-like or commercial that the Victorian time-traveler would deem it rubbish.  Language, (unless our Victorian were a rater in Her Majesty’s Navy)  would have become unutterably vulgar.  Human relations would have become broader but shallower, and the family reduced to just one of the many nodes of association provided for the convenience of individuals.  The poor-house and the debtors prison would have been abolished, but by the year 2017 debt would have become the primary nexus holding the economy together.  Indeed, from the point of view of a middle-class Victorian, by the year 2017 society itself would have become one giant debtor’s prison.

This is not even to speak of the actual prisons of the 21st century, or the fact that Jack the Ripper (still in the future for 1859) would spawn, like some forensic Adam, a class of registered and unregistered offenders.  Finally our representative Victorian, even if not an enthusiast for the works of Herbert Spencer, might dimly recognize that by the standards of classical liberalism, the 21st century state had itself become a criminal network, engaged in perpetual borrowing and taxation for extensive regulation at home and endless warfare abroad.

Having safely deposited our Victorian time-traveler back to the homely 19th century, and drugged him with the obligatory milk of amnesia so that history won’t be spoiled, a familiar figure enters from stage left to deliver a soliloquy.  This is Mr. Carping Critic, who objects to the whole little drama.  He claims that our whole little experiment is a sham, based on false premises from the start.  He says that the two questions were apples and oranges from the start, and that the “no” verdict to the second question rests on biased judgment.  He says that when we jump from technology to culture we go from the measurable to the intangible, and we have entered into that shady region of values where nobody’s opinion (even that of a time-traveling Victorian) is more objective than that of someone else.

From the point of view of Mr. Carping Critic, the Victorian’s view of art is just an outmoded taste, so of course we should expect a negative verdict.  If the growth of the prison population is viewed negatively, it just shows the enduring grip of pastoral romanticism over the advantages of cozy confinement.  And so forth and so on in every department of “culture” since after all, culture is a matter of values, and as we all know, values change.  The seal of the entire argument is the whole ridiculous subject of clothing, which our time traveler had nothing better to venture than the opinion of a bigoted prude.

With that coup de grace, Mr. Carping Critic thinks he has stripped the Victorian of her secret!

I cannot refute Mr. Carping Critic on his own grounds, since they are not grounds at all, but the quicksands of a shifting and relativistic doctrine.  However it is a doctrine which has a history and that history can be exposed and criticized.  Indeed, I will go beyond Mr. Carping Critic to criticize the one concept which remains beyond criticism for him, namely “the culture concept.”  Yes, he is right to say that the time-traveling questions were not consistent, for in 1859 the word “culture” hadn’t quite assumed the connotation that we give it today.  Soon that would change, and it would change in such a way that people would no longer be as confident about making statements about objective reality as they had previously.

I think, in contrast to Mr. Carping Critic and his ilk, that objective reality, not just in the natural but the human world, continues to exist, and that an inability to talk about it puts anyone thus incapacitated at a severe disadvantage.  However our inability to talk about human affairs objectively is the end result of a kind of conspiracy, a conspiracy that started long ago and today has come to fruition in a multitude of crises.  In subsequent installments I will unmask this conspiracy… the culture conspiracy.

Posted in Anthropology, Art, Culture & Politics, Esoterism, Paleoconservativism, Theology, Uncategorized | Tagged: , , , , , | Leave a Comment »

From Old-papers to Lie-papers, this is what the media calls “progress”

Posted by nouspraktikon on April 7, 2017

Newspapers never used to contain “news”…but now the situation has been corrected

Decades ago when I heard an old monk exclaim, “These things you call newspapers…they contain nothing new!”  it was more of a self-evident truth than a revelation.  Aristotle, writing 2400 years ago, observed that if you read one book by Thucydides you didn’t need to read another history book for the rest of your life.  A lot of history has been written since then, but the principle still holds, for while the specifics of time and place may bear recording, the human comedy (or perchance tragedy) recapitulates the same old themes in every generation.  As “Rick” (portrayed by Humphrey Bogart) asked Sam the piano player to croon…

Its still the same old story,

A fight for love and glory,

A case of do or die

Of course if you really want to known the specifics of what was happening in North Africa c. 1942, Thucydides isn’t of much help.  That’s not what Aristotle or the old monk meant.  For “as time goes by” the concretes of time, place, and technology alter, but the human passions which animate the historical drama remain constant.

So I became rather casual in my attitude towards the media, deeming the daily old as soon as it was printed, and even before it redeemed its paper-value as a wrapper for the remains of maritime edibles.  Looked at in that way, there was something quaint about the Old-paper, as it regurgitated the same facts about different people while the generations cycled through their time on Earth.  To epitomize, the Weather section was paradigmatic of all the other sections.  Sun and storm might iterate through the seasons, but one never expected an entirely new form of weather to emerge.

This is not to say that novelty was entirely absent.  There was technological innovation and discovery of remote locations.  However these were like gardens which were expected to grow over time.  If there had been no innovations or discoveries, that would have been a far greater novelty. Moreover, since it was just the same expansive human nature which motivated the discovery process in accordance with human needs (or curiosity) even the greatest innovations lined up with the same doctrine of human nature.  Yet most importantly, even the greatest changes were reported on, as if they were a part of a natural order, they were not…what shall I say…they were not “promulgated.”

However I must now confess that, either I was wrong in my assumption that “no news is new news,” or something has changed.  I suspect the latter.  At some point the media moved from reportage to promulgation.  One suspects that deep in the heart of the media complex, people no longer recognize a distinction between journalism and fiction.  Selective reportage, outright suppression of facts, story-crafting, and agenda-fitting have replaced investigation.  The archetypal media man or woman no longer aspires to uncover a great story so much as to become the Great Novelist, rewriting reality according to the inspiration of their genius.  Today the newspaper has at last become a novelty.  Indeed, it has become “poetry” according to the Greek root of our word, i.e., total innovation.

In Journalism and elsewhere, Post-Modernism is past Marxism

How has this odd situation come about?  We are all aware of that confluence of factors which has changed “the news” in the past several decades, from the rise of social media to corporate concentration of the older journalistic outlets.  None the less, I am inclined to count what men and women have in their heads at the salient factor, in accordance with the principle “ideas have consequences.”  Journalists don’t just bloom like lilies of the valley, and before they are recruited into the media complex they must matriculate from the academic complex.

If it ever were, the academic complex is no longer a free marketplace of ideas.  Rather certain ideologies have gained an ironclad ascendancy on American campuses.  The most general and erudite (were it not elitist to admit) of these ideologies is so-called “post-modernism” which claims that human minds can have no contact with anything remotely resembling objective reality.  Rather, particular humans spin out their narratives, much like a caterpillar weaving its cocoon around its body.

Taken at face value, this sounds like a formula for toleration and harmony, such as was claimed on behalf of the ancient skeptics and cynics.  Those ancient “know-nothings” professed not to care about social opinion, to the point where whether a person wore clothes or not was a matter of indifference.  Whatever the merits of such skeptical liberty, it is a far cry from the atmosphere which surrounds post-modernism.  As anyone who has contact with modern academics is aware, hypersensitivity and condemnation are the qualities most apparent on university campuses today.

In reality, the hippy-like indifference on the surface of post-modernist thought masks a deeper level of ideological doctrine.  This doctrine is invariably Marxism of one or another ilk, but most especially the cultural Marxism associated with the Frankfurt school or the ideas of Antonio Gramchi.  The idea is not just to create novelty, but to create novelty which is subversive of the present state of affairs.  A new idea or a narrative which created greater harmony in society, though superficially compliant with postmodernist thought, is not sufficient.   The new narrative must be destructive of the old narratives.

This is the ideological reason why today’s media not only embrace new perspectives on human nature, but why these new perspectives are designed to create conflict and chaos.  To be sure there are other, simpler, reasons.  The most evident is the standing insight of yellow journalism that disasters sell newspapers, and that while natural disasters can’t be conjured up to order, wars and riots can be.  So today conflict, both domestic and global, is not just reported on, but spawned by the media itself.

The idea that human beings can create their own world ex nihilo is, of course, blasphemous.  But this is an attitude which goes back, behind even the Marxists, at least to Kant and the way modernity defined “culture” in opposition to nature.  Ultimately it goes back to Adam, or whoever that human was who first knowingly spit into God’s eye.  Unfortunately today’s corporate journalists are not such of whom one expects genuine, Godly, repentance.  Rather, and unlike wise King Canute, they are apt to stand stubbornly on the shore of their own subjective fancy, until engulfed by an objective tsunami far beyond their reckoning.

 

Posted in Culture & Politics, History, Media, Philosophy, Theology, Uncategorized | Tagged: , | Leave a Comment »

Africa through the Leftist looking glass

Posted by nouspraktikon on April 4, 2017

Leftist “Afrocentrism” is not Africa-centric at all, rather, it is Negative Euro-centrism

The cardinal, and supposedly indisputable, fact which determined modern Africa’s destiny is what people generally refer to as “The Partition of Africa” as if Africa were a huge cake that was cut into slivers by greedy and importunate dinner guests.  Indeed, there was a conference held in 1885 to ratify the European states’ spheres of influence in Africa, and it set the standard for determining the boundaries, not just of colonial Africa, but the territorial limits of today’s independent states.  Thus this phrase, and the image it evokes, has endured as the beginning of all disquisitions and inquisitions into the matters and morals of modern Africa.

Unfortunately this notion of “partition” fails the reality-test.  Apart from the history of European diplomacy, the “Partition of Africa” has no utility or even meaning.  In order to divide something up, the “something” has to first exist as a unified entity, and (except as a geographical concept) there never was any such thing as “Africa” to divide up.  In contrast, when historians speak of the division of Poland in the 18th century, they are referring to something concrete.  There was indeed a unified historic Polish state which suffered dismemberment at the hands of Prussia, Russia, and Austria.  Poland disappeared, its neighbors were enlarged.

This is not what happened to Africa.  Granted, something very important did happen in and on the continent of Africa during the late 19th century, and it happened (primarily) through the intervention of the European powers.  However, the actual process was precisely the opposite of a partition.  What happened circa 1885 to the various peoples of Africa was a process of forced unification, not forced division.  From the point of view of genuine Afro-centrism, or what might be more objectively called “ethnological realism” the 1885 event is better described as the (forced) unification of the African territories.

Yet somehow the myth of a division of a non-existent country called “Africa” has persisted in the collective imagination of world history.  The original impetus for this myth was, as everyone might suspect, the ignorance, chauvanism and pride (I abjure the term “racist” but you get the general point) of the European ruling classes at the height of Western world power.  It no doubt flattered them to think that they were able to enforce their will on territories who’s indigenous populations had no say in the matter whatsoever.

I won’t be going into the pros and cons of colonialism, a vast subject.  Rather, what I am arguing is the reality or otherwise of a single thought-construct, the “partition” of Africa.  After 1885 Africans found themselves inhabiting much larger political units than they had ever experienced before.  Some aspects of life in these larger units were beneficial, some were degrading, and let the chips fall where they may in each department of evaluation.  However what happened post-1885 was a unification rather than a sundering.  Sundering did occur in isolated instances, as when a boundary was arbitrarily drawn through the middle of a village, or though the grazing territories of a nomadic tribe.  However these were the exceptions which proved the rule.  The rule was that Africans woke up to a new reality, and in this reality they now were thrown into political relations with people whom they had had little contact with previously.  And these other people were not just the Europeans, but, most importantly, other Africans as well.

It is this unification which was the salient reality at the dawn of modern Africa, not sundering.  However, to say that unification was salient is not by any means a value judgement.  The pros and cons of this unification are all arguable, what is not arguable was its reality.  In fact the history of African politics, and of the rest of the world’s attitude towards Africa, largely revolves around the pros and cons of large political units.  Indeed, this is a theme which is hardly unique to Africa.  What is a nation?  What is a state?  What is the relation between these two, and are either of them or both of them good or evil?  This has been a universal theme since at least the times of the American and French revolutions.  However events on the African continent can throw these themes into either sharp relief or obscurity, depending on what kind of moral handle one has on the issues.

My thesis is that the political left has grabbed these issues at the wrong end, and that conventional discourse has slavishly followed the tone set by the left.  It is as if we had a telescopic view of Africa but were looking through the telescope from the wrong end.  This has had disastrous consequences, both for Africans and for everyone else.

Ethnographic realism and Federalism, Negative Euro-centrism and the unitary State

The seemingly abstract discussion above has more than historical relevance.  It is true that much of  Africa experiences debilitating social and economic conditions.  Furthermore, it is true that outside agents play a disproportionate role in the affairs of African states.  However it is singularly unhelpful to label these concrete conditions the result of “neo-colonialism” when in fact they are manifestations of the same globalist system which interferes in the affairs of non-African regions.  Due to the weakness of African political systems organizations such as the IMF, the World Court, and the so-called “peace keeping” UN military play the exaggerated role that they would like to assume throughout the world at large.  The reason why they are unable to play this role universally is that states outside Africa are stronger and less amenable to outside pressure.

And why are African states notoriously weak?  The general consensus is that “tribalism” (variously defined) keeps the political situation of all but the most stable African nations in a state of perpetual turmoil.  This is certainly true, however people have been analyzing the phenomenon of “tribalism” through the leftist looking-glass for several generations, and still no solution has been found to this problem, if “tribalism” is indeed a problem.  The leftist-Marxist view is that every African nation should have a unitary state, which will then enact economic and social planning to lead its population out of poverty and dependency.  Any groups which stand between the individual and the state are seen as running interference with this program are deemed reactionary.  Prominent among these groups are tribes, ethnic, and kinship organizations.

Does this sound familiar?  It should, since this has been the left’s prescribed rout to utopia throughout the world, not just Africa.   Worldwide, this started at the end of the 18th century, when the Paris Jacobin government abolished the provinces (the “tribes” of France) in favor of direct rule over localities by centrally appointed “prefects.” (N.B.:  This policy was extended to French West and Central Africa in the 20th century, and was inherited by the Francophonic states after independence.)

However in the case of Africa, the left ultimately envisions a continental union.  Hence the Marxian endorsement of the outmoded and Eurocentric notion of a “division” of the African continent circa 1885AD.  This is bad historiography but shrewd politics, since it gives substance to the myth of an undivided continental polity which should be restored in the future.  In fact what happened was not a division, but a forced unification of vast territories which have now become the nations on the African map.  If there had been no such forced unification there would have been no general problem of “tribalism” since the forcibly unified tribes would have been nations in themselves.

What has been done has been done, and today’s African political units are, and will remain, multi-ethnic.  This can be either a blessing or a curse.  If we look at it from the left-wing viewpoint, which I am equating with advocacy of political centralization, it interferes with the smooth operation of a unitary state.  However there are alternatives to this viewpoint.

The salient alternative is federalism, or having weak central governments and strong local governments.  The fewer rewards which can be contested at the national level, the less likely it is that various groups, ethnic, religious or otherwise, will have an opportunity to come into conflict.  Thus federalism, in any region, but notably in Africa, is likely to diminish the likelyhood of inter-group friction.

Advocates of political centralization generally fail to contest the above premise.  Rather, they claim that strong unitary states are necessary to resist outside pressure, generally framed as “imperialist” or some kindred threat.  However, even here the case for centralized unitary states is dubious.  In fact it is far easier for outside political forces to subvert a single political head than to deal with a multitude of layered political agencies.

Yes, the root problem in Africa is the one factor that the left refuses to blame: excessive political centralization.  Federalism would keep contentious ethnic forces from each other’s throats, and furthermore would minimize, though not eliminate, outside political interference in the affairs of the various nations.  The forced political unifications of 1885 are irrevocable, but their negative effects can be mitigated through decentralization.

Should be be surprised that the political solution for African nations is the same as the political solution for other regions of the world?  After all, the root human condition is the same everywhere.  That is what the left professes to believe.  Why doesn’t it endorse local autonomy and limited government everywhere on Earth?  Perhaps because it has simply adopted its historiography from its alleged imperialist enemies.

“Division of Africa” indeed!  Would that it were so.  We need smaller political units on every continent, so that people can easily trade, cross borders, and be friends.

 

Posted in Constitution, Constitutionalism, Culture & Politics, Libertarianism, Paleoconservativism, Politics, Traditionalism, Uncategorized | Tagged: , | Leave a Comment »