Pico Ultraorientalis

Just another WordPress.com weblog

Archive for the ‘Paleoconservativism’ Category

A god who failed: William F. Buckley and his “conservative” movement

Posted by nouspraktikon on August 30, 2017

A Fabian Conservatism?

There are two systems operating on this Earth.  According to one, every man, woman, and child strives with carnal jealously to grasp and hold on to their rights to self and things, and when there is more than enough, the excess is disposed of, given away, or traded on the open market.  That is the better of the two systems.  According to the other system, men and women quest for virtue and renown, seeking to bring the Kingdom of Heaven down to this world, by violence if necessary, and we are further told that at the end of this process a man shall appear who resembles Christ in certain regards.  It is this second system which attracts the best and the brightest.  The late William F. Buckley Jr., 1925-2008, practicing Catholic, family man, nominal patriot and putative spy, Yale graduate, novelist, journalist, polymath and polyglot, yet above all things, “intellectual”, was certainly among the brightest of his generation.  As a general principle, we ought not speak ill of those whom God has loved and endowed with great talents, yet it is incumbent upon anyone who wishes to preserve both truth and memory to render judgement on matters of public record, and especially those actions or omissions which have led the American body politic down its present primrose path.  If we are the proverbial tin can, well then, Mr. Buckley was a chief contender among those who kicked us down the road and into the ditch.  Assuredly, we have every right to inquire into his mind and motives.

My first memory of William F. Buckley is the televised image of two posh, erudite men engaged in a furious altercation over the merits of the Republican presidential nominee in 1964.   The one on the left (from the viewers perspective) was a scandalous representative of the liberal avant guard, an inconsistent and curmudgeonly  libertarian/left/democrat, surely an entertaining character if one were to consider him in isolation.  However he could barely gain a point against the other man, the one on the right (again keeping perspective in mind) who seemed an utter novelty, the Adam of a new race which was awaiting formation, or rather self-formation.  Gore Vidal, (stage left) has kept a loyal following of fans and detractors, yet Vidal by himself would never have become an epochal, or a defining figure of those crisis years.  It was Buckley’s, not Vidal’s, video debut , which marked off a new era, not (sadly) of American political thought, but of rhetoric and reality television.

Thus was born, at least in the viewing public’s mind, that oxymoron, the “conservative intellectual.”  The hokum of Dogpatch, an image of the American right as rustic buffoons so carefully crafted by liberal opinion makers was momentarily shattered by a visible presence.  Since I was a kid, I didn’t know that Buckley had already attained considerable celebrity in literary and journalistic circles, as early as 1950, with the publication of his  God and Man at Yale, but now the word had become flesh, visible to millions upon millions of couch dwellers and potato chip eaters.   He spoke, and he spoke well, interspersing his verbal darts with the flick of a serpentine tongue across tightly drawn lips.  Suddenly, the viewers glimpsed a crack of light shining through the deadening conformity of consensus politics.  Was this the chiaroscuro dawn of a new day, or just a hoax?  It was ominous when, in a fit of peek, the new god dropped his smooth mask to coin a notorious neologism.  Vidal, he fulminated, was an “octo-moron!”  In those days of civil discourse you didn’t just go calling someone an eight-fold idiot in front of America’s families…not to mention the lexicographers!

Fast forwarding to the present, and the perspective of the post-Trump, post-civil discourse era, it becomes painfully clear that this erudite “conservatism”  has failed. Someone once observed that Hegel only “died” in 1933, a watershed beyond which many conceded that his “dialectic of history” bore scant resemblance to the logical deductions of some charitable and edifying Deity.  We might likewise reckon that  Buckley “died” in 2016, when it became abundantly clear that the chattering of the political class could no longer be confined to a salon discussion constrained by the niceties of an Americanized high tea.   Today we must reluctantly acknowledge that even domestic politics is war, perhaps not quite violent war, but war none the less.  But then, shouldn’t we have known that all along?  If we didn’t it was mainly our own fault, yet no thanks to Bill Buckley and others who were only too happy to perpetuate our fond illusions.  Hence, those moderates who have managed to wake up to the situation often discover that they are very late into a long war of attrition conducted by the left, poised on unfavorable terrain, and desperately short of intellectual ammunition.

Not that all possible ideologies which might be denominated as conservative are bankrupt, rather, it is especially the smug, above-the-fray “conservatism” defined by William F. Buckley which circumstances have rendered impotent.  Herein is the real eight-fold idiocy, not that Buckley was able to concoct a new ideology, which he had the brains and the perfect right to do, but that he usurped the nomenclature of a previous movement, the Old Right, and applied it to his novelties.  A guileless Buckley would have decanted his fresh ideological wine into new, or at least newly labeled, wine-skins.  Accordingly, Buckley might have dubbed his concoction “Fabian Conservatism” or some such critter…but he insisted on preserving the illusion of continuity with the anti-New Deal coalition.   Ironically, the moderate Socialists of the early 20th century showed a greater respect for intellectual property rights by relabeling themselves as Fabians, thus permitting the revolutionary Bolshoi to maintain their identity as “Reds.”

Actually, “Fabian” would have been a far better moniker for whatever Buckley was up to.  For one thing, the progressives, then and now, have never intended to give up a single inch of political gain.  It is always a matter of advance to the front, either slow and Fabian or fast and revolutionary.  In contrast,”conservatism” as it was reinvented by Buckley’s National Review in in the 1950s, has been much closer to the strategy of Quintus Fabius “the delayer”(Rome, 3rd c BC)…defining itself as the weaker side and then enlisting for a long, indeed perpetual, retreat.  Today we are experiencing the results of this capitulation.   Buckley, much like Keynes “in the long run”, did not live to see the full consequences of this “Fabian” defeatism, a nation in which the conservative brand as a whole has been discredited, and where only a retrenched populism and leftism remain as the primary  engines of our uncivil  discourse.

Pied Piper of the Establishment

Was Buckley’s defeatism a matter of principle?  Was it motivated by an Oswald-Spenglerian ennui in the face of irresistible winds of change?  Or was it something else, something less intellectual but more human, a quest for power and social acceptance by a man with the smarts and social connections to become a celebrity, combined with a secret contempt for moral absolutes?   John F. McManus considers this question in his William F. Buckley Jr.: Pied Piper of the Establishment, a look at the public words and actions of America’s most famous, so to speak, “conservative.”  In this concise and readable work McManus illustrates how virtually every major premise of conservatism was contravened by Mr. Buckley and his associated writers at National Review.  Did Buckley really “delay” the advent of the current unpleasant situation through judicious compromise, such as might merit the title Fabian Conservatism?  Or did he hasten on the day of reckoning by sapping the bulwarks of more authentic brands of resistance?  Mr. McManus doesn’t rush to judgement, but judge he does, by patiently building up a bill of particulars which will strongly incline the reader to embrace the latter hypothesis.  The major, though not the only, items that McManus itemizes in the antithetical “conservatism” of Mr. Buckley are the following.

  1. Buckley substituted an unidentified “conservatism” for the explicit definition of good government found in the Constitution.
  2. He shielded an unholy alliance between leftists, capitalists, and statists, or what Mr. McManus calls, “the conspiracy” from the public, by denying its existence and targeting its foes.
  3. By accepting membership in the Council on Foreign relations, he supplied dignity and cover to a key element of this conspiratorial apparatus, or what today might be called the shadow government of the deep state.
  4. He contributed to the undermining of the nation’s morality.
  5. He led Americans away from involvement in the kind of principled activism (a.k.a. any continuation of the anti-war, non-interventionist Old Right conservatism, such as flourished in the Robert Taft era).

If Mr. McManus has been able to give us a comprehensive account of Mr. Buckley, his ideology, friends, and actions, it is because, as a young conservative he was a Buckleyite himself.  Initially having no alternative to the narrative introduced by National Review which smeared the remnants of the Old Right, and in particular its revival in the organizational form of the John Birch Society, Mr. McManus was an enthusiastic “Fabian” conservative.  However the providential arrival of a letter from a total stranger (in those days before the internet when it was hard to canvass opinions beyond one’s circle or standard journalism) led McManus to question the spin which National Review had put on the distinction between “right-wing” and “conservative.”  Subsequently, McManus did his own investigations which forced him to completely rethink the ambiguous ideology of William Buckley and embrace a principled philosophy of freedom.  This in turn led to membership and later leadership in his once-scorned but now beloved John Birch Society.

Now in order to form a just estimate of William Buckley, such as McManus and others have attempted, one has to understand the context of the world into which this new “conservatism” (Buckleyite, Fabian, or just “faux”) emerged.  The Second World War had been a global victory which came at the price of weakening every domestic institution in America other than the state, and the conscience of the Old Right urged a return to something like a peacetime society and economy.  It was well understood, and not just by conservatives, that there was a natural iteration between times of war and times of peace, and that a condition of perpetual war was a recipe for tyranny.  True, there was the very real threat of Communism to be dealt with, but it had to be dealt with in such a way that the very institutions used to fight Communism did not replicate the evil they were designed to overcome.

However the wisdom of turning America back into a normal society was not so easily put into practice.  The vast wartime tangle of bureaus and red tape (into which many actual “Reds” had insinuated themselves) proved easier to dedicate to new missions than to mothball.  Predictably, the same political party which had given America the New Deal were enthusiasts for the National Security State (activated by legislation passed in 1947) which perpetuated and legitimated all the essential wartime security and military apparatus.

This rapidly consolidating system was rightfully seen by many conservatives as “Orwellian” (a coinage of that era, since 1984 was written in 1948).  Moreover, for objectors the remedy was both obvious and Constitutional, i.e., “Throw the bums out!” and restore a peacetime, lassez-faire economy.  According to the myth of the two party system, that was the expected order of things, with frequent turnarounds in power both affirming the sovereignty of the people and harmonizing  extremes of policy.  Around 1954, similar to the Trump election of 2016, enemies of the status quo envisaged that if their party won fair and square the “loyal opposition”  would consent to a fundamental reorientation of national policy.  Alas, then as now, the concept of “loyal opposition” proved to be an oxymoron…if not an eightfold idiocy!  Whatever the hardships and tragedy of the New Deal and the Second World War, the truly sinister development wasn’t triggered until, after a twenty years hiatus, a Republican administration was finally inaugurated.  To the shock and dismay of genuine conservatives, rather than a return to normality, under Eisenhower the progress towards a managerial welfare/warfare state was affirmed and even accelerated.

It was at this juncture of history that William F. Buckley Jr. appeared in the forums of public life.  Initially National Review shared the outrage of the Old Right, sill smarting from the primary defeat of Taft, at the wholesale adoption of New Deal programs and apparatus by the nominally Republican administration which had replaced Truman.  McManus notes that…

In December 1957 Buckley himself scolded President Eisenhower for his sorry leadership.  During a forum in New York City sponsored by National Review he excoriated Ike for having allowed the “problem of internal security” to grow to “to a state far worse than that under Mr. Truman.”  Insisting that “Mr. Eisenhower must, inevitably, be repudiated.”  Buckley lamented that he didn’t expect anything to be done because “Eisenhower does not take stands, except against [Senator Joseph] McCarthy and the Bricker Amendment [stipulation that treaty law did not supervene US sovereignty].”  His remarks were later published in the National Review.

Thus, early on in the editorial career of the National Review, a policy line was taken which seemed indistinguishable from the base of the Old Right/Taft Republican movement.  However as soon as these conservative bona fides were established, Buckley took a new tack, ingratiating himself to left and center by taking a more establishment approach to the issues, and, most importantly, positioning himself on the acceptable side of the “right-wing extremist” vs. “conservative” divide.   Conveniently, the criteria for judging this distinction were largely devised by Mr. Buckley himself. An initial omen of this strategy was McCarthy and his Enemies (1954) a book coauthored by Buckley  on the anti-Communist investigator, an ostensible defense which contained so many unseemly observations of its subject and his cause that it diminished both.  By the early ’60s it should have been clear that Buckley had done a two-step, 1) appropriate the label “conservative” through his initial appeals to the Old Right, and  2) change the definition of “conservative” by stigmatizing most of the positions traditionally held by the Old Right.

It is important to remember that the Old Right (used here as equivalent to the anti-New Deal coalition) was a lassez-faire, generally anti-war, limited government movement.  It was not “right-wing” in the pejorative sense that subsequent political rhetoric has framed the term.  Significantly, such genuine rightists as existed in the America of the ’30s and ’40s seldom opposed the New Deal in principle.  The segregationist “Dixicrats” were all aboard FDR’s gravy train, and the scattering of minuscule groups which sought to ape European fascism could only complain that the New Deal was insufficiently centralized, militarized, technocratic, paganized or dictatorial.

The making of a god

However, if one is positioning oneself as the ascending god of public opinion, it is not sufficient, though it may be necessary, to redraw a nation’s ideological cartography.  As McManus repeatedly points out in his criticism of Buckley, which is in fact a criticism of the way conservatives “do politics,” ideology is generally overrated as a ground of human action.  Contrary to whatever Richard Weaver may have intended, it is people, not ideas, who create political  consequences…at least in the short run.  To put it according to the myths of the old pagans, whether one is Oedipus or the King of Alba Longa, one must slay the god of the harvest if one wishes to establish a new religion.  In the case of William F. Buckley Jr., it was not enough to displace, disparage, and assume the mantle of a bloodless abstraction such as “conservatism” or the generic, and geriatric, “Old Right.”  As in days of yore, a living sacrifice was necessary.

Now it so happened that, preceding and shadowing the career of our Ivy League tyro was another man, a very different sort of fellow, a practical businessman and independent researcher, yet one who, in the technical definition of anthropologist Rene Girard might be reckoned as Buckley’s “double.”   That man was Robert Welch, who founded the John Birch Society in 1958.  Whatever the merits of Girard’s theories might be, it  is said that in a mimetic universe (that is, a society populated by imitative creatures, which indeed sounds rather familiar) it is impossible for doubles to long coexist.  Buckley and Welch were doubles in the sense that one or the other was destined to become the rallying point of the conservative cause.  One or the other, not both.

To translate from mythic to political terms, an assassination was in order!  Fortunately for Welch, especially considering Buckley’s career in operational intelligence, assassination of character was deemed sufficient.  Welch, having eaten from the tree of the knowledge of good and evil  (something Buckley was especially dedicated to preventing among his fledgling “conservatives”) was cast out of the paradise of polite company, and into the valley of wailing and gnashing of teeth.  Except that Welch neither wailed nor gnashed his teeth, but took his public stigma, or what Girard would call his “skapegoating” with charitable fortitude.

Just as Girard’s mimetic theory would predict, it worked like a charm, this exchange of fates between Welch and Buckley.  McManus quotes Buckley biographer Judis on the potent effects…

Buckley’s attack on the John Birch Society also transformed him as a public figure.  He [Buckley] was no longer the pariah of the McCarthy days.  He was a public representative of the new conservatism that television producers and college deans could invite to appear without provoking an outcry.  Whether intentional or not, Buckley’s attack on the John Birch Society prepared the way for his own celebrity. (McManus p. 153)

[N.B., Pay attention to how  “without provoking an outcry” appears, from the vantage of the present,  on the forward side of a half-century historical parenthesis! Intimidation of speech outside of the left’s allowed parameters is not a novelty of the post-Trump era, but has been a frequent academic constraint in both 20th and 21st century America.  Perhaps the intermission of good feeling and toleration was only due to “Fabian” self-censorship on the part of conservatives.]

The scapegoating of Welch and the new ideological cartography mutually reinforced and validated each other.  One doesn’t have to be a Harry Turtledove to imagine an alternative historical scenario, a world in which Welch did the scapegoating and Buckley became the sacrifice.  The major obstacle to the realization of this alternative universe was the basic decency and fair-play of Welch himself, who refused to be drawn into mimetic rivalry with fellow conservatives.  Welch illustrated his own attitude by prefacing his response to the scapegoating with lines from the poet Edwin Markham…

He drew a circle and shut me out–

Heretic, rebel, a thing to flout.

But love and I had the wit to win:

We drew a circle that took him in!  (McManus p.154)

Furthermore, the inverted ideological map of the alternative universe would actually make far more sense, with Welch positioned as the centrist and Buckley as “far right-wing.”  Most people at the mid-point of the 20th century would, setting aside propaganda, have regarded Welch as the solid “bourgeois” and Buckley as the scheming, effete, aristocrat.  Indeed, it was this almost French Bourbon air of amorality and private immunity which gave Buckley much of his charm and influence.  And if such quirks of character were not enough enough to make one suspect that Buckley was far to the “right” of Welch, what about the secret societies, the espionage, the pornography and similar intrigue?  I won’t go into the details here as McManus documents them extensively in his book.  However it might be  useful to take a synoptic glance at what McManus evidently considers Buckley’s most damning characteristic.

Barking up the Tree of the Knowledge of Good and Evil

To reiterate, Buckley made a sacrifice of Welch, thus becoming a divinity, the god of a new conservative movement formed in his own image.  However, there is a curse attendant on all mortals who pretend to godhood, that they must sleeplessly patrol the bounds of their sacred groves against the onslaught of fresh rivals.  We may liken Buckley to the cherub charged with guarding paradise, however the tree that he was set guard over was not that of life, but rather concerned a very specific form of knowledge.

To be sure, Buckley was not against knowledge or intellect, and with the exception of one particular form of knowing, he was pleased to spread abroad all sorts of chatty information and innuendo.  This included exposure of the more outrageous left wing follies, and to this was added his police function as a maintainer of conservative standards of belief and decorum.  In short, he was smart, and he was on a mission to save America from its own stupidity, stupidity and error of such magnitude that it threatened to lose the Cold War and bring Western Civilization to an untimely end.  Nor was he against knowledge in the sense of “carnal knowledge” and he had a Playboy interview to prove it.  That too was smart, in the sense of currying favor with “the smart set” of the ’60s.

Most significantly, as intellectual-in-chief, Buckley enjoyed the role of contrarian, stimulating all sorts of fascinating conversations by reversing conservative thought on key social and economic issues.  Should Richard Nixon have instituted wage-and-price controls?  Well, why not give it try?  Contrary to everything which the Austrian school of economics had painstakingly demonstrated, that wage-and-price controls would sabotage production and exchange, Buckley felt that one had to be open minded on the topic.  Should the Supreme court have had authority to determine whether abortion was murder?  Why not?  True, two-thousand years of Christian teaching had already provided a clear answer to this question.  However Mr. Buckley, though a Catholic, felt that discussion on the topic needed to be opened up and freed from dogma.  In addition to abortion and price controls, Mr. McManus lists over a dozen “indefensible positions”(pp. 220-229) where Buckley either reversed the conservative stand or introduced moral ambiguity.  And should we have been surprised?  After all, settled doctrines don’t sell magazines or increase the ratings of televised talk shows the way that controversy and factional in-fighting do.

Yet for all his delight in upsetting the apple cart of knowledge, there was one angle which Buckley declared taboo.  With regard to American government policy, and to some extent other institutions of society, all investigation had to take place within the smart/stupid framework.  The alternative framework, the good/evil framework, was strictly out of bounds.  Any policy commentator who suggested that there was a conspiracy in high places actively engaged in undermining America’s best interests, was just a dog barking up the tree of forbidden knowledge, and needing to be silenced.  These barking dogs were many, including not just Sen. Joseph McCarthy, Robert Welch, and Herbert Hoover, but ironically Buckley himself together with the staff of National Review, prior to his apotheosis as the god of a new conservatism.  Yet as early the mid-’50s it was clear that a new paradigm was taking hold.

In August 1956, at about the same time that FBI Director J. Edgar Hoover was warning of a “conspiracy so monstrous” that one “cannot believe that it exists,” Buckley offered his contrary view that America’s problems were occurring “spontaneously, not in compliance with a continuously imposed discipline.”  In effect, he was saying, “Don’t listen to Hoover, the House Committee, or the Senate Subcommittee.  Ignore even my own statement in McCarthy and His Enemies.  The bad that happens to our nation is the result of spontaneous stupidity, not orchestrated design.”  (McManus pp. 128-129)

Apart from questions of historical accuracy, why is this still a big deal?  Of all the trees in the political garden, why does the fruit of this one matter in a unique way?  Let’s pay attention to the observations of Mr. McManus….

Concluding that willful conspirators rather than mere bumbling do-gooders are at the root of such problems stimulates activity because of human nature’s most powerful instinct: self-preservation.  Most who decide that the disastrous transformation of America is the work of deliberate evildoers will do whatever they can to save their country, themselves, and their loved ones.

But those who become convinced that the damage being done results from well-intentioned mistakes will do little except grumble.  Even while witnessing the ongoing destruction, they will shrug their shoulders, continue working to keep their heads above water, and naively expect others in government and elesewhere to eventually see the error of their ways and take corrective action.

Today, as never before, many are willing to impute evil to their governing officials.  Unpleasant as this might be, it at least gives us grounds for reevaluating Buckley’s assessment that stupidity and not conspiracy was at the root of America’s ills.  Fewer and fewer people today would concur with this assessment, however time and energy have been lost through distractions…not the least of distractions being Buckley’s influence, an influence which both intellectualized and demoralized political discourse on the right.

Postscript on Intellectuals and Pseudo-Intellectuals

It was a balm to the pride of conservatives in the 20th century that thinkers on the left consisted not of actual, but of false or “pseudo”, intellectuals.  In contrast, Mr. Buckley and his cohorts could be trotted out as examples of the genuine article.  To be sure, Buckley and his friends were more erudite, not to mention amiable, than your average Weatherman.  However, in some ultimate sense Mr. Buckley was as “pseudo” as they came, and for reasons that should now be apparent, that, being a conscientious objector to the war against evil, he whiled away his time in the garden of ideas.

That is not to say that ideas cannot be serious.  However the number of people for whom ideas are central to existence is few indeed.  For Bill Buckley ideas were toys, baubles of the mind which could be entertained as hypotheses, not principles which compelled moral action.  How many of us can say that we deal with ideas in any other way?  Are we all not pseudo-intellectuals to one degree or another?  Perhaps that is our nature, the nature of those of us who are less than gods.  Perhaps it is good to be only a pseudo-intellectual.

Those who truly sought salvation in ideas have nearly vanished from the Earth.  Plato, Plotinus, Hypatia of Alexandria, and later during the Renaissance, Pletho and  Pico before his conversion by Savonarola, and perhaps a few others.  William F. Buckley was not among their company, and neither was Jesus of Nazareth.  So in spite of old Bill’s long list of sins, which I have barely touched upon here, this speaks well for his soul, that he was not an intellectual in the absolute sense.  There is always hope.

 

 

Advertisements

Posted in Appologetics, Christianity, Constitution, Culture & Politics, Paleoconservativism, Philosophy, Politics, Traditionalism, Uncategorized | Leave a Comment »

In Defense of “Man”

Posted by nouspraktikon on July 15, 2017

Not Even Wrong

Suddenly.

Not suddenly as you or I measure time, but suddenly according to the stately cadences of historical events, we have lost, if not yet our species, at least, and ominously, our name for it.  At some point in the not very distant past, “Man” vanished…not extinguished as an  organism, but as an object of consciousness.  For where there is no name there can be no consciousness, where there is no consciousness there can be no science.  Today there is no longer a science called Anthropology worthy of its name, for the name has been banished.   I don’t mean the entertaining science of bones and basket weaving and many other shining objects which is offered in college curricula as “Anthropology.”  I mean Anthropology in the most specific of species-centered meanings, inquiry into that simple question….”What is…what is…[bleep!].”   It is a question which can scarcely be asked today, let alone answered.

This masking of “Man” strikes me as an important development which deserves an extended and serious discussion.   To that end, some ground rules are necessary, concerning which I have some good news and some bad news.  Here goes both:  Sex will not be mentioned in the course of this article.  I have no interest whether the reader be sex-crazed or celibate, male or female or anywhere on the spectrum in-between.  I am only interested in whether you think this Anthropological murder mystery is worth of your time and consideration.

If you concur, then the omission of sex and his/her ugly sibling “gender” is good news indeed, because these things are monumental and, I would argue, intentional, distractions from the difficulties involved in Philosophical Anthropology.  Those bad news bears,  non-adults who think sexuality is the central, nay exclusive, issue in life, can adjourn to their favorite safe space, the Reading Room on Gender, where they can reinforce their own bias among those vast collections of literature which are supplemented daily by our subsidized scholars and their media mimes.

Now to be sure, there are other rabbit paths leading away from the essential inquiry, its just that sex and gender are the most obvious, if not the most obnoxious, and hence need to be eliminated first.  However, those other anti-Anthropological rabbit paths, though less celebrated, become increasingly subtle as the core of the problem is approached.  In any subject, the task is hard enough when we have been force-fed the wrong answers…the real difficulties start when we realize that we started off on the wrong foot by asking the wrong questions.  Today, when we encounter the fundamental question of  Philosophical Anthropology, to paraphrase the incidentally sexy but essentially humane Erin Brockovitch, “..all we have is two wrong feet and damn ugly shoes.”  We don’t know”bleep!”…and the absence of the word doesn’t help.

If we wish to restore that lost science, it will prove necessary to go back and wrap our brains around that simple word “Man” which was once the standard English term for the class of all human beings, much like its French equivalent “l’homme” etc..  Man has long since disappeared out of scholarly, correct and polite language , which means pretty much everywhere, since in casual idiom, if we discount “Man oh man!” and similar oddities, the universalizing nomenclature of Philosophical Anthropology is worse than useless.  After all, you can tell a good joke about Poles, or rabbis, or priests, or homosexuals, or women, and yes, even about “men” qua the male gender, but its hard (short of aliens or the envious algorithms of The Matrix) to envision a “Man” joke.  However, while the comedians won’t notice, there might be a few instances where, for the health of civilization, the ability to have a word for the human species could come in handy.  From this, we can derive another important consideration, once “Man” has been abolished, it  is unlikely to be missed by the broad masses.  The only people who are likely to be bothered are a few specialists in what it means to be a unique species, and these specialists are generally regarded an over-serious, isolated and boring bunch.  Likewise, if the word “epidemic” and all synonyms for “epidemic” were outlawed, the only people likely to get in a panic would be epidemiologists.  Everyone else would get along quite splendidly…at least for a while.

To be sure, the abolition of “Man” and the Abolition of Man, as per the essay by C.S. Lewis are not identical.  The latter concerns the weakening of the species, the former concerns the loss of its name.  Indeed, the distinction between signs and things signified is another treasure which must be jealously guarded against the ravages of post-modernity, which is trying to slouch its way back towards a magical worldview.  Be that as it may, we can still surmise that in the defense of something it might prove essential to be able to speak about it.

On the other hand, we have to make especially sure we don’t get lured down another popular rabbit path, a highly respectable path none the less leads away from the Anthropological core: The path of language.  For example, we could easily lump this abolition of “Man” (the word) together with similar language “correction.”  Pointing out the absurdity of these corrections is the strategy of many conservatives, such as British philosopher Sir Roger Scruton who talks about the way that gender neutrality reforms have “violated the natural cadences of the English language.”   On an esthetic level, there may still be some residual irritation at “people” (or similar substitutes) in lieu of “Man”.  Yet, while this is good Edmund Burke-vintage common sense, it heads off in a trivial and logic mincing direction, of the kind favored by British analytical philosophers and American word-pundits in the Bill Safire tradition.  It expresses a futile, rearguard, hope that inane reforms, like the substitution of his and hers by “hez” can be reversed by a return to  convention, or even mutual rationality.  Rather, the Postmodernist hoards are not likely to be stemmed by a grammar policeman, policewoman, or even policeperson holding up a gloved hand, shouting “Stop!”  Its not that the “reforms” can’t be exposed as illogical and unappealing, its that they are just the tip of the spear carried by acolytes in a far deeper struggle.

Whether the war over language is winnable, I maintain it is the war against Man (as a concept) which is primary, a battle with ideological motives rooted in the hoary past.  Call it a “conspiracy” if you will, keeping in mind that conspiracy is just  popular philosophy prosecuted by cadres of minimally educated but highly motivated minions.  The generals in this conspiracy knew that they could not launch a frontal assault on Man (a.k.a. the human race), so they focused their attention on “Man” at first as a concept and then as a word.  This history of this war is better measured by centuries than by decades and has taken many a convoluted turn.  Hence my belief that contemporary Feminism is, at best, a secondary effect.  It is the Amazon battalion thrown into the breach of the citadel after the the groundwork had been patiently laid and the initial battlefield secured.  That crucial battlefield was anthropology, and not what one is liable to think of as the field of anthropology, but its philosophical cousin, that key science of all sciences, namely, the “Philosophy of…[bleep!]…”

A good “Man” is wrong to find

One can admit something exists and is important without idolizing it.  There was all too much idolization of the human race after the Renaissance and building up to the Enlightenment, a period bookended by Pico de la Mirandola’s On the Dignity of [Bleep!] and Alexander Pope’s Essay on [Bleep!] tomes which style and economy have rendered, perhaps mercifully, unreadable today.  In those days, whenever errant scholars ventured too far from the Pauline/Augustinian double anthropology of fall and redemption, it spelled trouble.  However, personal repentance generally put a  limit to the damage which could be inflicted before the toxic juice of self-worship became endemic to society.  Mirandola befriended and was converted by Savonarola, that misunderstood Catholic puritan, while at least Pope never became the Pope nor were his verses rendered into binding encyclicals.  Savonarola taught the early humanists the secret of Christian Anthropology, that Man is both sacred and bad.  For his tuition, and other causes, he was burned at the stake.

The last child and virtual apotheosis (that is, one “made into God”) of the early modern period was Voltaire, who’s hatred of religion was legendary.  None the less, even Voltaire had too much common sense to think that his animus towards Christianity could be transmuted into a new and living faith.  He noted that “It is easy enough to start a new religion, all you have to do is get yourself crucified and then rise from the dead!”  In recent years, the late Rene Girard has documented Voltaire’s insight with numerous case-studies, illustrating how most human religions originate in scapgoating, death, and subsequent apotheosis.  However the wily Voltaire could see where all this was heading, and limited his disciples to the “cultivation of  their gardens” i.e., the enjoyment of a quiet and restrained sensuality.  We might call this soft-core Humanism, or the humanism of the self.   This early modern Man-ism, which today is probably the most popular (albeit unconscious) religion on the planet, is little more than a recrudescence of old Epicurus, whose famous doctrine Paul once debated on the field of Athenian Mars.  At worst the virtues of this philosophy, such as conviviality, apolitical repose, refined aesthetics etc., are disguised vices, vices centered on feelings.  Think of the the steriotypical Country Club Republican of today’s America.  Such people are pathetic, but not in any superficial sense of the word, since the purpose of their  life is “pathic”…that is, to have feelings, high quality feelings.

Hard-core Humanism was a novelty of Voltaire’s rival, J. J. Rousseau.  In contrast to the soft doctrine, here the object of action is the ideal of Man, not the feeling-satisfaction of individual human beings.   It was Rousseau who managed to transmute the Enlightenment’s carping animus against Christianity into something resembling a true religion.  As the founder of this new religion, which has variously been termed Modernism, Humanism, Socialism and much else, Rousseau should have found himself subject to the pitiless Law of the Scapegoat.  However he eluded martyrdom, and not just because he died a natural death nineteen years prior to the outbreak of the revolution he had inspired.  Rousseau’s Man differed in important ways from both Christian and Renaissance conceptions, which were predicated on either a personal God, or at any rate, a hierarchy of beings of which the human race was but one link in the chain of existence.  Although initially disguised by Deistic code-words, the new religion lifted up Man as the Head of the Cosmos.  Since this Man was a collective, it was not expedient that any individual anti-Christ need suffer the Law of the Scapegoat.  If there were to be any suffering, it would only be in accord with the tyrant Caligula’s wish for the Roman people, “If only they all had but one neck!”  In principle, the head which lifts itself too high gets chopped off.  Caligula himself  proved  no exception to the rule.

At all events, by the 2nd or 3rd  year of the Human Revolution (c. 1793AD) modern technology had outstripped antiquity, democratizing death and allowing Caligula’s dream to come true.  The guillotine enabled the disciples of Rousseau to liquidate the old political class en mass, and then in a predictable turn of events, those disciples themselves mounted the scaffold, suffering a kind of mechanical crucifixion to the god whom they had lifted up, Man.  It was a collective crucifixion to a collective god, for this “Man” was not the same as in the soft Humanism of Voltaire, which was just a category designating a collection of individuals.  Rather, this post-Rousseau “Man” was, if not quite a concrete organism, at least cohesive enough to have a single will, a doctrine as lethal as it was democratic.

The carnage of the Revolutionary/Napoleonic period was not repeated in Europe until 1914 and thereafter, after which great quantities of men and women again began to be killed as a consequence of political and military action.  Here  we would like to inquire whether this carnage (lit. carnal death) was in some sense related to the death (or life) of an abstraction.  Is there a relation between the death of humans and the death of “Man” as a concept and a word, and if so, is that relation positive or negative?  The example of the French Revolution would seem to caution us against a laudatory Humanism, on the suspicion that the higher the ideal of “Man” is lifted up, the more human individuals are likely to be subjected to political violence.

At this point in the argument however, such a conclusion would be premature.  The period between the exile of Napoleon and the shooting of Archduke Ferdinand in Bosnia, which saw relative calm in European politics was conversely that period which witnessed, for good or ill, a wholesale revolution in popular concept of “Man” under the impact of Evolution, Marxism, and Psycho-analysis.  However none of these epicenters of scientific upheaval were directly concerned with Anthropology, at least Philosophical Anthropology, rather they were centered on the cognate disciplines of biology, economics, and psychology.

More to the point, none of these revolutionaries set out to solve the problem, “What is… [bleep!]…”   However others took up that now forbidden question, and we should try to pick up their tracks from where they left off in the tumult of 19th century thought.

Philosophical Anthropology: The Conspiracy Thickens

Today if you mention “Illuminism” it is likely to conjure up secret societies, occultism and political skulduggery, critical investigation into which is no doubt important and proper.  However in the literary salons of Europe and America during the 1840s and 185os Illuminism had a second, though in all probability related, meaning.  It referred to the then-novel research which today’s theologians refer to as the “Higher Criticism.”  If you know about, say, the “Jesus Seminar” then you pretty much know what Illuminism a.k.a. “Higher Criticism” was, except that the contemporary Seminar is pretty much an isolated rehashing of themes which were treated with greater plausibility and seriousness 170 years before.  Those earlier 19th century critics of religion were advancing along the front of a broad intellectual movement which was in the early stages of transiting from spiritualism to materialism.  The cynosure of the movement was Germany in the years following, and in reaction to, the death of philosopher G.F.W. Hegel.  To simplify a very complex way of thinking, many people of that time had accepted Pantheism, the idea that the universe and God are the same thing.  Since most people are not very quick on the uptake, and are willing to sign on to a belief systems before they grasp all of its correlative implications.

Thus, many a happy Pantheist, circa 1840AD, was surprised and saddened to learn that their system no longer permitted them to believe in the personal divinity of Jesus, whom they had hoped to retain as a spiritual hedge in spite of their infidel inclinations .  They should have figured this out from reading Hegel, but it took the shock treatment administered by some young, radical, German intellectuals of the time (a.k.a.,  the Illuminists, Higher Critics etc.) to rub the noses of these au currant ladies and gentlemen in the compost of atheism.  After a halfhearted embrace of Pantheist ambiguity, some among the elite classes of Europe were again courting hard-core, Rousseau-vintage, Humanism, very much along the lines of the original French Revolution of 1789, albeit the European political revolutions of the 40s didn’t amount to much.  This time, humanism broke out with more scientific rigor and less heartfelt enthusiasm, “Man” was made the vehicle of those hopes and dreams which had previously been invested in God.  Moreover, the unprecedented technological progress of the times were conducive to putting faith in human works.

Yet those works, splendid as they might be, begged the nature of their creators.  What was the essence of Man?  Or as we would say today, “What is the essence of….[bleep!]?”  Amazing though it might seem in retrospect, some people of that era actually took the time and pains to ask the Anthropological question.  The man who best serves as archetype of those questioners, actually proposing and discarding several solutions over the course of his life, was the German philosopher Ludwig Feuerbach (1804-1872).  One thing that can be said of Feuerbach, even if we dismiss him as a serial wrong-guesser who justly earned posthumous obscurity, was his persistent and scrupulous engagement with the Anthropological question.  His best remembered quote,”You are what you eat!” might ornament a nutritionist more gloriously than a philosopher.  Yet we must consider that, as a thinker, he was an anvil and not a hammer, pounded left and right by forces which were not just making Modernity but shattering the classical mirror of Man (better known to us as “bleep!”).  Feurerbach’s lifetime bracketed an epochal turn in human self-definition, a turn which Feuerbach didn’t initiate so much as chronicle.

Therefore, meditate on the chronological sketch below and notice how the the turn from Anthropology to anti-Anthropology transpired in the space of a specific, species-haunted, generation.  I know this narrative will be easy to dismiss as a curmudgeon’s rant on “the origins of the left”  but if you visualize the broad movement behind, and independent of, individual intentions will you grasp  its Anthropological significance.  In spooky confirmation of a simultaneous and  universal (or at least pan-Western) turn of thought, the history of early Positivism could be adduced as  a development in synchronicity with Idealism, but in this case the decapitation of Man being conducted by French, and allegedly “conservative” social scientists from August Compte to Emile Durkheim.  But I rather prefer the bold and brooding history of Anglo-German radicalism.

1804  death of Immanuel  Kant, birth of L. Feuerbach

1806 Hegel publishes his Phenomenology, consciousness posited as the motive force in the history of the world, subjective (individual) consciousness conditioned in a “dialectical” relationship to objective (collective) consciousness.

1818-19 Lectures on the History of Philosophy, S. T. Coleridge introduces German Idealism to the English reading public, slowly Idealism will replace the reigning Scottish “common sense” philosophy in the English speaking world.

1831  death of Hegel

1835 Life of Jesus, by Strauss

1841 The Essence of Christianity by Feuerbach

1843 The Essence of Christianity translated by George Eliot

1844 Karl Marx, Theses on Feuerbach, critical of objectivity and lack of political engagement in speculative Anthropology

1847-48 Revolutions in France and central Europe

1848 The Communist Manifesto

1850 The Great London Exposition, popular vindication of applied technology over philosophical and scientific theory

1854-56 Crimean War (only major European war between 1815-1914)  Nightingale, progressive transfer of humane care from family and church to state

1859 Charles Darwin, the Origin of Species, natural selection adduced as motive force in natural history

1860 Essays and Reviews, English theologians embrace the methods of Higher Criticism

1861-65 American civil war, first modern “total” war

1861 Marx, Capital vol. 1 published

1871 Charles Darwin, the Descent of Man

1872 Death of Feuerbach

Note that at the outset Man was The All-In-All, but at the end of the period, not even the  child of a monkey, rather, a scion of some anonymous animal.

In The Essence of Christianity Feuerbach attempted to equate God with “good.”  In his view all the things which were posited of a Supreme Being were actually virtuous attributes of the human species-being.  Justice, mercy, love, fidelity, etc., were human characteristics, which had been mistakenly projected on to an alienated figment of the collective imagination and deified.  However, and here’s the rub, the human individual had no more ultimate reality than God.  Feuerbach’s Man was not men, or men and women, or even people, but the species as a collective.   Individuals were mortal but the species was immortal.  Man was God, Man was good, and Man would live forever.  At the time it seemed like a grand faith, a devotion to something tangible which might give meaning to the limited and fragile life of individuals.

Feuerbach’s intention was  to make a smooth transition from the crypto-Pantheism of Hegel, to a less infatuated, more earthy, Humanism.  Yet  his critics were were more likely to see this continuity with idealism as contamination by unrealistic nonsense.  As thinkers more cunning and sanguinary than Feuerbach were quick to point out, this alleged Human species-being never managed to will anything concrete and  unanimously, but rather, all real  history has been the history of antagonistic groups engaged in fratricidal strife.  For the critics, the ultimate meaning of history was far better illustrated by victorious parties dancing on the graves of the defeated than a universally inclusive chorus singing Beethoven’s Ode to Joy.  According to Karl Marx the antagonistic parties were economic classes, and to some extent nations.  Today we would add genders, races, religions, and even sexual orientations.  Under fire from its radical critics, Human species-being quickly melted into the solvent of class analysis.

Small wonder that Marx happily discarded Feuerbach’s anthropology for the naturalism of Darwin, at one point seeking (and being refused) permission to dedicate Capital to the British naturalist.  Darwin’s system was founded on the assumption of conflict and competition, not the deduction of human from divine virtues.  Feuerbach continued to revise his system in the direction of increasingly consistent materialism, but was no longer in the forefront of a generation which had jumped from philosophical speculation to natural science, now that the latter was backed up by the prestige of  rapidly developing technology.

More significantly, the capital which Darwin did not endorse was the capital M in Man.  In classical anthropology Man had been one of the primordial kinds, as in Spirit, Man, Animal, and Mineral.  Naturalists from Aristotle to Buffon had recognized that  qua organism, the human body was akin to other mammals, and especially to apes and monkeys.  However in a consistently despiritualized science, the one human species was no longer set apart from the myriad of other animals, but rather fell under the same biological and ethological constraints as any other organism.  This reduction may have deeply bothered Darwin personally, but as a scientist he never really posed the Anthropological question the same way that Feuerbach had done, rather he was resigned to viewing homo sapiens as a single object within the purview of the natural science.  In spite of the title, after The Decent of Man, Man ceased to exist as a problem for natural science.  Or more precisely, from a Darwinian point of view, Man, as a unique aspect of the world, had never existed to begin with.

From Man to “Man”

We began by hinting that the loss of “Man” was a harbinger of the death of our own species.  After some clarification we can now understand that the situation is rather worse than we had initially feared, in that, conceptually, Man was killed off sometime in the middle of the 19th century, while “Man” (the word) actually survived the concept by more than a hundred years.  To maintain clarity, we must remember that there are actually three deaths.  First, the death of the concept, second the death of the word, and third, and yet to happen, the actual species extinction of homo sapiens.  That the third death is yet to happen should not imply that it necessarily will, it is only a hypothesis.  None the less, the three deaths are cognitively related.  In particular, the death of Man (the concept) at the hands of Darwinism, is strongly associated with the putative mortality of the species.  If Man is subject to species extinction, as are all organic taxa according to the laws of natural selection, then Man cannot be considered a primary aspect of the world.  As an analogy, consider the concept of “states of matter” which are generally accepted as uniform, or at least ubiquitous, aspects of nature.  If, say, all liquids could disappear from the cosmos, it would put the schema of “states of matter” in serious doubt.  Something of that nature is what has happened with Man, due to the anti-Anthropological turn circa 1860.

Now, would it be too wicked for me to suggest that while Man is not a “species” in the same sense that felix domestica is a species, none the less Man bears an uncanny resemblance to the cat, that enigmatic creature of the proverbial nine lives?  Not only did the word “Man” persist far longer than one might have expected, but Anthropology entered a period of great fruition after the death of Darwin.  Here I’m not referring primarily to what people ordinarily think of as “Anthropology”, the post-Darwinian people-within-nature paradigm which covers everything from bones to basket weaving.  Be wary that, just as in politics, where the nomenclature for everything gets twisted around to its opposite, and we now are forced to call socialists “liberals” in similar fashion those post-Darwinian scholars who no longer believe in a human essence are liable to call themselves “Anthropologists.”  In fact, they are mostly anti-Anthropologists who just want to study the secondary attributes and accidental properties associated with human beings.   Granted, there is nothing intrinsically wrong with that, and on the whole these so-called Anthropologists are not a bad lot, being no more consistently anti-Anthropological than the other professionals who have have inherited scattered fragments among the human sciences.  If the so-called Anthropologists have any besetting sins, those would be 1) they stole the name away from genuine Anthropology, 2) some sub-schools were virulently anti-cognitive, for example the ethnologist Franz Boaz who never saw a theory that he didn’t want to grind down into a powder of facts, 3) others, notably the Structuralists, were hyper-cognitive, and sought to gin up a Theory of Everything, based on some attribute (usually kinship or language) of human thought or behavior.

The anti-Anthropologists who called themselves “Anthropologists” loved “Man” (the word).  After all, it was their schtick, and made a nifty title for textbooks, even textbooks written by sophisticated Darwinians and Marxists who knew that human species-being had gone out of fashion with Feuerbach.  In the meantime, anything on two legs with an opposable thumb would do, and it was all great fun until Feminism put the kibosh on that particular branding.  None the less, so-called  “Anthropology” took the ban on “Man” in stride, since their usage of the term was based on a consistent nominalism, if not on a conscious memory of the anti-Anthropological roots of modern natural science.  Fortunately, due to the exclusion of classical languages, undergraduates could still take “Anthro” and not worry their heads that banned “Man” had never meant just  andro…indeed, that it had meant much more than both andro and gyno put together.

Yet, I wanted to mention the 2oth century miracle of Anthropology, not so-called “Anthropology” but genuine Philosophical Anthropology, as it flourished after, and in spite of, the anti-Anthropological turn of the previous generation.  If I thought that Man were a mere species and not an attribute of Created Being, my inclination would be to classify it somewhere within the family Leporidae, as a mammal with a capacity for making unexpected intellectual leaps, and multiplying thoughts faster than other species can reproduce their genes.  To that end, what great broods have been spawned, not just among the anti-Anthropologists, which is only to be expected, but even among genuine Anthropologists during the 20th and even 21st centuries!

Now remember, when I heap praise on the battered remnants of genuine, philosophical, Anthropology, I’m only lauding them for asking the right question, namely: “What is…[bleep!]”  And by now you understand what “bleep!” is and that a Philosophical Anthropologist is one who would know and say that “bleep!”=Man, and that possibly we should even come out and say “Man” when we mean Man.  I am not saying that many, or even any, of these Anthropologists have answered the question correctly, although I think there is an answer, and that some have made a closer approach to the correct solution than others.  Naturally I have my own views, but I would consider anyone a legitimate Anthropologist who asked the question aright.

There are schools of Philosophical Anthropology of every description.  Some are religious, some are frankly atheistic, but even the most starkly atheistic Anthropologists demure from post-Darwinian naturalism in positing something unique and essential about the human race.  In that sense, all Anthropologists, from atheists to Christians, are tendering a kind of “minority report” against the consensus view of modern science and society.  An atheistic, but genuine, Anthropologist might posit that the human race has a unique responsibility to conserve the cosmos and bring it to its best potential.  Countering this, the consensus view would maintain that such an assertion was errant nonsense, an arbitrary projection of human values into the unthinking and unthinkable void.

In a brief treatment, it is impossible to do more than allude to all the speculative “minority reports” which have been filed by Philosophical Anthropologists against the hegemony of post-Darwinian naturalism.  No doubt many of these speculations have been wrong-headed, but they have at least kept a window open to world-views outside the standard narrative.  If I had to pick a representative of the type it would be Max Scheler(German, d. 1928).  Feuerbach’s anthropolgy began with materialistic idealism and sloped inexorably down to idealistic materialism, however Scheler’s thought described a parabola, which at its height sought the divine in Man.   Personality, both Divine and Human, was arguably Scheler’s main concern, however his reluctance to deal with the limits imposed by a temporal creation, as per the Judeo-Christian scriptures, subordinated individuality to the vague infinity of deep time, a dilemma similar to that encountered by the ancient Gnostics.  Abandoning his initial, and intentionally Christian, viewpoint, Scheler made the alarming discovery that, in precluding a personal God, the amoral instinctual urges of the Cosmos were far stronger than  any principle of spiritual form or sentiment.   The intellectual public in Germany and beyond, repelled by such otiose metaphysics embraced existentialism, a doctrine which gave up on the reality of anything but individuals.  Anthropology once again retreated to the shadows.

In retrospect, Feurebach and Scheler seem like tragic figures who lifted up Man, in one or another guise, as a god, only to see their systems crushed down by more consistently nihilistic doctrines.  However it doubtful whether their contemporaries saw the loss of Anthropological hegemony as something to be lamented.  Rather, they were relieved to be unburdened of Man, just as they had greeted the earlier, and logically prior, “death of God” with satisfaction.

The return of Man, and the return of “Man”…which, both or neither?

The operational assumption is that people can get along perfectly well without a conception of their own species occupying a special place in the system of the world.  Underlying this assumption is the more fundamental axiom that the natural science narrative is our default outlook on the world.  After all, its “natural” is it not?

However the “minority report” of Philosophical Anthropology raises the  specter of a completely different world, a world in which the unique bearers of the divine image have been persuaded that they are but one of a myriad of animal species.  By this account, the conceptual framework of natural science within which the image bearers were circumscribed, was not so much a “discovery” as the imputation of a belief-system.  From this perspective, it is naturalism, not the classical Man-centered cosmology, which is fabulous.  To get the masses of humanity to believe such a deflating fable in the course of a few centuries, has been a superbly effective triumph of propaganda.  Although we have some hints as to who has disseminated this propaganda, the question of in whose interest it was disseminated remains enigmatic.

Within the English-speaking world, the banner of the old classical Anthropology (Christian or secular) was “Man.”  The banner was not furled up until long after the cause was lost.  Yet the banner itself was essential, so essential that the high command of anti-Anthropology decided to send in the Amazonian battalion to haul it down under the pretext of the gender wars.  Lost in the confusion of that particular skirmish, was the deep import of having a proper name for that key nexus of Creation through which the Divine, ideally, was to communicate its dominion over the visible world.  “People” is more than just an innocent substitute for “Man”, since, being a plural, it serves as a pretext for importing the entire philosophy of nominalism into the human sciences.  Nominalism views entities (you and me and the cat and the carpet) as capable of being grouped into any category which happens to be convenient.   Who’s convenience?

It can be safely inferred that this is a view well suited to those who want to abolish the boundaries between species.  Perhaps now the reader can see the relevance of all the preceding esoteric Anthropology, for looming on the event horizon of our world are a thousand crises brought about by relation of the human to the non-human.  Indeed, we are conjuring up new categories of non-humans day by day.  AI and aliens, robots and Chimeras, not to mention all those entities of the natural and spiritual world who are ancient in human lore.  I eagerly await the rebirth of the “dinosaur” from its amber-encased DNA.  Or will it be a dragon?   Names make a difference.

None the less, we proceed without caution, for the night-watch has been relieved of its duties as the evening of human history encroaches.  Isolated voices cry out, “There may be a problem here!” and anxiety is ubiquitous, but few are willing to “get real.”  This is not an accident.  The “real” tools, nay, the “real” weapons with which we might have fought were long ago taken away and beaten, not into plowshares, but into the bars of zoological confinement for what remains of the dignity of Man.  The “real” tools were realistic in a properly philosophical sense, exalting created kinds as the unalterable building blocks from which God created our world.  Such was Man.  Hence the necessity of having a personal name for the species.

Will Man come again?  I think so, but more on the basis of faith than calculation.  In the meantime others look towards a rapidly accelerating future, and begin to realize that “Nature” is hardly a better idol than secular Man, that the sense of “nature-in-itself” is an illusory effect of what psychologists call normalcy bias.  None the less, something is approaching, we know not what.  Intellectuals call it “the end of history” while technologists speak of “the singularity.”  Most just ignore it, but it will come nonetheless.

Suddenly.

 

 

 

 

 

 

Posted in Anthropology, Art, Christianity, Culture & Politics, Esoterism, Evolution, History, Paleoconservativism, Philosophy, Politics, Traditionalism, Uncategorized | Leave a Comment »

From Ike with love: The Age of Deception (1952-2016)

Posted by nouspraktikon on July 5, 2017

Nothing has changed except our point of view, but that counts for something

It is easy to think, as the left continues to overplay its cards, that something significant has occurred, and that our trajectory towards an Orwellian future has accelerated .  On the contrary, the Trump victory has triggered a new gestalt in people’s minds.  By 2017 fairly average people can see what only hardened conspiracy theorists were willing to hypothesize as late as 2015.   Whether or not we are at the beginning of a new era, for good or ill, is a matter of conjecture.  Indisputably, we have taken our leave of a period in political history which will prompt nostalgia among anyone but truth-seekers.  While it was hardly an era of good feelings, it was held up by its laureates as a time of consensus, or at least bi-partisanship.

Rather, it seems better to call our recent past the Age of Deception.  The Great Deception consisted in draping a de facto one party system in the vestments of a two party system.  If you had said this in 1965, or 1975, or 1980, or 1994, or 2001, or perhaps even 2008…most people would have called you an extremist.

However somebody, somebody who thought extremism in the cause of truth was no vice, had already pointed this out as early as 1958.  Sure enough, his opponents, and they were legion, labeled this man a slanderer, effectively burying  his work from the sight of the general public, first using savage opprobrium, subsequently silence, and at last retrospective ridicule.   The man was Robert Welch, and the “book” he wrote, initially penned as a private circular and later published as The Politician, targeted none other than President Dwight Eisenhower as an agent of communism.

Then as now, to the half-informed mind of the general reading public, such an allegation was patently absurd.  Eisenhower was venerated as a war hero on the basis of his direction of the Allied war efforts in Europe.  Now admitedly, there are a number of ways to think about the “heroism” of strategic commanders as opposed to direct combatants, but generally, if the war is won, the public will grant the former a triumph and allow them to retire in luxurious obscurity.  “Ike’s” not-so-obscure military retirement consisted of becoming President of Columbia University.  After that, for reasons most people are rather vague about, he was drafted to become the Republican candidate for another kind of presidency, nominated over Sen. Robert Taft of Ohio, the last champion of the “Old Right.”

After that, we usually go to sleep in our American history class until it is time to wake up for Kennedy.  Indeed, this might be a kind of clue that something is amiss in the standard Eisenhower narrative, like the barking dog who falls strangely silent in the dead of night.  How many books, popular and scholarly, are published each year about JFK in comparison to good old “Ike” (even subtracting those chillers which focus entirely on Kennedy’s murder)?  I doubt that a ratio of a hundred to one would be far off base.  Either America’s political ’50s were incredibly boring, or there is a story which, in the view of some, were best left untold….

A few history mavens might even remember that “We…(presumably all Americans)..like Ike”…because (warning, redundancy!) he was “…a man who’s easy to like.”  And furthermore, as the campaign jingle continued with mesmerizing repetition…”Uncle Joe is worried, ’cause we like Ike!”  Of course, if Mr. Welch was anywhere close to on-target in The Politician, “Uncle Joe” a.k.a. Joseph Stalin had little to be worried about, at least in regard to Dwight Eisenhower.

If you are skeptical that “Ike” could have been a communist front man, then I can sympathize with you.  Frankly, I was skeptical myself…indeed, everybody has a right to be skeptical of startling claims.  On the other hand, if you think that it is disrespectful to raise the issue of presidential duplicity at all, then you are on shaky grounds.  You are on especially shaky grounds if you happen to be one of those people who think that our sitting president was sponsored by (today’s post-communist) Russia.

You see, after 2016 everything has changed.  Whether or not Mr. Welch’s claims regarding “Ike” can be vindicated, at the very least we are now in position to read The Politician as an objective historical account.  The Politician is a strong and scholarly witness of an already forgotten time, one that now can, and should, be approached without bias or malice.

Why Robert Welch didn’t “like Ike”

It is an uncomfortable but inescapable truth that once certain things come to one’s attention it is impossible  to “unsee” them.  There is a shift in perception which renders impossible any  return to “normal” however rosy that mythical past might have been.  For example, a beloved though eccentric uncle can seldom be restored to a family’s unguarded intimacy once he comes under suspicion of pederasty, and rightly so.  Likewise, the image of Eisenhower would be shattered, not so much as war hero, but as the epitome of a stable, normal and normalizing politician, were he to be exposed as a willing agent of communism.  Conversely, just as the suspect uncle would insist on due process, even if he knew himself to be guilty, the upholders of the Eisenhower legacy are apt to clamor for iron clad proof of what, according to mainstream historiography, would be considered an outrageous accusation.

Sadly, for the reputation of Eisenhower and our national narrative, the claims of Mr. Welch are well documented, coherent, detailed, and were compiled by a contemporary who knew the American political class of the 1950s like the back of his hand.  If you wish to keep Eisenhower in your pantheon of heroes, read no further.  If, on the other hand, you would like to see the claims against him substantiated, read The Politician.  Here, I can only provide a brief, albeit damning, sampling drawn from Mr. Welch’s researches.  Therein he documents the following egregious policies which were either authorized or enabled by Eisenhower:

*Even in his role as allied commander, the fountainhead of his public esteem, Eisenhower was allegedly (The Politician provides graphic details) complicit in the nefarious Operation Keelhaul, a rendition program which forcibly repatriated ex-Axis agents collaborating with the American forces to their home countries behind the iron curtain.  This eliminated numerous sources of “worry” for “Uncle Joe.”

*Eisenhower was instrumental, as President of Columbia University, in pushing that already left-leaning institution further in the same  direction.  He continued to associate with and hire left-wing and communist front faculty, procuring for them teaching/research endowments.  Again, the allegations in The Politician have been strengthened in the light of subsequent events.  Just ten years after the publication of Welch’s Eisenhower exposure, the University of Columbia erupted as an epicenter of the spreading “new left” movement of the ’60s.

*At the heart of The Politician’s allegations is “the politician” himself.  Prior to Eisenhower’s nomination as a candidate for president on the Republican ticket, all of his political associations had been with the left-wing of the Democrat party.  This is perhaps the most uncanny aspect of Eisenhower’s career, and the one most germane to the establishment of a faux two-party system beginning in the ’50s.  The only fig leaf concealing this duplicity was the absence of any prior political office holding (Democrat or Republican) by the President-to-be.  Again, historical retrospect adds, if not new facts, new irony to the narrative of The Politician.  Our current presidency is commonly considered exceptional, if not down right illegitimate, on grounds that Mr. Trump held no prior office and was not sufficiently initiated into the mysteries of the political class.  In the light of Eisenhower’s prior example this current “exceptionalism” can only be caviled at by those who either 1) adhere to the dangerous proposition that generalship is a political office, or 2) are willing to admit that such rules can only be broken on the left.

*Once inaugurated President Eisenhower continued the policies of FDR’s New Deal.  Indeed, programs and bureaucracies which existed only in embryo in previous administrations were fleshed out, expanded, and duplicated.  The agricultural sector is typical, and just one of the many that Welch enumerates. Amazingly, farm subsidies swelled to half of farmers’ revenue, a fact of which “Ike” was very proud.  Moreover, unlike FDR and the Democrats of the ’30s, these programs were not justified as “emergency” measures, but were considered a permanent and “normal” restructuring of the relation between the public and the private sector, i.e., de facto socialism.   This was enabled by the collapse of any meaningful two-party opposition due to the alliance between left-wing Democrats and the establishment Republicans who backed Eisenhower.  The monolithic bureaucracy, exemplified by the Department of Health, Education, and Welfare, long resisted by the “Old Right” was institutionalized under the faux two-party consensus.  Hence the public sector actually saw a spurt of growth in terms of employees and expenditure in the transition from Truman to Eisenhower.  Consequently, the national debt rose at a rate several times higher than even the Democrats had been willing to incur.

*As shocking as many of the above allegations might seem, the most controversial aspect of the Eisenhower administration was its acceptance and further entrenchment of the post-WWII National Security State system inaugurated under Harry Truman.  This has to be remembered both in conjunction with, and contrast to, the only quote that most people today are likely associate with Dwight Eisenhower, namely, his “prescient” warning against the dangers of the “military industrial complex.”  This utterance was prescient only in so far as Eisenhower was speaking prior to the Vietnam debacle, after which such forebodings became commonplace.  To the best of my knowledge Mr. Welch doesn’t reference this quote, which dates from a time subsequent to the initial redaction of The Politician, although not prior to later editions.  However, Mr. Welch frequently draws attention to rhetorical gestures made by Eisenhower through which he exculpated himself from responsibility for his suspect policies by seeming to condemn their inevitable negative consequences.   Thus he might condemn “galloping socialism” while rapidly expanding the public sector.  Seen in this light, we might take Ike’s warning against the “military industrial complex” to heart, while doubting the speaker’s innocence of the very thing he condemned.

Does this “Ancient History” even matter?

The short answer…yes, it does.

You might recall a scene in Starwars where Luke Skywalker asks Yoda about the future.  Yoda answers, “A strange thing the future, always in motion it is…”  In a sense the past is also in motion, shaped by the interpretation given it by the present.  Yet it would be too great a concession to the irrational forces of our times to say that this was a real, and not an apparent, motion.  The past must be uncovered, not invented…although the temptation to  invent myth is strong.

There is always a strong mental resistance to meddling with any society’s pantheon, or in more American terms, we might say, tampering with Mt. Rushmore.  In Mr. Welch’s day, The Politician seemed rude to the point of slander, while today it seems impious.  We might say “only” impious, when actually it’s the primal sin.  Mr. Welch mentioned something nobody was supposed to notice.  That’s impiety.

Or is it?  Note another odd thing about the Eisenhower myth, that there is no such myth!  Somehow or other Eisenhower has eluded both the pantheon and the rogue’s gallery of American history.  If the entire history of the Presidency during the ’50s elicits very little commentary, is that because the whole period was boring?  Hardly.  Rather, might not such a presidency be likened to a constant background noise, or better yet a universal solvent…the purpose of which is to set the standard of normality for “the end of history”?

Today we have come out the other end of “the end of history.”  Not that we really know how things will end, or for that matter continue.  All we know is that, for the first time in a long time the carefully scripted design for the future has suffered a setback.  The planners, whoever and whatever they may be (though from a galaxy far away I think they be not!) are in disarray and many things are back on the table which once were considered “settled.”  This may be a good thing, it may be a dangerous thing, and most likely both, but this is where we seem to be at present.

Consequently, under today’s conditions, reading, and taking seriously, the thesis in Mr. Welch’s The Politician, is no longer an act of impiety.  It is an essential measure of the road which we have traversed through the land of manipulated consensus.  Having finished that journey, we can look back at the trail-head, take stock, and get a new perspective.  However, in contrast to the fantasies of the “progressives” no perspective is better just because it is newer…only if it is truer to realities which transcend perspective itself.  Furthermore, to get at those realities one has to crunch a lot of historical data, and there is a lot of data to crunch, most of it rather unpleasant, in The Politician.

Only those with a deep urge for enlightenment need apply to the task.

 

Posted in Constitutionalism, Culture & Politics, Economics, History, Media, Paleoconservativism, Politics, Uncategorized | Tagged: | Leave a Comment »

Slouching towards the Post-Legal Society (Introduction: “The Beast”)

Posted by nouspraktikon on June 23, 2017

Cultural Marxism:  From show trials to no trials

If property is the proverbial nine points of the law, it is not surprising that Marxism, its frontal attack on property having stalled out (NB: ideology aside, we all like our “stuff”) would have eventually gotten around to launching a second front against law itself.  The total annihilation of law never succeeded with Communism Classic (Stalin’s version), since the Soviet state needed a judicial apparatus to highlight its superiority to “bourgeois law” …not to mention providing a half-way house on the way to the Gulag.  The nightmare of totalitarianism having been quietly put aside, if not entirely exorcised, we have emerged into the glaring, and presumably lawful, light of the Global Village.  Or have we?

Today, the legal “reforms” of the (allegedly) defunct Soviet state are held to be little more than antiquarian curiosities.  However this does not mean that “bourgeois law” a.k.a., classic legal principles of the Civil and Common law, have triumphed throughout the world.  Rather, the struggle against law has gone underground, or rather above ground and hidden in plain sight.  It dares not risk exposing itself, and therefore avoids clear opposition to the institution which makes civilization possible: Objective Law.  Since it eschews both thesis and antithesis, running for the dense cover of ambiguity, it must be tracked like a beast…by locating and examining its spores.  We know not what it is, but like W. B. Yeats, we can at least pose the question…

And what rough beast, its hour come at last

Slouches towards Bethlehem to be born?

But at least we have a track, where the beast has digested large swaths of civilization’s foliage and left us a species-specific excrement where form has been neatly reduced to matter.   If we can track the down the spoor-dropper, perhaps it can be slain.  Or perhaps not.  But at least we may come to know who, or what, our adversary is.

Antinomianism

We must pick up the beast’s trail in the foothills of religion, and especially false religion.  The journeyman tracker will think that we have found the beast itself, and with a gleeful cry of “Antinomianism! Antinomianism!” presume that they have him treed, when in fact it is just a spoor, albeit very a significant find.  Actually the beast has moved on to an entirely different part of the forest, since the “true” false region of today is not a religion at all, but science, or rather scientism.

However there are enough who still believe in ersatz-Christianity to cloud the contemporary scene with a subtle contempt for law.  This is an Oedipal Christianity in which the God of Law is slain by the Son of Love, a doctrine preached by a vague figure named Jesus something or other.  Scientifically this is supposed to be Yeshua ben Yosef, but it really doesn’t matter, since this ersatz-Christianity has been purified of all but universal truths which all good natured people ought to be able to agree to.  Among these is that law is mean and should be dispensed with in favor of good will.

Yeats was assuming that the reader of his poem knew that he was talking about the “Antichrist.”  However if we get too hung up on the idea of the Antichrist being an ugly, brutal, beast then we are likely to be deceived.  Granted, there are many cults which like to dress up in spandex costumes, going about sporting horns and tridents.  They may even enjoy frightening middle-class people on Halloween and sundry sabbaths with their clownish antics.  But this is all an exercise in misdirection.  Such cultists may be “anti-Christs” but not the final beast who arrives at the end of history. The real threat to our spiritual well being doesn’t come from avowed nihilists who dance around impersonating a cartoon Satan.

The real threat comes when the world-system (what the Bible calls the “Aeon”) proceeds to abolish law in favor of a “higher morality.”  In today’s virtue-signaling pseudo-saints we see a harbinger of the real Antichrist.  The real Antichrist will not look evil or demonic, in fact the real Antichrist will try to resemble Christ to  whatever extent that might be possible.  After all, Christ did transpose law-abiding to a higher abiding in Him.   Call that a “higher morality” if you will.  However the “higher morality” of the Antichrist will not be based on fear of the Creator, but fear of the creatures.  Specifically, it will involve fear of the Human collective, a fear that will initially manifest itself as virtue-signaling, but in fact will rest upon appeasement of human (and ultimately demonic) lusts.

Having broken through the firewall of law (whether we choose to call such formal restraints law, culture, morality, ethics, or whatever) the direct confluence of collective human lusts and fears will create a Democracy of Desire.  Initially such a state of affairs may not seem ugly to behold.  It may even appear to be morally beautiful.

A beautiful beast.

 

Posted in Anthropology, Appologetics, Charismata, Christianity, Constitutionalism, History, Law, Paleoconservativism, Philosophy, Politics, Uncategorized | 1 Comment »

Constitutional Contrary or Conundrum? The Imperial Presidency vs. the Unitary Executive

Posted by nouspraktikon on June 4, 2017

Strong President, Weak President

Setting boundaries and limits to power is the essence of politics in a republic.  No Latin word was ever belabored more than imperium in the era prior to Caesar’s crossing of the Rubicon.  Originally it referred to the “sphere of power” which was exercised by a magistrate, great or small, beyond which the office holder infringed upon the rival authority of some other elected official.  With the atrophy of the Republic, it became a personal noun, the Imperator, the root of our term for a King of Kings, an “Emperor.”  The word, thus transformed, described a  person who’s “sphere of power” had become the whole world, thus annihilating the use to which its root had once been put, namely, to define and limit power.

Last year I predicted that Donald Trump, if elected President, would not become a fascist dictator, an “Emperor” so to speak.  Rather, the tremendous forces arrayed against him would ensure that the office would be brought to heel to a much greater degree than those who fear an Imperial Presidency are wont to imagine.  None the less, even I have been surprised by the extent of the weakness in the executive.  If we have passed any Rubicon, it seems rather that we have passed over from a concealed, to an open, form of oligarchy.

One way of coming to grips with this non-revolution is to admit from the outset that 1) the Imperial Presidency, and 2) the unitary executive, are contraries, not complements.  If we were to talk about official spheres of power with the fastidiousness of the ancient Romans, we might call the first, the President’s “lateral power” and the second the President’s “upright power.”  Imagine that presidential power is a rectangle of fixed area which loses depth whenever it is stretched horizontally.  I know that is a rather strange image to put in the service of a radical hypothesis, but bear with me.

Why the unitary executive is a great Constitutional doctrine

Generally when we ( and by “we”I mean, libertarians, conservatives, traditionalists, natural rights advocates, strict constructionists, etc.) hear the word “president” modified by the word “strong” we go into a fit of moral indignation, if not outright hysteria.  Yes, generally heads of state should be weak, lest they turn into tyrants.  However the American presidency is a unique institution, one which the founders of the Republic intended as a safeguard of liberty, just as much as the legislative and judicial branches.  To begin with, the very notion that the American president is a “head of state” is an extra-Constitutional notion, one which arises from the necessity of adjusting American nomenclature to the standards of  diplomacy.  Indeed, since the Congress is our premier branch of government, the Speaker of the House has a fairly good claim to be the federal head of state, on the analogy of parliamentary systems.

Leaving aside the symbolic, and rather silly, issue of heads of state, let’s turn to a more fundamental question which impacts on the idea of the unitary executive.  Each of the branches of the Federal government must conduct its internal affairs in hermetic isolation of the other, while being in constant cooperation as corporate bodies to conduct the governance of these United States.  Naturally, each of the branches will attempt to extend its sphere of authority, or what the Romans called, their imperium.

Now the matters which are of concern to each branch are well spelled out in the Constitution, but each of the branches always attempts to grow its authority by multiplying those things by which it exerts authority.   Thus the legislative branch attempts to grow its authority by increasing the volume and complexity of legislation, while the judicial branch attempts to grow its authority through the multiplication of rulings, judgements, and injunctions.  On the other hand, it is primarily the executive branch which attempts to grow its authority through the multiplication of offices.  Sad to note, but the three branches may remain evenly balanced while all of them grow in concert, disrupting the larger balance between governmental and non-governmental institutions in civil society.

Whatever cure there might be for the exponential growth of government in the legislative and judicial spheres, the theory of the unitary executive provides both a unique analysis and possible cure for burgeoning bureaucracy.  How so?

Strictly speaking, in the American republic there can never be more than one government officer at a given time.  The name of this officer is the President of the United States!

Oh yes, if you must quibble, there is also a deputy in case of death or incapacitation, the anomalous Veep.  None the less, two officers is a pretty strict limit for the bureaucracy of a large republic.  It reminds one of the twin consuls of Rome, a historical precedent which was never far from the thoughts of the American founders.  In terms of modern political theory we have arrived at genuine “minarchism”…an ungainly word which has been coined to express the most limited of limited governments.

Of course, for true unity of will and purpose, a person can never really trust anyone else to do their own job.  Hence the most pristine unitary executive would be one in which the President did all the work of executive branch personally.  We can imagine a President who, dispensing with the service of a secretary, was able to handle all executive correspondence personally.  (NB: The reason we can imagine it is that we live in a world of word processors, computers, and the internet.)  However other things, such as warfare, might be a bit more tricky, unless our chief magistrate had the strength of the Biblical Samson or a modern-day comic super-hero.

So to be on the realistic side, even our pristine unitary executive would, of necessity, need to contract out for a few staffers.  Hopefully these would all be temporary workers.  After all, the chief magistrate himself is a temporary worker, limited to four, or at the maximum, eight years of employment by the American people.

Now before you dismiss this as nothing more than utopian swamp fever, perhaps we should take a look at the way the doctrine of the unitary executive has played out in the history of the Republic.

 

The historical roots of a weakening unitary executive

Unfortunately, while the imperial Presidency is the most realistic of real-political realities, the concept of a “unitary executive” is little more than a constitutional doctrine which has had to go hat in hand through the corridors of history in search of application.  To put the theory in its clearest form, the unitary executive is the President himself, who is at once both the only employee of the American people, and also the boss of every federal office holder outside of the Congress and the Judiciary.  The theory seemed most incarnate in the reign of those generals who seemed to be able to wield their authority with the same imperious might in the Oval Office as on the battlefield.  One thinks of Andrew Jackson and Teddy Roosevelt.

That was then, and now is now, when Mr. Trump’s executive leadership seems more like an exercise in herding cats.  Yet people with even a tad of historical lore under their skulls recognize that The Donald didn’t suddenly fumble the unitary executive to the horror of his fans and the delight of his detractors.  Common wisdom suggests that the unitary executive began to unravel, at the very latest, in the aftermath of the Watergate (1973) scandals.  Legislation which sought to limit the presidential imperium resulted in severe checks on arbitrary presidential power.  However these reforms failed to check arbitrary governmental power in general, or to stave off the multiplication of executive projects, expenditures and offices.  Rather, by setting up checks and balances within the executive branch of the federal government, they added to the executive bureaucracy.  And this went to the extent that the “special prosecutors” who were the plumb in the cake of the post-Watergate reforms threatened to become a “Fourth Branch” of trans-Constitutional governance.

Those who can see beyond the historical horizon of Watergate are more likely to see the first unraveling of the unitary executive in the New Deal, and the multiplication of those “alphabet agencies” such as the ICC, TVA, and NRA, each of whom were endowed with judicial as well as executive authority.  Yet an earlier starting point is the Progressive era, which saw the rise of the intellectual in the federal administration, a creature who was less likely to be constrained by, or even understood by, whatever folksy president inherited the legacy of those hybrid characters like Wilson who both studied and practiced administration.

Loyalty vs. Merit

However these movements were actually just footnotes to the unitary executive’s original fall from grace, which coincided with the rise of a merit based civil service.  It was the Pendelton Act of 1878 which consolidated the system of permanently employed government service.  After that there was little reason to think that officers would be loyal to a politician who’s term of office was likely to be far shorter than the duration of their career.   Like all sea changes in the policy of the republic, the effect of this reform was not immediately apparent.  After all, presidents in the late 19th century were just expected to be “weak.”  Think Grover Cleveland.

Today, because we read history from public school textbooks, the pre-reform civil service gets a bad press.  Typically it is referred to as the “spoils system” which conjures up images (not entirely unsubstantiated) of bribery and largess.  However there is another side to this issue.  We should at least try to be “Mugwumps” that fanciful word for a person who was willing to consider the merits and demerits of a permanent civil service.  In the interests of fairness, I would like to exercise a bit of Mugwumpery and dub the temporary civil servant system the “Loyalty System.”  After all, the politically appointable (and removable) civil servant would at least have no vested interest sabotaging the chief executive who, unlike him or herself, was directly chosen through the electoral mechanisms of the Republic.

In certain moods our progressives and our conservatives might even agree that disloyalty is a bad thing and moreover presidents should at least have the chance to formulate policy on their own turf before being challenged by either the courts or the legislature.  However there is a libertarian remnant which stubbornly insists that a strong president is a bad president, and indeed that a strong administration is nothing more than a step along the primrose path to empire.

However, as illogical as it may seem, the presidency became “weak” before it became imperial.  After WWI and as the 20th century wore on, there was need to have an emperor to complement the existence of an empire.  However the discipline of the bureaucracy which manifested itself at this time was not due to the charismatic appeal of those politicians who became, willy-nilly, chief magistrates of the republic.  Rather, it was due to the professional association of those who had a vested interest in the expansion of state power, both internationally and domestically.  Presidential orders were obeyed because presidents of whatever party were (to a greater or lesser extent)  aligned with the expansion of a robust administrative state. In 1952 Sen. Taft of Ohio lost the Republican nomination against General Dwight Eisenhower.  Taft was the last mainstream presidential candidate to seriously challenge the operational premise of expanding state power.  Barry Goldwater and Ron Paul would later mount doomed, albeit educational, campaigns dedicated to challenging that same premise.

Then in 2016 Donald Trump was elected after campaigning on many of the same anti-statist planks that animated Taft, Goldwater, Paul and (very inconsistently) Reagan.  Trump had the good sense to mix his contrarian rhetoric with a dash of jingoist appeal.  So far, the bureaucracy is in somewhat less than full scale revolt.  But only a very naive observer would be surprised that the doctrine of the unitary executive has been utterly abrogated.

The not-so-deep-state and the demise of the unitary executive

Today when “deep state” has become a household expression, it is easy to substitute James Bond intrigue for fundamental political analysis.  No doubt there is a great deal of skulduggery going on in high places these days, but the unitary executive would have floundered without any alienation between the Oval Office and the intelligence services.  It is not just the Praetorian Guard who are in revolt, but the clerks…and there are a lot of clerks.  It is not just a cabal, but the system, a system in which managers are independent of elected policy-makers.  In the EU this system appears in its most naked form.  In the US it still has to make end runs around the remains of a Constitutional Republic.

As Richard Weaver said, “Ideas have consequences!”  One of the great, pure, ideas of the 19th century was civil service reform.  However in creating a permanent state independent of politics, civil service reform ensured that all future reforms would be bound inside the parameters of the managerial state.  The owl of Minerva takes flight at night, and only now do we see the luster of those single-minded individuals whom the progressives have been eager to denounce as dictators-in-waiting.  The aristocratic Washington, the Jacobin Jefferson, mean old Andy Jackson, the imperious Polk and (though they were already compromised by the permanent state) later figures such as Lincoln and Teddy Roosevelt.

Finally, we can at last see the wisdom of the Founders in endowing one third of the federal government with a vestige of monarchy.  At very worst a monarchy, but never, ever, an empire, since a strong individual, unencumbered by bureaucracy and backed by the people, might indeed succeed in ruling the daily affairs of one nation…but then it would be bedtime.

 

Posted in Constitution, Constitutionalism, culture, Culture & Politics, Economics, History, Law, Paleoconservativism, Politics, Traditionalism, Uncategorized | Tagged: | 1 Comment »

How Churchmen are changed into Ducks

Posted by nouspraktikon on May 9, 2017

George Whitfield (1714-1770)

Among the more formidable characters in church history is George Whitfield (sometimes spelled Whitefield but pronounced without the “e”) the preacher who spread a Calvinistic variety of Methodism in colonial America.  You must understand that at the time Methodism was, as the very name indicates, a methodology and not a sect.  It was Whitfield’s aggressive preaching method, not to the taste of some, which had such a tremendous effect on forming the unique spirituality of early America.

His odd looks (he was cross eyed) and forceful rhetoric must have convinced many that Whitfield  was more an angel than a man.  It was related that he could pronounce a word as neutral and exotic as “Mesopotamia” in such a way as to draw tears from his audience.  For some this was sorcery, but for others it was salvation, and the crowds that he was able to gather were a mighty tributary in that powerful river of revival which we call America’s Great Awakening.

Like his rival in preaching the good news, John Wesley, Whitfield was a life long clergyman in the Anglican church.  Oddly enough, this evangelist with Tory sympathies earned the esteem of freethinking Benjamin Franklin, and the two struck up a friendship which lasted throughout their mature lives.  None the less, it is hard to imagine Whitfield, who died five years before the outbreak of the American Revolution, throwing in his lot with the founding fathers.  For Whitfield being an Anglican was not a doctrinal affirmation, and indeed he despised most of what today would be called “Anglican theology.”  For him, membership in the established church was just the normative state of being born into the British branch of Christendom.  In the Whitmanian view, the established church didn’t get you into heaven, but you couldn’t get out of the established church.  A questionable deal, but a deal nobody could refuse in Britain or its colonies.

To Whitfield’s amazement, many of the Americans whom he had converted on matters spiritual in the 1740’s were loath to join his church, preferring to form into autonomous assemblies, notably Baptist associations.  Whitfield sighed, in reference to the immersion of his converts, “It seems that my fledglings have become ducks!”  From our modern perspective this seems odd as well, why would someone get evangelized by a preacher from one denomination and then go out and join another denomination?  Why did the Whitfield Christians “become ducks”?

Erastianism

To begin with, “denominations” in our contemporary sense didn’t exist, although there were already a multitude of sects.  What did exist was a passionate clash of opinions over ideological and theological issues which today seem obscure and unimportant.  A key word in these debates was “Erastianism” which dropped out of our household vocabularies a century and a half ago and has not been missed yet.

However, unless we know how this “Erastianism” could get people hot under the collar (both clerical and lay collars) we wont understand how churchmen became ducks.  Fortunately there is a term of  recent coinage which conveys much the same meaning to modern ears.  Among libertarian, Constitutional, and conservative circles “statism” has become the contemporary opprobrium of choice for what the colonists called “tyranny.”  Today we can define Erastianism as “statism applied to church governance”, or church-statism.  Keeping that in mind, and equipped with a Bible in one hand and the Declaration of Independence in the other, we are well underway to unravel the ecclesiastical conundrums of 18th century America.  We know what the outcome was, the rise of the Methodists and Baptists and the decline of the Anglican/Episcopalians.  Was this due to the vagaries of demographics or was there some underlying principle working itself out in the lives of Christian men and women?

Going back to the mid-18th century British America, one must keep in mind that Erastianism was not just a theory but a practice.  Take the colony of North Carolina as an example.  The Church of England was established as a public institution, essentially an arm of the state.  Did this mean that those early Tarheels were enthusiastic Anglicans?  Hardly!  In fact the region was largely unchurched during its early history.  None the less a system of church vestries (lay committees) was established paralleling the civil administration, and all subjects were required to pay taxes to maintain this apparatus.

As in all monarchical church-state systems the organization was pyramidal.  Yet, curiously, within British North America this was a truncated pyramid.  Above the vestries and the occasional parish priest, there were no high church officials.  North Carolina, and all other colonies (mostly outside New England) where Anglicanism was established, reported to the Bishop of London.  This led to a curious ambivalence on the part of the colonials.  Some persons, of an Episcopal persuasion, were eager to have cathedrals and bishops established on American shores.  They blamed the crown for foot-dragging on this issue.

Another, and presumably larger, party was heartily glad that the bishops had not yet arrived.  Their fear was that the crown was scheming to impose a hierarchy on the colonies, a hierarchy which would coerce believers in matters of doctrine and impose heftier church taxes.  This was a major item of contention among the colonists in the run up to the revolution, and the fact that it was not directly mentioned in the Declaration of Independence is, like the dog that doesn’t bark, rather a testimony to the seriousness of the issue than the contrary.  It was, like slavery, one of those issues that divided the Founders at a time when it was crucial to present a united front against the crown.

Voting with their (webbed) feet

Keeping these things in mind, perhaps it is easier to understand why the fruits of the Great Awakening, sparked by the evangelism of Anglican priests, did not redound to the Established Church.  Again, taking North Carolina as our example, there are records of a great increase in the membership of Baptist assemblies, while the Established Church remained largely a bureaucratic skeleton.  Converted by the Spirit (through the preaching of Whitfield, Wesley et al) the rustic colonists saw no need to perfect their salvation through works, where the “works” in question were attendance on the ceremony and obligations of local established parishes.  Moreover, such were were added on top of (prior to the revolution)the “work” of paying the church tax…that is regardless of one’s belief, atheist, dissenter or whatever.

Really, Whitfield ought not to have been surprised, for the Spirit was working through his eccentricities, not his Anglicanism.  The crowds swooned at his uncanny words such as “Mesopotamia”…I know not whether they would swoon at “Mother England.”

We too should cry when we hear the world “Mesopotamia”!

These things are of interest to me since I am persuaded by a kind of Calvinistic Methodism myself.  Albeit that I am only a Calvinist in supposing that all people are sinners, while my Method has little in common with that of the Wesley brothers.  Rather, the method consists in this, that (at least under ceterus paribus conditions, a.k.a., all things being equal) freedom is a good thing and coercion is wrong.

Now today in Christendom (or rather post-Christendom) we are no longer so clearly divided into and Established Church and Dissenters.  However the same perennial urges resurface under different guise.  Thus today we have Liberal churches and Conservative churches.  In both these “denominations” there are churches and individuals who seek to become an Establishment.  Both seek to establish a church-state, albeit according to a different view of what the proper function of the state might be.  The liberal churchmen, and churchwomen, want to be the altruistic cheerleaders of the journalistic-academic-welfare-health complex, while the conservatives want the church to be an official apologist for the military-industrial-banking complex.

However there is always a remnant which has been granted the wisdom to understand human folly.  Among the greatest of follies is what has been called “the tyranny of good intentions.”  This is when we try to force something good on someone.  If we try to force Christ on someone we get the Inquisition.  If we try to force “democracy” (a problematic concept in itself!) on a people we get…well, we get something like the contemporary Middle East, a region in constant turmoil where two thousand year old Christian communities are today on the verge of extinction.

It is we, not Whitfield’s auditors, who should weep when we hear that old name for Iraq and its neighbors…”Mesopotamia”!

Yet through the gloom of it all, let’s remember that Jesus loves us.  I’m afraid I may have increased the gloom by throwing a heavy theological tome at your head.  But at least I warned you…

Duck!

 

 

Posted in Appologetics, Charismata, Christian Education, Christianity, Constitution, Constitutionalism, culture, Culture & Politics, Paleoconservativism, Philosophy, Politics, Traditionalism | Tagged: | Leave a Comment »

Captain Obvious calling: What if Myths are just (you guessed it!) myths?

Posted by nouspraktikon on May 3, 2017

From unsophisticated lies to sophisticated rationalizations

I have spent more of my  life than I would care to admit trying to unravel the mysteries of myths and mythologies.   The dominant theories among anthropologists, psychologists and other scholars reflects the prevailing assumption that myth reflects a key to some deep primitive wisdom which modern people have gotten out of touch with.  Thus for Levi-Strauss, myth reveals the primitive meta-logic of the mind which is far more socially cohesive than the analytical categories of common sense logic.  Carl Jung goes further in seeing the primal spirituality of all human beings stored in a collective unconscious which from time to time is expressed in mythical terms.

The assumption is that there are truths too deep to be expressed in plain expository language.  But what if myth, far from expressing truths, is actually giving vent to falsehoods.  This is the viewpoint of Rene Girard, who sees in the incoherence of myth, a similarity to rationalization.  When the main character of a mythical narrative suddenly turns into a god or a totemic animal, Girard suggests that the hero was the subject of envy and fell victim to murder most foul.  To disguise the crime the survivors in society changed the narrative and promoted the hero from the status of victim to god.  Those who notice some similarity to Christ’s passion will not be surprised that Girard is a Christian and was influenced by the gospel narrative in framing his social theory.

One need not concur with all the details of Girard’s anthropology to see the wisdom of applying a forensic approach to myth.  If myths are primitive rationalizations of the great crimes committed in antiquity, this would go a long way to explaining the convoluted and contradictory logic which seems characteristic of all primitive societies.  As Mark Twain once said, “I don’t tell lies because its too much work to keep them all straight in my memory.”

From Fall to Falsehood

However the human race seems, on the whole, to have taken liberties with the truth at the price of developing a vast and often incoherent body of narratives which we call mythology.  To say that myths are lies and nothing more than lies, would seem to put the work of generations of anthropologists and folklorists to naught.  Yet this might be a true key to understanding the enigma of the human past.  All myths might be variations on one Big Lie which has been told generation after generation, growing in detail and complexity as each narrator attempted to put more distance between his contemporaries and some Primal Crime of deep antiquity.

In this context, it might be useful to note that the Bible, whatever “genre” we might assign to it, most certainly is not myth.  Even the most superficial acquaintance with scripture shows that its style and method is completely different from all the mythological systems which have been passed down through the traditions of the nations.  Indeed, scripture and myth are not just different but opposite, and comparing them is much like looking through a telescope alternatively from different ends.  Thus, while myths are human attempts at making a theology, the Bible was given us by God as a book of anthropology.  In understanding ourselves, we understand our relationship to God, or lack thereof.

Unlike myths, the Bible reveals to us the Great Crime which broke our fellowship with God.  It tells the truth in straight, unambiguous terms, in terms which would be recognized by any logician, whether or not such a logician accepted the moral of the story.  In contrast, mythology, the Bible’s primitive rival, is forever losing the logical thread of its narrative, much like dreams, which are simply the nocturnal counterpart of the mythological madness told in broad daylight.  When myth is on the witness stand the story is always changing, backtracking, and the names are changed to protect the guilty.

Not so with scripture, which radiates a clarity similar to the last pages in a classical “whodunit.”  Of course, this makes it unpopular with the criminal class, a class which (in regard to the Original Crime) includes the entirety of the human race.  Conversely this explains the popularity of myth which is, in the absence of other virtues…at least highly creative.

Posted in Anthropology, Art, Christian Education, Christianity, culture, Fiction, History, Paleoconservativism, Theology, Uncategorized | Tagged: | Leave a Comment »

The Gun You Should Reach For When You Hear the Word “Culture”

Posted by nouspraktikon on April 24, 2017

Why “Culture” is a loaded word which needs to be disarmed

All advocates of a civilized world, and most emphatically all Christians, need to be skeptical every time the word “culture” is mentioned.  Evolution and culture are the two key concepts which have destroyed genuine anthropology, anthropology in the Christian sense of the word.  If today we live in a world where the barbarians are at the gates, it is only because the vital distinction between civilization and barbarism was first erased from the scholarly vocabulary in the name of an ambiguous and relativistic understanding of human nature, an understanding which is encapsulated in the term “culture.”

The word “culture” (an otherwise unobjectionable term) was adopted by secular anthropologists as the label for a mental package deal known as “the culture concept.”  The essence of this concept is that human beings create their own mental reality.  Even humanists are humble enough to realize that human beings do not create their own physical reality.  That sort of thing went out of style with Renaissance magic.  Humanists claim that the universe has arisen through something other than human agency, and since human agency is the only rational design they recognize, they conclude that it is a result of chance plus vast quantities of time.  This is the celebrated theory of evolution.

There is another sense in which Humanists exhibit a minimal degree of humility.  The culture concept implies that “Man Makes Himself” to quote a title  from V. Gordon Child, from a day when even left-wing scholars could use masculine pronouns.  However the culture concept admonishes the would be Ubermench that human individuals do not make themselves, only groups have the power to shape the mental environment of their members.  Since the culture concept derives ultimately from the thinking of Immanuel Kant, this is an important revision in the theory.  Kant asserted that the human mind creates its own reality, but he was very abstract in his presentation.  He didn’t stress the role of groups in forming their own environments.  This was worked out in the century after Kant by various neo-Kantian scholars and passed down through the educational system in the form of anthropological dogma.

This formula, that 1) evolution makes the physical environment, and 2) culture makes our mental environment, is the one-two punch of all Humanist thought.  It is diametrically opposed to Christian anthropology, which sees the human race as part of creation dependent upon almighty God.  To be sure, in the Christian view the human race occupies a unique role in creation, as the thinking and governing part, just as in Humanism the humans are unique in possessing “culture.”  However there is a world of difference in these two forms of uniqueness.  The first uniqueness is related to something personal outside itself, a condition which renders objective morality possible.  The second uniqueness, the uniqueness of “culture” is purely self-referential.  It cannot be brought to the bar of any moral standard higher than itself.  From the Humanist viewpoint, this isolated uniqueness reflects the principle of human autonomy.  From the Christian viewpoint, it is an illusion resulting from sin.

Culture as the moral ultimate means that culture itself cannot be judged, and implies relativism.  The history of the culture concept is the progress of increasingly consistent forms of relativism.  In the 19th century anthropologists tried to rank cultures on the basis of degrees of civilization, or put negatively, emergence from barbarism.  However as the relativistic implications of the culture concept were systematized, notably by Franz Boaz and his followers, attempts at judging cultures were suppressed.   Today, all judgments of different cultures according to some objective standard outside culture are considered prejudicial.  However this moral conclusion is the consequence of the supposed impossibility of any objective standard.

When the Nazi German Propaganda Minister Goebbels famously exclaimed, “When I hear the world culture I reach for my gun!” he was diametrically opposed to the cultural criticism which we are trying to undertake.  Like Franz Boas, Goebbels was aiming for the idea of “high culture” as opposed to barbarism.  We should translate his words as “when I hear the word ‘civilization’ I reach for my gun.”  Both Nazism and cultural relativism have tried to make it impossible to isolate barbarism as a descriptive category and set it over against civilization.  Of course there were profound moral differences between Boaz, the liberal Jew, and Goebbels, the German fascist.  The latter went beyond theory and was determined to normalize barbarism by acting it out in real life.  However in the long run it has been the gentle scholar who has been more effective in destroying civilization, first as an ideal and then as a reality, among people of good intentions.

Yes, traditions exist

The major opposition to a frontal assault on the culture concept is the contention that culture aptly describes the variety and richness of human traditions found throughout the world.  However this diversity has always been recognized, certainly prior to the academic hegemony of the culture concept.  Some of these traditions were instituted by the Most High God, some are human innovations, and some have been inspired by lesser spirits.  Human innovation is not to be gainsaid, either for good or for evil, and neither is the vast diversity of traditions.

The culture concept adds nothing to our understanding of the richness of human institutions.  However by insisting on the human origin of our mental world, the culture concept begs one of the most significant questions which can be asked about history: Who, or what, instituted institutions?  Its long range effect is to flatten out the mental world into the single, flat, plane of human reality.  Cultural Humanists boast of having an “immanent frame” in which they are free to make any judgement they wish about human affairs.  However “any judgement” ultimately means that no judgment is authoritative, and hence that all are meaningless.  This default to meaninglessness and nihilism is the next to last stage in the decline of cultural relativism.

The final stage occurs when “culture” having outlived its usefulness in the promotion of nihilism is reabsorbed by “evolution” the master-concept which required culture as a temporary supplement and diversion.  When the ideals of humanity have lost their charm, the spiritual descendants of Goebbels will round on the spiritual descendants of Boaz, with guns metaphorical or otherwise.

It is to save these people of good intentions, these so-called “Humanists” from the fate which dooms their concepts, their bodies, and their souls (not necessarily in that order) that we must insist on a God beyond culture.

Posted in Anthropology, Appologetics, culture, Culture & Politics, Paleoconservativism, Philosophy, Politics, Theology, Traditionalism, Uncategorized | Leave a Comment »

Dear Michael Savage, here is your prize-winning proof of Human Stupidity (which assumes the existence of God)

Posted by nouspraktikon on April 12, 2017

Dear Michael Savage,

First of all I want to let you know how much I enjoy your program.  After taking a lot of guff and being called a deplorable, you have now dumped the Trump train over Syria.  Just goes to show, that for true blooded deplorables, it was more than just a “thing” about the orange hair.  Oh well….

So much for WWIII and the other small stuff.  Now getting down to that proof of the existence of God!  As you and I and everyone else knows, God exists.  However there are a certain class of scholars, known as apologists, who go beyond just knowing that God exists to trying to prove that he exists.  God must love these people very much, since he doesn’t blast them out of existence for doing something which is ultimately blasphemous.  I love them too, especially the really complicated ones like Thomas Aquinas and Gottfried Leibniz, who’s thoughts are as intellectually challenging as they are useless.   These are the people who attempted  a frontal assault on human infidelity and ignorance, which in itself is rather stupid.

The correct procedure is to reverse the question and ask why human beings reject God and all knowledge of His existence and character.  In scholarly circles this method is called “presuppositionalism” and if left to run amuck it will lead to academic disputations as obscure as anything spawned from the pen of Thomas Aquinas.  However the basic insight perfectly simple.  We all live in a world which is screaming at us 24 hours a day seven days a week, “I am God’s creation!”  Yet there are two classes of human beings, those who accept the Creator and their creaturely status, and those who feel that both the universe and they themselves are self-made.

Since both the believers and the God-rejecting people live in the same world, a world in which we are nurtured and have our being, there would not seem to be much ground for metaphysical disputation.  Even rather evil people such as Martin Heidegger have never doubted that existence exists, although that benighted philosopher expressed great surprise that Being had managed to nudge out non-existence in the contest for reality.

No, both classes of human beings inhabit the same life-world, but they think according to different principles.  As scholars would say, they adhere to different epistemological systems.  The believers see themselves as mentally naked in front of God and the world.  For them there is no “problem of knowledge” per se, since the  information we get from our world is abundant and, except in limiting cases, generally reliable.

However, in the case of the non-believer, one must have an epistemology before venturing into the wilds of the universe.  For such people, there is a gap between the ego and reality, a gap which can only be bridged through strenuous philosophical or scientific investigation.  However this plight of inadequate knowledge is not just an epistemological inconvenience, but rather grounded in the moral attitude of the non-believer him or herself, since before staking any claim to knowledge the non-believer has already declared a state of ego-autonomy.  This declaration of independence has the unfortunate consequence of stranding the ego on a deserted island of his or her own making, from which venturing out into the world of bruit fact, governed only by the laws of chance,  is a perilous adventure.

Well now Mr. Savage, even if you accept all that I have written above, it certainly doesn’t present a “proof of the existence of God”…at least in the classic sense.  However, from a forensic point of view, it ought to make us suspicious of of the non-believer’s motivation.  Why the insistence on autonomy?  Why the cumbersome epistemological apparatus?  It would almost seem as if there were something or Someone out in the wilds of reality whom the non-believer was afraid of, and for whom this gap between the ego and the Other was improvised.

Indeed, there are grounds for supposing that the gap between the ego and its environment is not a fact of nature, but an improvisation designed to suppress the original confluence between the human mind and God.  This would also explain the general uselessness of “proofs of the existence of God” since these are attempting to employ a metaphysical tool in order to solve a moral problem.  The “proofs” usually only work on people who are already believers.

To conclude, Mr. Savage, I know that this is a rather bleak judgement, and furthermore begs the question, “What is to be done?”  After all it implies that humanity is divided into two non-communicating epistemological camps.  Instead of offering you an inductive or deductive proof of God’s existence, all I have done is explain the irreducible ignorance of a vast segment of humanity.  Or as you would say, the reason why “they are stupid.”

Well, I suppose prayer wouldn’t hurt.

Blessings upon you and yours,

Mark Sunwall

Posted in Anthropology, Christian Education, Christianity, Culture & Politics, Paleoconservativism, Philosophy, Theology, Uncategorized | Tagged: | Leave a Comment »

The Culture Conspiracy: A critical investigation into the destruction of civilization (Introduction)

Posted by nouspraktikon on April 10, 2017

The Culture Conspiracy

This is the first installment of a multi-part series on how the modern “culture concept” has, as a complement to the theory of evolution, demoralized and degraded civilization, or actual “culture” in the original intent of that word.  While it is not intended to be an exhaustive overview of the topic, the investigation will try to hit on all the major aspects of the problem.  Tentatively, it will be organized along the following themes,

  1. The Great Baton Pass
  2. The Measure of Man vs. the Measure of God
  3. From Custom to Culture
  4. Erasing the essential Civilization/Barbarism distinction
  5. From Kant to Hegel: From the individual to the species
  6. From Hegel to Boaz: From the species to the people
  7. The Super-organic, the Spiritual, and the Ugly
  8. The Enigma of Innovation
  9. Man Makes Himself Part II: From Custom to Customization
  10. Beyond the Culture Concept

Though each of these contains enough to provide a mini-course in itself, in its present state the work is likely to appear as the outline of a syllabus rather than a detailed treatment of the subject.

Introduction: The Culture Conspiracy

Suppose you were able to travel back in time to the mid-Victorian era.  Just to pick a date, let’s suppose it were 1859, the year in which Darwin published his master work, Origin of Species.  You arrive in London, England and are able to established communications with a middle class person, of either sex, and ask them two questions about the future.  First, do you expect technology to improve in the future?  Second, do you expect culture to improve in the future?  If I am not greatly mistaken, the answer of a well-informed Londoner of 1859 would be a resounding “Yes!” to both questions.

Next, through the magic of your time-traveling you offer them a vista of life at the beginning of the twenty-first century.  Now they are able to judge whether their optimistic prophecies have been vindicated.  There is no need to waste time on the answer to the first question.  The mid-Victorian would find the technological wonders of the present to be little less than a magical transformation of the human environment.  Even if the lady or gentleman in question were a Luddite, or like Mr. Butler, apprehensive of “machines” in general, they would be forced to admit that the machines had won the day, whether or not the technical triumph was in the long range interests of the human race.

And what of culture?  If cultural optimism were vindicated in proportion to the Victorian’s technological optimism, what wonderful variations on Moore’s Law might one expect?  In the year 2017 music would be one-hundred times more sonorous than Mozart, paintings one-hundred times beautiful than Turner, the law-courts one-hundred times more just and expeditious, families one-hundred times more peaceful and harmonious,  architecture one-hundred times more symmetrical and stately,  and the religious life of the average man or woman one-hundred times more pious.

I am sure everyone understands that such exaggerated expectations would suffer bitter disappointment.  But I would go beyond that and hypothesize that our representative Victorian would judge that much of culture had regressed rather than progressed.  Looking around at a population dressed in t-shirts and jeans, the well-dressed Victorian might assume that he or she (especially she) had landed in a sartorial dark ages.  Dress might be the most ubiquitous and offensive sign of cultural degeneration, but further investigation would reveal a myriad of aspects in which 21st century culture had decayed far beyond the lowest level of Victorian expectations.

Art might be cheap and easily accessible but so primitive, cartoon-like or commercial that the Victorian time-traveler would deem it rubbish.  Language, (unless our Victorian were a rater in Her Majesty’s Navy)  would have become unutterably vulgar.  Human relations would have become broader but shallower, and the family reduced to just one of the many nodes of association provided for the convenience of individuals.  The poor-house and the debtors prison would have been abolished, but by the year 2017 debt would have become the primary nexus holding the economy together.  Indeed, from the point of view of a middle-class Victorian, by the year 2017 society itself would have become one giant debtor’s prison.

This is not even to speak of the actual prisons of the 21st century, or the fact that Jack the Ripper (still in the future for 1859) would spawn, like some forensic Adam, a class of registered and unregistered offenders.  Finally our representative Victorian, even if not an enthusiast for the works of Herbert Spencer, might dimly recognize that by the standards of classical liberalism, the 21st century state had itself become a criminal network, engaged in perpetual borrowing and taxation for extensive regulation at home and endless warfare abroad.

Having safely deposited our Victorian time-traveler back to the homely 19th century, and drugged him with the obligatory milk of amnesia so that history won’t be spoiled, a familiar figure enters from stage left to deliver a soliloquy.  This is Mr. Carping Critic, who objects to the whole little drama.  He claims that our whole little experiment is a sham, based on false premises from the start.  He says that the two questions were apples and oranges from the start, and that the “no” verdict to the second question rests on biased judgment.  He says that when we jump from technology to culture we go from the measurable to the intangible, and we have entered into that shady region of values where nobody’s opinion (even that of a time-traveling Victorian) is more objective than that of someone else.

From the point of view of Mr. Carping Critic, the Victorian’s view of art is just an outmoded taste, so of course we should expect a negative verdict.  If the growth of the prison population is viewed negatively, it just shows the enduring grip of pastoral romanticism over the advantages of cozy confinement.  And so forth and so on in every department of “culture” since after all, culture is a matter of values, and as we all know, values change.  The seal of the entire argument is the whole ridiculous subject of clothing, which our time traveler had nothing better to venture than the opinion of a bigoted prude.

With that coup de grace, Mr. Carping Critic thinks he has stripped the Victorian of her secret!

I cannot refute Mr. Carping Critic on his own grounds, since they are not grounds at all, but the quicksands of a shifting and relativistic doctrine.  However it is a doctrine which has a history and that history can be exposed and criticized.  Indeed, I will go beyond Mr. Carping Critic to criticize the one concept which remains beyond criticism for him, namely “the culture concept.”  Yes, he is right to say that the time-traveling questions were not consistent, for in 1859 the word “culture” hadn’t quite assumed the connotation that we give it today.  Soon that would change, and it would change in such a way that people would no longer be as confident about making statements about objective reality as they had previously.

I think, in contrast to Mr. Carping Critic and his ilk, that objective reality, not just in the natural but the human world, continues to exist, and that an inability to talk about it puts anyone thus incapacitated at a severe disadvantage.  However our inability to talk about human affairs objectively is the end result of a kind of conspiracy, a conspiracy that started long ago and today has come to fruition in a multitude of crises.  In subsequent installments I will unmask this conspiracy… the culture conspiracy.

Posted in Anthropology, Art, Culture & Politics, Esoterism, Paleoconservativism, Theology, Uncategorized | Tagged: , , , , , | Leave a Comment »