Tuesday, February 28, 2006

Liberty as a Social Construct

There are two main views of the source of liberty:
  • Liberty is innate in humans.
  • Liberty is a social construct.
I am of the second view: Liberty is not in our genes and does not flow from heaven like manna. Nor is it found by applying John Stuart Mill's misleadingly simple "harm principle."

Liberty requires a consensus about harms and the boundaries of mutual restraint -- the one being the complement of the other. Agreed harms are to be avoided mainly through self-restraint. Societal consensus and mutual restraint must, therefore, go hand in hand.

Looked at in that way, it becomes obvious that liberty is embedded in social norms and preserved through the observance of those norms. There may be societally forbidden acts that, to an outsider, would seem not to cause harm but which, if permitted within a society, would unravel the mutual restraint upon which ordered liberty depends.

The inculcation of mutual restraint depends mainly on the existence of viable families -- families in which the parents are present and at least one of them (traditionally the mother) spends a great deal of time inculcating in children the value of self-restraint (also known as the Golden Rule).

Honesty is a corollary of self-restraint, and is implicit in the Golden Rule. Honesty is essential to liberty because the security of one's livelihood and property depends primarily on voluntary adherence to contracts, formal and informal.

A third familial value essential to liberty is mutual aid -- the practice of mutual assistance and defense. The teaching of mutual aid at home spills over into the community. As I wrote here,
the willingness of humans to come to each other's defense has emotional and practical roots:

1. An individual is most willing to defend those who are emotionally closest to him because of love and empathy. (Obvious examples are the parent who risks life in an effort to save a child, and the soldier who throws himself on a grenade to protect his comrades.)

2. An individual is next most willing to defend those who are geographically closest to him because those persons, in turn, are the individual's nearest allies. (This proposition is illustrated by the Union and the Confederacy in the American Civil War, and by the spirit of "we're all in this together" that prevailed in the U.S. during World War I and World War II. This proposition is related to but does not depend on the notion that patriotism has evolutionary origins.)

3. If an individual is not willing to defend those who are emotionally or geographically closest to him, he cannot count on their willingness to defend him. In fact, he may be able to count on their enmity. (A case in point is Southerners' antagonism toward the North for many decades after the Civil War, which arose from Southerners' resentment toward the "War of Northern Aggresssion" and Reconstruction.)
What happens to self-restraint, honesty, and mutual aid outside the emotional and social bonds of family, friendship, community, church, and club can be seen quite readily in the ways in which we treat one another when we are nameless or faceless to each other. Thus we become rude (and worse) as drivers, e-mailers, bloggers, spectators, movie-goers, mass-transit commuters, shoppers, diners-out, and so on. Which is why, in a society much larger than a clan, we must resort to the empowerment of governmental agencies to enforce mutual restraint, mutual defense, and honesty within the society -- as well as to protect society from external enemies.

But liberty begins at home. Without the civilizing influence of traditional families, friendships, and social organizations, police and courts would be overwhelmed by chaos. Liberty would be a hollower word than it has become, largely because of the existence of other governmental units that have come to specialize in the imposition of harms on the general public in the pursuit of power and in the service of special interests (which enables the pursuit of power). Those harms have been accomplished in large part by the intrusion of government into matters that had been the province of families, voluntary social organizations, and close-knit communities. As Hans-Hermann Hoppe writes in "The Rise and Fall of the City,"
[a]fter the race and the class cards have been played and done their devastating work, the government turns to the sex and gender card, and "racial justice" and "social justice" are complemented by "gender justice." The establishment of a government — a judicial monopoly — not only implies that formerly separated jurisdictions (as within ethnically or racially segregated districts, for instance) are forcibly integrated; it implies at the same time that formerly fully integrated jurisdictions (as within households and families) will be forcibly broken down or even dissolved.

Rather than regarding intra-family or -household matters . . . as no one else's business to be judged and arbitrated within the family by the head of the household or family members, once a judicial monopoly has been established, its agents — the government — also become and will naturally strive to expand their role as judge and arbitrator of last resort in all family matters. To gain popular support for its role the government (besides playing one tribe, race, or social class against another) will likewise promote divisiveness within the family: between the sexes — husbands and wives — and the generations — parents and children. Once again, this will be particularly noticeable in the big cities.

Every form of government welfare — the compulsory wealth or income transfer from "haves" to "have nots" lowers the value of a person's membership in an extended family-household system as a social system of mutual cooperation and help and assistance. Marriage loses value. For parents the value and importance of a "good" upbringing (education) of their own children is reduced. Correspondingly, for children less value will be attached and less respect paid to their own parents. Owing to the high concentration of welfare recipients, in the big cities family disintegration is already well advanced. In appealing to gender and generation (age) as a source of political support and promoting and enacting sex (gender) and family legislation, invariably the authority of heads of families and households and the "natural" intergenerational hierarchy within families is weakened and the value of a multi-generational family as the basic unit of human society diminished.

Indeed, as should be clear, as soon as the government's law and legislation supersedes family law and legislation (including interfamily arrangements in conjunction with marriages, joint-family offspring, inheritance, etc.), the value and importance of the institution of a family can only be systematically eroded. For what is a family if it cannot even find and provide for its own internal law and order! At the same time, as should be clear as well but has not been sufficiently noted, from the point of view of the government's rulers, their ability to interfere in internal family matters must be regarded as the ultimate prize and the pinnacle of their own power.

To exploit tribal or racial resentments or class envy to one's personal advantage is one thing. It is quite another accomplishment to use the quarrels arising within families to break up the entire — generally harmonious — system of autonomous families: to uproot individuals from their families to isolate and atomize them, thereby increasing the state's power over them. Accordingly, as the government's family policy is implemented, divorce, singledom, single parenting, and illegitimacy, incidents of parent-, spouse-, and child-neglect or -abuse, and the variety and frequency of "nontraditional" lifestyles increase as well. . . .

It is [in the big cities] that the dissolution of families is most advanced, that the greatest concentration of welfare recipients exists, that the process of genetic pauperization has progressed furthest, and that tribal and racial tensions as the outcome of forced integration are most virulent. Rather than centers of civilization, cities have become centers of social disintegration, corruption, brutishness, and crime.

To be sure, history is ultimately determined by ideas, and ideas can, at least in principle, change almost instantly. But in order for ideas to change it is not sufficient for people to see that something is wrong. At least a significant number must also be intelligent enough to recognize what it is that is wrong. That is, they must understand the basic principles upon which society — human cooperation — rests: the very principles explained here. And they must have sufficient will power to act according to this insight.

The state — a judicial monopoly — must be recognized as the source of de-civilization: states do not create law and order; they destroy it. Families and households must be recognized as the source of civilization. It is essential that the heads of families and households reassert their ultimate authority as judge in all internal family affairs. Households must be declared extraterritorial territory, like foreign embassies. Free association and spatial exclusion must be recognized as not bad but good things that facilitate peaceful cooperation between different ethnic and racial groups. Welfare must be recognized as a matter exclusively of families and voluntary charity and state welfare as nothing but the subsidization of irresponsibility.
In sum, liberty is not an abstract ideal. Liberty cannot be sustained without the benefit of widely accepted -- and enforced -- social norms. A society that revolves around norms established within families and close-knit social groups is most likely to serve liberty.

Related posts:

The State of Nature
Some Thoughts about Liberty
The Paradox of Libertarianism

Anti-Western Values, in the West

UPDATED

I came across two three excellent posts today. There's "Oncoming" at davidwarrenonline, which includes this:
It is only in retrospect that we understand what happened as the 1930s progressed -- when a spineless political class, eager at any price to preserve a peace that was no longer available, performed endless demeaning acts of appeasement to the Nazis; while the Nazis created additional grievances to extract more.

This is precisely what is happening now, as we are confronted by the Islamist fanatics, whose views and demands are already being parroted by fearful “mainstream” Muslim politicians. We will do anything to preserve a peace that ceased to exist on 9/11. Not one of our prominent politicians dares even to name the enemy.
And there's "The Suicidal Left: Civilizations and their Death Drives" at The American Thinker, in which Vasko Kohlmayer observes:
Deeply averse to the West’s moral code, the Left contemptuously refers to it as bourgeois morality. It denigrates the West’s cultural triumphs, contending they are no more unique than those of other societies. It disparages the West’s past by painting it as nothing more than an amalgamation of oppression, exploitation and all-around ignominy.

Scoffing at the notion of the limited State, the Left rejects the climax of western political tradition. And the Left, of course, despises free market capitalism – the West’s economic foundation – which it claims to be inherently exploitative, unfair or worse.

The Left, however, does not confine itself to mere criticism, but aggressively seeks to transform its anti-Western attitude into reality. Even a cursory glance at some of its successes should give us an idea of just how effective its efforts have been.

Virtually demolishing the West’s traditional morality, the Left has managed to legitimize promiscuity, illegitimacy, abortion and homosexuality. This transformation has reached a point where in many quarters these behaviors are not only considered acceptable but outright commendable.

Through its aggressive atheism, the Left has succeeded in virtually eliminating Christianity from our public arena, and to a large degree from the private sphere as well. This trend has been especially pronounced in Europe where only some seven percent of the population engage in some form of regular religious observance. . . .

The West’s moral decline, the collapse of its religion, economic sluggishness, and the indifference to its own historical and cultural achievements – all this is the Left’s doing. Ominously, it has succeeded in inculcating large segments of the western population with contempt for their own culture and heritage. This is a dire state indeed, for no society that is despised by its own people can go on for very long.

Regardless of its rhetoric or avowed objective, the driving force behind the Left’s every movement is to strike against some aspect of Western society. Environmentalism, for instance, hits at the West’s economic foundation of free-market capitalism. Multiculturalism seeks to unravel its cultural coherence. The gay rights movement strikes at its moral underpinnings, and so on. The Left, of course, will deny the real reason for its actions. But to evaluate the true value of any act we need to look at its effect not the rhetoric behind it. And the effects of the Left’s actions are invariably – in one way or another – destructive to the West.

The Left’s gains have been greatly facilitated by its ingenious modus operandi, which is to cloak its destructive intent in the language of good causes. Civil rights, gender equality, ecological preservation are among some of its favorite ploys. The ostensible caring is not real, for these are not at all what the Left’s efforts are ultimately about.

The West’s greatest threat is neither Islam nor any other external foe. It is its own political Left. All the great ills and woes under which our civilization so agonizingly belabors – and under the weight of which it is slowly sinking – have been either brought on or inflamed by it.
To which I add: The Left's weakening of the West makes the West more vulnerable to militant Islam.

UPDATE: I have just come across a column by Dennis Prager at townhall.com, where Prager has this to say:
For a decade or more, it has been a given on the Left that Israel is to blame for terror committed against Israelis by Palestinian Muslims (Palestinian Christians don't engage in suicide terror). What else are the Palestinians supposed to do? If they had Apache helicopters, the argument goes, they would use them. But they don't, so they use the poor man's nuclear weapon -- suicide terror.

The same argument is given to explain 9-11. Three thousand innocent Americans were incinerated by Islamic terrorists because America has been meddling in the Middle East so long. This was bound to happen. And, anyway, don't we support Israel?

And when Muslim terrorists blew up Madrid trains, killing 191 people and injuring 1,500 others, the Left in Spain and elsewhere blamed Spanish foreign policy. After all, the Spanish government had sent troops into Iraq.

When largely Muslim rioters burned and looted for a month in France, who was blamed? France, of course -- France doesn't know how to assimilate immigrants, and, as the BBC reported on Nov. 5, 2005, "[Interior Minister Nicolas] Sarkozy's much-quoted description of urban vandals as 'rabble' a few days before the riots began is said by many to have already created tension." Calling rabble "rabble" causes them to act like to rabble. . . .

[O]ne way to describe the moral divide between conservatives and liberals is whom they blame for acts of evil committed against innocent people, especially when committed by non-whites and non-Westerners. Conservatives blame the perpetrators, and liberals blame either the victims' group or the circumstances. . . .

We don't know who will be the next target of Islamic or other murderers from poor or non-Western or non-white groups. All we can know is that liberal and leftist thought will find reasons to hold the targeted group largely responsible.
Related posts:

Lefty Profs
Apropos Academic Freedom and Western Values
Riots, Culture, and the Final Showdown
Government's Role in Social Decline
Capitalism, Liberty, and Christianity

The Joys of Sole Proprietorship

Glen Whitman, in a post at Agoraphilia, says that
[l]oosely speaking, accounting profit considers only expenditures as costs, whereas economic profit counts both expenditures and forgone income as costs. The classic example is a sole proprietor who works 60 hours/week running his store. On paper, he might appear to be making a large (accounting) profit. But if you subtracted the income he could have received had he taken a job working the same number of hours for someone else, his (economic) profit would be smaller, maybe even negative.
I think Whitman omits an important aspect of economic income, which is sometimes called "psychic income." The sole proprietor gains a non-pecuniary benefit by working for himself: being his own boss. That's why many persons choose the long hours and greater risks of sole proprietorship to the generally shorter hours and more stable income of salaried employment.

Then, too, there's always the possiblity that one's sole proprietorship will be bought out by a larger company for millions of dollars. That's a potential pecuniary benefit that usually isn't available to a salaried employee. But I think that potential benefit is secondary to the psychic income derived from being one's own boss.

Monday, February 27, 2006

Somthing to Ponder

A traveler in the desert who has run out of water comes upon a well that is enclosed by a high chain-link fence, in which there is a locked gate. The fence demarcates the property of the well's owner, who has plenty of water for his own needs and could give some away at no loss to himself. The traveler shouts until he is heard by the owner of the property, who comes to the gate and asks the traveler what he wants. The traveler says that he would like to fill his canteen so that he can continue his journey and not die of thirst before he reaches his destination. The fence and gate are so high that the traveler cannot give his canteen to the property owner by throwing it; the property owner must unlock the gate and, thus, give the traveler an opportunity to force his way in.

The property owner either gives water to the traveler or refuses to give any water to the traveler. The property owner's reasons for giving or refusing water are unknown to us. It is possible, for example, that the property owner is torn between (a) empathy for a human being in distress and (b) a suspicion (based on knowledgea and/or experience) that the traveler might try to rob him. It is possible, also, that the property owner is misanthropic, which is why he lives behind a very high fence in the middle of a desert. There may be other explanations for the property owner's decision to give or refuse water to the traveler. All we know is the property owner's decision.

How do you react if the property owner refuses to give water to the traveler?

1. He had to make a judgment. No one is in a position to second-guess that judgment.

2. The property owner doesn't owe water to the traveler. The traveler should have been better prepared for his journey and brought more water. It was happenstance that brought him to a property on which there was a well. What would he have done if a property with a well hadn't been on his route? Think of the kinds of behavior a property owner might encourage and invite if he were to succumb to the blandishments of an imprudent traveler or a criminal masquerading as one.

3. The property owner has a moral duty to aid the traveler, even at some risk to himself (the property owner). But, if the property owner refuses to help the traveler, the consequences of the refusal are on the property owner's conscience. It is no one else's business.

4. There should be a law that requires property owners to give water to travelers, even though such a law: (a) might encourage some travelers to go forth with inadequate supplies of water even though they might not come across a well, and (b) might make it easier for criminals to attack and rob property owners.

I am content with reaction 1. Reaction 2 isn't inconsistent with reaction 1, but I find reaction 2 unnecessarily defensive of the owner's decision. (Reaction 2 may be politically necessary, however, because of reactions 3 and 4.) Reaction 3 substitutes a third party's judgment for that of the owner. And the "moral duty" part of reaction 3 forms the basis for reaction 4, which then translates the third party's judgment into a legal stricture. The legal stricture on voluntary behavior has the usual results: It creates a moral hazard for travelers and has (negative) unintended consequences for property owners.

Sunday, February 26, 2006

Misdiagnosing the Problem

The usually clear-thinking Michael Barone goes astray:
Here is a map showing the location of riots protesting the Danish cartoons. And here's a link to Thomas Barnett's "nonintegrated gap." Notice the similarity? Barnett, as faithful readers of this blog will know, argues that the major task before us in the "functioning core" (North America, much of South America, Europe, India, Japan, and East Asia) is to integrate the "nonintegrated gap" (the Muslim world from the Maghreb to Pakistan, Indonesia, as well as the Philippines and part of Andean Latin America) into the free-market, rule-of-law core. The riots occurring largely in the gap (and in Muslim communities in Europe) are just the latest symptoms of the problem.
How has a problem that's endemic to the cultures of the "nonintegrated gap" become our problem? We don't force their culture (and the resulting ignorance and poverty) on them, they do it to themselves. For more, read this.

Saturday, February 25, 2006

Sunstein and Executive Power

Cass Sunstein endorses unilateral executive action. But Sunstein doesn't mean to endorse George W. Bush's use of executive power to defend America. Sunstein's aim is to justify the resuscitation Franklin D. Roosevelt's disastrous New Deal.

Related posts:

Sunstein at the Volokh Conspiracy
More from Sunstein
Call Me a Constitutional Lawyer
(Sen)seless Economics
Cass Sunstein's Truly Dangerous Mind
An (Imaginary) Interview with Cass Sunstein
Slippery Sunstein

Monopoly and the General Welfare

UPDATED, 02/26/06

I began an earlier post by illustrating (with a "Jack and Jill" example") how a regime of liberty fosters prosperity. I concluded:
In sum, liberty -- which includes the right to engage in voluntary exchange -- makes both Jack and Jill better off. Moreover, because they are better off they can convert some of their gains from trade into investments that yield even more output in the future. For example, to continue with this homely metaphor, imagine that Jill -- fueled by additional food -- is able to produce the usual amount of butter in less time, giving her time in which to design and build a churn that can produce butter at a faster rate.

Liberty advances the general welfare, which means the general well-being -- not handouts.
You may have noticed that I did not argue that Jack and Jill's well-being can somehow be aggregated into a "social welfare function." As I wrote here, there is no such thing:

Suppose . . . that a faction of US citizens (call it LW) is unhappy because of certain actions being taken to prevent an attack by AQ. The actions that make LW unhappy don't make me unhappy. In fact, they add to my happiness because I despise LW; anything that makes LW unhappy makes me happier. Thus, I'll continue to be happy, despite LW's unhappiness, unless and until (a) LW's unhappiness leads to a political decision to stop defending US against AQ or (b) AQ attacks US successfully.

I could go on, but I think you get the idea. My happiness (or unhappiness) is mine, and yours is yours. The best we can say is that voluntary exchange in free markets, protected by strict enforcement of laws against force and fraud, would make almost everyone happier -- and wealthier. So much wealthier that there'd be plenty of money with which to buy off the free-loaders. But that's another story.

(See also this post.)

In yet another post I defended monopoly:
Where, for instance, is there room in the socialist or regulatory calculus for a rule that allows for unregulated monopoly? Yet such an "undesirable" phenomenon can yield desirable results by creating "exorbitant" profits that invite competition (sometimes from substitutes) and entice innovation. (By "unregulated" I don't mean that a monopoly should be immune from laws against force and fraud, which must apply to all economic actors.
(There's more here.) And in this post, on the subject of monopoly (scroll to item 19), I opened by saying, "Monopoly (absent force, fraud, or government franchise) beats regulation, every time." This follows:
Regulators live in a dream world. They believe that they can emulate -- and even improve on -- the outcomes that would be produced by competitive markets. And that's precisely where regulation fails: Bureaucratic rules cannot be devised to respond to consumers' preferences and technological opportunities in the same ways that markets respond to those things. The main purpose of regulation (as even most regulators would admit) is to impose preferred outcomes, regardless of the immense (but mostly hidden) cost of regulation.

There should be a place of honor in regulatory hell for those who pursue "monopolists," even though the only true monopolies are run by governments or exist with the connivance of governments (think of courts and cable franchises, for example). The opponents of "monopoly" really believe that success is bad. Those who agitate for antitrust actions against successful companies -- branding them "monopolistic" -- are stuck in a zero-sum view of the economic universe (see No. 13), in which "winners" must be balanced by "losers." Antitrusters forget (if they ever knew) that (1) successful companies become successful by satisfying consumers; (2) consumers wouldn't buy the damned stuff if they didn't think it was worth the price; (3) "immense" profits invite competition (direct and indirect), which benefits consumers; and (4) the kind of innovation and risk-taking that (sometimes) leads to wealth for a few also benefits the many by fueling economic growth.

. . . What about those "immense" profits? They don't just disappear into thin air. Monopoly profits ("rent" in economists' jargon) have to go somewhere, and so they do: into consumption, investment (which fuels economic growth), and taxes (which should make liberals happy). It's just a question of who gets the money.

But isn't output restricted, thus making people generally worse off? That may be what you learned in Econ 101, but that's based on a static model which assumes that there's a choice between monopoly and competition. I must expand on some of the points I made in the original portion of this commandment:
  • Monopoly (except when it's gained by force, fraud, or government license) usually is a transitory state of affairs resulting from invention, innovation, and/or entrepreneurial skill.
  • Transitory? Why? Because monopoly profits invite competition -- if not directly, then from substitutes.
  • Transitory monopolies arise as part of economic growth. Therefore, such monopolies exist as a "bonus" alongside competitive markets, not as alternatives to them.
  • The prospect of monopoly profits entices more invention, innovation, and entrepreneurship, which fuels more economic growth.
I will now try to braid these strands into a rope fit for hanging trust-busters. Returning to Jack and Jill, suppose this:
  • Jack can make 1 loaf of bread a day; he does not know how to make butter.
  • Jill and June each can make 1 pound of butter a day; neither knows how to make bread.
  • Jack is a "natural monopolist" in bread; Jill and June must bid against each other to buy Jack's bread, and must compete with each other in selling butter to Jack.
Now, the question for Jack, Jill, and June is this: At what rate should they exchange bread and butter? Contrary to the anti-trust mentality, there is no right answer to that question. The answer depends on Jack, Jill, and June's respective preferences for bread and butter, and on their respective negotiating skills. But of one thing we can be certain, Jack, Jill, and June will strike bargains that makes all of them better off than they would be in the absence of trade -- if they are left alone by government.

How can that be so if Jack is a "monopolist"? Doesn't he have an "unfair" advantage in his dealings with Jill and June? No. Consider:
  • Suppose that Jack and Jill (and June, as well) prefer bread and butter in the ratio 1/3 loaf: 2/3 pound. Before June enters the picture, Jack and Jill cannot both have their preferred combination of bread and butter. Jack would like to trade 1/3 loaf to Jill in return for 2/3 pound, and Jill would like to trade 1/3 pound to Jack in return for 1/3 loaf. Neither trade will fully satisfy both parties, so they must strike a compromise that leaves both of them better off than if they did not trade, but not as well off as they would like to be (with respect to the ratio of bread and butter).
  • When June enters the picture, all parties can have their preferred combination of bread and butter, and they will (if no one interferes). Jack, as the only breadmaker, is not in a superior position with respect to Jill and June. He makes something that they want, but they also make something that he wants. (A "natural monopolist" does not live by bread alone.) And so, Jack must bargain with Jill and June in order to maximize his satisfaction.
  • If Jack tries to bargain with Jill and June by withholding all of his bread, he must accept the fact that he will not have any butter to put on it. If Jack tries to charge Jill and/or June more for his bread than they are willing to pay, they simply will not pay it.
  • What's likely to happen if Jack withholds bread or demands more than Jill and June are willing to pay? Jill and June will learn to make bread for themselves, as if Jack doesn't exist. They will have some bread with their butter; Jack will have no butter with his bread. Who's worse off now, Jack?
  • If Jack persists in his holdout, he isn't guilty of a crime, he's merely guilty of choosing to accept a reduced standard of living. That's a personal preference, not a crime. The same result (for Jill and June) would obtain if Jack were dead. No crime there. Jack is guilty of a "crime" only if he prevents Jill and June from making bread, or if he forces them to give him butter on terms they wouldn't accept voluntarily.
(UPDATE: Have I stacked the deck by the example I used? No. See the addendum, below.)

The only kind of monopoly that harms consumers is a legal monopoly, one that is operated or regulated by government. Such a monopoly isn't harmful per se, it's harmful because the government's operation or regulation of the monopoly ensures that it cannot and will not respond to price signals. A natural monopolist (like Jack the breadmaker) must bargain with his customers, and must be alert to the possibility that his customers will turn to substitutes and near-substitutes if he doesn't bargain with them. But when government operates and regulates whole sectors of the economy (e.g., telecommunications and health care), price signals are practically meaningless -- there is no bargaining -- and substitutes are hard to come by (near-substitutes will be regulated, of course).

The only real monopoly, then, is one that is operated or regulated by government. It is that kind of monopoly -- not Microsoft or Wal-Mart (for example) -- which ought to be broken up or fenced in by the trust-busters.
__________
ADDENDUM: I contrived a special case to illustrate the general point that "the consumer always prefers more to less." That is, the introduction of June, who brings additional butter to the table (so to speak), gives everyone the option of enjoying more butter. No one can be worse off than before, and at least one person can be better off. But suppose that instead of the outcome I sketched above (Jack, Jill, and June each end up with 1/3 loaf of bread and 2/3 pound of butter) June is willing to give Jack 2/3 of her pound of butter in exchange for 1/3 loaf of bread, a deal with which Jill chooses not to compete. The result: Jack (who starts with 1 loaf of bread) has 2/3 loaf of bread and 2/3 pound of better (extra bread for a rainy day), Jill (who starts with 1 pound of butter) has no bread and 1 pound of butter, and June (who starts with one pound of butter) has 1/3 loaf of bread and 1/3 pound of butter (a light eater). Jack and June are, by definition, happy with the outcome of the trade (June wouldn't make the deal with Jack if that weren't the case), but Jill is unhappy because she'd prefer to have 1/3 loaf of bread and 2/3 pound of butter, as before. Three points:

1. The new result has everything to do with differences of taste between Jill and June -- and nothing to do with Jack's so-called monopoly power.

2. To say that Jill has somehow become a "victim" of Jack's so-called monopoly power is to give a privileged postition to Jill. Why are Jill's preferences any more important than Jack's or June's? There's no way to assign values to Jack, Jill, and June's satisifaction, let alone to sum such values and determine that "society" is somehow better or worse off if Jill doesn't get her way.

3. If June's entry into the butter-bartering market leaves Jill hankering for bread, Jill might be able to earn some bread by making jam instead of butter. She would then have something to offer both Jack and June. In other words, the price system is sending a signal to Jill; it's up to her to interpret and act on that signal. That signal would not be sent if a "benevolent" government, acting at the behest of Jill's political action committee, were to step in and dictate the terms of trade between Jack, Jill, and June (according to Jill's preferences). The result would be to make Jill better off while making Jack and June worse off. That's what happens when interest groups are able to use the power of government to satisfy their preferences.

More Evidence for Combinatorial Recreation

Read this, then this.

Friday, February 24, 2006

Starving the Beast, Updated

Out-of-control spending is a hot topic of conversation in the blogosphere, especially among those who are disappointed in President Bush's failure to curb the federal government's appetite. Bill Niskanen and Peter Van Doren, former colleagues of mine at Cato Institute, published a paper a few years ago (which no longer seems to be available on the web), in which they said this:
For nearly three decades, many conservatives and libertarians have argued that reducing federal tax rates, in addition to increasing long-term economic growth, would reduce the growth of federal spending by "starving the beast." This position has recently been endorsed, for example, by Nobel laureates Milton Friedman and Gary Becker in separate Wall Street Journal columns in 2003.
It seems to me that the notion of starving the beast is really an outgrowth of an older, simpler notion that could have been called "strangle the beast." The notion was (and still is, in some quarters) that the intrusive civilian agencies of the federal government, which have grown rampantly since the 1930s, ought to be slashed, if not abolished. There's no need for fancy tricks like cutting taxes first, just grab the beast by the budget and choke it. There's more than money at stake, of course -- there's liberty and economic growth. (I have shown here the extent to which the beast of government has strangled economic growth.)

Anyway, Niskanen and Van Doren argue that the "starve the beast" strategy has failed, which is true, but I have serious reservations about their analysis. Their figure of merit is spending as a share of GDP. But it's the absolute, real size of the beast's budget that matters. Bigger is bigger -- and bigger agencies can cause more mischief than smaller ones. So, my figure of merit is real growth in nondefense spending.

What about defense spending, which Niskanen and Van Doren lump with nondefense spending in their analysis? Real nondefense spending has risen almost without interruption since 1932, with the only significant exception coming in 1940-5, when World War II cured the Depression and drastically changed our spending priorities. Real defense spending, on the other hand, has risen and fallen several times since 1932, in response to exogenous factors, namely, the need to fight hot wars and win a cold one. Niskanen and Van Doren glibly dismissed the essentially exogenous nature of defense spending by saying
that the prospect for a major war has been substantially higher under a unified government. American participation in every war in which the ground combat lasted more than a few days -- from the war of 1812 to the current war in Iraq -- was initiated by a unified government. One general reason is that each party in a divided government has the opportunity to block the most divisive measures proposed by the other party.
First, defense outlays increased markedly through most of Reagan's presidency, even though a major war was never imminent. The buildup served a strategy that led to the eventual downfall of the USSR. Reagan, by the way, lived with divided government throughout his presidency. Second, wars are usually (not always, but usually) broadly popular when they begin. Can you imagine a Republican Congress trying to block a declaration of war after the Japanese had bombed Pearl Harbor? Can you imagine a Democrat Congress trying to block Bush II's foray into Afghanistan after 9/11? For that matter, can you imagine a Democrat-controlled Congress blocking Bush I's Gulf War Resolution? Well, Congress was then in the hands of Democrats and Congress nevertheless authorized the Gulf War. Niskanen and Van Doren seem to dismiss this counter-example because the ground war lasted only 100 hours. But we fielded a massive force for the Gulf War (it was no Grenada), and we certainly didn't expect the ground war to end so quickly.

As I was saying, domestic spending is the beast to be strangled. (I'm putting aside here the "sacred beasts" that are financed by transfer payments: Social Security, Medicare, etc.) How has the domestic beast fared over past 70-odd years? Quite well, thank you. It fared best from 1933 through 1969, but it hasn't done badly since 1969.

The beast -- a creature of the New Deal -- grew four-fold from 1932 through 1940. Preparations for war, and war itself, brought an end to the Great Depression and stifled nondefense spending: It actually dropped by more than 50 percent (in real terms) from 1940 through 1945.

After World War II, Truman and the Democrats in control of Congress were still under the spell of their Depression-inspired belief in the efficacy of big government and counter-cyclical fiscal policy. The post-war recession helped their cause, because most Americans feared a return of the Great Depression, which was still a vivid memory. Real nondefense spending increased by 180 percent during the Truman years.

The excesses of the Truman years caused a backlash against "big government" that the popular Eisenhower was able to exploit, to a degree, in spite of divided government. Real domestic spending went up by only 9 percent during Ike's presidency.

The last burst of the New Deal came in the emotional aftermath of Kennedy's assassination and Lyndon Johnson's subsequent landslide victory in the election of 1964. Real nondefense spending in the Kennedy-Johnson years rose by 56 percent. The decades-long war over domestic spending really ended with the enactment of LBJ's Great Society. The big spenders won that war -- big time. Real nondenfense spending grew at an annual rate of 5.9 percent from 1932 through 1969.

Real nondefense spending has continued to grow since 1969, but at the lower rate of 2.5 percent per annum. What has changed is that nondefense spending has grown more steadily than it did from 1932 to 1969. Each administration since 1969 (aided and abetted by Congress, of course) has increased nondefense spending by following an implicit formula. That formula has two parts. First, there is the steady increase that is required to feed the beast that came to maturity with the Great Society. Second, there is countercyclical spending which is triggered by recessions and unemployment. As a result, there is a very strong -- almost perfect -- relationship between real nondefense spending and the unemployment rate for the years 1969 through 2005. Using a linear regression with six pairs of observations, one pair for each administration, I find that the percentage change in real nondefense spending is a linear function of the change in the unemployment rate. Specifically:
S = 1.0277 + 0.11346U

where S = real nondefense spending at end of a presidency/real nondefense spending at beginning of a presidency

U = unemployment rate at end of a presidency/unemployment rate at beginning of a presidency.

The adjusted R-squared for the regression is .979. The t-stats are 112.17 for the constant term and 15.26 for U.
What about divided government, of which Niskanen and Van Doren are so fond? Divided government certainly hampered the ability of Republican administrations (Nixon-Ford, Reagan, Bush I, and Bush II) to strangle the beast, had they wanted to. But it's not clear that they wanted to very badly. Nixon was, above all, a pragmatist. Moreover, he was preoccupied by foreign affairs (including the extrication of the U.S. from Vietnam), and then by Watergate. Ford was only a caretaker president, and too "nice" into the bargain. Reagan talked a good game, but he had to swallow increases in nondefense spending as the price of his defense buildup. Bush I simply lacked the will and the power to strangle the beast. Bush II may have had the power (at one time), but he spent it on support for his foreign policy and in an effort to buy votes for the GOP.

Bureaucratic politics (rather than party politics) is the key to the steady growth of nondefense spending. It's hard to strangle a domestic agency once it has been established. Most domestic agencies have vocal and influential constituencies, in Congress and amongst the populace. Then there are the presidential appointees who run the bureaucracies. Even Republican appointees usually come to feel "ownership" of the bureaucracies they're tapped to lead. Real nondefense spending therefore has risen steadily from the Great Society baseline, fluctuating slightly in countercyclical response to recessions and unemployment.

Having said all that, how do the presidents from Nixon through Bush II stack up? In spite of all the blather about Bush II's big-spending ways, there's not a dime's worth of difference among the post-Great Society administrations -- Democrat or Republican. Using the above regression equation, I estimated the expected growth of real nondefense spending for each administration. I then used that estimate to compute an actual-to-estimated ratio (how much nondefense spending actually rose divided by how much it "should" have risen, according to the equation).* The results:
Nixon-Ford -- 1.00
Carter -- 1.00
Reagan -- 1.00
Bush I -- 1.00
Clinton -- 1.01
Bush II -- 0.99
The lesson is clear: Tax cuts won't starve the beast -- Friedman, Becker, and other eminent economists to the contrary. But tax increases, on the other hand, would only stimulate the beast's appetite. The best way to cut spending is . . . to cut spending.

In any event, the truly vicious beast isn't federal nondefense spending, it's state and local spending. Spending by state and local governments in the United States is five times as large as the federal government's nondefense spending. Real spending by state and local governments increased by a multiple of 11 from 1945 to 2005. The population of the United States merely doubled in that same period. Thus the average American's real tax bill for state and local government is more than five times larger today than it was in 1945.

It's evident that not enough of the loot has been spent on courts and police. No, our modern, "relevant" state and local governments have seen fit to waste our money on such things as free bike trails for yuppies, free concerts that mainly attract people who can afford to pay for their own entertainment, all kinds of health services, housing subsidies, support for the "arts," public access channels on cable TV, grandiose edifices in which our state and local governments hatch and oversee their grandiose schemes, and much, much, more.

Then there are those public schools . . .

The good news about state and local spending is that its real rate of growth has dropped in the past few years. The bad news is that the slowdown coincided with a recession and period of slow economic recovery. The good news is that state and local spending is a beast with thousands of necks, and each of them can be throttled at the state and local level, given the will to do so.
__________
* By the standards of 1969-2005, here's how earlier administrations stack up:
Hoover -- 0.68
Roosevelt -- 3.29 (through 1940)
Truman -- 2.30
Eisenhower -- 0.85
Kennedy-Johnson -- 1.43
There used to be a real difference between Republicans and Democrats. Now there isn't. That doesn't make Democrats any better, it just confirms my version of the old adage: The pursuit of power corrupts.

Thursday, February 23, 2006

Useful Additions to My Lexicon

Boundary violation

Genetic fallacy

Apropos Academic Freedom and Western Values

My recent post about "Lefty Profs" sparked a post by Joe Miller at Bellum et Mores. There, Joe has penned "A Defense of the Loony Left"), which is really a witty defense of academic freedom, not of the loony left.

This post adds to what I have said in "Lefty Profs" and "A Politically Incorrect Democrat" (which is about Larry Summers's decision to step down as president of Harvard). An unspoken but very real motivation for those posts is the danger that the so-called loony left poses to the very freedoms we enjoy because of Western culture. Apropos that theme, Keith Windschuttle has posted a long essay at thesydneyline entitled "The Adversary Culture: The perverse anti-Westernism of the cultural elite." Along the way, Winschuttle observes:

Cultural relativism claims there are no absolute standards for assessing human culture. Hence all cultures should be regarded as equal, though different. It comes in two varieties: soft and hard.

The soft version now prevails in aesthetics. Take a university course in literary criticism or art theory and you will now find traditional standards no longer apply. Italian opera can no longer be regarded as superior to Chinese opera. The theatre of Shakespeare was not better than that of Kabuki, only different.

The hard version comes from the social sciences and from cultural studies. Cultural practices from which most Westerners instinctively shrink are now accorded their own integrity, lest the culture that produced them be demeaned.

There are absolute standards for assessing human culture. Here's mine: A culture that respects life, fosters liberty, and protects the pursuit of happiness will -- among other things -- yield economic well-being greater than that of a culture which does not repect life, foster liberty, or protect the pursuit of happiness.

Windschuttle concludes:

The concepts of free enquiry and free expression and the right to criticise entrenched beliefs are things we take so much for granted they are almost part of the air we breathe. We need to recognise them as distinctly Western phenomena. They were never produced by Confucian or Hindu culture. Under Islam, the idea of objective inquiry had a brief life in the fourteenth century but was never heard of again. In the twentieth century, the first thing that every single communist government in the world did was suppress it.

But without this concept, the world would not be as it is today. There would have been no Copernicus, Galileo, Newton or Darwin. All of these thinkers profoundly offended the conventional wisdom of their day, and at great personal risk, in some cases to their lives but in all cases to their reputations and careers. But because they inherited a culture that valued free inquiry and free expression, it gave them the strength to continue.

Today, we live in an age of barbarism and decadence. There are barbarians outside the walls who want to destroy us and there is a decadent culture within. We are only getting what we deserve. The relentless critique of the West which has engaged our academic left and cultural elite since the 1960s has emboldened our adversaries and at the same time sapped our will to resist.

The consequences of this adversary culture are all around us. The way to oppose it, however, is less clear. The survival of the Western principles of free inquiry and free expression now depend entirely on whether we have the intelligence to understand their true value and the will to face down their enemies.
My counsel isn't to round up the loony left and ship it off to Afghanistan, salutary as that might be for the loony left and the rest of us. No, my counsel is that those of us who value the best of Western culture must vigilantly defend it against the depradations of the loony left. That is why I speak out.

(Thanks to Political Correctness Watch for the link to Windschuttle's essay.)

Wednesday, February 22, 2006

Another Voice Against the New Paternalism

Glen Whitman of Agoraphilia weighs in:
If we think of a person as consisting of multiple selves—the present self who wishes to indulge in transient pleasures versus the future self who wishes to be healthy—then arguably the present self’s choices can force externalities on the future self. Those within-person externalities have been dubbed “internalities.” And just as we might impose a pollution tax on a factory to control the externality problem, we mightimpose a sin tax on items like cigarettes, alcohol, and fatty foods to control the internality problem.

The concept of internalities, although not yet a part of mainstream economics, is gaining attention. It is one among many novel economic models recently deployed by a new generation of paternalists. Paternalistic arguments advocate forcing or manipulating individuals to change their behavior for their own good, as distinct from the good of others. At one time paternalists argued that adults, like children, don’t really know what’s best for them. Some preferences, they argued, such as those for unhealthy food or casual sex, are just wrong. But such arguments hold little sway in a free society, where most people believe they should be able to pursue their own values and preferences even if others don’t share them. So the “new” paternalists have wisely chosen not to question people’s preferences directly; instead, they argue that internalities (and other sources of error in decisionmaking) can lead people to make decisions that are unwise even according to their own values and preferences.

In short, the old paternalism said, “We know what’s best for you, and we’ll make you do it.” The new paternalism says, “You know what’s best for you, and we’ll make you do it.”. . .

First, the new paternalism blithely assumes that, when your present self can impose costs on your future self, the outcome is necessarily bad. But preventing harm to the future self might involve even greater harm to the present self. There’s no valid reason to assume, when there is an inconsistency between present and future interests, that the latter must trump the former.

Second, the new paternalism ignores the fact that harms can be avoided in multiple ways. Restricting present behavior is one way to reduce future harms, but that doesn’t make it the best way. The future self might be capable of mitigating the harm at lower cost by other means.

Third, the new paternalism neglects the possibility of internal bargains and private solutions. All of us face self-control problems from time to time. But we also find ways to solve, or at least mitigate, those problems. We make deals with ourselves. We reward ourselves for good behavior and punish ourselves for bad. We make promises and resolutions, and we advertise them to our friends and families. We make commitments to change our own behavior. Internality theorists point to these behaviors as evidence that the internality problem exists. But they are actually evidence of the internality problem being solved, at least to some degree.

People are not perfect, so we should not expect real people’s actions to mimic those of perfectly rational and perfectly consistent beings. Mistakes will occur; self-control problems will persist. But paternalist solutions will solve them no better than personal solutions. What is really at stake is how self-control problems will be addressed—through private, voluntary means or through the force of government.

The new paternalists would have us believe that benevolent government can—through taxes, subsidies, restrictions on the availability of products, and so on—make us happier according to our own preferences. But even if we place little or no value on freedom of choice for its own sake, the paternalists’ recommendations simply don’t follow. Public officials lack the information and incentives necessary to craft paternalist policies that will help the people who most need help, while not harming those who don’t need the help or who need help of a different kind. Individuals, on the other hand, have every reason to understand their own needs and find suitable means of solving their own problems.
That's just what I've been saying:

The Rationality Fallacy
Libertarian Paternalism
A Libertarian Paternalist's Dream World
The Short Answer to Libertarian Paternalism
Second-Guessing, Paternalism, Parentalism, and Choice
Another Thought about Libertarian Paternalism
Back-Door Paternalism

More about Preemptive War

Go to Bellum et Mores, start with "War, the Constitution, and the UN II: Return of the Cosmopolitans" (posted February 19), and be sure to read all the comments (I'm there). Then scroll up to read what Joe Miller's students have to say about preemption. (Posts on other subjects are interspersed.)

Combinatorial Recreation

What's that? It's the term Einstein used to describe how a complex problem often is solved subconsciously while a person is engaged in a "mindless" diversion, or sleeping. I was reminded of the phenomenon by this post at FuturePundit.

Rating the Presidents, Again

Almost two years ago I commented on ratings of the presidents that were published at OpinionJournal. My own ratings were implicit in my comments, but I didn't finish the job and produce a top-to-bottom list of presidents. David N. Mayer (MayerBlog) has done so, and done it with brilliance -- here. The post is quite long, as Mayer's posts usually are, and every bit as rewarding. You should read the whole thing, but I cannot resist the urge to give you a preview.

Mayer notes that his "rating system differs from others in deemphasizing “leadership” per se and instead emphasizing fidelity to the Constitution." By that standard his "Great" presidents are Washington, Jefferson and Lincoln. His "Failures" comprise, in descending order, Theodore Roosevelt and Woodrow Wilson (tie), Franklin Roosevelt, Lyndon Johnson, Richard Nixon, and (last and least) Bill Clinton. I find no fault with Mayer's rating scheme or his results.

Mayer's designation of Lincoln as a "Great" will rankle many libertarians, especially the anarcho-capitalists who hang around the Ludwig von Mises Institute and LewRockwell.com. (My latest disparagement of their anarcho-romanticism is here. See also this piece about slavery.) Mayer says of Lincoln:
[H]e does not warrant the severe criticism that certain libertarian scholars have given him, calling him “tyrant” or “dictator” and erroneously claiming that the modern regulatory/welfare state began with the Civil War. Rather, I maintain, Lincoln did indeed save not only the Union but also the Constitution itself, from the most formidable internal threat it has ever (yet) faced.
In the end, what matters most is whether a president preserves liberty, and even advances it. How he does it is less important than whether or not he does it.

Finally, happy 274th birthday to George Washington.


Source: Wikipedia.

Tuesday, February 21, 2006

Lefty Profs

Orin Kerr's post about "Radicals in Higher Education" at The Volokh Conspiracy has drawn 138 comments (and still counting). Here's the post:
Last week, Sean Hannity expressed the following concern on Hannity & Colmes:
Kids are indoctrinated. They’re a captive audience. What can be done to remove these professors with these radical ideas from campus?
Michael Berube responds here.
Professor Bérubé also responds with a comment, in which he replies to some of the early comments and offers a link to his lengthy defense of academic freedom. (Which I may bother to eviscerate someday.) But the real issue isn't academic freedom, it's the one-sided political tilt that prevails in the academy.

My own comment:
Professor Bérubé protests too much. I have no time for Sean Hannity, but the essence (if not the tone) of Hannity's question deserves a thoughtful reply. The usual appeal to academic freedom is no more than an effort to deflect attention from the intellectual bankruptcy of leftist academic cant. I have not noticed that Americans are better off for having been subjected to such cant. It took me a few decades to outgrow my own "indoctrination" at the hands of the mostly left-leaning faculty at a State-supported university. And I suspect that my alma mater was far less to the left when I went there in the Dark Ages of the late 1950s and early 1960s than it is today. As for the bias evident in Professor Bérubé's own port-side emissions, I had this to say a while back about a piece Bérubé wrote for The Nation:
Michael Bérubé [is] a professional academic who is evidently bereft of experience in the real world. His qualifications for writing about affirmative action? He teaches undergraduate courses in American and African-American literature, and graduate courses in literature and cultural studies. He is also co-director of the Disability Studies Program, housed in the Rock Ethics Institute at Penn State.

Writing from the ivory tower for the like-minded readers of The Nation ("And Justice for All"), Bérubé waxes enthusiastic about the benefits of affirmative action, which -- to his mind -- "is a matter of distributive justice." Bérubé, in other words, subscribes to "the doctrine that a decision is just or right if all parties receive what they need or deserve." Who should decide what we need or deserve? Why, unqualified academics like Bérubé, of course. Fie on economic freedom! Fie on academic excellence! If Bérubé and his ilk think that a certain class of people deserve special treatment, regardless of their qualifications as workers or students, far be it from the mere consumers of the goods and services of those present and future workers to object. Let consumers eat inferior cake.

Bérubé opines that "advocates of affirmative action have three arguments at their disposal." One of those arguments is that
diversity in the classroom or the workplace is not only a positive good in itself but conducive to greater social goods (a more capable global workforce and a more cosmopolitan environment in which people engage with others of different backgrounds and beliefs).
Perhaps Bérubé knows the meaning of "capable global workforce." If he does, he might have shared it with his readers. As for a workplace that offers a "cosmopolitan environment" and engagement "with others of different backgrounds and beliefs" I say: where's the beef? As a consumer, I want value for my money. What in the hell does diversity -- as defined by Bérubé -- have to do with delivering value? Perhaps that's one reason U.S. jobs are outsourced. (I have nothing against that, but it shouldn't happen because of inefficiency brought about by affirmative action.) Those who seek a cosmopolitan environment and engagement with others of different backgrounds and beliefs can have all of it they want -- on their own time -- just by hanging out in the right (or wrong) places.

Alhough Bérubé seems blind to the economic cost of affirmative action, he is willing to admit that the practice has some shortcomings:
Affirmative action in college admissions has been problematic, sometimes rewarding well-to-do immigrants over poor African-American applicants--except that all the other alternatives, like offering admission to the top 10 or 20 percent of high school graduates in a state, seem to be even worse, admitting badly underprepared kids from the top tiers of impoverished urban and rural schools while keeping out talented students who don't make their school's talented tenth. In the workplace, affirmative action has been checkered by fraud and confounded by the indeterminacy of racial identities--and yet it's so popular as to constitute business as usual for American big business, as evidenced by the sixty-eight Fortune 500 corporations, twenty-nine former high-ranking military leaders and twenty-eight broadcast media companies and organizations that filed amicus briefs in support of the University of Michigan's affirmative action programs in the recent Supreme Court cases of Gratz v. Bollinger and Grutter v. Bollinger (2003).

Stop right there, professor. Affirmative action is "popular" because it's the law and it's also a politically correct position that boards of directors, senior corporate managers, and government officials, and military leaders can take at no obvious cost to themselves. Further, those so-called leaders are sheltered from the adverse consequences of affirmative action on the profitability and effectiveness of their institutions by imperfect competition in the private sector and bureaucratic imperatives in the government sector.

As I wrote in "Race, Intelligence, and Affirmative Action," here's how affirmative action really operates in the workplace:
If a black person seems to have something like the minimum qualifications for a job, and if the black person's work record and interviews aren't off-putting, the black person is likely to be hired or promoted ahead of equally or better-qualified whites. Why?
* Pressure from government affirmative-action offices, which focus on percentages of minorities hired and promoted, not on the qualifications of applicants for hiring and promotion.

* The ability of those affirmative-action offices to put government agencies and private employers through the pain and expense of extensive audits, backed by the threat of adverse reports to higher ups (in the case of government agencies) and fines and the loss of contracts (in the case of private employers).

* The ever-present threat of complaints to the EEOC (or its local counterpart) by rejected minority candidates for hiring and promotion. Those complaints can then be followed by costly litigation, settlements, and court judgments.

* Boards of directors and senior managers who (a) fear the adverse publicity that can accompany employment-related litigation and (b) push for special treatment of minorities because they think it's "the right thing to do."

* Managers down the line learn to go along and practice just enough reverse discrimination to keep affirmative-action offices and upper management happy.
I reject Bérubé's counsel about academic freedom as utterly as I reject his counsel about affirmative action. Academic freedom seems to be fine for leftists as long as they hold the academy in thrall. More parents would send their children to schools that aren't dominated by leftists if (a) there were enough such schools and (b) the parents could afford to do so. But the left's grip on the academy seems to be as secure as the grip of the labor unions on the American auto industry -- and you can see what has happened to the auto industry as a result.

As I wrote here,
The larger marketplace of ideas counteracts much of what comes out of universities -- in particular the idiocy that emanates from the so-called liberal arts and social sciences. But that's no reason to continue wasting taxpayers' money on ethnic studies, gender studies, and other such claptrap. State legislatures can and should tell State-funded universities to spend less on liberal arts and social sciences and spend more on the teaching of real knowledge: math, physics, chemistry, engineering, and the like. That strikes me as a reasonable and defensible stance.

It isn't necessary for State legislatures to attack particular individuals who profess left-wing blather. All the legislatures have to do is insist that State-funded schools spend taxpayers' money wisely, by focusing on those disciplines that advance the sum of human knowledge. Isn't that what universities are supposed to do?
For another view, let us consult Katherine Ernst's City Journal review of David Horowitz's The Professors: The 101 Most Dangerous Academics in America. Some choice bits:

The Professors profiles scores of unrepentant Marxists, terrorist-sympathizers (the number of profs expressing utter hatred for the US and Israel is astounding), and the just plain nutty working in today’s American academe. . . . The hostility to the free society, venomous racism—it’s open season on whites and Jews, apparently—and total disregard for objectivity of these far-left-wing ideologues add up to a travesty of the idea of higher education.

These academics—whose radicalism is widespread in today’s university—are “dangerous” not because they hold such beliefs, Horowitz argues, but because they replace scholarship and the transmission of knowledge with classroom activism and the ideological subjugation of paying students. . . . Horowitz is clear: everyone “has a perspective and therefore a bias.” Academics, however, have an obligation “not to impose their biases on students as though they were scientific facts.” Academe’s left-wing establishment—which first conquered its turf during the sixties countercultural movement—is so sure of its intellectual supremacy over conservative dolts and their military-industrial-complex buddies in the White House and corporate America, that it believes it’s obligated to spread the left-wing gospel to unsuspecting students. They need to save the world from the war-mongering criminal class running the country, after all!

Stories of indoctrination run through the book, from the education instructor who required her students to screen Fahrenheit 9/11 a week before the 2004 presidential election, to the criminology professor whose final exam asked students to “Make the case that George Bush is a war criminal.” (The prof later claimed the request was to “Make the argument that the military action of the U.S. attacking Iraq was criminal,” but he had conveniently destroyed all his copies of the original exam.) Once again, the academics’ own words do the loudest talking. Saint Xavier University’s Peter Kirstein: “Teaching is . . . NOT a dispassionate, neutral pursuit of the ‘truth.’ It is advocacy and interpretation.” . . .

Faux-intellectual academic fields like “Peace Studies” are now the latest fad gobbling up university capital. Basically, they’re advocacy platforms for college credit. “Why, if the Joneses want to spend $40,000 for Bobby to study ‘Marxist Perspectives on Fema-Chicana Lit,’ by all means, let them,” some might respond. Yet as The Professors warns, the craziness has inexorably spread to fields that once held sacrosanct the pursuit of objective knowledge. Members of Horowitz’s 101 teach economics, history, and English Literature, among other standard subjects.

Many of The Professors’ profiles offer outrages matching those of Ward Churchill, the infamous 9/11-victims-were-Nazis prof. The lunacy that was Professor Churchill, it’s worth remembering, enjoyed adoration for decades within academe until the public caught on. It may be wishful thinking, but if Horowitz’s book reaches enough hands, there could be some long-overdue collegiate shake-ups this year.

Let us hope so. "Academic freedom" is not a license to waste the money of taxpayers, parents, and students on propagandizing. Academics -- like politicians -- aren't owed a living, in spite of their apparent belief to the contrary. It isn't a violation of "academic freedom" or freedom of speech to say "The junk you teach is worthless, and besides that you don't teach, you preach. Begone!"

Related posts: Academic Freedom and Freedom of Speech (a collection of links)

QandO Saved Me the Trouble . . .

. . . of debunking a story at the website of the Ludwig von Mises Institute, in which Yumi Kim asserts that anarchy is working well in Somalia; for example:
Somalia has done very well for itself in the 15 years since its government was eliminated. The future of peace and prosperity there depends in part on keeping [a government] from forming.
But, as QandO's Jon Henke notes,
it's too bad that stories like this have to come out on the same day that Yumi Kim tells us of this wonderful experiment in anarcho-capitalism...
Thousands of people have fled the northern and northeastern suburbs of the Somali capital, Mogadishu, since clashes between militia groups started over the weekend, a top city official said. . . .
[T]he Mises Institute story ends with the claim that efforts "to construct a formal government" inspire only "fear and loathing in Mogadishu and the rest of the country". But that's directly contradicted by evidence on the ground.
There's plenty more. Read Jon's entire post.

The Mises story is another example of "Anarcho-Libertarian 'Stretching'."

Other related posts:

Defense, Anarcho-Capitalist Style
But Wouldn't Warlords Take Over?
My View of Warlordism, Seconded
The Fatal Naïveté of Anarcho-Libertarianism

A Politically Incorrect Democrat

UPDATE: Read this relevant post at The American Thinker, and this one at RedState.org.

Larry Summers, late of the Clinton administration, will relinquish the presidency of Harvard in the face of a pending (and second) vote of no-confidence by his faculty. Why?

Mr Summers’s brusque manner and characteristically aggressive form of questioning had turned some on the faculty against him. Resentment built into a furore last year when the president – a Harvard-trained economist – gave a speech suggesting that “issues of intrinsic aptitude” might be responsible for the dearth of women in science and engineering positions at top universities.

His comments angered some faculty members, culminating in a vote of no confidence in his leadership last March, which was passed by a 218-185 margin....

Harvey Mansfield, a professor of government at Harvard, said he thought the attacks on Mr Summers had their root in political differences. “My worry is that the feminist left and its sympathisers will take over Harvard, and I fear that the university will fall under the influence of a minority,” he said.

Tsk. Tsk. Musn't have any "aggressive" questioning of faculty, eh? (That would be a breach of current academic etiquette. The faculty is god-like and not to be challenged in its superior knowledge of how things should be.) Mustn't say politically incorrect things, eh? (That would be another breach of current academic etiquette, in which certain subjects are beyond debate -- beyond "academic freedom" -- lest certain parties take offense.)

Presumably, Prof. Mansfield has tenure, and a very thick skin.

A Legal Strategem for Pot Smokers

The U.S. Supreme Court, in Gonzales v. Raich (June 6, 2005), said in effect that the federal government can regulate the production of marijuana in any amount under the Controlled Substances Act, citing the Commerce Clause as authority. The Court later ruled, in Gonzales v. Oregon (January 17, 2006), that the federal government cannot rely on the Controlled Substances Act or the Commerce Clause to interfere with Oregon's legalization of physician-assisted suicide. The Court has now decided, in Gonzales v. O Centro Espirita (February 21, 2006), that the federal government may not rely on the Controlled Substances Act and the Commerce Clause to bar the use of a hallucogenic drug by a religious sect.

The lesson for pot smokers is clear: You must find a way to use marijuana in committing suicide (in Oregon) or you must join a religious sect of long standing that uses pot in its ceremonies.

Monday, February 20, 2006

European Hypocrisy

A statement and question from Alex Tabarrok at Marginal Revolution:
David Irving, the British historian, was sentenced in Austria today to three years in jail for denying the holocaust in two speeches he gave in 1989. I have little sympathy for Irving but support the right to free speech. How can we in the West take a principled stand against radical Muslims who riot and kill to protest depictions of Muhammad when we jail those who attack our sacred beliefs?
"We" in America are not responsible for the actions of our European "allies." It is evident (not only from the Irving case) that most of Europe (especially "Old Europe") wants to defend life, but not liberty and property.

Analysis Paralysis Is Universal

Spengler observes that "The West will attack Iran, but only when such an attack will do the least good and the most harm."

I worked for a CEO who knew that he would have to fire a goodly number of employees because of a funding cut. And everyone in the company knew it, as well. By acting quickly in response to the funding cut, the CEO could have reduced the number of firings and relieved the minds of those who worried needlessly that they would be fired. But the CEO couldn't bring himself to act quickly, and so he put off the firings for several months. The result: more employees fired, a prolonged period of reduced productivity during the months of delay, and a less functional company after the firings (because the firings disproportionately affected the support staff).

Delaying the inevitable usually makes matters worse.

Here's why.

(Thanks to American Digest for the link to the Spengler piece.)

Do Future Generations Pay for Deficits?

Tyler Cowen at Marginal Revolution asks and (sort of) answers the question. Here's my take:

1. Government spending, however it is financed, commandeers resources that could have been used to produce goods and services.

2. Some of those goods and services might have gone into current consumption, others into growth-producing capital investments.

3. The financing of government spending through taxes and borrowing determines precisely who forfeits their claims on the production of goods and services and, therefore, how much and what kinds of private consumption and investment are forgone because of government spending.

4. It is safe to say that government spending reduces economic growth to the extent that it reduces private-sector investment. (Read this post for a debunking of the notion that government spending on R&D is more productive than private spending on R&D.)

5. Given the difficulty of determining the incidence of government spending on investment (as opposed to consumption), the marginal effect of government spending can be approximated by the real, long-term rate of growth of GDP. That rate -- which reflects the growth-producing effects of investment spending on total output -- was 3.8 percent for the period 1790-2004. (Derived from estimates of real GDP available here.)

7. The real, long-term growth rate undoubtedly is lower than it would otherwise have been, because of government regulations and other growth-inhibiting activites of government. That is to say, government inhibits growth not only by commandeering resources from the private sector but also by dictating how the private sector may conduct its business.

8. Until the onset of the regulatory-welfare state around 1906 (explained here), real GDP had been growing at a rate of about 4.6 percent. Since the onset of the regulatory-welfare state, real GDP has grown at a rate of about 3.3 percent. (Derived from estimates of real GDP available here.)

9. In sum, the regulatory-welfare state has robbed Americans of untold trillions of dollars worth of consumption and wealth. I once estimated the current GDP gap to be about $8 trillion; that is, real GDP in 2004 was $10.7 trillion (year 2000 dollars), but could have been $18.7 trillion were it not for the regulatory-welfare state. Considering the apparent effect of the regulatory-welfare state on the rate of economic growth, the actual GDP gap is probably much greater than $8 trillion.

The answer to the question about who pays for deficits is this: All generations pay for government spending, however it is financed. And the cost just keeps piling up. It's not the deficits that matter -- future generations inherent the bonds as well as the interest payments -- it's the spending that matters.

Other related posts:

Curing Debt Hysteria in One Easy Lesson
The Real Meaning of the National Debt
Debt Hysteria, Revisited
Why Government Spending Is Inherently Inflationary
A Simple Fallacy
Ten Commandments of Economics
Professor Buchanan Makes a Slight Mistake
More Commandments of Economics
Productivity Growth and Tax Cuts
Risk and Regulation
Liberty, General Welfare, and the State

Those Hard-to-Find Items

You say you can't buy a left-handed buggy whip anywhere? Need a button hook for those shoes you've had since 1900? Having a hard time finding replacement blades for your Schick Injector Razor? Just move to the Commonwealth of Massachusetts. If a store in Massachusetts doesn't stock something you'd like to buy, the Commonwealth's bureaucrats will set things straight.

Saturday, February 18, 2006

Anarcho-Authoritarianism

I picked up the term from Ed Driscoll, who points to a Weekly Standard review by Fred Siegel of a biography of H.L. Mencken. Siegel explains anarcho-authoritarianism, taking Mencken as an exemplar of it:
Part of the reason it's so hard to make sense of Mencken is that he was, paradoxically, an anarcho-authoritarian. He agreed with the American Civil Liberties Union on the importance of free speech. But while that organization, under the influence of principled men such as Felix Frankfurter, argued for such freedoms on the grounds that "a marketplace of ideas" (to use Justice Holmes's term) was the best method of arriving at the truth, Mencken supported it [free speech] in order to shield superior men like himself from being hobbled by the little people. For the same reason, Mencken was a near anarchist when it came to America, but an authoritarian when it came to the iron rule of the Kaiser and General Ludendorff. We are more familiar with anarcho-Stalinists such as William Kunstler, who had a parallel attitude toward the United States and the Soviet empire, but it was Mencken who blazed the trail down which Kunstler and his ilk would travel. [Emphasis added by me.]
In other words, for Mencken and his ilk liberty is a personal convenience, not a general principle. Mencken showed his true colors when he wrote disdainfully of the "booboisie" (boob + bourgoisie). Mencken was a closeted statist who compensated for his frustrated ambitions by ridiculing those whom he could not dominate. A different kind of compensatory rhetoric is to be found these days mainly on what we call (inaccurately) the Left. As I wrote recently, Leftists
have become apocalyptic in their outlook: the environment will kill us, our food is poisonous, defense is a military-industrial plot, we're running out of oil, we can't defeat terrorism, etc., etc., etc. . . .

The emphasis on social restraints [in order to avert the apocalypse] means social engineering writ large. [The Leftist] wants a society that operates according to his strictures. But society refuses to cooperate, and so he conjures historically and scientifically invalid explanations for the behavior of man and nature. By doing so he is able to convince himself and his fellow travelers that the socialist vision is the correct one. He and his ilk cannot satisfy their power-lust in the real world, so they retaliate by imagining a theoretical world of doom. It is as if they walk around under a thought balloon which reads "Take that!"
Mencken, the closeted statist, settled for ridicule. Today's not-so-closeted statists cannot be content with ridicule; they must instead consign the objects of their derision to an imaginary hell.