During the past couple of week, the U.S. housing market has had a record number of home foreclosures, a rise in delinquency rates on mortgage loans, and further declines in housing starts. Default rates are especially high on subprime mortgage loans, which are loans to borrowers with poor credit histories and low or erratic earnings. The greatly increased availability of loans to borrowers with bad credit was fueled mainly by five years of low interest rates. Many lenders turned to what had been a neglected subprime loan market in their search for higher returns. The rapid appreciation in housing prices, in good part itself the result of low rates of interest, also gave lenders confidence that they could recoup the value of their loans in the event of defaults and foreclosures. In addition, new way of packaging mortgages and of combining them with other assets that reduce overall risk on portfolios with subprime loans lowered the risk of lending in the subprime market.
Members of Congress and others have called for much stricter lending standards for these loans, and sharper controls over the interest rates that can be charged. But subprime loans have made home ownership possible for groups that cannot get mortgages in the prime lending market. These recent criticisms of subprime loans- they were not much criticized while the housing market boomed- are reminiscent of the attacks in the '80s and '90s on the market for "junk", or low-grade, bonds. New and untested companies often do not have enough collateral to satisfy the stringent criteria of commercial banks. The development of the junk bond market enabled them to raise money from nonbank investors by issuing bonds that paid much higher interest rates than those on loans to prime companies. These bonds enabled startups with few tangible assets to borrow outside the banking system by offering much higher rates to compensate for the greater risk. Although junk bonds too were said, often by the competitive source of loans, bankers, to encourage undue risk-taking, such bonds have survived and even thrived, and not only in the United States. Moreover, some of the companies, such as CNN and MCI, that financed their early development with junk bonds have become very successful.
The same type of considerations applies to families with bad credit histories due to low and uncertain earnings, poor resource management, and other factors. Mortgages would not be available to these families to buy homes if lenders could only get the same interest rates and other loan conditions that they get from prime borrowers. Like new companies with limited collateral that issued junk so that they could survive the competitive business atmosphere, the market for subprime loans made the dream of owning a home come true for thousands of families.
While the default rate on subprime housing loans is high compared to the past, and the higher rate of defaults have forced about 20 subprime lenders to either close, seek buyers, or raise additional financing, delinquencies and defaults on these loans are far from being the rule rather than the exception. The fraction of subprime mortgage loans entering foreclosure in the first quarter of this year jumped to a 5 year high of 2.4 per cent from about 2 per cent in the last quarter of 2006. This is a large percentage rise in the default rate, but so far at least the vast majority of subprime loans are not yet in default, and are being repaid.
The default rate on prime loans also jumped, to 0.25 per cent, high for this type of loan but only 1/10th the default rate on that for subprime loans. The default rate on prime loans also rose by much less than the rate on subprime loans. It is not surprising that lower quality housing loans would be more likely to go into default during periods of rising interest rates and a slowdown in the housing market. Lower quality loans of all types are always more vulnerable to slowdowns in the markets that generated these loans.
Although many African American and other poor families became homeowners for the first time due to the development of the subprime loan market for housing, critics claim that many of these families were duped by misleading presentations of lenders into taking out short duration variable interest loans, loans with low down payments, or loans that were simply beyond their capacities to pay. No doubt overly eager or unscrupulous lenders did sometimes misrepresent the difficulty of making payments to borrowers with little experience in financing home ownership. However, intentional misleading presentations to families who were clearly unqualified to take on home ownership was not the norm but rather were exceptions.
The reason for my belief is not confidence in the morality of all lenders in the subprime market, but rather that delinquencies and especially defaults on these loans hurt lenders as well as borrowers. As defaults have risen, and the increase in housing prices slowed dramatically- in many areas prices have been falling- it has become increasingly difficult to recoup the amounts loaned by repossessing houses and selling them off. Moreover, in some states repossession of homes is difficult after owners declare bankruptcy. The fact that the majority of the companies that specialized in lending in the subprime market have gotten into serious financial difficulties, and many have closed, indicates that lenders as well as borrowers were badly damaged by the collapse of the subprime market. Both lenders and borrowers would have been hurt much more by the rise in interest rates and the end of the housing boom had the American economy not continued to have low levels of unemployment and growth in real GDP.
The subprime-mortgage imbroglio is just the latest chapter in an age-old concern with the charging of interest, especially to individuals. Medieval Christianity forbade the charging of interest on the ground that it was unnatural for money to increase (as by lending $100 at a 10 percent interest rate so that at the end of the year the $100 has grown to $110), because unlike pregnancy there was no mechanism by which an inanimate object such as money could reproduce itself. Behind this superstition lay undoubtedly a hostility to commercial society, which persists today in some quarters of the Muslim world; Islam forbids charging interest although substitutes are tolerated. The concern with lending has persisted into modernity even in Western societies. Usury laws, which set a ceiling on interest rates, and the Truth in Lending Act, which requires detailed disclosure of annualized interest rates in consumer loans, are examples of this concern.
The relaxation of usury laws--a natural concomitant of the spread of free-market ideas in American society--allowed lenders to offer loans at very high interest rates to borrowers with poor credit ratings. Payday loans, which charge astronomical interest rates to persons who need money to tide them over till their next paycheck, and subprime mortgage loans sometimes at annual rates 4 or 5 percent higher than mortgage loans to borrowers who have good credit, were consequence of the relaxation.
I agree with Becker that credit is no different from any other commodity. For government to place a ceiling on price prevents people from buying the commodity who would be willing to pay a higher price, and thus it prevents a mutually beneficial, and therefore value-maximizing, transaction. The argument for the ceiling is that people who have a poor credit record have demonstrated their incompetence to borrow and so should for their own good be prevented from borrowing more. That is not a compelling argument, apart from any general objections to government paternalism than one may have. A person may have a poor credit record, yet know that he can pay a high interest rate and that he will be better off despite the cost. As Becker notes, although the rate of default on subprime mortgage loans is high, still, the vast majority of those loans are repaid. For many people they are the only route to home ownership, which is greatly valued by the owners but has also been thought (perhaps dubiously) to have social value; that at any rate is the rationale for the tax deductibility of mortgage interest.
I do think that there is reason to think that the subprime mortgage market is imperfect, though not reason enough to warrant government interference with that market. The subprime mortgage lenders have engaged in aggressive marketing that may have deflected borrowers from shopping for better terms in the prime market. There are of course many gullible consumers and many people who have difficulty understanding the cumulative costs of high interest. There are also many people who like to speculate or otherwise gamble without a good appreciation of the odds. Perhaps there is even something of a "bubble" aspect to the subprime market. When housing prices were rising, borrowing to buy a house even at a high interest rate (interest rates generally were low until very recently, but high to subprime borrowers) was a leveraged investment, both on the borrowing side and on the lending side. The borrowers expected to repay the high interest out of the rapid appreciation in the value of the house, and the lenders expected to be cushioned against the consequences of a high rate of defaults by those same rising prices: if they had to foreclose, the house would be worth enough more than the mortgage to enable the lender to recoup. A bubble arises not because people fail to perceive that an asset is overvalued, but because they think the perception is not widespread and therefore the asset will maintain or increase its market value. No one wants to sell an asset while its price is still rising, but if enough people think that way the price may rise to a point at which a slight perturbation in the market may cause a crash. Given the riskiness of subprime mortgage loans, a modest decline in housing prices or rise in interest rates (many subprime mortgages were at floating rather than fixed rates) could precipitate enough unexpected defaults to create distress not only among subprime borrowers but also among the lenders. Apparently that is what has happened.
Although the result is not a happy one, I do not perceive adequate grounds for government intervention. Proposals for limiting subprime loans have the quality of closing the barn door after the horses have escaped. The subprime "crash" has presumably educated both borrowers and lenders in the riskiness of the market, and if subprime lending persists it will not be because of ignorance of the risk. Of course if subprime lenders have resorted or are resorting to fraud in inducing such loans, they should be punished, but for that no new laws are required.
Women's Economic Role--Posner's Response to Comments
There were some excellent comments on my posted comment of a couple of weeks ago, which I have been slow in responding to. One commenter pointed out that a possible reason for colleges to favor male applicants is that there is greater variance in performance among men than among women, and so, in the words of the commenter, "colleges, especially the good ones, tend to be risk-takers in admission, since it is disproportionately valuable for them to get the very top students." Another comment points out that a college may want to admit a certain minimum number of men in order to provide more dating opportunities for women. Maybe there is a tipping phenomenon at work as well--that if there are too few men, male applications drop because men don't want to be thought attending a "women's college."
A number of comments expressed puzzlement with the proposition that the higher average grades of women could signify discrimination in favor of men rather than of women. The puzzlement is understandable because of a typo: in the third line of my post, for which I apologize: "men" should be "women." The easiest way to understand the point is to imagine that the average woman's grade point average is an A and the average man's a D. Then it would be evident that the college was discriminating in favor of men, because it was admitting D men in preference to A or B women (I say "or B" to allow for the possibility that the college has admitted all its A applicants)
Another comment pointed out that if there is discrimination in the job market, women will have a stronger incentive than men to get good grades in order to improve their job-market prospects. Anti-Semitism has been thought a factor in pushing Jews to excel in their studies.
IHere is a puzzle: effectiveness in senior leadership positions in government does not seem to be well correlated with intelligence. Washington was a better President than Jefferson, though less able intellectually. Franklin Roosevelt, Harry Truman, Dwight Eisenhower, and Ronald Reagan were not as bright as Herbert Hoover, Richard Nixon, Jimmy Carter, or Bill Clinton. Lincoln, a brilliant lawyer, is an exception; Theodore Roosevelt perhaps another exception; and doubtless there are others. But overall the correlation between intelligence and effectiveness in the Presidency may actually be negative. Even more striking are the failures of Kennedy and Johnson's national security team in Vietnam and George W. Bush's national security team in Iraq. McNamara and his whiz kids (such as Daniel Ellsberg, Harold Brown, and Alan Enthoven), the Bundies, Walt Rostow, George Ball‚Äîthese were extremely able people, many of them (like McNamara and McGeorge Bundy) truly brilliant. And Bush assembled an outstanding national security team--Cheney, Rumsfeld, Powell, Wolfowitz, Rice, Tenet (appointed by Clinton but held over by Bush). Two members of the team--Cheney and Rumsfeld--were former secretaries of defense! And Powell was a former chairman of the joint chiefs of staff.
It could just be bad luck, but I think not. Economists distinguish between general and specific human capital, the first created by IQ and education and the second by training and experience in a particular job. A person who has a large amount of general human capital is likely to find a job in which that capital, augmented by on the job training and experience, is highly productive. The resulting success will make him an attractive candidate for a high-level government job. The high-level jobs are filled generally by lateral entries from quite different jobs, rather than by civil servants. Some of these high-level jobs are technical; an example is the chairmanship of the Federal Reserve Board. Such jobs are relatively easy to fill with persons who can be predicted with reasonable confidence to do a good job.
But there is a tendency to exaggerate the versatility of the combined general-specific human capital that a lateral entrant brings to a high-level government job of a managerial or advisory rather than technical character. There are several characteristics of such a job that actually militate against the prospects for the success of an extremely intelligent person. First, these are "ensemble" jobs in the sense that many different skills or aptitudes are necessary to successful performance; if one of these, such as intelligence, is very highly developed, a person may neglect the others.
Second, it may not be possible to use step-by-step, logical reasoning to solve the problems laid at the feet of the occupant of a job like secretary of defense or secretary of state or national security adviser. Such questions as what to do in Vietnam or what to do in Iraq do not lend themselves to rigorous analysis because there is not enough information to analyze. Intelligence is not designed for coping with situations that are not complex, but rather are profoundly uncertain. Having great information-processing skills is not worth a lot if you have no reliable information.
Third, leaders or managers should be more intelligent than their followers or subordinates, but not too much more intelligent. If they are too much more intelligent, they will have difficulty assessing the capacities and limitations of their underlings and they will be tempted to substitute their intelligence for their underlings' knowledge. Analysis and knowledge are, to an extent, substitutes. You can multiply two numbers rapidly if you have good computational skills or if, though your computational skills are mediocre, you have memorized the multiplication table. Knowledge in government resides in civil servants, and they tend on average to be less intelligent (also of course less powerful) than brilliant laterals. So the latter are tempted to think that they can make decisions with minimal assistance from the civil servants.
The temptation is reinforced by a failure to distinguish between intuition and step-by-step reasoning. Cognitive psychologists explain that the human unconscious contains more information than we can access at a conscious level. As Herbert Simon (an economist and psychologist) explained, conscious attention is a severely limited faculty and must be carefully rationed. Through intuition, however, we can access the larger repository of unconscious information. Hence we speak of a person as having "experience" or "good judgment" or "common sense," as distinguished from being brilliant in the sense of being quick or having a good (conscious) memory. So now imagine a confrontation between a brilliant person who has no knowledge about Vietnam or Iraq, and a career State Department officer who has spent his whole career working on conditions in one of those countries, who knows the language, has lived there, and is steeped in the country's history, culture, and politics. Suppose he offers some advice to the brilliant senior official, and the latter asks him to explain and justify the advice. He may be unable to do so because he may be drawing on a repository of information below the conscious level. The brilliant official may be irritated at his inability to extract much more than a conclusion from the expert.
What is required at the top levels of government is not brilliance, but managerial skill, which is a different thing, and includes knowing when to defer to the superior knowledge of a more experienced but less mentally agile subordinate. Moreover, so specialized is management as a job that success in managing a business may not translate at all into success in managing a government agency. The firm-specific human capital that a person acquired in a career of management in a business firm may have no value for the management of a government agency, or for that matter a university, a private foundation, or an international organization. Indeed, an experienced manager of a firm may falter and have to be fired if a change in the firm's environment requires a different type of management skill.
A striking example of the specialized character of leadership human capital is Larry Summers. A truly brilliant person and successful secretary of the treasury, he failed as president of Harvard University though he seemed to many people (myself included) to be an outstanding choice. I have the highest personal and professional regard for Summers and blame the failure of his presidency not on him but on the Harvard faculty of arts and sciences. But the fact is that he failed, because he was not able to port his very considerable suite of intellectual and managerial assets to the management of an organization critically different from the Treasury Department.
Economists have been emphasizing in recent years that that while cognitive abilities of individuals certainly raise their education and earnings, many non-cognitive skills are often more significant. These skills include simple factors like finishing one's work on time, to more complicated ones like good judgments in making decision, or effectiveness at using talents of subordinates. Posner argues convincingly that non-cognitive talents may be of greater importance in determining success at top-level government leadership positions than analytical brilliance and other cognitive skills.
He provides several explanations for the mixed success of cognitively able persons at important government positions, including limited extensive governmental experience- although not applicable, for example, to either Donald Rumsfeld or Richard Cheney- their reluctance to rely on the experience and knowledge of underlings, and the difficulty of using systematic analysis to evaluate the uncertainties in major government decisions. The limited role of top analytical skills might explain why voters, as opposed to intellectuals, typically do not weight heavily the "IQ" of presidential candidates in choosing whom to vote for. The modest value of exceptional analytical skills should also imply that presidents would not place major emphasis on these skills when choosing their top cabinet officers and other high level appointees. Although as Posner indicates, some presidents have appointed brilliant men who failed at major positions, on the whole brilliance is not the most important characteristic that presidents use in choosing their top appointees.
Of course, government leadership positions are not unique in requiring a much more varied set of talents than cognitive analysis. Success at top business and academic administrative positions also depend on omplicated mixtures of different talents. Cognitive brilliance is often not essential, and sometimes is even a handicap, in determining success at these positions as well. Many of the most successful business leaders have not been brilliant at systematic analysis, and some cognitively highly able persons failed miserably. Posner mentions Lawrence Summers, who was highly successful both as an academic teacher and researcher, and as a U. S. Treasury official, but had major troubles as president of Harvard. Another example from the academy is George M. Beadle, a Nobel Prize winning biologist who was a rather mediocre president of the University of Chicago.
To be sure, that many persons with exceptional analytical abilities fail at top leadership positions in large organizations may largely reflect the fact that failure, or at least mediocrity, is more common than success among heads of large organizations, whether it be government, business, or academic institutions. I am confident of that claim with respect to universities, the organizations I know best, where inspired leadership has not been common. A major reason for this must surely be the great difficulty in predicting how men or women would perform when they get promoted within an organization, or when they move in a lateral way from one organization to another.
The skills, for example, to succeed as provost of a university involves an ability to deal effectively with professors, to evaluate recommendations for professorial promotions and outside appointments, and to handle related faculty matters. Many provosts use success at that position to become candidates for presidents of universities, but the talents required to succeed as president are quite different. Presidents have to raise money, deal with businessmen, foundations, and legislatures, appoint deans, and make other basic administrative and organizational decisions. How well someone performed as provost gives some but limited insight into how well they would perform at the different tasks required of a president. This is even truer when they become president at a university different from the ones where they were provost.
With Hillary Clinton a very serious candidate for the U.S. Presidency, Angela Merkel as Chancellor of Germany, and Segolene Royal who almost became president of France, women have clearly arrived as political leaders in Europe and the United States. More to the point of this essay, the increasing role of women in political life is a reflection of the general education and employment advance of women in many countries.
Consider first education. Men in the United States who were born around 1930 were far more likely than women born at that time to attend college, whereas among those born 40 years later, about 10-15 percent more of the women than men went to college. Over twice as many men as women graduated a four-year college in that earlier cohort, while women in the later cohort were considerably more likely than men to graduate. Put differently, whereas in earlier cohorts women were much more likely to drop out of college, this pattern has sharply reversed, so that male students now are more likely to drop out. As a result of these trends, somewhere between 55 and 60 per cent of all students in American colleges are women.
The same general trends in educational achievements of men and women are found in other countries with advanced economies. Nor is this trend restricted to advanced economies. An article a few weeks ago in the New York Times indicated that female college students far outnumber male students in the moderately poor Moslem country of Algeria, and many more of the judges and lawyers there are female. Even in the fundamentalist country of Iran, women now apparently outnumber men at universities, although shortly after the Iranian revolution in 1979, attempts were made to discourage women from getting a higher education.
The subjects studied by women in high school and college are also converging to those studied by men. According to data presented by Golden, Katz, and Kuziemko (see "the Homecoming of American College Women: The Reversal of the Gender Gap in College", Journal of Economic Perspectives, Fall 2006 for these data, and some of the other data used in my discussion), girls are about as likely as boys to take physics and math courses in American high schools, and girls are more likely to take chemistry courses. Girls have better grades on average at all levels of education, while the dispersion in grades and performance is greater for male students. This means that male students are much more represented at the extremes of the school performance distribution: at very low as well as very high levels of school performance.
The propensity of women to go to college exceeds that of men in part because the financial gains from a college education compared to stopping education after high school have been higher for women than for men. According to calculations by my colleague Kevin M. Murphy, in 1990 college-educated women had average hourly earnings that were about 65 per cent higher than the average hourly earnings of female high school graduates, while the difference for men was only about 58 percent. The financial gains to both men and women from attending college increased by a lot from the mid 1970's on, although after 1990 they increased more for men.
During the past 60 years in all economically advanced nations, and in most developing countries as well, women began to work much more in the economy, and they acquired significantly more schooling, partly because birth rates declined sharply. As a result, women now have considerably more time that is free of household responsibilities. The American and other advanced economies also shifted away from manufacturing and toward services, where women have always been more likely to find employment. Discrimination in admissions to medical, law, engineering, and some other professional schools also declined, perhaps mainly under the pressure of the growing number of women who wanted to enter these programs. About half the students at medical and law schools in the United States are female, and their enrollments in MBA programs and engineering schools are also increasing rapidly.
A larger fraction of employed women are now working full time compared to the situation 50 years ago. For example, about two thirds of women who graduated college in recent years work full time compared to about one third a few decades ago. The greater education and greater commitment to the labor force of women than in the past helped raise the annual earnings of women relative to men. Some estimates indicate that wives earn more than their husbands in over 30 per cent of families where both work, and the fraction of families in which wives are the main breadwinner has been growing at a brisk pace.
Yet, women still on average earn less than men, and women are much less represented in the top deciles of the overall distribution of earnings. The next couple of decades should see a narrowing of both these gaps, but will they be eliminated? If, as is likely, women will continue to take time off from work to care for young children, and to miss work when their children get sick or need other special attention, that would continue to reduce both their average earnings relative to men, and their representation in the top of the earnings distribution.
To be sure, the greater education attainment of women, and their better performance at school, would tend to raise their average hourly earnings above that of men. Their better education and school performance would battle against their household responsibilities in determining the earnings of women relative to men. Still, even if the average hourly earnings of women reached parity or surpassed that of men, it is unlikely even without discrimination against women that they will be as represented as men at the top of the earnings distribution. For while combining household with market activities hurts average earnings, it is a really strong hindrance to having enough time to make that supreme commitment to work that is usually necessary to achieve great financial success.
I have very little to add to Becker's excellent discussion. One puzzle remains is why women have better college grades than men. One possibility is that colleges discriminate against men in admissions. For if colleges admitted blindly on the basis of academic prowess, they would keep admitting women until male and female grades were equal at the margin. The average grades of women might still be higher than those of men. But this would be surprising, unless most of the students in the applicant pool were women.
Discrimination against women in admission to college would not be irrational if male alumni are expected to be on average more generous donors, either because of higher average earnings or because, as Becker notes, men are likely to dominate the upper tail of the income distribution; alumni in the upper tail are likely to be disproportionately generous donors.
Another possibility, unrelated to current sex discrimination, but perhaps to historical discrimination against women, is "legacy" admissions. If alumni children are favored by college admissions officers (largely for financial reasons--admitting alumni children increases expected donations by alumni), and the alumni parents are disproportionately male because men used to go to college in higher numbers than women, this could explain why males are being admitted who are expected to be poorer students than women who could have been admitted in their place. However, given that alumni are likely to have an equal number of male and female children, this explanation would work only if alumni prefer their sons to be admitted to the same school.
Still another possible explanation for the higher average grades of female than male students is that men get as much out of college as women do even when male grades are lower, because there is more to college than academic performance and the "more" may be more valuable on average to men than to women. Male sports and other male social activities in college may build teamwork, and networks, that create more valuable human capital for men than these activities would do for women, perhaps because men will have greater participation in the labor market, where teamwork and connections are vital assets. On this view (proposed by Asher Meir in correspondence), male students substitute nonacademic for academic college activities, resulting in lower average grades that are, however, offset by the social human capital that they acquire from engaging in the nonacademic activities.
Whether the wage gap between men and women will continue to narrow because the ratio of male to female college students will continue to fall seems to me speculative. The ratio may not fall at all if colleges see advantages in the current ratio, though this would leave unexplained why it has fallen as far as it has already. If the ratio does not continue to fall, I do not see what would drive female wages up relative to male wages. Rising prosperity may actually induce many women to substitute household for market work, because diminishing marginal utility of money income, combined with higher income tax rates at higher incomes, would tend to make untaxed household income more attractive.
The public is upset by the casualties that our soldiers are suffering in the Iraq war, and it might seem that their upset would cause no puzzlement even to an economist. But there is an economic puzzle. It is this. Ours is an all-volunteer military. No one is forced to join. Everyone who does join realizes that he may find himself in a combat zone. This is an expected cost of military employment and in a competitive labor market will be reflected in the wage. That is, the wage rate in a competitive labor market will compensate a worker for any risks that the particular employment can be expected to create--a proposition that goes back to Adam Smith. If the risk materializes, the employee has no cause to complain, provided it was the risk that he understood the job involved or should have understood it involved when he signed up for it, because he was compensated in advance. Yet that is not how the public views our military casualties. That is the economic puzzle which I address.
What is not puzzling is why the families and friends of a killed or injured soldier grieve. Ex ante compensation for a loss does not wipe out the loss, even if it is a purely financial loss. It just provides the inducement to bear the risk of incurring the loss. One's spouse might consent to one's working at a very dangerous job, yet still grieve when one was killed at the job.
Nor is it a puzzle why, as in the recent search for the three American soldiers captured by the enemy in Iraq, immense resources are devoted to rescuing soldiers, rather than writing them off as having consented ex ante to their plight. The compensating wage for bearing risk varies, obviously, with the risk, and the risk in turn depends on efforts that are and will be made to minimize the risk, including body armor, rescue, medical treatment, and so forth. Knowing that one's fellow soldiers do not just abandon one when the cost of rescue would be disproportionate to any tactical value of the rescue reduces the wage that a volunteer army has to pay to attract soldiers of the quality it wants.
But the question remains how to explain the upset that the public feels at our mounting casualties in the Iraq war. Is it just shock at seeing photographs of dead and badly injured Americans? But in fact such photographs are rarely shown. Or is it perhaps that the risk of death and injury is greater than our soldiers had reason to expect when they signed up? Were this the concern, one would expect sympathy to be withdrawn from soldiers killed or injured who signed up within the last two years, for by two years ago it was clear that a great many recruits would be fighting in Iraq before the war ended. The case of soldiers who joined the military before the September 11, 2001, terrorist attacks indicated that the United States could be expected to be involved in more military operations than previously anticipated might be thought different. But most of those soldier completed their military obligation and so be allowed to resign without penalty years ago. The situation of those who "re-upped" is no different from that of recent recruits.
Could there be a paternalistic concern--that recruits are not calculating the risk of death or injury accurately and as a result are not receiving an adequately compensatory wage differential over a safe job? This is unlikely. One reason is that a great, and probably unobtainable, amount of information would be required in order to calculate that differential. The risk of death or injury in combat is an example of what statisticians describe as "uncertainty" rather than "risk," reserving the latter term for situations in which a numerical probability can be estimated. The incidence and length of wars, the probability of serving in a combat zone and for how long, and the amount and severity of the fighting in that zone are all imponderables. The resulting uncertainty argues for an alternative to building ex ante compensation into the soldier's wage when he is hired. Hence the practice of paying combat pay as a bonus to the soldier's ordinary wage. At present, soldiers serving in combat zones, mainly Iraq and Afghanistan, receive $225 a month as combat pay on top of their regular wage. The $7,000 bonus paid Marines who agree to be deployed to a combat zone for seven months is a similar response to the difficulty of fixing conventional ex ante compensation.
A further complication is illuminated by the economic concept of monopsony. The term refers to a situation in which there is no competition on the buying side of the market, as distinct from no competition on the selling side (monopoly). In a monopsonized market sellers receive less than they would in a competitive market because of their lack of alternatives. Persons who join the military to obtain or exercise technical skills have civilian alternatives, so the military has to compete with civilian employers for the services of such persons. But if you want to be a combat soldier, there is only one possible employer (if you are an American) and that is the U.S. government. So the government can pay a low wage to persons desiring that employment--in fact it seems that it can pay a lower wage than it does to its military technicians (adjusting for the value of the technical training that the latter receive) even though the latter are less exposed to combat risks.
I suspect that the main reason for public distress at U.S. military casualties is altruism, which is stronger in a family setting but extends to strangers as well, as in charitable giving. Most people are grateful to those who protect them, even if the protectors are well compensated. But what of those Americans who believe that our involvement in Iraq is a mistake and that our soldiers, or at least most of them, should be withdrawn? Most of the critics of the war realize that the soldiers are trying to protect us, even if the soldiers are mistaken in believing that they are doing so. If anything, critics feel sorrier for the troops than supporters of the war, because they think that the casualties represent sheer loss, so that the soldiers are deluded as well as endangered.
Posner raises an important issue: why do Americans (and persons of other nationalities) grieve so much when American military personnel (or the military personnel of these nations) are killed during military actions, even when those killed had volunteered for military service? In addition to the reason he stresses-the altruism of Americans toward their military personnel- I believe two other factors are important.
Although the pay required to attract volunteers rose after casualties began to appear in large numbers in Iraq, it did not rise by a large amount. Yet even small increases in the probability of losing one's life are valued highly when young persons are asked to take on the risks found in different civilian occupations. When directly applied to military risks, these estimates suggest that if the Iraq war increased the chances of dying to a typical new member of the American military force by one percent per year of service, this would require about a $3000 increase in pay for each year of service for each person in the military. The amount would be considerably higher for those who knew they would be posted into military action in Iraq, and would be higher in general if one percent is lower than the true risk.
The actual increases that have been required to attract volunteers have been much lower than $3,000 per person serving in the military. This suggests that those young men and women who have volunteered are attracted for other reasons than the higher compensation paid to undertake these military risks. One compelling other reason would be patriotism on their part, and a resulting desire to serve their country. Americans feel considerable indebtedness to its military personnel who lose their lives in combat when their enlistments have been due to such non-financial assessments of the risk to their lives from becoming members of the armed forces during wartime. This indebtedness to those killed for volunteering as least in part for patriotic reasons would explain why there is considerable concern and regret over those who die while serving in combat zones. The same concern applies to policemen who are killed in the line of duty because many are assumed also to be serving because of their interest in protecting the public from criminals.
This reflection on the motives for serving shows up in the difference in attitudes toward the usual volunteers for military service, and the attitudes toward "mercenaries". A mercenary is assumed to be serving mainly for monetary reasons rather than for patriotism. For that reason, their deaths causes less concern and mourning on the part of the civilian populations that they are protecting. To be sure, some of the volunteers serving may not have strong patriotic motivation, but they too gain sympathy since it is impossible to tell them from the very patriotic members of military service.
A second explanation for the great concern about those killed in Iraq is that volunteers enlist under the implicit expectation that the military will take appropriate steps to protect those serving in as effective a manner as possible. There is a widespread perception that the war has been fought with inadequate understanding of the enemy, and insufficient protection of American personnel serving in combat-related positions. That would mean the country has let its military personnel down. This belief about inadequate protection of its military enlistees has led to guilt, and the "altruism" that Posner refers to, toward those serving and dying in Iraq while fighting a war that has not been conducted very well.