Saturday, August 14, 2010

did agriculture fundamentally alter human sexuality?

gizmodo | • Before humans settled down into civilization, we were small bands of hunter-gatherers who had no notion of sexual monogamy. Within our relatively small tribes, most humans had multiple partners, primarily from within the tribal group, although occasionally we'd have a dalliance with a stranger to keep the DNA pool zesty. Children had multiple social "fathers", jealousy was nearly nonexistent, and relatively easy access to calories kept us fit, happy, and satisfied well into our 70s and 80s—provided we managed to get past the perils of high mortality rates expected from a wild environment and primitive medicine.

• Upon the discovery of agriculture, nomadic wandering was no longer possible—someone has to stick around to water the crops—so the ideas of property and inheritance became sadly useful. Domesticated food could become scarce, unlike the effectively endless bounty of hunter-gathering (ignoring the occasional climate-torqued famine or run of bad luck), so hoarding became necessary to ensure calories even in lean times. It's a lot of work to farm, so it became important to ensure that you weren't wasting your precious grains on someone else's offspring, especially if it meant you own kid was getting short shrift. Hence monogamy, marriage, and the unfortunate concept of partners as property, manifested in agrarian societies as a tendency to view women as chattel.

• Our genes, still tuned toward sexual novelty, cause us to really hate being monogamous, but societal pressures—including centralized codified religion—force men and women into an arrangement that brings with it just as many problems as it solves. Men cheat, women wither in sexual shackles (or, you know, cheat), wars erupt over resources or sexual exclusivity, cats and dogs almost start sleeping together except they're afraid the neighbors might find out—Old Testament, real wrath of God-type stuff.

While that glosses over so much good stuff from Sex at Dawn—our sexual similarities to our closest relatives, the bonobos; the dismantling of the idea that most animals are monogamous; humans' absolutely scandalous appetite for sex and our correspondingly massive genitals—I hope it's a fair summation of the part that's relative to my point (which is coming, I swear!): Agriculture fundamentally altered human sexuality.

britain reels as austerity cuts begin

NYTimes | Last month, the British government abolished the U.K. Film Council, the Health Protection Agency and dozens of other groups that regulate, advise and distribute money in the arts, health care, industry and other areas.

It seemed shockingly abrupt, a mass execution without appeal. But it was just a tiny taste of what was to come.

Like a shipwrecked sailor on a starvation diet, the new British coalition government is preparing to shrink down to its bare bones as it cuts expenditures by $130 billion over the next five years and drastically scales back its responsibilities. The result, said the Institute for Fiscal Studies, a research group, will be “the longest, deepest sustained period of cuts to public services spending” since World War II.

Until recently, the cuts were just election talking points, early warnings of a new age of austerity. But now the pain has begun. And as the government begins its abrupt retrenchment, the implications, complications and confusions in the process are beginning to emerge.

hmmm.....,

USAToday | At a time when workers' pay and benefits have stagnated, federal employees' average compensation has grown to more than double what private sector workers earn, a USA TODAY analysis finds.

Federal workers have been awarded bigger average pay and benefit increases than private employees for nine years in a row. The compensation gap between federal and private workers has doubled in the past decade.

Federal civil servants earned average pay and benefits of $123,049 in 2009 while private workers made $61,051 in total compensation, according to the Bureau of Economic Analysis. The data are the latest available.

The federal compensation advantage has grown from $30,415 in 2000 to $61,998 last year.

Public employee unions say the compensation gap reflects the increasingly high level of skill and education required for most federal jobs and the government contracting out lower-paid jobs to the private sector in recent years.

"The data are not useful for a direct public-private pay comparison," says Colleen Kelley, president of the National Treasury Employees Union.

Chris Edwards, a budget analyst at the libertarian Cato Institute, thinks otherwise. "Can't we now all agree that federal workers are overpaid and do something about it?" he asks.

Friday, August 13, 2010

gop's four deformations of the apocalypse

NYTimes | The first of these started when the Nixon administration defaulted on American obligations under the 1944 Bretton Woods agreement to balance our accounts with the world. Now, since we have lived beyond our means as a nation for nearly 40 years, our cumulative current-account deficit — the combined shortfall on our trade in goods, services and income — has reached nearly $8 trillion. That’s borrowed prosperity on an epic scale.

It is also an outcome that Milton Friedman said could never happen when, in 1971, he persuaded President Nixon to unleash on the world paper dollars no longer redeemable in gold or other fixed monetary reserves. Just let the free market set currency exchange rates, he said, and trade deficits will self-correct.

The second unhappy change in the American economy has been the extraordinary growth of our public debt. In 1970 it was just 40 percent of gross domestic product, or about $425 billion. When it reaches $18 trillion, it will be 40 times greater than in 1970. This debt explosion has resulted not from big spending by the Democrats, but instead the Republican Party’s embrace, about three decades ago, of the insidious doctrine that deficits don’t matter if they result from tax cuts.

The third ominous change in the American economy has been the vast, unproductive expansion of our financial sector. Here, Republicans have been oblivious to the grave danger of flooding financial markets with freely printed money and, at the same time, removing traditional restrictions on leverage and speculation. As a result, the combined assets of conventional banks and the so-called shadow banking system (including investment banks and finance companies) grew from a mere $500 billion in 1970 to $30 trillion by September 2008.

The fourth destructive change has been the hollowing out of the larger American economy. Having lived beyond our means for decades by borrowing heavily from abroad, we have steadily sent jobs and production offshore. In the past decade, the number of high-value jobs in goods production and in service categories like trade, transportation, information technology and the professions has shrunk by 12 percent, to 68 million from 77 million. The only reason we have not experienced a severe reduction in nonfarm payrolls since 2000 is that there has been a gain in low-paying, often part-time positions in places like bars, hotels and nursing homes.

David Stockman was President Ronald Reagan's director of the Office of Management and Budget.

Thursday, August 12, 2010

still more advanced civilization that will survive..,


the results of keeping drugs illegal

gilbertgrace | The law did have an effect on me. However, a lot of people did dabble during the 60s and 70s - now we have a hepatitis-C epidemic, thanks to the fact that those who dabbled in heroin had to do it under cover, sharing injecting equipment. Our society now faces the fact that 250,000 people have hepatitis-C, 20% to 30% of whom will end up with cirrhosis and need liver transplants. So our keeping the drug illegal in the 70s, all based on good thinking, led to consequences we just had no idea about but are now having to deal with.

Keeping drugs illegal gives government something to focus on as they fight the war on drugs rather than the war on dismay, despair, isolation and fear which has driven the drug use in the first place. This approach gives work to Customs agents, Federal Police and others, resulting in great news stories, such as large quantities of drugs being shown in people's underwear or in condoms which they hold up and say, "I wonder where they inserted this?". People think that is wonderful and get a chuckle out of it, but these drugs kill people. The people who are bringing them into the country are making millions of dollars at the expense of young people's lives. Despite the war on drugs, we are now seeing more heroin back in Sydney than ten years ago, thanks to the war on drugs and the war on terrorism which has allowed Afghanistan to now start producing more heroin than ever before.

The Stateline program on 4 February said it all, I think, sadly and innocently in some ways. We saw the story of a man who had committed suicide on the front lawn of some young people in western Sydney because these young, unemployed, under-engaged drug using teenagers had just heckled him and heckled him to the point of his killing himself - an extraordinary tale. Cannabis and other drug use were blamed to a significant degree for this outcome, whilst the issues of poor parenting, lack of work or social support systems were addressed far less clearly. The current system completely failed that man. If the kids who caused him to take his life were charged with his death and sent into the penal system, is there much hope that they would be rehabilitated? My answer is "No, there is not a lot of hope that they would come out of it better people." Our system failed everybody in that story - yet it was presented as a very intense, thoughtful look at drugs in our society.

Our current system criminalises the drug use that makes life bearable for some; it hardens the minds and hearts of those who do end up in the penal system; it ignores the bleedingly obvious societal factors which lead to dysfunctional drug use in the first place; and it allows this system to run beneath the surface of the law, out of reach of the police for much of the time, making millions of dollars and ending hundreds of tragic lives."

the neuronal basis of civilization?


Wednesday, August 11, 2010

thermogene collision


Video - A vision of how our genetic imperatives will clash with Peak Oil and net energy descent

greatest depression geography

thumper die-off narrative....,


matt simmons (april 7, 1943 – august 8, 2010)

EVWorld | A visionary or a gadfly, Simmons helped alert the world to peak oil.

News of Matt Simmon's untimely death came to me this morning in an email link to the August 9, 2010 Reuters new story, Oil guru Matthew Simmons dies in Maine.

I had had the pleasure of talking with Matt on a number of occasions over the years, even persuading him to join EV World's nascent Editorial Advisory Board, on the stipulation, he informed me, that I not ask his advice.

He first came to my attention back in 2004, when he was interviewed by Julian Darley on the topic of Peak Oil. I would interview him about his newly released book, "Twilight in the Desert " the summer of 2005.

I can still vividly recall his describing to me his wading through a three feet-high stack of Saudi oil field engineering reports, the consensus of which convinced him that Saudi Arabia's oil fields were in serious trouble, a conclusion that was not only the underlying premise of his book, but also earned him the reputation of an oil and gas industry gadfly, as well as a favorite cable news guest as oil prices briefly crested -- propelled mainly by greedy speculators -- at nearly $150 a barrel the summer of 2008. A year earlier he correctly predicted that the price would climb to over $100.

We finally would meet in person the Fall of 2005 at the first Association for the Study of Peak Oil conference in Denver and then again in Boston in 2006. At the Denver conference, he took a moment to thank me for helping publicize Twilight.

Over time, Matt became increasingly concerned about the imminence of Peak Oil and the serious impact it is likely to have on our culture. The founder of Simmons & Co. International, which today is one of the largest investment banking companies specializing in the energy industry, he also founded The Ocean Energy Institute, the mission of which is to promote research and development of offshore wind resources. In June, he announced he would retire from the investment bank business, but not before creating yet another sensation by alleging that the BP oil gusher is a "sideshow." The real problem, he contended is a huge seafloor fissure 5-7 miles away from the original blown wellhead. It is releasing, he estimated 120,000 barrels of oil a day. Little word has since appeared about that potential problem.

Matt, age 67, was discovered dead in the hot tub of his Maine summer home. The local coroner listed the cause of death as "accidental drowning" complicated by heart disease. He leaves his wife and five daughters, and a lot of grateful people whom he helped wake up to our challenging energy future, one he didn't live to see, but one I sure he hoped he'd helped make a little less turbulent by speaking out. Fist tap Dale.

Tuesday, August 10, 2010

colonization from without and from within

howtosavetheworld | Colonization is a loaded word, depending on whether you are the colonizer or the colonized. Throughout the history of our civilization, colonizers (imperialists, conquistadors, missionaries and, most recently, globalization corporatists) have asserted that colonized people were “savages” who needed external rule imposed on them “for their own good”. It matters little whether such assertions were honest, well-intentioned and misguided, or blatant excuses for theft, murder and oppression. The whole world is now substantially a single homogeneous colony, a single culture imposed and enforced by political and media propaganda, economic coercion, and of course, brute force.

The world “colonize” is from the Latin (whose speakers were accomplished at it) meaning “to inhabit, settle, farm and cultivate”. This definition carries no pretense of doing anything for the benefit of the “colonized” peoples. It just means taking over the land and resources, with or without violence and displacement. The words “culture” and “cultivate” also referred strictly to farming activities until, a mere two centuries ago, their meaning was expanded to include the intellectual, political, economic and social activities of civilization.

Such is the malleability of the human mind and conscience, that colonization occurs, to a greater or lesser extent, at four different levels, and the fact that the more interior forms of colonization are less obvious and often sub-conscious merely makes them, and their effect, more insidious. The four levels, depicted in the chart above, are, reading from the outside-in:

1. External colonization — where people from one land move into and colonize another land (e.g. various recent invasions of Afghanistan; NAFTA)
2. Internal colonization — where a dominant culture undermines and exterminates another culture within the same area (e.g. the ongoing brutality that the dominant European culture subjects indigenous peoples to, worldwide)
3. Self-colonization — where a group of people undermines and exterminates diversity within their own culture (e.g. McCarthyism, groupthink and hazing)
4. Personal colonization — where an individual molds her/himself to better fit in with her/his group and/or culture. Fist tap Dale.

separating the mind from essence..,

from Gurdjieff's "Views from the Real World," pp. 148-150

As long as a man does not separate himself from himself he can achieve nothing, and no one can help him.

To govern oneself is a very difficult thing--it is a problem for the future; it requires much power and demands much work. But this first thing, to separate oneself from oneself, does not require much strength, it only needs desire, serious desire, the desire of a grown-up man. If a man cannot do it, it shows that he lacks the desire of a grown-up man. Consequently it proves that there is nothing for him here. What we do here can only be a doing suitable for grown-up men.

Our mind, our thinking, has nothing in common with us, with our essence--no connection, no dependence. Our mind lives by itself and our essence lives by itself. When we say "to separate oneself from oneself" it means that the mind should stand apart from the essence. Our weak essence can change at any moment, for it is dependent on many influences: on food, on our surroundings, on time, on the weather, and on a multitude of other causes. But the mind depends on very few influences and so, with a little effort, it can be kept in the desired direction. Every weak man can give the desired direction to his mind. But he has no power over his essence; great power is required to give direction to essence and keep essence to it. (Body and essence are the same devil.)...

Speaking of the mind I know that each of you has enough strength, each of you can have the power and capacity to act not as he now acts....

I repeat, every grown-up man can achieve this; everyone who has a serious desire can do it. But no one tries....

In order to understand better what I mean, I shall give you an example: now, in a calm state, not reacting to anything or anyone, I decide to set myself the task of establishing a good relationship with Mr. B., because I need him for business purposes and can do what I wish only with his help. But I dislike Mr. B. for he is a very disagreeable man. He understands nothing. He is a blockhead. He is vile, anything you like. I am so made that these traits affect me. Even if he merely looks at me, I become irritated. If he talks nonsense, I am beside myself. I am only a man, so I am weak and cannot persuade myself that I need not be annoyed--I shall go on being annoyed.

Yet I can control myself, depending on how serious my desire is to gain the end I wish to gain through him. If I keep to this purpose, to this desire, I shall be able to do so. No matter how annoyed I may be, this state of wishing will be in my mind. No matter how furious, how beside myself I am, in a corner of my mind I shall still remember the task I set myself. My mind is unable to restrain me from anything, unable to make me feel this or that toward him, but it is able to remember. I say to myself: "You need him, so don't be cross or rude to him." It could even happen that I would curse him, or hit him, but my mind would continue to pluck at me, reminding me that I should not do so. But the mind is powerless to do anything.

This is precisely what anyone who has a serious desire not to identify himself with his essence can do. This is what is meant by "separating the mind from the essence."

And what happens when the mind becomes merely a function? If I am annoyed, if I lose my temper, I shall think, or rather "it" will think, in accordance with this annoyance, and I shall see everything in the light of the annoyance. To hell with it!

And so I say that with a serious man--a simple, ordinary man without any extraordinary powers, but a grown-up man--whatever he decides, whatever problem he has set himself, that problem will always remain in his head. Even if he cannot achieve it in practice, he will always keep it in his mind. Even if he is influenced by other considerations, his mind will not forget the problem he has set himself. He has a duty to perform and, if he is honest, he will strive to perform it, because he is a grown-up man.

No one can help him in this remembering, in this separation of oneself from oneself. A man must do it for himself. Only then, from the moment a man has this separation, can another man help him....

The only difference between a child and a grown-up man is in the mind. All the weaknesses are there, beginning with hunger, with sensitivity, with naiveté; there is no difference. The same things are in a child and in a grown-up man: love, hate, everything. Functions are the same, receptivity is the same, equally they react, equally they are given to imaginary fears. In short there is no difference. The only difference is in the mind: we have more material, more logic than a child.

on essence..,

LiveScience | Our personalities stay pretty much the same throughout our lives, from our early childhood years to after we're over the hill, according to a new study.

The results show personality traits observed in children as young as first graders are a strong predictor of adult behavior.

"We remain recognizably the same person," said study author Christopher Nave, a doctoral candidate at the University of California, Riverside. "This speaks to the importance of understanding personality because it does follow us wherever we go across time and contexts."

The study will be published in an upcoming issue of the journal Social Psychological and Personality Science.

Tracking personalities

Using data from a 1960s study of approximately 2,400 ethnically diverse schoolchildren (grades 1 – 6) in Hawaii, researchers compared teacher personality ratings of the students with videotaped interviews of 144 of those individuals 40 years later.

They examined four personality attributes — talkativeness (called verbal fluency), adaptability (cope well with new situations), impulsiveness and self-minimizing behavior (essentially being humble to the point of minimizing one's importance). Fist tap Nana.

Monday, August 09, 2010

plenty of advanced civilization that will survive

LATimes | Here in this medieval city in eastern Ethiopia, the humans and the hyenas are living in peace.

The truce began two centuries ago (or so the story goes) during a time of great famine.

There was drought in the hills where the wildlife roamed, and hungry hyenas had sneaked into Harar and eaten people.

Distressed, the town's Muslim saints convened a meeting on a nearby mountaintop. There, they devised a solution: The people would feed the hyenas porridge if the hyenas would stop their attacks.

The plan worked, and a strange, symbiotic relationship was born.

City leaders went on to create holes in the sand-colored stone walls that surround Harar to give the hyenas nightly access to the town's garbage. And in the 1960s, a farmer started feeding hyenas scraps of meat (goat, donkey, sometimes camel) to keep them away from his livestock.

That farmer was the first hyena man. Today the title belongs to Youseff Mume Saleh. Fist tap Big Don.

new hypothesis for human evolution and nature

Physorg | It's no secret to any dog-lover or cat-lover that humans have a special connection with animals. But in a new journal article and forthcoming book, paleoanthropologist Pat Shipman of Penn State University argues that this human-animal connection goes well beyond simple affection. Shipman proposes that the interdependency of ancestral humans with other animal species -- "the animal connection" -- played a crucial and beneficial role in human evolution over the last 2.6 million years.

"Establishing an intimate connection to other animals is unique and universal to our species," said Shipman, a professor of biological anthropology. Her paper describing the new hypothesis for human evolution based on the tendency to nurture members of other species will be published in the August 2010 issue of the journal Current Anthropology.

In addition to describing her theory in the scientific paper, Shipman has authored a book for the general public, now in press with W. W. Norton, titled The Animal Connection. "No other mammal routinely adopts other species in the wild -- no gazelles take in baby cheetahs, no mountain lions raise baby deer," Shipman said. "Every mouthful you feed to another species is one that your own children do not eat. On the face of it, caring for another species is maladaptive, so why do we humans do this?"

Shipman suggests that the animal connection was prompted by the invention of stone tools 2.6-million years ago. "Having sharp tools transformed wimpy human ancestors into effective predators who left many cut marks on the fossilized bones of their prey," Shipman said. Becoming a predator also put our ancestors into direct competition with other carnivores for carcasses and prey. As Shipman explains, the human ancestors who learned to observe and understand the behavior of potential prey obtained more meat. "Those who also focused on the behavior of potential competitors reaped a double evolutionary advantage for natural selection," she said.

india asks, should food be a right for the poor?

NYTimes | Inside the drab district hospital, where dogs patter down the corridors, sniffing for food, Ratan Bhuria’s children are curled together in the malnutrition ward, hovering at the edge of starvation. His daughter, Nani, is 4 and weighs 20 pounds. His son, Jogdiya, is 2 and weighs only eight.

Landless and illiterate, drowned by debt, Mr. Bhuria and his ailing children have staggered into the hospital ward after falling through India’s social safety net. They should receive subsidized government food and cooking fuel. They do not. The older children should be enrolled in school and receiving a free daily lunch. They are not. And they are hardly alone: India’s eight poorest states have more people in poverty — an estimated 421 million — than Africa’s 26 poorest nations, one study recently reported.

For the governing Indian National Congress Party, which has staked its political fortunes on appealing to the poor, this persistent inability to make government work for people like Mr. Bhuria has set off an ideological debate over a question that once would have been unthinkable in India: Should the country begin to unshackle the poor from the inefficient, decades-old government food distribution system and try something radical, like simply giving out food coupons, or cash?

The rethinking is being prodded by a potentially sweeping proposal that has divided the Congress Party. Its president, Sonia Gandhi, is pushing to create a constitutional right to food and expand the existing entitlement so that every Indian family would qualify for a monthly 77-pound bag of grain, sugar and kerosene. Such entitlements have helped the Congress Party win votes, especially in rural areas.

To Ms. Gandhi and many left-leaning social allies, making a food a legal right would give people like Mr. Bhuria a tool to demand benefits that rightfully belong to them. Many economists and market advocates within the Congress Party agree that the poor need better tools to receive their benefits but believe existing delivering system needs to be dismantled, not expanded; they argue that handing out vouchers equivalent to the bag of grain would liberate the poor from an unwieldy government apparatus and let them buy what they please, where they please.

Sunday, August 08, 2010

climate killing golf courses?!?!?!?!?!

WSJ | The sustained record-breaking heat across much of the U.S. this summer, combined with high humidity and occasional heavy rain, is killing the greens on many golf courses. A handful of high-profile courses have already had to close, and if the heat continues, others are likely to follow. Golfers themselves deserve part of the blame for insisting that putting surfaces be mown short and fast even in weather conditions in which such practices are almost certain to ruin them.

Huntingdon Valley Country Club outside Philadelphia, which dates from 1897, shut two of its three nines two weeks ago because of serious turf disease caused by the hot, wet weather. The Philadelphia area in July had 17 days of 90-degree-plus weather, six more than average, mixed with flooding thunderstorms of up to 4 inches.

The U.S. Golf Association last week issued a special "turf-loss advisory" to courses in the Mid-Atlantic states, urgently advising greenkeepers to institute "defensive maintenance and management programs" until the weather crisis ends. Most of the danger is to greens planted in creeping bentgrass and annual bluegrass (also known as poa annua).

"Physiologically, these are cool-season grasses that do very well when the air temperature is 60 to 75 degrees," said Clark Throssell, director of research for the Golf Course Superintendents Association of America. "They can cope with a few days of 90-degree weather every summer, but when that kind of heat lasts for days at a time, they have extreme difficulty."

Temperatures for weather reports are measured in the shade, but greens baking in the midday sun can reach 120 or 130 degrees. When grass spends too much time in soil that hot, it starts to thin out, turn yellow and wither. Most bentgrass strains will collapse entirely with prolonged exposure to 106-degree soil. The grass doesn't go dormant—it dies.

the HUGE cost of public pensions

NYTimes | There’s a class war coming to the world of government pensions.

The haves are retirees who were once state or municipal workers. Their seemingly guaranteed and ever-escalating monthly pension benefits are breaking budgets nationwide.

The have-nots are taxpayers who don’t have generous pensions. Their 401(k)s or individual retirement accounts have taken a real beating in recent years and are not guaranteed. And soon, many of those people will be paying higher taxes or getting fewer state services as their states put more money aside to cover those pension checks.

At stake is at least $1 trillion. That’s trillion, with a “t,” as in titanic and terrifying.

The figure comes from a study by the Pew Center on the States that came out in February. Pew estimated a $1 trillion gap as of fiscal 2008 between what states had promised workers in the way of retiree pension, health care and other benefits and the money they currently had to pay for it all. And some economists say that Pew is too conservative and the problem is two or three times as large.

So a question of extraordinary financial, political, legal and moral complexity emerges, something that every one of us will be taking into town meetings and voting booths for years to come: Given how wrong past pension projections were, who should pay to fill the 13-figure financing gap?

paring down...,

NYTimes | Faced with the steepest and longest decline in tax collections on record, state, county and city governments have resorted to major life-changing cuts in core services like education, transportation and public safety that, not too long ago, would have been unthinkable. And services in many areas could get worse before they get better.

The length of the downturn means that many places have used up all their budget gimmicks, cut services, raised taxes, spent their stimulus money — and remained in the hole. Even with Congress set to approve extra stimulus aid, some analysts say states are still facing huge shortfalls.

Cities and states are notorious for crying wolf around budget time, and for issuing dire warnings about draconian cuts that never seem to materialize. But the Great Recession has been different. Around the country, there have already been drastic cuts in core services like education, transportation and public safety, and there are likely to be more before the downturn ends. The cuts that have disrupted lives in Hawaii, Georgia and Colorado may be extreme, but they reflect the kinds of cuts being made nationwide, disrupting the lives of millions of people in ways large and small.

jobless and staying that way...,

NYTimes | Americans have almost always taken growth for granted. Recessions kick in, financial crises erupt, yet these events have generally been thought of as the exception, a temporary departure from an otherwise steady upward progression.

But as expectations for the recovery diminish daily and joblessness shows no sign of easing — as the jobs report on Friday showed — a different view is taking hold. And with it, comes implications for policymaking.

The “new normal,” as it has come to be called on Wall Street, academia and CNBC, envisions an economy in which growth is too slow to bring down the unemployment rate, while the government is forced to intervene ever more forcefully in a struggling private sector. Stocks and bonds yield paltry returns, with better opportunities available for investors overseas.

If that sounds like the last three years, it should. Bill Gross and Mohamed El-Erian, who run the world’s largest bond fund, Pimco, and coined the phrase in this context, think the new normal has already begun and will last at least another three to five years.

The new normal challenges the optimism that’s been at the root of American success for decades, if not centuries. And if it is here, the new normal could force Democrats and Republicans to rethink their traditional approach to unemployment and other social problems.

as an american I did not believe....,

NYTimes | “This economy is absolutely appalling,” said Mary Moore, 39, who has been applying for jobs as an administrative assistant in Norfolk, Va., since she lost her job at a publishing company in May 2009. Ms. Moore, who can collect unemployment benefits for a few more months, is struggling to pay her $525-a-month rent and health care insurance, which recently nearly tripled to $379 a month.

“As an American I did not believe we would see times such as this,” she said.

With the departure of thousands of temporary Census workers and thousands more let go by state and local governments, businesses could not rescue the American labor market in July.

Over all, the nation lost 131,000 jobs last month, according to the Department of Labor, which also said that June was far weaker than previously indicated.

Private employers added 71,000 jobs last month, but those figures were overtaken by the 143,000 cut as the Census wound down. It is also about half the number that economists say is needed to simply accommodate population growth, so the tepid job increases cannot begin to plug the hole created by the loss of more than eight million jobs during the recession. The unemployment rate, in fact, remained stuck at 9.5 percent in July.

“The private sector is still hobbled,” said Robert A. Dye, senior economist at PNC Financial Services Group in Pittsburgh, “and certainly is not nearly strong enough to overcome the drain on the government side.”

Government figures released last week confirmed that the American economy slowed in the spring, and the latest jobs numbers suggested that the weakness continued into the early summer. With economists and politicians fervently arguing over whether the economy is poised for liftoff or stalled on the runway, Friday’s jobs report did little to end the debate.

Some economists are talking about the risk of a “double dip” recession, and the political stakes for the Obama administration are rising as the midterm elections tick closer.

Saturday, August 07, 2010

yeah.., I said it!!!

xinhua | The head of China's largest domestic rating agency denied criticism by its western counterparts' of practicing populism, while reaffirming the agency's principals of independence, objectivity and fairness.

Based on long-term research on the credit economy and rules and experience of rating standards, the Dagong Global Credit Rating Co. Ltd provides impartial rating information in the post-crisis era, which has warded off swinging to domestic interests or so-called "populism", Guan Jianzhong, chairman of the Dagong Global Credit Rating told Xinhua in an exclusive interview on Wednesday.

In Tuesday's interview with the Financial Times, Harold "Terry" McGraw III, chairman and chief executive of the U.S.-based McGraw-Hill Companies, which owns Standard & Poor's, suggested that the Chinese rating agency follow a "populist mood", and lack transparency in publishing its policies, procedures and putting out assumptions and criteria.

Guan said the accusation is irresponsible for the western rating firm to label a new-born international rating agency as "populist", instead of carrying out self-criticism on its own highly politicized rating standards.

"Standard & Poor's failed to identify the debtor nations' currency depreciation, which infringed on the interests of the creditor nations, as the sovereign debt default. Such practice is the fundamental cause weighing on the instability of the international credit system," said Guan.

Guan also rejected reports that he suggested the government should have more control in credit rating decisions.

"It's a total sheer absurdity. I'v never made such a suggestion," he said.

"Dagong has been maintaining its independent, impartial and fair position, however, the independence of some U.S. rating firms needs to be questioned due to the close relationship between the shareholders and their clients,"said Guan, adding billionaire investor Warren Buffett is the largest shareholder in Moody's.

oh, and your credit bad.....,

FT | The head of China’s largest credit rating agency has slammed his western counterparts for causing the global financial crisis and said that as the world’s largest creditor nation China should have a bigger say in how governments and their debt are rated.

“The western rating agencies are politicised and highly ideological and they do not adhere to objective standards,” Guan Jianzhong, chairman of Dagong Global Credit Rating, told the Financial Times in an interview. “China is the biggest creditor nation in the world and with the rise and national rejuvenation of China we should have our say in how the credit risks of states are judged.”

On the corporate side, Mr Guan argues Moody’s Investors Service, Standard & Poor’s and Fitch Ratings – the three companies that dominate the global credit rating industry – have become too close to the clients they are supposed to be objectively assessing.

He specifically criticised the practice of “rating shopping” by companies who offer their business to the agency that provides the most favourable rating.

In the aftermath of the financial crisis “rating shopping” has been one of the key complaints from western regulators , who have heavily criticised the big three agencies for handing top ratings to mortgage-linked securities that turned toxic when the US housing market collapsed in 2007.

“The financial crisis was caused because rating agencies didn’t properly disclose risk and this brought the entire US financial system to the verge of collapse, causing huge damage to the US and its strategic interests,” Mr Guan said.

Recently, the rating agencies have been criticised for being too slow to downgrade some of the heavily indebted peripheral eurozone economies, most notably Spain, which still holds triple A ratings from Moody’s.

Friday, August 06, 2010

inner pessimism and powerlessness


Video - classic excerpt from Watermelon Man

WSJ | do our political leaders have any sense of what people are feeling deep down? They don't act as if they do. I think their detachment from how normal people think is more dangerous and disturbing than it has been in the past. I started noticing in the 1980s, the growing gulf between the country's thought leaders, as they're called—the political and media class, the universities—and those living what for lack of a better word we'll call normal lives on the ground in America. The two groups were agitated by different things, concerned about different things, had different focuses, different world views.

But I've never seen the gap wider than it is now. I think it is a chasm. In Washington they don't seem to be looking around and thinking, Hmmm, this nation is in trouble, it needs help. They're thinking something else. I'm not sure they understand the American Dream itself needs a boost, needs encouragement and protection. They don't seem to know or have a sense of the mood of the country.

And so they make their moves, manipulate this issue and that, and keep things at a high boil. And this at a time when people are already in about as much hot water as they can take.

To take just one example from the past 10 days, the federal government continues its standoff with the state of Arizona over how to handle illegal immigration. The point of view of our thought leaders is, in general, that borders that are essentially open are good, or not so bad. The point of view of those on the ground who are anxious about our nation's future, however, is different, more like: "We live in a welfare state and we've just expanded health care. Unemployment's up. Could we sort of calm down, stop illegal immigration, and absorb what we've got?" No is, in essence, the answer.

An irony here is that if we stopped the illegal flow and removed the sense of emergency it generates, comprehensive reform would, in time, follow. Because we're not going to send the estimated 10 million to 15 million illegals already here back. We're not going to put sobbing children on a million buses. That would not be in our nature. (Do our leaders even know what's in our nature?) As years passed, those here would be absorbed, and everyone in the country would come to see the benefit of integrating them fully into the tax system. So it's ironic that our leaders don't do what in the end would get them what they say they want, which is comprehensive reform.

When the adults of a great nation feel long-term pessimism, it only makes matters worse when those in authority take actions that reveal their detachment from the concerns—even from the essential nature—of their fellow citizens. And it makes those citizens feel powerless.

Inner pessimism and powerlessness: That is a dangerous combination.

Thursday, August 05, 2010

culture war in america

NYTimes | The events fanned a long-standing disagreement between much of the high school faculty and the administration of Hunter College over the use of a single, teacher-written test for admission to the school, which has grades 7 through 12. Faculty committees have recommended broadening the admissions process to include criteria like interviews, observations or portfolios of student work, in part to increase minority enrollment and blunt the impact of the professional test preparation undertaken by many prospective students.

Eliminating the test, which has remained essentially unchanged for decades, is not on the table, said John Rose, the dean for diversity at Hunter College. The test, he said, is an integral part of the success of the school, which has a stellar college admissions profile — about 25 percent of graduates are admitted to Ivy League schools — and outstanding alumni like Ms. Kagan and Ruby Dee.

“Parents, faculty members and alumni feel very strongly that the test is very valuable in terms of preserving the kind of specialness and uniqueness that the school has,” Mr. Rose said.

As has happened at other prestigious city high schools that use only a test for admission, the black and Hispanic population at Hunter has fallen in recent years. In 1995, the entering seventh-grade class was 12 percent black and 6 percent Hispanic, according to state data. This past year, it was 3 percent black and 1 percent Hispanic; the balance was 47 percent Asian and 41 percent white, with the other 8 percent of students identifying themselves as multiracial. The public school system as a whole is 70 percent black and Hispanic.

When Justin Hudson, 18, stood up in his purple robes to address his classmates in the auditorium of Hunter College, those numbers were on his mind. He opened his remarks by praising the school and explaining how appreciative he was to have made it to that moment.

Then he shocked his audience. “More than anything else, I feel guilty,” Mr. Hudson, who is black and Hispanic, told his 183 fellow graduates. “I don’t deserve any of this. And neither do you.”

They had been labeled “gifted,” he told them, based on a test they passed “due to luck and circumstance.” Beneficiaries of advantages, they were disproportionately from middle-class Asian and white neighborhoods known for good schools and the prevalence of tutoring.

“If you truly believe that the demographics of Hunter represent the distribution of intelligence in this city,” he said, “then you must believe that the Upper West Side, Bayside and Flushing are intrinsically more intelligent than the South Bronx, Bedford-Stuyvesant and Washington Heights. And I refuse to accept that.”

The entire faculty gave him a standing ovation, as did about half the students. The principal, Eileen Coppola, who had quietly submitted her formal resignation in mid-June but had not yet informed the faculty, praised him, saying, “That was a very good and a very brave speech to make,” Mr. Hudson recalled. But Jennifer J. Raab, Hunter College’s president and herself a Hunter High alumna, looked uncomfortable on the stage and did not join in the ovation, faculty members and students said.

american culture war


Video - MSNBC: Gay Marriage Is Inevitable (Cenk's Take)

NYTimes | The decision, though an instant landmark in American legal history, is more than that. It also is a stirring and eloquently reasoned denunciation of all forms of irrational discrimination, the latest link in a chain of pathbreaking decisions that permitted interracial marriages and decriminalized gay sex between consenting adults.

As the case heads toward appeals at the circuit level and probably the Supreme Court, Judge Walker’s opinion will provide a firm legal foundation that will be difficult for appellate judges to assail.

The case was brought by two gay couples who said California’s Proposition 8, which passed in 2008 with 52 percent of the vote, discriminated against them by prohibiting same-sex marriage and relegating them to domestic partnerships. The judge easily dismissed the idea that discrimination is permissible if a majority of voters approve it; the referendum’s outcome was “irrelevant,” he said, quoting a 1943 case, because “fundamental rights may not be submitted to a vote.”

He then dismantled, brick by crumbling brick, the weak case made by supporters of Proposition 8 and laid out the facts presented in testimony. The two witnesses called by the supporters (the state having bowed out of the case) had no credibility, he said, and presented no evidence that same-sex marriage harmed society or the institution of marriage.

Same-sex couples are identical to opposite-sex couples in their ability to form successful marital unions and raise children, he said. Though procreation is not a necessary goal of marriage, children of same-sex couples will benefit from the stability provided by marriage, as will the state and society. Domestic partnerships confer a second-class status. The discrimination inherent in that second-class status is harmful to gay men and lesbians. These findings of fact will be highly significant as the case winds its way through years of appeals.

Video - Michael Savage on California gay marriage ban being overturned

Wednesday, August 04, 2010

solar thermal electrochemical photo (STEP) carbon capture

ACS | The first experimental evidence of a new solar process, combining electronic and chemical pathways, to isolate CO2 (carbon capture) is presented. This solar thermal electrochemical photo (STEP) process is a synergy of solid-state and solar thermal processes, and is fundamentally capable of converting more solar energy than photovoltaic or solar thermal processes alone. Here, CO2 is captured using a 750−950 °C electrolysis cell powered by a full spectrum solar simulator in a single step. The process uses the full spectrum; solar thermal energy decreases the energy required for carbon capture, while visible sunlight generates electronic charge to drive the electrolysis. CO2 can be captured from 34% to over 50% solar energy efficiency (depending on the level of solar heat inclusion), as solid carbon and stored, or used as carbon monoxide to be available for a feedstock to synthesize (with STEP generated hydrogen) solar diesel fuel, synthetic jet fuel, or chemical production.

conservative shakedown crews

WaPo | Days after the Deepwater Horizon oil rig sank in the Gulf of Mexico, a conservative nonprofit group called the Institute for Energy Research asked BP to contribute $100,000 for a media campaign it was launching in defense of the oil industry.

Although BP took a pass, the group's advocacy arm went ahead with a campaign -- only instead of defending BP, it vilified the company as a "safety outlier" in an otherwise safe industry. The campaign's Web site features dozens of images of the burning rig, oil-smeared birds and other environmental devastation from the spill.

"BP is a victim of its own carelessness," the group's president, Thomas Pyle, wrote as part of the campaign's kickoff in early July. "The rest of us should not be."

To backers of BP who were familiar with the discussions and spoke on the condition of anonymity, it seemed an awful lot like a shakedown. The initial proposal contained no criticism of the British oil giant or its handling of the spill. A BP spokesman declined to comment.

But Pyle, previously an oil-industry lobbyist and an aide to former congressman and Texas Republican Tom DeLay, said the anti-BP message was part of a separate campaign and was not intended as retaliation. "A lot of people were trying to lump the industry together as one cohesive unit," Pyle said in an interview. "Our point was to not judge the whole industry by one incident and one actor."

The case illustrates the murky world of advocacy-for-hire in Washington, where ideological groups wage stealth messaging campaigns with little disclosure of their funding or possible motives. Such arrangements rarely come to light since most advocacy groups are organized as nonprofits that do not have to disclose details about their donors.

Tuesday, August 03, 2010

tracing oil reserves to their tiny origins

NYTimes | In 1913, as the automobile zoomed into American life, The Outing Magazine gave its readers a bit of background on what fueled the new motorcars in “The Story of Gasoline.” After a brief vignette describing the death of “old Colonel Stegosaurus Ugulatus,” the article explained that “yesterday you poured the remains of the dinosaur from a measuring-can — which, let us hope, held five gallons, full measure — into your gasoline tank.”

The idea that oil came from the terrible lizards that children love to learn about endured for many decades. The Sinclair Oil Company featured a dinosaur in its logo and in its advertisements, and outfitted its gas stations with giant replicas that bore long necks and tails. The publicity gave the term “fossil fuels” new resonance.

But the emphasis turned out to be wrong.

Today, a principal tenet of geology is that a vast majority of the world’s oil arose not from lumbering beasts on land but tiny organisms at sea. It holds that blizzards of microscopic life fell into the sunless depths over the ages, producing thick sediments that the planet’s inner heat eventually cooked into oil. It is estimated that 95 percent or more of global oil traces its genesis to the sea.

“It’s the dominant theory,” said David A. Ross, scientist emeritus at the Woods Hole Oceanographic Institution on Cape Cod. The idea, he added, has been verified as geologists have roamed the globe over the decades and repeatedly found that beds of marine sediments are “a good predictor” of where to discover oil.

The theory also explains offshore drilling — why there is oil in many seabeds, why it is more often near shore than in the abyss, and why, despite the Deepwater Horizon disaster in the Gulf of Mexico, which killed 11 crewmen and caused the worst offshore oil spill in American history, oil experts say offshore drilling may increase, rather than cease.

cosmic rays not background after all?

LiveScience | A puzzling pattern in the cosmic rays bombarding Earth from space has been discovered by an experiment buried deep under the ice of Antarctica.

Cosmic rays are highly energetic particles streaming in from space that are thought to originate in the distant remnants of dead stars.

But it turns out these particles are not arriving uniformly from all directions. The new study detected an overabundance of cosmic rays coming from one part of the sky, and a lack of cosmic rays coming from another.

This odd pattern was detected by the IceCube Neutrino Observatory, an experiment still under construction that is actually intended to detect other exotic particles called neutrinos. In fact, scientists have gone out of their way to try to block out all signals from cosmic rays in order to search for the highly elusive neutrinos, which are much harder to find.

Yet in sifting through their cosmic-ray data to try to separate it from possible neutrino signals, the researchers noticed the intriguing pattern. Fist tap Nana.

Monday, August 02, 2010

human race will be extinct within 100 years

DailyMail | As the scientist who helped eradicate smallpox he certainly know a thing or two about extinction.

And now Professor Frank Fenner, emeritus professor of microbiology at the Australian National University, has predicted that the human race will be extinct within the next 100 years.

He has claimed that the human race will be unable to survive a population explosion and 'unbridled consumption.’

Fenner told The Australian newspaper that 'homo sapiens will become extinct, perhaps within 100 years.'

'A lot of other animals will, too,' he added.

'It's an irreversible situation. I think it's too late. I try not to express that because people are trying to do something, but they keep putting it off.'

Since humans entered an unofficial scientific period known as the Anthropocene - the time since industrialisation - we have had an effect on the planet that rivals any ice age or comet impact, he said.

Fenner, 95, has won awards for his work in helping eradicate the variola virus that causes smallpox and has written or co-written 22 books.

He announced the eradication of the disease to the World Health Assembly in 1980 and it is still regarded as one of the World Health Organisation's greatest achievements.

He was also heavily involved in helping to control Australia's myxomatosis problem in rabbits.

Last year official UN figures estimated that the world’s population is currently 6.8 billion. It is predicted to exceed seven billion by the end of 2011.

Fenner blames the onset of climate change for the human race’s imminent demise.

He said: 'We'll undergo the same fate as the people on Easter Island.

'Climate change is just at the very beginning. But we're seeing remarkable changes in the weather already.'

'The Aborigines showed that without science and the production of carbon dioxide and global warming, they could survive for 40,000 or 50,000 years.

‘But the world can't. The human species is likely to go the same way as many of the species that we've seen disappear.'

the limits of the coded world

NYTimes | To my mind the philosopher who gave the most complete answer to this question was Immanuel Kant. In Kant’s view, the main mistake philosophers before him had made when considering how humans could have accurate knowledge of the world was to forget the necessary difference between our knowledge and the actual subject of that knowledge. At first glance, this may not seem like a very easy thing to forget; for example, what our eyes tell us about a rainbow and what that rainbow actually is are quite different things. Kant argued that our failure to grasp this difference was further reaching and had greater consequences than anyone could have thought.

The belief that our empirical exploration of the world and of the human brain could ever eradicate human freedom is an error.

Taking again the example of the rainbow, Kant would argue that while most people would grant the difference between the range of colors our eyes perceive and the refraction of light that causes this optical phenomenon, they would still maintain that more careful observation could indeed bring one to know the rainbow as it is in itself, apart from its sensible manifestation. This commonplace understanding, he argued, was at the root of our tendency to fall profoundly into error, not only about the nature of the world, but about what we were justified in believing about ourselves, God, and our duty to others.

The problem was that while our senses can only ever bring us verifiable knowledge about how the world appears in time and space, our reason always strives to know more than appearances can show it. This tendency of reason to always know more is and was a good thing. It is why human kind is always curious, always progressing to greater and greater knowledge and accomplishments. But if not tempered by a respect for its limits and an understanding of its innate tendencies to overreach, reason can lead us into error and fanaticism.

Let’s return to the example of the experiment predicting the monkeys’ decisions. What the experiment tells us is nothing other than that the monkeys’ decision making process moves through the brain, and that our technology allows us to get a reading of that activity faster than the monkeys’ brain can put it into action. From that relatively simple outcome, we can now see what an unjustified series of rather major conundrums we had drawn. And the reason we drew them was because we unquestioningly translated something unknowable — the stretch of time including the future of the monkeys’ as of yet undecided and unperformed actions — into a neat scene that just needed to be decoded in order to be experienced. We treated the future as if it had already happened and hence as a series of events that could be read and narrated.

From a Kantian perspective, with this simple act we allowed reason to override its boundaries, and as a result we fell into error. The error we fell into was, specifically, to believe that our empirical exploration of the world and of the human brain could ever eradicate human freedom.

This, then, is why, as “irresistible” as their logic might appear, none of the versions of Galen Strawson’s “Basic Argument” for determinism, which he outlined in The Stone last week, have any relevance for human freedom or responsibility. According to this logic, responsibility must be illusory, because in order to be responsible at any given time an agent must also be responsible for how he or she became how he or she is at that time, which initiates an infinite regress, because at no point can an individual be responsible for all the genetic and cultural forces that have produced him or her as he or she is. But this logic is nothing other than a philosophical version of the code of codes; it assumes that the sum history of forces determining an individual exist as a kind of potentially legible catalog.

The point to stress, however, is that this catalog is not even legible in theory, for to be known it assumes a kind of knower unconstrained by time and space, a knower who could be present from every possible perspective at every possible deciding moment in an agent’s history and prehistory. Such a knower, of course, could only be something along the lines of what the monotheistic traditions call God. But as Kant made clear, it makes no sense to think in terms of ethics, or responsibility, or freedom when talking about God; to make ethical choices, to be responsible for them, to be free to choose poorly, all of these require precisely the kind of being who is constrained by the minimal opacity that defines our kind of knowing.

As much as we owe the nature of our current existence to the evolutionary forces Darwin first discovered, or to the cultures we grow up in, or to the chemical states affecting our brain processes at any given moment, none of this impacts on our freedom. I am free because neither science nor religion can ever tell me, with certainty, what my future will be and what I should do about it. The dictum from Sartre that Strawson quoted thus gets it exactly right: I am condemned to freedom. I am not free because I can make choices, but because I must make them, all the time, even when I think I have no choice to make.

your move: the limits of free will

NYTimes | You may have heard of determinism, the theory that absolutely everything that happens is causally determined to happen exactly as it does by what has already gone before — right back to the beginning of the universe. You may also believe that determinism is true. (You may also know, contrary to popular opinion, that current science gives us no more reason to think that determinism is false than that determinism is true.) In that case, standing on the steps of the store, it may cross your mind that in five minutes’ time you’ll be able to look back on the situation you’re in now and say truly, of what you will by then have done, “Well, it was determined that I should do that.” But even if you do fervently believe this, it doesn’t seem to be able to touch your sense that you’re absolutely morally responsible for what you next.

The case of the Oxfam box, which I have used before to illustrate this problem, is relatively dramatic, but choices of this type are common. They occur frequently in our everyday lives, and they seem to prove beyond a doubt that we are free and ultimately morally responsible for what we do. There is, however, an argument, which I call the Basic Argument, which appears to show that we can never be ultimately morally responsible for our actions. According to the Basic Argument, it makes no difference whether determinism is true or false. We can’t be ultimately morally responsible either way.

The argument goes like this.

(1) You do what you do — in the circumstances in which you find yourself—because of the way you then are.

(2) So if you’re going to be ultimately responsible for what you do, you’re going to have to be ultimately responsible for the way you are — at least in certain mental respects.

(3) But you can’t be ultimately responsible for the way you are in any respect at all.

(4) So you can’t be ultimately responsible for what you do.

The key move is (3). Why can’t you be ultimately responsible for the way you are in any respect at all? In answer, consider an expanded version of the argument.

(a) It’s undeniable that the way you are initially is a result of your genetic inheritance and early experience.

(b) It’s undeniable that these are things for which you can’t be held to be in any way responsible (morally or otherwise).

(c) But you can’t at any later stage of life hope to acquire true or ultimate moral responsibility for the way you are by trying to change the way you already are as a result of genetic inheritance and previous experience.

(d) Why not? Because both the particular ways in which you try to change yourself, and the amount of success you have when trying to change yourself, will be determined by how you already are as a result of your genetic inheritance and previous experience.

(e) And any further changes that you may become able to bring about after you have brought about certain initial changes will in turn be determined, via the initial changes, by your genetic inheritance and previous experience.

There may be all sorts of other factors affecting and changing you. Determinism may be false: some changes in the way you are may come about as a result of the influence of indeterministic or random factors. But you obviously can’t be responsible for the effects of any random factors, so they can’t help you to become ultimately morally responsible for how you are.

Sunday, August 01, 2010

warsocialism's epic failure

HuffPo | Throughout much of the twentieth century, great powers had vied with one another to create new, or more effective, instruments of coercion. Military innovation assumed many forms. Most obviously, there were the weapons: dreadnoughts and aircraft carriers, rockets and missiles, poison gas, and atomic bombs -- the list is a long one. In their effort to gain an edge, however, nations devoted equal attention to other factors: doctrine and organization, training systems and mobilization schemes, intelligence collection and war plans.

All of this furious activity, whether undertaken by France or Great Britain, Russia or Germany, Japan or the United States, derived from a common belief in the plausibility of victory. Expressed in simplest terms, the Western military tradition could be reduced to this proposition: war remains a viable instrument of statecraft, the accoutrements of modernity serving, if anything, to enhance its utility.

Grand Illusions

That was theory. Reality, above all the two world wars of the last century, told a decidedly different story. Armed conflict in the industrial age reached new heights of lethality and destructiveness. Once begun, wars devoured everything, inflicting staggering material, psychological, and moral damage. Pain vastly exceeded gain. In that regard, the war of 1914-1918 became emblematic: even the winners ended up losers. When fighting eventually stopped, the victors were left not to celebrate but to mourn. As a consequence, well before Fukuyama penned his essay, faith in war’s problem-solving capacity had begun to erode. As early as 1945, among several great powers -- thanks to war, now great in name only -- that faith disappeared altogether.

Among nations classified as liberal democracies, only two resisted this trend. One was the United States, the sole major belligerent to emerge from the Second World War stronger, richer, and more confident. The second was Israel, created as a direct consequence of the horrors unleashed by that cataclysm. By the 1950s, both countries subscribed to this common conviction: national security (and, arguably, national survival) demanded unambiguous military superiority. In the lexicon of American and Israeli politics, “peace” was a codeword. The essential prerequisite for peace was for any and all adversaries, real or potential, to accept a condition of permanent inferiority. In this regard, the two nations -- not yet intimate allies -- stood apart from the rest of the Western world.

So even as they professed their devotion to peace, civilian and military elites in the United States and Israel prepared obsessively for war. They saw no contradiction between rhetoric and reality.

Yet belief in the efficacy of military power almost inevitably breeds the temptation to put that power to work. “Peace through strength” easily enough becomes “peace through war.” Israel succumbed to this temptation in 1967. For Israelis, the Six Day War proved a turning point. Plucky David defeated, and then became, Goliath. Even as the United States was flailing about in Vietnam, Israel had evidently succeeded in definitively mastering war.

A quarter-century later, U.S. forces seemingly caught up. In 1991, Operation Desert Storm, George H.W. Bush’s war against Iraqi dictator Saddam Hussein, showed that American troops like Israeli soldiers knew how to win quickly, cheaply, and humanely. Generals like H. Norman Schwarzkopf persuaded themselves that their brief desert campaign against Iraq had replicated -- even eclipsed -- the battlefield exploits of such famous Israeli warriors as Moshe Dayan and Yitzhak Rabin. Vietnam faded into irrelevance.

For both Israel and the United States, however, appearances proved deceptive. Apart from fostering grand illusions, the splendid wars of 1967 and 1991 decided little. In both cases, victory turned out to be more apparent than real. Worse, triumphalism fostered massive future miscalculation.