Wednesday, August 01, 2012

the elite perversion of scholarship

truthdig | Fraternities, sororities and football, along with other outsized athletic programs, have decimated most major American universities. Scholarship, inquiry, self-criticism, moral autonomy and a search for artistic and esoteric forms of expression—in short, the world of ethics, creativity and ideas—are shouted down by the drunken chants of fans in huge stadiums, the pathetic demands of rich alumni for national championships, and the elitism, racism and rigid definition of gender roles of Greek organizations. These hypermasculine systems perpetuate a culture of conformity and intolerance. They have inverted the traditional values of scholarship to turn four years of college into a mindless quest for collective euphoria and athletic dominance.

There is probably no more inhospitable place to be an intellectual, or a person of color or a member of the LGBT community, than on the campuses of the Big Ten Conference colleges, although the poison of this bizarre American obsession has infected innumerable schools. These environments are distinctly corporate. To get ahead one must get along. The student is implicitly told his or her self-worth and fulfillment are found in crowds, in mass emotions, rather than individual transcendence. Those who do not pay deference to the celebration of force, wealth and power become freaks. It is a war on knowledge in the name of knowledge.

“Knowledge,” as C. Wright Mills wrote in “The Power Elite,” “is no longer widely felt as an ideal; it is seen as an instrument. In a society of power and wealth, knowledge is valued as an instrument of power and wealth, and also, of course, as an ornament in conversation.”

There are few university presidents or faculty members willing to fight back. Most presidents are overcompensated fundraisers licking the boots of every millionaire who arrives on campus. They are like court eunuchs. They cater to the demands of the hedge fund managers and financial speculators on their trustee boards, half of whom should be in jail, and most of whom revel in this collective self-worship. And they do not cross the football coach, who not only earns more than they do but has much more power on the campus.

Tuesday, July 31, 2012

DHS Presents: Surviving a Crazy Baldhead Eruption


charles murray deeply worried about his people and their ways...,



WSJ | Mitt Romney's résumé at Bain should be a slam dunk. He has been a successful capitalist, and capitalism is the best thing that has ever happened to the material condition of the human race. From the dawn of history until the 18th century, every society in the world was impoverished, with only the thinnest film of wealth on top. Then came capitalism and the Industrial Revolution. Everywhere that capitalism subsequently took hold, national wealth began to increase and poverty began to fall. Everywhere that capitalism didn't take hold, people remained impoverished. Everywhere that capitalism has been rejected since then, poverty has increased.

Capitalism has lifted the world out of poverty because it gives people a chance to get rich by creating value and reaping the rewards. Who better to be president of the greatest of all capitalist nations than a man who got rich by being a brilliant capitalist?

Yet it hasn't worked out that way for Mr. Romney. "Capitalist" has become an accusation. The creative destruction that is at the heart of a growing economy is now seen as evil. Americans increasingly appear to accept the mind-set that kept the world in poverty for millennia: If you've gotten rich, it is because you made someone else poorer.

What happened to turn the mood of the country so far from our historic celebration of economic success?

Two important changes in objective conditions have contributed to this change in mood. One is the rise of collusive capitalism. Part of that phenomenon involves crony capitalism, whereby the people on top take care of each other at shareholder expense (search on "golden parachutes").

Another change in objective conditions has been the emergence of great fortunes made quickly in the financial markets. It has always been easy for Americans to applaud people who get rich by creating products and services that people want to buy. That is why Thomas Edison and Henry Ford were American heroes a century ago, and Steve Jobs was one when he died last year.

When great wealth is generated instead by making smart buy and sell decisions in the markets, it smacks of inside knowledge, arcane financial instruments, opportunities that aren't accessible to ordinary people, and hocus-pocus. The good that these rich people have done in the process of getting rich is obscure. The benefits of more efficient allocation of capital are huge, but they are really, really hard to explain simply and persuasively. It looks to a large proportion of the public as if we've got some fabulously wealthy people who haven't done anything to deserve their wealth.

The objective changes in capitalism as it is practiced plausibly account for much of the hostility toward capitalism.

Monday, July 30, 2012

cia manages the drug trade...,

aljazeera | The US Central Intelligence Agency and other international security forces "don't fight drug traffickers", a spokesman for the Chihuahua state government in northern Mexico has told Al Jazeera, instead "they try to manage the drug trade".

Allegations about official complicity in the drug business are nothing new when they come from activists, professors, campaigners or even former officials. However, an official spokesman for the authorities in one of Mexico's most violent states - one which directly borders Texas - going on the record with such accusations is unique.

"It's like pest control companies, they only control," Guillermo Terrazas Villanueva, the Chihuahua spokesman, told Al Jazeera last month at his office in Juarez. "If you finish off the pests, you are out of a job. If they finish the drug business, they finish their jobs."

Villanueva is not a high ranking official and his views do not represent Mexico's foreign policy establishment. Other more senior officials in Chihuahua State, including the mayor of Juarez, dismissed the claims as "baloney".

"I think the CIA and DEA [US Drug Enforcement Agency] are on the same side as us in fighting drug gangs," Hector Murguia, the mayor of Juarez, told Al Jazeera during an interview inside his SUV. "We have excellent collaboration with the US."

Under the Merida Initiative, the US Congress has approved more than $1.4bn in drug war aid for Mexico, providing attack helicopters, weapons and training for police and judges.

More than 55,000 people have died in drug related violence in Mexico since December 2006. Privately, residents and officials across Mexico's political spectrum often blame the lethal cocktail of US drug consumption and the flow of high-powered weapons smuggled south of the border for causing much of the carnage. Fist tap Arnach.

forget LIBORgate - oil market manipulation is much worse...,

zerohedge | Since the Global Community all the sudden seems to be preoccupied with Market manipulation even though the authorities knew it was a problem for over 5 years with Libor Rate Fixing. It is high time authorities look at the Crude Oil market which has been manipulated for the last decade and all the sophisticated participants know it is rigged or artificially higher than the fundamentals of the economy dictate. Consumers are paying an easy $35 dollars per barrel over what they would otherwise dole out for a barrel of oil if fund managers didn`t use the benchmark futures contracts as their own personal ATMs.

Just a month ago Crude Oil WTI was $78 a barrel and today it is $93. Do you think the fundamentals changed one bit to merit this price swing? Nope! Supply levels are all at record highs around the world. Is it Iran? Please!! It is all about the money flows, nobody takes delivery anymore. Assets have become one big correlated risk trade. Risk On, Risk Off. If the Dow is up a hundred, you can bet crude is up at least a dollar! It has nothing to do with fundamentals, inventory levels, supply disruptions, etc. It is all about fund flows.

So how this affects the average Joe is that if Wall Street is having a good day, i.e., fund flows are going in, then Average Joe is having a bad day and paying more for Gas. Yes, it is that simple. A good day for Wall Street is a bad day for consumers at the pump these days as Capital flows into one big Asset Trade: Risk On!

It should be separate in that equities respond to stock valuations, and energy responds to the market conditions of supply and demand. But that isn`t the case in the investing world today, it is all about Capital Flows in and out of Assets. The economy could be doing really poorly, Oil inventories can be extremely high, the economic data very bleak but Oil will go up and consumers will pay more at the pump just because some Fund Manager pours capital into a futures contract.The Fund Managers goals are in direct opposition to the consumers who actually uses the product. Funds flows and not supply and demand ultimately carry the day in the energy markets, and that needs to change!

The key is equities, crude oil (both Brent and WTI) are essentially equities for Fund Managers to trade in and out of and they make a fortune in these instruments. When I refer to Fund Managers this includes Hedge Funds, Oil Majors, Pension Funds, Investment Banks etc. This is part of the reason that the price of oil can be so varied in value within a 3 month span. WTI can literally be $110 one month and $80 the next because of pure funds going in or coming out of the futures contracts.

The volatility really is where they make their money, they have deep pockets and they make a fortune moving crude oil around like a puppet on a string. If you think in terms of each dollar price move in the commodity being equal to $1,000 and the size that these players employ on a monthly and quarterly basis you start to see the value of buying thousands and thousands of futures contracts and capitalizing on these huge moves in the commodity.

Sunday, July 29, 2012

does the insula help elite athletes better anticipate their body's upcoming feelings, improving their physical reactions?



scientificamerican | All elite athletes train hard, possess great skills and stay mentally sharp during competition. But what separates a gold medalist from an equally dedicated athlete who comes in 10th place? A small structure deep in the brain may give winners an extra edge.

Recent studies indicate that the brain's insular cortex may help a sprinter drive his body forward just a little more efficiently than his competitors. This region may prepare a boxer to better fend off a punch his opponent is beginning to throw as well as assist a diver as she calculates her spinning body's position so she hits the water with barely a splash. The insula, as it is commonly called, may help a marksman retain a sharp focus on the bull's-eye as his finger pulls back on the trigger and help a basketball player at the free-throw line block out the distracting screams and arm-waving of fans seated behind the backboard.

The insula does all this by anticipating an athlete's future feelings, according to a new theory. Researchers at the OptiBrain Center, a consortium based at the University of California, San Diego, and the Naval Health Research Center, suggest that an athlete possesses a hyper-attuned insula that can generate strikingly accurate predictions of how the body will feel in the next moment. That model of the body's future condition instructs other brain areas to initiate actions that are more tailored to coming demands than those of also-rans and couch potatoes.

This heightened awareness could allow Olympians to activate their muscles more resourcefully to swim faster, run farther and leap higher than mere mortals. In experiments published in 2012, brain scans of elite athletes appeared to differ most dramatically from ordinary subjects in the functioning of their insulas. Emerging evidence now also suggests that this brain area can be trained using a meditation technique called mindfulness—good news for Olympians and weekend warriors alike.

genomic study of african hunter-gatherers elucidates human variation and ancient interbreeding

sciencedaily | Human diversity in Africa is greater than any place else on Earth. Differing food sources, geographies, diseases and climates offered many targets for natural selection to exert powerful forces on Africans to change and adapt to their local environments. The individuals who adapted best were the most likely to reproduce and pass on their genomes to the generations who followed.

That history of inheritance is written in the DNA of modern Africans, but it takes some investigative work to interpret. In a report to be featured on the cover of the Aug. 3 issue of the journal Cell, University of Pennsylvania geneticists and their colleagues analyze the fully sequenced genomes of 15 Africans belonging to three different hunter-gatherer groups and decipher some of what these genetic codes have to say about human diversity and evolution.

The study, led by Sarah Tishkoff, a Penn Integrates Knowledge Professor with appointments in the School of Arts and Sciences' biology department and the Perelman School of Medicine's genetics department, tells several stories.

It identifies several million previously unknown genetic mutations in humans. It finds evidence that the direct ancestors of modern humans may have interbred with members of an unknown ancestral group of hominins. It suggests that different groups evolved distinctly in order to reap nutrition from local foods and defend against infectious disease. And it identifies new candidate genes that likely play a major role in making Pygmies short in stature.

"Our analysis sheds light on human evolution, because the individuals we sampled are descended from groups that may have been ancestral to all other modern humans," Tishkoff said. "A message we're seeing is that even though all the individuals we sampled are hunter-gatherers, natural selection has acted differently in these different groups."

Joining Tishkoff in the work from Penn was first author Joseph Lachance as well as Clara Elbers, Bart Ferwerda and Timothy Rebbeck. Their collaborators include Benjamin Vernot, Wenqing Fu and Joshua Akey of the University of Washington; Alain Froment of France's Musée de L'Homme; Jean-Marie Bodo of Cameroon's Ministère de la Recherche Scientifique et de l'Innovation; Godfrey Lema and Thomas B. Nyambo of Tanzania's Muhimbili University College of Health Sciences; and Kun Zhang of the University of California at San Diego.

The researchers sequenced the genomes of five men from each of three hunter-gatherer groups: the Hadza and the Sandawe of Tanzania and the Western Pygmies of Cameroon. The three differ greatly from one another in appearance, in language, in the environments they occupy and in cultural practices, though the Hadza and the Sandawe live just 200 kilometers apart.

"We purposefully picked three of the most diverse hunter-gatherer groups," Tishkoff said, "because they have not been very well represented in other genome sequencing projects, which tend to focus on majority populations in Africa. This is a unique and important dataset."

Saturday, July 28, 2012

since the last famine, have as many girls been butchered in ethiopia as jewish women who died in the holocaust?

independent | Last Thursday week, with famine approaching yet again, I wondered about the wisdom of forking out yet more aid to Ethiopia. Since the great famine of the mid-1980s, Ethiopia's population has soared from 33.5 million to 78 million.

Now, I do not write civil service reports for the United Nations: I write a newspaper column, and I was deliberately strong in my use of language -- as indeed I had been when writing reports from Ethiopia at the height of that terrible Famine.

I was sure that my column would arouse some hostility: my concerns were intensified when I saw the headline: "Africa has given the world nothing but AIDS." Which was not quite what I said -- the missing "almost" goes a long way; and anyway, my article was about aid, not AIDS.

Since dear old Ireland can often enough resemble Lynch Mob Central on PC issues, I braced myself for the worst: and sure enough, in poured the emails. Three hundred on the first day, soon reaching over 800: but, amazingly, 90pc+ were in my support, and mostly from baffled, decent and worried people. The minority who attacked me were risibly predictable, expressing themselves with a vindictive and uninquiring moral superiority. (Why do so many of those who purport to love mankind actually hate people so?)

We did more in Ethiopia a quarter of a century ago than just rescue children from terrible death through starvation: we also saved an evil, misogynistic and dysfunctional social system. Presuming that half the existing population (say, 17 million) of the mid 1980s is now dead through non-famine causes, the total added population from that time is some 60 million, around half of them female.

That is, Ethiopia has effectively gained the entire population of the United Kingdom since the famine. But at least 80pc of Ethiopian girls are circumcised, meaning that no less than 24 million girls suffered this fate, usually without anaesthetics or antiseptic. The UN estimates that 12pc of girls die through septicaemia, spinal convulsions, trauma and blood-loss after circumcision which probably means that around three million little Ethiopian girls have been butchered since the famine -- roughly the same as the number of Jewish women who died in the Holocaust.

2 UC Davis neurosurgeons accused of experimental surgery are banned from human research


sacbee | A prominent UC Davis neurosurgeon was banned from performing medical research on humans after he and an underling were accused of experimenting on dying brain cancer patients without university permission, The Bee has learned.

Dr. J. Paul Muizelaar, who earns more than $800,000 a year as chairman of the department of neurological surgery, was ordered last fall to "immediately cease and desist" from any research involving human subjects, according to documents obtained by The Bee.

Also banned was the colleague, Dr. Rudolph J. Schrot, an assistant professor and neurosurgeon who has worked under Muizelaar the past 13 years.

The university has admitted to the federal government that the surgeons' actions amounted to "serious and continuing noncompliance" with federal regulations.

Documents show the surgeons got the consent of three terminally ill patients with malignant brain tumors to introduce bacteria into their open head wounds, under the theory that postoperative infections might prolong their lives. Two of the patients developed sepsis and died, the university later determined.

The actions – described by two prominent bioethicists as "astonishing," and a "major penalty" for the school – threaten both the doctors' professional careers and the university's reputation and federal-funding status.

"This is really distressing" said Patricia Backlar, an Oregon bioethicist who served on former President Bill Clinton's national bioethics advisory commission.

"UC Davis is a very respectable school, but even the best places have trouble," Backlar said. " … These men have put that school in jeopardy."

Research on both humans and animals is tightly controlled in the United States and, according to federal regulations and university policy, must undergo a rigorous approval process to ensure that subjects are protected.

did a female burro commit suicide?

psychologytoday | Every now and again someone asks me if nonhuman animals (animals) commit suicide. This past weekend I gave two lectures as part of the Collegiate Peaks Forum Series in Buena Vista, Colorado and after I talked about grief and mourning in a wide variety of species (see also) someone in the audience asked me this question. My answer was that there are some good observations of animals seemingly taking their own lives in situations when one might expect a human to take his or her own life. For example, it's been suggested that whales intentionally beach themselves to end their lives, highly stressed elephants step on their trunks or jump over a cliff to end prolonged pain (see also), and cats stressed out by earthquakes kill themselves.

Opinions vary from "yes they do" to "perhaps they do" to "no they don't" (see also). Some say animals don't have the same concept of death that we have and don't know that their lives will end when they do something to stop breathing.

Burro suicide?

After one of my talks in Buena Vista one of the women in the audience, Cathy Manning, told me a very simple but compelling story about a burro who seemed to kill herself. Cathy knew a female burro who gave birth to a baby with a harelip. The infant couldn't be revived and Cathy watched the mother walk into a lake and drown. It's known that various equines including horses and donkeys grieve the loss of others (see and and) so I didn't find this story to be inconsistent with what is known about these highly emotional beings.

I think it's too early to make any definite statements about whether animals commit suicide but this does not mean they don't grieve and mourn the loss of family and friends. What they're thinking when they're deeply sadded when another animal dies isn't clear but it's obvious that a wide variety of animals suffer the loss of family and friends. Cathy's story made me rethink the question if animals commit suicide and I hope this brief story opens the door for some good discussion about thie intriguing possibility. As some of my colleagues and I have stressed, we must pay attention to stories and hope they will stimulate more research in a given area.

Friday, July 27, 2012

supernatural: meetings with the ancient teachers of mankind



grahamhancock | My intention at the outset was to write a book exploring the mystery of human origins. There are many gaps in the fossil record between about 7 million years ago (the date of our supposed last common ancestor with chimpanzees) and the emergence of the first civilisations recognised by historians around 5000 years ago. My thought was that if I probed these gaps diligently enough something might emerge – some insight, some scrap of previously neglected information – that might shed light on the great puzzles of the human predicament. Why, alone amongst animal species, have we developed culture and religion, beliefs in life after death, beliefs in non-physical beings such as spirits, demons and angels, elaborate mythologies, the ability to create and to appreciate art, the ability to use and manipulate symbols, consciousness of ourselves and of our place in the scheme of things? Did these abstract, even “spiritual”, qualities develop slowly, over millions of years, or were they switched on suddenly, like lights in a darkened room?

To cut a long story short, what I discovered is that during most of the first 7 million years of human evolution there is no evidence at all for the existence of symbolic abilities amongst our ancestors. No matter how intensively we probe what is known about the fossil record, or speculate about what is not yet known about it, all that we see evidence for throughout this period is a dull and stultifying copying and recopying of essentially the same patterns of behaviour and essentially the same “kits” of crude stone tools, without change or innovation, for periods of hundreds of thousands, even millions of years. When a change is introduced (in tool shape for example) it then sets a new standard to be copied and recopied without innovation for a further immense period until the next change is finally adopted. In the process, glacially slow, we also see the gradual development of human anatomy in the direction of the modern form: the brain-pan enlarges, brow ridges reduce in size, overall anatomy becomes more gracile – and so on and so forth.

By 196,000 years ago, and on some accounts considerably earlier, humans had achieved “full anatomical modernity”. This means that they were in every way physically indistinguishable from the people of today and, crucially, that they possessed the same large, complex brains as we do. The most striking mystery, however, is that their behaviour continued to lag behind their acquisition of modern neurology and appearance. They showed no sign of possessing a culture, or supernatural beliefs, or self-consciousness, or any interest in symbols. Indeed there was nothing about them that we could instantly identify with “us”. Dr Frank Brown, whose discovery of 196,000-year-old anatomically-modern human skeletons in Ethiopia was published in Nature on 17 February 2005, points out that they are 35,000 years older than the previous “oldest” modern human remains known to archaeologists:

“This is significant because the cultural aspects of humanity in most cases appear much later in the record, which would mean 150,000 years of Homo sapiens without cultural stuff…”

Brown’s colleague, John Fleagle of Stony Brook University in New York State, also comments on the same problem:

“There is a huge debate regarding the first appearance of modern aspects of behaviour… As modern human anatomy is documented at earlier and earlier sites, it becomes evident that there was a great time gap between the appearance of the modern skeleton and ‘modern behaviour’.”

For Ian Tattershall of the American Museum of Natural History the problem posed by this gap – and what happened to our ancestors during it – is “the question of questions in palaeoanthropology”. His colleague Professor David Lewis-Williams of the Rock Art Research Institute at South Africa’s Witwatersrand University describes the same problem as “the greatest riddle of archaeology – how we became human and in the process began to make art and to practice what we call religion.”

I quickly realized that this was the mystery, and the period, I wanted to investigate. Not that endless, unimaginative cultural desert from 7 million years ago down to just 40,000 years ago when our ancestors hobbled slowly through their long and boring apprenticeship, but the period of brilliant and burning symbolic light that followed soon afterwards when the first of the great cave art of southwest Europe appeared – already perfect and fully formed – between 35,000 and 30,000 years ago.

A most remarkable theory exists to explain the special characteristics of these amazing and haunting early works of art, and to explain why identical characteristics are also found in prehistoric art from many other parts of the world and in art produced by the shamans of surviving tribal cultures today. The theory was originally elaborated by Professor David Lewis-Williams, and is now supported by a majority of archaeologists and anthropologists. In brief, it proposes that the reason for the similarities linking all these different systems of art, produced by different, unrelated cultures at different and widely-separated periods of history, is that in every case the shaman-artists responsible for them had previously experienced altered states of consciousness in which they had seen vivid hallucinations, and in every case their endeavour in making the art was to memorialise on the walls of rock shelters and caves the ephemeral images that they had seen in their visions. According to this theory the different bodies of art have so many similarities because we all share the same neurology, and thus share many of the same experiences and visions in altered states of consciousness.

artist puts together collection of self-portraits while on drugs

boombotix | Bryan Lewis Saunders is a performance artist / poet who has been drawing self-portraits every day since 1995.

Literally.

He has over 8,000 of these self-portraits at home. As you could imagine, it can get a little boring doing the same thing over and over for a while.

I mean, he’s not a terribly interesting looking guy: balding, wears glasses, has a beard. Nothing too special (no offense Bryan).

To spice things up, Saunders decided to do a series of self-portraits while on various forms of drugs. And not that easy stuff. We’re talking everything from mushrooms to crystal meth, Adderall, and bottles of cough syrup.

This guy didn’t just do the project: he went hard at it.

When you’re looking at each of the portraits, note what the drugs are doing to his motor skills and how he tries to compensate while also trying to communicate what he’s experiencing via the drug. Fist tap Dale.

Thursday, July 26, 2012

social identification, not obedience, motivates unspeakable acts

sciencedaily | What makes soldiers abuse prisoners? How could Nazi officials condemn thousands of Jews to gas chamber deaths? What's going on when underlings help cover up a financial swindle? For years, researchers have tried to identify the factors that drive people to commit cruel and brutal acts and perhaps no one has contributed more to this knowledge than psychological scientist Stanley Milgram.

Just over 50 years ago, Milgram embarked on what were to become some of the most famous studies in psychology. In these studies, which ostensibly examined the effects of punishment on learning, participants were assigned the role of "teacher" and were required to administer shocks to a "learner" that increased in intensity each time the learner gave an incorrect answer. As Milgram famously found, participants were willing to deliver supposedly lethal shocks to a stranger, just because they were asked to do so.

Researchers have offered many possible explanations for the participants' behavior and the take-home conclusion that seems to have emerged is that people cannot help but obey the orders of those in authority, even when those orders go to the extremes.

This obedience explanation, however, fails to account for a very important aspect of the studies: why, and under what conditions, people did not obey the experimenter.

In a new article published in Perspectives on Psychological Science, a journal of the Association for Psychological Science, researchers Stephen Reicher of the University of St. Andrews and Alexander Haslam and Joanne Smith of the University of Exeter propose a new way of looking at Milgram's findings.

The researchers hypothesized that, rather than obedience to authority, the participants' behavior might be better explained by their patterns of social identification. They surmised that conditions that encouraged identification with the experimenter (and, by extension, the scientific community) led participants to follow the experimenters' orders, while conditions that encouraged identification with the learner (and the general community) led participants to defy the experimenters' orders.

As the researchers explain, this suggests that participants' willingness to engage in destructive behavior is "a reflection not of simple obedience, but of active identification with the experimenter and his mission."

closer to a food crisis than most people realise...,

guardian | In the early spring this year, US farmers were on their way to planting some 96m acres in corn, the most in 75 years. A warm early spring got the crop off to a great start. Analysts were predicting the largest corn harvest on record.

The United States is the leading producer and exporter of corn, the world's feedgrain. At home, corn accounts for four-fifths of the US grain harvest. Internationally, the US corn crop exceeds China's rice and wheat harvests combined. Among the big three grains – corn, wheat, and rice – corn is now the leader, with production well above that of wheat and nearly double that of rice.

The corn plant is as sensitive as it is productive. Thirsty and fast-growing, it is vulnerable to both extreme heat and drought. At elevated temperatures, the corn plant, which is normally so productive, goes into thermal shock.

As spring turned into summer, the thermometer began to rise across the corn belt. In St Louis, Missouri, in the southern corn belt, the temperature in late June and early July climbed to 100F or higher 10 days in a row. For the past several weeks, the corn belt has been blanketed with dehydrating heat.

Weekly drought maps published by the University of Nebraska show the drought-stricken area spreading across more and more of the country until, by mid-July, it engulfed virtually the entire corn belt. Soil moisture readings in the corn belt are now among the lowest ever recorded.

While temperature, rainfall, and drought serve as indirect indicators of crop growing conditions, each week the US Department of Agriculture releases a report on the actual state of the corn crop. This year the early reports were promising. On 21 May, 77% of the US corn crop was rated as good to excellent. The following week the share of the crop in this category dropped to 72%. Over the next eight weeks, it dropped to 26%, one of the lowest ratings on record. The other 74% is rated very poor to fair. And the crop is still deteriorating.

Over a span of weeks, we have seen how the more extreme weather events that come with climate change can affect food security. Since the beginning of June, corn prices have increased by nearly one half, reaching an all-time high on 19 July.

Although the world was hoping for a good US harvest to replenish dangerously low grain stocks, this is no longer on the cards. World carryover stocks of grain will fall further at the end of this crop year, making the food situation even more precarious. Food prices, already elevated, will follow the price of corn upward, quite possibly to record highs.

Not only is the current food situation deteriorating, but so is the global food system itself.

drought and the economy...,



forbes | The effects of the vast drought afflicting America’s farm belt are rippling across the economy. Major companies apparently feeling the heat from rising crop prices include McDonald’s, Smithfield Foods and Arthur Daniels Midland, which processes agricultural commodities.

More than half of the nation’s pasture and rangeland is now plagued by drought – the largest natural disaster area in U.S. history. And with corn prices soaring as crops wither, other sectors are nervously watching the weather forecasts and assessing potential impacts on their business. For example:
The Climate Connection
But perhaps the most sobering implication of this agricultural crisis is what it heralds for the long-term health of our economy.

Unlike the reaction to the recent searing heat wave, the mainstream media has largely ignored a possible climate connection to America’s worst drought since 1956. While this particular drought could turn out to be due to several factors, (such as a second winter of La Niña), we know the afflicted region will look increasingly as it does today in a warmer world.

The U.S. Global Change Research Program, for example, has projected more frequent and severe droughts across much of the United States. Their forecast for the Great Plains region, 70 percent of which is farmland, is dire: increasing temperatures and evaporation rates and more sustained drought, furthering stressing already overstrained water resources.

Wednesday, July 25, 2012

strange fruit: biology interpreted in the service of dominant social interests...,

kenanmalik | How did society become racialized? This is the most complex of the questions Malik tackles.

For most of human history, the concept of race simply did not exist, at least in the way we think of race today. Malik turns to Ivan Hannaford and his exhaustive study Race: The History of an Idea in the West to demonstrate this historical contrast. While the Greeks classified peoples of the world by skin color, they rejected a racial worldview in favor of a political and civic one, Hannaford asserts. For the Greeks, the key social distinction was between citizens and 'barbarians'. Even in the Middle Ages, Hannaford emphasizes, the main issue with regard to strangers was, 'Do they possess a rule of law', 'Do they act like us?' What defined a person was his or her relationship to law and to faith, not biology or history.

Hannaford’s conclusion is not cited by Malik, but is worth noting. He emphasizes that racial and political thought are two opposed approaches to social organization. He goes further to characterize political thinking as 'inherently and logically resistant to the idea of race as we understand it.' He states that race is 'inimical to Western civilization in the strict sense of the word', and that ethnicity is an idea introduced in modern times that gained importance only in proportion to the decline in political thought (emphasis in the original). Both writers concur that the word 'race' may have been in use for a long time, but its modern meaning has not. As man’s social organization has evolved, the imputed content of 'race' has taken on very different significance, a point often not understood.

Like Hannaford, Malik provides a survey of the development of racial categorization, tracing the role of various schools of thought from romanticism to positivism and postmodernism, as well as a whole range of thinkers from the German philosopher Johann Gottfried von Herder through the founder of cultural anthropology, Franz Boas.

However, Malik takes strong issue with Hannaford, and many postmodernists, when they blame both the Enlightenment in general and its adherents among more modern scientists such as Carl Linnaeus and Charles Darwin for creating and perpetuating racism through taxonomy. Malik does not dispute the rise of such trends as 'scientific racism', as developed by Johann Friedrich Blumenbach, but he emphasizes that the Enlightenment’s attitude toward human difference was permeated by the revolutionary ideas of social equality and the perfectibility of man. The predominant view of that revolutionary period was that human variation, physical or cultural, represented differences not in kind but in degree.

Darwin and the majority of the scientists of his age embodied this spirit. In fact, the fundamental philosophical orientation of racial theory - which assumes the fixity of characteristics- ran entirely counter to natural selection, as Malik notes. For the misnamed 'social Darwinists', struggle eliminated the impure specimens of the race to perpetuate the ideal type. Darwin, on the other hand, dismissed the idea of an ideal type of a species as nonsense.

However, when the Enlightenment’s ideals of liberty, equality and fraternity were not realized following the French Revolution, when social inequality continued and worsened despite the developments of science, the tendency developed to explain poverty and other social ills in racial terms, as though they were somehow natural.

As for the common theory that racism, at least in the New World, evolved directly from slavery, Malik notes, 'As a biological theory, 19th century racial thought was shaped less by the attempts of a reactionary slave-owning class to justify privileges than by the growing pessimism among liberals about the possibilities of equality and social progress.' C. Vann Woodward argues similarly in his groundbreaking book The Strange Career of Jim Crow in which he points to the loss of support for Radical Reconstruction by Northern liberals as a decisive factor in the rise of Jim Crow segregation.

Malik also states that in Victorian England 'race' was considered a description of social distinctions rather than a skin color. With social degradation developing alongside intensified exploitation in British industry, the existence of classes began to be interpreted as hereditary.

Malik concludes that race did not cause inequality, but that the persistence and growth of inequality provided the basis for the growth of racial thinking. This profound point, well worth emphasizing, is at the center of his prior volume, The Meaning of Race. This truth needs to be firmly grounded in historical analysis, and both Malik’s and Hannaford’s summaries tend to give heavy weight to the views of a long series of intellectuals without fully connecting this history of ideas with the social relations and the class struggle. At times, this line of argument conflates the naïve fears of those at the bottom of society with deliberate state policy decisions at the top.

In The Meaning of Race, Malik says that the preoccupation with race at the turn of the 20th century reflected the concern for social stability, the fear of working class unrest, the growth of national rivalries and the emergence of imperialism. Unfortunately, he does not return to this point in Strange Fruit.

While there are many complex intellectual strands that influence the rise of ideas, at bottom they reflect the movement of social forces. At critical historical junctures, certain ideas are “selected,” or found to express the interests of social forces, particularly those of the dominant class. 'The ruling ideas of each age have ever been the ideas of its ruling class', said Karl Marx in the Communist Manifesto.

As a book emphasizing the implications of racial thought for science, Strange Fruit lays less emphasis on this relationship between rise of ideologies and class forces than Malik’s prior work. Nevertheless, the growth of racism historically did not reflect the state of biology. It was the reverse - biology was often interpreted in the service of prevailing social interests, a point he himself refers to.

subconsciously we echo the speech of superiors...,



physorg | Want to know who holds the power? Just listen carefully, preferably with a little help from a computer. Research at Cornell shows that people speaking to someone of perceived superior status often unconsciously echo the linguistic style of that person. The effect is usually not noticed by humans but shows up in a computer analysis of large amounts of text. The linguistic clues were found in discussions in which the outcome matters to the speaker.

The rule seems to apply across many domains of life. The researchers found it in Internet discussion and in arguments before the Supreme Court. In the latter case, also offers clues to which justices may favor one side or the other in a case.
Graduate student and lead author Cristian Danescu-Niculescu-Mizil presented the research at the World Wide Web Conference April 16-20 in Lyon, France. Co-authors are Jon Kleinberg '93, the Tisch University Professor of computer science; Lillian Lee '93, professor of computer science; and Yahoo! researcher Bo Pang, Ph.D. '06.

While commonly study people in small groups, the were able to find subtle effects because they worked with very large collections of text -- 240,000 conversations among Wikipedia editors and 50,389 verbal exchanges from 204 cases argued before the Supreme Court.

In conversation with someone more powerful, the analysis shows, a speaker tends to coordinate with the other person's use of "function words": articles, auxiliary verbs, conjunctions, frequently used adverbs ("very," "just," "often"), pronouns, prepositions and quantifiers ("all," "some," "many"). This means that the effects are independent of the topic and would show up even in text that has been censored to hide or disguise the subject matter, the researchers say.

As a further test, the researchers trained a computer to measure language coordination on Wikipedia and then fed it text from Supreme Court arguments and vice versa. They got the same results either way, confirming that the effect is independent of the situation.

On Wikipedia talk pages, where writers and editors discuss their articles, status is clearly defined, with some editors identified as "admins," who have more authority over what goes into an article. The researchers found that when editors were promoted to an admin position, others coordinated language to them more after the promotion. In turn, the newly promoted administrators coordinated their language less with the rank and file, but usually only after about two months of adjustment to their new status.

In the Supreme Court, as expected, lawyers coordinate their speech to justices. But, say the researchers, there is another factor besides formal status that confers power: dependence. Speakers coordinate their speech with those who can do something for them. Lawyers generally go before the with an idea of which justices will be opposed to their cause, and the analysis showed that lawyers coordinated their speech more with justices who opposed them (as confirmed by the final vote). At the same time, opposing justices coordinated less with those lawyers.

As a sidelight, the analysis showed that female lawyers coordinated their speech with justices more than male lawyers, and justices coordinated less with female lawyers. The researchers caution that this result may be influenced by other gender differences in communication style.

The researchers see applications of this type of analysis in studying the sociology of online groups, which previously has focused on structural features such as who talks to whom. "It is exciting to contemplate extending the range of social properties that can be analyzed via text," they concluded.

texas gop opposes teaching of critical thinking skills....,



WaPo | I thought I’d heard enough about the Texas Republican Party’s platform that rejects the teaching of critical thinking skills until I heard Stephen Colbert’s take on it.

I wrote about this recently here, quoting from the platform:

Knowledge-Based Education – We oppose the teaching of Higher Order Thinking Skills (HOTS) (values clarification), critical thinking skills and similar programs that are simply a relabeling of Outcome-Based Education (OBE) (mastery learning) which focus on behavior modification and have the purpose of challenging the student’s fixed beliefs and undermining parental authority.

After this was ridiculed, Texas GOP Communications Director Chris Elam told TPM.com that it was a mistake and that opposition to “critical thinking” wasn’t supposed to be part of the platform. Since a party convention approved the platform, it can’t just be dropped, he said. Sure thing.

Colbert returned to “The Colbert Report” from vacation this week and couldn’t resist taking a hilarious shot at this as part of a piece that is described on the show’s website like this: “The minds of young people are being poisoned by knowledge, but thankfully Texas is the Large Hadron Collider of denying science.”

Tuesday, July 24, 2012

BD's in the hizzle f'shizzle m'nizzles....,



about that PRR....,

ISAR | According to Philippe Rushton, the "equalitarian fiction," a "scientific hoax" that races are genetically equal in cognitive ability, underlies the "politically correct" objections to his research on racial differences. He maintains there is a taboo against race unequaled by the Inquisition. I show that while Rushton has been publicly harassed, he has had continuous opportunities to present his findings in diverse, widely available, respectable journals and no general suppression within academic psychology is evident. Similarly, Henry Garrett and his associates in the IAAEE, dedicated to preserving segregation and preventing "race suicide," disseminated their ideas widely, although Garrett complained of the "equalitarian fiction" in 1961. Examination of the intertwined history of Mankind Quarterly, German Rassenhygiene, far right politics, Henry Garrett, and Roger Pearson suggests that some cries of "political correctness" must be viewed with great caution.

Rushton's discussion of the "equalitarian dogma" suggest that brave, politically neutral scientists resisted the attempts of powerful left-wing forces to control their work. However, when the history of postwar racial difference research is examined, the picture is one of a relatively powerful set of well-funded people, most of whom believed in the basic tenets of early 20th century eugenics (7) and were strongly opposed to both integration and intermarriage, fearing "race suicide." They used every scientific and public communication channel available to convince their colleagues and the public of their position. Far from suffering academic censorship, they had access to prestigious scientific journals and meetings, gave court and government testimony, and distributed pamphlets. Their "controversial" work received attention in every textbook. All retained their tenured positions, sometimes funded by the taxes of the very people they declared to be, on average, biologically inferior. They suffered protests and attacks in the popular press, and some deplorable assaults by protesters, with no serious injuries. Their research was often subjected to special scrutiny, and some were asked not to accept money from the Pioneer Fund. None were expelled from the American Psychological Association. Comparison of these events to the Inquisition, Stalin, and Hitler, is inappropriate, to say the least.

The continued criticism and concern over Rushton's work naturally flow from the view that his theory is one of racial superiority, albeit one in which Asian groups come out ahead of others. But Rushton (1996) explicitly disavows the terms "inferior" and "superior." The readers must judge whether Table 1, in which blacks are said to have, on average, smaller brains, lower intelligence, lower cultural achievements, higher aggressiveness, lower law-abidingness, lower marital stability and less sexual restraint than whites, and the differences are attributed partially to heredity, implies that they are "inferior." Readers must also judge whether Rushton's (e.g., 1995a) r vs. K theory in which the climate of Africa is said to have selected for high birth rates and low parental care suggests the "inferiority" of blacks. No one can doubt the uses that will be made of Rushton's research by such groups as David Duke's National Association for the Advancement of White People, whose newsletter advertised IAAEE's publications and Mankind Quarterly, alongside the Protocols of the Learned Elders of Zion (see Tucker 1994, for an extensive discussion of the use of racial research by the far right).

Rushton explicitly disavows any policy implications of his research. In this sense, he cannot be considered a eugenicist, since eugenics always involved social policy. However, Rushton simultaneously argues in this journal that "if all people were treated the same, most average race differences would not disappear" (p. 3), a statement which in no way follows from his research and might be thought to carry policy implications for welfare, compensatory education, and employment equity. In contrast to Rushton's cautious approach Henry Garrett, Roger Pearson, T. Travis Osborne and especially Freiherr von Verschuer, quoted at the outset of this paper, embraced, and campaigned for the implementation of policy based on race difference research.

Philippe Rushton cannot be held responsible for the work of these men, and shares no "guilt by association." But those who maintain that a scientific theory cannot incite people to murder should review the history of scientific racism, the history of German Rassenhygiene, and the contemporary use of racial theory in Bosnia (see Kohn, 1995). Those who maintain that the data of racial research are "politically neutral" and "value-free" should understand the political commitments of those who conducted and promoted much of this research. Those who wish to promote open, honest discussion should contemplate the meaning of a book on worldwide race differences (Rushton, 1995a) in which "apartheid," "poverty," "colonialism," "slavery," and "segregation" do not appear in the index. Only then can an informed judgment about "political correctness" and racial research be made.

on human self-domestication, psychiatry, and eugenics



peh-med | Abstract: The hypothesis that anatomically modern homo sapiens could have undergone changes akin to those observed in domesticated animals has been contemplated in the biological sciences for at least 150 years. The idea had already plagued philosophers such as Rousseau, who considered the civilisation of man as going against human nature, and eventually "sparked over" to the medical sciences in the late 19th and early 20th century. At that time, human "self-domestication" appealed to psychiatry, because it served as a causal explanation for the alleged degeneration of the "erbgut" (genetic material) of entire populations and the presumed increase of mental disorders.

Consequently, Social Darwinists emphasised preventing procreation by people of "lower genetic value" and positively selecting favourable traits in others. Both tendencies culminated in euthanasia and breeding programs ("Lebensborn") during the Nazi regime in Germany. Whether or not domestication actually plays a role in some anatomical changes since the late Pleistocene period is, from a biological standpoint, contentious, and the currently resurrected debate depends, in part, on the definitional criteria applied.

However, the example of human self-domestication may illustrate that scientific ideas, especially when dealing with human biology, are prone to misuse, particularly if "is" is confused with "ought", i.e., if moral principles are deduced from biological facts. Although such naturalistic fallacies appear to be banned, modern genetics may, at least in theory, pose similar ethical problems to medicine, including psychiatry. In times during which studies into the genetics of psychiatric disorders are scientifically more valued than studies into environmental causation of disorders (which is currently the case), the prospects of genetic therapy may be tempting to alter the human genome in patients, probably at costs that no-one can foresee.

In the case of "self-domestication", it is proposed that human characteristics resembling domesticated traits in animals should be labelled "domestication-like", or better, objectively described as genuine adaptations to sedentism.

tame theory: did bonobos domesticate themselves?

scientificamerican | Time and again humans have domesticated wild animals, producing tame individuals with softer appearances and more docile temperaments, such as dogs and guinea pigs. But a new study suggests that one of our primate cousins—the African ape known as the bonobo—did something similar without human involvement. It domesticated itself.

Anthropologist Brian Hare of Duke University's Institute for Brain Sciences noticed that the bonobo looks like a domestic version of its closest living relative, the chimpanzee. The bonobo is less aggressive than the chimp, with a smaller skull and shorter canine teeth. And it spends more time playing and having sex. These traits are very similar to those that separate domestic animals from their wild ancestors. They are all part of a constellation of characteristics known as the domestication syndrome.

The similarities between bonobos and domesticated species dawned on Hare during a large departmental dinner, where he listened to Harvard University anthropologist Richard Wrangham hold forth on bonobos. "He was talking about how bonobos are an evolutionary puzzle," Hare recalls. "'They have all these weird traits relative to chimps and we have no idea how to explain them,'" Wrangham had noted. "I said, 'Oh that's like the silver foxes!' Richard turned around and said, 'What silver foxes?'"

The foxes that Hare mentioned were the legacy of Russian geneticist Dmitri Belyaev. In the 1950s Belyaev started raising wild silver foxes in captivity and breeding those that were least aggressive toward their human handlers. Within just 20 generations, he had created the fox equivalent of our domestic pooches. Instead of snarling when humans approached, they wagged their tails. At the same time, their ears became floppier, tails curlier and skulls smaller.

Belyaev's experiments showed that if you select for nicer animals, the other parts of the domestication syndrome follow suit. Hare thinks that a similar process happened in bonobos, albeit without human intervention.

Monday, July 23, 2012

the lucifer principle



howardbloom | Over a hundred years ago, Matthius Schleiden, the German botanist, was pondering the recently discovered fact that beings as simple as water fleas and as complex as human beings are made up of individual cells. Each of those cells has all the apparatus necessary to lead a life of its own. It is walled off in its own mini-world by the surrounding hedge of a membrane, carries its own metabolic power plants, and seems quite capable of going about its own business, ruggedly declaring its independence. Yet the individual cells, in pursuing their own goals, cooperate to create an entity much larger than themselves. Schleiden declared that each cell has an individual existence, and that the life of an organism comes from the way in which the cells work together.

In 1858, pathologist Rudolph Virchow took Schleiden's observation a step further. He declared that "the composition of the major organism, the so-called individual, must be likened to a kind of social arrangement or society, in which a number of separate existencies are dependent upon one another, in such a way, however, that each element possesses its own peculiar activity and carries out its own task by its own powers." A creature like you and me, said Virchow, is actually a society of separate cells.

The reasoning also works in reverse--a society acts like an organism. Half a century after Virchow, entomologist William Morton Wheeler was observing the lives of ants. No ant is an island. Wheeler saw the tiny beasts maintaining constant contact, greeting each other as they passed on their walkways, swapping bits of regurgitated food, adopting social roles that ranged from warrior or royal handmaiden to garbage handler and file clerk. (Yes, at the heart of many ant colonies is a room to which all incoming workers bring their discoveries. Seated at the chamber's center is a staff of insect bureaucrats who examine the new find, determine where it is needed in the colony, and send it off to the queen's chamber if it is a prized morsel, to the nursery if it is ordinary nourishment, to the construction crews if it would make good mortar, or to the garbage heap kept just outside the nest.)

Viewed from the human perspective, the activities of the individual ants seemed to matter far less than the behavior of the colony as a whole. In fact, the colony acted as if it were an independent creature, feeding itself, expelling its wastes, defending itself, and looking out for its future. Wheeler was the man who dubbed a group of individuals collectively acting like one beast a superorganism.

The term superorganism slid into obscurity until it was revived by Sloan-Kettering head Lewis Thomas in his influential 1974 book Lives Of A Cell. Superorganisms exist even on the very lowest rungs of the evolutionary ladder. Slime mold are seemingly independent amoeba, microscopic living blobs who race about on the moist surface of a decaying tree or rotting leaf cheerfully oblivious to each other when times are good. They feast gaily for days on bacteria and other delicacies, attending to nothing but their own selfish appetites. But when the food runs out, famine descends upon the slime mold world. Suddenly the formerly flippant amoeba lose their sense of boisterous individualism. They rush toward each other as if in a panic, sticking together for all they're worth.

Gradually, the clump of huddled microbeasts grows to something you can see quite clearly with the naked eye. It looks like a slimy plant. And that plant--a tightly-packed mass of former freedom-lovers--executes an emergency public works project. Like half-time marchers forming a pattern, some of the amoeba line up to form a stalk that pokes itself high into the passing currents of air. Then the creatures at the head cooperate to manufacture spores. And those seeds of life drift off into the breeze.

If the spores land on a heap of rotting grass or slab of decomposing bark, they quickly multiply, filling the slippery refuge with a horde of newly-birthed amoeba. Like their parents, the little things race off to the far corners of their new home in a cheerful hunt for dinner. They never stop to think that they may be part of a community whose corporate life is as critical as their own. They are unaware that someday they, like their parents, will have to cluster with their fellows in a desperate cooperative measure on which the future of their children will depend.

the original colonists: the social conquest of earth

NYTimes | This is not a humble book. Edward O. Wilson wants to answer the questions Paul Gauguin used as the title of one of his most famous paintings: “Where do we come from? What are we? Where are we going?” At the start, Wilson notes that religion is no help at all — “mythmaking could never discover the origin and meaning of humanity” — and contemporary philosophy is also irrelevant, having “long ago abandoned the foundational questions about human existence.” The proper approach to answering these deep questions is the application of the methods of science, including archaeology, neuroscience and evolutionary biology. Also, we should study insects.

Insects? Wilson, now 82 and an emeritus professor in the department of organismic and evolutionary biology at Harvard, has long been a leading scholar on ants, having won one of his two Pulitzer Prizes for the 1990 book on the topic that he wrote with Bert Hölldobler. But he is better known for his work on humans. His “Sociobiology: The New Synthesis,” a landmark attempt to use evolutionary theory to explain human behavior, was published in 1975. Those were strange times, and Wilson was smeared as a racist and fascist, attacked by some of his Harvard colleagues and doused with water at the podium of a major scientific conference. But Wilson’s days as a pariah are long over. An evolutionary approach to psychology is now mainstream, and Wilson is broadly respected for his scientific accomplishments, his environmental activism, and the scope and productivity of his work, which includes an autobiography and a best-selling novel, ­“Anthill.”

In “The Social Conquest of Earth,” he explores the strange kinship between humans and some insects. Wilson calculates that one can stack up log-style all humans alive today into a cube that’s about a mile on each side, easily hidden in the Grand Canyon. And all the ants on earth would fit into a cube of similar size. More important, humans and certain insects are the planet’s ­“eusocial” species — the only species that form communities that contain multiple generations and where, as part of a division of labor, community members sometimes perform altruistic acts for the benefit of others.

Wilson’s examples of insect eusociality are dazzling. The army ants of Africa march in columns of up to a million or more, devouring small animals that get in their way. Weaver ants “form chains of their own bodies in order to pull leaves and twigs together to create the walls of shelters. Others weave silk drawn from the spinnerets of their larvae to hold the walls in place.” Leafcutter ants “cut fragments from leaves, flowers and twigs, carry them to their nests and chew the material into a mulch, which they fertilize with their own feces. On this rich material, they grow their principal food, a fungus belonging to a species found nowhere else in nature. Their gardening is organized as an assembly line, with the material passed from one specialized caste to the next.”

There are obvious parallels with human practices like war and agriculture, but Wilson is also sensitive to the differ­ences. The social insects evolved more than 100 million years ago; their accomplishments come from “small brains and pure instinct”; and their lengthy evolution has led them to become vital elements of the biosphere. In contrast, Homo sapiens evolved quite recently; we have language and culture; and the consequences of our relatively sudden domination have been mixed, to put it mildly: “The rest of the living world could not coevolve fast enough to accommodate the onslaught of a spectacular conqueror that seemed to come from nowhere, and it began to crumble from the pressure.”

outsourcing punishment to god: beliefs in divine control reduce earthly punishment


Royal Society | The sanctioning of norm-transgressors is a necessary—though often costly—task for maintaining a well-functioning society. Prior to effective and reliable secular institutions for punishment, large-scale societies depended on individuals engaging in ‘altruistic punishment’—bearing the costs of punishment individually, for the benefit of society. Evolutionary approaches to religion suggest that beliefs in powerful, moralizing Gods, who can distribute rewards and punishments, emerged as a way to augment earthly punishment in large societies that could not effectively monitor norm violations. In five studies, we investigate whether such beliefs in God can replace people's motivation to engage in altruistic punishment, and their support for state-sponsored punishment. Results show that, although religiosity generally predicts higher levels of punishment, the specific belief in powerful, intervening Gods reduces altruistic punishment and support for state-sponsored punishment. Moreover, these effects are specifically owing to differences in people's perceptions that humans are responsible for punishing wrongdoers.

Sunday, July 22, 2012

alcohol is THE Gateway drug - why it's available on every corner in some neighborhoods....,

Wiley | BACKGROUND: The Gateway Drug Theory suggests that licit drugs, such as tobacco and alcohol, serve as a “gateway” toward the use of other, illicit drugs. However, there remains some discrepancy regarding which drug—alcohol, tobacco, or even marijuana—serves as the initial “gateway” drug subsequently leading to the use of illicit drugs such as cocaine and heroin. The purpose of this investigation was to determine which drug (alcohol, tobacco, or marijuana) was the actual “gateway” drug leading to additional substance use among a nationally representative sample of high school seniors.

METHODS: This investigation conducted a secondary analysis of the 2008 Monitoring the Future 12th-grade data. Initiation into alcohol, tobacco, and other drug use was analyzed using a Guttman scale. Coefficients of reliability and scalability were calculated to evaluate scale fit. Subsequent cross tabulations and chi-square test for independence were conducted to better understand the relationship between the identified gateway drug and other substances' use.

RESULTS: Results from the Guttman scale indicated that alcohol represented the “gateway” drug, leading to the use of tobacco, marijuana, and other illicit substances. Moreover, students who used alcohol exhibited a significantly greater likelihood of using both licit and illicit drugs.

CONCLUSION: The findings from this investigation support that alcohol should receive primary attention in school-based substance abuse prevention programming, as the use of other substances could be impacted by delaying or preventing alcohol use. Therefore, it seems prudent for school and public health officials to focus prevention efforts, policies, and monies, on addressing adolescent alcohol use. Fist tap Dale.

Saturday, July 21, 2012

get high for free

slate | It continues to be totally off the radar of prominent politicians, but polls indicate that large and growing numbers of Americans are open to the idea of legalizing marijuana. Gallup broke ground last fall with the first-ever poll showing 50 percent of respondents nationwide wanting to legalize, and a more precisely worded poll from Rasmussen in May had 56 percent in favor of “legalizing marijuana and regulating it in a similar manner to the way alcohol and tobacco cigarettes are regulated today.” Thus far those polls are outliers, and most surveys show more voter skepticism than that. But as elderly voters are more pot-phobic than the young, legalization’s support is likely to increase over time and surely it will work its way onto the national agenda sooner or later.

There’s been relatively little analysis of what a legal marijuana industry might look like. One key but little-appreciated fact is that, according to persuasive research by Jonathan Caulkins, Angela Hawken, Beau Kilmer, and Mark Kleiman in their new book Marijuana Legalization: What Everyone Needs To Know, is that legal pot would be amazingly cheap. In fact, midgrade stuff would be so cheap that it might make sense for businesses to give it away like ketchup packets or bar nuts.

Conventional thinking about pot pricing is often dominated by people’s experience buying weed in legal or quasi-legal settings such as a Dutch “coffee shop” or a California medical marijuana dispensary. But this is badly misleading. Neither California nor the Netherlands permit growing or wholesale distribution of marijuana as a legal matter. If pot were fully legal, its growth, distribution, and marketing would work entirely differently.

chromosomal evidence of your species near demise...,

slate | How did having 46 chromosomes then spread worldwide? It’s possible that having two fewer chromosomes than everyone else gave Guy and Doll’s family a whopping evolutionary advantage, allowing them to out-compete the 48-chromosome sluggards. But probably not. More likely, they happened to be living at a point when the human race nearly got wiped out.

Take your pick for the cause of our near-extinction—ice ages, plagues, Indonesian gigavolcanoes. But humans have far less genetic diversity than most other species, and the most reasonable explanation for this is a genetic bottleneck: a severe reduction in the population of humans in the past, perhaps multiple times. One study suggested that our population, worldwide, might have dropped as low as 40 adults. (The world record for fitting people in a phone booth is 25.) That’s an outlandishly pessimistic guess even among disaster scientists, but it’s common to find estimates of a few thousand adults, below what some minor league baseball teams draw. Consider that these humans might not have been united in one place, but scattered into small, isolated pockets around Africa, and things look even shakier for our future. Had the Endangered Species Act existed way back when, human beings might have been the equivalent of pandas and condors.

But however alarming, bottlenecks and near-extinctions aren’t necessarily all bad. With less competition around, a beneficial brain-boosting gene, say, could have an easier time spreading. And those who slip through the bottleneck become big Darwinian winners, because whatever genes those dumb-lucky survivors have can spread far and wide. Guy and Doll were likely two of those survivors, and they bequeathed to the rest of us our unique arrangement of 46 chromosomes.

Thursday, July 19, 2012

study finds voters overwhelmingly want big defense spending cuts

nationaljournal | Americans of all stripes have had enough of massive Pentagon budgets and want significant cuts in defense spending, according to new survey data released on Monday.

In Republican and Democratic districts across the country, 74 percent and 80 percent of respective voters said they want less defense spending, the study found.

On average, voters indicated that they wanted a budget for fiscal 2013 that would be nearly 20 percent less than current defense spending.

With $645 billion enacted for total defense spending this year, the average voter’s preferred budget for next year, an 18 percent cut, would translate into a $116 billion savings—money lawmakers trying to balance the budget could sorely use.

The study’s results will disappoint defense hawks and industry officials fighting against spending cuts, especially those claiming to protect defense jobs in their districts.

Voters participated in an April survey that first presented and explained competing arguments for higher or lower defense spending. Some results of that survey have previously been released. But the authors dived into a deeper examination of party differences and found no statistically significant separation of attitudes on defense spending cuts between Republicans and Democrats. Moreover, voters in districts with or without defense industry jobs overwhelmingly said they want the federal government to spend less.

“The idea that Americans’ would want to keep total defense spending up so as to preserve local jobs is not supported by the data,” said Steven Kull, director of the Program for Public Consultation, which conducted the survey with the Stimson Center and the Center for Public Integrity, a nonprofit investigative journalism group.

“It is the largest rigging of prices in the history of the world, by many orders of magnitude.”


Wednesday, July 18, 2012

global fight for natural resources 'has only just begun



guardian | The global battle for natural resources – from food and water to energy and precious metals – is only beginning, and will intensify to proportions that could mean enormous upheavals for every country, leading academics and business figures told a conference in Oxford on Thursday.

Sir David King, former chief scientific adviser to the UK government, who convened the two-day Resource 2012 conference, told the Guardian: "We are nowhere near realising the full impact of this yet. We have seen the first indications – rising food prices, pressure on water supplies, a land grab by some countries for mining rights and fertile agricultural land, and rising prices for energy and for key resources [such as] metals. But we need to do far more to deal with these problems before they become even more acute, and we are not doing enough yet."

Countries that are not prepared for this rapid change will soon – perhaps irrevocably – lose out, with serious damage to their economies and way of life, the conference was told.

Amartya Sen, a Nobel prize-winning economist, said that the free market would not necessarily provide the best solution to sharing out the world's resources. Governments would need to step in, he said, to ensure that people had access to the basics of life, and that the interests of businesses and the financial markets did not win out over more fundamental human needs.

Sen has played a key role as an academic in showing how the way resources are distributed can impact famine and surplus more than the actual amount of resources, that are available, particularly food.

David Nabarro, special representative for food security and nutrition at the United Nations Special, defended the outcomes of last month's Rio+20 conference – a global summit that was intended to address resource issues and other environmental problems, including pollution, climate change and the loss of biodiversity, all of which are likely to have knock-on effects that will exacerbate resource shortages.

Many observers criticised the governments represented at Rio+20 for failing to adopt any clear targets and initiatives on key environmental problems, saying it was a wasted opportunity.

But Nabarro said there had been important successes – that governments had agreed to strive for the elimination of hunger and more sustainable agriculture, including an emphasis on small farmers, improvements in nutrition (in both developed and developing countries), and cutting the harmful waste of resources that is currently plaguing economies.

Fuck Robert Kagan And Would He Please Now Just Go Quietly Burn In Hell?

politico | The Washington Post on Friday announced it will no longer endorse presidential candidates, breaking decades of tradition in a...