Thursday, July 24, 2014

how america changed the meaning of war


tomdispatch |  Then came the attack of September 11th. Like the starting gun of a race that no one knew he was to run, this explosion set the pack of nations off in a single direction -- toward the trenches. Although the attack was unaccompanied by any claim of authorship or statement of political goals, the evidence almost immediately pointed to al-Qaeda, the radical Islamist, terrorist network, which, though stateless, was headquartered in Afghanistan and enjoyed the protection of its fundamentalist Islamic government. In a tape that was soon shown around the world, the group’s leader, Osama bin Laden, was seen at dinner with his confederates in Afghanistan, rejoicing in the slaughter.

Historically, nations have responded to terrorist threats and attacks with a combination of police action and political negotiation, while military action has played only a minor role. Voices were raised in the United States calling for a global cooperative effort of this kind to combat al-Qaeda. President Bush opted instead for a policy that the United States alone among nations could have conceivably undertaken: global military action not only against al-Qaeda but against any regime in the world that supported international terrorism.

The president announced to Congress that he would "make no distinction between the terrorists who commit these acts and those who harbor them." By calling the campaign a "war," the administration summoned into action the immense, technically revolutionized, post-Cold War American military machine, which had lacked any clear enemy for over a decade. And by identifying the target as generic "terrorism," rather than as al-Qaeda or any other group or list of groups, the administration licensed military operations anywhere in the world.

In the ensuing months, the Bush administration continued to expand the aims and means of the war. The overthrow of governments -- "regime change" -- was established as a means for advancing the new policies. The president divided regimes into two categories -- those "with us" and those "against us." Vice President Cheney estimated that al-Qaeda was active in 60 countries. The first regime to be targeted was of course al-Qaeda’s host, the government of Afghanistan, which was overthrown in a remarkably swift military operation conducted almost entirely from the air and without American casualties.

Next, the administration proclaimed an additional war goal -- preventing the proliferation of weapons of mass destruction. In his State of the Union speech in January 2002, the president announced that "the United States of America will not permit the world’s most dangerous regimes to threaten us with the world’s most destructive weapons." He went on to name as an "axis of evil" Iraq, Iran, and North Korea -- three regimes seeking to build or already possessing weapons of mass destruction. To stop them, he stated, the Cold War policy of deterrence would not be enough -- "preemptive" military action would be required, and preemption, the administration soon specified, could include the use of nuclear weapons.

Beginning in the summer of 2002, the government intensified its preparations for a war to overthrow the regime of Saddam Hussein in Iraq, and in the fall, the president demanded and received a resolution from the Security Council of the United Nations requiring Iraq to accept the return of U.N. inspectors to search for weapons of mass destruction or facilities for building them. Lists of other candidates for "regime change" began to surface in the press.

somewhere in new mexico before the end of time...,


What if you discovered that everything that you'd ever been taught about the world around you and particularly your country was false? With Environmental problems escalating and climate change now making impacts can societies collapse? What are the alternatives to avoiding collapse? What kind of world can you expect if the ecology collapses due to human stresses.

Here some of the premiere thinkers often referred to as "doomers" talk about climate change and the impacts of an industrial system on earth systems. Is it already too late?

the right to exclude others (property rights) is the foundational american religious principle


thenation |  The austerity agenda as it plays out on the ground in American cities is often so relentless in demanding cuts in public services that it is easy to imagine that it cannot be upended. And that goes double for Detroit, where Michigan Governor Rick Snyder has given his appointed “emergency manager”—rather than local elected officials—control over critical decisions regarding city operations.

But that does not mean that austerity always wins.

Last week, protests by Detroiters and allies from across the country focused local, national and international attention on the Detroit Water and Sewage Department’s program of shutting off water service for thousands of low-income families that have fallen behind in paying bills. On Friday, religious leaders and community activists were arrested after blocking trucks operated by the private contractor that was responsible for the shutoffs. At the same time, a mass march filled the streets of downtown Detroit with protesters arguing that the most vulnerable citizens of a city hard hit by deindustrialization ought not be further harmed by the loss of a basic necessity that the United Nations deems a human right.

Wednesday, July 23, 2014

never mind the rhetoric - the property "right" is the might to exclude others...,


theecologist |  Never mind the 'war on terror' rhetoric, writes Nafeez Ahmed. The purpose of Israel's escalating assault on Gaza is to control the Territory's 1.4 trillion cubic feet of gas - and so keep Palestine poor and weak, gain massive export revenues, and avert its own domestic energy crisis.

Israel's defence minister is on record confirming that military plans to uproot Hamas' are about securing control of Gaza's gas reserves

The conquest of Gaza is accelerating. Israel has now launched its ground invasion, bringing the Palestinian death toll to 260, 80% of whom are civilians.

A further 1,500 have been wounded and 1,300 Palestinian homes destroyed. Israel's goal, purportedly, is to "restore quiet" by ending Hamas rocket attacks on Israel.

Last Tuesday, Israeli defence minister and former Israeli Defence Force (IDF) chief of staff Moshe Ya'alon announced that Operation Protective Edge marks the beginning of a protracted assault on Hamas.

The operation "won't end in just a few days", he said, adding that "we are preparing to expand the operation by all means standing at our disposal so as to continue striking Hamas."

The price will be very heavy ... yes, $4 billion!
The following morning, he went on: "We continue with strikes that draw a very heavy price from Hamas. We are destroying weapons, terror infrastructures, command and control systems, Hamas institutions, regime buildings, the houses of terrorists, and killing terrorists of various ranks of command ...

"The campaign against Hamas will expand in the coming days, and the price the organization will pay will be very heavy."

But in 2007, a year before Operation Cast Lead, Ya'alon's concerns focused on the 1.4 trillion cubic feet of natural gas discovered in 2000 off the Gaza coast, valued at $4 billion.

children exposed to religion have difficulty distinguishing fact from fiction...,


rawstory |  A study published in the July issue of Cognitive Science determined that children who are not exposed to religious stories are better able to tell that characters in “fantastical stories” are fictional — whereas children raised in a religious environment even “approach unfamiliar, fantastical stories flexibly.”

In “Judgments About Fact and Fiction by Children From Religious and Nonreligious Backgrounds,” Kathleen Corriveau, Eva Chen, and Paul Harris demonstrate that children typically have a “sensitivity to the implausible or magical elements in a narrative,” and can determine whether the characters in the narrative are real or fictional by references to fantastical elements within the narrative, such as “invisible sails” or “a sword that protects you from danger every time.”

However, children raised in households in which religious narratives are frequently encountered do not treat those narratives with the same skepticism. The authors believed that these children would “think of them as akin to fairy tales,” judging “the events described in them as implausible or magical and conclude that the protagonists in such narratives are only pretend.”

And yet, “this prediction is likely to be wrong,” because “with appropriate testimony from adults” in religious households, children “will conceive of the protagonist in such narratives as a real person — even if the narrative includes impossible events.”

The researchers took 66 children between the ages of five and six and asked them questions about stories — some of which were drawn from fairy tales, others from the Old Testament — in order to determine whether the children believed the characters in them were real or fictional. 

“Children with exposure to religion — via church attendance, parochial schooling, or both — judged [characters in religious stories] to be real,” the authors wrote. “By contrast, children with no such exposure judged them to be pretend,” just as they had the characters in fairy tales. But children with exposure to religion judged many characters in fantastical, but not explicitly religious stories, to also be real — the equivalent of being incapable of differentiating between Mark Twain’s character Tom Sawyer and an account of George Washington’s life.

This conclusion contradicts previous studies in which children were said to be “born believers,” i.e. that they possessed “a natural credulity toward extraordinary beings with superhuman powers. Indeed, secular children responded to religious stories in much the same way as they responded to fantastical stories — they judged the protagonist to be pretend.”

The researchers also determined that “religious teaching, especially exposure to miracle stories, leads children to a more generic receptivity toward the impossible, that is, a more wide-ranging acceptance that the impossible can happen in defiance of ordinary causal relations.”

Nassim Taleb: Two Myths About Rivalry, Scarcity, Competition, and Cooperation


asymptosis |  I’d condense my thinking on the subject as follows:

1) People mistake rivalry for scarcity. If one tribe excludes all the others from a water source, forces them to do their will to get water, there’s obviously scarcity, right? Wrong.

Don’t get me started on the sacralization of (largely inherited) “property rights,” ownership — the right to exclude others.

2) They don’t understand that competition’s only virtue is increasing and improving cooperation. Cooperation — non-kin altruism, eusociality, etc. — is the thing that got us to the top of the food chain. Cooperation is what wins the battle against scarcity.
Competition fetishists think that competition is always good because it sometimes improves cooperation, even though it frequently does the exact opposite.

Think: trade wars. Or just…wars.

Tuesday, July 22, 2014

how can moral science exist?


richarddawkins |  The solution is not in some remarkable discovery, or genius breakthrough of logical formula – much to the pity of my book sales, and the desires of my publisher – but is rather in thinking around the problem. One need not show morality to be a system of naturally occurring and deducible facts like in physical or social sciences, in order for moral science to be advocated. If one needed to do this, morality could not be shown to be rational at all. Instead one simply needs to show that a rational theory of morality is possible, justifiable and more rationally able than the other moral theories on the table. So not just better than theological accounts, but also less-assumptive than many rights-theorists or utilitarian thinkers have come up with. The theory would then also have to be assumptive only to the degree that science is (i.e., assumptive only about the self-proving worth of rationality).

It would have to use the scientific method to develop a transparent set of social agreements about basic moral principles – whatever we agree those most basic of moral principles to be – instead of on the assumptions of natural moral facts (as there are no such things). To the non-philosopher, this translates as reducing the moral principles we wish our societies to be guided by to the most basic sets they can possibly be – however we wish this to look – and then using reason and science to build consistent moral rules, and make consistent moral decisions based on these most basic of principles. For example, we might look at our current principles about murder/violent crime and then reduce them to a basic principle that suffering and death should be avoided wherever possible. From there we would judge whether our laws were rationally consistent with what we socially agreed.

All be it a very different type of science, moral science can exist in a socially created space like this without contravening the rules of rationality, all the while allowing the most important of humanities problems to be exposed to the fruits of scientific method. Indeed, most areas of politics and morality need not be thought of as subjective at all once moral science is on the table, unless the problem is wholly without reason or evidence on either side. This doesn’t mean opponents of rationality will suddenly drop their beliefs and join us, but it does provide a consistent framework to stop people having to turn to religion or other methods in order to form moral beliefs. We shouldn’t underestimate the secular advantage this would have in future generations.

Moral science is important: it’s more rational than what we currently have, ie, a system where we just slightly amend historically decided ideas when we really have to. But more than this, it’s important because it gives us a chance to rationally judge moral issues – no longer having to allow for dangerous and often irrational subjective differences. What’s more, it allows for the whole method to be scientific in attitude; not allowing for certainty where there is none and helping to do away with as much potential for uncompromising aggression as possible.

the social brain and the myth of empathy


cambridge |  Neuroscience research has created multiple versions of the human brain. The “social brain” is one version and it is the subject of this paper. Most image-based research in the field of social neuroscience is task-driven: the brain is asked to respond to a cognitive (perceptual) stimulus. The tasks are derived from theories, operational models, and back-stories now circulating in social neuroscience. The social brain comes with a distinctive back-story, an evolutionary history organized around three, interconnected themes: mind-reading, empathy, and the emergence of self-consciousness. This paper focuses on how empathy has been incorporated into the social brain and redefined via parallel research streams, employing a shared, imaging technology. The concluding section describes how these developments can be understood as signaling the emergence of a new version of human nature and the unconscious. My argument is not that empathy in the social brain is a myth, but rather that it is served by a myth consonant with the canons of science.

What we've previously noted about this topic.

the neuroethology of friendship


wiley |  Friendship pervades the human social landscape. These bonds are so important that disrupting them leads to health problems, and difficulties forming or maintaining friendships attend neuropsychiatric disorders like autism and depression. Other animals also have friends, suggesting that friendship is not solely a human invention but is instead an evolved trait. A neuroethological approach applies behavioral, neurobiological, and molecular techniques to explain friendship with reference to its underlying mechanisms, development, evolutionary origins, and biological function. Recent studies implicate a shared suite of neural circuits and neuromodulatory pathways in the formation, maintenance, and manipulation of friendships across humans and other animals. Health consequences and reproductive advantages in mammals additionally suggest that friendship has adaptive benefits. We argue that understanding the neuroethology of friendship in humans and other animals brings us closer to knowing fully what it means to be human.

the conscientiousness of kidspeak?


newyorker |  If, for instance, a fourteen-year-old girl says, “So we, like, um, went to the pizza place, but the, uh, you know—the guy?—said, like, no, so we were, like, O.K., so we, uh, decided that we’d go to, like, a coffee shop, but, uh, Colette can’t—she has, like, a gluten thing. You know what I mean? So that’s, like, why we came home, and, um, you know, would you, like, make us eggs?” To a sensitized listener, who recognizes the meaning of the circumlocutions, the nuanced space between language and event, the sentence really means: “So we tried, as it were, to go and enjoy a pizza, but the, so to speak, maĆ®tre d’ of the establishment claimed—a statement that we were in no social position to dispute—that there was, so to speak, ‘no room for us at the inn.’ And then Colette insisted—and far be it for me either to contest or endorse her self-diagnosis—that she could not eat wheat-based food, so, knowing full well that it is likely to be irksome and ill-timed, could you feed us with scrambled eggs?” The point of the “likes”s and other tics is to supply the information that there is a lot more information not being offered, and that the whole thing is held at a certain circumspect remove. It didn’t happen exactly this way, and, of course, one might quibble with a detail here or there, but this is the gist of what happened. Each “like” is a Jamesian “as it were.”

It turns out that three sociolinguists at the University of Texas at Austin have been studying these things systematically. The paper they produced, published in the Journal of Language and Social Psychology, has the beautiful title “Um … Who Like Says You Know: Filler Word Use as a Function of Age, Gender and Personality.” The study they conducted “aimed to investigate how the frequency of filled pauses and discourse markers used in the English language varies with two basic demographic variables (gender and age) and personality traits.” The researchers explain that, to do this, they “focused on three common discourse markers … (I mean, you know, and like) and two filled pauses (uh and um).”

They recorded and transcribed interviews with the speakers, noted how often the speakers used so-called “discourse markers,” and concluded that these markers are, indeed, used most frequently by women and girls. More important, the study also shows that the use of the discourse markers is particularly common among speakers who score on a personality test as “conscientious”—“people who are more thoughtful and aware of themselves and their surroundings.” Discourse markers, far from being opaque, automatic, or zombie-like, show that the speaker has “a desire to share or rephrase opinions to recipients.” In other words, those “like”s are being used to register that what’s being narrated may not be utterly faithful to each detail—that it may not be, as a fourteen-year-old might say, “literally” true—but that it is essentially true, and, what’s more, that an innate sense of conscientiousness and empathy with the listener forbids the speaker from pretending to a more closely tuned accuracy than she in fact possesses.

Monday, July 21, 2014

ancient earthenworks pre-date amazonian rainforest

ancient-origins |  A new study published in the journal Proceedings of the National Academy of Sciences has revealed that a series of mysterious lines and geometric shapes carved into the Amazonian landscape were created thousands of years ago before the rainforest even existed, according to a report in Discovery News. The purpose of the massive earthworks and who created them remains unknown, and scientists are beginning to realise just how much there still is to learn about the prehistoric cultures of the Amazon and life before the arrival of Europeans. 

The unusual earthworks, which include square, straight, and ring-like ditches, were first uncovered in 1999, after large areas of pristine forest was cleared for cattle grazing. Since then, hundreds of the earthen foundations have been found in a region more than 150 miles across, covering northern Bolivia and Brazil’s Amazonas state.

The ditches were sculpted from the clay rich soils of the Amazon and are typically around 30 feet wide and 10 feet deep, alongside 3 feet high walls. However, the largest ring ditches found so far are an incredible 1,000 feet in diameter.  The purpose of the ditches remains a complete mystery. The fact that many of them are clustered on a 200 metre high plateau suggests they may have been used for defence, however, others have suggested they were used for drainage or for channelling water since most were placed near spring water source. A team of researchers who published a paper in the journal Antiquity in 2010 argued that the layout of the ditches is highly symbolic suggestion a ceremonial and religious function.

Until now, it was believed that the earthworks dated back to around 200 AD. However, the latest study has revealed that they are, in fact, much older. Study author John Francis Carson, a postdoctoral researcher at the University of Reading in the United Kingdom, explained that sediment cores had been taken from two lakes near the major earthwork sites.  These sediment cores hold ancient pollen grains and charcoal from long-ago fires, and can reveal information about the climate and ecosystem that existed when the sediment was laid down as far back as 6,000 years ago.

The results revealed that the oldest sediments did not come from a rainforest ecosystem at all. Rather, they showed that the landscape, before about 2,000 to 3,000 years ago, looked more like the savannahs of Africa than today’s lush rainforest.

the final century of civilization?


tdf |  Climate Change Could Bring Catastrophe in Next Century. It's an idea that most of us would rather not face - that within the next century, life as we know it could come to an end. Our civilization could crumble, leaving only traces of modern human existence behind.

It seems outlandish, extreme - even impossible. But according to cutting edge scientific research, it is a very real possibility. And unless we make drastic changes now, it could very well happen. Experts have a stark warning: that unless we change course, the perfect storm of population growth, dwindling resources and climate change has the potential to converge in the next century with catastrophic results.

In order to plan for the worst, we must anticipate it. In that spirit, guided by some of the world's experts, ABC News' "Earth 2100," hosted by Bob Woodruff, will journey through the next century and explore what might be our worst-case scenario. But no one can predict the future, so how do we address the possibilities that lie ahead? Our solution is Lucy, a fictional character devised by the producers at ABC to guide us through the twists and turns of what the next 100 years could look like. It is through her eyes and experiences that we can truly imagine the experts' worst-case scenario -- and be inspired to make changes for the better.

Sunday, July 20, 2014

superintelligence


mashable |  Humans are currently the most intelligent beings on the planet — the result of a long history of evolutionary pressure and adaptation. But could we some day design and build machines that surpass the human intellect?

This is the concept of superintelligence, a growing area of research that aims to improve understanding of what such machines might be like, how they might come to exist, and what they would mean for humanity's future.

Oxford philosopher Nick Bostrom's recent book Superintelligence: Paths, Dangers, Strategies discusses a variety of technological paths that could reach superintelligent artificial intelligence (AI), from mathematical approaches to the digital emulation of human brain tissue.

And although it sounds like science fiction, a group of experts, including Stephen Hawking, wrote an article on the topic noting that "There is no physical law precluding particles from being organised in ways that perform even more advanced computations than the arrangements of particles in human brains."

Brain as computer 
The idea that the brain performs "computation" is widespread in cognitive science and AI since the brain deals in information, converting a pattern of input nerve signals to output nerve signals.

Another well-accepted theory is that physics is Turing-computable: That whatever goes on in a particular volume of space, including the space occupied by human brains could be simulated by a Turing machine, a kind of idealized information processor. Physical computers perform these same information-processing tasks, though they aren't yet at the level of Turing's hypothetical device.

These two ideas come together to give us the conclusion that intelligence itself is the result of physical computation. And, as Hawking and colleagues go on to argue, there is no reason to believe that the brain is the most intelligent possible computer.

In fact, the brain is limited by many factors, from its physical composition to its evolutionary past. Brains were not selected exclusively to be smart, but to generally maximize human reproductive fitness. Brains are not only tuned to the tasks of the hunter gatherer, but also designed to fit through the human birth canal; supercomputing clusters or data-centers have no such constraints.
Synthetic hardware has a number of advantages over the human brain both in speed and scale, but the software is what creates the intelligence. How could we possibly get smarter-than-human software?

evolutionary forecasting


Simonfoundation |   Michael Doebeli, a mathematical biologist at the University of British Columbia, wondered how E. coli would evolve if it had two kinds of food instead of just one. In the mid-2000s, he ran an experiment in which he provided glucose — the sole staple of Lenski’s experiment — and another compound E. coli can grow on, known as acetate.
Doebeli chose the two compounds because he knew that E. coli treats them very differently. When given a choice between the two, it will devour all the glucose before switching on the molecular machinery for feeding on acetate. That’s because glucose is a better source of energy. Feeding on acetate, by contrast, E. coli can only grow slowly.

Something remarkable happened in Doebeli’s experiment — and it happened over and over again. The bacteria split into two kinds, each specialized for a different way of feeding. One population became better adapted to growing on glucose. These glucose-specialists fed on the sugar until it ran out and then slowly switched over to feeding on acetate. The other population became acetate-specialists; they evolved to switch over to feeding on acetate even before the glucose supply ran out and could grow fairly quickly on acetate.

When two different kinds of organisms are competing for the same food, it’s common for one to outcompete the other. But in Doebeli’s experiment, the two kinds of bacteria developed a stable coexistence. That’s because both strategies, while good, are not perfect. The glucose-specialists start out growing quickly, but once the glucose runs out, they slow down drastically. The acetate-specialists, on the other hand, don’t get as much benefit from the glucose. But they’re able to grow faster than their rivals once the glucose runs out.

Doebeli’s bacteria echoed the evolution of lizards in the Caribbean. Each time the lizards arrived on an island, they diversified into many of the same forms, each with its own set of adaptations. Doebeli’s bacteria diversified as well — and did so in flask after flask.

To get a deeper understanding of this predictable evolution, Doebeli and his postdoctoral researcher, Matthew Herron, sequenced the genomes of some of the bacteria from these experiments. In three separate populations they discovered that the bacteria had evolved in remarkable parallel. In every case, many of the same genes had mutated.

Although Doebeli’s experiments are more complex than Lenski’s, they’re still simple compared with what E. coli encounters in real life. E. coli is a resident of the gut, where it feeds on dozens of compounds, where it coexists with hundreds of other species, where it must survive changing levels of oxygen and pH, and where it must negotiate an uneasy truce with our immune system. Even if E. coli’s evolution might be predictable in a flask of glucose and acetate, it would be difficult to predict how the bacteria would evolve in the jungle of our digestive system.

bacteria that subsist on electricity


newscientist |   Unlike any other life on Earth, these extraordinary bacteria use energy in its purest form – they eat and breathe electrons – and they are everywhere

STICK an electrode in the ground, pump electrons down it, and they will come: living cells that eat electricity. We have known bacteria to survive on a variety of energy sources, but none as weird as this. Think of Frankenstein's monster, brought to life by galvanic energy, except these "electric bacteria" are very real and are popping up all over the place.

Unlike any other living thing on Earth, electric bacteria use energy in its purest form – naked electricity in the shape of electrons harvested from rocks and metals. We already knew about two types, Shewanella and Geobacter. Now, biologists are showing that they can entice many more out of rocks and marine mud by tempting them with a bit of electrical juice. Experiments growing bacteria on battery electrodes demonstrate that these novel, mind-boggling forms of life are essentially eating and excreting electricity.

That should not come as a complete surprise, says Kenneth Nealson at the University of Southern California, Los Angeles. We know that life, when you boil it right down, is a flow of electrons: "You eat sugars that have excess electrons, and you breathe in oxygen that willingly takes them." Our cells break down the sugars, and the electrons flow through them in a complex set of chemical reactions until they are passed on to electron-hungry oxygen.

In the process, cells make ATP, a molecule that acts as an energy storage unit for almost all living things. Moving electrons around is a key part of making ATP. "Life's very clever," says Nealson. "It figures out how to suck electrons out of everything we eat and keep them under control." In most living things, the body packages the electrons up into molecules that can safely carry them through the cells until they are dumped on to oxygen.

"That's the way we make all our energy and it's the same for every organism on this planet," says Nealson. "Electrons must flow in order for energy to be gained. This is why when someone suffocates another person they are dead within minutes. You have stopped the supply of oxygen, so the electrons can no longer flow."

The discovery of electric bacteria shows that some very basic forms of life can do away with sugary middlemen and handle the energy in its purest form – electrons, harvested from the surface of minerals. "It is truly foreign, you know," says Nealson. "In a sense, alien."

Saturday, July 19, 2014

why not try and breed the kwisatz haderach?

126 year old Jose Aguinelo dos Santos - world's oldest living person

westhunter |  The Dark Lords of the IRS have proclaimed that West Hunter Incorporated has Federal tax-exempt status. Contributions, including various forms of real property,  are deductible.  For details, write gcochran9@comcast.net.

West Hunter’s purpose is the advancement of education and science in anthropology and evolution.  That means this blog, scientific and popular articles, books, talks, and research projects.  Depending on resources, possible projects might include a search for  effective nootropics (possibly inspired by some of the Ashkenazi mutations),  cloning  a super-Neanderthal, or breeding the Kwisatz Haderach.

Robert Heinlein, in Methuselah’s Children, imagined the Howard Foundation.  Founded by Ira Howard, who made a pile in the California Gold Rush but died young (of old age!) and childless,  the foundation bribed people with unusually long-lived ancestors into marrying people with similar backgrounds, thus selectively breeding for longevity. This was written in 1941, when many people still knew that such things were possible.

You can imagine a similar foundation that breeds for intelligence: Cyril Kornbluth did, as background for The Marching Morons.  But today, there’s no such thing. Any attempt would be denounced, even if utterly non-coercive and completely successful.

Nobody’s thinking about the long run, the big issues.  Well, hardly anybody.

wade in the nytimes: adventures in very recent evolution



NYTimes |  Ten thousand years ago, people in southern China began to cultivate rice and quickly made an all-too-tempting discovery — the cereal could be fermented into alcoholic liquors. Carousing and drunkenness must have started to pose a serious threat to survival because a variant gene that protects against alcohol became almost universal among southern Chinese and spread throughout the rest of China in the wake of rice cultivation. 

The variant gene rapidly degrades alcohol to a chemical that is not intoxicating but makes people flush, leaving many people of Asian descent a legacy of turning red in the face when they drink alcohol. 

The spread of the new gene, described in January by Bing Su of the Chinese Academy of Sciences, is just one instance of recent human evolution and in particular of a specific population’s changing genetically in response to local conditions. 

Scientists from the Beijing Genomics Institute last month discovered another striking instance of human genetic change. Among Tibetans, they found, a set of genes evolved to cope with low oxygen levels as recently as 3,000 years ago. This, if confirmed, would be the most recent known instance of human evolution. 

Many have assumed that humans ceased to evolve in the distant past, perhaps when people first learned to protect themselves against cold, famine and other harsh agents of natural selection. But in the last few years, biologists peering into the human genome sequences now available from around the world have found increasing evidence of natural selection at work in the last few thousand years, leading many to assume that human evolution is still in progress. 

“I don’t think there is any reason to suppose that the rate has slowed down or decreased,” says Mark Stoneking, a population geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

the 10,000 year explosion?


wikipedia |  Cochran and Harpending put forward the idea that the development of agriculture has caused an enormous increase in the rate of human evolution, including numerous evolutionary adaptations to the different challenges and lifestyles that resulted. Moreover, they argue that these adaptations have varied across different human populations, depending on factors such as when the various groups developed agriculture, and the extent to which they mixed genetically with other population groups.[2]

Such changes, they argue, include not just well-known physical and biological adaptations such as skin colour, disease resistance, and lactose tolerance, but also personality and cognitive adaptations that are starting to emerge from genetic research. These may include tendencies towards (for example) reduced physical endurance, enhanced long-term planning, or increased docility, all of which may have been counter-productive in hunter-gatherer societies, but become favoured adaptations in a world of agriculture and its resulting trade, governments and urbanization. These adaptations are even more important in the modern world, and have helped shape today's nation states. The authors speculate that the scientific and Industrial Revolutions came about in part due to genetic changes in Europe over the past millennium, the absence of which had limited the progress of science in Ancient Greece. The authors suggest we would expect to see fewer adaptive changes among the Amerindians and sub-Saharan Africans, who have farmed for the shortest times and were genetically isolated from older civilizations by geographical barriers. In groups that had remained foragers, such as the Australian Aborigines, there would presumably be no such adaptations at all. This may explain why Indigenous Australians and many native Americans have characteristic health problems when exposed to modern Western diets. Similarly, Amerindians, Aboriginals, and Polynesians, for example, had experienced very little infectious disease. They had not evolved immunities as did many Old World dwellers, and were decimated upon contact with the wider world

Friday, July 18, 2014

let's talk about dopamine hegemony...,


pnas |  The D4 dopamine receptor (DRD4) locus may be a model system for understanding the relationship between genetic variation and human cultural diversity. It has been the subject of intense interest in psychiatry, because bearers of one variant are at increased risk for attention deficit hyperactivity disorder (ADHD) (1). A survey of world frequencies of DRD4 alleles has shown striking differences among populations (2), with population differences greater than those of most neutral markers. In this issue of PNAS Ding et al. (3) provide a detailed molecular portrait of world diversity at the DRD4 locus. They show that the allele associated with ADHD has increased a lot in frequency within the last few thousands to tens of thousands of years, although it has probably been present in our ancestors for hundreds of thousands or even millions of years.

Thursday, July 17, 2014

our polity is our way of life...,


boredpanda | We are all aware of the global pollution problem, but hardly anyone realizes just how much trash we produce daily. Gregg Segal, a photographer from California, aims to show this problem through powerful imagery, photographing people lying in their weekly load of trash. His ongoing project cleverly called “7 Days of Garbage” tries to portray people from different social backgrounds to reach largest audience possible .

Segal decided to photograph the participants in front of naturalistic backgrounds to show that the garbage produced by us is effecting it directly. “Obviously, the series is guiding people toward a confrontation with the excess that’s part of their lives. I’m hoping they recognize a lot of the garbage they produce is unnecessary”, he said to Slate.

Some of the participants were too ashamed of how much garbage they produced weekly, so they edited their garbage bags. Others showed everything just the way it was resulting in nasty and very strong images, which you can see here.  Fist tap Kurman.

When Big Heads Collide....,

thinkingman  |   Have you ever heard of the Olmecs? They’re the earliest known civilization in Mesoamerica. Not much is known about them, ...