Showing posts with label dopamine. Show all posts
Showing posts with label dopamine. Show all posts

Wednesday, January 10, 2018

Money As Tool, Money As Drug: The Biological Psychology of a Strong Incentive


nih.gov |  Why are people interested in money? Specifically, what could be the biological basis for the extraordinary incentive and reinforcing power of money, which seems to be unique to the human species? We identify two ways in which a commodity which is of no biological significance in itself can become a strong motivator. The first is if it is used as a tool, and by a metaphorical extension this is often applied to money: it is used instrumentally, in order to obtain biologically relevant incentives. Second, substances can be strong motivators because they imitate the action of natural incentives but do not produce the fitness gains for which those incentives are instinctively sought. The classic examples of this process are psychoactive drugs, but we argue that the drug concept can also be extended metaphorically to provide an account of money motivation. From a review of theoretical and empirical literature about money, we conclude that (i) there are a number of phenomena that cannot be accounted for by a pure Tool Theory of money motivation; (ii) supplementing Tool Theory with a Drug Theory enables the anomalous phenomena to be explained; and (iii) the human instincts that, according to a Drug Theory, money parasitizes include trading (derived from reciprocal altruism) and object play.

Friday, March 11, 2016

the creepy inevitable and inescapable definition of virtual reality...,

WaPo |  When cookie giant Oreo wanted to promote its latest flavors, its marketing heads decided to spice up its traditional TV ads with something not just new, but otherworldly: a virtual-reality-style fly-through of a whimsical, violet-skied fantasyland, where cream filling flows like a river and cookie pieces rocket past the viewer's head.
The 360-degree “Wonder Vault” animation allowed viewers to look around this world by turning their smartphone, moving their mouse on a screen or gazing through a virtual-reality headset. And many did: In the minute-long sugary utopia’s two weeks of existence, it has enticed nearly 3 million YouTube viewers — about as big as the 12-to-34-year-old audience for “The Big Bang Theory,” the most-watched sitcom on TV.
“Look at the Cinnamon Bun world: There are cinnamon buns, but there are also ice skaters. It evokes that sort of emotional connection,” said Elise Burditt, brand manager for Oreo North America. “It’s all about taking people inside this world we’ve created ... and back to that feeling of being a kid again.”
As VR technology has rapidly grown more vivid, affordable and widespread, its artists and fans have championed the dramatic ways it could change movies, news, video games, on-the-job training and the creative arts. But many newcomers will take their first virtual steps via a more quintessentially American medium — advertising. And companies now are investing heavily in a race to shape those worlds to their design.

the inner-trainment industry...,


timtyler |  The entertainment industry wastes billions of dollars a year on films, games, pornography and escapism.

As such it is like a cancerous growth on humanity, sapping our collective resources and strength.

These funds typically do not produce anything worthwhile. They do not feed anyone. No housing or shelter is provided. The world does not wind up better irrigated as a result. No more useful elements or minerals come into circulation. Scientific knowledge is not advanced.

It is not just the funds that are wasted. Precious natural resources are needlessly depleted as well. Human time and effort - which could usefully be spent in other areas - are also used up. Both the consumers and the producers are affected.

All that is produced as a result of all this expenditure is entertainment.

What is entertainment?

Entertainment is a type of stimulation designed to trigger a drug-like states of euphoria.

Upon receipt of certain kinds of sensory input, the human brain produces drug-like compounds associated with positive behavioural reinforcement.

Various types of entertainment cause different types of stimulation. Comedy activates the nucleus accumbens - a brain area which is known to be involved in the rewarding feelings that follow monetary gain or the use of some addictive drugs. The shock-relief cycle horror movies repeatedly put the viewer through works as another type of drug-based conditioning - based on endorphins. Action adventure games are fuelled on adrenaline. Pornography works on the brain's sexual reward centres - and so on.

The result of all this drug-related stimulation is a high level of fantasy addiction in the population.

Addicts tend to become couch potatoes, often with various other associated pathologies: eye strain, back problems, malnutrition, RSI - and so on.

Some exposure to story telling and fantasies may be beneficial - since it allows humans to gain exposure to the experiences of others quickly and in relative safety. This explains why humans are attracted to this sort of thing in the first place. However, today's fanatsies often tend to go beyond what is healthy and beneficial. They typically represent a super-stimulus, in order to encourage an rapid response and subsequent addiction.

We see the same thing with sugars. Some sugars is useful - so humans are genetically programmed to eat them. However, in the modern environment, food is plentiful, and there is a huge food marketing industry - and the result is an obesity epidemic. This wastes billions of dollars in unwanted food production and healthcare bills, and is a complete and unmitigated managerial disaster.

Similarly some exposure fantasies is beneficial. It is when there is a whole marketing industry pumping consumers to consume fantasies at the maximum possible rate - in order to satisfy its own selfish goals - that problems with over-production and over-consumption arise.

Friday, November 20, 2015

syraq's speed freaks, jihad junkies, and captagon cartels...,


foreignpolicy |  In a dank garage in a poor neighborhood in south Beirut, young men are hard at work. Industrial equipment hums in the background as they put on their surgical masks and form assembly lines, unpacking boxes of caffeine and quinine, in powder and liquid form. They have turned the garage into a makeshift illegal drug factory, where they produce the Middle East’s most popular illicit drug: an amphetamine called Captagon

For at least a decade, the multimillion-dollar Captagon trade has been a fixture of the Middle East’s black markets. It involves everyone from and  gangs, to Hezbollah, to members of the Saudi royal family. On Oct. 26, Lebanese police arrested Saudi prince Abdel Mohsen Bin Walid Bin Abdulaziz at Beirut’s Rafik Hariri International Airport for allegedly trying to smuggle 40 suitcases full of Captagon (along with some cocaine) to Riyadh aboard a private jet.

The past  have seen the global trade in illegal Captagon skyrocket, as authorities across the region have observed a major spike in police seizures of the drug. Local law enforcement, Interpol, and the U.N. Office on Drugs and Crime (UNODC) all agree on the catalyst: the conflict in Syria. Captagon now links addicts in the Gulf to Syrian drug lords and to brigades fighting Syrian President Bashar al-Assad, who are funded by the profits, and, after years of fighting, are now hooked on the product.

Captagon began as a pharmaceutical-grade amphetamine called. Patented by German pharmaceutical giant  in the 1960s, doctors used it to treat a range of disorders, from narcolepsy to depression. But the drug fell out of favor in the 1970s, when the U.S. Food and Drug Administration deemed it too addictive to justify its use, with the World Health Organization following suit and recommending a worldwide ban in the 1980's. 

This is where the free market history of Captagon ends and the hazier black market story — one told by drug lords, smugglers, and law enforcement — begins.

Friday, September 04, 2015

mommy toldjah that playing around with your own PISS and SHIT doesn't end well...,


vanderbilt |  In the popular mind, mass extinctions are associated with catastrophic events, like giant meteorite impacts and volcanic super-eruptions.

But the world’s first known mass extinction, which took place about 540 million years ago, now appears to have had a more subtle cause: evolution itself.

“People have been slow to recognize that biological organisms can also drive mass extinction,” said Simon Darroch, assistant professor of earth and environmental sciences at Vanderbilt University. 

“But our comparative study of several communities of Ediacarans, the world’s first multicellular organisms, strongly supports the hypothesis that it was the appearance of complex animals capable of altering their environments, which we define as ‘ecosystem engineers,’ that resulted in the Ediacaran’s disappearance.”

The study is described in the paper “Biotic replacement and mass extinction of the Ediacara biota” published Sept. 2 in the journal Proceedings of the Royal Society B.

There is a powerful analogy between the Earth’s first mass extinction and what is happening today,” Darroch observed. “The end-Ediacaran extinction shows that the evolution of new behaviors can fundamentally change the entire planet, and we are the most powerful ‘ecosystem engineers’ ever known.”

The earliest life on Earth consisted of microbes – various types of single-celled microorganisms. They ruled the Earth for more than 3 billion years. Then some of these microorganisms discovered how to capture the energy in sunlight. The photosynthetic process that they developed had a toxic byproduct: oxygen. Oxygen was poisonous to most microbes that had evolved in an oxygen-free environment, making it the world’s first pollutant.


Sunday, March 29, 2015

indicative of the way a lot of people in power behave?


nypost |  In 2012, Forbes magazine ranked Brunei the fifth-richest nation in the world. Yet there is little fun to be had: Alcohol is banned and there is virtually no nightlife or culture.

“I’m trying to think of a place that’s duller,” Australian writer Charles James told Fortune in 1999. “Maybe a British village in midwinter.”

In one way, the brothers adhere to Islamic law: As prescribed, each has several wives and families. But everything else they do is in defiance of the Koran and the law they’ve just imposed.

“It’s a radical double standard,” says Jillian Lauren, who wrote about her life as a member of Jefri’s harem in her memoir, “Some Girls.” “They have more money than anyone else. I know that they both have been married and divorced multiple times. It’s really hypocritical.”

“With their money, they could have cured diseases,” an adviser to Jefri told Fortune. “But they have little interest in the rest of humanity.”

Another described Jefri and his brother as incredibly dim. “They don’t have a lot of thoughts,” he said. “If you were a fly on the wall and heard their conversations, they’d take you to Bellevue.”

A third brother, Mohamed, was reported to loathe his brothers’ wantonness and profligacy. But when the Sultan tasked him with rebuilding the economy that he and Jefri had so badly damaged, he took more than $2 billion for himself and was promptly fired.

Saturday, March 28, 2015

peak casinos


newyorker |  In the summer of 2010, New Jersey Governor Chris Christie travelled by helicopter to Atlantic City for what the local media described as a historic press conference. The news out of the city had been growing steadily worse, and by the time of Christie’s appearance it was clear that, nearly four decades after it had legalized gambling in an attempt to avoid economic ruin, Atlantic City was back where it had started. Standing in front of Boardwalk Hall, next to the mayor and members of the city council, Christie declared, “Atlantic City is dying.” The city, once known as the World’s Playground, had become unclean and unsafe. The number of visitors had fallen, and casino revenues were plummeting. Christie then announced a plan to return Atlantic City to its rightful place as the East Coast’s premier entertainment destination. There would be a sparkling new tourist district, with more conventions, restaurants, retail outlets, and non-gambling attractions. Also in development were bold new marketing plans and nonstop air routes to deliver fresh gamblers. Atlantic City, the Governor promised, would become “Las Vegas East.”

Four years later, Christie’s plan has failed. Four of Atlantic City’s twelve casinos have gone out of business this year, including Revel, an estimated $2.3-billion jewel that opened just two years ago; another, the Trump Taj Mahal, has announced that it could close within weeks. An estimated eight thousand jobs have already been lost, and thousands more seem likely to follow. Since Christie’s 2010 press conference, the assessed value of all the property in the city has declined by nearly half.

While it would be easy to conclude that Atlantic City’s demise is the predictable result of decades of well-documented greed, corruption, and incompetent leadership, the city is in fact one of the first casualties of a nationwide casino arms race. Eager for new jobs and new revenues that don’t require raising taxes, states from coast to coast have turned to gambling: in 1978, only Nevada and New Jersey had commercial casinos; today, twenty-four states do. Atlantic City once had the densely populated Northeast all to itself, but now nearly every state in the region is home to casinos. And with both New York and Massachusetts poised to open massive new gambling resorts, the competition for the fixed number of gamblers there will only get tougher. “It’s a war,” Father Richard McGowan, a professor of management at Boston College who studies the gambling industry, said. “It’s remarkable to me how the states are fighting each other for gambling revenue.”

Friday, March 06, 2015

is there such a thing as dietary racism?


prisonplanet |  An article featured in the left-leaning news outlet Mother Jones this week declares the act of eating three meals a day to be racist.

In a piece entitled, “Why You Should Stop Eating Breakfast, Lunch, and Dinner,” writer Kiera Butler asserts that strict adherence to mealtimes is not only “anti-science,” but “racist” as well.

“When European settlers got to America, they also imported their meal habits,” Butler says. “They observed that the eating schedule of the native tribes was less rigid—the volume and timing of their eating varied with the seasons.”

“Sometimes, when food was scarce, they fasted. The Europeans took this as ‘evidence that natives were uncivilized…’ So fascinated were Europeans with tribes’ eating patterns… that they actually watched Native Americans eat ‘as a form of entertainment.’”

Butler’s article goes on to chronicle the rising prevalence of meal schedules and their dominance in modern Western culture, insinuating that the tradition’s white European roots make the very practice inherently racist.

“Dogmatic adherence to mealtimes is anti-science, racist, and might actually be making you sick,” Butler writes.

While such absurd claims are often praised by hoards of “social justice warriors” scouring the depths of the Internet, commenters of the article were quick to reject the daft declaration.

“Add ‘eating’ to the list of ‘everything is racist…’” the article’s top comment states.

“I never realized that oatmeal was racist. I feel so ashamed!” another joked.

The obsession by some to label everything as racist is so pervasive that focusing merely on the topic of food can yield countless similar stories.  Fist tap Big Don.

Wednesday, December 31, 2014

visiting scholar constructive feedback sums up...,


None of this is surprising.

Here is where I stand today after observing the last 3 US Presidents and how the news media and "THE TEAM" of operatives who operate upon with the two 'Dung Producing Party Animals":
  • AMERICAN HEGEMONIC SUPREMACY goes unchallenged, NOT (just) because there is no equal and opposite opposing forces in the world to "check it" but MOSTLY because of the domestic political "proxy battles" which go on - which functionally limit protest or animate it - with regard to WHO is in power.
  • I keep focused on international news so I don't get caught up any longer in this scheme.
  • YES I supported the Iraq War because I was caught up in the "Fight The Critics" scheme.
  • Today I see clearly that THE US GOVERNMENT REMAINS CONSISTENT IN ITS ACTIONS - while the people (American) divide themselves into factions
Today the US "economy" is said to be strong.

Look at the international press and note that mostly every other nation is suffering and on the brink of recession (especially the "Oil Producing States" and they are being forced to make massive government spending cuts/ layoffs).

The USA has the power of the "US Federal Reserve System" which used a $3.5 trillion credit card to produce FAKE MONEY to keep the US economy going - all the while the IMF/World Bank forced "austerity measures" in other nations who don't have the fiscal power to print fake money without suffering from INFLATIONARY FORCES.

The only way this smoke and mirror scheme is going to be address - so that 330 million Americans don't get isolated from the travails of the billions of other people in the world is to force "IMF-like" controls upon the domestic US fiscal policy - to reflect the $18 trillion in debt that the USA is able to abstract.

This is not an "Anti-American" stance. It is an acknowledgment that other nations without the same power are caught up in the wake that American and European policies produce. The worst thing they are doing is allowing their citizens to become NET CONSUMERS, further deleveraging their ability to achieve "Self-Determination".

Military Power / Fiscal Power / Foreign Policy Sanctions / Intelligence Agency Insurgency / Cultural Colonialism / Consumerism - are different sides of the very same cube.

Thursday, November 13, 2014

pope francis must be depressed or sum'n, steady on that radical late-MLK type isht...,


Time | "Responsibility for the poor and the marginalized must therefore be an essential element of any political decision" Pope Francis warned heads of states attending the annual G20 meeting in Australia about the effects of “unbridled consumerism” and called on them to take concrete steps to alleviate unemployment.

In a letter addressed to Australia Prime Minister Tony Abbott, who is chairing this year’s G20 Leaders’ Summit which begins Sunday, the Pontiff called for its participants to consider that “many lives are at stake.”

“It would indeed be regrettable if such discussions were to remain purely on the level of declarations of principle,” Pope Francis wrote in the letter.

Pope Francis, who has made a habit of addressing the leaders of the G20 meetings, has often raised his concerns with the global economy. Last year, in lengthy report airing the views of the Vatican, he criticized the “idolatry of money” and denounced the unfettered free market as the “new tyranny.”
In the letter published Tuesday, he said that, like attacks on human rights in the Middle East, abuses in the financial system are among the “forms of aggression that are less evident but equally real and serious.”

“Responsibility for the poor and the marginalized must therefore be an essential element of any political decision, whether on the national or the international level,” he wrote.

knowing what the system is helps you know who your allies are...,


HuffPo |  It was, after all, the Catholic bishops who created the "right-to-life" movement in the first place, back when most American weren't even paying attention to the abortion issue, as I detail in my book Good Catholics: The Battle over Abortion in the Catholic Church. In the mid-1960s, abortion wasn't a major political issue. It was regulated by the states, most of which banned it except to save a woman's life. But public health officials, doctors and some legislators began pushing to make abortion more widely available because some 1 million illegal procedures were being performed every year. The gynecological wards of many city's hospitals were filled with women suffering from botched procedures -- some 10,000 in New York City alone in 1967 -- and only women who were rich or well-connected could get legal abortions, even in cases of rape or fetal deformity.
But the Catholic bishops, who considered sexual morality their special purview, decided to make preventing any liberalization of abortion law the main cause of their newly formed National Conference of Catholic Bishops. When California considered a bill to liberalize abortion access, the Dioceses of Los Angeles hired the same political consulting firm that got Ronald Reagan elected governor of California to beat back the bill. The bishops' consulting firm created the first grassroots "right-to-life" group to lobby against the bill. 

After that, the NCCB hired a political consultant to create right-to-life groups around the country. The bishops provided the financial and administrative support to get some of the earliest and most influential anti-abortion groups, including those in New York, Pennsylvania and Michigan, off the ground to obscure their involvement in the campaign against abortion, which they feared would reawaken old fears of the Vatican trying to impose its doctrine on American society. They created and funded the National Right to Life Committee, which would go on to be the most influential anti-abortion organization for 30 years, to coordinate the activities of the local anti-abortion groups.
Most of these early groups were heavily Catholic. But as more Evangelical Christians became interested in the issue, they became concerned that the bishops' control of the NRLC would dilute the effectiveness of the pro-life movement because it would be seen as tool of the Catholic Church. At a heated board meeting just before the Roe v. Wade decision, they wrested control of the organization from the bishops' conference, obscuring the Catholic roots of the organization and the anti-abortion movement.

Having lost their grassroots lobby just when they needed it most, the bishops tried another tack. In 1975, they released the Pastoral Plan for Pro-Life Activities, which said abortion was the number one issue for Catholics, and laid out a plan to organize Catholics politically to support candidates who backed a constitutional amendment to ban abortion. The move politicized the issue in a presidential election cycle in which both Jimmy Carter and Gerald Ford thought they needed the Catholic vote to win. Both candidates went to the bishops seeking their blessing as the press watched breathlessly. Not surprisingly, the bishops gave what was widely viewed as their endorsement to Ford because of his support of an anti-abortion amendment.

So by the mid-1970s, the bishops had created the anti-abortion movement out of whole cloth and become the first to politicize the issue in a presidential election (even though they failed to throw the election to their preferred candidate). Four years later, when Republican strategist Paul Weyrich was looking for an issue to unite socially conservative voters into a new Republican electoral coalition to replace the fading New Deal coalition, he decided abortion was the perfect wedge issue, both because it tapped into conservative dissatisfaction with the new, socially liberal culture and because it could potentially separate Catholic voters from the Democratic Party. Weyrich rebranded the bishops' right-to-life movement the "pro-family" movement, teamed up with direct mail wizard Richard Viguerie and televangelists Jerry Falwell and Pat Robertson to form the Moral Majority, and the culture wars were officially born.

destruction of the system of dopamine hegemony is the end: everything else is merely conversation...,


medialens |  If Julian Assange was initially perceived by many as a controversial but respected, even heroic, figure challenging power, the corporate media worked hard to change that perception in the summer of 2012. After Assange requested political asylum in the Ecuadorian Embassy in London, the faux-feminists and corporate leftists of the 'quality' liberal press waged war on his reputation.
This comment from the Guardian's Deborah Orr summed up the press zeitgeist:
'It's hard to believe that, until fairly recently, Julian Assange was hailed not just as a radical thinker, but as a radical achiever, too.'
A sentiment echoed by Christina Patterson of the Independent:
'Quite a feat to move from Messiah to Monty Python, but good old Julian Assange seems to have managed it.'
The Guardian's Suzanne Moore expressed what many implied:
'He really is the most massive turd.'
The attacks did more than just criticise Assange; they presented him as a ridiculous, shameful figure. Readers were to understand that he was now completely and permanently discredited.

We are all, to some extent, herd animals. When we witness an individual being subjected to relentless mockery of this kind from just about everyone across the media 'spectrum', it becomes a real challenge to continue taking that person seriously, let alone to continue supporting them. We know that doing so risks attracting the same abuse.

Below, we will see how many of the same corporate journalists are now directing a comparable campaign of abuse at Russell Brand in response to the publication of his book, 'Revolution'. The impact is perhaps indicated by the mild trepidation one of us experienced in tweeting this very reasonable comment from the book:
'Today humanity faces a stark choice: save the planet and ditch capitalism, or save capitalism and ditch the planet.' (p.345)

can it be that it was all so simple?

Thursday, September 04, 2014

rule of law: spending more on guns than nuns is recipe for being done...,


mic |  Tear gas. Armored transports. Military-grade weaponry. These are the images burned into our minds in the wake of the chaos in Ferguson earlier this month. 

Just one of the many disturbing revelations coming out of Ferguson is the militarization of local police departments across the U.S.

This statistic captures the trend: Despite a global recession that crippled city finances, the total spending on police per American increased by 28% between 2001 and 2010, according to figures from the Bureau of Justice Statistics. And that's the increase after taking inflation into account. 

The story gets more interesting when examining police spending at the city level. The map below shows how much it costs, per person, to support police departments in various cities.
Are people in cities that spend more on police safer? No. This is clear from the interactive chart below, which ranks cities by their violent crime and property crime rates. Violent crimes are defined as murder, non-negligent manslaughter, forcible rape, robbery, or aggravated assault. Property crimes are burglary, larceny, or motor vehicle theft.

Detroit and St. Louis top the violent crime ranking, and both spend more on policing per person than most major cities.  Fist tap Dale.

Monday, August 11, 2014

in a consumer society, there are two kinds of slaves:the prisoners of addiction, and the prisoners of envy...,


monbiot |  To be at peace with a troubled world: this is not a reasonable aim. It can be achieved only through a disavowal of what surrounds you. To be at peace with yourself within a troubled world: that, by contrast, is an honourable aspiration. This column is for those who feel at odds with life. It calls on you not to be ashamed.

I was prompted to write it by a remarkable book, just published in English, by a Belgian professor of psychoanalysis, Paul Verhaeghe(1). What About Me?: The Struggle for Identity in a Market-Based Society is one of those books that, by making connections between apparently distinct phenomena, permits sudden new insights into what is happening to us and why.

We are social animals, Verhaeghe argues, and our identity is shaped by the norms and values we absorb from other people. Every society defines and shapes its own normality – and its own abnormality – according to dominant narratives, and seeks either to make people comply or to exclude them if they don’t.

Today the dominant narrative is that of market fundamentalism, widely known in Europe as neoliberalism. The story it tells is that the market can resolve almost all social, economic and political problems. The less the state regulates and taxes us, the better off we will be. Public services should be privatised, public spending should be cut and business should be freed from social control. In countries such as the UK and the US, this story has shaped our norms and values for around 35 years: since Thatcher and Reagan came to power(2). It’s rapidly colonising the rest of the world.

Verhaeghe points out that neoliberalism draws on the ancient Greek idea that our ethics are innate (and governed by a state of nature it calls the market) and on the Christian idea that humankind is inherently selfish and acquisitive. Rather than seeking to suppress these characteristics, neoliberalism celebrates them: it claims that unrestricted competition, driven by self-interest, leads to innovation and economic growth, enhancing the welfare of all.

At the heart of this story is the notion of merit. Untrammelled competition rewards people who have talent, who work hard and who innovate. It breaks down hierarchies and creates a world of opportunity and mobility. The reality is rather different. Even at the beginning of the process, when markets are first deregulated, we do not start with equal opportunities. Some people are a long way down the track before the starting gun is fired. This is how the Russian oligarchs managed to acquire such wealth when the Soviet Union broke up. They weren’t, on the whole, the most talented, hard-working or innovative people, but those with the fewest scruples, the most thugs and the best contacts, often in the KGB.

Even when outcomes are based on talent and hard work, they don’t stay that way for long. Once the first generation of liberated entrepreneurs has made its money, the initial meritocracy is replaced by a new elite, which insulates its children from competition by inheritance and the best education money can buy. Where market fundamentalism has been most fiercely applied – in countries like the US and UK – social mobility has greatly declined(3).

If neoliberalism were anything other than a self-serving con, whose gurus and think tanks were financed from the beginning by some of the richest people on earth (the American tycoons Coors, Olin, Scaife, Pew and others)(4), its apostles would have demanded, as a precondition for a society based on merit, that no one should start life with the unfair advantage of inherited wealth or economically-determined education. But they never believed in their own doctrine. Enterprise, as a result, quickly gave way to rent.

All this is ignored, and success or failure in the market economy are ascribed solely to the efforts of the individual. The rich are the new righteous, the poor are the new deviants, who have failed both economically and morally, and are now classified as social parasites.

Saturday, July 19, 2014

the 10,000 year explosion?


wikipedia |  Cochran and Harpending put forward the idea that the development of agriculture has caused an enormous increase in the rate of human evolution, including numerous evolutionary adaptations to the different challenges and lifestyles that resulted. Moreover, they argue that these adaptations have varied across different human populations, depending on factors such as when the various groups developed agriculture, and the extent to which they mixed genetically with other population groups.[2]

Such changes, they argue, include not just well-known physical and biological adaptations such as skin colour, disease resistance, and lactose tolerance, but also personality and cognitive adaptations that are starting to emerge from genetic research. These may include tendencies towards (for example) reduced physical endurance, enhanced long-term planning, or increased docility, all of which may have been counter-productive in hunter-gatherer societies, but become favoured adaptations in a world of agriculture and its resulting trade, governments and urbanization. These adaptations are even more important in the modern world, and have helped shape today's nation states. The authors speculate that the scientific and Industrial Revolutions came about in part due to genetic changes in Europe over the past millennium, the absence of which had limited the progress of science in Ancient Greece. The authors suggest we would expect to see fewer adaptive changes among the Amerindians and sub-Saharan Africans, who have farmed for the shortest times and were genetically isolated from older civilizations by geographical barriers. In groups that had remained foragers, such as the Australian Aborigines, there would presumably be no such adaptations at all. This may explain why Indigenous Australians and many native Americans have characteristic health problems when exposed to modern Western diets. Similarly, Amerindians, Aboriginals, and Polynesians, for example, had experienced very little infectious disease. They had not evolved immunities as did many Old World dwellers, and were decimated upon contact with the wider world

Friday, July 18, 2014

let's talk about dopamine hegemony...,


pnas |  The D4 dopamine receptor (DRD4) locus may be a model system for understanding the relationship between genetic variation and human cultural diversity. It has been the subject of intense interest in psychiatry, because bearers of one variant are at increased risk for attention deficit hyperactivity disorder (ADHD) (1). A survey of world frequencies of DRD4 alleles has shown striking differences among populations (2), with population differences greater than those of most neutral markers. In this issue of PNAS Ding et al. (3) provide a detailed molecular portrait of world diversity at the DRD4 locus. They show that the allele associated with ADHD has increased a lot in frequency within the last few thousands to tens of thousands of years, although it has probably been present in our ancestors for hundreds of thousands or even millions of years.

Wednesday, July 02, 2014

wrecked RECD (really existing capitalist democracy)


alternet |  “Capitalism” is a term now commonly used to describe systems in which there are no capitalists: for example, the worker-owned Mondragon conglomerate in the Basque region of Spain, or the worker-owned enterprises expanding in northern Ohio, often with conservative support – both are discussed in important work by the scholar Gar Alperovitz. 

Some might even use the term “capitalism” to refer to the industrial democracy advocated by John Dewey, America’s leading social philosopher, in the late 19th century and early 20th century. Dewey called for workers to be “masters of their own industrial fate” and for all institutions to be brought under public control, including the means of production, exchange, publicity, transportation and communication. Short of this, Dewey argued, politics will remain “the shadow cast on society by big business.” 

The truncated democracy that Dewey condemned has been left in tatters in recent years. Now control of government is narrowly concentrated at the peak of the income scale, while the large majority “down below” has been virtually disenfranchised. The current political-economic system is a form of plutocracy, diverging sharply from democracy, if by that concept we mean political arrangements in which policy is significantly influenced by the public will. 

There have been serious debates over the years about whether capitalism is compatible with democracy. If we keep to really existing capitalist democracy – RECD for short – the question is effectively answered: They are radically incompatible. 

It seems to me unlikely that civilization can survive RECD and the sharply attenuated democracy that goes along with it. But could functioning democracy make a difference? 

Let’s keep to the most critical immediate problem that civilization faces: environmental catastrophe. Policies and public attitudes diverge sharply, as is often the case under RECD. The nature of the gap is examined in several articles in the current issue of Daedalus, the journal of the American Academy of Arts and Sciences.

Sunday, June 29, 2014

the dopamine hypothesis...,


frontiersin |  Dopamine is an inhibitory neurotransmitter involved in the pathology of schizophrenia. The revised dopamine hypothesis states that dopamine abnormalities in the mesolimbic and prefrontal brain regions exist in schizophrenia. However, recent research has indicated that glutamate, GABA, acetylcholine, and serotonin alterations are also involved in the pathology of schizophrenia. This review provides an in-depth analysis of dopamine in animal models of schizophrenia and also focuses on dopamine and cognition. Furthermore, this review provides not only an overview of dopamine receptors and the antipsychotic effects of treatments targeting them but also an outline of dopamine and its interaction with other neurochemical models of schizophrenia. The roles of dopamine in the evolution of the human brain and human mental abilities, which are affected in schizophrenia patients, are also discussed.

Thursday, June 26, 2014

hegemony recklessly eyeballing weed, genes, the spectrum....,


kingscollegelondon | The researchers found that people genetically pre-disposed to schizophrenia were more likely to use cannabis, and use it in greater quantities than those who did not possess schizophrenia risk genes. 

Power says: “We know that cannabis increases the risk of schizophrenia. Our study certainly does not rule this out, but it suggests that there is likely to be an association in the other direction as well – that a pre-disposition to schizophrenia also increases your likelihood of cannabis use.” 

“Our study highlights the complex interactions between genes and environments when we talk about cannabis as a risk factor for schizophrenia. Certain environmental risks, such as cannabis use, may be more likely given an individual’s innate behaviour and personality, itself influenced by their genetic make-up.  This is an important finding to consider when calculating the economic and health impact of cannabis.” 

Additional funding was provided by the National Institutes of Health, Australian National health and Medical Research Council, Australian Research Council, GenomEUtwin Project, Centre for Research Excellence on Suicide Prevention in Australia, the National Institute for Health Research Biomedical Research Centre (NIHR BRC) at the South London and Maudsley NHS Foundation Trust and King’s College London and the Netherlands Organization for Health Research and Development.  

Paper reference: Power, R. et al. ‘Genetic predisposition to schizophrenia associated with increased use of cannabis’ published in Molecular Psychiatry. 

UCLA And The LAPD Allow Violent Counter Protestors To Attack A Pro-Palestinian Encampment

LATimes |   University administrators canceled classes at UCLA on Wednesday, hours after violence broke out at a pro-Palestinian encampment...