Wednesday, October 16, 2013

entheogens and the development of culture



psypressuk | Originally published in 2013 ‘Entheogens and the Development of Culture: The Anthropology and Neurobiology of Ecstatic Experience’ is a collection of essays edited by John A. Rush. Rush has previously authored the books ‘The Mushroom in Christian Art: The Identity of Jesus in the Development of Christianity’, ‘Failed God: Fractured Myth in a Fragile World’ and ‘The Twelve Gates: A Spiritual Passage through the Egyptian Books of the Dead’. This work has been published by North Atlantic Books.
 
As a collection, Entheogens and the Development of Culture: The Anthropology and Neurobiology of Ecstatic Experience proposes that psychoactive substances have been key components in the development of both human culture and the human brain. The fourteen essays that are included in the collection are written by a number of researchers from across various disciplines, including anthropology, mycology, classics, cultural historians, psychology and biology. While, academically and perspectively, the writers often appear to be coming from altogether totally different theoretical places, with a myriad of intentions laced within them, they do share the common goal of examining the role of psychoactive substances in the history of human culture. And, as such, provides an interesting argument when taken in its totality.

The question regarding the entheogenic effect on the development of the human brain, while bolstered to some degree by the cultural chapters, is largely formulated in Michael Winkelman’s essay Altered Consciousness and Drugs in Human Evolution. Holding the position that our brains have evolved alongside, and as a result of certain plants and altered states, by way of the serotonergic and dopaminergic systems that can be stimulated by exogenous neurotransmitters—such as those found in Psilocybe mushrooms. Winkleman writes:
“The role of drugs in the evolution of human consciousness must be understood in relationship to effects on the serotonergic system and its roles in overall brain functioning. The alterations of consciousness enhance paleomammilian brain functions and their coordination and integration with the entire brain. Enhanced serotonergic mechanisms contributed to experiences of altered consciousness in humans, embodied in visionary experiences” (Rush 45)
So, the theory goes, the evolution of human consciousness has been, in part, mediated by the exogenous neurotransmitters that humans have sought out and consumed, thereby taking a hand in their own evolution. Taking the theory at face value, for the moment, this leads Winkleman to postulate that, “this expanded associational area improved the brain’s capacity to interface with a variety of other neural mechanisms, including those involved in learning, problem-solving, and memory function” (ibid.). Here, therefore, is the window into culture. From these improved brain functions, art, society and, indeed, organization generally, could develop. However, as we shall see, the remainder of the essays are less about the role of entheogens generating the capability for culture-creation in humans, but more about the role of entheogens within culture itself. Indeed, if entheogens created the capacity for culture, culture itself embarked on a process of reintegrating entheogens from the newly evolved perspective.

Entheogen discourse is primarily driven by historical analysis, and particularly the religious use of mushrooms within human culture, and while this is also very true of this collection, a number of other substances are discussed, which are worth mentioning first. Chris Bennett and Neil McQueen, both having written extensively on drugs and the bible, offer a chapter entitled Cannabis and the Hebrew Bible, which makes use of Sula Benet’s identification of kaneh bosm—an anointing oil used as an initiatory rite—with cannabis.

stallman: how much surveillance can democracy withstand?


wired | The current level of general surveillance in society is incompatible with human rights. To recover our freedom and restore democracy, we must reduce surveillance to the point where it is possible for whistleblowers of all kinds to talk with journalists without being spotted. To do this reliably, we must reduce the surveillance capacity of the systems we use.

Using free/libre software, as I’ve advocated for 30 years, is the first step in taking control of our digital lives. We can’t trust non-free software; the NSA uses and even creates security weaknesses in non-free software so as to invade our own computers and routers. Free software gives us control of our own computers, but that won’t protect our privacy once we set foot on the internet.

Bipartisan legislation to “curtail the domestic surveillance powers” in the U.S. is being drawn up, but it relies on limiting the government’s use of our virtual dossiers. That won’t suffice to protect whistleblowers if “catching the whistleblower” is grounds for access sufficient to identify him or her. We need to go further.

Thanks to Edward Snowden’s disclosures, we know that the current level of general surveillance in society is incompatible with human rights. The repeated harassment and prosecution of dissidents, sources, and journalists provides confirmation. We need to reduce the level of general surveillance, but how far? Where exactly is the maximum tolerable level of surveillance, beyond which it becomes oppressive? That happens when surveillance interferes with the functioning of democracy: when whistleblowers (such as Snowden) are likely to be caught.

If whistleblowers don’t dare reveal crimes and lies, we lose the last shred of effective control over our government and institutions. That’s why surveillance that enables the state to find out who has talked with a reporter is too much surveillance — too much for democracy to endure.

An unnamed U.S. government official ominously told journalists in 2011 that the U.S. would not subpoena reporters because “We know who you’re talking to.” Sometimes journalists’ phone call records are subpoena’d to find this out, but Snowden has shown us that in effect they subpoena all the phone call records of everyone in the U.S., all the time.

Opposition and dissident activities need to keep secrets from states that are willing to play dirty tricks on them. The ACLU has demonstrated the U.S. government’s systematic practice of infiltrating peaceful dissident groups on the pretext that there might be terrorists among them. The point at which surveillance is too much is the point at which the state can find who spoke to a known journalist or a known dissident.

Tuesday, October 15, 2013

why western political and financial elites absolutely hate vladimir putin...,


ICH | The election of Barak Obama to the White House truly was a momentous historical event.  Not only because a majority White population had elected a Black man to the highest office in the country (this was really mainly an expression of despair and of a deep yearning for change), but because after one of the most effective PR campaigns in history, the vast majority of Americans and many, if not most, people abroad, really, truly believed that Obama would make some deep, meaningful changes.  The disillusion with Obama was as great as the hopes millions had in him.  I personally feel that history will remember Obama not only as one of the worst Presidents in history, but also, and that is more important, as the last chance for the "system" to reform itself.  That chance was missed.  And while some, in utter disgust, described Obama as "Bush light", I think that his Presidency can be better described as  "more of the same, only worse".

Having said that, there is something which, to my absolute amazement, Obama's election did achieve: the removal of (most, but not all) Neocons from (most, but not all) key positions of power and a re-orientation of (most, but not all) of US foreign policy in a more traditional "USA first" line, usually supported by the "old Anglo" interests.  Sure, the Neocons are still firmly in control of Congress and the US corporate media, but the Executive Branch is, at least for the time being, back under Anglo control (this is, of course, a generalization: Dick Cheney was neither Jewish nor Zionist, while the Henry Kissinger can hardly be described as an "Anglo").  And even though Bibi Netanyahu got more standing ovations in Congress (29) than any US President, the attack on Iran he wanted so badly did not happen.  Instead,  Hillary and Petraeus got kicked out, and Chuck Hagel and John Kerry got in.  That is hardly "change we can believe in", but at least this shows that the Likud is not controlling the White House any more.

Of course, this is far from over.  If anything the current game of chicken played between the White House and Congress over the budget with its inherent risk of a US default shows that this conflict is far from settled.

The current real power matrix in the USA and Russia

We have shown that there two unofficial parties in Russia which are locked in a deadly conflict for power, the "Eurasian Sovereignists" and "Atlantic Integrationists".  There are also two unofficial parties in the USA who are also locked in a deadly conflict for power: the Neocons and the "old Anglos imperialists".  I would argue that, at least for the time being, the "Eurasian Sovereignists" and the "old Anglos" have prevailed over their internal competitor but that the Russian "Eurasian Sovereignists" are in a far stronger position that the American "old Anglos".   There are two main reasons for that:

1)  Russia has already had its economic collapse and default and
2)  a majority of Russians fully support President Putin and his "Eurasian Sovereignist" policies.

 In contrast, the USA is on the brink of an economic collapse and the 1% clique which is running the USA is absolutely hated and despised by most Americans.

After the immense and, really, heart-breaking disillusionment with Obama, more and more Americans are becoming convinced that changing the puppet in the White House is meaningless and that what the US really needs is regime change.

The USSR and the USA - back to the future?

It is quite amazing for those who remember the Soviet Union of the late 1980 how much the US under Obama has become similar to the USSR under Brezhnev: internally it is characterized by a general sense of disgust and alienation of the people triggered by the undeniable stagnation of a system rotten to its very core. A bloated military and police state with uniforms everywhere, while more and more people live in abject poverty.  A public propaganda machine which, like in Orwell's 1984, constantly boasts of successes everywhere while everybody knows that these are all lies.  Externally, the US is hopelessly overstretched and either hated and mocked abroad.  Just as in the Soviet days, the US leaders are clearly afraid of their own people so they protect themselves by a immense and costly global network of spies and propagandists who are terrified of dissent and who see the main enemy in their own people.

Add to that a political system which far from co-opting the best of its citizens deeply alienates them while promoting the most immoral and corrupt ones into the positions of power.  A booming prison-industrial complex and a military-industrial complex which the country simply cannot afford maintaining.  A crumbling public infrastructure combined with a totally dysfunctional health care system in which only the wealthy and well-connected can get good treatment.  And above it all, a terminally sclerotic public discourse, full of ideological clichés an completely disconnected from reality.

I will never forget the words of a Pakistani Ambassador to the UN Conference on Disarmament in Geneva in 1992 who, addressing an assembly of smug western diplomats, said the following words: "you seem to believe that you won the Cold War, but did you ever consider the possibility that what has really happened is that the internal contradictions of communism caught up with communism before the internal contradictions of capitalism could catch up with capitalism?!".  Needless to say, these prophetic words were greeted by a stunned silence and soon forgotten.  But the man was, I believe, absolutely right: capitalism has now reached a crisis as deep as the one affecting the Soviet Union in the late 1980s and there is zero chance to reform or otherwise change it.  Regime change is the only possible outcome.

stop playing!


independent | American lawmakers risk causing a “massive disruption the world over” that could tip the global economy into another recession if politics gets in the way of raising the country’s debt ceiling and the ongoing government shutdown remains unresolved, Christine Lagarde, the head of the International Monetary Fund, warned on Sunday as the US Senate became the focus of talks to end the budgetary deadlock in Washington.

The stark assessment by Ms Lagarde, a former French Finance Minister, came after news that talks between the Republican Speaker of the House of Representatives, John Boehner, and President Barack Obama had broken down, putting the onus on the Senate leadership to craft a bipartisan pact to avert what experts predict would be financial catastrophe.

The US government will hit the congressionally-mandated ceiling on how much money it can borrow to fund its commitments by 17 October. If by then the $16.7 trillion (£10.4trn) limit is not raised by the legislature, the US would be forced to walk down a road usually associated with weaker economies: dishonouring its spending commitments and defaulting on its debts, an outcome that Ms Lagarde said could shatter the fragile economic recovery under way in the US and around the world.

“If there is that degree of disruption, that lack of certainty, that lack of trust in the US signature, it would mean massive disruption the world over, and we would be at risk of tipping yet again into a recession,” she told NBC.

The IMF chief also poured cold water on suggestions by some within the Republican camp, including the Kentucky Senator Rand Paul, that the government need not default if the ceiling is not raised. Mr Paul told CNN that “not raising the debt ceiling means you have to balance your budget. It doesn’t mean you have to default.”

But Ms Lagarde said there was no room to get around the limit and what it meant. “When you are the largest economy in the world, when you are the safe haven in all circumstances, as has been the case, you can’t go into that creative accounting business,” Ms Lagarde said.  Fist tap Dale.

why does the world bank richly fund those quite capable of funding themselves?


WaPo | The bank’s 30-year engagement with China began soon after the United States restored diplomatic ties with the communist nation in 1979. It is often trumpeted by bank officials as one of their more important success stories. China’s rapid development has been central to the drop in extreme poverty around the world. Bank officials, including Kim, say that they are uniquely positioned to help China with ongoing problems such as pollution and climate change — and that China’s success on those fronts is important to the world.

But as China has become more powerful and sophisticated, the bank’s role there has occupied an increasingly gray area. 

Is the World Bank there because China needs help? Or because loans to China provide a hefty profit that pays the bank’s salaries and administrative costs? Does helping China reduce its carbon use justify bank support for programs that may have damaged industries in other countries? Are Chinese contractors simply more willing than most to work in difficult parts of the world for a cheaper price?
China is the bank’s third-largest borrower, with nearly $56 billion in loans since 1980, and 107 programs underway. In the early years, that included support from the International Development Association (IDA), the branch of the bank that provides low-interest loans or grants to the poorest countries, and whose lending is subsidized by contributions from wealthier nations, including the United States.

That cut-rate lending stopped more than a decade ago when China passed the income threshold set by the bank for IDA borrowers. But some in the United States argue that China — a nuclear power with a space program and $3 trillion in cash reserves — should not get any help from an organization whose money and energy could be better directed at countries with less ability to help themselves.

Monday, October 14, 2013

what if you're more afraid of the all-seeing eye?


medialens | In our May 13 media alert we highlighted how the state, and a compliant media, relentlessly raise fears of the 'shadows and threats' that supposedly assail us. We make no apology for again citing the American writer H. L. Mencken:

'The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.'

In that alert, we pointed to an edition of BBC Newsnight that was devoted to UK 'defence' spending and policy. The BBC's Gavin Esler introduced and presented the programme from the perspective of government; namely, that:

'National security is the first duty of government. We will remain a first-rate military power.'
Reflecting, and indeed boosting, state priorities is the default mode of BBC News. Last Tuesday, the flagship News at Ten on BBC1 demonstrated this perfectly when celebrity news presenter Fiona Bruce, who also has The Queen's Palaces and The Antiques Roadshow on her CV,  began with the ominous words:

'A warning from MI5: Britain's security is threatened on more fronts, in more ways than ever before.'

Bruce continued:

'recent leaks about the extent of Briton's global surveillance is damaging efforts to stop attacks on the UK. Despite MI5's warnings, some critics say the public has a right to know if it's being spied on.' 

Bruce then introduced BBC security correspondent Gordon Corera who was standing besuited outside MI5 headquarters, ready to repeat the secret service's key messages in a simulacrum of journalistic authority. He began on the approved note:

'Yes, the job of people here at MI5 is to keep the country safe from national security threats, particularly terrorist attacks.'

As ever, the professed upholding of BBC 'impartiality' translates in practice to providing the propaganda version of reality. After all, as Mencken observed, a major state function is to convince the public that the government is protecting it from threats. It would not be responsible BBC journalism to recognise that government policies put British people at risk by, for instance, launching illegal wars of aggression likely to lead to blowback – a genuine risk well understood by the state and, indeed, with the kind of horrific consequences seen in the London 7/7 bombings in 2005. As John Pilger noted recently:

'British governments are repeatedly warned, not least by the parliamentary intelligence and security committee, that foreign adventures beckon retaliation at home.' 

Corera then went on to convey the propaganda message from Andrew Parker, director-general of MI5, the UK's domestic counterintelligence and security agency. Parker had given a Whitehall speech to a 'closed audience' on how the '[security] threats had changed and how the organisation was trying to cope with them.' While neither Edward Snowden nor WikiLeaks were mentioned by name, they were implicitly the target of Parker's criticisms that revelations about surveillance were 'potentially a gift to terrorists allowing it to make it easier for them to strike at Britain.' No responsible journalist would let this pass without challenge.

we return to our regularly scheduled grousing....,


guardian | The Law Society is considering issuing new guidance to solicitors across England and Wales amid growing concern that the government's mass online surveillance operations are undermining their ability to take legal cases against the state.

Lawyers representing people who make serious complaints against the police, army or security services fear the industrial-scale collection of email and phone messages revealed by the Guardian over the past four months is threatening the confidential relationship between them and their clients, jeopardising a crucial plank of the criminal justice system.

"These are absolutely fundamental issues," said Shamik Dutta, from Bhatt Murphy lawyers in London. "The NSA revelations are having a chilling effect on the way a crucial part of the justice system operates. Individuals who are making serious allegations of wrongdoing against the state are becoming increasingly concerned about whether the information they share with their lawyers will remain confidential."

Files leaked by the whistleblower Edward Snowden show the British eavesdropping centre, GCHQ, and its American counterpart, the National Security Agency, have developed capabilities to undertake mass surveillance of the web and mobile phone networks. This is done by trawling the servers of internet companies and collecting raw data from the undersea cables that carry web traffic.
Two of the programmes, Prism and Tempora, can sweep up vast amounts of private data, which is shared between the two countries. The Guardian recently revealed how GCHQ and the NSA have also successfully cracked much of the online encryption relied upon by hundreds of millions of businesses and individuals to protect their privacy.

The Law Society expressed concern about the possible impact on lawyers' day-to-day work. "The ability of clients to consult and receive advice from lawyers with certainty of absolute confidentiality is fundamental to the rule of law and the values of our democracy," it said in a statement. "The Law Society would regard it as very serious indeed if privileged correspondence were to be intercepted. We are looking at the issue and considering whether there needs to be advice given to the profession."

A spokesperson for the Bar Council said the "enormous developments in the cyber world over the past decade" meant it was right to question whether the proper safeguards were in place. "The public need to be reassured that powers to intercept communications are properly exercised. We have long had concerns that the existing legislation fails to respect the right of individuals to consult privately with a lawyer. Any increase in the scope of the authorities' data gathering activities must be matched by the establishment of real protection for this fundamental aspect of the administration of justice."

Eric King, head of research at Privacy International, said there were real fears in the legal profession about confidentiality being breached by the security services following the NSA revelations. "We are astonishingly concerned about privileged communications being swept up as part of the mass surveillance programmes we have learned about over the past few months," he said.

"When you are talking about litigation involving major cases that challenge significant government policy, the intelligence that could be gained and the change in strategy that could be deployed by obtaining an idea of what the claimants were seeking to challenge, it could easily tip the balance decisively in what are critical cases."

Sunday, October 13, 2013

ordinary thinking a prisoner of the temporal consensus...,


a new model of the universe | The fourth dimension for us lies in the world of celestial bodies and in the world of molecules.

The fifth dimension lies in the moments of life eternally remaining where they are, and in the repetition of life itself, taken as a whole.

Life in itself is time for man. For man there is not and cannot be any other time outside the time of his life. Man is his life. His life is his time.

The way of measuring time, for all, by means of such phenomena as the apparent or real movement of the sun or the moon, is comprehensible as being convenient for practical purposes. But it is generally forgotten that this is only a formal time accepted by common agreement. Absolute time for man is his life. There can be no other time outside this time.

If I die to-day, to-morrow will not exist for me. But, as has been said before, all theories of the future life, of existence after death, of reincarnation, etc., contain one obvious mistake. They are all based on the usual understanding of time, that is, on the idea that to-morrow will exist after death. In reality it is just in this that life differs from death. Man dies because his time ends. There can be no to-morrow after death.

But all usual conceptions of the " future life " require the existence of " to-morrow ". What future life can there be, if it suddenly appears that there is no future, no " tomorrow", no time, no "after"? Spiritualists, theosophists, theologians and others who know everything about the future life, may find themselves in a very strange situation if the fact is realised that no "after" exists.

What then is possible? And what may the meaning of life as a circle be?

I have pointed out that the very curvature of the line of time implies the presence in it of yet another dimension, namely, the fifth dimension, or eternity. And if in the usual understanding the fourth dimension is extension of time, what can the fifth dimension, or eternity, be?

atwill and his thesis dissected...,


freethoughtblogs/carrier | Atwill is the one dude I get asked about most often.[*] And now apparently even Dawkins is tweeting about Atwill, thanks to his upcoming venture into England later this month to sell his weird Roman Conspiracy variety of Jesus mythicism. To get the gist you can check out his PR puff piece. Thomas Verenna has already written a deconstruction of that. Notably even Acharya S (D.M. Murdock) doesn’t buy Atwill’s thesis, declaring that she does “not concur with Atwill’s Josephus/Flavian thesis” and that “the Flavians, including Josephus, did not compose the canonical gospels as we have them.” Robert Price has similarly soundly debunked his book, even after strongly wanting to like it.

Atwill is best known as the author of Caesar’s Messiah (subtitle: “The Roman Conspiracy to Invent Jesus,” Roman meaning the Roman imperial family…yeah). In this Atwill argues “Jesus [is] the invention of a Roman emperor” and that the entire (?) New Testament was written by “the first-century historian Flavius Josephus” who left clues to his scheme by littering secret hidden coded “parallels” in his book The Jewish War. Atwill claims to prove “the Romans directed the writing of both” the JW and the NT, in order “to offer a vision of a ‘peaceful Messiah’ who would serve as an alternative to the revolutionary leaders who were rocking first-century Israel and threatening Rome,” and also (apparently) as a laughing joke on the Jews (Atwill variously admits or denies he argues the latter, but it became clear in our correspondence, which I will reproduce below…it’s weird because making fun of the Jews kind of contradicts the supposedly serious aim of persuading the Jews, yet Atwill seems to want the imperial goal to have simultaneously been both).

Notice his theory entails a massive and weirdly erudite conspiracy of truly bizarre scope and pedigree, to achieve a truly Quixotic aim that hardly makes sense coming from any half-intelligent elite of the era (even after adjusting for the Flynn effect), all to posit that the entire Christian religion was created by the Romans (and then immediately opposed by it?), who somehow got hundreds of Jews (?) to abandon their religion and join a cult that simply appeared suddenly without explanation on the Palestinian (?) book market without endorsement.

I honestly shouldn’t have to explain why this is absurd. But I’ll hit some highlights. Then I’ll reveal the reasons why I think Atwill is a total crank, and his work should be ignored, indeed everywhere warned against as among the worst of mythicism, not representative of any serious argument that Jesus didn’t exist. And that’s coming from me, someone who believes Jesus didn’t exist.

Historically, Atwill’s thesis is more or less a retooled version of the old Pisonian Conspiracy Theory, by which is not meant the actual Pisonian conspiracy (to assassinate Nero), but a wildly fictitious one in which the Piso family invented Christianity (and fabricated all its documents) through its contacts with the Flavian family, and thence Josephus (who was indeed adopted into that family after tricking his officer corps into committing suicide and then surrendering to the Romans during the War…oh, and conveniently declaring Vespasian the Messiah).

This pseudo-historical nonsense is over a century old by now, first having been proposed (so far as I know) by Bruno Bauer in Christ and the Caesars in 1877 (Christus und Caesaren). It has been revamped a dozen times since. Atwill is simply the latest iteration (or almost–there is a bonkers Rabbi still going around with an even wilder version). Atwill’s is very much like Bible Code crankery, where he looks for all kinds of multiple comparisons fallacies and sees conspiracies in all of them, rather than the inevitable coincidences (or often outright non-correspondences) that they really are. Everything confirms his thesis, because nothing could ever fail to. Classic nonfalsifiability. He just cherry picks and interprets anything to fit, any way he wants.

Saturday, October 12, 2013

the great library at Alexandria was destroyed by budget cuts, not fire...,


io9 | Though it seems fitting that the destruction of so mythic an institution as the Great Library of Alexandria must have required some cataclysmic event . . . in reality, the fortunes of the Great Library waxed and waned with those of Alexandria itself. Much of its downfall was gradual, often bureaucratic, and by comparison to our cultural imaginings, somewhat petty. For example, the Roman Emperor Marcus Aurelius Antoninus suspended the revenues of the Mouseion, abolishing the members’ stipends and expelling all foreign scholars. Alexandria was also the site of numerous persecutions and military actions, which, though few were reported to have done any great harm to the Mouseion or the Serapeum, could not help but have damaged them. At the very least, what institution could hope to attract and keep scholars of the first eminence when its city was continually the site of battle and strife?

What's interesting here is Phillips' emphasis on how the decline of the library rested as much on its reputation as a learning center as it did on the number of books in its collection. What made the Museum and its daughter branch great were its scholars. And when the Emperor abolished their stipends, and forbade foreign scholars from coming to the library, he effectively shut down operations. Those scrolls and books were nothing without people to care for them, study them, and share what they learned far and wide.

The last historical references to the library's contents meeting their final end come in stories about the events of 639 CE, when Arab troops under the rule of Caliph Omar conquered Alexandria.

Luciano Canfora has written one of the most complete histories of the library, based on primary source material — documents written by people who knew and worked in the library. In The Vanished Library, he describes what the library at Alexandria had been reduced to by the time of its ultimate destruction in 639:
The Serapeum had been destroyed in the attack on the pagan temples in 391. The last famous figure associated with the Museum had been Theon, father of the celebrated Hypatia who studied geometry and musicology and whom the Christians, convinced in their ignorance that she was a heretic, barbarously murdered in 415 . . . Naturally, the city's books had changed, too; and not only in their content. The delicate scrolls of old had gone. Their last remnants had been cast out as refuse or buried in the sand, and they had been replaced by more substantial parchment, elegantly made and bound into thick codices - and crawling with errors, for Greek was increasingly a forgotten language. The texts now consisted chiefly of patristic writings, Acts of Councils, and 'sacred literature' in general.
This was not Ptolemy's great collection, nor was it the center of scholarship in what was then the modern world. It was a broken-down remnant of its former self, neglected for centuries. The collection was mostly stocked with materials that reflected what Judeo-Christian bureaucrats would have considered important; these materials did not reflect the Greek ideal of universal knowledge that had birthed the library in the first place.

In the end, it was only this diminished version of the library that was burned on the orders of Caliph Omar when Emir Amrou Ibn el-Ass took the city.

Friday, October 11, 2013

the entheogen theory of religion and ego death


egodeath | Vertical, Timeless Determinism -  In late antiquity, consciousness was centered around the doctrine and mystic-state experience of the pre-setness of future thoughts and occurrences. The central thematic concern of religions in the Hellenistic era was Heimarmene (Martin 1987), which means fatedness, Necessity, or timeless cosmic determinism. Modern thought considers some related issues, though only in a single cognitive state. For example, Philosophical Metaphysics investigates the related issues of tenseless time, fatedness, agent movement through space and time, and controller agents (Oaklander & Smith 1995). 

The future is unchangeable and pre-set because of the static relation of control to the time dimension, and because it is largely an illusion that a person is a continuant agent who exercises power while moving through time. 
Modern science introduces clockwork determinism and thereby reduces the person to an automaton; in reaction, Copenhagenist quantum mechanics aims to provide an emancipating alternative to the hidden-variables determinism of Einstein and Bohm. However, modern conceptions of determinism and causality are limited to intellectual speculation based in the ordinary cognitive state, so they habitually tend to envision time as a sequential flow.
Transcending Determinism Requires Two Jumps - Determinism is both a praised goal and a disparaged trap to escape, due to determinism-awareness being the intermediate but not final goal of religious mental transformation. Valentinian Gnosticism affirmed cosmic determinism but also transcended it, and formulated two contrasting schemes of thinking about moral culpability (Pagels 1992). 

Simplified 2-stage initiation themes actually reflect a 3-stage progression that is centered around determinism. Mystic metaphor both endorses and disparages the realization of determinism, because determinism is only an intermediate destination on the path to salvific regeneration. The first demon or stage of egoic delusion to be cast out is the assumption of simple independent self-command and freewill. The second demon to be overcome is the mental model of cosmic determinism or fatedness, a model which is rationally coherent but raises the practical problem of control instability.

plundering the planet


cassandralegacy | Ladies and gentlemen, it is a pleasure to be here and my task today is to tell you about something that stands at the basis of everything we do: mineral resources. It it is the subject of a book that is the result of a research program sponsored by the Club of Rome and that has involved me and 16 co-authors.

For the time being we have only the German version, we are working at the English one, but that will take some time - a few months. In any case, the title should be clear to you even if you don't speak German and you can notice that we say "The Plundered Planet;" not "The Improved Planet", or "The Developed Planet". No, this is the concept: plundering. We have been acting with mineral resources as if we were pirates looting a captured galleon: grabbing everything we can, as fast as we can. 
 
Now, of course, there is a problem with the idea of plundering planet Earth. It is how long we can go on plundering. At the basic level, it is a question of common sense: we know that once we have burned oil, it is gone. We know that after we have dispersed copper in tiny bits all over we can't recover it any more. We know that diamonds are forever, perhaps, but also that once we have taken them out of a mine, then there are no more diamonds in there. Mineral resources are not infinite. 
 
So, there is this nagging question: how long can we go on mining? It is a question that started being asked in the 19th century and the answer is both easy and difficult. It is easy to say "not forever," but it is difficult to say for how long, exactly. So, what form will take depletion? How is it going to be felt on the economy. And, since we see ourselves as very smart, can we find some trick to avoid, or at least delay, the problem?
 
The first study that attempted to quantify these question was a report that was sponsored by the club of Rome back in 1972. You have surely heard about it: here is the cover of that book.
 
 
Now, you have probably also heard that this study was "wrong," that is, that it had made wrong predictions, that it was based on bad data and flawed models, and similar accusations. That was the result of a wave of criticism, a true tsunami I'd say, that engulfed the book and its authors after the study was published. The authors were accused of being not just wrong, but part of a global conspiracy aimed at enslaving humankind and exterminating the colored races (I am not kidding, that was said several times).

However, if there was such a harsh reaction to the book, it was also because it went to the core of some of the basic assumptions of our society, of our deeply held belief that, somehow, not only growth is always good, but that we can keep growing forever. But the book said that it wasn't possible. And it didn't say just that, it said that the limits to growth were to appear in a time span that was not of centuries, but just of decades. Below, you can see the main results of the 1972 study, the run that was called the "base case" (or "standard run"). The calculations were redone in 2004, finding similar results.

Thursday, October 10, 2013

roman christianity a psy-op?


covertmessiah | The origin of the Christian religion has been a subject steeped in mystery for nearly 2000 years. Who was Jesus? Is he an historical character? Who wrote the Gospels? Why are they written in Greek? Why did they have a pro-Roman and anti-semitic perspective? Why was the religion headquartered in Rome? Caesar’s Messiah: The Roman Conspiracy to Invent Jesus is a documentary based on the best‐selling religious studies book by Joseph Atwill. Atwill is one of a number of scholars today from all around the world, who are questioning the historic facts behind these mysterious origins of Christianity. When examining the actual history of this era, many of the answers provided by the Church do not hold up to rigorous scrutiny. No doubt, Christianity has done a lot of good for the world, but a lot of bad has come from its most dogmatic believers, who create wars, hatred, and other harm under the disguise of religion. In studying how Christianity emerged, the seven controversial Bible scholars featured in this film agree that it was used as a political tool to control the masses of the day, and is still being used this way today. For example, support for the wars in the Middle East is preached to Evangelical Christians as a way to speed up the coming of the End of Days. Maybe we need to expand the possible answers about how Christianity originated, and deeper questions need to be asked. Maybe we need to examine what political motives were behind the formation of the Christian religion?

The documentary begins with a brief history of the political and religious climate of Judea in the first century CE – the era during which Christianity emerged. Judea was occupied by the Roman Empire, which required them to worship Caesar as a god. The Jews found this blasphemous, and they waged constant rebellions against the Empire. Their religious scriptures prophesied that a militaristic warrior Messiah would defeat the Romans and lead the Jews to liberation. A string of numerous Messiahs presented themselves to lead the people in war against Rome, only to be defeated and crucified – a customary Roman punishment for insurgents of the day. However, the Roman government was growing weaker from over a century of increasingly corrupt rule by the Julio‐Claudian dynasty — the last emperor of this lineage being Nero, who was bankrupting the Empire with his self‐indulgence. In their greatest victory, the messianic Jews finally succeeded in burning Rome and driving the Romans out of Judea. This caused Nero to call upon his best military men, the Flavians – Vespasian and his son Titus — to crush the rebellion for good. The Flavians succeeded not only in destroying the Jewish towns of Galilee and their temple in Jerusalem, but after Nero was deposed and committed suicide, they seized the throne through a military coup and took over reign of the Roman Empire itself. Under the Flavians, the Empire flourished, and many great monuments were built including the famous Coliseum. In order to pacify the Jewish rebellion, they captured and burned all the Jews’ scriptures. It is around this time that a new literature emerged with the story of a very different Jewish Messiah – one who preached “give to Caesar what is Caesar’s”, “turn the other cheek”, and “love your enemy”.

Kenneth Humphreys on the historicity of Jesus: "[It's] a dilemma for those who believe in him. Because on the one hand he supposedly overturned the world, it turned the world upside-down and triggered off this massive movement, but on the other hand he leaves no trace in historical record."
The second half of the documentary focuses on the documents the Flavians left behind which prove their authorship of the Gospels. The Bible scholars deconstruct the Gospels and the character Jesus, showing that they are based on archetypes found in the ancient pagan mystery schools and in earlier Jewish literature. Much of the teachings of Christianity are traced back to the writings of Philo of Alexandria — who was combining Jewish scripture with Greek pagan beliefs — and Stoicism, a philosophy promoted by the Flavians. When the Flavians seized control of the Roman Empire, they needed to legitimise their rule, so they had their Jewish court historian Josephus (originally Yosef ben Matityahu who adopted the name Titus Flavius Josephus) create a large body of work which became the only official history we have of the Jewish-Roman War. Fist tap Dale.

how new religions are made

religiondispatches | What inspired you to write Chosen People? What sparked your interest?
When I was in college I was interested in the similarities between Jewish and Black nationalisms, and began to learn Jewish and African American histories at Stanford University with Clayborne Carson, George Fredrickson, Sylvia Wynter, Mark Mancall, Arnie Eisen, and Tudor Parfitt.

A chance encounter led me to visit the Original Hebrew Israelites of Dimona, Israel, and the experience was so powerful that I set out to study the antecedents of Black Israelite movements. At that time, Shlomo Levy, a Columbia University graduate student who was himself the son of one of the leading figures of the New York Israelite community, had begun to work with the Schomburg Center of the New York Public Library to collect papers from a dozen or so Black Israelite synagogues. I wrote an honors thesis on a small part of that collection, and then returned in graduate school to use the rest.

Working towards my doctorate at UCLA I was fortunate enough to study Black Atlantic religions with Donald Cosentino, and African American and West Indian histories with Brenda Stevenson, Gary Nash, and Bobby Hill. I was also inspired by seminars I took with Carlo Ginzburg, Peg Jacob, Lynn Hunt, Henry Yu, and others. I wanted to thickly describe African American Judaism from microhistorical, Black Atlantic, and African American Studies perspectives.

The question of "authenticity" that had dominated the accounts of so many white Jews was of little interest to me. What had gone missing in the limited literature on the topics was an attempt to tell the story of Black Israelites as an instance of African American history (in the hemispheric sense, including the West Indies), and an attempt to write Black Israelites into the larger stories of American religion and of Black Atlantic religions.

You describe a variety of fascinating (and largely unknown) figures in American religious history. Which one of them fascinated you the most?
Prophet William Saunders Crowdy stars in two chapters and is a largely unknown but remarkable figure who deserves to be on a postage stamp for the impact he had on U.S. religion and culture. But without a doubt, I was most fascinated by Rabbi Wentworth Arthur Matthew. That is because I had access to sources at the Schomburg and in newspapers over half a century that allowed me to clearly see Rabbi Matthew's religious evolution, and his polycultural bricolage of his own Israelite tradition combining Holiness-based Israelite churches, Judaism, conjuring, West Indian festivals, Central European occult practices, and freemasonry. Although Matthew tried his best to hide this religious bricolage, his papers offer a rare opportunity to see how new religions are made.

Is there anything you had to leave out?
Tons. I came to see Black Israelites as being very closely related to Black Muslims. Not only was there overlap between the groups' memberships, but it was not uncommon for groups to blend elements of both Judaism and Islam in the 1920s, as in the 1970s.

I think African American adoption of both religions are variants of Black thought about "the East," and deserve to be thought of as Black forms of Orientalism—not in a pejorative sense, but in an affirmative and romantic sense. So at one point the book was at least twice as long, before I decided that the Islam/Orientalism piece needed to be a book of its own.

Even then, the Black Israelites book continued for five more chapters concerning interactions and race relations between white and Black Jews during the Civil Rights and Black Power eras. Thankfully, Oxford University Press' readers reined me in, and I was left with the much more compact, and much more readable text as it stands today, which focuses on the period from the nineteenth century to the 1930s.

about those europeans who invaded and currently occupy palestine...,

physorg | Professor Martin Richards, of the Archaeogenetics Research Group at the University of Huddersfield, has published a paper uncovering new information about how Ashkenazi Jewish men moved into Europe from the Middle East, and their marriage practices with European women. 

The origins of Ashkenazi Jews – that is, Jews with recent in central and Eastern Europe – is a long-standing controversy. It is usually assumed that their ancestors migrated into Europe from Palestine in the first century AD, after the destruction of the Second Temple by the Romans, with some intermarriage with Europeans later on. But some have argued that they have a mainly European ancestry, and arose by conversion to Judaism of indigenous Europeans, especially in Italy. Others have even argued that they were largely assimilated in the North Caucasus during the time of the Khazar Empire, whose rulers turned to Judaism around of the tenth century AD. 

Archaeogenetics can help to resolve this dispute. Y-chromosome studies have shown that the male line of descent does indeed seem to trace back to the Middle East. But the female line, which can be illuminated by studies of mitochondrial DNA has until now proved more difficult to interpret. This would be especially intriguing because Judaism has been inherited maternally for about 2000 years.
We have settled this issue by looking at large numbers of whole mitochondrial genomes – sequencing the full 16,568 bases of the molecule – in many people from across Europe, the Caucasus and the Middle East. We have found that, in the vast majority of cases, Ashkenazi lineages are most closely related to southern and western European lineages – and that these lineages have been present in Europe for many thousands of years.

This means that, even though Jewish men may indeed have migrated into Europe from Palestine around 2000 years ago, they brought few or no wives with them. They seem to have married with European women, firstly along the Mediterranean, especially in Italy, and later (but probably to a lesser extent) in western and central Europe. This suggests that, in the early years of the Diaspora, Judaism took in many converts from amongst the European population, but they were mainly recruited from amongst women. Thus, on the female line of descent, the Ashkenazim primarily trace their ancestry neither to Palestine nor to Khazaria, but to southern and western Europe.

More information: You can read more about the work of the Archeogenetics Research group at: www.hud.ac.uk/research/researchcentres/targ/

Wednesday, October 09, 2013

Brain Research through Advancing Innovative Neurotechnologies (BRAIN)


TechnologyReview | On a beautiful April morning, chemist Paul Weiss is darting across the campus of the University of California, Los Angeles, in red-framed Wayfarer sunglasses and a suit. He’s on his way to make himself an espresso, but even with a caffeine deficit he’s tough to keep up with. Weiss put the coffee machine in his students’ office instead of his own, to create more opportunities to check in with them and run into colleagues.

Weiss, who heads the California Nanosystems Institute, a state-sponsored research hub for all things nano, is a specialist in developing new ways of probing single molecules like neurotransmitters and those that make up the active layer in solar panels. However, with caffeine in his system, what he wants to talk about is not chemistry but community. For Weiss, 53, chemistry is a social science. "It’s about making a connection," he says. To be able to do something useful, he says, you have to connect to other people within and outside your field, know what problems other fields like neuroscience or energy will be facing in 10 years, and start building the necessary tools today.

As far as he’s concerned, one of the most important goals for the next decade is to understand the human brain. To meet that challenge, biologists need help from chemists, physicists, engineers, and other toolmakers like him, he says. The brain has nearly 100 billion neurons networked together by an estimated 100 trillion electrical and chemical connections. How all these interactions combine to enable us to walk, talk, learn, form memories, create—and how things go wrong in diseases like Parkinson’s—is pretty much a mystery. Weiss hopes to create new tools for probing the nanoscale chemical and electrical activity of thousands to millions of neurons at once. "If we want to understand what a memory is, how we learn—this is where we think the sweet spot is," he says.

For years, Weiss has been recruiting researchers from apparently distant fields to work on the problem—helping organize meetings of scientists to talk about it, trying to bridge the gap between neuroscientists and physical scientists. This organizing work has now borne fruit. In April, President Obama requested $100 million in federal funding for the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative. Private research institutions are also chipping in. The Kavli Foundation, a nonprofit in Oxnard, California, has pledged $40 million over the next 10 years. "His profound understanding of the whole field of nanotechnology made a huge difference," says the Kavli Foundation’s vice president of science programs, Miyoung Chun, who helped coördinate the project that became the BRAIN Initiative.

Researchers working in relative isolation have already made progress on developing tools for studying the brain, including arrays of nanoscale electrodes for probing neurons and computer programs for analyzing the onslaught of data these kinds of measurements are expected to generate. By working together, Weiss believes, researchers from different fields can now accelerate advances by developing common, widely available tools.

behavioral genetics is pseudo-science

ScientificAmerican | Last spring, I kicked up a kerfuffle by proposing that research on race and intelligence, given its potential for exacerbating discrimination, should be banned. Now Nature has expanded this debate with “Taboo Genetics.” The article “looks at four controversial areas of behavioral genetics”—intelligence, race, violence and sexuality—”to find out why each field has been a flashpoint, and whether there are sound scientific reasons for pursuing such studies.”

The essay provides a solid overview, including input from both defenders of behavioral genetics and critics. The author, Erika Check Hayden, quotes me saying that research on race and intelligence too often bolsters “racist ideas about the inferiority of certain groups, which plays into racist policies.”
I only wish that Hayden had repeated my broader complaint against behavioral genetics, which attempts to explain human behavior in genetic terms. The field, which I’ve been following since the late 1980s, has a horrendous track record. My concerns about the potential for abuse of behavioral genetics are directly related to its history of widely publicized, erroneous claims.

I like to call behavioral genetics “gene whiz science,” because “advances” so often conform to the same pattern. Researchers, or gene-whizzers, announce: There’s a gene that makes you gay! That makes you super-smart! That makes you believe in God! That makes you vote for Barney Frank! The media and the public collectively exclaim, “Gee whiz!”

Follow-up studies that fail to corroborate the initial claim receive little or no attention, leaving the public with the mistaken impression that the initial report was accurate—and, more broadly, that genes determine who we are.

Over the past 25 years or so, gene-whizzers have discovered “genes for” high IQ, gambling, attention-deficit disorder, obsessive-compulsive disorder, bipolar disorder, schizophrenia, autism, dyslexia, alcoholism, heroin addiction, extroversion, introversion, anxiety, anorexia nervosa, seasonal affective disorder, violent aggression—and so on. So far, not one of these claims has been consistently confirmed by follow-up studies.

These failures should not be surprising, because all these complex traits and disorders are almost certainly caused by many different genes interacting with many different environmental factors. Moreover, the methodology of behavioral geneticists is highly susceptible to false positives. Researchers select a group of people who share a trait and then start searching for a gene that occurs not universally and exclusively but simply more often in this group than in a control group. If you look at enough genes, you will almost inevitably find one that meets these criteria simply through chance. Those who insist that these random correlations are significant have succumbed to the Texas Sharpshooter Fallacy.

To get a sense of just how shoddy behavioral genetics is, check out my posts on the “liberal gene,” “gay gene” and God gene” (the latter two “discovered” by Dean Hamer, whose record as a gene-whizzer is especially abysmal); and on the MAOA-L gene, also known as the “warrior gene.” Also see this post, where I challenge defenders of behavioral genetics to cite a single example of a solid, replicated finding.

Ever since I first hammered behavioral genetics in my 1993 Scientific American article “Eugenics Revisited,” critics have faulted me for treating the field so harshly. But over the last 20 years, the field has performed even more poorly than I expected. At this point, I don’t know why anyone takes gene-whiz science seriously.

taboo "genetics"


Nature | Growing up in the college town of Ames, Iowa, during the 1970s, Stephen Hsu was surrounded by the precocious sons and daughters of professors. Around 2010, after years of work as a theoretical physicist at the University of Oregon in Eugene, Hsu thought that DNA-sequencing technology might finally have advanced enough to help to explain what made those kids so smart. He was hardly the first to consider the genetics of intelligence, but with the help of the Chinese sequencing powerhouse BGI in Shenzhen, he planned one of the largest studies of its kind, aiming to sequence DNA from 2,000 people, most of whom had IQs of more than 150.

He hadn't really considered how negative the public reaction might be until one of the study's participants, New York University psychologist Geoffrey Miller, made some inflammatory remarks to the press. Miller predicted that once the project turned up intelligence genes, the Chinese might begin testing embryos to find the most desirable ones. One article painted the venture as a state-endorsed experiment, selecting for genius kids, and Hsu and his colleagues soon found that their project, which had barely begun, was the target of fierce criticism.

There were scientific qualms over the value of Hsu's work (see Nature 497, 297299; 2013). As with other controversial fields of behavioural genetics, the influence of heredity on intelligence probably acts through myriad genes that each exert only a tiny effect, and these are difficult to find in small studies. But that was only part of the reason for the outrage. For decades, scientists have trodden carefully in certain areas of genetic study for social or political reasons.

At the root of this caution is the widespread but antiquated idea that genetics is destiny — that someone's genes can accurately predict complex behaviours and traits regardless of their environment. The public and many scientists have continued to misinterpret modern findings on the basis of this — fearing that the work will lead to a new age of eugenics, preemptive imprisonment and discrimination against already marginalized groups.

“People can take science and assume it is far more determinative than it is — and, by making that assumption, make choices that we will come to regret as a society,” says Nita Farahany, a philosopher and lawyer at Duke University School of Law in Durham, North Carolina.

But trying to forestall such poor choices by drawing red lines around certain areas subverts science, says Christopher Chabris of Union College in Schenectady, New York. Funding for research in some areas dries up and researchers are dissuaded from entering promising fields. “Any time there's a taboo or norm against studying something for anything other than good scientific reasons, it distorts researchers' priorities and can harm the understanding of related topics,” he says. “It's not just that we've ripped this page out of the book of science; it causes mistakes and distortions to appear in other areas as well.”

Here, Nature looks at four controversial areas of behavioural genetics to find out why each field has been a flashpoint, and whether there are sound scientific reasons for pursuing such studies.

Tuesday, October 08, 2013

narcissism of minor differences: profound behavioral reinforcement for status-seeking?

NYTimes | Turning a blind eye. Giving someone the cold shoulder. Looking down on people. Seeing right through them.

These metaphors for condescending or dismissive behavior are more than just descriptive. They suggest, to a surprisingly accurate extent, the social distance between those with greater power and those with less — a distance that goes beyond the realm of interpersonal interactions and may exacerbate the soaring inequality in the United States.

A growing body of recent research shows that people with the most social power pay scant attention to those with little such power. This tuning out has been observed, for instance, with strangers in a mere five-minute get-acquainted session, where the more powerful person shows fewer signals of paying attention, like nodding or laughing. Higher-status people are also more likely to express disregard, through facial expressions, and are more likely to take over the conversation and interrupt or look past the other speaker.

Bringing the micropolitics of interpersonal attention to the understanding of social power, researchers are suggesting, has implications for public policy.

Of course, in any society, social power is relative; any of us may be higher or lower in a given interaction, and the research shows the effect still prevails. Though the more powerful pay less attention to us than we do to them, in other situations we are relatively higher on the totem pole of status — and we, too, tend to pay less attention to those a rung or two down.

A prerequisite to empathy is simply paying attention to the person in pain. In 2008, social psychologists from the University of Amsterdam and the University of California, Berkeley, studied pairs of strangers telling one another about difficulties they had been through, like a divorce or death of a loved one. The researchers found that the differential expressed itself in the playing down of suffering. The more powerful were less compassionate toward the hardships described by the less powerful.

Dacher Keltner, a professor of psychology at Berkeley, and Michael W. Kraus, an assistant professor of psychology at the University of Illinois, Urbana-Champaign, have done much of the research on social power and the attention deficit.

Mr. Keltner suggests that, in general, we focus the most on those we value most. While the wealthy can hire help, those with few material assets are more likely to value their social assets: like the neighbor who will keep an eye on your child from the time she gets home from school until the time you get home from work. The financial difference ends up creating a behavioral difference. Poor people are better attuned to interpersonal relations — with those of the same strata, and the more powerful — than the rich are, because they have to be.

While Mr. Keltner’s research finds that the poor, compared with the wealthy, have keenly attuned interpersonal attention in all directions, in general, those with the most power in society seem to pay particularly little attention to those with the least power. To be sure, high-status people do attend to those of equal rank — but not as well as those low of status do.

This has profound implications for societal behavior and government policy. Tuning in to the needs and feelings of another person is a prerequisite to empathy, which in turn can lead to understanding, concern and, if the circumstances are right, compassionate action.

In politics, readily dismissing inconvenient people can easily extend to dismissing inconvenient truths about them. The insistence by some House Republicans in Congress on cutting financing for food stamps and impeding the implementation of Obamacare, which would allow patients, including those with pre-existing health conditions, to obtain and pay for insurance coverage, may stem in part from the empathy gap. As political scientists have noted, redistricting and gerrymandering have led to the creation of more and more safe districts, in which elected officials don’t even have to encounter many voters from the rival party, much less empathize with them.

shake and bake baby!!!


HuffPo | The multimillion-dollar superlab of "Breaking Bad" may be gone, but thousands of meth labs around the country remain. The midwestern states tend to see the most incidents involving meth labs, and Missouri outranks all others with 1,825 busts and seizures in 2012, according to a Government Accountability Office analysis of Drug Enforcement Administration data.

Moreover, an increasingly popular crude cooking method known as "shake and bake" has put meth production in addicts' hands, eliminating the need for an RV or even chemistry know-how.

It takes about 15 minutes to "shake and bake" a batch of meth in a plastic bottle using ingredients you may already have lying around the house. Sometimes the bottle explodes, badly burning the often uninsured meth cook and anyone else in the line of fire.

Meth use cost the U.S. economy around $23.4 billion in 2005, according to a RAND Corporation study. While incidents involving meth labs have tapered somewhat in recent years, thanks to the rise of "shake and bake" hospitals have noticed an uptick in meth burn cases. It costs around $230,000 to treat a meth lab burn victim, Mother Jones reported. The most common age of these victims: under 4 years old. 

Oregon and Mississippi have figured out how to curb these accidents by making the key meth ingredient pseudoephedrine prescription-only. Other states keep the common cold medicine behind the counter under a 2006 federal law, but when Oregon and Mississippi implemented prescription legislation, meth lab incidents immediately plummeted. Dozens of other states have tried to follow their lead, but the pharmaceutical industry isn't having it

Sen. Ron Wyden (D-Ore.) wanted to make Oregon's success story a national reality, announcing legislation in 2010 for federal prescription regulation of pseudoephedrine. But according to Mother Jones, he never introduced the bill in Congress, in part because of "heavy industry spending."  Fist tap Dale.

Weak People Are Open, Empty, and Easily Occupied By Evil...,

Tucker Carlson: "Here's the illusion we fall for time and again. We imagine that evil comes like fully advertised as such, like evi...