Saturday, March 24, 2012

california cities scrambling to avert insolvency...,

cnbc | When the city of Stockton, Calif., announced last month it would skip some bond payments and enter talks with its creditors, the municipal debt world shuddered. If Stockton were to file for bankruptcy protection, it would be the largest U.S. city ever to do so. Other troubled Californian municipalities might be tempted to follow suit. Predictions of mass defaults on municipal bonds might start to look a little more realistic.
When the housing market went bust, Lincoln's revenue shrank. City leaders responded by tapping reserves and slashing spending on public safety, which accounts for most of the budget. The city reduced the number of its police officers to 20 from 40 and closed two of three fire houses. Increased contributions by city employees to their retirement accounts also helped.

Bankruptcy isn't an option, said Mayor Spencer Short, adding that more cuts are coming: "I'm looking at the possibility of introducing a budget that's beyond bare bones."

Antioch, a city of 100,000 on the eastern fringe of the San Francisco Bay area, is another municipality clobbered by the housing slump. Unlike in Vallejo and Stockton, though, relations between Antioch's city leaders and labor units were not contentious, allowing quick action to cut costs when the scale of its financial troubles became clear, City Manager Jim Jakel said.

Jakel ticked off Antioch's moves to keep its books balanced: a hiring freeze; furloughs; employees waiving pay increases; and a city workforce reduced to 245 from 401 through attrition and layoffs.

Costa Mesa, a city of 110,000 south of Los Angeles, has slashed its payroll from 611 to 450. It is selling its police helicopters and has hired a neighboring city for air patrols. It's also pursuing a controversial effort to convert to a charter city from a general law city, which would give City Hall more power to outsource more work, said councilman Jim Righeimer.

Friday, March 23, 2012

the white savior industrial complex

TheAtlantic | A week and a half ago, I watched the Kony2012 video. Afterward, I wrote a brief seven-part response, which I posted in sequence on my Twitter account:
These tweets were retweeted, forwarded, and widely shared by readers. They migrated beyond Twitter to blogs, Tumblr, Facebook, and other sites; I'm told they generated fierce arguments. As the days went by, the tweets were reproduced in their entirety on the websites of the Atlantic and the New York Times, and they showed up on German, Spanish, and Portuguese sites. A friend emailed to tell me that the fourth tweet, which cheekily name-checks Oprah, was mentioned on Fox television.

These sentences of mine, written without much premeditation, had touched a nerve. I heard back from many people who were grateful to have read them. I heard back from many others who were disappointed or furious. Many people, too many to count, called me a racist. One person likened me to the Mau Mau. The Atlantic writer who'd reproduced them, while agreeing with my broader points, described the language in which they were expressed as "resentment."

This weekend, I listened to a radio interview given by the Pulitzer Prize-winning journalist Nicholas Kristof. Kristof is best known for his regular column in the New York Times in which he often gives accounts of his activism or that of other Westerners. When I saw the Kony 2012 video, I found it tonally similar to Kristof's approach, and that was why I mentioned him in the first of my seven tweets. Fist tap Dale.

lessons from the 1970's..,

aljazeera | General accounts of radicalism tend to concentrate on the 1960s. They also place a particular emphasis on those elements that later proved useful to marketers. We constantly hear about the students' revolt against the staid morals of earlier generations - the music and film industries have retold that story ever since. But we hear less about the joint efforts of American students and civil rights campaigners to create an alternative to the Republican-Democrat duarchy. Apple was happy to use clips of Martin Luther King in its adverts, but the mainstream has little time for his opposition to the war in Vietnam and his demands for economic and racial justice.

The high watermark for popular engagement in politics comes some time later, in that unloved decade, the 1970s. In Britain a series of strikes against Ted Heath's inept administration eventually forced an election in 1974, which the government lost. In the same year Nixon resigned the presidency as the Watergate scandal metastasised. In 1976, the bicentenary of the Declaration of Independence, the Americans elected a President promising change and democratic renewal. The popular movements that first emerged in the 1960s seemed on the brink of transforming both America and Britain.

That's certainly how it seemed to those who ran politics and the economy. Scandinavian social democracy haunted the waking nightmares of American businessmen. One of them worried that "if we don't take action now we will see our own demise". Another complained in a revealing metaphor that "we are like the head of a household, and the public sector is like our wife and child. They can only consume what we produce".

Democracy crisis
The Trilateral Commission is an obscure talking shop whose current members include the (unelected) Prime Ministers of Italy and Greece. In 1975 it was sufficiently worried to put together a book entitled The Crisis of Democracy.

One of its authors, Samuel Huntington, identified the core of this crisis, from the perspective of the governing elite. In the 1960s and 1970s, "previously passive or unorganised groups in the population… embarked on concerted efforts to establish their claims to opportunities, positions, rewards and privileges, which they had not considered themselves entitled to before". Blacks, women, and "clerical, technical and professional employees in public and private bureaucracies" - that is, the vast majority of the population - were seeking the kinds of power that the wealthy and their trusted servants considered theirs by right. Crucially, they were seeking control of the state through the electoral process.

Democracy could only work, explained Huntington, if there was "some measure of apathy and non-involvement on the part of some individuals and groups". The so-called crisis of democracy would only end when the majority gave up their ambitions to secure an effectual say in politics and hence economics. It was time for the powerful to assert "the claims of expertise, experience and special talents" over and above the claims of democracy. Fist tap Arnach.

monopolizing the first draft of history

medialens | Journalists are supposed to tell the truth without fear or favour. In reality, as even the editor of the Independent acknowledges, MPs and reporters are ‘a giant club’.

Together, politics and media combine to provide an astonishingly consistent form of reality management controlling public perception of conflicts in places like Afghanistan, Iraq, Libya and Syria. Alastair Crooke, founder and director of Conflicts Forum, notes how the public is force-fed a ‘simplistic victims-and-aggressor meme, which demands only the toppling of the aggressor’.

The bias is spectacular, outrageous, but universal, and so appears simply to mirror reality. Ahmad Barqawi, a Jordanian freelance columnist and writer based in Amman, said it well:

‘I remember during the “Libyan Revolution”, the tally of casualties resulting from Gaddafi’s crackdown on protesters was being reported by the mainstream media with such a “dramatic” fervor that it hardly left the public with a moment to at least second-guess the ensuing avalanche of unverifiable information and erratic inflow of “eye witnesses’ accounts”.

‘Yet the minute NATO forces militarily intervened and started bombing the country into smithereens, the ceremonial practice of body count on our TV screens suddenly stopped; instead, reporting of Libyan casualties (of whom there were thousands thanks only to the now infamous UNSC resolution 1973) turned into a seemingly endless cycle of technical, daily updates of areas captured by NATO-backed “rebel forces”, then lost back to Gaddafi’s military, and again recaptured by the rebels in their creeping territorial advances towards Tripoli…

‘How is it that the media’s concern for human rights did not extend to the victims of NATO bombing campaigns in the Libyan cities of Tripoli and Sirte? How come the international community’s drive to protect the lives of Libyan civilians in Benghazi lost steam the minute NATO stepped in and actually increased the number of casualties ten-fold?’

It is a remarkable phenomenon – global media attention flitting instantaneously, like a flock of starlings, from one focus desired by state power to another focus also desired by state power.

But the bias goes far beyond even this example. The media’s basic stance in reporting events in Libya and Syria has been one of intense moral outrage. The level of political-media condemnation is such that media consumers are often persuaded to view rational, informed dissent as apologetics for mass murder. Crooke writes:

‘Those with the temerity to get in the way of “this narrative” by arguing that external intervention would be disastrous, are roundly condemned as complicit in President Assad's crimes against humanity.’ They are confronted by the ‘unanswerable riposte of dead babies - literally’.

Thursday, March 22, 2012

sure sign that mexico is officially considered a failed state...,



NNSA | Last night, the Rachel Maddow Show on MSNBC broke the news of NNSA’s latest achievement – removing all remaining weapons-usable material from Mexico. Through a trilateral agreement, the US, Mexico and Canada worked to convert Mexican research reactors to use low enriched uranium, removed all remaining spent and fresh HEU and provided Mexico with LEU to continue reactor operations. This achievement is a key deliverable from the 2010 Nuclear Security Summit in Washington, D.C. and a crucial step in achieving the President’s nuclear security agenda to “secure vulnerable nuclear material world wide” within four years.

well..., no fukushima daichi in mexico city...,



csmonitor | I have lived in Mexico City for six years and never worried much about earthquakes. But now I have a baby. And as all parents will understand, earthquakes have now joined the list of things like airplane turbulence and speeding taxis, to name but a few, that I now care desperately about.

So when the unusually long and strong earthquake shook this city right after noon local time, as I was typing away at a local Starbucks where I often work, I slammed shut my laptop and ran as fast as I could home (losing a powercord and mouse along the way).

The streets were packed with people who had evacuated, looking up at the highrises around us, wondering if there was damage and if buildings would hold. As I looked up and ran, I kept thinking not about what lay in my own path, but that the buildings standing firm must mean that mine probably did too.

Everyone was fine at home, my sweet baby outside with her caretaker and the rest of our neighbors. But the earthquake was the biggest that I felt since living here. It measured in at 7.4 according to the US Geological Survey, which initially put it at 7.9, and the center was in Guerrero state. On Twitter, President Felipe Calderon said there appears to be no serious damage. "The health system is operating normally, except for some broken glass and other minor damage," he wrote in a Twitter post.

The quake shook central and southern Mexico, with damage including a fallen bridge and swaying office towers in Mexico City. Some 60 homes are reported damaged near the epicenter of the quake, and there are currently no reported deaths, according to the Associated Press.

I am writing this “reporters on the job” from outside my house right now on the sidewalk. For the first time in about a half dozen temblors that have prompted us to evacuate the house, we do have damage. Our walls are cracked, as it appears that our apartment and that of our neighbor slammed into one another. (Their windows are blown out.) Right now we are waiting for the authorities to assess whether there is structural damage.

It's certainly not dramatic, but we do have broken shards of glass across our entrance and plaster across the front hallway. As I am sitting here, I see the front walling of our apartment blown off, and it immediately brings me back to the horrors of Haiti's earthquake in January 2010, my first and only first-hand experience with a disaster that devastating.

Wednesday, March 21, 2012

autonomy, mastery, purpose...,



Fist tap Dale.

good thing he ain't have no skittles and iced tea...,





abcnews | Russell’s 30-minute documentary has racked up more than 83 million views on YouTube in two weeks. Supporters signed an online pledge through Invisible Children, a charity co-founded by Russell, to bring Kony to justice. But critics say the film oversimplifies a complex problem and promotes “slacktivism,” action that does little to fix the problem.

“He got a lot of criticism for this, and he may not be as internally resilient as someone needs to be to weather a storm of criticism like that,” said Alan Hilfer, chief psychologist at Maimonides Medical Center in New York City. “It might have been such a tremendous injury to his ego that he just sort of fell apart.”

Hilfer has not seen Russell, nor does he know his medical history. But he said worry and self doubt can lead to sleep problems and poor self-care, which can trigger irrational behavior such as Russell’s.

“Sometimes we see this in response to people not taking good care of themselves when they’re under great deal of stress and pressure,” Hilfer said. “They become overwhelmed and anxious and it interferes with their ability to sleep. Without treatment, it can cause disorientation and mental confusion.”

Hilfer said severe dehydration can also cause strange behavior.

“Dehydration can absolutely cause all the signs of mental confusion he seemed to be experiencing,” he said, adding that severe dehydration would most likely be brought on by illness, but could also result from poor self-care.

A bad reaction to a medication, possible a prescription sleep aide, might also be to blame, Hilfer said.

very satisfying to watch..,


Tuesday, March 20, 2012

meanwhile, cognitive elites further augment their electronic medical records

TechnologyReview | Back in 2000, when Larry Smarr left his job as head of a celebrated supercomputer center in Illinois to start a new institute at the University of California, San Diego, and the University of California, Irvine, he rarely paid attention to his bathroom scale. He regularly drank Coke, added sugar to his coffee, and enjoyed Big Mac Combo Meals with his kids at McDonald's. Exercise consisted of an occasional hike or a ride on a stationary bike. "In Illinois they said, 'We know what's going to happen when you go out to California. You're going to start eating organic food and get a blonde trainer and get a hot tub,' " recalls Smarr, who laughed off the predictions. "Of course, I did all three."

Smarr, who directs the California Institute for Telecommunications and Information Technology in La Jolla, dropped from 205 to 184 pounds and is now a fit 63-year-old. But his transformation transcends his regular exercise program and carefully managed diet: he has become a poster man for the medical strategy of the future. Over the past decade, he has gathered as much data as he can about his body and then used that information to improve his health. And he has accomplished something that few people at the forefront of the "quantified self" movement have had the opportunity to do: he helped diagnose the emergence of a chronic disease in his body.

Like many "self-quanters," Smarr wears a Fitbit to count his every step, a Zeo to track his sleep patterns, and a Polar WearLink that lets him regulate his maximum heart rate during exercise. He paid 23andMe to analyze his DNA for disease susceptibility. He regularly uses a service provided by Your Future Health to have blood and stool samples analyzed for biochemicals that most interest him. But a critical skill separates Smarr from the growing pack of digitized patients who show up at the doctor's office with megabytes of their own biofluctuations: he has an extraordinary ability to fish signal from noise in complex data sets.

On top of his pioneering computer science work—he advocated for the adoption of ARPAnet, an early version of the Internet, and students at his University of Illinois center developed Mosaic, the first widely used browser—Smarr spent 25 years as an astrophysicist focused on relativity theory. That gave him the expertise to chart several of his biomarkers over time and then overlay the longitudinal graphs to monitor everything from the immune status of his gut and blood to the function of his heart and the thickness of his arteries. His meticulously collected and organized data helped doctors discover that he has Crohn's, an inflammatory bowel disease.

a post antibiotic era means an end to modern medicine as you know it



care2 | What would the world look like if an injury from a minor infection could kill you? Where bacterial illnesses like strep had no treatment? Where the risk of infection made it too dangerous for simple, routine surgeries such as hip replacements? Where the risk of infection would be great enough to render chemotherapy useless?

According to Margaret Chan, the director general of the World Health Organization, this could soon be reality. At a meeting with infection disease experts in Copenhagen, she stated simply that every antibiotic in the arsenal of modern medicine may soon become useless due to the rise of antibiotic resistant diseases. The Independent quoted her explaining the ramifications:

“A post-antibiotic era means, in effect, an end to modern medicine as we know it. Things as common as strep throat or a child’s scratched knee could once again kill.”

She continued: “Antimicrobial resistance is on the rise in Europe, and elsewhere in the world. We are losing our first-line antimicrobials.

“Replacement treatments are more costly, more toxic, need much longer durations of treatment, and may require treatment in intensive care units.

“For patients infected with some drug-resistant pathogens, mortality has been shown to increase by around 50 per cent.

“Some sophisticated interventions, like hip replacements, organ transplants, cancer chemotherapy, and care of preterm infants, would become far more difficult or even too dangerous to undertake.”

Around the world, more and more pathogens are spreading which don’t respond to any known antibiotic drugs. In India, there has been a recent outbreak of drug-resistant TB. And in the US, the CDC warns that a new strain of gonorrhea is on the rise – and it is resistant to most forms of antibiotics. The agency warns that it’s only a matter of time before we start seeing outbreaks of untreatable STIs. (And the fact that sex education in the US rarely warns teens how to adequately protect themselves from STIs probably won’t help.)

So why is this happening? There are a couple of troubling reasons – the first that Chan points to is the heavy use of antibiotics in livestock. In the US, a full 80% of the country’s antibiotics go to farm animals, not human beings. And the FDA has done little to discourage this. The only solution here is to go vegetarian/vegan, or start paying more for organic (not “natural”) meat, eggs, and dairy that have never been exposed to antibiotics.

The other reason is just depressing: there’s no money to be made, apparently, in developing new antibiotics.

drug-resistant "white plague" lurks among rich and poor

Reuters | On New Year's Eve 2004, after months of losing weight and suffering fevers, night sweats and shortness of breath, student Anna Watterson was taken into hospital coughing up blood.

It was strange to be diagnosed with tuberculosis TB.L- an ancient disease associated with poverty - especially since Watterson was a well-off trainee lawyer living in the affluent British capital of London. Yet it was also a relief, she says, finally to know what had been making her ill for so long.

But when Watterson's infection refused to yield to the three-pronged antibiotic attack doctors prescribed to fight it, her relief turned to dread.

After six weeks of taking pills that had no effect, Watterson was told she had multi-drug resistant TB, or MDR-TB, and faced months in an isolation ward on a regimen of injected drugs that left her nauseous, bruised and unable to go out in the sun.

"My friends were really shocked," Watterson said. "Most of them had only heard of TB from reading Victorian novels."

Tuberculosis is often seen in the wealthy West as a disease of bygone eras - evoking impoverished 18th or 19th century women and children dying slowly of a disease then commonly known as "consumption" or the "white plague".

But rapidly rising rates of drug-resistant TB in some of the wealthiest cities in the world, as well as across Africa and Asia, are again making history.

London has been dubbed the "tuberculosis capital of Europe", and a startling recent study documenting new cases of so-called "totally drug resistant" TB in India suggests the modern-day tale of this disease could get a lot worse.

"We can't afford this genie to get out of the bag. Because once it has, I don't know how we'll control TB," said Ruth McNerney, an expert on tuberculosis at the London School of Hygiene and Tropical Medicine.

Monday, March 19, 2012

peacetime martial law?



beforeitsnews | This Executive Order was posted on the WhiteHouse.gov web site on Friday, March 16, 2012, under the name National Defense Resources Preparedness. In a nutshell, it's the blueprint for Peacetime Martial Law and it gives the president the power to take just about anything deemed necessary for "National Defense", whatever they decide that is. It's peacetime, because as the title of the order says, it's for "Preparedness". A copy of the entire order follows the end of this story.

Under this order the heads of these cabinet level positions; Agriculture, Energy, Health and Human Services, Transportation, Defense and Commerce can take food, livestock, fertilizer, farm equipment, all forms of energy, water resources, all forms of civil transporation (meaning any vehicles, boats, planes), and any other materials, including construction materials from wherever they are available. This is probably why the government has been visiting farms with GPS devices, so they know exactly where to go when they turn this one on.

Specifically, the government is allowed to allocate materials, services, and facilities as deemed necessary or appropriate. They decide what necessary or appropriate means.

UPDATE: BIN reader Kent Welton writes: This allows for the giving away of USA assets and subsidies to private companies: "(b) provide for the modification or expansion of privately owned facilities, including the modification or improvement of production processes, when taking actions under sections 301, 302, or 303 of the Act, 50 U.S.C. App. 2091, 2092, 2093; and (c) sell or otherwise transfer equipment owned by the Federal Government and installed under section 303(e) of the Act, 50 U.S.C. App. 2093(e), to the owners of such plants, factories, or other industrial facilities."

What happens if the government decides it needs all these things to be prepared, even if there is no war? You likely won't be able to walk into a store to purchase virtually anything because it will all be requisitioned, "rationed" and controlled by the government. Construction materials, food like meat, butter and sugar, anything imported, parts, tires and fuel for vehicles, clothing, etc. will likely become unobtainable, or at least very scarce. How many things are even made here in the USA any more?

forget the money, follow the "sacredness"...,

NYTimes | Self-interest, political scientists have found, is a surprisingly weak predictor of people’s views on specific issues. Parents of children in public school are not more supportive of government aid to schools than other citizens. People without health insurance are not more likely to favor government-provided health insurance than are people who are fully insured.

Despite what you might have learned in Economics 101, people aren’t always selfish. In politics, they’re more often groupish. When people feel that a group they value — be it racial, religious, regional or ideological — is under attack, they rally to its defense, even at some cost to themselves. We evolved to be tribal, and politics is a competition among coalitions of tribes.

The key to understanding tribal behavior is not money, it’s sacredness. The great trick that humans developed at some point in the last few hundred thousand years is the ability to circle around a tree, rock, ancestor, flag, book or god, and then treat that thing as sacred. People who worship the same idol can trust one another, work as a team and prevail over less cohesive groups. So if you want to understand politics, and especially our divisive culture wars, you must follow the sacredness.

A good way to follow the sacredness is to listen to the stories that each tribe tells about itself and the larger nation. The Notre Dame sociologist Christian Smith once summarized the moral narrative told by the American left like this: “Once upon a time, the vast majority” of people suffered in societies that were “unjust, unhealthy, repressive and oppressive.” These societies were “reprehensible because of their deep-rooted inequality, exploitation and irrational traditionalism — all of which made life very unfair, unpleasant and short. But the noble human aspiration for autonomy, equality and prosperity struggled mightily against the forces of misery and oppression and eventually succeeded in establishing modern, liberal, democratic, capitalist, welfare societies.” Despite our progress, “there is much work to be done to dismantle the powerful vestiges of inequality, exploitation and repression.” This struggle, as Smith put it, “is the one mission truly worth dedicating one’s life to achieving.”

This is a heroic liberation narrative. For the American left, African-Americans, women and other victimized groups are the sacred objects at the center of the story. As liberals circle around these groups, they bond together and gain a sense of righteous common purpose.

Contrast that narrative with one that Ronald Reagan developed in the 1970s and ’80s for conservatism. The clinical psychologist Drew Westen summarized the Reagan narrative like this: “Once upon a time, America was a shining beacon. Then liberals came along and erected an enormous federal bureaucracy that handcuffed the invisible hand of the free market. They subverted our traditional American values and opposed God and faith at every step of the way.” For example, “instead of requiring that people work for a living, they siphoned money from hard-working Americans and gave it to Cadillac-driving drug addicts and welfare queens.” Instead of the “traditional American values of family, fidelity and personal responsibility, they preached promiscuity, premarital sex and the gay lifestyle” and instead of “projecting strength to those who would do evil around the world, they cut military budgets, disrespected our soldiers in uniform and burned our flag.” In response, “Americans decided to take their country back from those who sought to undermine it.”

This, too, is a heroic narrative, but it’s a heroism of defense. In this narrative it’s God and country that are sacred — hence the importance in conservative iconography of the Bible, the flag, the military and the founding fathers. But the subtext in this narrative is about moral order. For social conservatives, religion and the traditional family are so important in part because they foster self-control, create moral order and fend off chaos. (Think of Rick Santorum’s comment that birth control is bad because it’s “a license to do things in the sexual realm that is counter to how things are supposed to be.”) Liberals are the devil in this narrative because they want to destroy or subvert all sources of moral order.

Actually, there’s a second subtext in the Reagan narrative in which liberty is the sacred object. Circling around liberty would seem, on its face, to be more consistent with liberalism and its many liberation movements than with social conservatism. But here’s where narrative analysis really helps. Part of Reagan’s political genius was that he told a single story about America that rallied libertarians and social conservatives, who are otherwise strange bedfellows. He did this by presenting liberal activist government as the single devil that is eternally bent on destroying two different sets of sacred values — economic liberty and moral order. Only if all nonliberals unite into a coalition of tribes can this devil be defeated.

do you believe in magic?

questioneverything | These days I try to tune out what is going on in the Republican primary race because, to be blunt, every candidate is a joke and everything I've heard any of them say to date has been ludicrous. Especially about energy and the economy. More than that, I am heartbroken that so many citizens in this country can actually buy into any of this dribble. To be fair, however, the Democrats and the current administration haven't got it much better. They seem to be a little more realistic when it comes to understanding the oil/gasoline price problems, but they still don't seem to get the underlying dynamic that is driving us all off the cliff. Right now they are working hard to put the best spin on the recent economic data that seems to show the economy in recovery, even if slowly. But spin is all it is. As long as oil hovers north of $100 per barrel, we are all going to be adapting downward for a long time to come.

For example there is a lot of hot air circulating in the left political arena and in the MSM about jobs starting to show improvement. And it appears to be true that the raw numbers of jobs, even across many sectors, is either increasing or the loss is slowing. What they don't tell you is that the average wage rates for these new jobs is much less than what the old jobs (in the same sector) had paid. People aren't complaining. They are just happy to have a job. What the newly hired, as well as most Americans in the low and middle classes, are doing is cutting back on non-essentials. The recent run up in gas prices on the coasts is aggravating this. GDP growth remains sluggish even while the stock markets seem to be soaring. The left wants everyone to believe that the economy is recovering as we plunge into the political season. But, in fact, it is only adjusting. People are lowering expectations and adapting to a lower overall cash flow.

Meanwhile the underlying true cause of this contraction dynamic goes without recognition. As the world shifts from traditional crude oil liquids to the kind of gunk we get out of Alberta's tar sands, the net energy per capita continues its downward spiral. Peak conventional oil is being compensated in volume by bringing on more non-conventional and energy expensive volumes just to keep up appearances. Usable energy, that is the kind that does useful economic work, is the basis for the economy. Purchasing power relies on having enough energy to produce real goods and useful services and the amount of useful energy derived from non-conventional (like deep water) oil cannot replace what we had from conventional. No feasible amount of biofuels will make up the difference either.

But what about natural gas? The word on the street is that we have enough NG for 100 years at present use rates. The President said so. The NG companies say so. The investment bankers say so. The MSM says so. All we have to do is convert everything to NG and away we go!

There is a little known fact about NG, especially the kind you can only get out of the ground by horizontal drilling and hydro-fracturing the shale rocks. First the difference between technical recovery and financially-feasible recovery is significant. The MSM (and everybody else) likes to quote the reserve estimates based on the former. They choose to use this much higher number because they believe the technology to make it economic is just around the corner. Ask any of these advocates about the latter and they will look at you cross eyed. To them technically recoverable is the number that counts. Also what they do not know is the production dynamics of the non-conventional wells. It is true that these wells, when they produce (which isn't even close to 100% of drills), they produce at a much greater initial level than conventional wells. A fair amount of the hype about NG comes from this observation. But they never follow with the fact that these same wells have a much faster decline rate. In fact it looks like the decline rates of such wells is so fast that the total volume of actual NG recovered is much less than the initial burst would have predicted based on conventional wells. In other words, there is a big fanfare of production followed by wimpy results. I would bet the NG companies will not be publishing that to their investors.

The net energy per capita of all forms of fossil fuels is in decline. Even coal is costing more to get to the power plants. And if emission requirements gain any teeth (not really likely) then the costs of producing electricity with coal will climb and it will NEVER go down again.

The economy can grow as long as net energy is growing. There are only two ways that will happen. If the total volume of raw energy (fossil fuels) is growing rapidly then the total net will also grow. Or, if someone were to figure out how to reverse the decline in energy return on energy invested (EROI) then that would boost the net return from any amount of raw energy extracted. As that might entail finding a loophole in the Second Law of Thermodynamics, and as no one who knows anything about the physics expects that to happen, that avenue is probably not going to work out. There is a third way we might experience growth of net energy on an individual level, and that is if the population would stop growing, and in fact, decline. Well peak oil (and the same phenomenon covering NG and coal) pretty much puts the cabash on the first way. The second would require an act of God. And the fourth is biologically impossible — we are still animals. So, since the net energy is destined to decline so is the economy. Simple physics.

Sunday, March 18, 2012

competing visions of a computer-controlled future

Spiegel | Federico Faggin has lived in the United States for more than 40 years, but he's still living la dolce vita in classic Italian style in his magnificent house on the edge of Silicon Valley. The elderly Faggin answers the phone with a loud "pronto" and serves wine and antipasti to guests. Everything about him is authentic. The only artificial thing in Faggin's world is what he calls his "baby." It has 16 feet -- eight on each side -- and sits wrapped in cotton in a cigarette case.

About four decades ago, Faggin was one of the first employees at Intel when he and his team developed the world's first mass-produced microprocessor, the component that would become the heart of the modern era. Computer systems are ubiquitous today. They control everything, from mobile phones to Airbus aircraft to nuclear power plants. Faggin's tiny creation made new industries possible, and he has played a key role in the progress of the last few decades. But even the man who triggered this massive revolution is slowly beginning to question its consequences.

"We are experiencing the dawn of a new age," Faggin says. "Companies like Google and Facebook are nothing but a series of microprocessors, while man is becoming a marginal figure."

The Worrying Speed of Progress
This week, when German Chancellor Angela Merkel and Google chairman Eric Schmidt opened CeBIT -- the digital industry's most important annual trade fair -- in the northern German city of Hanover, there was a lot of talk of the mobile Internet once again, of "cloud computing," of "consumer electronics" and of "connected products." The overarching motto of this convention is "Trust" -- in the safety of technology, in progress and in the pace at which progress unfolds.

This effort to build trust seems more necessary than ever, now that those who place their confidence in progress are being joined by skeptics who also see something dangerous about the rapid pace of development.

In his book "The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future," American computer scientist Martin Ford paints a grim picture. He argues that the power of computers is growing so quickly that they will be capable of operating with absolutely no human involvement at some point in the future. Ford believes that 75-percent unemployment is a possibility before the end of the century.

"Economic progress ultimately signifies the ability to produce things at a lower financial cost and with less labor than in the past," says Polish sociologist Zygmunt Bauman. As a result, he says, increasing effectiveness goes hand in hand with rising unemployment, and the unemployed merely become "human waste."

Likewise, in their book "Race Against the Machine," Erik Brynjolfsson and Andrew McAfee, both scholars at the Massachusetts Institute of Technology (MIT), argue that, for the first time in its history, technological progress is creating more jobs for computers than for people.

software-defined networking

TechnologyReview | Yet today, even with seemingly cost-effective cloud services available from the likes of Amazon, most companies still choose to operate their own computing resources—whether for corporate e-mail or financial trading—as if they were homeowners relying on generators for electricity. One reason they resist cloud computing, Casado says, is that network architecture is too decentralized to reconfigure easily, which leaves the cloud insecure and unreliable. Cloud computing providers tend to run entire data centers on one shared network. If, for example, Coke and Pepsi both entrusted their computer systems to one of today's public cloud services, they might share a network connection, even though their data stores would be carefully kept separate. That could pose a security risk: a hacker who accessed one company's data could see the other's. It would also mean that a busy day for Coke would cause Pepsi's data transfers to slow down.

All of that changes when Nicira's software is installed on the servers in a data center. The software blocks the applications or programs running on the servers from interacting with the surrounding network hardware. A virtual network then takes over to do what a computer network needs to do: it provides a set of connections for the applications to route data through. Nicira's virtual network doesn't really exist, but it's indistinguishable from one made up of physical routers and switches.

To describe the power this gives to cloud administrators, Casado uses a Hollywood reference. "We actually give them the Matrix," he says. The movie's Matrix manipulated the brains of humans floating in tanks to provide the sensation that they were walking, talking, and living in a world that didn't exist. Nicira's version pulls a similar trick on the programs that reside on a server inside a data center, whether they are running a website or a phone app. In practice, this means that administrators can swiftly reprogram the virtual network to offer each application a private connection to the rest of the Internet. That keeps data more secure, and Coke's data crunch would affect Coke alone. It also lets the cloud provider set up automatic controls that compensate for events like sudden spikes in demand.

Ben Horowitz, a partner in the investment firm Andreessen-Horowitz, says he and his partner Marc Andreessen, a cofounder of Netscape, quickly realized that Nicira was delivering something long overdue in computing. "The total lack of innovation in networking compared to operating systems or storage had been bothering us for a while," he says. "It was holding back the industry." After meeting Casado, Horowitz invested in Nicira and joined its board. He saw in Nicira echoes of VMware, a company that helped set off the cloud computing boom and has a market capitalization of $40 billion. VMware's software creates virtual computers inside a server, boosting the efficiency of data centers and driving down the cost of servers. Nicira's software promises a similar instant upgrade to what a data center can do, by removing the efficiency bottleneck imposed by networks.

système D

Black Market
Created by: BusinessDegree.net

doomsday has its day in the sun...,

NYTimes | Television has long been full of “Americans” (“American Restoration,” “American Chopper,” “American Hoggers”) and “Extremes” (“Extreme Marksmen,” “Extreme Makeover,” “Extreme Couponing”) and “Tops” (“Top Gear,” “Top Chef,” “Top Shot”). In recent weeks, though, an interloper has staked a claim: “Doomsday.”

Last month the National Geographic Channel introduced “Doomsday Preppers,” a Tuesday-night reality series about people who are stockpiling, arming and otherwise preparing for some kind of apocalypse. Last week it was the Discovery Channel’s turn. Its new “Doomsday Bunkers,” on Wednesday nights, is about Deep Earth Bunker, a company that builds underground getaways for the types of people seen in “Doomsday Preppers.”

Watch either show for a short while and, unless you’re a prepper yourself, you might be moderately amused at the absurd excess on display and at what an easy target the prepper worldview is for ridicule. Watch a bit longer, though, and amusement may give way to annoyance at how offensively anti-life these shows are, full of contempt for humankind.

“Doomsday Preppers” introduces an array of end-of-civilization types who at first seem surprisingly varied. These preppers live all over the country, in rural areas, suburbs and cities. Each has a different reason for turning a perfectly adequate home into a canned-food warehouse or building an escape hideaway (or bug-out location, to use the prepper term) in the mountains. One expects the North and South Poles to swap places, one a global economic collapse, one “an electromagnetic pulse that will disable the transportation system of the United States.”

But the people on this show and the customers of Deep Earth Bunker are more alike than diverse. Who knows how representative these shows are of the prepper universe, but the people they feature are disproportionately white. They can’t speak for long without employing that cliché involving excrement and a fan. And whatever their religious beliefs might be, something “Preppers” doesn’t generally explore, most of them put their real faith in firearms.

“Preppers” and “Bunkers” are both full of footage of people firing or lovingly cradling their weaponry, which in many cases is frighteningly extensive. (You really don’t want the guy in last week’s “Preppers” living next door; in addition to a house full of ammunition, he has stockpiled 50 gallons of gasoline, an unsettling combination.) One notable exception was Kathy Harrison, a New England woman profiled on a recent “Preppers.”

“It’s easy to feel a little left out of the prepper community if you live in New England and if you’re not fairly right wing and conservative politically,” she said in the segment. “But I just don’t spend my time worrying about stockpiling guns and ammunition, because our security comes not from stockpiling weapons but from having a community that respects each other, supports each other, and we have each others’ backs.”

A noble sentiment. But the unmistakable impression left by these programs is that what these folks want most of all is not to protect their families — the standard explanation for why they’re doing what they’re doing — or even the dubious pleasure of being able to say to the rest of us, “See, I told you the world was going to end.” What they want is a license to open fire.

Saturday, March 17, 2012

are you humans meant to have language and music?



Discover | What do ironing and hang-gliding have in common? Not much really, except that we weren’t designed to do either of them. And that goes for a million other modern-civilization things we regularly do but are not “supposed” to do. We’re fish out of water, living in radically unnatural environments and behaving ridiculously for a great ape. So, if one were interested in figuring out which things are fundamentally part of what it is to be human, then those million crazy things we do these days would not be on the list.

But what would be on the list?

At the top of the list of things we do that we’re supposed to be doing, and that are at the core of what it is to be human rather than some other sort of animal, are language and music. Language is the pinnacle of usefulness, and was key to our domination of the Earth (and the Moon). And music is arguably the pinnacle of the arts. Language and music are fantastically complex, and we’re brilliantly capable at absorbing them, and from a young age. That’s how we know we’re meant to be doing them, i.e., how we know we evolved brains for engaging in language and music.

But what if this gets language and music all wrong? What if we’re not, in fact, meant to have language and music? What if our endless yapping and music-filled hours each day are deeply unnatural behaviors for our species? (What if the parents in Footloose* were right?!)

I believe that language and music are, indeed, not part of our core—that we never evolved by natural selection to engage in them. The reason we have such a head for language and music is not that we evolved for them, but, rather, that language and music evolved—culturally evolved over millennia—for us. Our brains aren’t shaped for these pinnacles of humankind. Rather, these pinnacles of humankind are shaped to be good for our brains.

But how on Earth can one argue for such a view? If language and music have shaped themselves to be good for non-linguistic and amusical brains, then what would their shapes have to be?

They’d have to possess the auditory structure of…nature. That is, we have auditory systems which have evolved to be brilliantly capable at processing the sounds from nature, and language and music would need to mimic those sorts of sounds in order to harness—to “nature-harness,” as I call it—our brain.

And language and music do nature-harness, a case I make in my third book, Harnessed: How Language and Music Mimicked Nature and Transformed Ape to Man (Benbella, 2011). The two most important classes of auditory stimuli for humans are (i) events among objects (most commonly solid objects), and (ii) events among humans (i.e., human behavior). And, in my research I have shown that the signature sounds in these two auditory domains drive the sounds we humans use in (i) speech and (ii) music, respectively. Fist tap Dale.