kcstar.com | The number of impoverished people in America’s suburbs surged 64 percent in the past decade, creating for the first time a landscape in which the suburban poor outnumber the urban poor, a new report shows.
An extensive study by the Brookings Institution found that poverty is growing in the suburbs at more than twice the pace that it’s growing in urban centers. The collapse of the housing market and the subsequent foreclosure crisis were cited as aggravating a problem that was developing before recession struck in the late 2000s.
By 2011, the suburban poor in the nation’s major metropolitan areas outnumbered those living in urban centers by nearly 3 million, according to Confronting Suburban Poverty in America, a book to be released today by Brookings’ Metropolitan Policy Program.
The study placed the number of suburban poor at 16.4 million in 2011, up from about 10 million in 2000.
Around Kansas City, patterns of poverty have been quietly shifting for some time. But the economic downturn and job losses brought suburban poverty out of the shadows, said Karen Wulfkuhle, executive director of United Community Services of Johnson County.
“In the last three or four years, we’ve seen a growing understanding and recognition of suburban poverty,” she said. “It’s hitting people who have been here (in Johnson County) all their lives.”
More than 12 percent of Johnson County children 5 years old or younger lived below the poverty line in 2011. That figure was just 4.5 percent in 2008, Wulfkuhle said.
“Poverty isn’t a static thing,” she added. “People don’t stay on one side of the (poverty) line or the other. They move back and forth.”
More than 23,000 pupils in the county’s public schools qualified for free or reduced-price lunches in the 2012-13 school year — triple the number from a decade ago.
In suburban Platte County, the number of persons receiving food stamps climbed 11 percent between 2009 and the end of last year. Cass County saw a 15 percent jump in that time.
The Brookings study attributed part of the shifting poverty patterns to overall population growth in the nation’s suburbs, where much of the housing stock is more than 50 years old.
The authors said the trends demand new approaches in social-welfare efforts, which currently emphasize “place-based” programs to help neighborhoods with large concentrations of poor residents. Suburban poverty, by contrast, tends to be diffuse and spread across fragmented communities.
“Poverty is touching more people and places than before, challenging outdated notions of where poverty is and who it affects,” said co-author Elizabeth Kneebone.
mercurynews | The
middle-class American dream, which resided for more than half a century
in leafy suburban enclaves such as Mountain View, Lafayette and Antioch
-- homogeneous bulwarks built by GI loans and fortified by white flight
-- has given way to an alarming rise in suburban poverty over the past
decade, according to a study by the Brookings Institution scheduled for
release Monday.
While the poor are with us everywhere in greater
numbers than ever before, the authors of "Confronting Suburban Poverty
in America" conclude that the Bay Area's two largest metropolitan areas
have experienced the spread of this scourge in starkly different ways.
The
percentage of people living in poverty in the suburbs rose 56.1 percent
in the San Francisco-Oakland-Fremont metro area from 2000 to 2011,
compared to 64 percent nationwide. The San Jose-Sunnyvale-Santa Clara
metropolitan region surged 53.1 percent. But Silicon Valley experienced a
corresponding rise (49 percent) among its urban poor, while in San
Francisco, inner-city poverty increased by only 18.4 percent.
"We
have a way of dealing with poverty in America that is about five decades
old," says Alan Berube, the book's co-author -- along with Elizabeth
Kneebone -- and Brookings Metropolitan Policy Program Senior Fellow.
"And it's built for where poverty was then: primarily in inner cities.
The way the programs are structured and delivered really doesn't compute
for a lot of suburbia and the increasing number of low-income people who are living there."
Unknown consequences
Suburban
life, which became a fixture of American postwar mythologizing, reached
its apex with the release of "E.T. the Extra-Terrestrial," Steven
Spielberg's 1982 film about of life in cookie-cutter tract housing.
According to the Brookings study, however, poverty during the past
decade grew twice as fast in the suburbs as cities. By 2011, 3 million
more poor people lived in suburbs of the nation's major metropolitan
areas than in its big cities.
The book actually opens with a
description of the suburbs of East Contra Costa County -- places such as
Oakley, Antioch and Brentwood -- where the number of people living
below the poverty line grew by more than 70 percent in the past decade.
Berube says the high cost of living in San Francisco simply pushed the
urban poor who bus restaurant tables and drive cabs into a kind of
blight flight.
"Try living somewhere in the city of San Francisco
on $20,000 a year for a family of four," Berube says. "A lot of families
saw an opportunity to live in a safer community, and in a better
housing unit, way out in East Contra Costa County. It was a very
rational decision in response to a shrinking supply of affordable
housing, but I don't think we thought about what would be the
consequences for those families when they got there."
abqjournal | The Albuquerque metropolitan area ranks eighth in the country for
suburban poverty, according to a new book published by the Brookings
Institution.
Albuquerque’s eighth place comes from a 17 percent suburban poverty
rate, which falls behind the list-topping Texas metropolitan areas of El
Paso and McAllen, with suburban poverty rates of 36.4 percent and 35.4
percent.
The metropolitan areas with the lowest suburban poverty rates are Des
Moines, Iowa, with 5.7 percent, the Bridgeport-Stamford, Conn., area
with 5.9 percent and Baltimore, with 6.7 percent.
The book, “Confronting Suburban Poverty in America,”
published today, compares the top 100 metropolitan areas’ city and
suburban poverty numbers.
Yet, unlike many other cities, Albuquerque’s suburbs have a lower poverty rate than Albuquerque itself.
Poverty is generally defined as “not earning enough money to
meet one’s basic needs,” according to Kim Posich, executive director of
the New Mexico Center on Law and Poverty. Approximately 22 percent of
Albuquerque residents and between 21.5 and 23 percent of New Mexico
residents live in poverty, which the federal government calculates as a
four-person family living on about $22,500 a year, he added.
In 1970, nationwide, 7.4 million city dwellers and 6.4
million suburbanites lived in poverty. By 2011, the number of poor
suburbanites exceeded the number of poor city dwellers. In 2011, 12.8
million people in cities and 15.3 million people in suburbs lived in
poverty.
For the Albuquerque metropolitan area — which the U.S.
census estimates to include 901,000 people in Sandoval, Valencia,
Torrance and Bernalillo counties — numbers trended in the opposite
direction. In 1970, 34,116 Albuquerqueans and 34,784 suburbanites lived
in poverty, making the split just about even. But 41 years later,
106,397 Albuquerqueans were living in poverty, compared with 74,688
suburbanites.
Albuquerque’s data bucked national trends in the first half
of that four-decade period because the city kept gobbling up
geographical portions of unincorporated land, says Alan Berube, a senior
Brookings fellow who co-wrote the 143-page book with Brookings
colleague Elizabeth Kneebone over a two-year period.
“Parts (of Albuquerque) that were in the suburbs in 1970 are
actually part of the city today, because the city has absorbed those
communities … so that has added to its population, and it (has) added to
its poor population, too,” Berube said.
nature | Thinking about a professor just before you take an intelligence test
makes you perform better than if you think about football hooligans. Or
does it? An influential theory that certain behaviour can be modified by
unconscious cues is under serious attack.
A paper published in PLoS ONE last week1
reports that nine different experiments failed to replicate this
example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks.
David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE,
is among sceptical scientists calling for Dijksterhuis to design a
detailed experimental protocol to be carried out indifferent
laboratories to pin down the effect. Dijksterhuis has rejected the
request, saying that he “stands by the general effect” and blames the
failure to replicate on “poor experiments”.
An acrimonious e-mail debate on the subject has been
dividing psychologists, who are already jittery about other recent
exposures of irreproducible results (see Nature 485, 298–300; 2012).
“It’s about more than just replicating results from one paper,” says
Shanks, who circulated a draft of his study in October; the failed
replications call into question the underpinnings of
‘unconscious-thought theory’.
Dijksterhuis published that theory in 2006 (ref. 3).
It fleshed out more general, long-held claims about a ‘smart
unconscious’ that had been proposed over the past couple of decades —
exemplified in writer Malcolm Gladwell’s best-selling book Blink (Penguin,
2005). The theory holds that behaviour can be influenced, or ‘primed’,
by thoughts or motives triggered unconsciously — in the case of
intelligence priming, by the stereotype of a clever professor or a
stupid hooligan. Most psychologists accept that such priming can occur
consciously, but many, including Shanks, are unconvinced by claims of
unconscious effects.
In their paper, Shanks and his colleagues tried to obtain
an intelligence-priming effect, following protocols in Dijksterhuis’s
papers or refining them to amplify any theoretical effect (for example,
by using a test of analytical thinking instead of general knowledge).
They also repeated intelligence-priming studies from independent labs.
They failed to find any of the described priming effects in their
experiments.
The e-mail debate that Shanks joined was kicked off last
September, when Daniel Kahneman, a Nobel-prizewinning psychologist from
Princeton University in New Jersey who thinks that unconscious social
priming is likely to be real, circulated an open letter warning of a
“train wreck looming” (see Nature http://doi.org/mdr; 2012)
because of a growing number of failures to replicate results. Social
psychology “is now the poster child for doubts about the integrity of
psychological research”, he told psychologists, “and it is your
responsibility” to deal with it.
NYTimes | One summer night in 2011, a tall, 40-something professor named Diederik
Stapel stepped out of his elegant brick house in the Dutch city of
Tilburg to visit a friend around the corner. It was close to midnight,
but his colleague Marcel Zeelenberg had called and texted Stapel that
evening to say that he wanted to see him about an urgent matter. The two
had known each other since the early ’90s, when they were Ph.D.
students at the University of Amsterdam; now both were psychologists at
Tilburg University. In 2010, Stapel became dean of the university’s
School of Social and Behavioral Sciences and Zeelenberg head of the
social psychology department. Stapel and his wife, Marcelle, had
supported Zeelenberg through a difficult divorce a few years earlier. As
he approached Zeelenberg’s door, Stapel wondered if his colleague was
having problems with his new girlfriend.
Zeelenberg, a stocky man with a shaved head, led Stapel into his living
room. “What’s up?” Stapel asked, settling onto a couch. Two graduate
students had made an accusation, Zeelenberg explained. His eyes began to
fill with tears. “They suspect you have been committing research
fraud.”
Stapel was an academic star in the Netherlands and abroad, the author of
several well-regarded studies on human attitudes and behavior. That
spring, he published a widely publicized study in Science about an
experiment done at the Utrecht train station showing that a trash-filled
environment tended to bring out racist tendencies in individuals. And
just days earlier, he received more media attention for a study
indicating that eating meat made people selfish and less social.
His enemies were targeting him because of changes he initiated as dean,
Stapel replied, quoting a Dutch proverb about high trees catching a lot
of wind. When Zeelenberg challenged him with specifics — to explain why
certain facts and figures he reported in different studies appeared to
be identical — Stapel promised to be more careful in the future. As
Zeelenberg pressed him, Stapel grew increasingly agitated.
Finally, Zeelenberg said: “I have to ask you if you’re faking data.”
That weekend, Zeelenberg relayed the allegations to the university
rector, a law professor named Philip Eijlander, who often played tennis
with Stapel. After a brief meeting on Sunday, Eijlander invited Stapel
to come by his house on Tuesday morning. Sitting in Eijlander’s living
room, Stapel mounted what Eijlander described to me as a spirited
defense, highlighting his work as dean and characterizing his research
methods as unusual. The conversation lasted about five hours. Then
Eijlander politely escorted Stapel to the door but made it plain that he
was not convinced of Stapel’s innocence.
blicunion | Fortunately, the Swiss National Advisory Commission on Biomedical Ethics (NEK, President: Otfried Höffe) critically commented on the use of the ADHD drug Ritalin in its opinion of 22 November 2011 titled Human enhancement by means of pharmacological agents: The consumption of pharmacological agents altered the child’s behavior without any contribution on his or her part.
That amounted to interference in the child’s freedom and personal rights, because pharmacological agents induced behavioral changes but failed to educate the child on how to achieve these behavioral changes independently. The child was thus deprived of an essential learning experience to act autonomously and emphatically which “considerably curtails children’s freedom and impairs their personality development”, the NEK criticized.
The alarmed critics of the Ritalin disaster are now getting support from an entirely different side. The German weekly Der Spiegel quoted in its cover story on 2 February 2012 the US American psychiatrist Leon Eisenberg, born in 1922 as the son of Russian Jewish immigrants, who was the “scientific father of ADHD” and who said at the age of 87, seven months before his death in his last interview:“ADHD is a prime example of a fictitious disease”
Since 1968, however, some 40 years, Leon Eisenberg’s “disease” haunted the diagnostic and statistical manuals, first as “hyperkinetic reaction of childhood”, now called “ADHD”. The use of ADHD medications in Germany rose in only eighteen years from 34 kg (in 1993) to a record of no less than 1760 kg (in 2011) – which is a 51-fold increase in sales! In the United States every tenth boy among ten year-olds already swallows an ADHD medication on a daily basis. With an increasing tendency.
When it comes to the proven repertoire of Edward Bernays, the father of propaganda, to sell the First World War to his people with the help of his uncle’s psychoanalysis and to distort science and the faith in science to increase profits of the industry – what about investigating on whose behalf the “scientific father of ADHD” conducted science? His career was remarkably steep, and his “fictitious disease” led to the best sales increases. And after all, he served in the “Committee for DSM V and ICD XII, American Psychiatric Association” from 2006 to 2009. After all, Leon Eisenberg received “the Ruane Prize for Child and Adolescent Psychiatry Research. He has been a leader in child psychiatry for more than 40 years through his work in pharmacological trials, research, teaching, and social policy and for his theories of autism and social medicine”.
And after all, Eisenberg was a member of the “Organizing Committee for Women and Medicine Conference, Bahamas, November 29 – December 3, 2006, Josiah Macy Foundation (2006)”. The Josiah Macy Foundation organized conferences with intelligence agents of the OSS, later CIA, such as Gregory Bateson and Heinz von Foerster during and long after World War II. Have such groups marketed the diagnosis of ADHD in the service of the pharmaceutical market and tailor-made for him with a lot of propaganda and public relations? It is this issue that the American psychologist Lisa Cosgrove and others investigated in their study Financial Ties between DSM-IV Panel Members and the Pharmaceutical Industry7. They found that “Of the 170 DSM panel members 95 (56%) had one or more financial associations with companies in the pharmaceutical industry. One hundred percent of the members of the panels on ‘Mood Disorders’ and ‘Schizophrenia and Other Psychotic Disorders’ had financial ties to drug companies. The connections are especially strong in those diagnostic areas where drugs are the first line of treatment for mental disorders.” In the next edition of the manual, the situation is unchanged. “Of the 137 DSM-V panel members who have posted disclosure statements, 56% have reported industry ties – no improvement over the percent of DSM-IV members.” “The very vocabulary of psychiatry is now defined at all levels by the pharmaceutical industry,” said Dr Irwin Savodnik, an assistant clinical professor of psychiatry at the University of California at Los Angeles.
We propose that human intelligence is composed of multiple independent components
Each behavioral component is associated with a distinct functional brain network
The higher-order “g” factor is an artifact of tasks recruiting multiple networks
The components of intelligence dissociate when correlated with demographic variables
What makes one person more intellectually able than another? Can the entire distribution of human intelligence be accounted for by just one general factor? Is intelligence supported by a single neural system? Here, we provide a perspective on human intelligence that takes into account how general abilities or ‘‘factors’’ reflect the functional organization of the brain. By comparing factor models of individual differences in performance with factor models of brain functional organization, we demonstrate that different components of intelligence have their analogs in distinct brain networks. Using simulations based on neuroimaging data, we show that the higher-order factor ‘‘g’’ is accounted for by cognitive tasks co-recruiting multiple networks. Finally, we confirm the independence of these components of intelligence by dissociating them using questionnaire variables. We propose that intelligence is an emergent property of anatomically distinct cognitive systems, each of which has its own capacity. cell |
thescientist | When a group of genetically identical mice lived in the same complex
enclosure for 3 months, individuals that explored the environment more
broadly grew more new neurons than less adventurous mice, according to a
study published today (May 9) in Science.
This link between exploratory behavior and adult neurogenesis shows
that brain plasticity can be shaped by experience and suggests that the
process may promote individuality, even among genetically identical
organisms.
“This is a clear and quantitative demonstration that individual
differences in behavior can be reflected in individual differences in
brain plasticity,” said Fred Gage
of the Salk Institute for Biological Studies in La Jolla, California,
who was not involved the study. “I don’t know of another clear example
of that . . . and it tells me that there is a tighter relationship
between [individual] experiences and neurogenesis than we had previously
thought.”
Scientists have often tried to tackle the question of how individual
differences in behavior and personality develop in terms of the
interactions between genes and environment. “But there is next to
nothing [known] about the neurobiological mechanisms underlying
individuality,” said Gerd Kempermann of the German Center for Neurodegenerative Diseases in Dresden.
One logical way to study this phenomenon is to look at brain
plasticity, or how the brain’s structure and function change over time.
Plasticity is hard to study, however, because it mostly takes place at
the synaptic level, so Kempermann and his colleagues decided to look at
the growth of new neurons in the adult hippocampus, which can easily be
quantified. Earlier studies have demonstrated that activity—both
physical and cognitive—increases adult neurogenesis in groups of
genetically identical mice, but there were differences between
individuals in the amount of neuron growth.
To understand why, Kempermann and his colleagues housed 40 genetically
identical female inbred mice in a complex 5-square-meter, 5-level
enclosure filled with all kinds of objects designed to encourage
activity and exploration. The mice were tagged with radio-frequency
infer-red (RFIR) transponders, and 20 antennas placed around the
enclosure tracked their every movement. After 3 months, the researchers
assessed adult neurogenesis in the mice by counting proliferating
precursor cells, which had been labeled before the study began.
The researchers found that individual differences in exploratory
behavior correlated with individual differences in the numbers of new
neurons generated. “To out knowledge, it’s the first example of a direct
link between individual behavior and individual brain plasticity,”
Kempermann said.
Gage cautions about pinning all the differences on the environment,
however. Although the mice in the study were genetically identical, he
said, they were not behaviorally identical to begin with: clearly some
variation occurs at a very early stage that makes them more or less
prone to explore. “It’s incorrect to think of it that the environment
caused the difference between the mice,” he said. “The difference was
already there, and the environment amplified that difference. My own
personal bias is that there are likely genetic events that happened at
germline, or somatic events over time,” that set the stage for these
subtle behavioral differences that are subsequently amplified.
NYPost | They are 1 percenters who are 100 percent despicable. Some
wealthy Manhattan moms have figured out a way to cut the long lines at
Disney World — by hiring disabled people to pose as family members so
they and their kids can jump to the front, The Post has learned.
The “black-market Disney guides” run $130 an hour, or $1,040 for an eight-hour day. “My
daughter waited one minute to get on ‘It’s a Small World’ — the other
kids had to wait 2 1/2 hours,” crowed one mom, who hired a disabled
guide through Dream Tours Florida.
“You can’t go to Disney without a tour concierge,’’ she sniffed. “This is how the 1 percent does Disney.”
The
woman said she hired a Dream Tours guide to escort her, her husband and
their 1-year-old son and 5-year-old daughter through the park in a
motorized scooter with a “handicapped” sign on it. The group was sent
straight to an auxiliary entrance at the front of each attraction.
Disney allows each guest who needs a wheelchair or motorized scooter to bring up to six guests to a “more convenient entrance.”
The
Florida entertainment mecca warns that there “may be a waiting period
before boarding.” But the consensus among upper-crust moms who have used
the illicit handicap tactic is that the trick is well worth the cost.
Not only is their “black-market tour guide” more efficient than Disney World’s VIP Tours, it’s cheaper, too.
Disney Tours offers a VIP guide and fast passes for $310 to $380 per hour.
Passing
around the rogue guide service’s phone number recently became a
shameless ritual among Manhattan’s private-school set during spring
break. The service asks who referred you before they even take your
call.
“It’s insider knowledge that very few have and share
carefully,” said social anthropologist Dr. Wednesday Martin, who caught
wind of the underground network while doing research for her upcoming
book “Primates of Park Avenue.”
“Who wants a speed pass when you can use your black-market handicapped guide to circumvent the lines all together?” she said.
“So when you’re doing it, you’re affirming that you are one of the privileged insiders who has and shares this information.”
abc.net.au | Lying, cheating and other forms of Machiavellian
skulduggery seem to be the inevitable evolutionary consequences of
living in co-operative communities, suggest UK scientists.
Instead of viewing deception and co-operation as polar opposites, Luke McNally from Trinity College Dublin and Andrew Jackson from the University of Edinburgh say we might do better to think of them as two sides of the same evolutionary coin.
"Deception is an inherent component of our complex social lives, and
it's likely impossible to separate the good from the bad; the darkest
parts of our psychology evolved as a result of the most virtuous," says
McNally.
First, they use game theory to show the evolution of co-operation creates pressures that favour the evolution of deception.
In their scenario, individuals have three options: to always cheat
and not help others; to reciprocate the help that others offer; or cheat
and try to conceal this cheating by deceiving others.
"When reciprocal co-operators interact with honest cheaters, they
spot their cheating and stop co-operating with them," McNally explains.
"However, as deceivers are better at hiding their cheating, reciprocal co-operators find it harder to spot their cheating.
"This means that the deceivers are able to gain co-operation without
having to co-operate themselves, allowing deception to evolve."
The researchers back up this theory with real-world evidence gathered
from studies of deception in 24 different primate species.
They show deceptive behaviour is more common in species that co-operate more.
"Our comparative analysis shows the more co-operation a species
engages in the more it engages in deception, which is what our model
predicts," McNally says.
theatlanticwire | Richwine says his passion for outlining the case for racial inferiority is rooted in his love of data not racism. At a 2008 panel,
Richwine ranked races by IQ: "Decades of psychometric testing has
indicated that at least in America, you have Jews with the highest
average IQ, usually followed by East Asians, then you have non-Jewish
whites, Hispanics, and then blacks." Now, he tells York, he's not sorry
for those comments. "I don't apologize for any of the things that I
said," he says. But he does wish he'd put an asterisk on the entire
sentence so it doesn't sound like he's endorsing the idea that some
ethnic groups are just biologically destined to be less intelligent than
others. He would have noted that "there is a nuance that goes along
with that: the extent to which IQ scores actually reflect intelligence,
the fact that it reflects averages and there is a lot of overlap in any
population, and that IQ scores say absolutely nothing about the causes
of the differences -- environmental, genetic, or some combination of
those things.
Richwine's argument that he is not a racist because he does not think of himself as a racist is not very persuasive, although it is common.
But even more problematic is that Richwine also admits to York that
he's not very good at spotting racism. In 2010, for example, he wrote
for two articles for the white nationalist site Alternative Right. One of his articles made
the argument that since "U.S.-born Hispanics are much more likely to be
incarcerated than foreign-born Hispanics" that "implies that Hispanic
crime will become more of a problem as time goes on, not less."
That fits well with the editorial agenda founder Richard B. Spencer, a
former editor of The American Conservative, who has a history of saying things like, "There are races who, on average, are going to be superior." People like blogger E.D. Kain have dubbed the site
"ugly white nationalism." Richwine said he didn't think anything was
problematic, telling York, "I thought it would be like a
paleo-conservative website. I had seen that [former National Review writer] John Derbyshire had also published something there." Derbyshire was left The National Review because
he wrote an essay about how he tells his kids to avoid groups of black
people but to have one black friend to inoculate against charges of
racism.
That was in 2012 — and Derbyshire had been writing racist things for years. As I argued at the time,
he "effectively demonstrates, year after year, exactly how racist you
can be and still get published by people who consider themselves
intellectuals." That line has since moved, which Richwine apparently
noticed too late.
fastcoexist | I run a for-profit business that delivers products and services to
customers earning less than $6 a day in West Africa. When I tell people
this, I frequently encounter disbelief or concern. The three most common
responses I hear are:
Surely you can’t make money working with people who are so poor?
Don’t you feel like you are taking advantage of these people by making money from them?
Wouldn’t charity do a better job of meeting their needs?
While these questions are well-intentioned, I initially found them
upsetting because they go far beyond a healthy skepticism about my
business model. They made me doubt whether I should be working with poor
consumers at all.
While I stayed the course, I fear that many will simply choose a
simpler path of building a startup in developed markets. The absolute
worst thing that can happen for the poorest people on Earth is that the
next generation of superstar entrepreneurs ends up in Silicon Valley
making iPhone Apps, rather than trying to address the problems of the 4
billion people who need them the most.
So next time you overhear one of these questions, do the world’s poor a favor and shoot it down. Here’s how:
karlnorth |The Interdependence.Economic
activity at phantom carrying capacity depletes resources at a rate that
causes rising resource costs and decreasing profit margins in the
production of real wealth. The investor class therefore turns
increasingly to the production of credit as a source of profits. Credit
unsupported by the production of real wealth is stealing from the
future: it is phantom wealth. It also creates inflation, which is
stealing from the purchasing power of income in the present. Protected
from the masses by the illusion of democracy, government facilitates the
unlimited production of credit and the continued overshoot of real
carrying capacity. This causes inflation and permanently rising costs of
raw materials. To divert public attention from the resultant declining
living standard of the laboring classes, government dispenses rigged
statistics and fake news of continued growth to project the illusion of
economic health. The whole interdependent phantom stage of the
capitalist system has an extremely limited life before it collapses into
chaos.
deadspin | You
may have heard that the highest-paid employee in each state is usually
the football coach at the largest state school. This is actually a gross
mischaracterization: Sometimes it is the basketball coach.
Based
on data drawn from media reports and state salary databases, the ranks
of the highest-paid active public employees include 27 football coaches,
13 basketball coaches, one hockey coach, and 10 dorks who aren't even
in charge of a team.
So are my hard-earned tax dollars paying these coaches?
Probably
not. The bulk of this coaching money—especially at the big football
schools—is paid out of the revenue that the teams generate.
So what's the problem then? These guys make tons of money for their schools; shouldn't they be paid accordingly?
There are at least three problems.
Coaches
don't generate revenue on their own; you could make the exact same case
for the student-athletes who actually play the game and score the
points and fracture their legs.
It can be tough to attribute this revenue directly to the performance of the head coach. In 2011-2012, Mack Brown was paid $5 million to lead a mediocre 8-5 Texas team to the Holiday Bowl. The team still generated $103.8 million in revenue, the most in college football. You don't have to pay someone $5 million to make college football profitable in Texas.
This revenue rarely makes its way back to the general funds of these universities. Looking at data from 2011-2012,
athletic departments at 99 major schools lost an average of $5 million
once you take out revenue generated from "student fees" and "university
subsidies." If you take out "contributions and donations"—some of which
might have gone to the universities had they not been lavished on the
athletic departments—this drops to an average loss of $17 million, with
just one school (Army) in the black. All this football/basketball
revenue is sucked up by coach and AD salaries, by administrative and facility costs, and by the athletic department's non-revenue generating sports; it's not like it's going to microscopes and Bunsen burners
yahoo-finance | The U.S. is home to some of the greatest colleges and
universities in the world. But with the student debt load at more than
$1 trillion and youth unemployment elevated, when assessing the value of
a college education, that’s only one part of the story.
Former Secretary of Education William Bennett, author of Is College Worth It,
sat down with The Daily Ticker on the sidelines of the Milken
Institute's 2013 Global Conference to talk about whether college is
worth it.
“We have about 21 million people in higher education, and about half
the people who start four year colleges don’t finish,” Bennett tells The
Daily Ticker. “Those who do finish, who graduated in 2011 - half were
either unemployed or radically underemployed and in debt.”
That average student loan balance for a 25-year-old is $20,326,
according to the Federal Reserve of New York. Student debt is second
largest source of U.S. household debt, after only mortgages.
Bennett assessed the “return on investment” for the 3500 colleges and
universities in the country. He found that returns were positive for
only 150 institutions. The top 10 schools ranked by Bennett as having
the best "ROI" are below (for the full list he used, click here, and for the latest figures, click here):
fortune | Being jobless is an awful thing for anyone no matter where they live.
But it's especially unnerving for young people just starting their
careers. A lot has been written about the topic lately, but two new
reports show the job employment picture likely won't get any better for
young people living in the world's richest countries. And in many ways,
America's young people today have it worse than even parts of
debt-troubled Europe.
The findings come as thousands graduate from college this month.
Graduates may have hung up their hard-earned diplomas today, but for
many it will be a huge struggle to find jobs they studied hard for.
Across the world's richest countries, joblessness among 15- to
24-year-olds is estimated at 12.6%, close to its crisis peak, according
to the International Labor Organization.
The problem is most pronounced in a few parts of the world, including
developed economies, such as the United States and parts of Europe.
In 2012, the rate of joblessness in the richest countries rose to a
decades-long high of 18.1%, according to the ILO, which doesn't see the
rate drop below 17% before 2016.
REGARDING THE BUENA VISTA SCHOOL DISTRICT BUDGET CRISIS
Buena
Vista School District and its community of parents and stakeholders has
a long tradition of pride and excellence. We pride ourselves on the
caring and committed staff with which we are blessed and consider it our
highest calling to be entrusted with the care and education of the
community’s children.
Recent reductions in state school aid,
combined with a severe drop in enrollment have created a situation where
the District has not been able to get small enough fast enough. Adding
to this problem is the fact that the District must return to the state
funds related to the Wolverine Secure Treatment Center which it
continued to receive after the program severed ties with the District in
2012. The District brought its receipt of these funds to the attention
of the State during a meeting with state officials to discuss a draft of
its deficit elimination plan in February. All of this came into focus
when the State did not transmit the District’s April state school aid.
Upon
noting that state school aid was not received in April as planned, the
District made inquiry of the State and was told that state school aid
for April, May and June would be withheld to recoup the funds that were
mistakenly sent to the District. We remain in contact with officials at
the State, the Intermediate School District and our surrounding
districts. We have been told by State officials that a prerequisite to
continuing dialogue is the District’s completion of a satisfactory
deficit elimination plan. We are and have been working diligently to
meet this requirement, and appreciate the technical assistance that
State officials have provided regarding the deficit elimination plan.Fist tap Dale.
guardian | For all Raine's rigour, his discipline of "neurocriminology" still
remains tarnished, for some, by association with 19th-century
phrenology, the belief that criminal behaviour stemmed from defective
brain organisation as evidenced in the shape of the skull. The idea was
first proposed by the infamous Franz Joseph Gall, who claimed to have
identified over- or underdeveloped brain "organs" that gave rise to
specific character: the organ of destructiveness, of covetousness and so
on, which were recognisable to the phrenologist by bumps on the head.
Phrenology was widely influential in criminal law in both the United
States and Europe in the middle of the 1800s, and often used to support
crude racial and class-based stereotypes of criminal behaviour.
The
divisive thinking was developed further in 1876 by Cesare Lombroso, an
Italian surgeon, after he conducted a postmortem on a serial murderer
and rapist. Lombroso discovered a hollow part of the killer's brain,
where the cerebellum would be, from which he proposed that violent
criminals were throwbacks to less evolved human types, again
identifiable by ape-like physical characteristics. The political
manipulation of such hypotheses in the eugenics movement eventually saw
them wholly outlawed and discredited.
As one result, after the
second world war, crime became attributable to economic and political
factors, or psychological disturbances, but not to biology. Prompted by advances in genetics
and neuroscience, however, that consensus is increasingly fragile, and
the implications of those scientific advances for law – and for concepts
such as culpability and responsibility – are only now being tested.
Raine
is by no means alone in this argument, though his highly readable book
serves as an invaluable primer to both the science and the ethical
concerns. As the polymath David Eagleman,
director of neuroscience and law at Baylor College in Texas, recently
pointed out, knowledge in this area has advanced to the point where it
is perverse to be in denial. What are we to do, for example, Eagleman
asked, with the fact that "if you are a carrier of one particular set of
genes, the probability that you will commit a violent crime is four
times as high as it would be if you lacked those genes. You're three
times as likely to commit a robbery, five times as likely to commit
aggravated assault, eight times as likely to be arrested for murder and
13 times as likely to be arrested for a sexual offence. The overwhelming
majority of prisoners carry these genes; 98.1% of death row inmates do…
Can we honestly say that the carriers of those genes have exactly the
same range of choices in their behaviour as those who do not possess
them? And if they do not, should they be judged and punished by the same
standard?"
Raine's work is full of this kind of statistic and
this kind of question. (One of his more startling findings is the
extraordinarily high level of psychopathic markers among employees of a
temping agency he studied, which came as no surprise to him.
"Psychopaths can't settle, they need to move around, look for new
stimulation," he says.) He draws on a number of studies that show the
links between brain development, in particular – and brain injury and
impairment by extension – and criminal violence. Already legal defence
teams, particularly in the US, are using brain scans and neuroscience as
mitigating evidence in the trials of violent criminals and sex
offenders. In this sense, Raine believes a proper public debate on the
implications of his science is long overdue.
Raine was in part
drawn to his discipline by his own background. In the course of scanning
his murderers, Raine also examined his own PET profile and found,
somewhat to his alarm, that the structure of his brain seemed to share
more characteristics with the psychopathic murderers than with the
control group.
He laughs quickly when I ask how that discovery
felt. "When you have a brain scan that looks like a serial killer's it
does give you pause," he says. And there were other factors: he has
always had a markedly low heart rate (which his research has shown to be
a truer indicator of a capacity for violence than, say, smoking is as a
cause of lung cancer). He was plagued by cracked lips as a child,
evidence of riboflavin deficiency (another marker); he was born at home;
he was a blue baby, all factors in the kind of developmental
difficulties that might set his own researcher's alarm bells ringing.
"So,"
he says, "I was on the spectrum. And in fact I did have some issues. I
was taken to hospital aged five to have my stomach pumped because I had
drunk a lot of alcohol. From age nine to 11 I was pretty antisocial, in a
gang, smoking, letting car tyres down, setting fire to mailboxes, and
fighting a lot, even though I was quite small. But at that age I burnt
out of that somehow. At 11, I changed schools, got more interested in
studying and really became a different sort of kid. Still, when I was
graduating and thinking 'what shall I research?', I looked back on the
essays I'd written and one of the best was on the biology of
psychopaths; I was fascinated by that, partly, I think, because I had
always wondered about that early behaviour in myself."
As Raine
began to explore the subject more, he began to look at the reasons he
became a researcher of violent criminality, rather than a violent
criminal. (Recent studies suggest his biology might equally have
propelled him towards other careers – bomb disposal expert, corporate
executive or journalist – that tend to attract individuals with those
"psychopathic" traits.) Despite his unusual brain structure, he didn't
have the low IQ that is often apparent in killers, or any cognitive
dysfunction. Still, as he worked for four years interviewing people in
prison, a lot of the time he was thinking: what stopped me being on
their side of the bars?
Raine's biography, then, was a good
corrective to the seductive idea that our biology is our fate and that a
brain scan can tell us who we are. Fist tap Big Don.
Four years ago, long before he’d join the Heritage Foundation, before
Marco Rubio was even in the Senate, Jason Richwine armed a time bomb. A
three-member panel at Harvard’s John F. Kennedy School of Government accepted Richwine’s thesis,
titled “IQ and Immigration Policy.” In it, Richwine provided
statistical evidence that Hispanic immigrants, even after several
generations, had lower IQs than non-Hispanic whites. Immigration
reformers were fools if they didn’t grapple with that.
"Visceral opposition to IQ selection can sometimes generate
sensationalistic claims—for example, that this is an attempt to revive
social Darwinism, eugenics, racism, etc,” wrote Richwine. “Nothing of
that sort is true. … an IQ selection system could utilize individual
intelligence test scores without any resort to generalizations.”
This week, Heritage released a damning estimate
of the immigration bill, co-authored by Richwine. The new study was all
about cost, totally eliding the IQ issues that Richwine had mastered,
but it didn’t matter after Washington Post reporter Dylan Matthews found the dissertation. Heritage hurried
to denounce it—“its findings in no way reflect the positions of The
Heritage Foundation”—and Richwine has ducked any more questions from the
press.
His friends and advisers saw this coming. Immigration reform’s
political enemies know—and can’t stand—that racial theorists are
cheering them on from the cheap seats. They know that the left wants to
exploit that—why else do so many cameras sprout up whenever Minutemen
appear on the border, or when Pat Buchanan comes out of post-post-post
retirement to write another book about the “death of the West?”
Academics aren’t so concerned with the politics. But they know all
too well the risks that come with research connecting IQ and race. At
the start of his dissertation, Richwine thanked his three
advisers—George Borjas, Christopher Jenks, and Richard Zeckhauser—for
being so helpful and so bold. Borjas “helped me navigate the minefield
of early graduate school,” he wrote. “Richard Zeckhauser, never someone
to shy away from controversial ideas, immediately embraced my work.”
Yet they don’t embrace everything Richwine’s done since. “Jason’s
empirical work was careful,” Zeckhauser told me over email. “Moreover,
my view is that none of his advisors would have accepted his thesis had
he thought that his empirical work was tilted or in error. However,
Richwine was too eager to extrapolate his empirical results to
inferences for policy.”
Borjas’ own work on immigration and inequality has led to a few two-minutes-hate moments in the press. He wasn’t entirely convinced by Richwine, either.
“I have never worked on anything even remotely related to IQ, so
don't really know what to think about the relation between IQ,
immigration, etc,” Borjas told me in an email. “In fact, as I know I
told Jason early on since I've long believed this, I don't find the IQ
academic work all that interesting. Economic outcomes and IQ are only
weakly related, and IQ only measures one kind of ability. I've been
lucky to have met many high-IQ people in academia who are total losers,
and many smart, but not super-smart people, who are incredibly
successful because of persistence, motivation, etc. So I just think
that, on the whole, the focus on IQ is a bit misguided.”
But Richwine had been fascinated by it, and for a very long time, in
an environment that never discouraged it. Anyone who works in Washington
and wants to explore the dark arts of race and IQ research is in the
right place. The city’s a bit like a college campus, where investigating
“taboo” topics is rewarded, especially on the right. A liberal squeals
“racism,” and they hear the political correctness cops (most often, the
Southern Poverty Law Center) reporting a thinkcrime. Fist tap Big Don.
guardian | 'I never did anything for money. I never set money as a goal. It was a result." So says Bob Diamond,
formerly the chief executive of Barclays. In doing so Diamond lays
waste to the justification that his bank and others (and their
innumerable apologists in government and the media) have advanced for
surreal levels of remuneration – to incentivise hard work and talent.
Prestige, power, a sense of purpose: for them, these are incentives
enough.
Others of his class – Bernie Ecclestone and Jeroen van der
Veer (the former chief executive of Shell), for example – say the same.
The capture by the executive class of so much wealth performs no useful
function. What the very rich appear to value is relative income. If
executives were all paid 5% of current levels, the competition between
them (a questionable virtue anyway) would be no less fierce. As
the immensely rich HL Hunt commented several decades ago: "Money is just a way of keeping score."
The desire for advancement along this scale appears to be insatiable. In March Forbes magazine published an article about Prince Alwaleed,
who, like other Saudi princes, doubtless owes his fortune to nothing
more than hard work and enterprise. According to one of the prince's
former employees, the Forbes magazine global rich list "is how he wants
the world to judge his success or his stature".
The result is "a
quarter-century of intermittent lobbying, cajoling and threatening when
it comes to his net worth listing". In 2006, the researcher responsible
for calculating his wealth writes, "when Forbes estimated that the
prince was actually worth $7 billion less than he said he was, he called
me at home the day after the list was released, sounding nearly in
tears. 'What do you want?' he pleaded, offering up his private banker in
Switzerland. 'Tell me what you need.'"
Never mind that he has his
own 747, in which he sits on a throne during flights. Never mind that
his "main palace" has 420 rooms. Never mind that he possesses his own
private amusement park and zoo – and, he claims, $700m worth of jewels.
Never mind that he's the richest man in the Arab world,
valued by Forbes at $20bn, and has watched his wealth increase by
$2bn in the past year. None of this is enough. There is no place of
arrival, no happy landing, even in a private jumbo jet. The politics of
envy are never keener than among the very rich.
guardian | Beijing's building boom has already spawned a wealth of novelty forms, with a stadium in the shape of a bird's nest, a theatre nicknamed the egg, and a TV headquarters that has been likened to a giant pair of underpants. But the official People's Daily newspaper might have trumped them all with its new office building, which appears to be modelled on a colossal phallus.
Photos of the scaffold-shrouded shaft have been circulating on Weibo,
the Chinese micro-blogging site, to the authorities' dismay, with
censors working overtime to remove the offending images. "It seems the
People's Daily is going to rise up, there's hope for the Chinese dream,"
commented one user. "Of course the national mouthpiece should be imposing," added another.
The 150m-tall tower, located in the city's eastern business district, appropriately near OMA's pants-shaped CCTV headquarters, is the work of architect Zhou Qi, a professor at Jiangsu's Southeast University.
"Our way of expression is kind of extreme," Zhou told the Modern Express
newspaper, "different from the culture of moderation that Chinese
people are accustomed to." He explained the design was inspired not by
part of his anatomy, but by the traditional Chinese philosophy of "round
sky and square earth" – the tower tapers from a square base to a
cylindrical top. He claimed that the elongated spherical form was
designed to recall the Chinese character for "people" from above. The
fact it might look like a male member from below was clearly a secondary
concern.
bnarchives | This paper outlines the contours of a new research agenda for the
analysis of food price crises. By weaving together a detailed
quantitative examination of changes in corporate profit shares with a
qualitative appraisal of the restructuring in business control over the
organisation of society and nature, the paper points to the rapid
ascendance of a new power configuration in the global political economy
of food: the Agro-Trader nexus. The agribusiness and grain trader firms
that belong to the Agro-Trader nexus have not been mere 'price takers',
instead they have actively contributed to the inflationary restructuring
of the world food system by championing and facilitating the rapid
expansion of the first-generation biofuels sector. As a key driver of
agricultural commodity price rises, the biofuels boom has raised the
Agro-Trader nexus’s differential profits and it has at the same time
deepened global hunger. These findings suggest that food price inflation
is a mechanism of redistribution.
Just how bad things are can be determined through analysis of 2010 Census data.
The average black person lives in a neighborhood that is 45 percent
black. Without segregation, his neighborhood would be only 13 percent
black, according to professors John Logan and Brian Stults at Brown and Florida State.
Logan and Stult evaluated segregation in major cities with a dissimilarity
index, which identifies the percentage of one group that would have to
move to a different neighborhood to eliminate segregation. A score above 60 on the dissimilarity index is considered extreme.
In the following slides, we have ranked the most segregated cities in ascending order. They are illustrated with maps of cities by race created by Eric Fischer and publicly available on Flickr. The red dots show white people, blue is black, orange is Hispanic, green is Asian, and yellow is other.
newyorker | hen we all finished filing our tax returns last
week, there was a little something missing: two trillion dollars. That’s
how much money Americans may have made in the past year that didn’t get
reported to the I.R.S., according to a recent study by the economist
Edgar Feige, who’s been investigating the so-called underground, or
gray, economy for thirty-five years. It’s a huge number: if the
government managed to collect taxes on all that income, the deficit
would be trivial. This unreported income is being earned, for the most
part, not by drug dealers or Mob bosses but by tens of millions of
people with run-of-the-mill jobs—nannies, barbers, Web-site designers,
and construction workers—who are getting paid off the books. Ordinary
Americans have gone underground, and, as the recovery continues to limp
along, they seem to be doing it more and more.
Measuring an
unreported economy is obviously tricky. But look closely and you can see
the traces of a booming informal economy everywhere. As Feige said to
me, “The best footprint left in the sand by this economy that doesn’t
want to be observed is the use of cash.” His studies show that, while
economists talk about the advent of a cashless society, Americans still
hold an enormous amount of cold, hard cash—as much as seven hundred and
fifty billion dollars. The percentage of Americans who don’t use banks
is surprisingly high, and on the rise. Off-the-books activity also helps
explain a mystery about the current economy: even though the percentage
of Americans officially working has dropped dramatically, and even
though household income is still well below what it was in 2007,
personal consumption is higher than it was before the recession, and
retail sales have been growing briskly (despite a dip in March). Bernard
Baumohl, an economist at the Economic Outlook Group, estimates that,
based on historical patterns, current retail sales are actually what
you’d expect if the unemployment rate were around five or six per cent,
rather than the 7.6 per cent we’re stuck with. The difference, he
argues, probably reflects workers migrating into the shadow economy.
“It’s typical that during recessions people work on the side while
collecting unemployment,” Baumohl told me. “But the severity of the
recession and the profound weakness of this recovery may mean that a lot
more people have entered the underground economy, and have had to stay
there longer.”
The increasing importance of the gray economy
isn’t only a reaction to the downturn: studies suggest that the sector
has been growing steadily over the years. In 1992, the I.R.S. estimated
that the government was losing $80 billion a year in income-tax revenue.
Its estimate for 2006 was $385 billion—almost five times as much (and
still an underestimate, according to Feige’s numbers). The U.S. is
certainly a long way from, say, Greece, where tax evasion is a national
sport and the shadow economy accounts for twenty-seven per cent of
G.D.P. But the forces pushing people to work off the books are powerful.
Feige points to the growing distrust of government as one important
factor. The desire to avoid licensing regulations, which force people to
jump through elaborate hoops just to get a job, is another. Most
important, perhaps, are changes in the way we work. As Baumohl put it,
“For businesses, the calculus of hiring has fundamentally changed.”
Companies have got used to bringing people on as needed and then
dropping them when the job is over, and they save on benefits and
payroll taxes by treating even full-time employees as independent
contractors. Casual employment often becomes under-the-table work; the
arrangement has become a way of life in the construction industry. In a
recent California survey of three hundred thousand contractors,
two-thirds said they had no direct employees, meaning that they did not
need to pay workers’-compensation insurance or payroll taxes. In other
words, for lots of people off-the-books work is the only job available.
theatlantic | Elevated and lasting unemployment is an awful thing, anywhere, and
for anyone. But it is awful in a special way for young people, cutting
them off from networks and starting salaries at the moment they need to
forge connections and begin to cobble together a career.
A new study from the International Labor Organization
takes a global tour of youth joblessness and finds that what's gone up
won't come down in the next five years. The youth unemployment rate*
among the richest countries is projected to flat-line, rather than fall,
before 2018. As a result, the global Millennial generation could be
uniquely scarred by the economic downturn. Research by Lisa Kahn has showed that people graduating into a recession have typically faced a lifetime of lower wages.
As Ritchie King from Quartz shows in the graph to the left, it's now "harder for a teenager or
young adult to find a job in developed economies than in Sub-Saharan
Africa."
Lurking under the rise of youth unemployment among the richest countries is an even scarier trend -- the rise of long-term
youth unemployment. Long-term unemployment isn't just a difference in
length; it's a difference in kind, because the more time you spend out
of a company, the less likely you are to be hired back into one. In many European countries, particularly Spain,
the increase in unemployment has come almost exclusively from people
being out of work longer than two-years. In advanced economies,
"longterm unemployment has arrived as an unexpected tax on the current
generation of youth," ILO writes. About half of Europe's unemployed
youth have been out of work for more than six months, according to 2011
data.
American audiences are probably most interested in how
our Millennial generation compares to young people around the world. So,
from table B1 at the end of the paper, I picked a few OECD countries and graphed the last eight years of youth unemployment.
rt | Israel used "a new type of weapon", a senior official at the Syrian
military facility that came under attack from the Israeli Air Force told
RT.
“When the explosion happened it felt like an earthquake,”
said the source, who was present near the attack site on the
outskirts of Damascus on Sunday morning.
“Then a giant golden mushroom of fire appeared. This tells us
that Israel used depleted uranium shells.”
Depleted uranium is a by-product of the uranium enrichment
process that creates nuclear weapons, and was first used by the US
in the Gulf conflict of 1991. Unlike the radioactive materials used
in nuclear weapons, depleted uranium is not valued for its
explosiveness, but for its toughness – it is 2.5 times as dense as
steel – which allows it to penetrate heavy protection.
Countries using depleted uranium weapons insist that the
material is toxic, but not dangerously radioactive, as long as it
remains outside the body.
The source also claims the attack – if it managed to hit the
objects it targeted – served more of a political than a military
purpose.
“Several civilian factories and buildings were destroyed. The
target was just an ordinary weapons warehouse. The bombing is an
ultimatum to us – it had no strategic motivation.”
Western intelligence sources told the media that the strikes
targeted transfers of weapons from the Lebanese Hezbollah movement,
which is sympathetic to the government of Syrian President Bashar
Assad.
The official who spoke to RT denies this.
“There was no valuable equipment at the site. It was all
removed after a previous attack on the facility. The military
losses from this are negligible.”
guardian | The celebrated physicist Stephen Hawking became embroiled in a deepening furore today over his decision to boycott a prestigious conference in Israel in protest over the state's occupation of Palestine.
Hawking,
a world-renowned scientist and bestselling author who has had motor
neurone disease for 50 years, cancelled his appearance at the
high-profile Presidential Conference, which is personally sponsored by
Israel's president, Shimon Peres, after a barrage of appeals from
Palestinian academics.
The move, denounced by prominent Israelis
and welcomed by pro-Palestinian campaigners, entangled Cambridge
University – Hawking's academic base since 1975 – which initially
claimed the scientist's withdrawal was on medical grounds, before
conceding a political motivation.
The university's volte-face came
after the Guardian presented it with the text of a letter sent from
Hawking to the organisers of the high-profile conference in Jerusalem,
clearly stating that he was withdrawing from the conference in order to
respect the call for a boycott by Palestinian academics.
The full
text of the letter, dated 3 May, said: "I accepted the invitation to the
Presidential Conference with the intention that this would not only
allow me to express my opinion on the prospects for a peace settlement
but also because it would allow me to lecture on the West Bank. However,
I have received a number of emails from Palestinian academics. They are
unanimous that I should respect the boycott. In view of this, I must
withdraw from the conference. Had I attended, I would have stated my
opinion that the policy of the present Israeli government is likely to
lead to disaster."
Hawking's decision to throw his weight behind
the academic boycott of Israel met with an angry response from the
organisers of the Presidential Conference, an annual event hosted by
Israeli president Shimon Peres.
"The academic boycott against
Israel is in our view outrageous and improper, certainly for someone for
whom the spirit of liberty lies at the basis of his human and academic
mission," said conference chairman Israel Maimon. "Israel is a democracy
in which all individuals are free to express their opinions, whatever
they may be. The imposition of a boycott is incompatible with open,
democratic dialogue."
Daniel Taub, the Israeli ambassador to
London, said: "It is a great shame that Professor Hawking has withdrawn
from the president's conference … Rather than caving into pressure from
political extremists, active participation in such events is a far more
constructive way to promote progress and peace."
The Wolf
Foundation, which awarded Hawking the Wolf prize in physics in 1988,
said it was "sad to learn that someone of Professor Hawking's standing
chose to capitulate to irrelevant pressures and will refrain from
visiting Israel".
But Palestinians welcomed Hawking's decision.
"Palestinians deeply appreciate Stephen Hawking's support for an
academic boycott of Israel," said Omar Barghouti, a founding member of
the Boycott, Divestment and Sanctions movement. "We think this will
rekindle the kind of interest among international academics in academic
boycotts that was present in the struggle against apartheid in South
Africa."
guardian | Professor Stephen Hawking is backing the academic boycott of Israel
by pulling out of a conference hosted by Israeli president Shimon Peres
in Jerusalem as a protest at Israel's treatment of Palestinians.
Hawking,
71, the world-renowned theoretical physicist and former Lucasian
Professor of Mathematics at the University of Cambridge, had accepted an
invitation to headline the fifth annual president's conference, Facing
Tomorrow, in June, which features major international personalities,
attracts thousands of participants and this year will celebrate Peres's
90th birthday.
Hawking is in very poor health, but last week he
wrote a brief letter to the Israeli president to say he had changed his
mind. He has not announced his decision publicly, but a statement
published by the British Committee for the Universities of Palestine
with Hawking's approval described it as "his independent decision to
respect the boycott, based upon his knowledge of Palestine, and on the
unanimous advice of his own academic contacts there".
Hawking's
decision marks another victory in the campaign for boycott, divestment
and sanctions targeting Israeli academic institutions.
In April
the Teachers' Union of Ireland became the first lecturers' association
in Europe to call for an academic boycott of Israel, and in the United
States members of the Association for Asian American Studies voted to
support a boycott, the first national academic group to do so.
In
the four weeks since Hawking's participation in the Jerusalem event was
announced, he has been bombarded with messages from Britain and abroad
as part of an intense campaign by boycott supporters trying to persuade
him to change his mind. In the end, Hawking told friends, he decided to
follow the advice of Palestinian colleagues who unanimously agreed that
he should not attend.
Hawking's decision met with abusive
responses on Facebook, with many commentators focusing on his physical
condition, and some accusing him of antisemitism.
By participating
in the boycott, Hawking joins a small but growing list of British
personalities who have turned down invitations to visit Israel,
including Elvis Costello, Roger Waters, Brian Eno, Annie Lennox and Mike
Leigh.
Rejuvenation Pills
-
No one likes getting old. Everyone would like to be immorbid. Let's be
careful here. Immortal doesnt include youth or return to youth. Immorbid
means you s...
Death of the Author — at the Hands of Cthulhu
-
In 1967, French literary theorist and philosopher Roland Barthes wrote of
“The Death of the Author,” arguing that the meaning of a text is divorced
from au...
9/29 again
-
"On this sacred day of Michaelmas, former President Donald Trump invoked
the heavenly power of St. Michael the Archangel, sharing a powerful prayer
for pro...
Return of the Magi
-
Lately, the Holy Spirit is in the air. Emotional energy is swirling out of
the earth.I can feel it bubbling up, effervescing and evaporating around
us, s...
New Travels
-
Haven’t published on the Blog in quite a while. I at least part have been
immersed in the area of writing books. My focus is on Science Fiction an
Historic...
Covid-19 Preys Upon The Elderly And The Obese
-
sciencemag | This spring, after days of flulike symptoms and fever, a man
arrived at the emergency room at the University of Vermont Medical Center.
He ...