Saturday, May 25, 2013

First Principle - Always Bet on Black - Redux

No race that has anything to contribute to the markets of the world is long in any degree ostracized.

--Booker T. Washington

Having been invited to discuss solutions, I aim to summarize some of the experiences that have given rise to what I call an "open source business model". I should be able to accomplish this in three installments. (originally posted 12/20/07 - reposted 10/13/10 - one'mo gin today)
1. The power of cultural production.

2. The power of open source culture.

3. The power of creative collaboration.

So a few years ago, I'm experimenting with Content Management Systems (CMS) (the database driven backend to interactive websites) in the context of an image sharing technology. I read an article in the Kansas City Star about a comics creator in Kansas City Kansas who was experimenting with CD-Romics, or comics on compact disc. So I call this brother Anthony Jappa

Comics are extremely visual, the cover art is a major selling draw, and all-in-all comics would make perfect grist for the image sharing mill. Furthermore, I used to have a serious love jones for comic books. Long story short, though our initial collaboration did not yield a successful business model we've been friends and collaborators ever since. (digitizing covers for collectors proved excessively labor intensive, we couldn't get folks to do the work themselves, and eBay was already in full effect) And our continuing collaboration has produced a proven and successful methodology for infecting young minds with the urge to create, collaborate, and commercialize. I keep the original development site we constructed as a memento of our original scope and objectives - because where we've gone is so much beyond what this effort set out to achieve.

Back to the matter at hand. Brother Jappa is a true master in the field of creative cultural production. Largely self-taught, he has reached the stage in his career where he writes books and creates animated shorts. He earns his living doing what he loves to do. He is also one of the greatest natural teachers I have ever had the privilege of meeting. He can teach you not only the discipline, but can teach you everything there is to know about comics creation as a cultural production business, as well. It's in the latter domain that he has taught me a helluva lot about its history, its power, and changes in the nature of the thing that have resulted from monopolistic control of comics creation, publication, and distribution. The latter is the business reality against which he's had to contend for over a dozen years.

His innovative flirtation with CD-Romics was a preliminary effort to use technology to get around some of those distribution control barriers. When I met him, I was vaguely aware that the comics game had changed substantially since I was a kid, but I had no idea concerning the specific nature of the changes that had taken place. See, I was a hardcore comic book collector when I was boy. Back in the day, a dollar would get you 4 comics and an afternoon of mind expanding escape. Moreover, you could buy comics in any convenience store, many grocery stores, almost all pharmacies, they were everywhere newstands were to be found. Comics were not only the cheapest and best entertainment available to me - they became the core of my very first business, as well. By the age of 12, I had become a collector/dealer of comics. By the time I was 20, my collection was extensive enough that I was able to pay for my first two years of college tuition and expenses through the sale of much of that collection.

As anybody who has shared this addiction knows, comics are no longer ubiquitous and a dollar won't come anywhere near buying you even a single comic book. Jappa explained to me how the Marvel/DC/Diamond publication distribution monopolies had fundamentally altered the landscape of comic book availability and further, how the dominant business practices in the industry have made it exceedingly difficult for independant comics creators to break-in in any meaningful way into this field of cultural production and entertainment. For his business to succeed - and there's no question that his work is worldclass - Jappa's had to hustle and grind like a madman. Getting breaks here and there with dealers, working the convention circuit, and doing everything he can to figure out alternative pathways by which he can achieve mass distribution and proliferation of his work product.

Because my active interest in comics had fallen off after I sold the bulk of my collection, it had been years since I had paid comics any heed at all. However, I had noticed that comics were by no means as available as they once had been. Everything that Jappa taught me about the comic publication and distribution industry was a smaller scale instantiation of what Norman Kelley had written a few years earlier about the music industry.

Today, a great deal of public attention is misdirected toward issues of content in popular cultural production. Content is merely a symptom, it is not the root cause of the present malaise in popular cultural production. Control of production and distribution is the cause of the current malaise. There are worlds upon worlds of original, wholesome, uplifting, enlightening, and entertaining cultural production that never get published or put into widespread distribution. Folks lose their minds debating high/low distinctions of culture and anybody with an opinion is qualified to enter the fray. However, real practitioners and specialists in the business know that most of the actual barriers to the market reduce to business and interpersonal nuts and bolts. These include issues like;

• how artists are recruited,
• how contracts are structured for maximums profits for record firms,
• how much firms spend on the production of an artist's work,
• whether artists make their living solely by selling units or doing performances (a situation similar to that of blues musicians),
• how musicians lose the copyright to their music,
• the lack of royalty payments, and
• the Big Six publication and distribution monopoly

On a smaller scale, the comics industry is identical.

Very likely, there never will be a substitute for the paper comic that is the equivalent of the CD-ROM substitute for vinyl record albums, and the rapid and destabilizing emergence of mp3's as a substitute for CD-ROMS. But you never know. Brother Jappa has twelve years invested in the game and he's still hustling, grinding, and inovating in order to make that breakthrough. I'm going to support him with technology any and every way I know how to do.

Back to the matter at hand, the power of cultural production - one of the other fundamentals I learned from Jappa is the universality of the basic storyboard as the basis for all complex narrative storytelling. If you look into the basement of any movie that has ever been made, what you'll find that comics is the creative and preproduction grist for that creative mill. i.e., storyboards that lay out the visual as well as narrative flow that will be converted to the screen. Same goes for complex games and video games. The comics creation discipline is a fundamental prerequisite to any complex narrative cultural production that takes place in our society. It is a lynchpin of complex cultural production.

Readers of the assault know full well the power of images to manipulate and control the subliminal consciousness. As a singular historian and master of this game, Jappa has taught me a great deal about the intent and use of images across the history of comics - on both conscious and subconscious levels - to convey specific messages to the audience these comics are created to address.

Comics creation is an immensely powerful and fundamental narrative discipline that is imperative for us to master and control. There are no barriers to entry on the creation side, and we're Working overtime to figure out methods for short-circuiting and overcoming the market barriers posed by the dominant distribution monopoly. Technology is key to overcoming those barriers. The last really powerful aspect of this game is that both boys and girls enjoy and gravitate toward various aspects of the creative work involved with it. Children love to create, they love to do cultural production work, and they love to learn all the ins-and-outs of the technique involved with doing it well. It's a carrot with which draw children into an oasis of creation, collaboration, disciplined development of technique and invite him to meet and discuss ways in which we might collaborate.and last but not least the potential commercial rewards of harnessing their imaginations and plying their creative workproducts in the marketplace.

Matter of fact, it's the initial sugarcoating I've found effective for drawing children into routine use and increasing familiarity with open source technologies. We make a concerted effort to use only open source software in our digital production efforts. I'll explain this more fully in the next installment. It's important because not only do we want our kids to be creative producers and ply the workproduct of their imaginations in the market, we also want them to be creative technologists capable of directly controlling and modifying the tools used to do the fun stuff.

Work with open source tools and technologies conduces to exposure to the wide world of open source culture, as well - and that's a culture that is fundamentally all about surmounting control and distribution barriers. Our solutions will emerge from those small, dedicated crews that learn how to surmount current control and distribution barriers and collaborate with likeminded others to proliferate that knowhow far and wide...,

observational learning...,



wikipedia | Observational learning is the learning that occurs through observing the behavior of other people. Albert Bandura, who is best known for the classic Bobo doll experiment, discovered this basic form of learning in 1986. Bandura stressed the importance of observational learning because it allowed children especially, to acquire new responses through observing others' behavior. This form of learning does not need reinforcement to occur; instead, a model is required. A social model can be a parent, sibling, friend, or teacher, but particularly in childhood a model is someone of authority or higher status. A social model is significantly important in observational learning because it allows one to cognitively process behavior, encode what is observed, and store it in memory for later imitation. While the model may not be intentionally trying to instill any particular behavior, many behaviors that one observes, remembers and imitates are actions that models display. A child may learn to swear, smack, smoke, and deem other inappropriate behavior acceptable through poor modeling. Bandura claims that children continually learn desirable and undesirable behavior through observational learning. Observational learning suggests that an individual's environment, cognition, and behavior all integrate and ultimately determine how one functions.[1] Through observational learning, behaviors of an individual can spread across a culture through a process known as diffusion chain, which basically occurs when an individual first learns a behavior by observing another individual and that individual serves as a model through whom other individuals will learn the behavior and so on so forth.[2]

Culture and environment also play a role in whether observational learning will be the dominant learning style in a person or community. In some cultures, children are expected to actively participate in their communities and are therefore exposed to different trades and roles on a daily basis.[3] This exposure allows children to observe and learn the different skills and practices that are valued in their communities.[4] In communities where children's primary mode of learning is through observation, the children are rarely separated from adult activities. This incorporation into the adult world at an early age allows children to use observational learning skills in multiple spheres of life. Culturally, they learn that their participation and contributions are valued in their communities. This teaches children that it is their duty as members of the community to observe contributions being made in order to gradually become involved and participate further in the community.[5]

coopetition...,


wikipedia | Coopetition or Co-opetition (sometimes spelled "coopertition" or "co-opertition") is a neologism coined to describe cooperative competition. Coopetition is a portmanteau of cooperation and competition.

Basic principles of co-opetitive structures have been described in game theory, a scientific field that received more attention with the book Theory of Games and Economic Behavior in 1944 and the works of John Forbes Nash on non-cooperative games. It is also applied in the fields of political science and economics and even universally [works of V. Frank Asaro, J.D.: Universal Co-opetition,2011, and The Tortoise Shell Code, novel, 2012].

Coopetition occurs when companies interact with partial congruence of interests. They cooperate with each other to reach a higher value creation if compared to the value created without interaction, and struggle to achieve competitive advantage.

Often coopetition takes place when companies that are in the same market work together in the exploration of knowledge and research of new products, at the same time that they compete for market-share of their products and in the exploitation of the knowledge created. In this case, the interactions occur simultaneously and in different levels in the value chain. This is the case of the arrangement between PSA Peugeot Citroën and Toyota to share components for a new city car - simultaneously sold as the Peugeot 107, the Toyota Aygo, and the Citroën C1, where companies save money on shared costs while remaining fiercely competitive in other areas. Several advantages can be foreseen, as cost reductions, resources complementarity and technological transfer. Some difficulties also exist, as distribution of control, equity in risk, complementary needs and trust. Not only two companies can interact within a coopetitive environment, but several partnerships among competitors are possible.

Friday, May 24, 2013

psychiatry and drug companies as bad as teachers and the academy....,



sciencemag | The new edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) is being released this month amid great controversy, with many in the field questioning whether psychiatry’s main diagnostic guide needs an overhaul, or to be completely abandoned. Some argue that it’s time to start from scratch and create a new system for diagnosing mental illness based on biological data. Others say that the manual turns too many aspects of normal life, such as grief, into medical conditions and that it’s time to look more closely at the social dimensions of psychiatric disorders, such as our relationships with friends and family.

MOOC - Maximizing Outreach to Outsider Communities?


emory | When Coursera first began partnering with top universities to bring MOOCs (massive open online courses) to a worldwide audience, the enrollment numbers created a shockwave.

Suddenly, tens of thousands of students were signing up to take a single online class, recalls Kimbi Hagen, one of Emory's early pioneers in the free, not-for-credit online experiment.

Now that Hagen, who is assistant professor in the department of behavioral sciences and health education at Rollins School of Public Health and assistant director of Emory's Center for AIDS Research, has just completed teaching one of Emory's first three MOOCs through Coursera, she realizes those enrollment numbers don't tell the whole story.

Of the 18,600 students from 174 countries who initially enrolled in her nine-week Coursera class on AIDS, some 10,601 actively participated, keeping up with online discussion forums, essays and quizzes. Untold numbers also signed up to simply audit the course material.

But through the personal stories that began filtering back, Hagen realized that her course had a far greater reach than she expected.

The class drew a range of participants, from health professionals and educators to college students and the curious.

One student, who had adopted four HIV-positive children, took the course to "learn to be the best parent and support person possible." A high school teacher, alarmed at the number of HIV-positive students at her school, sought "the right information" to share with sexually active adolescents. Another never had the courage to reveal his HIV-positive status to family and co-workers before taking the class.

All told, it was a vibrant, engaged community eager to discuss what they were learning, through online forums and beyond.

"There were many situations where people were gathering to watch (the online course), be it a village in Nigeria or an athletic team here in the U.S.," Hagen recalls.

In fact, it wasn't unusual to hear about efforts to gather an entire village, Peace Corps team or hospital staff to share and discuss her video, says Hagen, who jokes that MOOC could just as easily stand for "Maximizing Outreach to Outsider Communities."

Hagen recalls a Muslim student living in an Islamic country (she prefers to protect the location) who "would watch the videos and go from village to village to share with other women what she'd learned."

Going into the Coursera experiment, Hagen had no idea of its full potential. But observing students embrace the topic and become educators themselves, dispersing their knowledge to others -- for a teacher, she says, it doesn't get much better.

"This is easily one of the most significant things I've ever done in my entire life," Hagen says.

"And it's absolutely what the Rollins School of Public Health exists to do, what public health is really all about." 

Thursday, May 23, 2013

MOOC's still about Delivery of Instruction rather than Student Learning - why MOOLOS are coming....,

Talk is about MOOC's - it will shift to MOOLOS...,

hakesedstuff | ABSTRACT: Joshua Kim in his Inside Higher Ed report “ ‘Laptop U’ Misses the Real Story” at http://bit.ly/11b4tNd correctly pointed out two problems in Nathan Heller’s otherwise exemplary New Yorker article “Laptop U” at http://nyr.kr/10MmItb (paraphrasing):

1. “The online and blended education world, really the higher ed world where most of us spend our days, fails to make any appearance.”

2. “If in fact the real story is the rise of blended and online learning, then [that story] will go completely untold if MOOCs are the sole focus.”

In my opinion, two other problems are that “Laptop U”:

3. Fails to emphasize the fact that MOOCs, like most Higher Ed institutions, concentrate on DELIVERY OF INSTRUCTION rather than STUDENT LEARNING to the detriment of their effectiveness - - see “From Teaching to Learning: A New Paradigm for Undergraduate Education” [Barr and Tagg (1995)] at http://bit.ly/8XGJPc.

4. Ignores the failure of MOOC providers to gauge the effectiveness of their courses by pre-to-postcourse measurement of student learning gains utilizing “Concept Inventories” http://bit.ly/dARkDY. As I pointed out “Is Higher Education Running AMOOC?” [Hake (2013) at http://yhoo.it/12nPMZB, such assessment would probably demonstrate that MOOCs are actually MOORFAPs (Massive Open Online Repetitions of FAiled Pedagogy). There would then be some incentive to transform MOOCs into MOOLOs (Massive Open Online Learning Opportunities).

though he swears by only four, this is the notion I believe umbra is on about...,


wikipedia | A concept inventory is a criterion-referenced test designed to evaluate whether a student has an accurate working knowledge of a specific set of concepts. To ensure interpretability, it is common to have multiple items that address a single idea. Typically, concept inventories are organized as multiple-choice tests in order to ensure that they are scored in a reproducible manner, a feature that also facilitates administration in large classes. Unlike a typical, teacher-made multiple-choice test, questions and response choices on concept inventories are the subject of extensive research. The aims of the research include ascertaining (a) the range of what individuals think a particular question is asking and (b) the most common responses to the questions. Concept inventories are evaluated to ensure test reliability and validity. In its final form, each question includes one correct answer and several distractors. The distractors are incorrect answers that are usually (but not always) based on students' commonly held misconceptions.[1]

Ideally, a score on a criterion-referenced test reflects the amount of content knowledge a student has mastered. Criterion-referenced tests differ from norm-referenced tests in that (in theory) the former is not used to compare an individual's score to the scores of the group. Ordinarily, the purpose of a criterion-referenced test is to ascertain whether a student mastered a predetermined amount of content knowledge; upon obtaining a test score that is at or above a cutoff score, the student can move on to study a body of content knowledge that follows next in a learning sequence. In general, item difficulty values ranging between 30% and 70% are best able to provide information about student understanding.

Distractors are often based on ideas commonly held by students, as determined by years of research on misconceptions. Test developers often research student misconceptions by examining students' responses to open-ended essay questions and conducting "think-aloud" interviews with students. The distractors chosen by students help researchers understand student thinking and give instructors insights into students' prior knowledge (and, sometimes, firmly held beliefs). This foundation in research underlies instrument construction and design, and plays a role in helping educators obtain clues about students' ideas, scientific misconceptions, and didaskalogenic, that is, teacher-induced confusions and conceptual lacunae that interfere with learning.

Wednesday, May 22, 2013

eusociality, mooc intelligence sorting, geolocation and climate change....,



livescience | Earlier this month, a group of policy and legal experts from around the world met at an event co-hosted by the Centre for Spatial Law and Policy and Harvard University's Center for Geographic Analysis to examine the challenges related to our ever-evolving location-enabled society. It was a truly fascinating event with eye-opening presentations on smart transportation systems, tweet-mapping and Google Glass.

As experts openly debated the good and bad of the current Wild West era of geospatial technologies, it became clear that its current and sometimes lawless advancement is influencing trends in more traditional, related areas, such as Earth observations and environmental information.

Consider the following: Last week, Climate Central posted a report that found that "Six months after [Superstorm] Sandy, data from the eight hardest hit states shows that 11 billion gallons of untreated and partially treated sewage flowed into rivers, bays, canals, and in some cases city streets, largely as a result of record storm-surge flooding that swamped the region's major sewage treatment facilities." About the same time, Space Daily published a story on how development banks are using Earth observations to better monitor and track projects and investment globally. The BBC and NPR, in turn, reported that digitized Nimbus 1 satellite data from 1964 clarified the extent of ice cover in the Antarctic at that time, confirming the theory that sea ice is shrinking.

Those very different stories have much in common. They all illustrate the importance of geospatial technologies in better identifying, understanding and managing changing environmental conditions.
But, as we look at the changing planet and try to determine how best to respond or adapt to its uncertainty, we can be certain that:
  • People want and need environmental information like never before;
  • Demand coupled with new technologies and resources will enable access and application of that data and information like never before; and
  • With personal, economic, and national security interests driving the use of that information, new policy and legal issues will arise like never before.
Some of those issues are the changing roles of the public and private sector, calls for more open data and information policies, and the demand for environmental information.

deficit in nation's aquifers accelerating...,

usgs | A new U.S. Geological Survey study documents that the Nation's aquifers are being drawn down at an accelerating rate.
 
Groundwater Depletion in the United States (1900-2008) comprehensively evaluates long-term cumulative depletion volumes in 40 separate aquifers (distinct underground water storage areas) in the United States, bringing together reliable information from previous references and from new analyses.

"Groundwater is one of the Nation's most important natural resources. It provides drinking water in both rural and urban communities. It supports irrigation and industry, sustains the flow of streams and rivers, and maintains ecosystems," said Suzette Kimball, acting USGS Director. "Because groundwater systems typically respond slowly to human actions, a long-term perspective is vital to manage this valuable resource in sustainable ways."

To outline the scale of groundwater depletion across the country, here are two startling facts drawn from the study's wealth of statistics. First, from 1900 to 2008, the Nation's aquifers, the natural stocks of water found under the land, decreased (were depleted) by more than twice the volume of water found in Lake Erie. Second, groundwater depletion in the U.S. in the years 2000-2008 can explain more than 2 percent of the observed global sea-level rise during that period.  

Since 1950, the use of groundwater resources for agricultural, industrial, and municipal purposes has greatly expanded in the United States. When groundwater is withdrawn from subsurface storage faster than it is recharged by precipitation or other water sources, the result is groundwater depletion. The depletion of groundwater has many negative consequences, including land subsidence, reduced well yields, and diminished spring and stream flows.

While the rate of groundwater depletion across the country has increased markedly since about 1950, the maximum rates have occurred during the most recent period of the study (2000–2008), when the depletion rate averaged almost 25 cubic kilometers per year. For comparison, 9.2 cubic kilometers per year is the historical average calculated over the 1900–2008 timespan of the study.

One of the best known and most investigated aquifers in the U.S. is the High Plains (or Ogallala) aquifer. It underlies more than 170,000 square miles of the Nation's midsection and represents the principal source of water for irrigation and drinking in this major agricultural area. Substantial pumping of the High Plains aquifer for irrigation since the 1940s has resulted in large water-table declines that exceed 160 feet in places.

The study shows that, since 2000, depletion of the High Plains aquifer appears to be continuing at a high rate. The depletion during the last 8 years of record (2001–2008, inclusive) is about 32 percent of the cumulative depletion in this aquifer during the entire 20th century. The annual rate of depletion during this recent period averaged about 10.2 cubic kilometers, roughly 2 percent of the volume of water in Lake Erie. Fist tap Dale.

dry-land agriculture tough but not apocalyptic...,



NYTimes | Forty-nine years ago, Ashley Yost’s grandfather sank a well deep into a half-mile square of rich Kansas farmland. He struck an artery of water so prodigious that he could pump 1,600 gallons to the surface every minute. Last year, Mr. Yost was coaxing just 300 gallons from the earth, and pumping up sand in order to do it. By harvest time, the grit had robbed him of $20,000 worth of pumps and any hope of returning to the bumper harvests of years past. 

“That’s prime land,” he said not long ago, gesturing from his pickup at the stubby remains of last year’s crop. “I’ve raised 294 bushels of corn an acre there before, with water and the Lord’s help.” Now, he said, “it’s over.” 

The land, known as Section 35, sits atop the High Plains Aquifer, a waterlogged jumble of sand, clay and gravel that begins beneath Wyoming and South Dakota and stretches clear to the Texas Panhandle. The aquifer’s northern reaches still hold enough water in many places to last hundreds of years. But as one heads south, it is increasingly tapped out, drained by ever more intensive farming and, lately, by drought.
Vast stretches of Texas farmland lying over the aquifer no longer support irrigation. In west-central Kansas, up to a fifth of the irrigated farmland along a 100-mile swath of the aquifer has already gone dry. In many other places, there no longer is enough water to supply farmers’ peak needs during Kansas’ scorching summers. 

And when the groundwater runs out, it is gone for good. Refilling the aquifer would require hundreds, if not thousands, of years of rains. 

This is in many ways a slow-motion crisis — decades in the making, imminent for some, years or decades away for others, hitting one farm but leaving an adjacent one untouched. But across the rolling plains and tarmac-flat farmland near the Kansas-Colorado border, the effects of depletion are evident everywhere. Highway bridges span arid stream beds. Most of the creeks and rivers that once veined the land have dried up as 60 years of pumping have pulled groundwater levels down by scores and even hundreds of feet.
On some farms, big center-pivot irrigators — the spindly rigs that create the emerald circles of cropland familiar to anyone flying over the region — now are watering only a half-circle. On others, they sit idle altogether. 

Two years of extreme drought, during which farmers relied almost completely on groundwater, have brought the seriousness of the problem home. In 2011 and 2012, the Kansas Geological Survey reports, the average water level in the state’s portion of the aquifer dropped 4.25 feet — nearly a third of the total decline since 1996. 

And that is merely the average. “I know my staff went out and re-measured a couple of wells because they couldn’t believe it,” said Lane Letourneau, a manager at the State Agriculture Department’s water resources division. “There was a 30-foot decline.” 

Kansas agriculture will survive the slow draining of the aquifer — even now, less than a fifth of the state’s farmland is irrigated in any given year — but the economic impact nevertheless will be outsized. In the last federal agriculture census of Kansas, in 2007, an average acre of irrigated land produced nearly twice as many bushels of corn, two-thirds more soybeans and three-fifths more wheat than did dry land.
Farmers will take a hit as well. Raising crops without irrigation is far cheaper, but yields are far lower. Drought is a constant threat: the last two dry-land harvests were all but wiped out by poor rains. 

In the end, most farmers will adapt to farming without water, said Bill Golden, an agriculture economist at Kansas State University. “The revenue losses are there,” he said. “But they’re not as tremendously significant as one might think.”

weather getting worse and our ability to forecast not keeping up..,



livescience | Whether fiscal, political or global, we are living in an environment of change. Unfortunately, although our natural environment is changing drastically, our national response to deal with it is not.
During last Thursday's House Appropriations Subcommittee hearing on the fiscal year 2014 budget, Chairman Frank Wolf and Ranking Member Chaka Fattah cautioned those present that the nation's fiscal situation simply will not allow for new funding or the expansion of programs. As I sat there listening in full agreement, I couldn't help but wonder why there haven't been more solutions put forward to improve current investments in numerous areas related to commerce, justice and science. Surely, this is a problem we — the most technologically advanced nation in the world — can fix.

Our environmental information capability is a good example. Extreme weather and climatic events have had tremendous social and economic impacts on the nation. Numerous respected institutions, such as the National Research Council and the Government Accountability Office (GAO), have repeatedly called attention to the decline of U.S. Earth-monitoring capabilities such as vital weather satellites. Yet, we have not seen any change in how that investment is made or managed.

Just two weeks ago, GAO added weather satellites to its high-risk list, citing concerns over a potential gap in weather satellite coverage of 17 to 53 months beginning in 2014. As reported broadly through the media these last few weeks, our nation has now fallen behind Europe in weather forecast modeling. The Reinsurance Association of America estimates the insured value of U.S. coasts at $9 trillion, yet the country has only a small, emerging, operational ocean-observing capability. Despite more than 60 percent of the continental U.S. experiencing drought last summer, our national drought monitoring and forecasting capabilities continue to face funding challenges.

Finally, while more and more national security experts identify climate change as a major threat, the country has yet to establish an operational long-term forecasting capability. Our nation's annual investment in that area is estimated at $3 billion, spread across 17 federal agencies. Considering the following statistics from Munich Reinsurance's U.S. Natural Catastrophe Update for 2012, shouldn't we be asking whether this amount, and how it is being invested, is adequate to protect America's future?

Tuesday, May 21, 2013

is higher education running AMOOC?

hakesedstuff | ABSTRACT: My discussion-list post “Evaluating the Effectiveness of College” at http://yhoo.it/16cJ7HO concerned the failure of U.S. higher education to emphasize student learning rather than the delivery of instruction [Barr and Tagg (1995)] at http://bit.ly/8XGJPc. In response, a correspondent asked me “Is There Some Hope In Coursera’s Pedagogical Foundations? ”

Despite the serious cracks detected in all but one of Coursera’s five pedagogical foundation stones, I don’t think Coursera is necessarily doomed to pedagogic collapse. Instead I think there may actually be some hope IF its MOOCs are evaluated by measurement of pre-to-post-course student learning gains using Concept Inventories http://bit.ly/dARkDY. If the physics education reform effort is any guide, then (a) such assessment will demonstrate that MOOCs are actually MOORFAPs (Massive Open Online Repetitions of FAiled Pedagogy), and (b) there will be some incentive to transform MOOCs into MOOLOs (Massive Open Online Learning Opportunities).

But even if MOOCs fail to become MOOLOs there still may be some hope since, as Keith Devlin (2013) points out at http://bit.ly/14440kt, MOOCs have the potential to uncover individuals world-wide who have the talent to learn from MOORFAPs, in the same way that most current professional physicists were able to learn physics from FAPs (Failed Academic Pedagogy).

For those who may wish to dig deeper into the MOOC milieu I recommend Nathan Heller’s (2013) scholarly “LAPTOP U: Has the future of college moved online?” at http://nyr.kr/10MmItb

why open online education is flourishing...,



Fist tap Dale.

Monday, May 20, 2013

a new book release from the people who brought you the Obamamandian Candidate....,

brookings | On May 20, the Metropolitan Policy Program at Brookings will host an event marking the release of Confronting Suburban Poverty in America, co-authored by Elizabeth Kneebone and Alan Berube. They, along with some of the nation’s leading anti-poverty experts, including Luis Ubiñas, president of the Ford Foundation, and Bill Shore, founder and CEO of Share our Strength, will join leading local innovators from across the country to discuss a new metropolitan opportunity agenda for addressing suburban poverty, how federal and state policymakers can deploy limited resources to address a growing challenge, and why building on local solutions holds great promise.

Synopsis:
It has been nearly a half century since President Lyndon Johnson declared his War on Poverty, setting in motion development of America’s modern safety net. Back in the 1960s, tackling poverty “in place” meant focusing resources in the inner city and in isolated rural areas. The suburbs were home to middle- and upper-class families—affluent commuters and homeowners who did not want to raise kids in the city. But the America of 2012 is a very different place. Poverty is no longer just an urban or rural problem but increasingly a suburban one as well.

In Confronting Suburban Poverty in America, Elizabeth Kneebone and Alan Berube take on the new reality of metropolitan poverty and opportunity in America. For decades, suburbs added poor residents at a faster pace than cities, so that suburbia is now home to more poor residents than central cities, composing over a third of the nation’s total poor population. Unfortunately, the antipoverty infrastructure built over the past several decades does not fit this rapidly changing geography. The solution no longer fits the problem. Kneebone and Berube explain the source and impact of these important developments; moreover, they present innovative ideas on addressing them.

The spread of suburban poverty has many causes, including job sprawl, shifts in affordable housing, population dynamics, immigration, and a struggling economy. As the authors explain in Confronting Suburban Poverty in America, it raises a number of daunting challenges, such as the need for more (and better) transportation options, services, and financial resources. But necessity also produces opportunity—in this case, the opportunity to rethink and modernize services, structures, and procedures so that they better reflect and address new demands. This book embraces that opportunity.

The authors put forward a series of workable recommendations for public, private, and nonprofit leaders seeking to modernize poverty alleviation and community development strategies and connect residents with economic opportunity. They describe and evaluate ongoing efforts in metro areas where local leaders are learning how to do more with less and adjusting their approaches to address the metropolitan scale of poverty—for example, collaborating across sectors and jurisdictions, using data and technology in innovative ways, and integrating services and service delivery. Kneebone and Berube combine clear prose, original thinking, and illustrative graphics in Confronting Suburban Poverty in America to paint a new picture of poverty in America as well as the best ways to combat it.

the poor ye shall always have with you, in the burbs....,


kcstar.com | The number of impoverished people in America’s suburbs surged 64 percent in the past decade, creating for the first time a landscape in which the suburban poor outnumber the urban poor, a new report shows.

An extensive study by the Brookings Institution found that poverty is growing in the suburbs at more than twice the pace that it’s growing in urban centers. The collapse of the housing market and the subsequent foreclosure crisis were cited as aggravating a problem that was developing before recession struck in the late 2000s.

By 2011, the suburban poor in the nation’s major metropolitan areas outnumbered those living in urban centers by nearly 3 million, according to Confronting Suburban Poverty in America, a book to be released today by Brookings’ Metropolitan Policy Program.

The study placed the number of suburban poor at 16.4 million in 2011, up from about 10 million in 2000.

Around Kansas City, patterns of poverty have been quietly shifting for some time. But the economic downturn and job losses brought suburban poverty out of the shadows, said Karen Wulfkuhle, executive director of United Community Services of Johnson County.

“In the last three or four years, we’ve seen a growing understanding and recognition of suburban poverty,” she said. “It’s hitting people who have been here (in Johnson County) all their lives.”

More than 12 percent of Johnson County children 5 years old or younger lived below the poverty line in 2011. That figure was just 4.5 percent in 2008, Wulfkuhle said.

“Poverty isn’t a static thing,” she added. “People don’t stay on one side of the (poverty) line or the other. They move back and forth.”

More than 23,000 pupils in the county’s public schools qualified for free or reduced-price lunches in the 2012-13 school year — triple the number from a decade ago.

In suburban Platte County, the number of persons receiving food stamps climbed 11 percent between 2009 and the end of last year. Cass County saw a 15 percent jump in that time.

The Brookings study attributed part of the shifting poverty patterns to overall population growth in the nation’s suburbs, where much of the housing stock is more than 50 years old.

The authors said the trends demand new approaches in social-welfare efforts, which currently emphasize “place-based” programs to help neighborhoods with large concentrations of poor residents. Suburban poverty, by contrast, tends to be diffuse and spread across fragmented communities.

“Poverty is touching more people and places than before, challenging outdated notions of where poverty is and who it affects,” said co-author Elizabeth Kneebone.

squeezed out of the affluent urban core...,


mercurynews | The middle-class American dream, which resided for more than half a century in leafy suburban enclaves such as Mountain View, Lafayette and Antioch -- homogeneous bulwarks built by GI loans and fortified by white flight -- has given way to an alarming rise in suburban poverty over the past decade, according to a study by the Brookings Institution scheduled for release Monday.


While the poor are with us everywhere in greater numbers than ever before, the authors of "Confronting Suburban Poverty in America" conclude that the Bay Area's two largest metropolitan areas have experienced the spread of this scourge in starkly different ways.

The percentage of people living in poverty in the suburbs rose 56.1 percent in the San Francisco-Oakland-Fremont metro area from 2000 to 2011, compared to 64 percent nationwide. The San Jose-Sunnyvale-Santa Clara metropolitan region surged 53.1 percent. But Silicon Valley experienced a corresponding rise (49 percent) among its urban poor, while in San Francisco, inner-city poverty increased by only 18.4 percent.

"We have a way of dealing with poverty in America that is about five decades old," says Alan Berube, the book's co-author -- along with Elizabeth Kneebone -- and Brookings Metropolitan Policy Program Senior Fellow. "And it's built for where poverty was then: primarily in inner cities. The way the programs are structured and delivered really doesn't compute for a lot of suburbia and the increasing number of low-income people who are living there."

Unknown consequences
Suburban life, which became a fixture of American postwar mythologizing, reached its apex with the release of "E.T. the Extra-Terrestrial," Steven Spielberg's 1982 film about of life in cookie-cutter tract housing. According to the Brookings study, however, poverty during the past decade grew twice as fast in the suburbs as cities. By 2011, 3 million more poor people lived in suburbs of the nation's major metropolitan areas than in its big cities.

The book actually opens with a description of the suburbs of East Contra Costa County -- places such as Oakley, Antioch and Brentwood -- where the number of people living below the poverty line grew by more than 70 percent in the past decade. Berube says the high cost of living in San Francisco simply pushed the urban poor who bus restaurant tables and drive cabs into a kind of blight flight.

"Try living somewhere in the city of San Francisco on $20,000 a year for a family of four," Berube says. "A lot of families saw an opportunity to live in a safer community, and in a better housing unit, way out in East Contra Costa County. It was a very rational decision in response to a shrinking supply of affordable housing, but I don't think we thought about what would be the consequences for those families when they got there."

nationwide establishment full-court press to get back on agenda....,

abqjournal | The Albuquerque metropolitan area ranks eighth in the country for suburban poverty, according to a new book published by the Brookings Institution.

Albuquerque’s eighth place comes from a 17 percent suburban poverty rate, which falls behind the list-topping Texas metropolitan areas of El Paso and McAllen, with suburban poverty rates of 36.4 percent and 35.4 percent.

The metropolitan areas with the lowest suburban poverty rates are Des Moines, Iowa, with 5.7 percent, the Bridgeport-Stamford, Conn., area with 5.9 percent and Baltimore, with 6.7 percent.

The book, “Confronting Suburban Poverty in America,” published today, compares the top 100 metropolitan areas’ city and suburban poverty numbers.

Yet, unlike many other cities, Albuquerque’s suburbs have a lower poverty rate than Albuquerque itself.
Poverty is generally defined as “not earning enough money to meet one’s basic needs,” according to Kim Posich, executive director of the New Mexico Center on Law and Poverty. Approximately 22 percent of Albuquerque residents and between 21.5 and 23 percent of New Mexico residents live in poverty, which the federal government calculates as a four-person family living on about $22,500 a year, he added.

In 1970, nationwide, 7.4 million city dwellers and 6.4 million suburbanites lived in poverty. By 2011, the number of poor suburbanites exceeded the number of poor city dwellers. In 2011, 12.8 million people in cities and 15.3 million people in suburbs lived in poverty.

For the Albuquerque metropolitan area — which the U.S. census estimates to include 901,000 people in Sandoval, Valencia, Torrance and Bernalillo counties — numbers trended in the opposite direction. In 1970, 34,116 Albuquerqueans and 34,784 suburbanites lived in poverty, making the split just about even. But 41 years later, 106,397 Albuquerqueans were living in poverty, compared with 74,688 suburbanites.

Albuquerque’s data bucked national trends in the first half of that four-decade period because the city kept gobbling up geographical portions of unincorporated land, says Alan Berube, a senior Brookings fellow who co-wrote the 143-page book with Brookings colleague Elizabeth Kneebone over a two-year period.
“Parts (of Albuquerque) that were in the suburbs in 1970 are actually part of the city today, because the city has absorbed those communities … so that has added to its population, and it (has) added to its poor population, too,” Berube said.

Sunday, May 19, 2013

social psychology catching a right proper shit-hammering...,

interestingly the New Yorker does not concur....,
nature | Thinking about a professor just before you take an intelligence test makes you perform better than if you think about football hooligans. Or does it? An influential theory that certain behaviour can be modified by unconscious cues is under serious attack.

A paper published in PLoS ONE last week1 reports that nine different experiments failed to replicate this example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks.

David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE, is among sceptical scientists calling for Dijksterhuis to design a detailed experimental protocol to be carried out indifferent laboratories to pin down the effect. Dijksterhuis has rejected the request, saying that he “stands by the general effect” and blames the failure to replicate on “poor experiments”.

An acrimonious e-mail debate on the subject has been dividing psychologists, who are already jittery about other recent exposures of irreproducible results (see Nature 485, 298–300; 2012). “It’s about more than just replicating results from one paper,” says Shanks, who circulated a draft of his study in October; the failed replications call into question the under­pinnings of ‘unconscious-thought theory’.

Dijksterhuis published that theory in 2006 (ref. 3). It fleshed out more general, long-held claims about a ‘smart unconscious’ that had been proposed over the past couple of decades — exemplified in writer Malcolm Gladwell’s best-selling book Blink (Penguin, 2005). The theory holds that behaviour can be influenced, or ‘primed’, by thoughts or motives triggered unconsciously — in the case of intelligence priming, by the stereotype of a clever professor or a stupid hooligan. Most psychologists accept that such priming can occur consciously, but many, including Shanks, are unconvinced by claims of unconscious effects.

In their paper, Shanks and his colleagues tried to obtain an intelligence-priming effect, following protocols in Dijksterhuis’s papers or refining them to amplify any theoretical effect (for example, by using a test of analytical thinking instead of general knowledge). They also repeated intelligence-priming studies from independent labs. They failed to find any of the described priming effects in their experiments.

The e-mail debate that Shanks joined was kicked off last September, when Daniel Kahneman, a Nobel-prizewinning psychologist from Princeton University in New Jersey who thinks that unconscious social priming is likely to be real, circulated an open letter warning of a “train wreck looming” (see Nature http://doi.org/mdr; 2012) because of a growing number of failures to replicate results. Social psychology “is now the poster child for doubts about the integrity of psychological research”, he told psychologists, “and it is your responsibility” to deal with it.

more making stuff up and getting busted for it....,



NYTimes | One summer night in 2011, a tall, 40-something professor named Diederik Stapel stepped out of his elegant brick house in the Dutch city of Tilburg to visit a friend around the corner. It was close to midnight, but his colleague Marcel Zeelenberg had called and texted Stapel that evening to say that he wanted to see him about an urgent matter. The two had known each other since the early ’90s, when they were Ph.D. students at the University of Amsterdam; now both were psychologists at Tilburg University. In 2010, Stapel became dean of the university’s School of Social and Behavioral Sciences and Zeelenberg head of the social psychology department. Stapel and his wife, Marcelle, had supported Zeelenberg through a difficult divorce a few years earlier. As he approached Zeelenberg’s door, Stapel wondered if his colleague was having problems with his new girlfriend. 

Zeelenberg, a stocky man with a shaved head, led Stapel into his living room. “What’s up?” Stapel asked, settling onto a couch. Two graduate students had made an accusation, Zeelenberg explained. His eyes began to fill with tears. “They suspect you have been committing research fraud.” 

Stapel was an academic star in the Netherlands and abroad, the author of several well-regarded studies on human attitudes and behavior. That spring, he published a widely publicized study in Science about an experiment done at the Utrecht train station showing that a trash-filled environment tended to bring out racist tendencies in individuals. And just days earlier, he received more media attention for a study indicating that eating meat made people selfish and less social. 

His enemies were targeting him because of changes he initiated as dean, Stapel replied, quoting a Dutch proverb about high trees catching a lot of wind. When Zeelenberg challenged him with specifics — to explain why certain facts and figures he reported in different studies appeared to be identical — Stapel promised to be more careful in the future. As Zeelenberg pressed him, Stapel grew increasingly agitated.
Finally, Zeelenberg said: “I have to ask you if you’re faking data.” 

“No, that’s ridiculous,” Stapel replied. “Of course not.” 

That weekend, Zeelenberg relayed the allegations to the university rector, a law professor named Philip Eijlander, who often played tennis with Stapel. After a brief meeting on Sunday, Eijlander invited Stapel to come by his house on Tuesday morning. Sitting in Eijlander’s living room, Stapel mounted what Eijlander described to me as a spirited defense, highlighting his work as dean and characterizing his research methods as unusual. The conversation lasted about five hours. Then Eijlander politely escorted Stapel to the door but made it plain that he was not convinced of Stapel’s innocence.

Saturday, May 18, 2013

ADHD is a fictitious disease - and - the vocabulary of psychiatry is now defined at all levels by the pharmaceutical industry



blicunion | Fortunately, the Swiss National Advisory Commission on Biomedical Ethics (NEK, President: Otfried Höffe) critically commented on the use of the ADHD drug Ritalin in its opinion of 22 November 2011 titled Human enhancement by means of pharmacological agents: The consumption of pharmacological agents altered the child’s behavior without any contribution on his or her part.

That amounted to interference in the child’s freedom and personal rights, because pharmacological agents induced behavioral changes but failed to educate the child on how to achieve these behavioral changes independently. The child was thus deprived of an essential learning experience to act autonomously and emphatically which “considerably curtails children’s freedom and impairs their personality development”, the NEK criticized.

The alarmed critics of the Ritalin disaster are now getting support from an entirely different side. The German weekly Der Spiegel quoted in its cover story on 2 February 2012 the US American psychiatrist Leon Eisenberg, born in 1922 as the son of Russian Jewish immigrants, who was the “scientific father of ADHD” and who said at the age of 87, seven months before his death in his last interview:“ADHD is a prime example of a fictitious disease”

Since 1968, however, some 40 years, Leon Eisenberg’s “disease” haunted the diagnostic and statistical manuals, first as “hyperkinetic reaction of childhood”, now called “ADHD”. The use of ADHD medications in Germany rose in only eighteen years from 34 kg (in 1993) to a record of no less than 1760 kg (in 2011) – which is a 51-fold increase in sales! In the United States every tenth boy among ten year-olds already swallows an ADHD medication on a daily basis. With an increasing tendency.

When it comes to the proven repertoire of Edward Bernays, the father of propaganda, to sell the First World War to his people with the help of his uncle’s psychoanalysis and to distort science and the faith in science to increase profits of the industry – what about investigating on whose behalf the “scientific father of ADHD” conducted science? His career was remarkably steep, and his “fictitious disease” led to the best sales increases. And after all, he served in the “Committee for DSM V and ICD XII, American Psychiatric Association” from 2006 to 2009. After all, Leon Eisenberg received “the Ruane Prize for Child and Adolescent Psychiatry Research. He has been a leader in child psychiatry for more than 40 years through his work in pharmacological trials, research, teaching, and social policy and for his theories of autism and social medicine”.

And after all, Eisenberg was a member of the “Organizing Committee for Women and Medicine Conference, Bahamas, November 29 – December 3, 2006, Josiah Macy Foundation (2006)”. The Josiah Macy Foundation organized conferences with intelligence agents of the OSS, later CIA, such as Gregory Bateson and Heinz von Foerster during and long after World War II. Have such groups marketed the diagnosis of ADHD in the service of the pharmaceutical market and tailor-made for him with a lot of propaganda and public relations? It is this issue that the American psychologist Lisa Cosgrove and others investigated in their study Financial Ties between DSM-IV Panel Members and the Pharmaceutical Industry7. They found that “Of the 170 DSM panel members 95 (56%) had one or more financial associations with companies in the pharmaceutical industry. One hundred percent of the members of the panels on ‘Mood Disorders’ and ‘Schizophrenia and Other Psychotic Disorders’ had financial ties to drug companies. The connections are especially strong in those diagnostic areas where drugs are the first line of treatment for mental disorders.” In the next edition of the manual, the situation is unchanged. “Of the 137 DSM-V panel members who have posted disclosure statements, 56% have reported industry ties – no improvement over the percent of DSM-IV members.” “The very vocabulary of psychiatry is now defined at all levels by the pharmaceutical industry,” said Dr Irwin Savodnik, an assistant clinical professor of psychiatry at the University of California at Los Angeles.

Friday, May 17, 2013

I.Q. as a measure of intelligence is a myth...,



  • We propose that human intelligence is composed of multiple independent components
  • Each behavioral component is associated with a distinct functional brain network
  • The higher-order “g” factor is an artifact of tasks recruiting multiple networks
  • The components of intelligence dissociate when correlated with demographic variables
What makes one person more intellectually able than another? Can the entire distribution of human intelligence be accounted for by just one general factor? Is intelligence supported by a single neural system? Here, we provide a perspective on human intelligence that takes into account how general abilities or ‘‘factors’’ reflect the functional organization of the brain. By comparing factor models of individual differences in performance with factor models of brain functional organization, we demonstrate that different components of intelligence have their analogs in distinct brain networks. Using simulations based on neuroimaging data, we show that the higher-order factor ‘‘g’’ is accounted for by cognitive tasks co-recruiting multiple networks. Finally, we confirm the independence of these components of intelligence by dissociating them using questionnaire variables. We propose that intelligence is an emergent property of anatomically distinct cognitive systems, each of which has its own capacity. cell |

next to nothing is known about the neurobiological mechanisms underlying individuality



thescientist | When a group of genetically identical mice lived in the same complex enclosure for 3 months, individuals that explored the environment more broadly grew more new neurons than less adventurous mice, according to a study published today (May 9) in Science. This link between exploratory behavior and adult neurogenesis shows that brain plasticity can be shaped by experience and suggests that the process may promote individuality, even among genetically identical organisms.

“This is a clear and quantitative demonstration that individual differences in behavior can be reflected in individual differences in brain plasticity,” said Fred Gage of the Salk Institute for Biological Studies in La Jolla, California, who was not involved the study. “I don’t know of another clear example of that . . . and it tells me that there is a tighter relationship between [individual] experiences and neurogenesis than we had previously thought.”

Scientists have often tried to tackle the question of how individual differences in behavior and personality develop in terms of the interactions between genes and environment. “But there is next to nothing [known] about the neurobiological mechanisms underlying individuality,” said Gerd Kempermann of the German Center for Neurodegenerative Diseases in Dresden.

One logical way to study this phenomenon is to look at brain plasticity, or how the brain’s structure and function change over time. Plasticity is hard to study, however, because it mostly takes place at the synaptic level, so Kempermann and his colleagues decided to look at the growth of new neurons in the adult hippocampus, which can easily be quantified. Earlier studies have demonstrated that activity—both physical and cognitive—increases adult neurogenesis in groups of genetically identical mice, but there were differences between individuals in the amount of neuron growth.

To understand why, Kempermann and his colleagues housed 40 genetically identical female inbred mice in a complex 5-square-meter, 5-level enclosure filled with all kinds of objects designed to encourage activity and exploration. The mice were tagged with radio-frequency infer-red (RFIR) transponders, and 20 antennas placed around the enclosure tracked their every movement. After 3 months, the researchers assessed adult neurogenesis in the mice by counting proliferating precursor cells, which had been labeled before the study began.

The researchers found that individual differences in exploratory behavior correlated with individual differences in the numbers of new neurons generated. “To out knowledge, it’s the first example of a direct link between individual behavior and individual brain plasticity,” Kempermann said.

Gage cautions about pinning all the differences on the environment, however. Although the mice in the study were genetically identical, he said, they were not behaviorally identical to begin with: clearly some variation occurs at a very early stage that makes them more or less prone to explore. “It’s incorrect to think of it that the environment caused the difference between the mice,” he said. “The difference was already there, and the environment amplified that difference. My own personal bias is that there are likely genetic events that happened at germline, or somatic events over time,” that set the stage for these subtle behavioral differences that are subsequently amplified.

Thursday, May 16, 2013

primates of park avenue...,


NYPost | They are 1 percenters who are 100 percent despicable. Some wealthy Manhattan moms have figured out a way to cut the long lines at Disney World — by hiring disabled people to pose as family members so they and their kids can jump to the front, The Post has learned. 

 The “black-market Disney guides” run $130 an hour, or $1,040 for an eight-hour day. “My daughter waited one minute to get on ‘It’s a Small World’ — the other kids had to wait 2 1/2 hours,” crowed one mom, who hired a disabled guide through Dream Tours Florida.

“You can’t go to Disney without a tour concierge,’’ she sniffed. “This is how the 1 percent does Disney.”
The woman said she hired a Dream Tours guide to escort her, her husband and their 1-year-old son and 5-year-old daughter through the park in a motorized scooter with a “handicapped” sign on it. The group was sent straight to an auxiliary entrance at the front of each attraction.

Disney allows each guest who needs a wheelchair or motorized scooter to bring up to six guests to a “more convenient entrance.” 

The Florida entertainment mecca warns that there “may be a waiting period before boarding.” But the consensus among upper-crust moms who have used the illicit handicap tactic is that the trick is well worth the cost.

Not only is their “black-market tour guide” more efficient than Disney World’s VIP Tours, it’s cheaper, too.
Disney Tours offers a VIP guide and fast passes for $310 to $380 per hour.

Passing around the rogue guide service’s phone number recently became a shameless ritual among Manhattan’s private-school set during spring break. The service asks who referred you before they even take your call.

“It’s insider knowledge that very few have and share carefully,” said social anthropologist Dr. Wednesday Martin, who caught wind of the underground network while doing research for her upcoming book “Primates of Park Avenue.”

“Who wants a speed pass when you can use your black-market handicapped guide to circumvent the lines all together?” she said.

“So when you’re doing it, you’re affirming that you are one of the privileged insiders who has and shares this information.”

cheating and co-operating...,



abc.net.au | Lying, cheating and other forms of Machiavellian skulduggery seem to be the inevitable evolutionary consequences of living in co-operative communities, suggest UK scientists.

Instead of viewing deception and co-operation as polar opposites, Luke McNally from Trinity College Dublin and Andrew Jackson from the University of Edinburgh say we might do better to think of them as two sides of the same evolutionary coin.

"Deception is an inherent component of our complex social lives, and it's likely impossible to separate the good from the bad; the darkest parts of our psychology evolved as a result of the most virtuous," says McNally.

The researchers lay out their evidence in the latest issue of the Proceedings of the Royal Society B.
First, they use game theory to show the evolution of co-operation creates pressures that favour the evolution of deception.

In their scenario, individuals have three options: to always cheat and not help others; to reciprocate the help that others offer; or cheat and try to conceal this cheating by deceiving others.

"When reciprocal co-operators interact with honest cheaters, they spot their cheating and stop co-operating with them," McNally explains.

"However, as deceivers are better at hiding their cheating, reciprocal co-operators find it harder to spot their cheating.

"This means that the deceivers are able to gain co-operation without having to co-operate themselves, allowing deception to evolve."

The researchers back up this theory with real-world evidence gathered from studies of deception in 24 different primate species.

They show deceptive behaviour is more common in species that co-operate more.

"Our comparative analysis shows the more co-operation a species engages in the more it engages in deception, which is what our model predicts," McNally says.

Wednesday, May 15, 2013

POTUS 2016 - remember where you heard it first!


richwine says "he's no racist, and has a tough time spotting it, too"...,


theatlanticwire | Richwine says his passion for outlining the case for racial inferiority is rooted in his love of data not racism. At a 2008 panel, Richwine ranked races by IQ: "Decades of psychometric testing has indicated that at least in America, you have Jews with the highest average IQ, usually followed by East Asians, then you have non-Jewish whites, Hispanics, and then blacks." Now, he tells York, he's not sorry for those comments. "I don't apologize for any of the things that I said," he says. But he does wish he'd put an asterisk on the entire sentence so it doesn't sound like he's endorsing the idea that some ethnic groups are just biologically destined to be less intelligent than others. He would have noted that "there is a nuance that goes along with that: the extent to which IQ scores actually reflect intelligence, the fact that it reflects averages and there is a lot of overlap in any population, and that IQ scores say absolutely nothing about the causes of the differences -- environmental, genetic, or some combination of those things.

Richwine's argument that he is not a racist because he does not think of himself as a racist is not very persuasive, although it is common. But even more problematic is that Richwine also admits to York that he's not very good at spotting racism. In 2010, for example, he wrote for two articles for the white nationalist site Alternative Right. One of his articles made the argument that since "U.S.-born Hispanics are much more likely to be incarcerated than foreign-born Hispanics" that "implies that Hispanic crime will become more of a problem as time goes on, not less." That fits well with the editorial agenda founder Richard B. Spencer, a former editor of The American Conservative, who has a history of saying things like, "There are races who, on average, are going to be superior." People like blogger E.D. Kain have dubbed the site "ugly white nationalism." Richwine said he didn't think anything was problematic, telling York, "I thought it would be like a paleo-conservative website. I had seen that [former National Review writer] John Derbyshire had also published something there." Derbyshire was left The National Review because he wrote an essay about how he tells his kids to avoid groups of black people but to have one black friend to inoculate against charges of racism.

That was in 2012 — and Derbyshire had been writing racist things for years. As I argued at the time, he "effectively demonstrates, year after year, exactly how racist you can be and still get published by people who consider themselves intellectuals." That line has since moved, which Richwine apparently noticed too late.

how misinformed ideas about profit are holding back the world's poor?


fastcoexist | I run a for-profit business that delivers products and services to customers earning less than $6 a day in West Africa. When I tell people this, I frequently encounter disbelief or concern. The three most common responses I hear are:  
  • Surely you can’t make money working with people who are so poor?
  • Don’t you feel like you are taking advantage of these people by making money from them?
  • Wouldn’t charity do a better job of meeting their needs?
While these questions are well-intentioned, I initially found them upsetting because they go far beyond a healthy skepticism about my business model. They made me doubt whether I should be working with poor consumers at all.

While I stayed the course, I fear that many will simply choose a simpler path of building a startup in developed markets. The absolute worst thing that can happen for the poorest people on Earth is that the next generation of superstar entrepreneurs ends up in Silicon Valley making iPhone Apps, rather than trying to address the problems of the 4 billion people who need them the most.

So next time you overhear one of these questions, do the world’s poor a favor and shoot it down. Here’s how:

phantom financial wealth, phantom carrying capacity and phantom democratic power...,

karlnorth | The Interdependence. Economic activity at phantom carrying capacity depletes resources at a rate that causes rising resource costs and decreasing profit margins in the production of real wealth. The investor class therefore turns increasingly to the production of credit as a source of profits. Credit unsupported by the production of real wealth is stealing from the future: it is phantom wealth. It also creates inflation, which is stealing from the purchasing power of income in the present. Protected from the masses by the illusion of democracy, government facilitates the unlimited production of credit and the continued overshoot of real carrying capacity. This causes inflation and permanently rising costs of raw materials. To divert public attention from the resultant declining living standard of the laboring classes, government dispenses rigged statistics and fake news of continued growth to project the illusion of economic health. The whole interdependent phantom stage of the capitalist system has an extremely limited life before it collapses into chaos.

Weak People Are Open, Empty, and Easily Occupied By Evil...,

Tucker Carlson: "Here's the illusion we fall for time and again. We imagine that evil comes like fully advertised as such, like evi...