medium | The next five years will see the international market for ‘riot control systems’ boom to a value of more than $5 billion at an annual growth rate of 5%, according to a new report by a global business intelligence firm.
The report forecasts a dramatic rise in civil unrest across the world, including in North America and Europe, driven by an increase in Ferguson-style incidents and “extremist attacks.”
The Middle East, North Africa and Asia-Pacific regions will also experience a persistent rise in conflicts.
This increasing trend in instability promises billions of dollars of profits for global defence firms, concludes the report, published last month by Infiniti Research Ltd., a market intelligence firm whose clients include Fortune 500 companies.
“Protests, riots, and demonstrations are major issues faced by the law enforcement agencies across the world,” said Abhay Singh, a lead defence technology analyst at the firm. “In addition the increase in incidents of civil wars in countries such as Syria, Iraq, Lebanon, and Egypt along with an increase in the global defence budget will generate demand for riot control systems.”
Europe, the Middle East and Africa will be the largest market, collectively experiencing a rate of growth at over 5%, exceeding $2 billion by 2020. Under the subheading, ‘EMEA: increase in extremist attacks to boost growth’, the report, priced at over $2,000, explains:
“Over the past years, Europe witnessed an increase in extremist attacks, which has raised concerns among the law enforcement and defense industries to equip themselves with modern equipment and protect civilians from external threats. In 2015, the Paris attacks and the killing of journalists in France are some of the examples of growing terrorism in Europe.”
The combination of intensifying conflict, terrorism, and civil unrest will lead to rocketing demand for riot control systems over the next 5 years “led by Germany, Russia, France, Poland, Saudi Arabia, Turkey, the UAE, Iran, and South African countries.”
rollingstone | In March, the commander in chief of the
War on Drugs stood in front of a crowd of policymakers, advocates and
recovering addicts to declare that America has been doing it wrong.
Speaking at the National Prescription Drug Abuse and Heroin Summit in
Atlanta – focused on an overdose epidemic now killing some 30,000
Americans a year – President Barack Obama declared, "For too long we
have viewed the problem of drug abuse ... through the lens of the
criminal justice system," creating grave costs: "We end up with jails
full of folks who can't function when they get out. We end up with
people's lives being shattered."
Touting a plan to increase drug-treatment spending by more than $1
billion – the capstone to the administration's effort to double the
federal drug-treatment budget – Obama insisted, "This is a
straightforward proposition: How do we save lives once people are
addicted, so that they have a chance to recover? It doesn't do us much
good to talk about recovery after folks are dead."
Obama's speech underscored tactical and rhetorical shifts in the
prosecution of the War on Drugs – the first durable course corrections
in this failed 45-year war. The administration has enshrined three
crucial policy reforms. First, health insurers must now cover drug
treatment as a requirement of Obamacare. Second, draconian drug
sentences have been scaled back, helping to reduce the number of federal
drug prisoners by more than 15 percent. Third, over the screams of
prohibitionists in its ranks, the White House is allowing marijuana's
march out of the black market, with legalization expected to reach
California and beyond in November.
The administration's change in rhetoric has been even more sweeping:
Responding to opioid deaths, Obama appointed a new drug czar, Michael
Botticelli, who previously ran point on drug treatment in Massachusetts.
Botticelli has condemned the "failed policies and failed practices" of
past drug czars, and refers not to heroin "junkies" or "addicts" but to
Americans with "opioid-abuse disorders."
medicalxpress | Jessica and Darren McIntosh were
too busy to see me when I arrived at their house one Sunday morning.
When I returned later, I learned what they'd been busy with: arguing
with a family member, also an addict, about a single pill of
prescription painkiller she'd lost, and injecting meth to get by in its
absence. Jessica, 30, and Darren, 24, were children when they started
using drugs. Darren smoked his first joint when he was 12 and quickly
moved on to snorting pills. "By the time I was 13, I was a full-blown
pill addict, and I have been ever since," he said. By age 14, he'd quit
school. When I asked where his care givers were when he started using
drugs, he laughed. "They're the ones that was giving them to me," he
alleged. "They're pill addicts, too."
Darren was 13
when he started taking pills, which he claims were given to him by an
adult relative. "He used to feed them to me," Darren said. On fishing
trips, they'd get high together. Jessica and Darren have never known a
life of family dinners, board games and summer vacations. "This right
here is normal to us," Darren told me. He sat in a burgundy recliner,
scratching at his arms and pulling the leg rest up and down. Their house
was in better shape than many others I'd seen, but nothing in it was
theirs. Their bedrooms were bare. The kind of multigenerational drug use
he was describing was not uncommon in their town, Austin, in southern
Indiana. It's a tiny place, covering just two and a half square miles of
the sliver of land that comprises Scott County. An incredible
proportion of its 4,100 population – up to an estimated 500 people
– are shooting up. It was here, starting in December 2014, that the
single largest HIV outbreak in US history took place. Austin went from
having no more than three cases per year to 180 in 2015, a prevalence
rate close to that seen in sub-Saharan Africa.
Exactly how this appalling human crisis happened here, in this
particular town, has not been fully explained. I'd arrived in Scott
County a week previously to find Austin not exactly desolate. Main
Street had a few open businesses, including two pharmacies and a
used-goods store, owned by a local police sergeant. The business with
the briskest trade was the gas station, which sold $1 burritos and egg
rolls. In the streets either side of it, though, modest ranch houses
were interspersed among shacks and mobile homes. Some lawns were
well-tended, but many more were not. On some streets, every other house
had a warning sign: 'No Trespassing', 'Private Property', 'Keep Out'.
Sheets served as window curtains. Many houses were boarded up. Others
had porches filled with junk – washing machines, furniture, toys, stacks
of old magazines. There were no sidewalks. Teenage and twenty-something
girls walked the streets selling sex. I watched a young girl in a puffy
silver coat get into a car with a grey-haired man. I met a father who
always coordinates with his neighbour to make sure their children travel
together, even between their homes, which are a block apart. Driving
around for days, knocking on doors looking for drug users who would speak with me was intimidating. I've never felt more scared than I did in Austin.
The mystery of Austin is only deepened by a visit to the neighbouring
town of Scottsburg, the county seat, eight miles south. It's just a bit
bigger than Austin, with a population of about 6,600, but it's vastly
different. A coffee shop named Jeeves served sandwiches and tall slices
of homemade pie, which you could eat while sitting in giant, cushiony
chairs in front of a fireplace. A shop next door sold artisanal soap and
jam. The town square had a war memorial and was decorated for
Christmas. The library was populated. The sidewalks had people and the
streets had traffic. There were drugs in Scottsburg, but the town did
not reek of addiction. The people didn't look gaunt and drug-addled. No
one I asked could explain why these two towns were so different, and no
one could explain what had happened to Austin. But a new theory of
public health might yet hold the answer. Known as syndemics, it may also
be the one thing that can rescue Austin and its people.
The term syndemics was coined by Merrill Singer, a medical
anthropologist at the University of Connecticut. Singer was working with
injecting drug users in Hartford in the 1990s in an effort to find a
public health model for preventing HIV among these individuals. As he
chronicled the presence of not only HIV but also tuberculosis and
hepatitis C among the hundreds of drug users he interviewed, Singer
began wondering how those diseases interacted to the detriment of the
person. He called this clustering of conditions a 'syndemic', a word
intended to encapsulate the synergistic intertwining of certain
problems. Describing HIV and hepatitis C as concurrent implies they are
separable and independent. But Singer's work with the Hartford drug
users suggested that such separation was impossible. The diseases
couldn't be properly understood in isolation. They were not individual
problems, but connected.
Singer quickly realised that syndemics was not just about the
clustering of physical illnesses; it also encompassed nonbiological
conditions like poverty, drug abuse, and other social, economic and
political factors known to accompany poor health.
CNN | Hepatitis C-related deaths reached an all-time high in 2014,
the Centers for Disease Control and Prevention announced Wednesday,
surpassing total combined deaths from 60 other infectious diseases
including HIV, pneumococcal disease and tuberculosis. The increase
occurred despite recent advances in medications that can cure most
infections within three months.
"Not
everyone is getting tested and diagnosed, people don't get referred to
care as fully as they should, and then they are not being placed on
treatment," said Dr. John Ward, director of CDC's division of viral
hepatitis.
At
the same time, surveillance data analyzed by the CDC shows an alarming
uptick in new cases of hepatitis C, mainly among those with a history of
using injectable drugs. From 2010 to 2014, new cases of hepatitis C
infection more than doubled. Because hepatitis C has few noticeable
symptoms, said Ward, the 2,194 cases reported in 2014 are likely only
the tip of the iceberg.
"Due to
limited screening and underreporting, we estimate the number of new
infections is closer to 30,000 per year," Ward said. "So both deaths and
new infections are on the rise."
"These
statistics represent the two battles that we are fighting. We must act
now to diagnose and treat hidden infections before they become deadly,
and to prevent new infections."
WaPo | The Labour Party has since suspended the offending councilors, but the comments have sparked fierce debates about anti-Semitism in Britain and look to be set to affect local elections taking place there Thursday.
A studyinto anti-Semitism by Tel Aviv University’s Kantor Center for the Study of Contemporary European Jewry that was published Wednesday noted that although violent anti-Semitic incidents worldwide decreased in 2015 compared with previous years, Europe’s Jews are growing increasingly concerned about their future.
The research noted that “the number of verbal and visual anti-Semitic expressions, mainly on social media, turned more threatening and insulting” and that anti-Semitic language against Israel as a Jewish state often infiltrates the mainstream.
In Europe, researchers found that Jewish communities and individuals feel threatened by the radicalization of Muslim citizens and the influx of refugees. There are also concerns that the mass migration will strengthen right-wing nationalist parties.
observer | DNC chair Debbie Wasserman Schultz is emblematic of the role big money plays in politics and, for the future of the Democratic party, it is vital for her to be replaced. Ms. Wasserman Schultz’s career has been in jeopardy since the beginning of the Democratic primaries, as a wave of resentful backlash over corrupt party politics has linked her to everything that is wrong with establishment practices. Her poor leadership and lack of impartiality as chair of the Democratic National Committee has disenfranchised millions of Democrats around the country, and has inspired thousands of progressive Independents to support Senator Bernie Sanders for president—no matter what.
An essential step in reuniting the Democratic party after the divisive presidential primaries will be to replace Debbie Wasserman Schultz with a new DNC chair who can be trusted to remain impartial. What the party needs most at this critical moment is a leader who will reinstitute the ban on federal lobbyists and super PACs buying off the DNC and its members. The ability for Democrats to unite as one party is obstructed not only by the polarity between Mr. Sanders and Ms. Clinton, but in large part by Ms. Wasserman Schultz—who has favored Ms. Clinton and other candidates who court corporate and wealthy donors rather than their constituents.
Ms. Wasserman Schultz has little interest in growing the Democratic party, and is content on maintaining the status quo to ensure she and the candidates she sympathizes with remain in office. In a recent interview on MSNBC, Ms. Wasserman Schultz vocalized her support for closing off all Democratic primaries from anyone not registered as a Democrat.
“I believe that the party’s nominee should be chosen—this is Debbie Wasserman Schultz’s opinion—that the party’s nominee should be chosen by members of the party,” the DNC Chair said in an interview with MSNBC Live, according to the Washington Examiner.
petras.lahaine | From our discussion it is clear that there is a profound disparity between the stellar academic achievements of Israel-First officials in the US government and the disastrous consequences of their public policies in office.
The ethno-chauvinist claim of unique ‘merit’ to explain the overwhelming success of American Jews in public office and in other influential spheres is based on a superficial reputational analysis, bolstered on degrees from prestigious universities. But this reliance on reputation has not held up in terms of performance - the successful resolution of concrete problems and issues. Failures and disasters are not just ‘overlooked’; they are rewarded.
After examining the performance of top officials in foreign policy, we find that their ‘assumptions’ (often blatant manipulations and misrepresentations) about Iraq were completely wrong; their pursuit of war was disastrous and criminal; their ‘occupation blueprint’ led to prolonged conflict and the rise of terrorism; their pretext for war was a fabrication derived from their close ties to Israeli intelligence in opposition to the findings US intelligence. Their sanctions policy toward Iran has cost the US economy many billions while their pro-Israel policy cost the US Treasury (and taxpayers) over $110 billion over the last 30 years. Their one-sided ‘Israel-First’ policy has sabotaged any a ‘two-state’ resolution of the Palestinian-Israeli conflict and has left millions of Palestinians in abject misery. Meanwhile, the disproportionate number of high officials who have been accused of giving secret US documents to Israel (Wolfowitz, Feith, Indyke and Polland etc.) exposes what really constitutes the badge of “merit” in this critical area of US security policy.
The gulf between academic credentials and actual performance extends to economic policy. Neo-liberal policies favoring Wall Street speculators were adopted by such strategic policymakers as Alan Greenspan, Ben Bernanke and Lawrence Summers. Their ‘leadership’ rendered the country vulnerable to the biggest economic crash since the Great Depression with millions of Americans losing employment and homes. Despite their role in creating the conditions for the crisis, their ’solution’ compounded the disaster by transferring over a trillion dollars from the US Treasury to the investment banks, as a taxpayer-funded bailout of Wall Street. Under their economic leadership, class inequalities have deepened; the financial elite has grown many times richer. Meanwhile, wars in the Middle East have drained the US Treasury of funds, which should have been used to serve the social needs of Americans and finance an economic recovery program through massive domestic investments and repair of our collapsing infrastructure.
The trade policies under the leadership of this ‘meritocratic’ elite - formerly called the ‘Chosen People’ - have been an unmitigated disaster for the majority of industrial workers, resulting in huge trade deficits and the deskilling of low paid service employment - with profound implications for future generations of American workers. It is no longer a secret that an entire generation of working class Americans has descended into poverty with no prospects of escape - except through narcotics and other degradation. On the ‘flip side’ of the ‘winners and losers’, US finance capital has expanded overseas with acquisition and merger fees enriching the 0.1% and the meritocratic officials happily rotating from their Washington offices to Wall Street and back again.
If economic performance were to be measured in terms of the sustained growth, balanced budgets, reductions in inequalities and the creation of stable, well-paying jobs, the economic elite (despite their self-promoted merits) have been absolute failures.
However, if we adopt the alternative criteria for success, their performance looks pretty impressive: they bailed out their banking colleagues, implemented destructive ‘free’ trade agreements, and opened up overseas investments opportunities with higher rates of profits than might be made from investing in the domestic economy.
If we evaluate foreign policy ‘performance’ in terms of US political, economic and military interests, their policies have been costly in lives, financial losses and military defeats for the nation as a whole. They rate ’summa cum lousy’.
However if we consider their foreign policies in the alternative terms of Israel’s political, economic and military interests, they regain their ’summa cum laudes’! They have been well rewarded for their services: The war against Iraq destroyed an opponent of Israel’s ethnic cleansing of Palestine. The systematic destruction of the Iraqi civil society and state has eliminated any possibility of Iraq recovering as a modern secular, multi-ethnic, multi-confessional state. Here, Israel made a major advance toward unopposed regional military dominance without losing a soldier or spending a shekel! The Iran sanctions authored and pushed by Levey and Cohen served to undermine another regional foe of Israeli land grabs in the West Bank even if it cost the US hundreds of billions in lost profits, markets and oil investments.
By re-setting the criteria for these officials, it is clear that their true academic ‘merit’ correlates with their success policies on behalf of the state Israel, regardless of how mediocre their performances have been for the United States as a state, nation and people. All this might raise questions about the nature of higher education and how performance is evaluated in terms of the larger spheres of the US economy, state and military.
independent | Today’s shock leak of the text of the Transatlantic Trade and Investment Partnership
(TTIP) marks the beginning of the end for the hated EU-US trade deal,
and a key moment in the Brexit debate. The unelected negotiators have
kept the talks going until now by means of a fanatical level of secrecy,
with threats of criminal prosecution for anyone divulging the treaty’s
contents.
Now, for the first time, the people of Europe can see for themselves
what the European Commission has been doing under cover of darkness -
and it is not pretty.
The leaked TTIP documents, published by Greenpeace this morning, run
to 248 pages and cover 13 of the 17 chapters where the final agreement
has begun to take shape. The texts include highly controversial subjects
such as EU food safety standards, already known to be at risk from
TTIP, as well as details of specific threats such as the US plan to end
Europe’s ban on genetically modified foods.
The documents show that US corporations will be granted unprecedented
powers over any new public health or safety regulations to be
introduced in future. If any European government does dare to bring in
laws to raise social or environmental standards, TTIP will grant US
investors the right to sue for loss of profits in their own corporate
court system that is unavailable to domestic firms, governments or
anyone else.
For all those who said that we were scaremongering and that the EU
would never allow this to happen, we were right and you were wrong.
theguardian |Two
Caribbean islands are at a crossroads in their relationship with the
US. One is plagued by corruption and debt, and dotted with crumbling
homes, abandoned by families for the imperial power nearby. The other is
Cuba.
Who won the cold war again?
Within 24 hours on Sunday, Puerto Rico’s governor, Alejandro GarcĂa Padilla, announced that the American territory would default on nearly $370m
of debt, after years of failure to put the island’s finances – or its
relationship with the US – in order. The next morning a cruise ship full
of tourists set sail for Havana,
bringing American dreams, dollars and capitalist sense into Cuba’s
future. Once seen as parallel case studies in cold war politics, the
islands have seemingly switched roles.
For half a century, the US dominated Puerto Rico and Cuba after
wrenching them away from Spain, but by the 1950s the islands parted
ways. Cubans threw off a US-backed dictator, found new patrons in the
Soviet Union and embraced communism. What nationalist fervor Puerto Rico
had was quashed, and the colony stayed bound to US-controlled
capitalism as a “free associated state”.
“When the cold war was going on they were like showcases for the
world to see which system actually works,” said Harry Franqui-Rivera, a
researcher at the Center for Puerto Rican Studies at Hunter College. “A
successful Cuba made the United States look bad and if Puerto Rico failed it would make the United States look worse.”
Twenty-five years after the fall of the Soviet Union, US textbooks usually say capitalism won and communism lost, and on a historic mission to Cuba last month Barack Obama said as much: “I have come here to bury the last remnant of the cold war in the Americas.”
But experts and activists say the cold war had a murky end, at least
in the Caribbean, and that the future for Puerto Rico and Cuba remains
far from certain. The day of Obama’s keynote speech in Havana, the mayor of San Juan tweeted:
“Obama spoke of opening bonds of collaboration with the neighboring
island of Cuba while he makes bonds of repression and control in Puerto
Rico.”
panampost | Despair and violence is taking over Venezuela. The economic crisis sweeping the nation means people have to withstand widespread shortages of staple products, medicine, and food.
So when the Maduro administration began rationing electricity this week, leaving entire cities in the dark for up to4 hours every day,discontent gave way to social unrest.
On April 26, people took to the streets in three Venezuelan states, looting stores to find food.
Maracaibo, in the western state of Zulia, is the epicenter of thefts: on Tuesday alone, Venezuelans raided pharmacies, shopping malls, supermarkets, and even trucks with food in seven different areas of the city.
Although at least nine people were arrested, and 2,000 security officers were deployed in the state, Zulia’s Secretary of Government Giovanny Villalobos asked citizens not to leave their homes. “There are violent people out there that can harm you,” he warned.
In Caracas, the Venezuelan capital, citizens reported looting in at least three areas of the city. Twitter users reported that thefts occurred throughout the night in the industrial zone of La California, Campo Rico, and Buena Vista.
They assured that several locals were robbed and that there were people on the street shouting “we are hungry!”
The same happened in Carabobo, a state in central Venezuela. Through Twitter, a journalist from Valencia reported the looting of a deli.
The crime took place on Tuesday evening amid a wave of protests against prolonged power rationing and outages in multiple parts of the country.
Food for 15 Days
Supermarkets employees from Valencia told the PanAm Post that besides no longer receiving the same amount of food as before, they must deal with angry Venezuelans who come to the stores only to find out there’s little to buy.
Purchases in supermarkets are rationed through a fingerprint system that does not allow Venezuelans to acquire the same regulated food for two weeks.
miamiherald | A recent International Monetary Fund report that Venezuela will reach a 720 percent inflation rate this year — the highest in the world — has drawn a lot of media attention, but what I heard from a senior IMF economist this week was even more dramatic.
Robert K. Rennhack, deputy director of the IMF’s Western Hemisphere department, told me in an interview that Venezuela is on a path to hyperinflation — the stage where the economy reaches total chaos — and could reach a “total collapse of the economic system” in 12 to 18 months if there are no changes in economic policies.
“Inflation in Venezuela probably entered on a hyperinflationary path in 2015,” Rennhack says. He told me that he expects Venezuela’s inflation to reach 2.200 percent in 2017, and could balloon very fast to 13,000 a year, the stage that most academics define as full-blown hyper-inflation.
Although he didn’t get into that, no Latin American government in recent memory has been able to survive a hyper-inflationary crisis. When you reach five-digit inflation rates, governments either make a dramatic political u-turn, or they fall.
“Hyper-inflation means that the currency has lost its value, people have to go to the stores with bags full of money, and prices rise almost by the hour,” Rennhack said. “What we saw in previous cases of hyperinflation in Latin America is that there was a political consensus that policies had to change.”
Asked how he reached his 12-18 month projection for hyper-inflation in Venezuela, Rennhack said his team of economists looked at previous episodes of hyper-inflation in Bolivia (1982-1984,) Argentina (1989-1990) and Brazil (1989-1990.) From those experiences, they concluded that Venezuela is on a similar path as these countries were between 12 and 18 months before their hyper-inflationary crises.
President Nicolas Maduro’s term ends in 2019, although the opposition MUD coalition is considering launching a referendum to demand early elections.
Many Venezuelans believe the country will explode much sooner than in 12 to 18 months. Prices go up daily, supermarket shelves are near empty, there are growing electricity shortages — Maduro has declared every Friday in April and May a non-working holiday in order to save energy — and crime statistics are skyrocketing.
ourfiniteworld | There are many who believe that the use of energy is critical to the
growth of the economy. In fact, I am among these people. The thing that
is not as apparent is that growth in energy consumption is dependent on the growth of debt.
Both energy and debt have characteristics that are close to “magic,”
with respect to the growth of the economy. Economic growth can only take
place when growing debt (or a very close substitute, such as company
stock) is available to enable the use of energy products.
The reason why debt is important is because energy products enable
the creation of many kinds of capital goods, and these goods are often
bought with debt. Commercial examples would include metal tools,
factories, refineries, pipelines, electricity generation plants,
electricity transmission lines, schools, hospitals, roads, gold coins,
and commercial vehicles. Consumers also benefit because energy products
allow the production of houses and apartments, automobiles, busses, and
passenger trains. In a sense, the creation of these capital goods is one
form of “energy profit” that is obtained from the consumption of
energy.
The reason debt is needed is because while energy products can indeed
produce a large “energy profit,” this energy profit is spread over many
years in the future. In order to actually be able to obtain the
benefit of this energy profit in a timeframe where the economy can use
it, the financial system needs to “bring forward some or all of the
energy profit to an earlier timeframe. It is only when businesses can do
this, that they have money to pay workers. This time shifting
also allows businesses to earn a financial profit themselves.
Governments indirectly benefit as well, because they can then tax the
higher wages of workers and businesses, so that governmental services
can be provided, including paved roads and good schools.
utopiathecollapse | Puerto Rico’s debt crisis moved into a
more perilous phase for residents, lawmakers and bondholders Monday
after the Government Development Bank failed to repay almost $400
million. The missed principal payment, the largest so far by the island,
is widely viewed on Wall Street as foreshadowing additional defaults
this summer, when more than $2 billion in bills are due.
Together with the spread of the Zika
virus, the risk of cascading defaults is putting new urgency on
bipartisan negotiations in Washington over legislation granting the U.S.
territory new powers to restructure more than $70 billion in debt. The
Centers for Disease Control and Prevention reported last week the first
U.S. death related to the mosquito-borne Zika virus—a Puerto Rican man
in his 70s who died in late February.
In a letter to Congress, Treasury
Secretary Jacob Lew warned on Monday that a U.S. “taxpayer-funded
bailout may become the only legislative course available” if the
proposed restructuring legislation isn’t approved. The island’s debt is
held by mutual funds, hedge funds, bond insurers and individual
investors, who were attracted in part by tax benefits and high yields.
The default Monday casts serious doubt on the commonwealth’s ability to
make other future payments, which “means that other defaults are very
likely on other Puerto Rico credits,” said Paul Mansour, head of the
municipal credit research group at investment management firm Conning.
Monday’s developments are the latest
sign that a long-running economic crisis has reached an acute stage,
embroiling financial markets and Congress. Benchmark Puerto Rican bond
prices fell to near record lows Monday, with some investors paying less
than 65 cents on the dollar for general obligation bonds maturing in
2035, an unusually low price.
WaPo |Marijuana’s strict scheduling emerges
from the cultural and racial apathy felt by Richard Nixon, the activist
president who signed the Controlled Substances Act into law. Nixon’s
aides suggested the war on marijuana was racially motivated, and Oval Office tapes highlight his contempt for the counterculture movement as well as racial minorities.
The
tapes also make it clear that Nixon wanted to link marijuana use and
its negative effects to two groups who he held in contempt: African
Americans and hippies. Nixon even appointed a commission to look into
the ills of marijuana — the Shafer Commission. When the group issued its report entitled, “Marihuana: A Signal of Misunderstanding,” which explained that marijuana was not as dangerous or addictive as it had widely been perceived, Nixon called his handpicked chairman, former Republican Pennsylvania governor Ray Shafer, into the Oval Office to be chastised.
So,
while marijuana’s placement in Schedule I was not a result of deep
scientific expertise, that does not mean that the CSA is to blame for
the continuing policy problems. The CSA has avenues to correct error or
compensate for new information or data. Rescheduling is one such remedy.
Under administrative rescheduling, the attorney general asks the Drug
Enforcement Administration and the Food and Drug Administration to
examine whether a substance is properly scheduled. The attorney general
takes those recommendations and ultimately makes a determination. If a
substance is determined to be improperly scheduled, a rulemaking process
commences that ultimately reschedules the substance.
That
the rescheduling process exists means that the architects of the CSA
understood the need for legal flexibility and thought that avenues for
revision should be built into the law. However, due to cultural biases
and stigma that have been cemented into society, science and
bureaucracy, those avenues have largely failed marijuana. The nearly
century-long institutional effort by the U.S. government to paint
marijuana as anathema to society, in all forms and under all
circumstances, has been devastatingly successful.
The federal government set up a DEA-mandated monopoly through the National Institutes on Drug Abuse
for the growth of research grade marijuana — not for all Schedule I
drugs, just marijuana. For decades, the supply from that monopoly was
often insufficient to meet clinical researchers’ needs. Until recently,
all marijuana research proposals needed to go through an additional,
unique review by the Public Health Service that added a bureaucratic
layer, hindering research.
WaPo |Congress and President Obama are
under pressure to reschedule marijuana. While rescheduling makes sense,
it doesn’t solve the state/federal conflict over marijuana (de-scheduling would be better).
But more important, it wouldn’t fix the broken scheduling system.
Ideally, marijuana reform should be part of a broader bill rewriting the
Controlled Substances Act.
The
Controlled Substances Act created a five-category scheduling system for
most legal and illegal drugs (although alcohol and tobacco were notably
omitted). Depending on what category a drug is in, the drug is either
subject to varying degrees of regulation and control (Schedules II
through V) — or completely prohibited, otherwise unregulated and left to
criminals to manufacture and distribute (Schedule I). The scheduling
of various drugs was decided largely by Congress and absent a scientific
process — with some strange results.
For
instance, while methamphetamine and cocaine are Schedule II drugs,
making them available for medical use, marijuana is scheduled alongside
PCP and heroin as a Schedule I drug, which prohibits any medical use.
Making matters worse, the CSA gives law enforcement — not scientists or
health officials — the final say on how new drugs should be scheduled
and whether or not old drugs should be rescheduled. Unsurprisingly, law
enforcement blocks reform.
Starting in 1972, the Drug Enforcement Administration obstructed a formal request to reschedule marijuana for 16 years. After being forced by the courts to make a decision, the agency held two years of hearings. The DEA chief administrative law judge
who held the hearings and considered the issue concluded that marijuana
in its natural form is “one of the safest therapeutically active
substances known to man” and should be made available for medical use.
Similar hearings on MDMA, a.k.a. ecstasy, concluded that it also has
important medical uses. In both cases, the DEA overruled its
administrative law judge and kept the drugs in Schedule I, unavailable
for medical use.
WaPo |Bertha Madras is a professor of
psychobiology at McLean Hospital and Harvard Medical School, with a
research focus on how drugs affect the brain. She is former deputy
director for demand reduction in the White House Office of National Drug
Control Policy.
Data from 2015 indicate that 30 percent of current cannabis users harbor a use disorder — more Americans are dependent on cannabis
than on any other illicit drug. Yet marijuana advocates have
relentlessly pressured the federal government to shift marijuana from
Schedule I — the most restrictive category of drug — to another schedule
or to de-schedule it completely. Their rationale? “States have already
approved medical marijuana”; “rescheduling will open the floodgates for
research”; and “many people claim that marijuana alone alleviates their
symptoms.”
Yet unlike
drugs approved by the Food and Drug Administration, “dispensary
marijuana” has no quality control, no standardized composition or dosage
for specific medical conditions. It has no prescribing information or
no high-quality studies of effectiveness or long-term safety. While the
FDA is not averse to approving cannabinoids as medicines and has
approved two cannabinoid medications, the decision to keep marijuana in
Schedule I was reaffirmed in a 2015 federal court ruling. That ruling was correct.
NYTimes | It should come as no surprise that the vast majority of heroin users
have used marijuana (and many other drugs) not only long before they
used heroin but while they are using heroin. Like nearly all people with
substance abuse problems, most heroin users initiated their drug use
early in their teens, usually beginning with alcohol and marijuana. There is ample evidence that early initiation of drug use primes the brain
for enhanced later responses to other drugs. These facts underscore the
need for effective prevention to reduce adolescent use of alcohol,
tobacco and marijuana in order to turn back the heroin and opioid
epidemic and to reduce burdens addiction in this country.
Marijuana use is positively correlated with alcohol use and cigarette
use, as well as illegal drugs like cocaine and methamphetamine. This
does not mean that everyone who uses marijuana will transition to using
heroin or other drugs, but it does mean that people who use marijuana
also consume more, not less, legal and illegal drugs than do people who do not use marijuana.
The legalization of marijuana increases availability of the drug and
acceptability of its use. This is bad for public health and safety not
only because marijuana use increases the risk of heroin use.
A better drug policy is one that actively discourages marijuana use
as well as other recreational drug use, especially for youth.
foreignpolicy | To evangelists of asteroid mining, the heavens are not just a
frontier but a vast and resource-rich place teeming with opportunity.
According to NASA, there are potentially 100,000 near-Earth objects —
including asteroids and comets — in the neighborhood of our planet. Some
of these NEOs, as they’re called, are small. Others are substantial and
potentially packed full of water and various important minerals, such
as nickel, cobalt, and iron. One day, advocates believe, those objects
will be tapped by variations on the equipment used in the coal mines of
Kentucky or in the diamond mines of Africa. And for immense gain:
According to industry experts, the contents of a single asteroid could
be worth trillions of dollars.
Kfir pitched me on the long-term plan. First, a fleet of satellites
will be dispatched to outer space, fitted with probes that can measure
the quality and quantity of water and minerals in nearby asteroids and
comets. Later, armed with that information, mining companies like DSI
will send out vessels to mechanically remove and refine the material
extracted. In some cases, the take will be returned to Earth. But most
of the time, it will be processed in space — for instance, to produce
rocket fuel — and stored in container vessels that will serve as the
equivalent of gas stations for outbound spacecraft.
This possibility isn’t so unrealistic, Kfir said. Consider the recent
and seismic growth of the space industry, he suggested, as we climbed
the stairs to DSI’s second-floor suite. Every year, the private
spaceflight sector grows larger, and every year the goals become
grander. Jeff Bezos, founder of Amazon and the space exploration company
Blue Origin, has spoken of the day “when millions of people are living
and working in space”; Elon Musk’s SpaceX is expected to reveal a Mars
colonization plan this year.
“But how are they going to sustain this new space
economy?” Kfir asked rhetorically. He nudged open DSI’s office door.
“Easy: by mining asteroids.” Bezos, Musk, and the other billionaires who
plan to be cruising around space in the near future won’t be able to do
so without celestial pit stops.
In his book, Asteroid Mining 101: Wealth for the New Space Economy,
John S. Lewis, professor emeritus of Cosmochemistry and Planetary
Atmospheres at the University of Arizona’s Lunar and Planetary
Laboratory and DSI’s chief scientist, envisions a future where “ever
more remote and ever more massive reservoirs of resources” take
astronauts farther and farther from our planet. “First to the Near Earth
Asteroids and the moons of Mars, then to the asteroid belt, then
to…[the] Trojan asteroids and the outer moons of Jupiter, then to the
Saturn system and the Centaurs,” and so on, to infinity.
thebulletin |Director of National Intelligence
James R. Clapper sent shock waves through the national security and
biotechnology communities with his assertion, in his Worldwide Threat Assessment testimony
to the Senate Armed Services Committee in February, that genome editing
had become a global danger. He went so far as to include it in the
report’s weapons of mass destruction section, alongside threats from
North Korea, China’s nuclear modernization, and chemical weapons in
Syria and Iraq. The new technology, he said, could open the door to
“potentially harmful biological agents or products,” with “far-reaching
economic and national security implications.”
So what has warranted this warning, and what can be done to mitigate the threat?
Other
editing techniques have been around for more than a decade but they are
laborious, less accurate, and quite expensive. Before that, previous
traditional methods required generations to see results. While some
techniques can recognize longer DNA sequences and have better specificity than Crispr, they are costly ($5,000 for each order versus $30 for Crispr) and difficult to engineer, sometimes requiring several tries to identify a sequence that works. Hence the rise of Crispr, which, along with Crispr associated proteins (Cas), provides a precise way
to target, snip, and insert exact pieces of a genome. (The Crispr-Cas9
protein has received the most attention in this recent discussion, yet
other enzymatic proteins such as the Crispr-Cpf1 use a different type of “scissors” and might be just as effective.)
The
benefits of such technology are obvious. Because preferred traits can
rapidly enter a species, test animals like mice can be designed more
efficiently for biomedical experiments, mosquitoes can be engineered so
they cannot reproduce (and therefore cannot spread malaria), plants can
be developed for drought resistance and higher yields, and diseases can
be eliminated by deactivating the responsible genes in a given host. The
bioengineer and author Robert Carlson notes that genome editing shows great promise for next-generation plastics, agricultural products, bioremediation organisms, carbon-neutral fuels, novel enzymes, and better vaccines.
scientificamerican | It happens hundreds of times a day: We press snooze on the alarm
clock, we pick a shirt out of the closet, we reach for a beer in the
fridge. In each case, we conceive of ourselves as free agents,
consciously guiding our bodies in purposeful ways. But what does science
have to say about the true source of this experience?
In a classic paper published almost 20 years ago,
the psychologists Dan Wegner and Thalia Wheatley made a revolutionary
proposal: The experience of intentionally willing an action, they
suggested, is often nothing more than a post hoc causal inference that
our thoughts caused some behavior. The feeling itself, however, plays no
causal role in producing that behavior. This could sometimes lead us to
think we made a choice when we actually didn’t or think we made a different choice than we actually did.
But there’s a mystery here. Suppose, as Wegner and Wheatley propose,
that we observe ourselves (unconsciously) perform some action, like
picking out a box of cereal in the grocery store, and then only
afterwards come to infer that we did this intentionally. If this is the
true sequence of events, how could we be deceived into believing that we
had intentionally made our choice before the consequences of
this action were observed? This explanation for how we think of our
agency would seem to require supernatural backwards causation, with our
experience of conscious will being both a product and an apparent cause
of behavior.
In a study just published in Psychological Science, Paul Bloom
and I explore a radical—but non-magical—solution to this puzzle.
Perhaps in the very moments that we experience a choice, our minds are
rewriting history, fooling us into thinking that this choice—that was
actually completed after its consequences were subconsciously
perceived—was a choice that we had made all along.
Celebrating 113 years of Mama Rosa McCauley Parks
-
*February 4, 1913 -- February 4, 2026*
*Some notes: The life of the courageous activist Mama Rosa McCauley Parks*
Mama Rosa's grandfather Sylvester Ed...
Monsters are people too
-
Comet 3I/Atlas is on its way out on a hyberbolic course to, I don't know
where. I do know that 1I/Oumuamua is heading for the constellation Pegasus,
and ...
Remembering the Spanish Civil War
-
This year marks the 90th anniversary of the launch of the Spanish Civil
War, an epoch-defining event for the international working class, whose
close study...
Return of the Magi
-
Lately, the Holy Spirit is in the air. Emotional energy is swirling out of
the earth.I can feel it bubbling up, effervescing and evaporating around
us, s...
Covid-19 Preys Upon The Elderly And The Obese
-
sciencemag | This spring, after days of flulike symptoms and fever, a man
arrived at the emergency room at the University of Vermont Medical Center.
He ...
-
(Damn, has it been THAT long? I don't even know which prompts to use to
post this)
SeeNew
Can't get on your site because you've gone 'invite only'?
Man, ...
First Member of Chumph Cartel Goes to Jail
-
With the profligate racism of the Chumph Cartel, I don’t imagine any of
them convicted and jailed is going to do too much better than your run of
the mill ...