Showing posts sorted by relevance for query genomics. Sort by date Show all posts
Showing posts sorted by relevance for query genomics. Sort by date Show all posts

Friday, August 03, 2012

about the special relationship...,



archeologynewsnetwork | The genetic changes underlying the evolution of new species are still poorly understood. For instance, we know little about critical changes that have happened during human evolution. Genetic studies in domestic animals can shed light on this process due to the rapid evolution they have undergone over the last 10,000 years. A new study published today describes how a complex genomic rearrangement causes a fascinating phenotype in chickens.

In the study published in PLoS Genetics researchers at Uppsala University, Swedish University of Agricultural Sciences, North Carolina State University and National Chung-Hsing University have investigated the genetic basis of fibromelanosis, a breed characteristic of the Chinese Silkie chicken. This trait involves a massive expansion of pigment cells that not only makes the skin and comb black but also causes black internal organs. Chickens similar in appearance to the Silkie were described by Marco Polo when he visited China in the 13th century and Silkie chickens have a long history in Chinese cuisine and traditional Chinese medicine.

archeologynewsnetwork | The domestication of chickens has given rise to rapid and extensive changes in genome function. A research team at Linköping University in Sweden has established that the changes are heritable, although they do not affect the DNA structure.

Humans kept Red Junglefowl as livestock about 8000 years ago. Evolutionarily speaking, the sudden emergence of an enormous variety of domestic fowl of different colours, shapes and sizes has occurred in record time. The traditional Darwinian explanation is that over thousands of years, people have bred properties that have arisen through random, spontaneous mutations in the chickens' genes.

Linköping zoologists, with Daniel Nätt and Per Jensen at the forefront, demonstrate in their study that so-called epigenetic factors play a greater role than previously thought. The study was published in the high-ranking journal BMC Genomics.

archeologynewsnetwork | Dr Alice Storey, an archaeologist at the University of New England, is tracing the global migration routes of domestic chickens back through thousands of years towards their origins in the jungles of South-east Asia.

In doing so, Dr Storey is pioneering the use of DNA from ancient chicken bones recovered from well-dated archaeological sites around the world. This is enabling her to add a fourth dimension – that of time – to an emerging “map” of chicken dispersal. One of the ultimate goals of such research is identifying the original Asian centres of jungle fowl domestication.

“All of our domestic chickens are descended from a few hens that I like to think of as the ‘great, great grandmothers’ of the chicken world,” Dr Storey said.

Biological, linguistic, historical and archaeological data have all contributed to an understanding that chickens accompanied human movements from their Asian homeland west through the Middle East to Europe and Africa, and east through the islands of South-east Asia and the Pacific.

Dr Storey’s analysis of ancient DNA is disentangling complications in this broad picture caused by interactions later than the original dispersal. “Only ancient DNA provides a unit of analysis with the chronological control necessary to reconstruct and disentangle the signals of initial dispersals from those of later interactions,” she said. Hers are the first published reports on the use of ancient DNA in this context.

A paper by Dr Storey and her colleagues, titled “Global dispersal of chickens in prehistory using ancient mitochondrial DNA signatures”, is published today in the online scientific journal PLoS ONE.

Sunday, May 10, 2015

dna printing

NPR |  Here's something that might sound strange: There are companies now that print and sell DNA.
This trend — which uses the term "print" in the sense of making a bunch of copies speedily — is making particular stretches of DNA much cheaper and easier to obtain than ever before. That excites many scientists who are keen to use these tailored strings of genetic instructions to do all sorts of things, ranging from finding new medical treatments to genetically engineering better crops.

"So much good can be done," says Austen Heinz, CEO of Cambrian Genomics in San Francisco, one of the companies selling these stretches of DNA.

But some of the ways Heinz and others talk about the possible uses of the technology also worries some people who are keeping tabs on the trend.

"I have significant concerns," says Marcy Darnovsky, who directs the Center for Genetics and Society, a genetics watchdog group.

A number of companies have been taking advantage of several recent advances in technology to produce DNA quickly and cheaply. Heinz says his company has made the process even cheaper.
"Everyone else that makes DNA, makes DNA incorrectly and then tries to fix it," Heinz says. "We don't fix it. We just see what's good, what's bad and then we use the correct pieces."

Monday, February 21, 2011

ephaptic coupling

Cordis | Researchers believed neurons in the brain communicated through physical connections known as synapses. However, EU-funded neuroscientists have uncovered strong evidence that neurons also communicate with each other through weak electric fields, a finding that could help us understand how biophysics gives rise to cognition.

The study, published in the journal Nature Neuroscience, was funded in part by the EUSYNAPSE ('From molecules to networks: understanding synaptic physiology and pathology in the brain through mouse models') project, which received EUR 8 million under the 'Life sciences, genomics and biotechnology for health' Thematic area of the EU's Sixth Framework Programme (FP6).

Lead author Dr Costas Anastassiou, a postdoctoral scholar at the Californian Institute of Technology (Caltech) in the US, and his colleagues explain how the brain is an intricate network of individual nerve cells, or neurons, that use electrical and chemical signals to communicate with one another.

Every time an electrical impulse races down the branch of a neuron, a tiny electric field surrounds that cell. A few neurons are like individuals talking to each other and having small conversations. But when they all fire together, it's like the roar of a crowd at a sports game.

That 'roar' is the summation of all the tiny electric fields created by organised neural activity in the brain. While it has long been recognised that the brain generates weak electrical fields in addition to the electrical activity of firing nerve cells, these fields were considered epiphenomenon - superfluous side effects.

Nothing was known about these weak fields because, in fact, they are usually too weak to measure at the level of individual neurons; their dimensions are measured in millionths of a metre (microns). Therefore, the researchers decided to determine whether these weak fields have any effect on neurons.

Experimentally, measuring such weak fields emanating from or affecting a small number of brain cells was no easy task. Extremely small electrodes were used in close proximity to a cluster of rat neurons to look for 'local field potentials', the electric fields generated by neuron activity. The result? They were successful in measuring fields as weak as one millivolt (one millionth of a volt).

Commenting on the results, Dr Anastassiou says: 'Because it had been so hard to position that many electrodes within such a small volume of brain tissue, the findings of our research are truly novel. Nobody had been able to attain this level of spatial and temporal resolution.'

What they found was surprising. 'We observed that fields as weak as one millivolt per millimetre robustly alter the firing of individual neurons, and increase the so-called "spike-field coherence" - the synchronicity with which neurons fire with relationship to the field,' he says.

Saturday, July 19, 2014

wade in the nytimes: adventures in very recent evolution



NYTimes |  Ten thousand years ago, people in southern China began to cultivate rice and quickly made an all-too-tempting discovery — the cereal could be fermented into alcoholic liquors. Carousing and drunkenness must have started to pose a serious threat to survival because a variant gene that protects against alcohol became almost universal among southern Chinese and spread throughout the rest of China in the wake of rice cultivation. 

The variant gene rapidly degrades alcohol to a chemical that is not intoxicating but makes people flush, leaving many people of Asian descent a legacy of turning red in the face when they drink alcohol. 

The spread of the new gene, described in January by Bing Su of the Chinese Academy of Sciences, is just one instance of recent human evolution and in particular of a specific population’s changing genetically in response to local conditions. 

Scientists from the Beijing Genomics Institute last month discovered another striking instance of human genetic change. Among Tibetans, they found, a set of genes evolved to cope with low oxygen levels as recently as 3,000 years ago. This, if confirmed, would be the most recent known instance of human evolution. 

Many have assumed that humans ceased to evolve in the distant past, perhaps when people first learned to protect themselves against cold, famine and other harsh agents of natural selection. But in the last few years, biologists peering into the human genome sequences now available from around the world have found increasing evidence of natural selection at work in the last few thousand years, leading many to assume that human evolution is still in progress. 

“I don’t think there is any reason to suppose that the rate has slowed down or decreased,” says Mark Stoneking, a population geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Saturday, May 22, 2010

a short course on synthetic genomics

Edge | On July 24, 2009, a small group of scientists, entrepreneurs, cultural impresarios and journalists that included architects of the some of the leading transformative companies of our time (Microsoft, Google, Facebook, PayPal), arrived at the Andaz Hotel on Sunset Boulevard in West Hollywood, to be offered a glimpse, guided by George Church and Craig Venter, of a future far stranger than Mr. Huxley had been able to imagine in 1948.

In this future — whose underpinnings, as Drs. Church and Venter demonstrated, are here already — life as we know it is transformed not by the error catastrophe of radiation damage to our genetic processes, but by the far greater upheaval caused by discovering how to read genetic sequences directly into computers, where the code can be replicated exactly, manipulated freely, and translated back into living organisms by writing the other way. "We can program these cells as if they were an extension of the computer," George Church announced, and proceeded to explain just how much progress has already been made. ... Click here to go to videos.

Wednesday, November 03, 2010

genetic engineering of space travelers


Video - Twilight Zone Episode Third From the Sun

LiveScience | NASA's human spaceflight program might take some giant leaps forward if the agency embraces genetic engineering techniques more fully, according to genomics pioneer J. Craig Venter.

The biologist, who established the J. Craig Venter Institute that created the world's first synthetic organism earlier this year, told a crowd here Saturday (Oct. 30) that human space exploration could benefit from more genetic screening and genetic engineering. Such efforts could help better identify individuals most suited for long space missions, as well as make space travel safer and more efficient, he said.

"I think this could change the shape of what NASA does, if you make the commitment to do it," said Venter, who led a team that decoded the human genome a decade ago. Venter spoke to a group of scientists and engineers who gathered at NASA's Ames Research Center for two different meetings: a synthetic biology workshop put on by NASA, and Space Manufacturing 14: Critical Technologies for Space Settlement, organized by the nonprofit Space Studies Institute.

Astronauts with the right (genetic) stuff

Genetics techniques could come in extremely handy during NASA's astronaut selection process, Venter said. The space agency could screen candidates for certain genes that help make good spaceflyers — once those genes are identified, he added.

Genes that encode robust bone regeneration, for example, would be a plus, helping astronauts on long spaceflights battle the bone loss that is typically a major side effect of living in microgravity. Also a plus for any prospective astronaut: genes that code for rapid repair of DNA, which can be damaged by the high radiation levels in space.

Genetic screening would be a natural extension of what NASA already does — it would just add a level of precision, according to Venter.

"NASA's been doing genetic selection for a long time," he said. "You just don't call it that."

Last summer, the agency chose just nine astronaut candidates — out of a pool of 3,500 — for its rigorous astronaut training program based on a series of established spaceflight requirements and in-depth interviews.

A new microbiome
At some point down the road, NASA could also take advantage of genetic engineering techniques to make long space journeys more efficient and easier on astronauts, Venter said.

As an example, he cited the human microbiome, the teeming mass of microbes that live on and inside every one of us. Every human body hosts about 100 trillion microbes — meaning the bugs outnumber our own cells by a factor of at least 10 to one.

While humans only have about 20,000 genes, our microbiome boasts a collective 10 million or so, Venter said. These microbes provide a lot of services, from helping us digest our food to keeping our immune system's inflammation response from going overboard.

With some tailoring, the microbiome could help us out even more, according to Venter.

"Why not come up with a synthetic microbiome?" he asked.

Theoretically, scientists could engineer gut microbes that help astronauts take up nutrients more efficiently. A synthetic microbiome could also eliminate some pathogens, such as certain bacteria that can cause dental disease. Other tweaks could improve astronauts' living conditions, and perhaps their ability to get along with each other in close quarters.

Body odor is primarily caused by microbes, Venter said. A synthetic microbiome could get rid of the offenders, as well as many gut microbes responsible for excessive sulfur or methane production. Fist tap Nana.

Friday, October 28, 2016

Computational Genomics F'Real...,


WSJ |  A QUICK RIDDLE: WHAT DO 100 works of classic literature, a seed database from the nonprofit Crop Trust and the Universal Declaration of Human Rights have in common? All of them were recently converted from bits of digital data to strands of synthetic DNA. In addition to these weighty files, researchers at Microsoft and the University of Washington converted a high-definition music video of “This Too Shall Pass” by the alternative rock band OK Go. The video is an homage to Rube Goldberg-like contraptions, which bear more than a passing resemblance to the labyrinthine process of transforming data into the genetic instructions that shape all living things.

This recent data-to-DNA conversion, completed in July, totaled 200 megabytes—which would barely register on a 16-gigabyte iPhone. It’s not a huge amount of information, but it bested the previous DNA storage record, set by scientists at Harvard University, by a factor of about 10. To achieve this, researchers concocted a convoluted process to encode the data, store it in synthetic DNA and then use DNA sequencing machines to retrieve and, finally, decode the data. The result? The exact same files they began with.

Which raises the question: Why bother?

“We are seeing this explosion in the amount of data that needs to be stored,” says Karin Strauss, the principal Microsoft researcher on the project. “To continue storing this information, we need radical new approaches.” In an age of gargantuan, power-sucking data centers, the space-saving potential of data stored in DNA is staggering. “You can archive all the data on the internet in a shoebox,” says Luis Ceze, an associate professor of computer science and engineering at the University of Washington.

Thursday, November 17, 2011

thinking outside the genome...,

The Scientist | Not so long ago, the mention of any word with the two syllables “-ō-mics” tacked on the end was usually followed immediately with some response akin to, “Huh?” Today, we’ve gotten to the point where almost no biological phenomenon can escape “omics-ization,” and within the next 25 years, omics will be the biggest, if not the only, game in town. Why? Because we are about to undergo what experts call a phase shift, where a technology drives a fundamental change not just in what is known, but, more importantly, in how we think of ourselves. Put another way: omics is destined to change our patterns of living in ways that only technological revolutions can deliver.

Other technologies have already proven to have similarly deep effects on human culture. Consider the impact of the Internet on commerce, or the influence of GPS systems on travel and navigation. The reach of these technologies extended well beyond the information they generated. They redefined society.

In the last half century, the technology in genomics has provided us with a set of approaches initially as underappreciated as computers were in the early 1970s. “Exotic,” “finicky,” and “geeky” were terms used for mainframe computers that couldn’t even talk with each other. The same transformative technological advances that have turned computers into must-have personal accessories are inevitable for the nascent field of omics. Here are four ways in which omics will reshape the human experience.

Saturday, June 09, 2018

Genetics in the Madhouse: The Unknown History of Human Heredity


nature  |  Who founded genetics? The line-up usually numbers four. William Bateson and Wilhelm Johannsen coined the terms genetics and gene, respectively, at the turn of the twentieth century. In 1910, Thomas Hunt Morgan began showing genetics at work in fruit flies (see E. Callaway Nature 516, 169; 2014). The runaway favourite is generally Gregor Mendel, who, in the mid-nineteenth century, crossbred pea plants to discover the basic rules of heredity.

Bosh, says historian Theodore Porter. These works are not the fount of genetics, but a rill distracting us from a much darker source: the statistical study of heredity in asylums for people with mental illnesses in late-eighteenth- and early-nineteenth-century Britain, wider Europe and the United States. There, “amid the moans, stench, and unruly despair of mostly hidden places where data were recorded, combined, and grouped into tables and graphs”, the first systematic theory of mental illness as hereditary emerged.

For more than 200 years, Porter argues in Genetics in the Madhouse, we have failed to recognize this wellspring of genetics — and thus to fully understand this discipline, which still dominates many individual and societal responses to mental illness and diversity.

The study of heredity emerged, Porter argues, not as a science drawn to statistics, but as an international endeavour to mine data for associations to explain mental illness. Few recall most of the discipline’s early leaders, such as French psychiatrist, or ‘alienist’, Étienne Esquirol; and physician John Thurnam, who made the York Retreat in England a “model of statistical recording”. Better-known figures, such as statistician Karl Pearson and zoologist Charles Davenport — both ardent eugenicists — come later.

Inevitably, study methods changed over time. The early handwritten correlation tables and pedigrees of patients gave way to more elaborate statistical tools, genetic theory and today’s massive gene-association studies. Yet the imperatives and assumptions of that scattered early network of alienists remain intact in the big-data genomics of precision medicine, asserts Porter. And whether applied in 1820 or 2018, this approach too readily elevates biology over culture and statistics over context — and opens the door to eugenics.

Wednesday, December 22, 2010

ideas of the microbiome and the virome...,

Sciencemag | Humans have been doing battle with bacteria since the 1800s, thwarting disease with antibiotics, vaccines, and good hygiene with mixed success. But in 2000, Nobel laureate Joshua Lederberg called for an end to the “We good; they evil” thinking that has fueled our war against microbes. “We should think of each host and its parasites as a superorganism with the respective genomes yoked into a chimera of sorts,” he wrote in Science in 2000.

His comments were prescient. This past decade has seen a shift in how we see the microbes and viruses in and on our bodies. There is increasing acceptance that they are us, and for good reason. Nine in 10 of the cells in the body are microbial. In the gut alone, as many as 1000 species bring to the body 100 times as many genes as our own DNA carries. A few microbes make us sick, but most are commensal and just call the human body home. Collectively, they are known as the human microbiome. Likewise, some viruses take up residence in the body, creating a virome whose influence on health and disease is just beginning to be studied.

Their genes and ours make up a metagenome that keeps the body functioning. This past decade we've begun to see how microbial genes affect how much energy we absorb from our foods and how microbes and viruses help to prime the immune system. Viewing the human and its microbial and viral components as intimately intertwined has broad implications. As one immunologist put it, such a shift “is not dissimilar philosophically from the recognition that the Earth is not the center of the solar system.”

This appreciation has dawned gradually, as part of a growing recognition of the key role microbes play in the world. Microbiologists sequencing DNA from soil, seawater, and other environments have discovered vast numbers of previously undetected species. Other genomics research has brought to light incredible intimacies between microbes and their hosts—such as a bacterium called Buchnera and the aphids inside which it lives. A study in 2000 found that each organism has what the other lacks, creating a metabolic interdependency.

One of the first inklings that microbiologists were missing out on the body's microbial world came in 1999, when David Relman of Stanford University in Palo Alto, California, and colleagues found that previous studies of bacteria cultured from human gums had seriously undercounted the diversity there. Turning to samples taken from the gut and from stools, the researchers identified 395 types of bacteria, two-thirds of them new to science.

In 2006, Steven Gill of the University at Buffalo in New York and colleagues did a metagenomics study of the gut, analyzing all the genes they could find in the 78 million bases sequenced. They found metabolic genes that complemented the human genome, including ones that break down dietary fiber, amino acids, or drugs, and others that produce methane or vitamins. This and a more comprehensive survey in 2010 by Jun Wang of BGI-Shenzhen in China and colleagues provided support for the concept of the microbe-human superorganism, with a vast genetic repertoire. Now, large-scale studies have surveyed the microflora in the gut, skin, mouth, nose, and female urogenital tract. The Human Microbiome Project has sequenced 500 relevant microbial genomes out of a planned 3000.

Some of these microbes may play important roles in metabolic processes. In 2004, a team led by Jeffrey Gordon of Washington University School of Medicine in St. Louis, Missouri, found that germ-free mice gained weight after they were supplied with gut bacteria—evidence that these bacteria helped the body harvest more energy from digested foods. Later studies showed that both obese mice and obese people harbored fewer Bacteroidetes bacteria than their normal-weight counterparts.

The microbiome is also proving critical in many aspects of health. The immune system needs it to develop properly. What's more, to protect themselves inside the body, commensal bacteria can interact with immune cell receptors or even induce the production of certain immune system cells. One abundant gut bacterium, Faecalibacterium prausnitzii, proved to have anti-inflammatory properties, and its abundance seems to help protect against the recurrence of Crohn's disease. Likewise, Sarkis Mazmanian of the California Institute of Technology in Pasadena showed that the human symbiont Bacteroides fragilis kept mice from getting colitis. And inserting bacteria isolated from healthy guts restored the microbial communities, curing chronic diarrhea in a patient infected with Clostridium difficile.

Herbert Virgin of Washington University School of Medicine finds a similar role for the virome. In mice, his team found that dormant herpesviruses revved up the immune system just enough to make the mice less susceptible to certain bacterial infections.

The ideas of a microbiome and a virome didn't even exist a decade ago. But now researchers have reason to hope they may one day manipulate the body's viral and microbial inhabitants to improve health and fight sickness.

Friday, December 09, 2016

Like Genomics - Reality is Computational


edgarlowen |  A computational model is by far the most reasonable and fruitful approach to reality. The computational model of Universal Reality is both internally consistent and consistent with science and the scientific method. This may initially seem counter intuitive but there all sorts of convincing reasons supporting it.

There is overwhelming evidence that everything in the universe is its information or data only and that the observable universe is a computational system:

1. To be comprehensible, which it self-evidently is, reality must be a logically consistent structure. To be logical and to continually happen it must be computable. To be computable it must consist of data because only data is computable. Therefore the content of the observable universe must consist only of programs computing data.

2. The laws of science which best describe reality are themselves logico-mathematical information forms. Why would the equations of science be the best description of reality if reality itself didn’t also consist of similar information structures? This explains the so-called “unreasonable effectiveness of mathematics” in describing the universe (Wigner, 1960).

3. By recognizing that reality is a logico-mathematical structure the laws of nature immediately assume their natural place as an intrinsic part of reality. No longer do they somehow stand outside a physical world while mysteriously controlling it. A physical model of the universe is unable to explain where the laws of nature reside or what their status is (Penrose, 2005).

4. Physical mechanisms to produce effects become unnecessary in a purely computational world. It’s enough to have a consistent logico-mathematical program that computes them in accordance with experimental evidence.

5. When everything that mind adds to our perception of reality is recognized and subtracted all that remains of reality is a computational data structure. This is explained in detail below and can actually be confirmed by carefully analyzed direct experience.

6. We know that our internal simulation of reality exists as neurochemical data in the circuits of our brain. Yet this world appears perfectly real to us. If our cognitive model of reality consists only of data and seems completely real then it’s reasonable to assume that the actual external world could also consist only of data. How else could it be so effectively modeled as data in our brains if it weren’t data itself?

7. This view of reality is tightly consistent with the other insights of Universal Reality, which are cross-consistent with modern science. Total consistency across maximum scope is the test of validity, truth and knowledge (Owen, 2016).

8. This view of reality leads to simple elegant solutions of many of the perennial problems of science and the nature of reality and leads directly to many new insights. Specifically it leads to a clear understanding of the nature of consciousness and also enables a new understanding of spacetime that conceptually unifies quantum theory and general relativity and resolves the paradoxical nature of the quantum world (Owen, 2016).

9. These insights complete the progress of science itself in reducing everything to data by revealing how both mass-energy and spacetime, the last remaining bastions of physicality, can be reduced to data as explained in Universal Reality (Owen, 2016).

10. Viewing the universe as running programs computing its data changes nothing about the universe which continues exactly as before. It merely completes the finer and finer analysis of all things including us into their most elemental units. It’s simply a new way of looking at what already exists in which even the elementary particles themselves consist entirely of data while everything around us remains the same. Reality remained exactly the same when everything was reduced to its elementary particles, and it continues to remain the same when those particles are further reduced to their data.

Sunday, August 02, 2015

meticulously planned parenthood WILL NOT be taken slowly because tards are scared of it...,


SA |  The official policy of the American Society of Reproductive Medicine is as follows: “Whereas preimplantation sex selection is appropriate to avoid the birth of children with genetic disorders, it is not acceptable when used solely for nonmedical reasons.” Yet in a 2006 survey of 186 U.S. fertility clinics, 58 allowed parents to choose sex as a matter of preference. And that was seven years ago. More recent statistics are scarce, but fertility experts confirm that sex selection is more prevalent now than ever.

“A lot of U.S. clinics offer non-medical sex selection,” says Jeffrey Steinberg, director of The Fertility Institutes, which has branches in Los Angeles, New York and Guadalajara, Mexico. “We do it every single day. We did three this morning.”

In 2009 Steinberg announced that he would soon give parents the option to choose their child’s skin color, hair color and eye color in addition to sex. He based this claim on studies in which scientists at deCode Genetics in Iceland suggested they could identify the skin, hair and eye color of a Scandinavian by looking at his or her DNA. "It's time for everyone to pull their heads out of the sand,” Steinberg proclaimed to the BBC at the time. Many fertility specialists were outraged. Mark Hughes, a pioneer of pre-implantation genetic diagnosis, told the San Diego Union-Tribune that the whole idea was absurd and the Wall Street Journal quoted him as saying that “no legitimate lab would get into it and, if they did, they'd be ostracized." Likewise, Kari Stefansson, chief executive of deCode, did not mince words with the WSJ: “I vehemently oppose the use of these discoveries for tailor-making children,” he said. Fertility Institutes even received a call from the Vatican urging its staff to think more carefully. Seifert withdrew his proposal.

But that does not mean he and other likeminded clinicians and entrepreneurs have forgotten about the possibility of parents molding their children before birth. “I’m still very much in favor of using genetics for all it can offer us,” Steinberg says, “but I learned a lesson: you really have to take things very, very slowly, because science is scary to a lot of people.” Most recently, a minor furor erupted over a patent awarded to the personal genomics company 23andMe. The patent in question, issued on September 24th, describes a method of “gamete donor selection based on genetic calculations." 23andMe would first sequence the DNA of a man or woman who wants a baby as well as the DNA of several potential sperm or egg donors. Then, the company would calculate which pairing of hopeful parent and donor would most likely result in a child with various traits.

Illustrations in the patent depict drop down menus with choices like: “I prefer a child with Low Risk of Colorectal Cancer; “High Probability of Green Eyes;” "100% Likely Sprinter;" and “Longest Expected Life Span” or “Least Expected Life Cost of Health Care." All the choices are presented as probabilities because, in most cases, the technique 23andMe describes could not guarantee that a child will or will not have a certain trait. Their calculations would be based on an analysis of two adults’ genomes using DNA derived from blood or saliva, which does reflect the genes inside those adults’ sperm and eggs. Every adult cell in the human body has two copies of every gene in that person’s genome; in contrast, sperm and eggs have only one copy of each gene and which copy is assigned to which gamete is randomly determined. Consequently, every gamete ends up with a unique set of genes. Scientists have no way of sequencing the DNA inside an individual sperm or egg without destroying it.

“When we originally introduced the tool and filed the patent there was some thinking the feature could have applications for fertility clinics. But we’ve never pursued the idea, and have no plans to do so,” 23andMe spokeswoman Catherine Afarian said in a prepared statement. Nevertheless, doctors using PGD can already—or will soon be able to—accomplish at least some of what 23andMe proposes and give parents a few of the choices the Freemans made about their second son.

Saturday, March 09, 2013

inside china's bio-google

technologyreview | BGI-Shenzhen, once known as the Beijing Genomics Institute, has burst from relative obscurity to become the world’s most prolific sequencer of human, plant, and animal DNA. In 2010, with the aid of a $1.58 billion line of credit from China Development Bank, BGI purchased 128 state-of-the-art DNA sequencing machines for about $500,000 apiece. It now owns 156 sequencers from several manufacturers and accounts for some 10 to 20 percent of all DNA data produced globally. So far, it claims to have completely sequenced some 50,000 human genomes—far more than any other group.

BGI’s sheer size has already put Chinese gene research on the map. Those same economies of scale could also become an advantage as comprehensive gene readouts become part of everyday medicine. The cost of DNA sequencing is falling fast. In a few years, it’s likely that millions of people will want to know what their genes predict about their health. BGI might be the one to tell them.

The institute hasn’t only initiated a series of grandly conceived science projects. (In January, it announced it had determined the DNA sequence of not one but 90 varieties of chickpeas.) It’s also pioneered a research-for-hire business to decode human genomes in bulk, taking orders from the world’s top drug companies and universities. Last year, BGI even started to install satellite labs inside foreign research centers and staff them with Chinese technicians.

BGI’s rise is regarded with curiosity and some trepidation, not just because of the organization’s size but also because of its opportunistic business approach (it has a center for pig cloning, dabbles in stem-cell research, and runs a diagnostics lab). The institute employs 4,000 people, as many as a midsize university—1,000 in its bioinformatics division alone. Like Zhao, most are young—the average age is 27—and some sleep in company dormitories. The average salary is $1,500 a month.

Ten years ago, the international Human Genome Project was finishing up the first copy of the human genetic code at a cost of $3 billion. Thanks to a series of clever innovations, the cost to read out the DNA in a person’s genome has since fallen to just a few thousand dollars. Yet that has only created new challenges: how to store, analyze, and make sense of the data. According to BGI, its machines generate six terabytes of data each day.

Zhang Yong, 33, a BGI senior researcher, predicts that within the next decade the cost of sequencing a human genome will fall to just $200 or $300 and BGI will become a force in assembling a global “bio-Google”—it will help “organize all the world’s biological information and make it universally accessible and useful.”

Thursday, February 09, 2023

The Application Of Machine Learning To Evidence Based Medicine

 
What if, bear with me now, what if the phase 3 clinical trials for mRNA therapeutics conducted on billions of unsuspecting, hoodwinked and bamboozled humans, was a new kind of research done to yield a new depth and breadth of clinical data exceptionally useful toward breaking up logjams in clinical terminology as well as experimental sample size? Vaxxed vs. Unvaxxed the subject of long term gubmint surveillance now. To what end?

Nature  | Recently, advances in wearable technologies, data science and machine learning have begun to transform evidence-based medicine, offering a tantalizing glimpse into a future of next-generation ‘deep’ medicine. Despite stunning advances in basic science and technology, clinical translations in major areas of medicine are lagging. While the COVID-19 pandemic exposed inherent systemic limitations of the clinical trial landscape, it also spurred some positive changes, including new trial designs and a shift toward a more patient-centric and intuitive evidence-generation system. In this Perspective, I share my heuristic vision of the future of clinical trials and evidence-based medicine.

Main

The last 30 years have witnessed breathtaking, unparalleled advancements in scientific research—from a better understanding of the pathophysiology of basic disease processes and unraveling the cellular machinery at atomic resolution to developing therapies that alter the course and outcome of diseases in all areas of medicine. Moreover, exponential gains in genomics, immunology, proteomics, metabolomics, gut microbiomes, epigenetics and virology in parallel with big data science, computational biology and artificial intelligence (AI) have propelled these advances. In addition, the dawn of CRISPR–Cas9 technologies has opened a tantalizing array of opportunities in personalized medicine.

Despite these advances, their rapid translation from bench to bedside is lagging in most areas of medicine and clinical research remains outpaced. The drug development and clinical trial landscape continues to be expensive for all stakeholders, with a very high failure rate. In particular, the attrition rate for early-stage developmental therapeutics is quite high, as more than two-thirds of compounds succumb in the ‘valley of death’ between bench and bedside1,2. To bring a drug successfully through all phases of drug development into the clinic costs more than 1.5–2.5 billion dollars (refs. 3, 4). This, combined with the inherent inefficiencies and deficiencies that plague the healthcare system, is leading to a crisis in clinical research. Therefore, innovative strategies are needed to engage patients and generate the necessary evidence to propel new advances into the clinic, so that they may improve public health. To achieve this, traditional clinical research models should make way for avant-garde ideas and trial designs.

Before the COVID-19 pandemic, the conduct of clinical research had remained almost unchanged for 30 years and some of the trial conduct norms and rules, although archaic, were unquestioned. The pandemic exposed many of the inherent systemic limitations in the conduct of trials5 and forced the clinical trial research enterprise to reevaluate all processes—it has therefore disrupted, catalyzed and accelerated innovation in this domain6,7. The lessons learned should help researchers to design and implement next-generation ‘patient-centric’ clinical trials.

Chronic diseases continue to impact millions of lives and cause major financial strain to society8, but research is hampered by the fact that most of the data reside in data silos. The subspecialization of the clinical profession has led to silos within and among specialties; every major disease area seems to work completely independently. However, the best clinical care is provided in a multidisciplinary manner with all relevant information available and accessible. Better clinical research should harness the knowledge gained from each of the specialties to achieve a collaborative model enabling multidisciplinary, high-quality care and continued innovation in medicine. Because many disciplines in medicine view the same diseases differently—for example, infectious disease specialists view COVID-19 as a viral disease while cardiology experts view it as an inflammatory one—cross-discipline approaches will need to respect the approaches of other disciplines. Although a single model may not be appropriate for all diseases, cross-disciplinary collaboration will make the system more efficient to generate the best evidence.

Over the next decade, the application of machine learning, deep neural networks and multimodal biomedical AI is poised to reinvigorate clinical research from all angles, including drug discovery, image interpretation, streamlining electronic health records, improving workflow and, over time, advancing public health (Fig. 1). In addition, innovations in wearables, sensor technology and Internet of Medical Things (IoMT) architectures offer many opportunities (and challenges) to acquire data9. In this Perspective, I share my heuristic vision of the future of clinical trials and evidence generation and deliberate on the main areas that need improvement in the domains of clinical trial design, clinical trial conduct and evidence generation.

Fig. 1: Timeline of drug development from the present to the future.
figure 1

The figure represents the timeline from drug discovery to first-in-human phase 1 trials and ultimately FDA approval. Phase 4 studies occur after FDA approval and can go on for several years. There is an urgent need to reinvigorate clinical trials through drug discovery, interpreting imaging, streamlining electronic health records, and improving workflow, over time advancing public health. AI can aid in many of these aspects in all stages of drug development. DNN, deep neural network; EHR, electronic health records; IoMT, internet of medical things; ML, machine learning.

Clinical trial design

Trial design is one of the most important steps in clinical research—better protocol designs lead to better clinical trial conduct and faster ‘go/no-go’ decisions. Moreover, losses from poorly designed, failed trials are not only financial but also societal.

Challenges with randomized controlled trials

Randomized controlled trials (RCTs) have been the gold standard for evidence generation across all areas of medicine, as they allow unbiased estimates of treatment effect without confounders. Ideally, every medical treatment or intervention should be tested via a well-powered and well-controlled RCT. However, conducting RCTs is not always feasible owing to challenges in generating evidence in a timely manner, cost, design on narrow populations precluding generalizability, ethical barriers and the time taken to conduct these trials. By the time they are completed and published, RCTs become quickly outdated and, in some cases, irrelevant to the current context. In the field of cardiology alone, 30,000 RCTs have not been completed owing to recruitment challenges10. Moreover, trials are being designed in isolation and within silos, with many clinical questions remaining unanswered. Thus, traditional trial design paradigms must adapt to contemporary rapid advances in genomics, immunology and precision medicine11.

Monday, November 05, 2007

Don't Take it Personal...,

In the last several months a potential new tool for diabetes prevention has come to market. A test developed by the Icelandic genomics company deCode Genetics and marketed to consumers by San Francisco-based DNA Direct determines whether people carry copies of a genetic variation that can greatly increase the risk of developing type 2 diabetes. Diabetes is the result of a complex mix of genetic and environmental factors. But recent genomic studies have identified several genetic variations that contribute heavily to the disease. The one that exerts by far the biggest influence occurs in a gene called TCF7L2, which was discovered by scientists at deCode in 2005; almost 20 percent of people with type 2 diabetes carry two copies of the high-risk version of the gene. These people are thought to secrete less insulin, a crucial hormone that signals cells to store glucose for energy. A single copy of the varia­tion somewhat increases the risk of contracting the disease, and two copies double the risk, regardless of other risk factors.

The Genetic Information Non-Discrimination Act, which would prohibit discrimination in employment or insurance on genetic grounds, is still lounging in the Senate, despite presidential support and a whopping margin (420-3) in the House of Representatives.

Do you really imagine that there's anything about you that "our" governing aristocrats consider inviolable?

Remember, what, it's all about...,

Saturday, November 24, 2007

Stem Cell Breakthrough

So we're doing the family chitchat over dinner yesterday evening, and my daughter is on about how much more opulent her lifestyle will be than the one she currently enjoys with her mother and I. This child has been versed in the realities, so it's a humbling parental reality check to witness all those carefully sown facts disintegrate into nothingness in the face of peer and media information onslaught (consensus reality).

Anyway, not wanting to harsh her thanksgiving mellow with my own somber outlook on what's around the signpost up ahead, I ask her if she's on top of this week's big news in the life sciences? Thankfully, both children are very interested in biology and I've made a point of trying to steer them in the direction of computational biology, genomics, and so forth, because this area has an exceptional concentration of new institutional investment in these disciplines, and, they have easy access and extensive potential exposure to folks working in this fundamentally important area;
To help explain this, we turn to Kenneth Miller, a cell biologist and professor at Brown University. He also serves as an adviser to the NewsHour's Science Unit.

Well, Ken, let's start with the science here. What does it actually mean to reprogram cells?

KENNETH MILLER, Cell Biologist: Well, what it means to reprogram cells, builds upon essentially a trick. And it's a trick that our own reproductive cells pull off when a sperm and egg unite to form an embryo.

The cells in an adult body -- skin cells, muscle cells, nerve cells -- are sort of at dead ends. In other words, that skin cell is going to remain a skin cell; that muscle cell is going to remain a muscle cell.

But our reproductive cells have the ability to go back to stage one, form a single-celled embryo, and then grow into every one of the tissues and cells in the body. That reprogramming is something that happens with us normally between each generation.

What developmental biologists have longed to understand is how that reprogramming takes place. And what this development means today is that we are a little bit closer to understanding how to switch on the reprogramming, take one of our adult cells, trick it into thinking it's part of an embryo, and hopefully get that cell to develop into cells that we really need to repair or to heal the body.

JEFFREY BROWN: And this work came out of studies that were done on mice, right? We talked about it on the program when that was done. So what's the advance here?

KENNETH MILLER: Well, the advance here, on one hand, the advance isn't much. In other words, you could minimize it. You could say, back in June, three laboratories reported that it was possible to pull this feat off, of taking an ordinary adult cell, sticking a few extra genes in it, and reprogramming it to become an embryonic stem cell, and that was done in one species, mice.

The development today is now it's been done in another species. And you might say, "Big deal." But that other species happens to be human beings, human cells. And now it's getting close to having direct application in hospitals and in laboratories.
You never know. Maybe we are on the cusp of the singularity and Moore's Law applied to the quantum computational apparatus of life itself will unlock a bounty and a utopian rather than dystopian future for this child of mine dreaming of opulence....,

Thursday, July 15, 2021

The Delta Variant Narrative Eloquently Makes The Case For A Malfeasant Elite Agenda

Data show that:
Experimental mRNA Therapeutic Jabs REDUCE: symptoms, hospital admissions, and fatalities.

Data also show that:
The mRNA jabbed STILL become: infected; symptomatic; infectious; even super spreaders.

So what exactly does the planned vaccination passport aim to achieve other than a backdoor around lockdown protections and to create a two tiered political economy with no genuine biosecurity utility - only seething interpersonal resentment?

Polio vaccine gives a rough approximation of “sterilizing immunity.” You can fight off any infections by the polio virus for pretty much the rest of your life.

The state of the art Coronavirus ‘mRNA Therapeutics’ are showing a steep drop off in effectiveness after six to eight months. Even with that, vaccinated individuals can and do catch Covid and spread Covid.

Have you ever heard or read of someone who has had a polio vaccination giving it to anyone else afterward?

Even when people are willing to get the coronavirus ‘mRNA jab,’ most of the working class in this country cannot afford the opportunity costs of getting the ‘jab.’ No health ‘care,’ no paid, or even unpaid time off to recover from the after effects many experience from having had the shots, etc.

For some as yet unspecified reasons, the American Health Establishment has massively bungled its handling of the Coronavirus Pandemic. Even now, the ‘vaccinated’ are told that they do not need to mask in public, even when the evidence says otherwise; even when they can be the disease vectors that ignite new ‘hot spots’ of infection - simply by not staying masked in public.

I am highly suspicious of the nature of the “bungling” in evidence in Britain and America.

I suspect now that a decision was made to let the disease run wild so as to “cull the herd.” Having survived an infection last year, I've become emotionally detached from the demographic sub-populations targetted for Covid 'culling,’

Some years ago, I would more openly profess my Malthusian predilections. However, now that I can plainly see a cull being effected by Elites via nullification of the Social Contract - I'm much less sanguine about the prospects for population control and much clearer about the utility of plague for human livestock management. 

NYTimes |  Those who have been inoculated against the coronavirus have little to worry about. Reports of infections with the Delta variant among fully immunized people in Israel may have alarmed people, but virtually all of the available data indicate that the vaccines are powerfully protective against severe illness, hospitalization and death from all existing variants of the coronavirus.

Even a single dose of vaccines that require two shots seems to prevent the most severe symptoms, although it is a flimsier barrier against symptomatic illness — making it an urgent priority to give people second doses in places like Britain that opted to prioritize first doses.

“When you have populations of unvaccinated individuals, then the vaccines really can’t do their jobs,” said Stacia Wyman, an expert in computational genomics at the University of California, Berkeley. “And that’s where Delta is really a concern.”

Britain’s experience with the Delta variant has highlighted the importance not just of vaccination, but the strategy underlying it. The country ordered inoculations strictly by age, starting with the oldest and carving out few exceptions for younger essential workers, outside of the medical profession.

 

 

Thursday, January 16, 2014

Jablonski taking a sledgehammer to race - MUCH more impressed with this woman than I am with myself...,


Edge | Race has always been a vague and slippery concept. In the mid-eighteenth century, European naturalists such as Linnaeus, Comte de Buffon, and Johannes Blumenbach described geographic groupings of humans who differed in appearance. The philosophers David Hume and Immanuel Kant both were fascinated by human physical diversity. In their opinions, extremes of heat, cold, or sunlight extinguished human potential. Writing in 1748, Hume contended that, "there was never a civilized nation of any complexion other than white."

Kant felt similarly. He was preoccupied with questions of human diversity throughout his career, and wrote at length on the subject in a series of essays beginning in 1775. Kant was the first to name and define the geographic groupings of humans as races (in German, Rassen). Kant's races were characterized by physical distinctions of skin color, hair form, cranial shape, and other anatomical features and by their capacity for morality, self-improvement, and civilization. Kant's four races were arranged hierarchically, with only the European race, in his estimation, being capable of self-improvement.

Why did the scientific racism of Hume and Kant prevail in the face of the logical and thoughtful opposition of von Herder and others? During his lifetime, Kant was recognized as a great philosopher, and his status rose as copies of his major philosophical works were distributed and read widely in the nineteenth century. Some of Kant's supporters agreed with his racist views, some apologized for them, or—most commonly—many just ignored them. The other reason that racist views triumphed over anti-racism in the late eighteenth and nineteenth centuries was that racism was, economically speaking, good for the transatlantic slave trade, which had become the overriding engine of European economic growth. The slave trade was bolstered by ideologies that diminished or denied the humanity of non-Europeans, especially Africans. Such views were augmented by newer biblical interpretations popular at the time that depicted Africans as destined for servitude. Skin color, as the most noticeable racial characteristic, became associated with a nebulous assemblage of opinions and hearsay about the inherent natures of the different races. Skin color stood for morality, character, and the capacity for civilization; it had become a meme. The nineteenth and early twentieth centuries saw the rise of "race science." The biological reality of races was confirmed by new types of scientific evidence amassed by new types of scientists, notably anthropologists and geneticists. This era witnessed the birth of eugenics and its offspring, the concept of racial purity. The rise of Social Darwinism further reinforced the notion that the superiority of the white race was part of the natural order. The fact that all people are products of complex genetic mixtures resulting from migration and intermingling over thousands of years was not admitted by the racial scientists, nor by the scores of eugenicists who campaigned on both sides of the Atlantic for the improvement of racial quality.

The mid-twentieth century witnessed the continued proliferation of scientific treatises on race. By the 1960s, however, two factors contributed to the demise of the concept of biological races. One of these was the increased rate of study of the physical and genetic diversity human groups all over the world by large numbers of scientists. The second factor was the increasing influence of the civil rights movement in the United States and elsewhere. Before long, influential scientists denounced studies of race and races because races themselves could not be scientifically defined. Where scientists looked for sharp boundaries between groups, none could be found.

Despite major shifts in scientific thinking, the sibling concepts of human races and a color-based hierarchy of races remained firmly established in mainstream culture through the mid-twentieth century. The resulting racial stereotypes were potent and persistent, especially in the United States and South Africa, where subjugation and exploitation of dark-skinned labor had been the cornerstone of economic growth.

After its "scientific" demise, race remained as a name and concept, but gradually came to stand for something quite different. Today many people identify with the concept of being a member of one or another racial group, regardless of what science may say about the nature of race. The shared experiences of race create powerful social bonds. For many people, including many scholars, races cease to be biological categories and have become social groupings. The concept of race became a more confusing mélange as social categories of class and ethnicity. So race isn't "just" a social construction, it is the real product of shared experience, and people choose to identify themselves by race.

Clinicians continue to map observed patterns of health and disease onto old racial concepts such as "White", "Black" or "African American", "Asian," etc. Even after it has been shown that many diseases (adult-onset diabetes, alcoholism, high blood pressure, to name a few) show apparent racial patterns because people share similar environmental conditions, grouping by race are maintained. The use of racial self-categorization in epidemiological studies is defended and even encouraged. In most cases, race in medical studies is confounded with health disparities due to class, ethnic differences in social practices, and attitudes, all of which become meaningless when sufficient variables are taken into account.

Race's latest makeover arises from genomics and mostly within biomedical contexts. The sanctified position of medical science in the popular consciousness gives the race concept renewed esteem. Racial realists marshal genomic evidence to support the hard biological reality of racial difference, while racial skeptics see no racial patterns. What is clear is that people are seeing what they want to see. They are constructing studies to provide the outcomes they expect. In 2012, Catherine Bliss argued cogently that race today is best considered a belief system that "produces consistencies in perception and practice at a particular social and historical moment".

Race has a hold on history, but it no longer has a place in science. The sheer instability and potential for misinterpretation render race useless as a scientific concept. Inventing new vocabularies of human diversity and inequity won't be easy, but is necessary.

Sunday, May 17, 2015

first computational genomics, now computational ethology...,


academia |  Abstract: For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i) survey what the ethological record reveals about the uniqueness of the human computational system, and (ii) explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested.

 Keywords: Minimalism; Labeling effects; cognome; animal cognition; formal language theory; language evolution

Thursday, November 08, 2007

Individual Genetic Variation 5X Greater Than Thought

The First Diploid Sequence of an Individual Human: The highly accurate sequence suggests that our genetic code is five times as variable as we thought

SOURCE: "The Diploid Genome Sequence of an Individual Human" Samuel Levy et al.
PLoS Biology
5: e254

RESULTS: Genomics pioneer Craig Venter and his colleagues have generated a highly accurate sequence of Venter's genome, one that includes the DNA sequences inherited from both his mother and his father.

WHY IT MATTERS: The genome sequence generated by the Human Genome Project, the massive, distributed effort to sequence human DNA that was completed in 2003, was a milestone in the history of biology. But the DNA sequence produced by the project represented just one set of chromosomes (every human has two sets, one inherited from each parent), and it drew on DNA samples from many individuals. As a result, it didn't reflect some of the variability between individuals. ­Venter's diploid genome suggests that genetic variation between individuals is approximately 0.5 percent, not the 0.1 percent that earlier sequencing projects suggested.

METHODS: In the new study, researchers used a method of gene sequencing called Sanger sequencing. The method is more expensive than newer approaches, but it generates longer strings of DNA that are easier to assemble into a complete genome.

NEXT STEPS: Venter and his colleagues plan to add phenotypic information, such as medical records and physical characteristics, to the database housing his genome. This will allow scientists to begin analyzing an individual's genomic information in the context of his or her actual traits.

Weak People Are Open, Empty, and Easily Occupied By Evil...,

Tucker Carlson: "Here's the illusion we fall for time and again. We imagine that evil comes like fully advertised as such, like evi...