Most people in the world share 2-4% of DNA with Neanderthals while a few inherited genes from Denisovans, a study confirms.
Denisovan
DNA lives on only in Pacific island dwellers, while Neanderthal genes
are more widespread, researchers report in the journal Science.
Meanwhile, some parts of our genetic code show little trace of our extinct cousins.
They include hundreds of genes involved in brain development and language.
"These
are big, truly interesting regions," said co-researcher Dr Joshua Akey,
an expert on human evolutionary genetics from the University of
Washington Medicine, US.
"It will be a long, hard slog to fully
understand the genetic differences between humans, Denisovans and
Neanderthals in these regions and the traits they influence."
Siberia cave
Studies
of nuclear DNA (the instructions to build a human) are particularly
useful in the case of Denisovans, which are largely missing from the
fossil record.
The prehistoric species was discovered less than a
decade ago through genetic analysis of a finger bone unearthed in a cave
in northern Siberia.
Substantial
amounts of Denisovan DNA have been detected in the genomes of only a
handful of modern-day human populations so far. DNA of girl from Denisova cave gives up genetic secrets - BBC
"The
genes that we found of Denisovans are only in this one part of the
world [Oceania] that's very far away from that Siberian cave," Dr Akey
told BBC News.
Where the ancestors of modern humans might have had physical contact with Denisovans is a matter of debate, he added.
Denisovans
may have encountered early humans somewhere in South East Asia and,
eventually, some of their descendants arrived on the islands north of
Australia.
Meanwhile, humans repeatedly ran into Neanderthals as they spread across Eurasia.
"We
still carry a little bit of their DNA today," said Dr Akey. "Even
though these groups are extinct their DNA lives on in modern humans."
Genetic ancestry
The
research was carried out by several scientists, including Svante Paabo
of the Department of Evolutionary Genetics at the Max-Planck-Institute
for Evolutionary Anthropology.
They found that all non-African populations inherited about 1.5-4% of their genomes from Neanderthals.
However,
Melanesians were the only population that also had significant
Denisovan genetic ancestry, representing between 1.9% and 3.4% of their
genome.
"I think that people (and Neanderthals and Denisovans)
liked to wander," said Benjamin Vernot of the University of Washington,
who led the project.
"And yes, studies like this can help us track where they wandered."
Less than a year ago, when a group of leading
researchers was calling for a moratorium on the use of a revolutionary
technology, Chinese researchers shocked the world
by using it to genetically modify human embryos. The worry was that
unfettered access to the technology might enable such embryos to become
fully grown humans, who will then pass on mutations to all their
offspring. The risk of unintended consequences seemed too great.
Now a different group of Chinese researchers have
again wielded the technology to genetically modify human embryos. This
time, however, the reaction from some scientists is just an annoyed
shrug. Clearly a lot happened in the last year for perceptions to change so drastically.
The technology in question is called CRISPR, and
it allows researchers to make genetic modifications with greater
precision than ever before. In 2015, Chinese researchers used CRISPR to
target genes responsible for a blood disorder called β-thalassaemia.
They were only able to replace the defective gene in 28 out 71 embryos. Worse still, it left a slew of unintended changes in other parts of the genome.
In the latest attempt, researchers at Guangzhou
Medical University have gone a step ahead. Instead of trying to correct
mutations that could cause disease, they used CRISPR technology to
insert a genetic mutation which might offer resistance against HIV.
The mutation was targeted in the CCR5 gene, which is responsible for producing a protein that HIV uses to latch on, enter, and infect a human immune cell. If the CCR5
gene were mutated, the logic goes, the HIV virus would not be able to
infect—and thus the mutation would confer resistance to the disease.
The researchers report in the Journal of Assisted Reproduction and Genetics
that they were successfully inserted the mutated gene in four out of 26
embryos. And, even in the successful cases, not all copies of the CCR5 gene were modified. In other cases mutations were caused that weren’t intended.
The experiment had been approved by a local ethics committee, which ensured that the study followed Chinese government guidelines.
All the experimental human embryos were “non-viable,” which means they
would have been unable to become fully grown humans. Such abnormal
embryos are an inevitable part of in-vitro fertilization therapy, where
sometimes two sperms insert their DNA into a single egg.
“The results are both comforting and disturbing,”
says Peter Donovan of the University of California at Irvine. “The good
news is that the technique worked for this group in the same way that
it did for the first group… an important part of the scientific process
showing it wasn’t a fluke the first time. The salutary lesson is that
there is still much to be learned about gene editing in human embryos
before it is ready for prime time.”
The rate of failure has made some bioethicists and scientists question the motives
of Chinese researchers who continue to test CRISPR in human embryos.
They argue that, while CRISPR offers greater precision, it still isn’t
ready for testing in human embryos. Others, like Donovan, maintain that
it will be studies using donated human embryos that will give us the
most understanding.
Despite the divided opinion, there is definitely a
change in perception. The first study reporting the genetic
modification of human embryos resulted in a summit
held in November between the science academies of China, the US and the
UK. After days of deliberation, the world’s leading geneticists agreed
that, while no CRISPR-modified embryos should become full human beings,
research using human embryos can continue.
The Chinese group that did the latest work
insists that their “proof-of-concept” may provide solutions to improving
human health. They write, “Despite the significant scientific and
ethical issues involved, however, we believe that it is necessary to
keep developing and improving the technologies for precise genetic
modifications in humans.
http://qz.com/
2015 was the year it became OK to genetically engineer babies
In April of this year, researchers in China published the results of an experiment in modifying the DNA of human embryos.
Though the embryos they worked on were damaged ones that could not have
grown into living babies, it sent a tremor through the scientific
establishment.
Just a month earlier, a group of leading geneticists had called for a moratorium
on gene-editing in embryos. Since any genetic changes would get passed
on to future generations, they argued, the risk of unintended
consequences was too great. And in November, at a summit in Washington
DC, scientists from across the world agreed that, while research should continue, it’s still too risky to allow any altered embryos to grow into full human beings.
And, yet, when historians of science look back
decades from now, they may well mark 2015 as the year genetically
engineering humans became acceptable. That’s because, while the world
was paying attention to the gene-editing summit, a more momentous
decision had been made just a month earlier in the UK. There, a
governmental body got ready to hand out licenses for creating a
particular kind of genetically engineered human—using a technique the US
tried and then banned 13 years ago.
This technique won’t create the fabled “designer
babies” just yet. But the changes made to an embryo will be hereditary,
and thus alter the genetic makeup of all the offspring to follow. The
story of how we got here, and what will come next, is why 2015 will be
remembered as a turning-point.
Just getting better
Our ability to do some form of genetic engineering goes back 40,000 years. Selective breeding created a tamer and more likable version of a wolf, the common ancestor of all today’s dogs.
Our desire to design better humans is also old. In Plato’s Republic, Socrates calls for a state-run program to get the best citizens to mate so that the population could be improved.
By the 19th century, the ideology of eugenics—a word not invented by Plato, but coined much later
from the Greek for “good breeding”—had taken such a hold that countries
were passing laws for such programs. Before World War II, 30 states in
the US had passed some form of eugenics laws that mandated sexual sterilization of those deemed unfit (typically the mentally ill).
Only after the horror of Hitler’s genocide did
the world recoil from eugenics. Most geneticists never returned to the
idea that biological intervention would build a better society than
social intervention. As Nathaniel Comfort, a professor of history of
medicine, writes in Aeon,
eugenics survived only in the form of “preventive medicine for genetic
diseases”—such as screening people for them and, occasionally, treating
them with gene therapy.
Not everyone stopped trying. Taking inspiration from Aldous Huxley’s Brave New World, eugenicist Robert Graham created the Repository for Germinal Choice, a sperm bank for the super-intelligent. The bank existed from 1980 to 1999 and had some 19 high-IQ donors, including at least one Nobel laureate (William Shockley).
The resulting “Genius Babies ”—some 200 of
them—are no different from normal people. One of those conceived through
Graham’s sperm bank told CNN, “There’s only so much you can control when it comes to genetics. It all has to do with what you give to your family.”
Beyond our control
That comment defines the limits of science today. In the 1970s, we finally understood how to tweak the genes of microbes, plants and animals to achieve certain traits. But in humans, with the exception of a few things
such as color blindness or tasting certain foods, “designer baby”
traits, such as greater intelligence, taller stature, stronger muscles,
or better memory, are controlled by hundreds of genes, each of which
also perform many other critical functions. Tools that can deal with
such complexity are still a long way off.
For now, then, the only foreseeable use for
gene-editing is to prevent disease. And the goal is to make genetic
tools good enough to do that without any unintended consequences.
Since 1989, thousands of people have received
experimental gene therapies. Typically these involve the use of a
“vector”—a biological vehicle, such as a virus, that can deliver the
correct copy of a faulty gene to the specific cells in the body affected
by the genetic disorder, such as cancer cells or faulty cells in the
eye.
While most of these treatments have been safe,
only a few have been effective. China approved the world’s first gene
therapy in 2003 to treat certain kinds of cancers. Europe got its first in 2012 that treats a rare inherited disorder (pdf) affecting the pancreas. The US is likely to get its first gene therapy approved in 2016 to treat a form of blindness.
None of these therapies, however, have used the latest advance in gene-editing: CRISPR-Cas9,
a highly precise copy-and-paste tool that allows for the removal and
replacement of individual genes. Since its development in 2012, it has
become an instant favorite among scientists.
CRISPR-Cas9’s immediate potential lies in curing
single-gene disorders in embryos. Changes made at that stage would
affect every cell in the body and could cure many diseases. We know some 4,000 such disorders,
and, though each is rare, put together they could change the lives of
millions in the next generation, and keep many more free from those
diseases for all the generations to follow.
However, the Chinese study earlier this year
showed that we might need something even more precise than CRISPR-Cas9
currently is. Only in one-third of the 86 embryos was the faulty gene
erased as predicted, and even in those cases, CRISPR-Cas9 had also
modified things it wasn’t meant to—the unintended consequences
scientists worry about.
There is, however, another form of genetic
engineering of human embryos. This is the one that the US tried and then
banned, and that the British government recently opened licensing
applications for. And what’s probably more important for the future of
the debate is how Britain decided the technology works and is safe.
Three-parent child
Alana Saarinen was born in the US with three
biological parents. Two of them contribute more than 99% of her genetic
material and the third provides the rest. She is one of only 30 or so people in the world who grew from a genetically engineered embryo into a healthy adult.
Sharon and Paul Saarinen had attempted in-vitro
fertilization (IVF) four times, but without success. A likely reason was
that Sharon’s egg cells had faulty mitochondria. These are like
biological batteries within a cell—they play an essential role in
converting your food into the energy that powers your body. Uniquely,
they also have their own DNA (though it’s only some 37 of the 20,000 or so genes that make up the human genome).
During reproduction, when an egg cell fuses with a
sperm cell, creating the first cell of an embryo, it’s only the DNA in
its nucleus that is a mix of both parents’ DNA. Mitochondria and their
DNA are passed on directly from mother to offspring. Because Sharon
Saarinen’s mitochondria were faulty, she basically needed a mitochondria
transplant in order to conceive.
That was where the third parent came in. A donor
provided an egg cell, whose nucleus was removed and healthy mitochondria
along with other bits of the cell were transferred to Sharon’s egg. The
egg cell was then mixed with Paul’s sperm cells in a normal IVF
procedure, and the resulting embryo would become Alana Saarinen.
This technique won’t only help women like Sharon
conceive. In a lot of cases with faulty mitochondria, pregnancies
proceed normally, but the child then turns out to have one of several
mitochondrial diseases, which can lead to all sorts of problems, from poor growth to autism to diabetes. One in every 5,000 children suffers from one of them.
Mitochondrial replacement therapy like the
Saarinens had is currently the only known way of preventing
mitochondrial diseases. But since they conceived Alana in 2000, only
some 30 or so children have been born using the technique. In 2002, the
US Food and Drug Administration (FDA) banned its use. Apart from ethical
concerns about scientists “playing God,” there was a scientific worry
too. We had never attempted to edit the “germ line”—the DNA that is
transferred from one generation to another—and the risks were unknown.
(About 10% of the pregnancies that resulted from this treatment had
complications, but it wasn’t clear whether the procedure was to blame.)
Selfish genes
The FDA ban meant that US women with faulty mitochondria were left with difficult choices
(pdf). They could choose not to have children, or undergo IVF and pick
the fertilized embryo with the fewest defective mitochondria—taking a
gamble on whether their child would develop mitochondrial disease.
But nearly a decade later the UK government’s
Human Fertilization and Embryology Agency (HFEA) took up the case. In
2012, after taking a detailed look at the results of studies on animals
and humans, it deemed that mitochondrial replacement therapy was “not
unsafe”—meaning that the benefits of curing mitochondrial disease would
outweigh the risks of the procedure.
The interesting thing was what the HFEA did next. In Sept. 2012, it launched a public consultation,
creating a website that explained both the risks and benefits, and
holding public events to do the same. Then it conducted a survey and
asked people to send in their comments online. After the public had
shown broad support for the therapy—and despite stiff opposition
from scholarly groups and religious groups alike—the HFEA spent two
years taking the necessary steps to get the regulations discussed in
parliament. In February, MPs agreed to allow the use of the therapy
under strict guidelines. In October, the process for handing out
licenses began.
We already use genetic engineering to create
climate-resistant crops and drug-producing bacteria. Now one of the
world’s most scientifically advanced countries—and, fittingly, the
birthplace of IVF—has agreed that genetically modified humans, too, are
sometimes not just OK, but desirable. This is what makes 2015 an
historic year.
Based on past progress, it is likely that genetic
enhancements to humans will become a reality step by step. Just like
mitochondrial replacement therapy, they will first appear for a very
narrow purpose, such as curing single-gene disorders, and then, likely
over many decades, we might reach the stage of creating those fabled
designer babies.
That gives us enough time to deliberate the
implications of each step. When our decisions will affect generations of
humans to come, it is important we use that time well. The process that
HFEA designed to win public and political support is a model worth
emulating. If each step were to get the same scrutiny that mitochondrial
replacement therapy got, genetically modified humans could become as
normal as genetically modified crops and bacteria are today—and, barring
the occasional controversy, as widely accepted.
Corrected Dec. 23: An
earlier version of this post incorrectly said that Sharon Saarinen’s
nucleus was implanted in a donor’s egg. It also said that HFEA began
handing out licenses in October, but in fact it then began the process
of handing them out.
http://qz.com/
The pros and cons of genetically engineering your children
From time to time, science troubles philosophers
with difficult ethical questions. But none has been as difficult as
considering permanently altering the genetic code of future generations.
At a meeting that began on Dec. 1 in Washington DC, the world’s leading
gene-editing experts met with ethicists, lawyers, and interested
members of the public to decide whether it should be done.
Gene-editing tools have existed since 1975, when a
meeting of a similar kind was held to discuss the future of genetic
technology. But recent developments have made the technology safe enough
to consider turning science fiction into reality. In fact, in April, Chinese researchers announcedthat
they had conducted experiments to remove genes of an inheritable
disease in human embryos (embryos that were alive but damaged, so they
could not have become babies).
So the stakes are high. By eliminating “bad”
genes from sperm and egg cells—called the “germline”—these tools have
the potential to permanently wipe out diseases caused by single
mutations in genes, such as cystic fibrosis, Huntington’s disease, or
Tay-Sachs.
At the same time, there is huge uncertainty about what could go wrong if seemingly troubling genes are eliminated.
One of the key researchers in the field is
Jennifer Doudna at the University of California, Berkeley. She has been
touted for a Nobel Prize for the development of CRISPR-Cas9, a highly
precise copy-paste genetic tool. In the build-up to the meeting, Doudna
made her concerns clear in Nature:
“Human-germline editing for the
purposes of creating genome-modified humans should not proceed at this
time, partly because of the unknown social consequences, but also
because the technology and our knowledge of the human genome are simply
not ready to do so safely.”
Her sentiments were echoed in a report
released before the meeting by the Center for Genetics and Society.
They believe that research in genetic tools must advance, but only
through therapy for adults (where genetic modifications are targeted at
some cells in the body but not passed on to kids, such as in curing a form of inherited blindness). The report continues:
“But using the same techniques to
modify embryos in order to make permanent, irreversible changes to
future generations and to our common genetic heritage—the human
germline, as it is known—is far more problematic.”
Consider sickle-cell anemia, an occasionally
fatal genetic disorder. Its genes, though clearly harmful, have
persisted and spread because, while having two copies of the sickle-cell
gene causes anemia, having just one copy happens to provide protection
against malaria, one of the most deadly diseases in human history. Had
we not known about their benefits, eliminating sickle-cell genes would
have proved to be a bad idea.
More importantly, there is a worry that once you
allow for designer babies you go down a slippery slope. Emily Smith
Beitiks, disability researcher at the University of California, San
Francisco, said recently:
“These proposed applications raise
social justice questions and put us at risk of reviving
eugenics—controlled breeding to increase the occurrence of ‘desirable’
heritable characteristics. Who gets to decide what diversity looks like
and who is valued?”
But the history of science shows that it is hard
to keep such a cat in the bag. Once developed, technologies have a way
of finding their way into the hands of those who desire to use them.
That worries George Church, a geneticist at Harvard Medical School, who
has been a strong voice in this debate since the beginning. In Nature, he writes:
“Banning human-germlined editing
could put a damper on the best medical research and instead drive the
practice underground to black markets and uncontrolled medical tourism,
which are fraught with much greater risk and misapplication.”
And many believe that the risks of gene-editing
are not that high anyway. Nathaniel Comfort, a historian of medicine at
Johns Hopkins University in Baltimore, writes in Aeon:
“The dishes do not come à la carte. If you believe that made-to-order babies are possible, you oversimplify how genes work.”
That is because abilities, such as intelligence,
height, or personality traits, involve thousands of genes. So there may
be some things that you cannot genetically enhance much, and certainly
not safely. And even knowingly changing the human genome is not as big a
deal as some make it out to be, Church notes:
“Offspring do not consent to their
parents’ intentional exposure to mutagenic sources that alter the germ
line, including chemotherapy, high altitude, and alcohol—nor to
decisions that reduce the prospects for future generations, such as
misdirected economic investment and environmental mismanagement.”
The meeting ended on Dec. 3, and the committee of organizers—10 scientists and two bioethicists—came to a conclusion
on the debate. They believe that the promises of germline editing are
too great to scupper future developments. They endorse that research
should continue in non-human embryos and “if, in the process of
research, early human embryos … undergo gene editing, the modified cells
should not be used to establish a pregnancy.” That is because the
committee believes that we neither know enough about safety issues to
allow any clinical application, nor enough about how society will
respond to the use of this technology in humans.
And, yet, perhaps the the last word on the debate
should go to a woman in the audience at the meeting. Her child died
only six days old after torturous seizures caused by a genetic ailment.
She implored the research community, “If you have the skills and the knowledge to eliminate these diseases, then freakin’ do it!”
Scientists have synthesized a 'minimal genome' with only genes necessary for life
By Chelsea Harvey
A pioneering accomplishment in the
field of genetic research could help scientists gain new insights into
the very definition of life. The new research, published Thursday in the journal Science,
describes the synthetic creation of a “minimal genome” — a cell
containing only the genes absolutely required to keep itself alive.
With just 473 genes, it’s the smallest genome of any living, dividing
cell found in nature and may provide important insights into the
fundamental genetic requirements for life.
The idea of designing and studying a “minimal genome” is a concept
that’s fascinated scientists for decades. In fact, unlocking the secrets
of the genome has been a preoccupation of genetic researchers since the
first genome sequencing was performed on a bacterium in 1995 — the
event that ultimately led to this week’s breakthrough, according to the
new study’s authors.
“This
is a study that had its origins a little over 20 years ago in 1995,
when this institute sequenced the very first genome in history,
Haemophilus influenzae,” said the new paper’s senior author J. Craig
Venter, founder of the J. Craig Venter Institute, which specializes in
genomic research, during a Wednesday teleconference.
Later that same year, the institute also sequenced the genome of a
second type of bacteria, Mycoplasma genitalium. These breakthroughs
allowed for the first genomic comparisons between two different species,
Venter said.
Venter is most famous for his role as a leader of the team that first sequenced the human genome in 2000.
“[My colleagues] and myself were discussing the philosophy of these
differences in the genomes and decided the only way to answer basic
questions about life would be to get to a minimal genome, and probably
the only way to do that would be by trying to synthesize a genome,”
Venter said.
“And that started our 20-year quest to do this.”
The reason that researchers must synthesize, or essentially design
their own, minimal genome is because just about every living organism we
know of contains more genes than are actually necessary for its basic
survival. Even the simplest bacteria contain extra, nonessential genes
that are related to its growth, development and ability to react to its
environment, but that aren’t technically required to keep the cell
alive.
So in order to get down to a truly minimal genome, scientists must
take an existing genetic sequence and pare it down themselves, cutting
away all the nonessential genes until they end up with only the ones
that are absolutely essential
They
do this by creating synthetic genomes — genomes that are designed and
chemically built from the ground up using our existing knowledge of an
organism’s genetic information.
Along the way, scientists can add or delete genetic information as
they see fit. It’s the same basic principle that’s used in genetic
engineering research. But in the case of a minimal genome, the goal is
to slice off as much unnecessary genetic information as possible without
changing or adding anything else to the organism’s genome.
And that’s just what Venter and his colleagues set out to do.
DNA minimalism
They started with the genome of a
type of bacteria known as Mycoplasma mycoides, a parasite normally found
in cows and goats. In 2010, the group succeeded in building the complete M. mycoides genome from scratch and transplanting it into another cell.
This time around, they used a variety of methods to whittle the genome down before transplanting it.
To start, the researchers divided the bacterium’s genome into eight
different segments that could be individually altered and tested — just
to make the experiments a little more manageable. They then applied a
handful of techniques to peel away the nonessential genes.
They call this their “design-build-test” approach.
First, they applied their basic knowledge of genetics and
biochemistry to infer which genes might be safe to remove — but this
technique did not produce viable cells.
The researchers then conducted a series of experiments in which they
inserted bits of foreign genetic information — called transposons — into
the genome in order to disrupt the functions of certain genes and
figure out which ones the cell could do without. This process helped
them whittle down the genome until no more genes could be removed.
Along the way, the researchers were able to divide the bacterium’s
genes into three major categories: essential, nonessential and
quasi-essential, meaning they weren’t absolutely required for life but
were necessary to help the cell grow at a healthy pace.
It allowed them to discover how much we don’t know, even about the core sections of the genome
Venter and his colleagues also discovered that the genome contained a
number of redundant genes — pairs of genes that performed the same
function in the cell. These genes made the whittling process a little
confusing at first — if one of the redundant genes was removed (but not
the other), the cell would continue functioning, tricking the
researchers into believing it was a nonessential gene.
A great deal of trial and error was required in order for the researchers to classify all the genes.
Finally, though, they reached a point where no more genes could be removed without killing the cell.
The result is the smallest genome ever recorded in a self-replicating
— that means alive and able to divide — cell. It contains just 473
genes, all of which are either directly required to keep the cell alive
or to enable it to grow and divide fast enough to be practical for the
researchers’ experiments.
Interestingly,
about a third of the resulting genome consists of genes with unknown
biological functions. Most of the known essential genes perform
functions related to expressing genes, passing down genetic information
from one generation to the next, or performing essential functions in
the cell’s membrane and cytosol, so the scientists predict that the
unknown genes will have similar jobs — we just don’t know what yet.
“One of the great findings but also the great caveats of this work is
that it allowed them to discover how much we don’t know, even about the
core sections of the genome,” said Adam Arkin, director of the Synthetic Biology Institute at the University of California Berkeley, in a statement.
That said, Venter also noted that the concept of a minimal cell is context-dependent.
The specific genes that an organism requires to survive — even an
organism as simple as a bacterial cell — depend on what kind of
environment the cell is living in and what kinds of nutrients are
available to it.
And, of course, one species’ minimal genome would likely differ significantly from that of another species.
With that in mind, exploring different forms of minimal genomes could have important industrial applications, said Daniel Gibson, another of the study’s authors and another scientist at the J. Craig Venter Institute, during the same teleconference.
Because these cells are so simple, devote all their energy to
essential functions and are subject to very few genetic mutations, they
are “straightforward to engineer” and could provide helpful insights
into more complex types of biosynthesis in the future, he said.
Still, there’s plenty of work left to be done before the study of minimal genomes may yield practical applications.
“The major limitation is that this is the beginning of a very long road,” said Sriram Kosuri, an assistant professor of biochemistry at UCLA, in a statement.
“It's not as if this new minimal genome will automatically lead to
either fundamental insights or industrial applications immediately. That
said, they've created a self-replicating biological organism that might
be a better starting point for such scientific and engineering goals
than continuing to study natural systems."
Roche says it plans to study role of microbiome in cancer
Vedanta expects more drug companies to enter the field
Top
scientists at Roche Holding AG and AstraZeneca Plc are sizing up
potential allies in the fight against cancer: the trillions of bacteria
that live in the human body.
"Five years ago, if you had asked me
about bacteria in your gut playing an important role in your systemic
immune response, I probably would have laughed it off," Daniel Chen,
head of cancer immunotherapy research at Roche’s Genentech division,
said in a phone interview. "Most of us immunologists now believe that
there really is an important interaction there."
Two recent
studies published in the journal Science have intrigued Chen and others
who are developing medicines called immunotherapies that stimulate the
body’s ability to fight tumors.
In
November, University of Chicago researchers wrote that giving mice
Bifidobacterium, which normally resides in the gastrointestinal tract,
was as effective
as an immunotherapy in controlling the growth of skin cancer. Combining
the two practically eliminated tumor growth. In the second study,
scientists in France found that some bacterial species activated a response to immunotherapy, which didn’t occur without the microbes.
Human Microbiome
That’s increased drugmakers’ interest in the human microbiome
-- the universe of roughly 100 trillion good and bad bacteria, fungi
and viruses that live on and inside the body. Roche is already
undertaking basic research in the field and plans to investigate the
microbiome’s potential for cancer treatment, Chen said.
"Certainly,
we are already scanning the space for interesting opportunities as the
science continues to emerge," he said. "We are very interested in
testing these in a controlled setting."
Some experienced investors
are skeptical and see the possibility of an approved product for cancer
to be at least five years away.
"To therapeutically influence the microbiome long-term in humans is a big hurdle," said Sander van Deventer,
managing partner at venture-capital firm Forbion Capital Partners. "The
microbiome is very stubborn. Everything we’ve done so far has only had a
temporary effect."
Nestle’s Investment
Earlier in his
career, van Deventer chaired the department of gastroenterology and
hepatology at the Academic Medical Center in Amsterdam, the first clinic
in the world to perform fecal transplants to fight hospital infection
Clostridium difficile with good bacteria. Forbion hasn’t yet invested in
any microbiome biotechs, "but we’re looking at all of them all the
time," he said.
Efforts are under way to turn bacteria into
regulated pharmaceutical products to treat illnesses of the gut, where
the microbes reside.
Nestle SA last January invested
$65 million in ambridge, Massachusetts-based Seres Therapeutics Inc.,
which is developing a treatment for Clostridium difficile, which affects
the digestive system. That follows early efforts to harness the
microbiome’s benefits, which spawned probiotic foods and supplements as
well as transplants of healthy bacteria.
The promise in cancer
will draw more large drugmakers into exploring the human microbiome,
said Bernat Olle, chief executive officer of Vedanta Biosciences, a Boston-based startup.
Treatment Potential
"That’s
the sense we get based on how we’re being approached by new pharma
groups and how serious they seem to be about wanting to enter the
field,” Olle said in a phone interview. Vedanta last year announced a
license agreement with Johnson & Johnson on its experimental microbiome drug for inflammatory bowel disease.
Another startup, 4D Pharma Plc, in November said
it had discovered a bacterium that produces a response comparable to
that of an immunotherapy in animal tests for breast and lung cancers.
The London-listed company plans to start trials in patients by the end
of this year. To support research in autoimmune and neurological
diseases, in addition to cancer, the company has raised over 100 million
pounds ($140 million) from investors over the last two years, CEO
Duncan Peyton said in a phone interview.
French biotech Enterome
is taking a different approach: developing treatments based on bacterial
secretions. Enterome plans to close a private financing round of about
15 million euros this month, according to CEO Pierre Belichard. More
news may be on the way.
‘Active Discussions’
"We are in active discussions with the usual suspects in the immunotherapy space," Belichard said in an interview in London.
Those
active in the field include a wide range of pharma companies including
AstraZeneca, Roche, Bristol-Myers Squibb Co., and Merck & Co.
"Personally,
I think it’s a fascinating area," Susan Galbraith, head of oncology
research at AstraZeneca, said in an interview in London.
Studies
have shown that immunotherapies have varying degrees of success even in
genetically identical mice, and the Science study from Chicago suggests
that the diversity of the microbiome may help explain that variability,
Galbraith said. AstraZeneca isn’t conducting its own research in the
area and would prefer to wait to see evidence in human trials before
getting involved, she said.
The sheer number of bacteria, some of
which could actually switch off an immune response, and the question of
how much bacteria is needed, make it a complex area of research, Roche’s
Chen said. It’s possible that the same bacteria could induce both
harmful and helpful responses, depending on the patient, he said.
Still, "it’s one of the most interesting developments we’ve seen in science over the last several years," he said.
The oldest ever human nuclear DNA to be reconstructed and sequenced
reveals Neanderthals in the making – and the need for a possible rewrite
of our own origins.
The 430,000-year-old DNA comes from mysterious early human fossils found in Spain’s Sima de los Huesos, or “pit of bones”.
The fossils look like they come from ancestors of the Neanderthals, which evolved some 100,000 years later. But a 2013 study found that their mitochondrial DNA is more similar to that of Denisovans (see video, below), who also lived later and thousands of kilometres away, in southern Siberia.
So who were the Sima people – and how are they related to us?
To find out, a team led by Matthias Meyer
at the Max Planck Institute for Evolutionary Anthropology in Leipzig,
Germany, pieced together parts of the hominin’s nuclear DNA from samples
taken from a tooth and a thigh bone.
The results suggest they are more closely related to ancestors of
Neanderthals than those of Denisovans – meaning the two groups must have
diverged by 430,000 years ago. This is much earlier than the
geneticists had expected.
It also alters our own timeline. We know that Denisovans and
Neanderthals shared a common ancestor that had split from our modern
human lineage. In light of the new nuclear DNA evidence, Meyer’s team
suggests this split might have happened as early as 765,000 years ago.
Previous DNA studies had dated this split to just 315,000 to 540,000 years ago, says Katerina Harvati-Papatheodorou at the University of Tubingen in Germany.
But a date of 765,000 years ago actually brings the DNA evidence more in line
with some recent fossil interpretations that also suggest an older
divergence between modern humans and the ancestor of the Neanderthals
and Denisovans.
“I am very happy to see that ideas about the divergence based on
ancient DNA and on anatomical studies of the fossil record seem to be
converging,” says Aida Gómez-Robles at George Washington University in Washington DC, who was involved in the fossil research.
Tree redrawn?
But if such an ancient split is correct, we might have to redraw parts of our evolutionary tree.
Conventional thinking is that modern humans, Neanderthals and Denisovans all evolved from an ancient hominin called Homoheidelbergensis.
However, H.heidelbergensis didn’t evolve until
700,000 years ago – potentially 65,000 years after the split between
modern humans and the Neanderthals and Denisovans.
Instead, another, obscure species called Homo antecessor might now be in the frame as our common ancestor.
This species first appeared more than a million years ago – and its face is very similar to that of modern humans, says Chris Stringer at the Natural History Museum in London.
Further puzzles
“Research must now refocus on fossils from 400,000 to 800,000 years
ago to determine which ones might actually lie on the respective
ancestral lineages of Neanderthals, Denisovans and modern humans,” he
says.
Another puzzle remains. The study confirmed a previous finding that
the mitochondrial DNA of the Sima hominin is more similar to Denisovans
than to Neanderthals – but no one knows why.
Perhaps there was another unidentified lineage of hominins in Eurasia
that interbred with the ancestors of both – but not with the particular
group of hominins that evolved into the Neanderthals.
Or, Meyer says, perhaps such mitochondrial DNA was typical of early
Neanderthals and Denisovans, and it was only later that Neanderthals
acquired different mitochondrial DNA from an African population of
“proto-Homosapiens“.
Journal reference: Nature, DOI: 10.1038/nature17405 Find out more about theoldest human genome dug up in Spain’s pit of bones:
Neanderthal diet: Only 20 percent vegetarian
Researchers have long debated the precise
diet of early humans, but the latest study is the first to nail down
precise percentages.
Fossil analysis suggests Neanderthals ate a diet of
80 percent meat. Photo by OrdinaryJoe/Shutterstock
TUBINGEN, Germany, March 14 (UPI) --
Neanderthals were apparently too busy hunting and scavenging to pay much
attention to Michael Pollan's dietary advice: eat mostly plants.
New isotopic analysis suggests prehistoric humans ate mostly meat. As detailed in a new study published in the journal Quaternary International, the Neanderthal diet consisted of 80 percent meat, 20 percent vegetables.
Researchers in Germany measured isotope concentrations of collagen in
Neanderthal fossils and compared them to the isotopic signatures of
animal bones found nearby. In doing so, scientists were able to compare
and contrast the diets of early humans and their mammalian neighbors,
including mammoths, horses, reindeer, bison, hyenas, bears, lions and others.
"Previously, it was assumed that the Neanderthals utilized the same
food sources as their animal neighbors," lead researcher Herve
Bocherens, a professor at the University of Tubingen's Senckenberg
Center for Human Evolution and Palaeoenvironment, said in a news release.
"However, our results show that all predators occupy a very specific
niche, preferring smaller prey as a rule, such as reindeer, wild horses
or steppe bison, while the Neanderthals primarily specialized on the
large plant-eaters such as mammoths and woolly rhinoceroses," Bocherens
explained.
All of the Neanderthal and animal bones, dated between 45,000 and
40,000 years old, were collected from two excavation sites in Belgium.
Researchers have long debated the precise diet of early humans, but
the latest study is the first to nail down precise percentages.
Bocherens and his colleagues are hopeful their research will shed light on the Neanderthals' extinction some 40,000 years ago.
"We are accumulating more and more evidence that diet was not a
decisive factor in why the Neanderthals had to make room for modern
humans," he said.
Humans Interbred With Hominins on Multiple Occasions, Study Finds
The
ancestors of modern humans interbred with Neanderthals and another
extinct line of humans known as the Denisovans at least four times in
the course of prehistory, according to an analysis of global genomes
published on Thursday in the journal Science.
The interbreeding may have given modern humans genes that bolstered immunity to pathogens, the authors concluded.
“This
is yet another genetic nail in the coffin of our over-simplistic models
of human evolution,” said Carles Lalueza-Fox, a research scientist at
the Institute of Evolutionary Biology in Barcelona who was not involved
in the study.
The
new study expands on a series of findings in recent years showing that
the ancestors of modern humans once shared the planet with a surprising
number of near relatives — lineages like the Neanderthals and Denisovans
that became extinct tens of thousands of years ago.
Before
disappearing, however, they interbred with our forebears on at least
several occasions, and today we carry DNA from these encounters.
Later studies showed that the forebears of modern humans first encountered Neanderthals after expanding out of Africa more than 50,000 years ago.
But
the Neanderthals were not the only extinct humans that our own
ancestors found. A finger bone discovered in a Siberian cave, called
Denisova, yielded DNA from yet another group of humans.
Research
later indicated that all three groups — modern humans, Neanderthals and
Denisovans — shared a common ancestor who lived roughly 600,000 years
ago. And, perhaps no surprise, some ancestors of modern humans also
interbred with Denisovans.
Some
of their DNA has survived in people in Melanesia, a region of the
Pacific that includes New Guinea and the islands around it.
Those
initial discoveries left major questions unanswered, such as how often
our ancestors interbred with Neanderthals and Denisovans. Scientists
have developed new ways to study the DNA of living people to tackle
these mysteries.
Joshua
M. Akey, a geneticist at the University of Washington, and his
colleagues analyzed a database of 1,488 genomes from people around the
world. The scientists added 35 genomes from people in New Britain and
other Melanesian islands in an effort to learn more about Denisovans in
particular.
The
researchers found that all the non-Africans in their study had
Neanderthal DNA, while the Africans had very little or none. That
finding supported previous studies.
But
when Dr. Akey and his colleagues compared DNA from modern Europeans,
East Asians and Melanesians, they found that each population carried its
own distinctive mix of Neanderthal genes.
The first encounter happened when the common ancestor of all non-Africans interbred with Neanderthals.
The
second occurred among the ancestors of East Asians and Europeans, after
the ancestors of Melanesians split off. Later, the ancestors of East
Asians — but not Europeans — interbred a third time with Neanderthals.
Earlier
studies had hinted at the possibility that the forebears of modern
humans had multiple encounters with Neanderthals, but until now hard
data was lacking.
“A
lot of people have been arguing for that, but now they’re really
providing the evidence for it,” said Rasmus Nielsen, a geneticist at the
University of California, Berkeley, who was not involved in the new
study.
Oh how the mighty dinosaurs
have fallen. It’s a bit sad that the descendants of the magnificent
creatures who once ruled the Earth have stubby wings and are the most
commonly consumed meat in America (yes, I’m talking about the chicken).
Such is the circle of life. And now, in an attempt to restore a bit of
the glory of dinosaurs (or just create a truly bizarre looking animal),
scientists have genetically modified chickens to give them dinosaur
legs. Because science.
Interestingly enough, because of the close genetic relationship the
modern day chicken shares with the prehistoric giant, the researchers
involved with the wacky task simply had to silence a gene that chickens
typically express. No gene insertion or further manipulation — just a
(highly complex) flip of a switch.
The precise gene suppressed by the Chilean scientists, headed by Joâo Botelho
At Universidad de Chile is one called the Indian Hedgehog. This gene
is crucial to the development of chicken’s bones, and when turned off,
apparently allows the birds to develop a bone structure that looks just
like the lower leg of a raptor. Chicken on top, dinosaur on the bottom. Related: Just For the Tech of It: Martian crops and dinosaur chickens
This is by no means the first time that Botelho or other scientists
have engineered a bird to go back to its more magnificent origins.
Botelho also managed to undo the backward-facing perching toe common in
birds to produce a front-facing toe — much like what dinosaurs had. And
at Yale, a chicken was given a dinosaur-esque snout when its gene
expression was altered at the embryo stage.
This sort of work is taking place across the country, and indeed, across the world, says Jack Horner, a famous paleontologist whose expertise was consulted in each and every one of the Jurassic Park films.
At his lab at Montana State University, scientists are working to “
genetically alter a chicken egg to produce a more prehistoric version of
the animal, complete with velociraptor-shaped head, arms, clawed hands
and long tail,” the Post Register reports. But don’t worry, researchers
say that we won’t be plunged into a real life version of the movies
anytime soon.
“The experiments are focused on single traits, to test specific
hypotheses,” says Alexander Vargas, who heds the lab in which Botelho
works. “Not only do we know a great deal about bird development, but
also about the dinosaur-bird transition, which is well-documented by the
fossil record. This leads naturally to hypotheses on the evolution of
development, that can be explored in the lab.”
Just call it scientific curiosity, and enjoy the strange but wonderful results that have come out of it … thus far.
Also watch: Raimond de Hullu’s vision for Oas1s green buildings
Please enable Javascript to watch this video
Oh how the mighty dinosaurs have fallen. It’s
a bit sad that the descendants of the magnificent creatures who once
ruled the Earth have stubby wings and are the most commonly consumed
meat in America (yes, I’m talking about the chicken). Such is the circle
of life. And now, in an attempt to restore a bit of the glory of
dinosaurs (or just create a truly bizarre looking animal), scientists
have genetically modified chickens to give them dinosaur legs. Because
science.
Interestingly enough, because of the close genetic
relationship the modern day chicken shares with the prehistoric giant,
the researchers involved with the wacky task simply had to silence a
gene that chickens typically express. No gene insertion or further
manipulation — just a (highly complex) flip of a switch.
The precise gene suppressed by the Chilean scientists, headed by Joâo Botelho
At Universidad
de Chile is one called the Indian Hedgehog. This gene is crucial to the
development of chicken’s bones, and when turned off, apparently allows
the birds to develop a bone structure that looks just like the lower leg
of a raptor. Chicken on top, dinosaur on the bottom. Related: Just For the Tech of It: Martian crops and dinosaur chickens
This
is by no means the first time that Botelho or other scientists
have engineered a bird to go back to its more magnificent origins.
Botelho also managed to undo the backward-facing perching toe common in
birds to produce a front-facing toe — much like what dinosaurs had. And
at Yale, a chicken was given a dinosaur-esque snout when its gene
expression was altered at the embryo stage.
This sort of work is taking place across the country, and indeed, across the world, says Jack Horner, a famous paleontologist whose expertise was consulted in each and every one of the Jurassic Park films.
At his lab at Montana State University, scientists are working to “
genetically alter a chicken egg to produce a more prehistoric version of
the animal, complete with velociraptor-shaped head, arms, clawed hands
and long tail,” the Post Register reports. But don’t worry, researchers
say that we won’t be plunged into a real life version of the movies
anytime soon.
“The experiments are focused on single traits, to
test specific hypotheses,” says Alexander Vargas, who heds the lab in
which Botelho works. “Not only do we know a great deal about bird
development, but also about the dinosaur-bird transition, which is
well-documented by the fossil record. This leads naturally to hypotheses
on the evolution of development, that can be explored in the lab.”
Just call it scientific curiosity, and enjoy the strange but wonderful results that have come out of it … thus far.
They probably hid from feathered dinosaurs, only to end up stuck in redwood sap.
A new collection of 12 lizards preserved in amber dates back to middle of the Cretaceous period – when dinosaurs such as the massive Argentinosaurus were still around – and may include the ancestors of geckos and chameleons.
The specimens come from Myanmar’s Kachin state and are thought to
have lived in tropical forest. Each is embedded in Burmese amber, which
previous studies dated to about 100 million years old. Previously, we
knew of only a few fragments of amber lizards from the time of the dinosaurs – when modern lizard groups first evolved, according to genetic analyses.
The lizards, discovered in private amber collections on loan to the
American Museum of Natural History and Harvard University, are
immaculate and unusually diverse. As such they suggest that major lizard
groups were already established at that time. The specimens will now go on display at the Houston Museum of Natural Science.
“One of them is perhaps the best fossil gecko that is known in the world,” says Juan Daza
of Sam Houston State University in Texas, whose team revealed the
finds, and then used CT scans to study them (click on image below). It
was so detailed the team initially thought it looked like a modern
animal.
But it wasn’t recent. “We started looking at the characteristics we
describe in modern species, and none of those match,” Daza says. The adhesive toe pads are already present in these ancient specimens, suggesting the gecko’s climbing lifestyle evolved much earlier than thought.
Another specimen has its tongue stuck out. With a narrow, extended tip, it matches no snake or lizard tongue ever found.
One small lizard is trapped
next to a scorpion-like animal and a millipede. That proximity, plus
the fact that modern lizards in tropical forests hunt arthropods,
suggest these animals preyed on them, Daza says.
That particular lizard is doubly interesting. Its bone structure
resembles that of a newborn chameleon, although it is about four times
the age of the oldest chameleon-like fossils previously known.
It even has a weak jaw, which wouldn’t be good for biting prey – possible evidence that the modern chameleon’s method of grabbing prey with a projectile tongue is really an old adaptation, Daza says. The find may also challenge current view that chameleons originated in Africa.
The new specimens are beautiful and very exciting, says Michael Caldwell
of the University of Alberta in Edmonton, Canada. “We really have had
little to no previous fossil record detailing that part of the family
tree of lizards,” he says.
But closer anatomical studies are now needed to determine where each
lizard is best classified – especially the putative chameleon, he adds.
"Our study shows how museums still play an
important role in preserving specimens of primary scientific value,"
said study co-author Andrea Cau.
By Brooks Hays
| Feb. 29, 2016 at 10:32 AM
A rendering offers an idea of how large abelisaurs were. Photo by ICL
LONDON, Feb. 29 (UPI) -- After
re-examining a fossilized femur bone belonging to an abelisaur specimen,
researchers can say with more certainty how large these fearsome
predators could become.
Based on their analysis, researchers at Imperial College London
believe the femur belonged to an abelisaur weighing nearly two metric
tons and stretching nine meters, or almost 30 feet. Those dimensions
make it one of the largest abelisaurs ever found.
The new research was detailed in the journal PeerJ.
"Smaller abelisaur fossils have been previously found by
paleontologists, but this find shows how truly huge these flesh eating
predators had become," researcher Alessandro Chiarenza, study co-auhtor,
said in a press release.
"Their appearance may have looked a bit odd as they were probably
covered in feathers with tiny, useless forelimbs, but make no mistake
they were fearsome killers in their time."
Abelisauridae dinosaurs made up for their tiny forelimbs and odd
appearance with massive hindquarters and deadly sharp teeth. They
thrived in what is now northern Africa some 95 million years ago, though
abelisaur fossils have been dated as far back as 170 million years ago
and as recently as 66 million years ago.
The femur was originally found in a Moroccan deposit known as Kem Kem
Beds -- famous for its abundance of predatory dino bones. The site has
confounded researchers who believe it would have been impossible for so
many carnivorous dinosaurs to coexist in such tight quarters.
New analysis suggests the sometimes violent geologic conditions that
created Kem Kem Beds may have also mixed up the strata and chronology of
the fossil record.
Other sites suggest abelisaur were inland hunters, somewhat separated
from their closest cousins, who preferred to hunt fish near lakes and
rivers.
"This fossil find, along with the accumulated wealth of previous
studies, is helping to solve the question of whether abelisaurs may have
co-existed with a range of other predators in the same region,"
Chiarenza explained. "Rather than sharing the same environment, which
the jumbled up fossil records may be leading us to believe, we think
these creatures probably lived far away from one another in different
types of environments."
The fossil was not recently unearthed, but had been sitting in a
museum drawer for several decades -- further proof that closeted
collections hide nearly as many secrets as untouched earth.
"While palaeontologists usually venture to remote and inaccessible
locations, like the deserts of Mongolia or the Badlands of Montana,"
added Andrea Cau, study co-author and researcher at the University of
Bologna, "our study shows how museums still play an important role in
preserving specimens of primary scientific value, in which sometimes the
most unexpected surprises can be discovered."