Monday, April 30, 2012

Sperm Viability Greatly Reduced in Offspring of Animals Treated With Common Antibiotic Tetracycline

 
In a paper published April 27 in Nature's open access journal Scientific Reports, researchers at the University of Nevada, Reno report that male pseudoscorpions treated with the antibiotic tetracycline suffer significantly reduced sperm viability and pass this toxic effect on to their untreated sons. They suggest that a similar effect could occur in humans and other species.
"This is the first research to show a transgenerational effect of antibiotics," David Zeh, chair of the Department of Biology in the College of Science, said. "Tetracycline has a significant detrimental effect on male reproductive function and sperm viability of pseudoscorpions - reducing viability by up to 25 percent - and now we know that effect is passed on to the next generation. We didn't see the effect in subsequent generations."
The research involved a three-generation study of the pseudoscorpion, Cordylochernes scorpioides, a small scorpion-like arachnid. To control for genetic influences, in the first generation, brothers and sisters from each of 21 broods were either treated with weekly doses of tetracycline from birth to adulthood or were reared as untreated controls. Subsequent generations were not treated with tetracycline. The antibiotic had no effect on male or female body size, sperm number or female reproduction, they found.
In the article, lead author and assistant biology professor Jeanne Zeh surmises that tetracycline may induce epigenetic changes in male reproductive tissues that may be passed to sons -- changes that do not alter the sequence of DNA but rather alter the way genes are expressed.
The broad-spectrum antibiotic tetracycline is commonly used in animal production, antimicrobial therapy, and for curing arthropods infected with bacterial endosymbionts such as Wolbachia. Despite more than six decades of therapeutic and agricultural use that has resulted in the evolution of widespread bacterial resistance, tetracycline is still commonly used as an additive in animal feed and as an accessible antimicrobial therapy in developing countries.
The research involved University of Nevada, Reno undergraduate and graduate students and was part of a project, funded by the National Science Foundation, which is investigating factors contributing to low male fertility.

Journal Reference:
  1. Jeanne A. Zeh, Melvin M. Bonilla, Angelica J. Adrian, Sophia Mesfin, David W. Zeh. From father to son: transgenerational effect of tetracycline on sperm viability. Scientific Reports, 2012; 2 DOI: 10.1038/srep00375
Courtesy: ScienceDaily


Saturday, April 28, 2012

Families That Eat Together May Be the Healthiest, New Evidence Confirms

"Come and get it!" A phrase historically proclaiming that the communal meal is ready, is heard all too infrequently among contemporary American households, especially as children get older. Indeed, over 40% of the typical American food budget is spent on eating out, with family meals often being relegated to holidays and special occasions. Aside from negative effects on the family budget, eating out has been shown to be generally associated with poor food choices and bad health. Of particular interest to public health experts is growing scientific evidence that fewer family meals may translate to increased obesity risk and poor nutritional status, especially among children.

But getting this message out to busy parents in a way that will convince them to spend more time at the dining room table with their children is problematic at best.
To both summarize what is known about this timely topic and create a model that might be used to educate parents and other caregivers as to the importance of family mealtimes, researchers at Rutgers recently evaluated results from 68 previously published scientific reports considering the association between family mealtime and children's health. They specifically looked at how frequency or atmosphere of family meals was related to consumption of both healthy foods (e.g., fruits and vegetables) and those considered less desirable (e.g., soft drinks). The researchers also evaluated if scientific evidence actually supports the idea that more frequent family meals can lead to decreased obesity.
Their review of the literature revealed numerous benefits to children associated with having frequent family meals, including increased intake of fruits, vegetables, fiber, calcium-rich foods, and vitamins. In addition, the more a family ate together the less children consumed dietary components thought to be harmful to health. Although the researchers found only a weak link between family meals and obesity risk, children in families with frequent family meals tended to have lower body mass index (BMI, kg/m2) than those who enjoyed fewer family meals.
The research team was also able to create a simple conceptual image that condensed their findings in a user-friendly fashion, and hope to test the effectiveness of this graphic with parents and other caregivers in the near future. According to the scientists, "Images like this one will be a helpful method to demonstrate the benefits identified in scientific literature to parents in a concise, non-biased method. Often parents will hear tidbits about family meal benefits here and there, but we hope that something like this may be useful to provide information from a reliable source."
Clearly, the scientific literature represents a vast store of valuable information that could help families make better decisions about food choices. However, many people do not have the time, inclination, or expertise needed to access, filter, and interpret these scientific reports. Instead, they must often rely on media "headlines" that focus on a single study, or worse do not accurately report the research that has been conducted. The authors of this new report hope that their "synthesis of the literature of the links between family meals and child health outcomes and creation of a parent-friendly image that visually summarized these findings will lead to interventions that benefit a wide range of children."

Story Source:
The above story is reprinted from materials provided by Federation of American Societies for Experimental Biology (FASEB), via Newswise

Courtesy: ScienceDaily


Thursday, April 26, 2012

New Genes Contributing to Autism Discovered; Genetic Links Between Neurodevelopment and Psychiatric Disorders


 A new approach to investigating hard-to-find chromosomal abnormalities has identified 33 genes associated with autism and related disorders, 22 for the first time. Several of these genes also appear to be altered in different ways in individuals with psychiatric disorders such as schizophrenia, symptoms of which may begin in adolescence or adulthood. Results of the study by a multi-institutional research team will appear in the April 27 issue of Cell and have been released online.

"By sequencing the genomes of a group of children with neurodevelopmental abnormalities, including autism, who were also known to have abnormal chromosomes, we identified the precise points where the DNA strands are disrupted and segments exchanged within or between chromosomes. As a result, we were able to discover a series of genes that have a strong individual impact on these disorders," says James Gusella, PhD, director of the Massachusetts General Hospital Center for Human Genetic Research (MGH CHGR) and senior author of the Cell paper. "We also found that many of these genes play a role in diverse clinical situations -- from severe intellectual disability to adult-onset schizophrenia -- leading to the conclusion that these genes are very sensitive to even subtle perturbations."
Physicians evaluating children with neurodevelopmental abnormalities often order tests to examine their chromosomes, but while these tests can detect significant abnormalities in chromosomal structure, they typically cannot identify a specific gene as being disrupted. Structural variants known as balanced chromosome abnormalities (BCAs) -- in which DNA segments are moved into different locations in the same chromosome or exchanged with segments in other chromosomes, leaving the overall size of the chromosomes unchanged -- are known to be significantly more common in individuals with autism spectrum disorders than in a control population. Several years ago Gusella and Cynthia Morton, PhD, of Brigham and Women's Hospital initiated the Developmental Genome Anatomy Project to identify developmentally important genes by investigating BCAs, but the task of identifying specific chromosome breakpoints has been slow and laborious.
To get a clearer view of the potential impact of BCAs on autism, the research team took advantage of a new approach developed by Michael Talkowski, PhD, of the MGH CHGR, lead author of the Cell paper, which allows the sequencing of an individual's entire genome in a way that detects the breakpoints of BCAs. The whole procedure can be accomplished in less than two weeks rather than the many months previously required. Screening the genomes of 38 individuals diagnosed with autism or other neurodevelopmental disorders found chromosomal breakpoints and rearrangements in non-protein-coding regions that disrupted 33 genes, only 11 of which previously had been suspected in these disorders.
As they compiled their results, the researchers were struck by how many of the BCA-disrupted genes they identified had been associated with psychiatric disorders in previous studies. To test their observation, they examined data from the largest genome-wide association study in schizophrenia to date -- in collaboration with Mark Daly, PhD, also of the MGH CHGR who led that study -- and found that a significant number of the BCA-disrupted genes identified in the current study were associated with schizophrenia when altered by more subtle variants that are common in the population.
"The theory that schizophrenia is a neurodevelopmental disorder has long been hypothesized, but we are just now beginning to uncover specific portions of the genetic underpinnings that may support that theory," says Talkowski. "We also found that different gene variations -- deletion, duplication or inactivation -- can result in very similar effects, while two similar changes at the same site might have very different neurodevelopmental manifestations. We suspected that the genetic causes of autism and other neurodevelopmental abnormalities are complex and likely to involve many genes, and our data support this."
Adds Gusella, who is the Bullard Professor of Neurogenetics at Harvard Medical School, "Our results suggest that many genes and pathways are important to normal brain development and that perturbation of some can lead to a great variety of developmental or psychiatric conditions, warranting extensive further study. We're hoping to investigate how these gene disruptions alter other genes and pathways and how prevalent these rearrangements are in the general population. This is a first step in what will be a long journey toward understanding genes underlying the pathophysiology of neurodevelopmental and psychiatric disorders and developing new clinical treatments."
Researchers from 15 institutions in three countries -- including Massachusetts General Hospital, the Broad Institute, Brigham and Women's Hospital and Harvard Medical School -- collaborated with Talkowski, Gusella, Morton and Daly on the investigation. Support for the study includes grants from the National Institutes of Health, the Simons Foundation Autism Research Initiative and Autism Speaks.

Journal Reference:
  1. Michael E. Talkowski, Jill A. Rosenfeld, Ian Blumenthal, Vamsee Pillalamarri, Colby Chiang, Adrian Heilbut, Carl Ernst, Carrie Hanscom, Elizabeth Rossin, Amelia M. Lindgren, Shahrin Pereira, Douglas Ruderfer, Andrew Kirby, Stephan Ripke, David J. Harris, Ji-Hyun Lee, Kyungsoo Ha, Hyung-Goo Kim, Benjamin D. Solomon, Andrea L. Gropman, Diane Lucente, Katherine Sims, Toshiro K. Ohsumi, Mark L. Borowsky, Stephanie Loranger, Bradley Quade, Kasper Lage, Judith Miles, Bai-Lin Wu, Yiping Shen, Benjamin Neale, Lisa G. Shaffer, Mark J. Daly, Cynthia C. Morton, James F. Gusella. Sequencing Chromosomal Abnormalities Reveals Neurodevelopmental Loci that Confer Risk across Diagnostic Boundaries. Cell, 2012; DOI: 10.1016/j.cell.2012.03.028
Courtesy: ScienceDaily


Tuesday, April 24, 2012

Bacteria Evolved Way to Safeguard Crucial Genetic Material



Just as banks store away only the most valuable possessions in the most secure safes, cells prioritise which genes they guard most closely, researchers at the European Molecular Biology Laboratory's European Bioinformatics Institute (EMBL-EBI) have found. The study, just published online in Nature, shows that bacteria have evolved a mechanism that protects important genes from random mutation, effectively reducing the risk of self-destruction. The findings answer a question that has been under debate for half a century and provide insights into how disease-causing mutations arise and pathogens evolve.

"We discovered that there must be a molecular mechanism that preferentially protects certain areas of the genome over others," says Nicholas Luscombe, who led the research at EMBL-EBI. "If we can identify the proteins involved and uncover how this works, we will be even closer to understanding how mutations that lead to diseases like cancer can be prevented."
Mutations are the reason each of us is unique. These changes to our genetic material are at the root of variation between individuals, and between cells within individuals. But they also have a darker side. If it affects an important gene -- for example, rendering a tumour-suppressing gene useless -- a mutation can have disastrous consequences. Nevertheless, protecting all genes from mutation would use up too many of the cell's resources, just like holding all deposits in maximum-security safes would be prohibitively expensive. Iñigo Martincorena, a PhD student in Luscombe's lab, has now found that cells evolved a 'risk management' strategy to address this issue.
Looking at 120,000 tiny genetic mutations called single nucleotide polymorphisms (SNPs) in 34 strains of the bacterium E. coli, the scientists were able to quantify how random the mutation rate was in different areas of the bacterial genomes. Their results showed that key genes mutate at a much lower rate than the rest of the genetic material, which decreases the risk of such genes suffering a detrimental mutation. "We were struck by how variable the mutation rate appears to be along the genome," says Martincorena. "Our observations suggest these bacteria have evolved a clever mechanism to control the rate of evolution in crucial areas of the genome."
Using population genetics techniques, the researchers were able to disentangle the effects of mutation rate and natural selection on mutations, settling a long-standing debate in the field. Scientists have long thought that the chances of a mutation occurring were independent of its value to an organism. Once the mutation had occurred, it would undergo natural selection, spreading through the population or being eliminated depending on how beneficial or detrimental the genetic change proved to be.
"For many years in evolution there has been an assumption that mutations occur randomly, and that selection 'cleans them up'," explains Martincorena. "But what we see here suggests that genomes have developed mechanisms to avoid mutations in regions that are more valuable than others."
Observations from studies of cancer genomes suggest that similar mechanisms may be involved in the development of cancers, so Luscombe and colleagues would now like to investigate exactly how this risk-managing gene protection works at a molecular level, and what role it may play in tumour cells.

Journal Reference:
  1. Iñigo Martincorena, Aswin S. N. Seshasayee, Nicholas M. Luscombe. Evidence of non-random mutation rates suggests an evolutionary risk management strategy. Nature, 2012; DOI: 10.1038/nature10995
Courtesy: ScienceDaily


Friday, April 20, 2012

New Study Examines Risks and Benefits of the First Line Treatment for Diabetes

Although the drug metformin is considered the gold standard in the management of type 2 diabetes, a study by a group of French researchers published in this week's PLoS Medicine suggests that the long-term benefits of this drug compared with the risks are not clearly established -- an important finding given that currently, thousands of people around the world are regularly taking metformin to help control their blood sugar levels in the belief that it also has long-lasting health benefits.

For the past 14 years, metformin has been recommended as the first-line treatment for type 2 diabetes after a landmark study (the UK Prospective Diabetes Study) reported that when combined with dietary control measures, metformin reduced death from all causes in overweight people with type 2 diabetes. However, an overlooked finding from this study was that in non-overweight people with type 2 diabetes, metformin may actually increase the risk of death.
In this new analysis, the authors led by Catherine Cornu from the Clinical Investigation Centre, in Lyon, France, analysed the data available from all relevant studies to re-evaluate the balance of the benefits versus the risks of taking metformin for type 2 diabetes.
Using information from 13 randomized controlled trials (which included a total of more than 13,000 patients) the authors found that compared to other drugs, metformin had no effect on the risk of death from all causes or on the risk of death from cardiovascular disease. Furthermore, metformin had no significant effect on the risk of developing cardiovascular conditions such as heart attacks, strokes, and heart failure.
The authors conclude: "We cannot exclude beyond any reasonable doubt that metformin use increases or decreases the risk of all-cause mortality or cardiovascular mortality."
They explain: "The specific efficacy of metformin to prevent death or cardiovascular events has not been proven by current studies. The number and quality of available studies are insufficient."
The authors recommend: "Further studies are needed to clarify this problematic situation. Metformin may not be the best comparator [drug] for evaluating new hypoglycaemic [blood sugar-lowering] drugs. However, it is not clear which comparator [drug] has the most favourable risk/benefit ratio."
It is essential that patients taking metformin who have any concerns do not stop the drug without consulting their doctor, especially as the authors conclude that "Compared with other antidiabetic treatments, metformin may be the one with the least disadvantages. It does not induce hypoglycaemia, weight gain, and heart failure. It is also associated with a reduced rate of mortality among patients with atherothrombosis."

Journal Reference:
  1. Rémy Boussageon, Irène Supper, Theodora Bejan-Angoulvant, Nadir Kellou, Michel Cucherat, Jean-Pierre Boissel, Behrouz Kassai, Alain Moreau, François Gueyffier, Catherine Cornu. Reappraisal of Metformin Efficacy in the Treatment of Type 2 Diabetes: A Meta-Analysis of Randomised Controlled Trials. PLoS Medicine, 2012; 9 (4): e1001204 DOI: 10.1371/journal.pmed.1001204

Courtesy: ScienceDaily


Wednesday, April 18, 2012

The Tamiflu Story Continued: Full Reports from Clinical Trials Should Be Made Publicly Available, Experts Argue

The full clinical study reports of drugs that have been authorized for use in patients should be made publicly available in order to allow independent re-analysis of the benefits and risks of such drugs, according to leading international experts who base their assertions on their experience with Tamiflu (oseltamivir).

Tamiflu is classed by the World Health Organization as an essential drug and many countries have stockpiled the anti-influenza drug at great expense to taxpayers. But a recent Cochrane review on Tamiflu has shown that even more than ten thousand pages of regulatory evidence were not sufficient to clarify major discrepancies regarding the effects and mode of action of the drug.
Writing in this week's PLoS Medicine, Peter Doshi from Johns Hopkins University School of Medicine in Baltimore, USA, Tom Jefferson from the Cochrane Collaboration in Rome, Italy, and Chris Del Mar from Bond University in the Gold Coast, Australia say that there are strong ethical arguments for ensuring that all clinical study reports are publicly accessible. In the course of trying to get hold of the regulatory evidence, the authors received several explanations from Roche as to why it would not share its data. By publishing that correspondence and comment, the authors assert that experiments on humans should be made available, all the more so given the international public health nature of the drug.
They argue: "It is the public who take and pay for approved drugs, and therefore the public should have access to complete information about those drugs. We should also not lose sight of the fact that clinical trials are experiments conducted on humans that carry an assumption of contributing to medical knowledge. Non-disclosure of complete trial results undermines the philanthropy of human participants and sets back the pursuit of knowledge."
However, according to the authors, industry and regulators have historically treated clinical study reports as confidential documents, impeding additional scrutiny by independent researchers.
Using the example of Tamiflu, in which drug companies, drug regulators, and public health bodies such as the World Health Organization and the Center for Disease Control have made discrepant claims about its clinical effects, the authors argue that critical analysis by an independent group such as a Cochrane review group is essential. By recounting the details of an extended correspondence with Tamiflu's manufacturer Roche, the authors argue that the company provided no convincing reasons to refuse providing access to clinical study reports.
The authors challenge industry to either provide open access to clinical study reports or publically defend their current position of randomized controlled trial data secrecy. They say: "we hope the debate may soon shift from one of whether to release regulatory data to the specifics of doing so. But until these policies go into effect -- and perhaps even after they do -- most drugs on the market will remain those approved in an era in which regulators protected industry's data."
European regulators respond to the Tamiflu recommendations
In a Perspective article accompanying a new analysis by Peter Doshi and colleagues in PLoS Medicine that recommended full clinical study reports of authorized drugs be made publicly available in order to allow independent re-analysis of the benefits and risks of such drugs, four drug regulators (representing the European Medicines Agency, the French Agence Française de Sécurité Sanitaire des Produits de Santé, the UK's Medicines and Healthcare products Regulatory Agency, and the Medicines Evaluation Board in The Netherlands) respond.
The four regulators say: "We consider it neither desirable nor realistic to maintain the status quo of limited availability of regulatory trials data," and suggest what they call a "three pronged approach," which includes establishing rules of engagement to follow the principle of maximum transparency whilst respecting the need to guarantee data privacy and to avert the potential for misuse. The regulators say: "We welcome debate on these issues, and remain confident that satisfactory solutions can be found to make complete trial data available in a way that will be in the best interest of public health."
However, they also lay out arguments for why trial data should not be open for all: personal data protection; non-financial competing interests; and the risks of competition.
They conclude: "We welcome debate on these issues, and remain confident that satisfactory solutions can be found to make complete trial data available in a way that will be in the best interest of public health."

Journal References:
  1. Peter Doshi, Tom Jefferson, Chris Del Mar. The Imperative to Share Clinical Study Reports: Recommendations from the Tamiflu Experience. PLoS Medicine, 2012; 9 (4): e1001201 DOI: 10.1371/journal.pmed.1001201
  2. Hans-Georg Eichler, Eric Abadie, Alasdair Breckenridge, Hubert Leufkens, Guido Rasi. Open Clinical Trial Data for All? A View from Regulators. PLoS Medicine, 2012; 9 (4): e1001202 DOI: 10.1371/journal.pmed.1001202
Courtesy: ScienceDaily


Monday, April 16, 2012

How a Single Gene Mutation Leads to Uncontrolled Obesity

Researchers at Georgetown University Medical Center have revealed how a mutation in a single gene is responsible for the inability of neurons to effectively pass along appetite suppressing signals from the body to the right place in the brain. What results is obesity caused by a voracious appetite.

Their study, published March 18th on Nature Medicine's website, suggests there might be a way to stimulate expression of that gene to treat obesity caused by uncontrolled eating.
The research team specifically found that a mutation in the brain-derived neurotrophic factor (Bdnf) gene in mice does not allow brain neurons to effectively pass leptin and insulin chemical signals through the brain. In humans, these hormones, which are released in the body after a person eats, are designed to "tell" the body to stop eating. But if the signals fail to reach correct locations in the hypothalamus, the area in the brain that signals satiety, eating continues.
"This is the first time protein synthesis in dendrites, tree-like extensions of neurons, has been found to be critical for control of weight," says the study's senior investigator, Baoji Xu, Ph.D., an associate professor of pharmacology and physiology at Georgetown.
"This discovery may open up novel strategies to help the brain control body weight," he says.
Xu has long investigated the Bdnf gene. He has found that the gene produces a growth factor that controls communication between neurons.
For example, he has shown that during development, BDNF is important to the formation and maturation of synapses, the structures that permit neurons to send chemical signals between them. The Bdnf gene generates one short transcript and one long transcript. He discovered that when the long-form Bdnf transcript is absent, the growth factor BDNF is only synthesized in the cell body of a neuron but not in its dendrites. The neuron then produces too many immature synapses, resulting in deficits in learning and memory in mice.
Xu also found that the mice with the same Bdnf mutation grew to be severely obese.
Other researchers began to look at the Bdnf gene in humans, and large-scale genome-wide association studies showed Bdnf gene variants are, in fact, linked to obesity.
But, until this study, no one has been able to describe exactly how BDNF controls body weight.
Xu's data shows that both leptin and insulin stimulate synthesis of BDNF in neuronal dendrites in order to move their chemical message from one neuron to another through synapses. The intent is to keep the leptin and insulin chemical signals moving along the neuronal highway to the correct brain locations, where the hormones will turn on a program that suppresses appetite.
"If there is a problem with the Bdnf gene, neurons can't talk to each other, and the leptin and insulin signals are ineffective, and appetite is not modified," Xu says.
Now that scientists know that BDNF regulates the movement of leptin and insulin signals through brain neurons, the question is whether a faulty transmission line can be repaired.
One possible strategy would be to produce additional long-form Bdnf transcript using adeno-associated virus-based gene therapy, Xu says. But although this kind of gene therapy has proven to be safe, it is difficult to deliver across the brain blood barrier, he adds.
"The better approach might be to find a drug that can stimulate Bdnf expression in the hypothalamus," Xu says. "We have opened the door to both new avenues in basic research and clinical therapies, which is very exciting."

Journal Reference:
  1. Guey-Ying Liao, Juan Ji An, Kusumika Gharami, Emily G Waterhouse, Filip Vanevski, Kevin R Jones, Baoji Xu. Dendritically targeted Bdnf mRNA is essential for energy balance and response to leptin. Nature Medicine, 2012; DOI: 10.1038/nm.2687

 Courtesy: ScienceDaily

Saturday, April 14, 2012

Detecting Breast Cancer's Fingerprint in a Droplet of Blood

One in eight women will be diagnosed with breast cancer during her lifetime. The earlier cancer is detected, the better the chance of successful treatment and long-term survival. However, early cancer diagnosis is still challenging as testing by mammography remains cumbersome, costly, and in many cases, cancer can only be detected at an advanced stage. A team based in the Dept. of Biomedical Engineering at McGill University's Faculty of Medicine has developed a new microfluidics-based microarray that could one day radically change how and when cancer is diagnosed. Their findings are published in the April issue of the journal Molecular & Cellular Proteomics.

For years, scientists have worked to develop blood tests for cancer based on the presence of the Carcinoembryonic Antigen (CEA), a protein biomarker for cancer identified over 40 years ago by McGill's Dr. Phil Gold. This biomarker, however, is also found in healthy people and its concentration varies from person to person depending on genetic background and lifestyle. As such, it has not been possible to establish a precise cut-off between healthy individuals and those with cancer.

"Attempts have been made to overcome this problem of person-to-person variability by seeking to establish a molecular 'portrait' of a person by measuring both the concentration of multiple proteins in the blood and identifying the signature molecules that, taken together, constitute a characteristic 'fingerprint' of cancer," explains Dr. David Juncker, the team's principal investigator. "However, no reliable set of biomarkers has been found, and no such test is available today. Our goal is to find a way around this."

Dr. Mateu Pla-Roca, the study's first author, along with members of Juncker's team, began by analyzing the most commonly used existing technologies that measure multiple proteins in the blood and developing a model describing their vulnerabilities and limitations. Specifically, they discovered why the number of protein targets that can be measured simultaneously has been limited and why the accuracy and reproducibility of these tests have been so challenging to improve. Armed with a better understanding of these limitations, the team then developed a novel microfluidics-based microarray technology that circumvents these restrictions. Using this new approach, it then became possible to measure as many protein biomarkers as desired while minimizing the possibility of obtaining false results.

Juncker's biomedical engineering group, together with oncology and bioinformatics teams from McGill's Goodman Cancer Research Centre, then measured the profile of 32 proteins in the blood of 11 healthy controls and 17 individuals who had a particular subtype of breast cancer (estrogen receptor-positive). The researchers found that a subset of six of these 32 proteins could be used to establish a fingerprint for this cancer and classify each of the patients and healthy controls as having or not having breast cancer.

"While this study needs to be repeated with additional markers and a greater diversity of patients and cancer subsets before such a test can be applied to clinical diagnosis, these results nonetheless underscore the exciting potential of this new technology," said Juncker.

Looking ahead, Juncker and his collaborators have set as their goal the development of a simple test that can be carried out in a physician's office using a droplet of blood, thereby reducing dependence on mammography and minimizing attendant exposure to X-rays, discomfort and cost. His lab is currently developing a hand-held version of the test and is working on improving its sensitivity so as to be able to accurately detect breast cancer and ultimately, many other diseases, at the earliest possible stage.

This study was funded by the Canadian Institutes for Health Research (CIHR), Genome Canada; Génome Québec; The Canada Foundation for Innovation (CFI), The Natural Science and Engineering Research Council (NSERC); and the Banque de tissue et de données of the Réseau de la Recherches sur le cancer (RRCancer) of the Fonds de recherche en santé du Québec (FRSQ).

Journal Reference:
  1. M. Pla-Roca, R. F. Leulmi, S. Tourekhanova, S. Bergeron, V. Laforte, E. Moreau, S. J. C. Gosline, N. Bertos, M. Hallett, M. Park, D. Juncker. Antibody Colocalization Microarray: A Scalable Technology for Multiplex Protein Analysis in Complex Samples. Molecular & Cellular Proteomics, 2011; 11 (4): M111.011460 DOI: 10.1074/mcp.M111.011460

Courtesy: ScienceDaily


Thursday, April 12, 2012

Tiny Hitchhikers Attack Cancer Cells: Gold Nanostars First to Deliver Drug Directly to Cancer Cell Nucleus

Nanotechnology offers powerful new possibilities for targeted cancer therapies, but the design challenges are many. Northwestern University scientists now are the first to develop a simple but specialized nanoparticle that can deliver a drug directly to a cancer cell's nucleus -- an important feature for effective treatment.

They also are the first to directly image at nanoscale dimensions how nanoparticles interact with a cancer cell's nucleus.
"Our drug-loaded gold nanostars are tiny hitchhikers," said Teri W. Odom, who led the study of human cervical and ovarian cancer cells. "They are attracted to a protein on the cancer cell's surface that conveniently shuttles the nanostars to the cell's nucleus. Then, on the nucleus' doorstep, the nanostars release the drug, which continues into the nucleus to do its work."
Odom is the Board of Lady Managers of the Columbian Exposition Professor of Chemistry in the Weinberg College of Arts and Sciences and a professor of materials science and engineering in the McCormick School of Engineering and Applied Science.
Using electron microscopy, Odom and her team found their drug-loaded nanoparticles dramatically change the shape of the cancer cell nucleus. What begins as a nice, smooth ellipsoid becomes an uneven shape with deep folds. They also discovered that this change in shape after drug release was connected to cells dying and the cell population becoming less viable -- both positive outcomes when dealing with cancer cells.
The results are published in the journal ACS Nano.
Since this initial research, the researchers have gone on to study effects of the drug-loaded gold nanostars on 12 other human cancer cell lines. The effect was much the same. "All cancer cells seem to respond similarly," Odom said. "This suggests that the shuttling capabilities of the nucleolin protein for functionalized nanoparticles could be a general strategy for nuclear-targeted drug delivery."
The nanoparticle is simple and cleverly designed. It is made of gold and shaped much like a star, with five to 10 points. (A nanostar is approximately 25 nanometers wide.) The large surface area allows the researchers to load a high concentration of drug molecules onto the nanostar. Less drug would be needed than current therapeutic approaches using free molecules because the drug is stabilized on the surface of the nanoparticle.
The drug used in the study is a single-stranded DNA aptamer called AS1411. Approximately 1,000 of these strands are attached to each nanostar's surface.
The DNA aptamer serves two functions: it is attracted to and binds to nucleolin, a protein overexpressed in cancer cells and found on the cell surface (as well as within the cell). And when released from the nanostar, the DNA aptamer also acts as the drug itself.
Bound to the nucleolin, the drug-loaded gold nanostars take advantage of the protein's role as a shuttle within the cell and hitchhike their way to the cell nucleus. The researchers then direct ultrafast pulses of light -- similar to that used in LASIK surgery -- at the cells. The pulsed light cleaves the bond attachments between the gold surface and the thiolated DNA aptamers, which then can enter the nucleus.
In addition to allowing a large amount of drug to be loaded, the nanostar's shape also helps concentrate the light at the points, facilitating drug release in those areas. Drug release from nanoparticles is a difficult problem, Odom said, but with the gold nanostars the release occurs easily.
That the gold nanostar can deliver the drug without needing to pass through the nuclear membrane means the nanoparticle is not required to be a certain size, offering design flexibility. Also, the nanostars are made using a biocompatible synthesis, which is unusual for nanoparticles.
Odom envisions the drug-delivery method, once optimized, could be particularly useful in cases where tumors are fairly close to the skin's surface, such as skin and some breast cancers. (The light source would be external to the body.) Surgeons removing cancerous tumors also might find the gold nanostars useful for eradicating any stray cancer cells in surrounding tissue.
The National Institutes of Health supported the research.

Journal Reference:
  1. Duncan Hieu M. Dam, Jung Heon Lee, Patrick N. Sisco, Dick T. Co, Ming Zhang, Michael R. Wasielewski, Teri W. Odom. Direct Observation of Nanoparticle–Cancer Cell Nucleus Interactions. ACS Nano, 2012; : 120322074956003 DOI: 10.1021/nn300296p
Courtesy: ScienceDaily


Tuesday, April 10, 2012

Emergence of Artemisinin Resistance On Thai-Myanmar Border Raises Spectre of Untreatable Malaria

Evidence that the most deadly species of malaria parasite, Plasmodium falciparum, is becoming resistant to the front line treatment for malaria on the border of Thailand and Myanmar was reported in The Lancet April 5.This increases concern that resistance could now spread to India and then Africa as resistance to other antimalarial drugs has done before. Eliminating malaria might then prove impossible.

 The study coincides with research recently published in Science in which researchers in south east Asia and the USA identify a major region of the malaria parasite genome associated with artemisinin resistance. This region, which includes several potential candidate genes for resistance, may provide researchers with a tool for mapping resistance.
Both studies, funded by the Wellcome Trust and the National Institutes of Health, follow reports in 2009 of the emergence of artemisinin-resistant malaria parasites in western Cambodia, 800km away from the Thailand-Myanmar border where the new cases of resistance have been observed. Resistance to artemisinin makes the drugs less effective and could eventually render them obsolete, putting millions of lives at risk.
According to the World Malaria Report 2011, malaria killed an estimated 655,000 people in 2010, mainly young children and pregnant women. It is caused by parasites that are injected into the bloodstream by infected mosquitoes. Plasmodium falciparum is responsible for nine out of ten deaths from malaria.
The most effective antimalarial drug is artemisinin; the artemisinin derivatives, most commonly artesunate, have the advantage over other antimalarial drugs such as chloroquine and mefloquine, of acting more rapidly and having fewer side-effects and, until recently, malaria parasites have shown no resistance against them. Although the drugs can be used on their own as a monotherapy, and these can still be obtained, fears over the possible development of resistance led to recommendations that they should only be used in conjunction with one or more other drugs as artemisinin-based combination therapies (ACTs). These are now recommended by the World Health Organization as the first-line treatment for uncomplicated falciparum malaria in all endemic countries. ACTs have contributed substantially to the recent decline in malaria cases in most tropical endemic regions.
In the Lancet study, researchers at the Shoklo Malaria Research Unit on the border of Thailand and Myanmar, part of the Wellcome Trust-Mahidol University-Oxford University Tropical Medicine Research Programme, measured the time taken to clear parasites from the blood stream in 3,202 patients with falciparum malaria using oral artesunate-containing medications over a ten year period between 2001 and 2010.
Over this period, the average time taken to reduce the number of parasites in the blood by a half -- known as the 'parasite clearance half-life' -- increased from 2.6 hours in 2001 to 3.7 hours in 2010, a clear sign that the drugs were becoming less effective. The proportion of slow-clearing infections -- defined as a half-life of over 6.2 hours -- increased over this same period from six to 200 out of every 1000 infections.
By examining the genetic make-up of the parasites, the researchers were able to provide compelling evidence that the decline in the parasite clearance rates was due to genetic changes in the parasites which had made them resistant to the drugs.
This finding is supported by the evidence reported in Science, in which the same researchers, together with an international team led by scientists at the Texas Biomedical Research Institute, San Antonio, identified a region on chromosome 13 of genome of the P. falciparum parasite that shows a strong association with slow parasite clearance rates. Whilst the actual mechanism involved is not clear, the region contains several candidate genes that may confer artemisinin resistance to the parasite.
Professor François Nosten, Director of the Shoklo Malaria Research Unit, said: "We have now seen the emergence of malaria resistant to our best drugs, and these resistant parasites are not confined to western Cambodia. This is very worrying indeed and suggests that we are in a race against time to control malaria in these regions before drug resistance worsens and develops and spreads further. The effect of that happening could be devastating. Malaria already kills hundreds of thousands of people a year -- if our drugs become ineffective, this figure will rise dramatically."
Professor Nick White, Chairman of the Wellcome Trust's South-East Asia Major Overseas Programmes and Chair of the WorldWide Antimalarial Resistance Network (WWARN), added: "Initially we hoped we might prevent this serious problem spreading by trying to eliminate all P. falciparum from western Cambodia. Whilst this could still be beneficial, this new study suggests that containing the spread of resistance is going to be even more challenging and difficult than we had first feared."
Dr Tim Anderson from the Texas Biomedical Research Institute, who led the genetics studies in both papers, commented: "Mapping the geographical spread of resistance can be particularly challenging using existing clinical and parasitology tools. If we can identify the genetic determinants of artemisinin resistance, we should be able to confirm potential cases of resistance more rapidly. This could be critically important for limiting further spread of resistance.
"We know that the genome region identified harbours a number of potential genes to explore further to see which ones drive artemisinin resistance. If we can pinpoint the precise gene or genes, we can begin to understand how resistance occurs."
The Wellcome Trust-Mahidol University-Oxford Tropical Medicine Research Programme is one of the Wellcome Trust's major overseas programmes, working to achieve the Trust's strategic priorities, which include combating infectious diseases.
Dr Jimmy Whitworth, Head of International Activities at the Wellcome Trust, said: "These two studies highlight the importance of being vigilant against the emergence of drug resistance. Researchers will need to monitor these outbreaks and follow them closely to make sure they are not spreading. Preventing the spread of artemisinin resistance to other regions is imperative, but as we can see here, it is going to be increasingly difficult. It will require the full force of the scientific and clinical communities, working together with health policymakers."

Journal References:
  1. Aung Pyae Phyo, Standwell Nkhoma, Kasia Stepniewska, Elizabeth A Ashley, Shalini Nair, Rose McGready, Carit ler Moo, Salma Al-Saai, Arjen M Dondorp, Khin Maung Lwin, Pratap Singhasivanon, Nicholas PJ Day, Nicholas J White, Tim JC Anderson, François Nosten. Emergence of artemisinin-resistant malaria on the western border of Thailand: a longitudinal study. The Lancet, 2012; DOI: 10.1016/S0140-6736(12)60484-X
  2. I. H. Cheeseman, B. A. Miller, S. Nair, S. Nkhoma, A. Tan, J. C. Tan, S. Al Saai, A. P. Phyo, C. L. Moo, K. M. Lwin, R. McGready, E. Ashley, M. Imwong, K. Stepniewska, P. Yi, A. M. Dondorp, M. Mayxay, P. N. Newton, N. J. White, F. Nosten, M. T. Ferdig, T. J. C. Anderson. A Major Genome Region Underlying Artemisinin Resistance in Malaria. Science, 2012; 336 (6077): 79 DOI: 10.1126/science.1215966

Courtesy: ScienceDaily


Saturday, April 7, 2012

How Stress Influences Disease: Study Reveals Inflammation as the Culprit

Stress wreaks havoc on the mind and body. For example, psychological stress is associated with greater risk for depression, heart disease and infectious diseases. But, until now, it has not been clear exactly how stress influences disease and health.

A research team led by Carnegie Mellon University's Sheldon Cohen has found that chronic psychological stress is associated with the body losing its ability to regulate the inflammatory response. Published in the Proceedings of the National Academy of Sciences, the research shows for the first time that the effects of psychological stress on the body's ability to regulate inflammation can promote the development and progression of disease.
"Inflammation is partly regulated by the hormone cortisol and when cortisol is not allowed to serve this function, inflammation can get out of control," said Cohen, the Robert E. Doherty Professor of Psychology within CMU's Dietrich College of Humanities and Social Sciences.
Cohen argued that prolonged stress alters the effectiveness of cortisol to regulate the inflammatory response because it decreases tissue sensitivity to the hormone. Specifically, immune cells become insensitive to cortisol's regulatory effect. In turn, runaway inflammation is thought to promote the development and progression of many diseases.
Cohen, whose groundbreaking early work showed that people suffering from psychological stress are more susceptible to developing common colds, used the common cold as the model for testing his theory. With the common cold, symptoms are not caused by the virus -- they are instead a "side effect" of the inflammatory response that is triggered as part of the body's effort to fight infection. The greater the body's inflammatory response to the virus, the greater is the likelihood of experiencing the symptoms of a cold.
In Cohen's first study, after completing an intensive stress interview, 276 healthy adults were exposed to a virus that causes the common cold and monitored in quarantine for five days for signs of infection and illness. Here, Cohen found that experiencing a prolonged stressful event was associated with the inability of immune cells to respond to hormonal signals that normally regulate inflammation. In turn, those with the inability to regulate the inflammatory response were more likely to develop colds when exposed to the virus.
In the second study, 79 healthy participants were assessed for their ability to regulate the inflammatory response and then exposed to a cold virus and monitored for the production of pro-inflammatory cytokines, the chemical messengers that trigger inflammation. He found that those who were less able to regulate the inflammatory response as assessed before being exposed to the virus produced more of these inflammation-inducing chemical messengers when they were infected.
"The immune system's ability to regulate inflammation predicts who will develop a cold, but more importantly it provides an explanation of how stress can promote disease," Cohen said. "When under stress, cells of the immune system are unable to respond to hormonal control, and consequently, produce levels of inflammation that promote disease. Because inflammation plays a role in many diseases such as cardiovascular, asthma and autoimmune disorders, this model suggests why stress impacts them as well."
He added, "Knowing this is important for identifying which diseases may be influenced by stress and for preventing disease in chronically stressed people."
In addition to Cohen, the research team included CMU's Denise Janicki-Deverts, research psychologist; Children's Hospital of Pittsburgh's William J. Doyle; University of British Columbia's Gregory E. Miller; University of Pittsburgh School of Medicine's Bruce S. Rabin and Ellen Frank; and the University of Virginia Health Sciences Center's Ronald B. Turner.
The National Center for Complementary and Alternative Medicine, National Institute of Mental Health, National Heart, Lung and Blood Institute and the MacArthur Foundation Research Network on Socioeconomic Status and Health funded this research.

Journal Reference:
  1. Sheldon Cohen, Denise Janicki-Deverts, William J. Doyle, Gregory E. Miller, Ellen Frank, Bruce S. Rabin, and Ronald B. Turner. Chronic stress, glucocorticoid receptor resistance, inflammation, and disease risk. PNAS, April 2, 2012 DOI: 10.1073/pnas.1118355109
Courtesy: ScienceDaily


Thursday, April 5, 2012

Fertilizer Use Responsible for Increase in Nitrous Oxide in Atmosphere


University of California, Berkeley, chemists have found a smoking gun proving that increased fertilizer use over the past 50 years is responsible for a dramatic rise in atmospheric nitrous oxide, which is a major greenhouse gas contributing to global climate change.

Climate scientists have assumed that the cause of the increased nitrous oxide was nitrogen-based fertilizer, which stimulates microbes in the soil to convert nitrogen to nitrous oxide at a faster rate than normal.
The new study, reported in the April issue of the journal Nature Geoscience, uses nitrogen isotope data to identify the unmistakable fingerprint of fertilizer use in archived air samples from Antarctica and Tasmania.
"Our study is the first to show empirically from the data at hand alone that the nitrogen isotope ratio in the atmosphere and how it has changed over time is a fingerprint of fertilizer use," said study leader Kristie Boering, a UC Berkeley professor of chemistry and of earth and planetary science.
"We are not vilifying fertilizer. We can't just stop using fertilizer," she added. "But we hope this study will contribute to changes in fertilizer use and agricultural practices that will help to mitigate the release of nitrous oxide into the atmosphere."
Since the year 1750, nitrous oxide levels have risen 20 percent -- from below 270 parts per billion (ppb) to more than 320 ppb. After carbon dioxide and methane, nitrous oxide (N2O) is the most potent greenhouse gas, trapping heat and contributing to global warming. It also destroys stratospheric ozone, which protects the planet from harmful ultraviolet rays.
Not surprisingly, a steep ramp-up in atmospheric nitrous oxide coincided with the green revolution that increased dramatically in the 1960s, when inexpensive, synthetic fertilizer and other developments boosted food production worldwide, feeding a burgeoning global population.
Tracking the origin of nitrous oxide in the atmosphere, however, is difficult because a molecule from a fertilized field looks identical to one from a natural forest or the ocean if you only measure total concentration. But a quirk of microbial metabolism affects the isotope ratio of the nitrogen the N2O microbes give off, producing a telltale fingerprint that can be detected with sensitive techniques.
Archived air from Cape Grim
Boering and her colleagues, including former UC Berkeley graduate students Sunyoung Park and Phillip Croteau, obtained air samples from Antarctic ice, called firn air, dating from 1940 to 2005, and from an atmospheric monitoring station at Cape Grim, Tasmania, which has archived air back to 1978.
Law Dome, Antarctica. Bubbles inside ice cores from this region provide historical air samples going back to 1940.
Analysis of N2O levels in the Cape Grim air samples revealed a seasonal cycle, which has been known before. But isotopic measurements by a very sensitive isotope ratio mass spectrometer also displayed a seasonal cycle, which had not been observed before. At Cape Grim, the isotopes show that the seasonal cycle is due both to the circulation of air returning from the stratosphere, where N2O is destroyed after an average lifetime of 120 years, and to seasonal changes in the ocean, most likely upwelling that releases more N2O at some times of year than at others.
"The fact that the isotopic composition of N2O shows a coherent signal in space and time is exciting, because now you have a way to differentiate agricultural N2O from natural ocean N2O from Amazon forest emissions from N2O returning from the stratosphere," Boering said. "In addition, you also now have a way to check whether your international neighbors are abiding by agreements they've made to mitigate N2O emissions. It is a tool that, ultimately, we can use to verify whether N2O emissions by agriculture or biofuel production are in line with what they say they are."
Changes in fertilizer use can reduce N2O emissions
Limiting nitrous oxide emissions could be part of a first step toward reducing all greenhouse gases and lessening global warming, Boering said, especially since immediately reducing global carbon dioxide emissions is proving difficult from a political standpoint. In particular, reducing nitrous oxide emissions can initially offset more than its fair share of greenhouse gas emissions overall, since N2O traps heat at a different wavelength than CO2 and clogs a "window" that allows Earth to cool off independent of CO2 levels.
"On a pound for pound basis, it is really worthwhile to figure how to limit our emissions of N2O and methane," she said. "Limiting N2O emissions can buy us a little more time in figuring out how to reduce CO2 emissions."
One approach, for example, is to time fertilizer application to avoid rain, because wet and happy soil microbes can produce sudden bursts of nitrous oxide. Changes in the way fields are tilled, when they are fertilized and how much is used can reduce N2O production.
Boering's studies, which involve analyzing the isotopic fingerprints of nitrous oxide from different sources, could help farmers determine which strategies are most effective. It could also help assess the potential negative impacts of growing crops for biofuels, since some feedstocks may require fertilizer that will generate N2O that offsets their carbon neutrality.
"This new evidence of the budget of nitrous oxide allows us to better predict its future changes- and therefore its impacts on climate and stratospheric ozone depletion -- for different scenarios of fertilizer use in support of rising populations and increased production for bio-energy," said coauthor David Etheridge of the Centre for Australian Weather and Climate Research in Aspendale, Victoria.
Boering's colleagues include D. M. Etheridge, P. J. Fraser, P. B. Krummel, R. L. Langenfelds, L. P. Steele and C. M. Trudinger of the Centre for Australian Weather and Climate Research; D. Ferretti of the National Institute of Water and Atmospheric Research in Wellington, New Zealand; K-R. Kim of the School of Earth and Environmental Sciences at Seoul National University in Korea; and T. D. van Ommen of the Australian Antarctic Division in Tasmania. Park is now at Seoul National University, while Croteau is at Aerodyne Research, Inc., in Billerica, Mass.
The work was supported by UC Berkeley's Atmospheric Sciences Center, NASA's Upper Atmosphere Research Program, the Camille Dreyfus Teacher-Scholar Award, the Brain 21 Korea Program, a Korean government research grant through Seoul National University, and the Australian government's Cooperative Research Centres Programme.
Finding the fingerprint of fertilized microbes
Boering was able to trace the source of N2O because bacteria in a nitrogen-rich environment, such as a freshly fertilized field, prefer to use nitrogen-14 (14N), the most common isotope, instead of nitrogen-15 (15N).
"Microbes on a spa weekend can afford to discriminate against nitrogen-15, so the fingerprint of N2O from a fertilized field is a greater proportion of nitrogen-14," Boering said. "Our study is the first to show empirically from the data at hand alone that the nitrogen isotope ratio in the atmosphere and how it has changed over time is a fingerprint of fertilizer use."
Just as telling is the isotope ratio of the central nitrogen atom in the N-N-O molecule. By measuring the nitrogen isotope ratio overall, the isotope ratio in the central nitrogen atom, and contrasting these with the oxygen-18/oxygen-16 isotope ratio, which has not changed over the past 65 years, they were able to paint a consistent picture pointing at fertilizer as the major source of increased atmospheric N2O .
The isotope ratios also revealed that fertilizer use has caused a shift in the way soil microbes produce N2O. The relative output of bacteria that produce N2O by nitrification grew from 13 to 23 percent worldwide, while the relative output of bacteria that produce N2O by denitrification -- typically in the absence of oxygen -- dropped from 87 to 77 percent. Although the numbers themselves are uncertain, these are the first numerical estimates of these global trends over time, made possible by the unique archived air dataset of this study.

Journal Reference:
  1. S. Park, P. Croteau, K. A. Boering, D. M. Etheridge, D. Ferretti, P. J. Fraser, K-R. Kim, P. B. Krummel, R. L. Langenfelds, T. D. van Ommen, L. P. Steele, C. M. Trudinger. Trends and seasonal cycles in the isotopic composition of nitrous oxide since 1940. Nature Geoscience, 2012; 5 (4): 261 DOI: 10.1038/ngeo1421
Courtesy: ScienceDaily


Tuesday, April 3, 2012

Cancer Stem Cell Vaccine in Development Shows Antitumor Effect

Scientists may have discovered a new paradigm for immunotherapy against cancer by priming antibodies and T cells with cancer stem cells, according to a study published in Cancer Research, a journal of the American Association for Cancer Research.

"This is a major breakthrough in immunotherapy research because we were able to use purified cancer stem cells to generate a vaccine, which strengthened the potency of antibodies and T cells that selectively targeted cancer stem cells," said Qiao Li, Ph.D., a research assistant professor in the department of surgery at the University of Michigan.
Cancer stem cells are tumor cells that remain present, and ultimately resistant, after chemotherapy or radiation treatment. Scientists disagree on whether these cells have unique properties, but those who support the uniqueness idea have argued that these cells regenerate the tumors that lead to relapse.
Despite the similar name, cancer stem cells are distinct from embryonic stem cells, and the two avenues of research are separate.
For the current study, Li and colleagues extracted cancer stem cells from two immunocompetent mouse models and used them to prepare the vaccine.
"We found that these enriched cancer stem cells were immunogenic and far more effective as an antigen source compared with the unselected tumor cells normally used in previous immunotherapy trials," said Li. "The mechanistic investigations found that when antibodies were primed with cancer stem cells, they were capable of targeting cancer stem cells and conferring antitumor immunity."
The researchers also found that cytotoxic T lymphocytes harvested from cancer stem cell-vaccinated hosts were capable of killing cancer stem cells in vitro.

Journal Reference:
  1. N. Ning, Q. Pan, F. Zheng, S. Teitz-Tennenbaum, M. Egenti, J. Yet, M. Li, C. Ginestier, M. S. Wicha, J. S. Moyer, M. E. P. Prince, Y. Xu, X.-L. Zhang, S. Huang, A. E. Chang, Q. Li. Cancer Stem Cell Vaccination Confers Significant Antitumor Immunity. Cancer Research, 2012; 72 (7): 1853 DOI: 10.1158/0008-5472.CAN-11-1400
Courtesy: ScienceDaily