To the stars and beyond: Assessing the impact of the $100 Genome

Author: Natasha Davie, 05/16/13

We live in a time of extraordinary medical advances. So far in 2013, we’ve seen the successful transplantation of a bioengineered kidney into rats, an infant reportedly cured of HIV using anti-retroviral drugs that are currently on the market, and discovered a protein that has the potential to ‘reverse aging’ in the heart.

These ‘small steps’ forward are indicative of a far greater ‘giant leap’ for health care. I recently attended a talk by Canadian Oxfordite Professor Sir John Bell, and was struck by an example he used to illustrate progress in medicine – taxonomy. The way we classify diseases has radically changed over the last 50 years. From describing someone as having “cancer” to “metastasized colorectal carcinoma” to “metastasized colorectal carcinoma with MLH1 silencing” to “metastasized colorectal carcinoma with MLH1 silencing and KRAS mutation,” our understanding of disease pathology and how to treat it has changed astronomically. But, more importantly, this meteoric rise in understanding is already being used to improve patient outcomes.

Using such information, we’re now able to predict responders to certain types of chemotherapy and identify optimal treatment regimens based on an individual’s genetic code. Furthermore, huge advances in sequencing technology mean that genomics is already being applied in a meaningful way, and with no sign of deceleration in the companies developing such technologies, the once distant dream of a $1000 genome has been obliterated (see Figure 1), as we instead begin to consider the widespread implications of the $100 genome.

Figure 1: How the cost of sequencing has changed over a 30-year period (source: George Church via Chemical & Engineering News)

At this modest cost, significantly less than the current $20,000 deposit for a ride with Virgin Galactic, any of us could walk into our local sequencing centre and walk out with a printout of our individual fundamental make-up. Right now, with that information, we can perform health assessments, determine suitable therapeutic approaches and approximately predict the likelihood of certain diseases (see Angelina Jolie’s recent decision to get a preventative double mastectomy after finding out she had the BRCA1 mutation and an 87% chance of developing breast cancer). But it becomes really interesting when we pull all of this information together with a patient’s medical history. Deciphering the genetic code, and finding ways to process the huge volumes of data generated, are undoubtedly the next ‘giant leap’.

The potential gain for bioinformatics companies is enormous. In fact, much of the data required already exists. In the UK, the National Health Service houses over 60 years of detailed medical records for every patient it has ever treated. Yet trudging through this data and extracting useful information – whilst respecting patient confidentiality – is challenging at best, and something the UK government has struggled with. Furthermore, the data generated by complex genome-wide association studies (GWAS), a technique commonly used to find links between genes and diseases across a substantial population, could overwhelm the majority of servers.

And yet we are finding ways around this. Fields that end in ‘-omics’ are making significant progress and realizing the benefit of collaboration, enabling us to consider differences in protein expression, lipid expression and metabolites across patient groups. This evolving health care paradigm, where we have an increased understanding of not only the disease, but also the individual patient, will be key in the commercialization of novel therapeutics, particularly those with regenerative capacity.

Regenerative medicine products will be expensive by definition, and demonstrating the efficacy needed to justify this cost will undoubtedly stem from increased understanding of the underlying mechanisms. Cells and tissues are complex, and deciphering the genetic differences between responders and non-responders will be key in unlocking their true therapeutic potential – and ultimately curing disease.

In the future, the key will be to integrate health care and research systems, where every willing patient is a research patient, and every bit of data generated contributes another piece to our understanding of the genetic puzzle. Combine this with patient stratification and adaptive licencing (topics to be explored at a later time) to enable earlier access to new therapeutics, and what we get is a dynamic health-care system, where each patient is optimally treated in their own mini clinical trial. And, in my opinion, that is a celestial goal worth reaching for.

The following two tabs change content below.

Natasha Davie

Natasha Davie is part of the Centre for Accelerating Medical Innovations at Oxford University, where she is pursuing a doctorate in Clinical Laboratory Sciences. She has been involved in regenerative medicine since 2002, when she worked with the London Regenerative Medicine Network on numerous projects analysing cell therapy translation, and gaining expertise in clinical trials, regulation, manufacture and commercialization. She completed her Masters in Biochemical Engineering at University College London in conjunction with the Harvard Stem Cell Institute and Harvard Medical School. Follow Natasha on Twitter @natashadavie
Tags: , , , , , , ,

Leave a Reply

*
*