Blog


Genus envy

Author: 
Ethan O. Perlstein

In 1997, a breakthrough was made in rare/orphan disease research. An evolutionarily conserved gene called NPC1 was shown to be responsible for Niemann-Pick disease type C, a degenerative lysosomal storage disorder that affects 1 in 150,000 people on Earth, half of whom manifest symptoms as children. The discovery of NPC1 should have unleashed a torrent of follow up studies in simple model organisms like yeast, worms and flies, all of which have an ancestral version of NPC1. Instead, what followed was a trickle, with clunky rodent models getting all the basic research attention. Is that partly why 16 years later we still don’t have a cure for NPC?

It was once axiomatic to say that model organisms illuminate cellular bits that have been conserved by evolution over the eons. Despite this overwhelming evidence of commonality, the biomedical establishment operates with a mindset of human exceptionalism. According to this mindset: 1) any organism simpler than a mouse or a rat is not relevant to drug discovery; 2) technological advances in human cell in vitro culture and genetic manipulation obviate the need for non-human models. I believe this view is both conceptually flawed and economically inefficient. The basic understanding we so desperately need to cure NPC and thousands of rare/orphan diseases like it will only come from painting meticulous physiological portraits of human disease on a canvas of simple model organisms, starting with our far-removed unicellular cousins.

Here I present Saccharomyces cerevisiae, which goes by several aliases: budding yeast; brewer’s yeast; baker’s yeast. As you can tell from the monikers, we and yeast go way back. Thousands of years ago the lucky bastard who first stumbled upon a natural fermentation put brew and brew together, and our fates have been entwined since. The use of fungi as model organisms in experimental biology dates back to the 1930s and 1940s to the seminal “one-gene, one-enzyme” auxotrophy studies of George Beadle and Edward Tatum on the bread mold Neurospora. The genome of S. cerevisiae (hereafter yeast) weighs in at 12 Mb, or megabases, and boasts around 5,000 genes. Depending on how the calculation is done, 20% – 30% of yeast genes have a statistically significant match to a human gene at the DNA level. For scale, the human genome is 3000 Mb, or 200 times larger than the yeast genome, and features ~20,000 genes. Yet most biomedical researchers appear to treat that 20% – 30% as though it were 1%, or ignore it altogether. Have they simply forgotten the literature, or is it the hex of human exceptionalism?

It’s not as though that conserved bloc of genes is chopped liver in terms of cellular functions. Obviously included in this tally are enzymes involved in central metabolism, e.g., glycolysis, or the breakdown of the sugar glucose into chemical energy. But non-metabolism genes and the proteins they encode are also part of the mix. There’s actin and tubulin, two proteins that comprise the dynamic scaffolding, or cytoskeleton, of cells; histone, a protein that wraps DNA double helices in a regulatory embrace; clathrin, the triskelia-shaped protein that forms Bucky Ball coats around lipid droplets called vesicles. And it’s not just the pipes and dry wall that’s shared. Even complex enzymes like kinases are conserved from yeast to humans, including one of my favorites TOR, which stands for Target Of Rapamycin, an ancient nutrient sensor.

The full force of evolutionary conservation is no more persuasively felt than in gene-replacement experiments. If DNA sequence alignment indicates that two genes are related in organisms separated by over a 1 billion years of evolution, how do we know that this DNA sequence similarity translates into functional interchangeability? Swap the modern version for the ancient one, and see if the cell or organism behaves normally. It’s a concept from genetics called complementation. It must have been in those heady days that the expression “the awesome power of yeast genetics” was born. Once I got a taste for yeast in my first-year graduate school laboratory rotations, there was no turning back. In my graduate and postdoctoral research over the last decade, I’ve been trying to connect basic discoveries made in yeast to human diseases, and now my focus is rare/orphan disease.

Studying yeast alone is not going to cure NPC, but if you take evolutionary conservation at face value, the awesome power of yeast genetics is a modest down payment on a cure.

Time for a Rare Disease Moon Shot

Author: 
Ethan O. Perlstein

Why is it so difficult to translate biological discoveries into effective drugs for diseases that are triggered by a single faulty gene?

Once upon a time we stamped out smallpox. We went to the Moon and back in a decade, then sequenced the human genome. We've made giant leaps before. The holdup doesn't appear to be lack of talent or money. Right now we're experiencing a generational glut of underemployed or downsized professionally-trained life scientists – postdocalypse, anyone? The largest pharmaceutical companies are sitting on tax-sheltered cash reserves totaling several NIH annual budgets. (Note: 1 NIH annual budget = $29B).

Some say it’s just really, really hard to bring a drug to market, and there’s certainly truth to that. But it’s not time travel hard. Drug failures seem to arise from misaligned incentives and scientific blind spots rather than conceptual impenetrability. On the one hand, I blame the rampant targetophilia, or drug action reductionism, on Big Pharma. Blockbuster magic bullets, e.g., Gleevec, were never going to be economically sustainable, and this groupthink has contributed to declining productivity in drug development i.e., Eroom’s Law. On the other hand, academia's Tenure Games are no picnic. The “publish or perish” ethos thwarts rapid dissemination and sustained cross-pollination of human disease-model discoveries. Even projects that lead to a glorious Cell paper can be abandoned when a postdoc leaves the lab, left to languish for years behind a paywall, or snuffed out much earlier at the proposal stage by risk-averse study sections.

I’m not the only person bypassing pharmageddon and postdocalypse to become an independent scientist even if I'm one of the most vocal about it. And I hope that I’m not the only person interested in applying my scientific curiosity toward a long-neglected challenge that deserves its own Moon Shot: curing all 7,000 rare diseases. Think of each rare diseases as a basic science puzzle waiting to be solved. And think of all the zeal and resourcefulness of rare disease advocates, particularly rare disease parents, waiting to be harnessed by a collaborative critical mass of professional life scientists. We know these puzzles are solvable. For example, Vertex Pharmaceutical’s highly profitable and successful cystic fibrosis (CF) drug called Kalydeco is a monumental pharmacological achievement. But it took 23 years to develop this treatment from the moment the cystic fibrosis gene was discovered, cost hundreds of millions of dollars with critical seed investment from the Cystic Fibrosis Foundation, and now people with a drug-responsive form of CF must pay $200,000 to $400,000 per year for the right to live.

This summer, I'll be prototyping a cheaper, modular, evolutionarily informed drug repurposing approach to rare disease research that seeks to compress the time between rare disease diagnosis and rare disease drug discovery at a fraction of the $50-$100M that I’ve been told is the minimum cost threshold for serious drug discovery. In plain terms, I want to identify off-the-shelf candidate treatments for $1M bucks or less, and do it in less than the time it takes to review a government grant. 

A rare disease Moon Shot starts with the rare diseases about which we know the most, and the lessons learned from initial forays will be applied to increasingly challenging cases, e.g., undiagnosed rare diseases. I think for the subset of diagnosed rare diseases that have been studied extensively in model organisms, like the lysosomal storage disorders, preclinical leads could be generated rapidly and then validated in patient-derived cells based on knowledge that is languishing behind paywalls. Public investments in model organism research has "de-risked" the earliest stages of rare disease drug discovery. The take-home message is: rare diseases don’t discriminate evolutionarily!

 

 

Indie science

Author: 
Ethan O. Perlstein

Transitioning to scientific independence from the tribal world of government-sponsored biomedical research is not easy. The stars have to align in your personal life. For example, having a supportive spouse who will quit their job and move across the country with you. Alas, searing curiosity alone cannot pay the bills. You’ll need a day job, or better yet a complementary, part-time consulting gig that leaves you enough time to do professional-grade science. Even after all that, you’ll still need a place to do the experiments – and of course a way to pay for it.

When I began my search for lab space in earnest at the end of April, I didn’t think I’d find a turnkey option that was not only yeast-friendly but also in biking distance of my Oakland abode. I dutifully put out the word on my lab website, made lots of cold calls, and reversed course at several cul-de-sacs. Luckily, it all paid off: yesterday I signed a short-term lease on a space in Berkeley!

This quest for a bench began with leads that I had rounded up before I even touched down in the Bay Area. Now keep in mind that I don’t need an entire lab, just enough bench space to conduct genetic and chemical screens with yeast cells. I don’t need core facilities or expensive, esoteric instrument. The experiments I’ve planned require consumables, reagents that are standard fare in any modern molecular biology lab, and experiments that I can’t do for lack of expertise or equipment I’ll outsource on a fee-for-service basis. I also don’t need employees, as I’ll be doing most of the experiments myself; nor a grant, as I’ll be self-funding this initial foray. (Fear not: a post dedicated to funding is in the works).

At one extreme is the biohacker/DIY bio option, e.g., BioCurious. At the other extreme is the biotech incubator option, e.g., QB3. Neither is the best fit for me right now. I admire the hacker ethos, but BioCurious is primarily serving hobbyists and fledgling scientists as opposed to professionally trained scientists. QB3 is primarily incubating life science startups up to 5 team members. Right now Perlstein Lab is just yours truly, and I want to retain this flexibility as I continue to seek out partners and patrons for a rare disease Moon Shot.

So I kept sleuthing, and received feedback from unexpected sources. For example, Kevin Lustig, the founder of Assay Depot, noticed my query from a LinkedIn share and suggested that I cold call local contract research organizations (CROs) to find out if any of them would sublease space to me. I reached out to a dozen or so CROs that seemed to fit my specs but these leads all came up empty. Then one day I got an email from Pia Abola, whom I’d never met. She used to work at an independent institute affiliated with UC Berkeley. She said based on my description her old lab might be a good fit.

So that very day I sent an email to the director of this institute, which is called VTT/MSI Molecular Sciences Institute (hereafter just MSI). As I wrote above, it’s a turnkey lab, which means fully stocked with the exact common equipment that I need for yeast work. I’d have to pay a premium for turnkey, more than double what QB3 charges for comparable albeit completely empty bench. In spite of the higher rent, another factor weighed in my decision: time. At QB3 I’d have to sign a one-year lease, and communal equipment would be hit or miss. Whereas at MSI I could start with a short-term lease. Besides, the $900 – $1,000 per month that QB3 charges startups at, say, the East Bay Innovation Center would quickly balloon with the cost of basic equipment like pipettors, which I’d be responsible for purchasing.

With a tentative lease agreement in hand, the next step in the process proved to be the most challenging – finding liability insurance. It’s been a wild goose chase. I know I’m at the leading edge because underwriters at popular insurance companies, e.g. Farmers or State Farm, won’t cover independent scientists. Practically speaking, I squandered my first leads not knowing exactly how to describe myself or my situation. After each decline though, insurance brokers offered parting sage advice on what to say and what not to say the next time.

If you’re considering going down the independent scientist path, know that when dealing with insurance brokers, refer to yourself as a one-person startup or a sole proprietor. Tell the broker you’re looking for commercial liability for “premises and ops.” In terms of liability caps, MSI and probably any other laboratory landlord want $1MM. As a result, my monthly premium will be higher than typical renter’s insurance.

So…what projects will I be working on? I gave a preview of some of the experiments I’ll be doing in the form of an open proposal in early May. I’m still integrating all the helpful comments, which are spread out on my lab website, Facebook and Twitter. A new post will be the first order of business next week, after I complete safety training, etc. Ever in the spirit of Open Science, I will be broadcasting my scientific method live, trying my best to follow in the footsteps of open lab notebook pioneers like Rosie Redfield, Anthony Salvagno, and an old grad school friend and newly minted assistant professor Greg Lang.

As an early adopter of the independent scientist model, I don’t pretend to have all the answers or a crystal ball. But I take inspiration from independent scientists who have tinkered and explored for centuries outside universities long before the Academia-Pharma Complex began to crowd them out since 1945. More and more unabsorbed academic trainees will go indie over a second postdoc. Same goes for all the downsized Pharma scientists.

Gene Patents No More

Author: 
Eric Schuur

As most of you know at this point, on Jun 13, 2013 the Supreme Court of the United States ruled essentially that native DNA sequences are not patentable subject matter.  The question ended up with the Supreme Court precisely because there are good arguments on both sides and, as you would expect, there was a lot of highly charged rhetoric exchanged leading up to the decision.  I’m not going to settle those questions here, but I did think it was worth a few moments musing about less technical aspects of patent issues.

I have tended to side with those who believe that gene sequences, at least the ones that exist in the body, should not be patented because they are a principle of nature—it just feels like giving too much away to me.  However, as one of the founders and early investors in Myriad Genetics shared in The Wall Street Journal, without the monopoly guaranteed by a patent, investors would not have anted up to launch a company to develop the genetic tests, certainly not at that high risk time (circa 1991) when it was all but certain if these tests would be worth anything.  Others have argued that the monopoly on the gene sequences that Myriad (and others, with respect thousands of other genes) have enjoyed have impeded progress in understanding genetic function and utility.

As with most persistent debates, both points of view are probably true in part.  From the perspective of the pro-patent camp, a large shift in how diagnostic products are developed that was necessary to bring tests such as these to market.  A huge promise of the human genome project was to provide for health care based on the sequence of an individual’s genome—personalized medicine.  But, it was a promise made, lo these many long years ago, when we really had no idea if it would actually work.  To get to market, we would need to generate a large amount of experience with these sophisticated new tests, the operating characteristics of which were largely unknown.  Would genetic prediction of cancer susceptibility actually work?  Would patients and doctors actually find the test useful?  Much different than measuring the number of white blood cells in a blood sample.

To get physicians and patients to order the tests, data on the validity of the tests was needed.  That required a lot of free testing to generate the data on the relationship between the gene variations and cancer incidence.  Myriad Genetics, as well as Genomic Health and other vendors of gene-based tests, have invested heavily in clinical validation of their tests in order to convince patients and physicians of their value.  Investors paid for much of this and patent monopolies were their reward.

On the anti-patent side of things, I wonder if Myriad Genetics has shot itself in the foot by jealously guarding its monopoly and appearing to be greedy (even after the Supreme Court decision, Myriad continues to aggressively pursue its perceived monopoly, see this article).  Stanford University and UCSF famously structured their genetic engineering patents to allow broad-based licensing, winning high levels of praise for licensing savvy and social conscientiousness.  Their patents did not meet the raucous challenges faced by Myriad.  In contrast, articles such as the one mentioned above  and this article decry the excessive costs imposed by companies of not only diagnostics, but therapeutics, as well.

Now I haven’t sat down and poured over Myriad’s (or anyone else’s) financial statements to ascertain if they really need to charge $4000 per test to recoup their investment and earn a reasonable rate of return for their investors.  But, what is clear is that that $4000 number is very high compared to what the world is used to paying for diagnostic tests.  It probably would have helped Myriad if they had been more transparent about why they needed to charge that amount, given that their soon-to-be competitors are planning to charge under $1000 in some cases.

Since that our government grants patent monopolies for the betterment of our society in general, I wonder if it might be prudent for companies and other patent holders to consider public reaction to how they handle the right to charge what the market will bear.  If the public perception is that the patent owner’s behavior is not in the best interest of society, they may be sacrificing goodwill, which ultimately, in closely watched cases, such as this one, might tip the balance one way or the other.

Follow me on Twitter: @erschuur

Biotech Journalist Shares Photos, Trip to Top of North America

Author: 
Theral Timpson

Luke Timmerman is the National Biotech Editor for Xconomy. His weekly BioBeat columns on Mondays are a favorite among industry veterans.

Last month Luke took three weeks to climb Denali (Mt. McKinley) in Alaska. I was blown away with this photo essay he posted on his return. Having climbed a few mountains, including Mt. Whitney here in California, I know a bit about what it takes.

At 20,300 feet, Denali is the tallest place in North America and provides climbers with some formidable obstacles. For one, the ascent from base to peak is higher than with any other mountain entirely above sea level. And being in Alaska, the mountain is further from the equator than any of the other big climbs around the globe. This means that the barometric pressure is lower; air is harder to come by.

Luke's pictures are stunningly beautiful. Jagged peaks topping immense glaciers. Exhausted, ice-tipped, bearded faces and smiles. That Luke worked his way up to this with rigorous daily training and then made it to the summit--what to say?   I'm inspired.

More stories from the trip arranged as lessons on leadership.

"Cloud First" Insist IT Experts at 5th Annual Cloud Slam Conference

Author: 
Theral Timpson

Thomas Barton is an IT engineer at Novartis whose job is to link things together.   When Barton wanted to upgrade the company’s middleware-software that connects one piece of software to another--he was encouraged by a colleague to turn to “the cloud.” 

In fact, folks in the IT industry are more and more promoting a “cloud first” strategy. Just starting a new business?  Don’t buy your own software.  Start with the cloud.  Upgrading software for your existing business?  Step into the cloud.  These are folks from Microsoft, Intel, Dell and other IT giants, and their mantra is clear:  Waste no more time or money.  Go to the cloud first.  The message rang loud and clear at the Cloud Slam conference in Santa Clara this week focused on cloud computing and healthcare/life science.

Stats First

Let’s start with some statistics and projections made by keynote speaker, Mark Weiner of Microsoft:

  • Data will grow by 4400% over the next 10 years
  • By 2015, there will be 2x more smartphones than people
  • By the end of 2016, 60% of healthcare organizations will be taking data into the cloud
  • Only 16% of healthcare organizations know where archived data resides
  • There are 600 million imaging studies done per year
  • By 2016, there will be one exabyte worth of medical imaging

With all the data coming down the pike, software engineers say there is no way around it.  We must go virtual.

The Growth/Security Conflict

The issue of whether to use the cloud is much like the “open science” conflict I’ve discussed in previous blogs.  One must use the cloud to grow in a cost effective and efficient way.  Yet one must not use the cloud to maintain the highest security of sensitive data.  

“There’s no way our data is going in the cloud,” is a common line industry veterans have heard over the past few years.  Yet, business by business, industry by industry, conference by conference IT engineers are pushing back. 

The security issue is the main concern and understandably took up the bulk of the time for panel discussions throughout the Cloud Slam conference that were populated with leaders from Microsoft, Dell Boomi, Intel, and other software giants.  

So how do these experts address the concern over security and privacy?  Healthcare is a heavily regulated industry.  

“The cloud can provide better security,” said Microsoft’s Hector Rodriguez.  “In fact, we already live in the cloud.”

Microsoft’s answer to the Amazon cloud is Windows Azure, touted as “a cloud for modern business.”  Hector went on to explain that without a comprehensive data strategy, employees of  healthcare organizations resort to “work around” methods, such as email and social media.  Any data shared in this fashion is not secure.  Using one system hosted in a cloud platform where data integrity is maintained according to the needs of the organization can thereby improve security.

The cloud is also better for disaster recovery.  During natural disasters, such as the recent Sandy storm in the northeast, millions of dollars are spent recovering data centers that were not originally budgeted for.  The cloud is not limited by local conditions.  When organizations use the cloud, data back up can be done within hours.

So are concerns over security overblown?  Panelists at the conference urged healthcare organizations to “push your vendors.”  Special cloud platforms can be HIPPA compliant, can be made to work with demands of regulators such the FDA makes on clinical trial data.  In fact, it was pointed out, an FDA cloud workgroup has just begun meeting.

An important trend to help business take advantage of the cloud is to consider a “hybrid cloud”, using a combination of platforms, perhaps one internal and one virtual, or two separate virtual systems.

The Novartis Story

Thomas Barton of Novartis began with one project.  Their middleware was just taking too long and costing too much.  Over the past seven months Barton and his team have changed from enterprise software hosted on sixteen servers internally to eight servers in the cloud hosted by Dell Boomi.  Though Barton didn’t share the exact cost savings from this transfer, he did say that the company was now saving 30% for this application.  

The Novartis case is an example of a hybrid system.  They are converting one project at a time and still able to have all their various software connected.    

Barton says skeptics in the company asked, “oh no, where does my data go?”  He replies saying that it doesn’t go anywhere.  That it is configured and coded in the cloud, but the data is still local.

Novartis deals with sensitive, regulated data, and according to Barton, “everything is validated and qualified.”  Disaster recovery has been vastly improved.  And, Barton says he’ll see payback within the first year.

There were more success stories.  Jason Stowe is the founder and CEO of Cycle Computing and devotes his time to life science customers.  He’s convinced that cloud computing has the power to advance science in a dramatic way.

“Researchers can ask bigger questions with new compute technologies,” Stowe said in a breakout session.  

In one project, Stowe was able to turn “39 years of science into 11 hours” with his cloud platform.  Going to the cloud versus using traditional in house servers, reduced the cost of the project from $44 million to $4,372.  

With a genomics project, Stowe’s company did what would normally be 115 years of compute hours in one week for $19,555.

I’m not sure how Jason calculated his numbers, but they’re getting him into some big doors.  Cycle Computing serves most of big pharma and some large genomic projects.

Making the Jump

So why don’t more life science companies turn to the cloud?  After hearing stories like those of Novartis and Cycle Computing, I expected to see more companies from our industry at the conference.  

David Houlding, a speaker and panelist from Intel Healthcare (Intel has a healthcare division?), speculated on the resistance to the cloud.

“It’s been a disservice to the industry that the cloud has been framed as new and risky,” he said.  “Actually, what we’re seeing is just the evolution of the data center.”   Houlding insists that the cloud is “not an all or nothing journey.”  

Other panelists discussed resistance within the ranks of an organization’s IT department.  When the idea of moving to the cloud is mentioned and considered, it could lead to jobs lost.  Panel members pointed out that overall IT departments are understaffed.  And that engineers who have specialized in on-site data centers can go on to become privacy teams, focused on theft protection and other security issues.  

“Even though an organization may outsource to the cloud, it is still the one accountable to its customers and regulators,” said Houlding.  “IT folks will become auditors.”

The conference was very commercially focused, being sponsored by the speakers from the big software giants.  As expected, there was not much push back or skepticism.  

The comments of Intel’s Houlding about the disservice of calling the cloud something new and risky make sense on the one hand.  The life science industry can be especially slow to adopt new ways that are perceived risky.  In fact, I see this as our industry’s core conflict.  We have, and must continue to cross new borders to grow and improve.  Yet we must be careful not to add more risk.  

Sometimes a paradigm shift is necessary.  A big jump in conception.  Sometimes it is helpful to say that we need to think in new ways.  Jason Stowe gives a provocative challenge to  investigators, and all of us in the industry.  Can we come up with bigger questions, can we think bigger with better compute power?

The question of whether to use the cloud, it seems, is not if, but when.

Supreme Court Invalidates Myriad's BRCA Gene Patents, Allows for cDNA Patents

Author: 
Theral Timpson

The Myriad gene patent case reached its final point today when the U.S. Supreme Court handed down a decision that will resonate throughout the life science industry for years to come.

In a rare unanimous ruling, the high court ruled that Myriad's BRCA gene patents are invalid.

Justice Clarence Thomas wrote for the court: “A naturally occurring DNA segment is a product of nature and not patent eligible merely because it has been isolated, but cDNA is patent eligible because it is not naturally occurring . . . Had Myriad created an innovative method of manipulating genes while searching for the BRCA1 and BRCA2 genes, it could possibly have sought a method patent. But the processes used by Myriad to isolate DNA were well understood by geneticists at the time of Myriad’s patents” and are not at issue in this case, the decision went on. “Similarly, this case does not involve patents on new applications of knowledge about the BRCA1 and BRCA2 genes.”

However, the court left the door open for patenting cDNA.

“cDNA retains the naturally occurring exons of DNA, but it is distinct from the DNA from which it was derived. As a result, cDNA is not a 'product of nature' and is patent eligible,” continued the decision.

At issue for the life science industry, particularly the booming area of diagnostics is whether without patent protection, diagnostics companies such as Myriad are less likely to develop innovative tests.  Where would the drug industry be, for example, without patents for such therapeutics as Viagra and Lipitor?

The decision will be greeted with different responses around the industry.  Those in academic research positions are generally against gene patents, while those in industry such as today's guest, Mark Trusheim, hoping for IP protection for their innovation.

This morning on Twitter scientists were attempting to parse the decision and determine just where the justices had drawn the line.

Ewan Birney, leader of the ENCODE project appeared confident that the court struck a good balance. "Plenty of real estate each side of the boundary - both public goods side and private innovation," he tweeted.

Scientist Leonid Kruglyac was less enthusiastic about the cDNA part of the decision. He tweeted: "Thomas: "lab technician unquestionably creates something new when cDNA is made.” // So do I when I make a sandwich."

The Wall Street Journal article this morning asserts that "the court in recent years has sought to constrict the scope of patent protections, concerned that patents were being issued too easily and so broadly as to squelch competition and impede innovation. Justice Elena Kagan at one point alluded to such concerns, describing the Patent and Trademark Office as "patent-happy.""

Over at GEN, an article points out that in a blog response, Myriad is seeking to "shift focus away from the patentability of the genes. Myriad noted that under the Affordable Care Act, the BRCAnalysis test is considered preventative, and insurance is required to cover 100% of the cost “for many women. We offer financial assistance to uninsured patients with the greatest need.”"

The results of today's decision will be far reaching and hotly debated for some time to come.   

"Perhaps there's a silver lining to the ruling, as this will free the industry up to develop broader, algorithmic based tests," says Charles Matthews of Boston Healthcare in an upcoming program.  

We'll be following the decision and its fallout closely.

In ASCO Speech, Hamburg Hints of Change to LDT Regulation

Author: 
Theral Timpson

Is the FDA going to go through with the much anticipated crack down on LDTs?

In a posting at Forbes, Science and Medicine Reporter, Matthew Herper, quotes FDA Commissioner Hamburg with the following:

“Advanced diagnostics such as these are the cornerstone of personalized medicine, and their development can only foreshadow the many advances on the horizon. This is truly an exciting time in the history of cancer therapies and their companion diagnostics,” Hamburg said. “Unfortunately, not all complex diagnostics used in cancer diagnosis or treatment have been developed to perform at the same demonstrated standards.

“There is a category of diagnostics called laboratory-developed tests which are produced in and offered by laboratories for use in their own facilities. LDTs are currently marketed without FDA premarket review to determine whether they are safe and effective – whether they are accurate and clinically valid. And that can be a problem.”

Herper took these excerpts from a prepared speech Hamburg gave at the American Society for Clinical Oncology in Chicago this week.

As we've covered in our series, Commercializing Diagnostics, there exists an un-level playing field for diagnostics providers in the U.S.   Roche and Qiagen have been leaders in securing FDA approval for their diagnostic tests, only to see clinical labs using non-regulated copies of their tests.  There has been extensive debate about whether LDTs, or laboratory developed tests, done in a CLIA certified lab should require FDA approval as well.  Some, such as representatives from Roche and Qiagen, argue that the non regulated tests are done at lower standards.  Others say that basic quality standards can be maintained by CLIA.  And that the unnecessary burden of regulation will hamper innovation.

“Historically, FDA exercised enforcement discretion – that is, it did not generally enforce applicable regulatory requirements for these devices, because they were relatively simple, low-risk tests performed on a few patients being evaluated by physicians at the same facility as the lab,” Hamburg says. “But LDT’s have become more sophisticated and complex. Results from these tests are rapidly becoming a staple of medical decision-making, particularly for cancer.”

Hamburg's speech this week indicates that the FDA is moving toward a more level playing field.  How far will the FDA go?  Many experts have been expecting the FDA to pursue a stratified rather than a blanket approach, guiding their limited resources toward the most complex, clinically important tests first.

The move could have a big impact on the diagnostics industry.  Many companies, such as CardioDx who we featured last year, have bypassed the FDA and gone on to win reimbursement from the major payers.

There is also an insecurity about the IP value of diagnostics due to the ongoing Myriad Genetics gene patent case. Will diagnostic tests be patentable?  In an upcoming interview at Mendelspod, Mark Trusheim, a Special Government Employee for the FDA's Office of the Commissioner, talks about ways that diagnostics companies can build the value of their tests, other than regulation.  

The diagnostics industry is taking off as we unravel biology and develop new tools, such as next generation sequencers and improved algorithms for finding ever more meaningful biomarkers.  And regulation and patent issues continue to keep the new industry in flux as it attempts to find solid footing.

UCSC Up To More than Bioinformatics

Author: 
Theral Timpson

UC Santa Cruz is well know in our field for their part in the Human Genome Project.  Led by David Haussler, the bioinformatics group there released the first working draft of the human genome sequence on the web, leading shortly to the UCSC Genome Browser, an essential open resource for biomedical science.  This was followed up last year by the launch of the  Cancer Genomics Hub (CGHub), a large-scale data repository for the National Cancer Institute. 

But perhaps less well known is that Haussler and his colleagues at the Center for Biomolecular Science and Engineering also run the Institute for the Biology of Stem Cells.  The Institute is one of the major facilities funded by CIRM (California Institute for Regenerative Medicine) and will benefit from the recently announced award of $36 million to attract six world-class scientists to California.  One of these, Richard Gregory of Harvard, will be moving to the Institute at UCSC.  

Camilla Forsberg, Co-Director, UCSC Institute for the Biology of Stem Cells

Last week, we attended the first ever Stem Cells and Aging Symposium at UCSC, a two day meeting put on by the Institute at UCSC’s University Center devoted to connecting stem cell research with that on aging.   We attended, first, because the campus is twenty minutes from our Mendelspod office and we hope to feature more of the researchers and their work on the program.  (Stay tuned for an upcoming interview with David Haussler).  But also because both stem cell research and that on aging is really taking off.  Look for an upcoming series on both topics.   

“We’ve seen huge interest in stem cells recently,” Camilla Forsberg, Co-Director of the Institute, told me at one of the breaks.   

The Insitute has held a training program for several years for ten students, she went on, explaining how the meeting came about.  Each year the students would get together and present their work at Research Reveiw Day.  Last year, Camilla and her colleagues decided to go bigger with this year’s review and turned it into the two day sympsosium.

The increased interest in stem cell research is due to the breakthroughs in the field, she says, and also because stem cell research is offering much better understanding of basic biology itself.  

The meeting was keynoted by Judith Campisi, a well known researcher from the Buck Institute for Research on Aging, and featured expert stem cell scientists presenting their work.   Session topics were:

  • Aging and the stem cell niche
  • Insights on aging from the embryo
  • Aging of tissue-specific stem cells
  • Epigentics and aging

After the sessions, several hours were devoted to poster presentations in various rooms.

“This is an impressive lineup of speakers for a small conference,” said Amy Ralston, a biology professor at UCSC.  

Big Science and Stem Cell Research

I spoke with Amy and an HHMI investigator, Judith Kimble, after a lunch panel devoted to “building collaborations for stem cell research in aging.”    Judith says that big science collaborations are becoming more important. 

“Science is changing from small science to big science.  Thirty years ago, one hundred percent of research was small science.  Now thirty percent or so is big science,”  said Judith.  

Our audience at Mendelspod will know that this has been an important topic for us.  Both Judith and Amy are fans of ENCODE, a $400 million project funded by the NHGRI to identify all functional elements in the human genome sequence, and saw it as the logical next step after the Human Genome Project.  

I mentioned the arguments made here at Mendelspod by Dan Graur and Michael Eisen that big science is a waste of money which threatens the system of independent biological research that has been going on for the last fifty years.  

“If you remember,” responded Judith, “no one wanted to put money into the Human Genome Project in the beginning.  And now everyone uses the data.”

Judith said she favored the idea of big science collaborations for stem cell research, such as those that were mentioned at the lunch panel, as long as the funding “didn’t damage RO1s,” the mainstay of the grant system.

“The Stem Cell Institute at UCSC is capitalizing on the strengths the university already has in genomics,” concluded co-director Camilla Forsberg.    In addition to world class computational biology, the university also boasts an RNA Center.  Richard Gregory, the newcomer from Boston funded by CIRM, is an RNA biologist. 

“And this week’s symposium, which we hope will continue next year is a great opportunity for training our students and offering researchers from here and other centers a place to show their work,” she said.  

The meeting was sponsored by the Ellison Foundation, a major funder of aging related research, and was headlined by a list of commercial sponsors as well. On the meeting website it says that “this is the only major meeting on stem cells and aging in the United States this year.”  

I wonder how long they’ll be able to boast that line.

 

Big Data Takes the Stage at Stanford

Author: 
Theral Timpson

We're currently developing a series on big data here at Mendelspod.  So we jumped at the chance to attend the first 'Big Data in BioMedicine Conference' put on at Stanford in conjunction with the University of Oxford.  The conference gave a great overview of the topic, reaching not only into all that omics data, but health IT and public health as well.  

Stanford is a fitting place for the topic. It's the workplace for what I call the Big Three: Atul Butte, Mike Snyder, and Russ Altman  (all of whom have joined us at Mendelspod.)  Luminaries in the field from UCSC and Berkeley joined as well, including David Haussler of Human Genome Project fame, and Steve Brenner.

Watching some of the conference on livestream (video will be posted soon at bigdata.stanford.edu), I made sure to make it in person to the panel 'Public Health and Big Data Policies'.  I wanted to hear a talk by Forbes columnist and GNS Healthcare CEO, Colin Hill.

Colin has been on my radar as several guests at Mendelspod have suggested him for an interview.   I have seen his face in dozens of emails, as he co-chairs O’ Reilly Media’s Strata Rx Healthcare Big Data Conferences.  He began his presentation with a quote by John Wanamaker, thought by some to be the father of modern marketing (not a character on Mad Men):

“Half the money I spend on advertising is wasted.  But I don’t know which half.”

Based in Kendall Square in Cambridge, MA, (another hub for big data and bio innovation), GNS Healthcare is using big data analytics to help in everything from drug development to patient diagnosis.

“Just as Pay Per Click solved Wanamaker’s dilemma, we are showing what is working for whom in heatlhcare.  We have the data,” Colin said in a steady, confident presentation.  

The holy grail for GNS and others in the field is to update standard of care with individualized treatment algorithms.  For example, GNS uses three dimensional modeling to analyze patient characteristics and predict drug effectiveness. 

Another example is their predictive test for “metabolic syndrome”.  This is a condition based on a combination of a patient’s baseline health data that, when occuring together, increases the risk of developing cardiovascular disease and diabetes.    If you are high in three of five basic areas: blood sugar, blood pressure, body mass index, cholesterol, and triglcerides, you are considered to have metabolic syndrome.  Colin pointed out that 4% of Americans are diagnosed, but that it’s estimated that 25% of us have it.  According to Hill, GNS’ test has a predictive power for the syndrome of 88%.  

As Colin spoke I began to get the big data big picture and consider all the possible algorithms and tests that are possible to run at companies like GNS.  Ways of improving drug development, clinical trials, omic profiling such as that done by Mike Snyder and 23andMe, drug effectiveness and toxicity--you name it.  And how are doctors going to keep up with all this information?  It’s impossible.  And how much of this will be regulated by the FDA?  

It’s obvious that private commercial enterprises such as GNS will be playing a major role in the future of healthcare.  I’m beginning to understand what Eric Topol is talking about by “homo digitus” and the patient as consumer.  Hill says that in today’s world of so much data, standard of care has become outdated.  

“Let’s not waste the data,” he said.  “Big data can solve knowledge blind spots and create as much impact as the ACA (Obamacare).”

I was eager to hear from Yael Garten, a data scientist at LinkedIn.  Is this social media giant up to something cool for healthcare?   Yael trained as a bioinformatician at Stanford, a protege of Russ Altman before joining LinkedIn.    It turns out Yael didn’t really have anything practical to offer yet.  Just fancy graphics of big omics data side by side with some LinkedIn Data, and yes, they looked similar.  But so what?  You can see a connector map like that on the back of your Delta Airlines napkin.  

She continued with some statistics and trends.  There’s a big rise in postings for data scientists at LinkedIn.  Fifty percent of email is now attended to on mobile devices.  Usage of desktops graphs differently through the day than that of tablets.  Data is being democratized.  So how is LinkedIn using this data to improve health outcomes?    The connection wasn’t made.

The final two presentations on the panel were by Dennis Wall from Harvard and from two scientists at the NIH.

Wall has been working on new diagnostic tools for autism, which he said has become an epidemic.  I had no idea that the numbers of autistic persons in the US was increasing at such an alarming rate.  It’s gone from one in 110 people to one in fifty!  Wall showed (graphically of course) that many affected by autism are in rural areas and do not have access to qualified caretakers.  What Wall and his team have done is to create a program which can look at amateur video footage of persons and diagnose with a high degree of accuracy whether those individuals are autistic.  

Peter Lyster is the program director for NIH’s BICB, or biomedical informatics and computational biology program.  He had the important task, along with a colleague whose name I didn’t catch, of communicating the strong commitment the NIH is making to informatics.  About $1 billion of the NIH’s $30 billion budget goes to the sector, and Peter announced a new large initiative of $100 million in grants.  

The presentations were followed by a lively Q & A with the four speakers.  My favorite question came from John Hornberger, an economist and member of an ASCO ethics panel.  He wanted to know of Colin what considerations GNS has made for the ethics of big data.  How are they monitoring the quality of their tests?  Do they look to the FDA for oversight.  The question was aimed at Colin, but applied to all the panelists and all of us at the conference.   Is it right for LinkedIn to use data gathered from all our accounts and postings and be sold to healthcare companies?  What are the dangers of diagnosing autism not in person by a physcian, but by a computer using a two minute video clip?    Is there enough oversight of the natural market forces which use our data in new ways?  

Colin’s response was somewhat cavalier as would be expected from an entrepreneur.  He pointed out that GNS is currently looking at 100 million lives through various studies.  That it’s happening now.  That the standard of care is outdated because it is not data driven and people need help.

At the break I met Hornberger to secure an interview for our upcoming series to discuss ethics and big data.

“This is a new industry, a new conference,” he acknowledged.  “We don’t want to throw out the baby by regulating too early.  But there is a discussion to be had.”  

In addition to Hornberger and Hill,  Atul Butte, the conference ring master and face of all things bioinformatics agreed to come on again as part of our own big data series.  And David Haussler from UCSC will be joining us for a discussion about what will be the winning utility platform for genomics data.  

For being the first stab, the conference was superbly organized with a glamorous stage setting at the Li Ka Shing building at Stanford.  (The Li Ka Shing Foundation was the major underwriter for the conference.)  Not surprisingly big data folks are big tweeters.  You can read much more about the event at #bigdatamed on Twitter.



New to Mendelspod?

We advance life science research, connecting people and ideas.
Register here to receive our newsletter.

or skip signup