Blog


The Academia-Pharma Complex

Author: 
Ethan O. Perlstein

I provocatively call the nexus of government research and regulatory agencies, university biology departments and medical schools, and drug companies the Academia-Pharma Complex. This vast public-private partnership financed by US taxpayers to develop drugs is on an unsustainable path and desperately needs Open Science. Reform begins with a diagnosis of what ails us. Many roads lead to the Bayh-Dole Act of 1980, long ago in the pre-Internet Age. Bayh-Dole grants patent rights to non-government entities for inventions resulting from publicly funded research. These non-government entities include universities.

As described in a trenchant analysis in The Economist from 2005, the primary legislative intent behind Bayh-Dole was to spur (and simplify) the commercialization of publicly funded research, which prior to Bayh-Dole was stagnating inside numerous disparate federal agencies engaged in R&D efforts. Once in the private sector, discoveries would be forged into products, in this case new FDA-approved drugs. In a nutshell, here’s how it works. Professors in biology departments spend NIH-disbursed grant money on project proposals that have been positively evaluated by academic review committees. The study of biological processes invariably yields patentable results. In those instances, Professor John or Jane Q. Smith makes a beeline for the university technology transfer office, which has the Herculean task of shepherding patents into the promised land via licensing agreements that generate revenue streams to the university.

However, in practice a deluge of public and private funds and research scientists flow into the drug discovery pipeline but fewer and fewer drops trickle out the other end. Although I found examples of Bayh-Dole boosters, a balanced scholarly review of the law published in 2006 spelled out the flaws of a closed approach:

“By vesting such comprehensive discretion and flexibility in patenting and licensing with individual institutions, the Bayh-Dole Act provided the nation and the world with a large-scale experiment in how public institutions manage public assets as private goods. The outcomes have been positive on nearly all counts, but the Act inadvertently created a misalignment between the private interests of university technology transfer offices and public interests that benefit the innovation system at large or that enable access to IP for humanitarian purposes.”

I dug around and found more data-driven support for opening up biomedical research. First, consider the now (in)famous graph of Pharma productivity that shows the number of new drugs approved per billion US$ R&D spending DEcreasing over time. It's the opposite of the efficiency gains in computing power dubbed Moore’s Law, hence the flipped moniker Eroom’s Law. Sure, we had a bumper crop of new drug approvals in 2012. Leading the way was the groundbreaker Kalydeco, a new drug approved for the treatment of some cases of cystic fibrosis, the poster child of rare/orphan diseases. But new first-in-class drugs for many devastating diseases, e.g., psychiatric and neurological diseases, are nowhere in sight. Not coincidentally, the success of Kalydeco depended on open collaboration between academics, industry, disease advocacy groups, not-for-profit foundations and patients.

Second, consider the age of scientific independence, i.e., age at first R01 award. The R01 is the bread and butter grant for tenure-track and tenured professors in Academia that provides $100,000s per lab in public funding over multi-year increments. In 1980, the average age of professors receiving their first R01 money was 36. By 2011, the last year for which we have data, the average age had climbed to 42, where it plateaued at the end of the booming late 1990s, when NIH’s budget was doubled to around $30 billion. It’s hovered there ever since, and making matter worse sequestration has driven funding success rates to historic lows.

A prolonged and unnecessary apprenticeship exacerbates the distorting effects that Bayh-Dole has on research choices, and encourages academics to engorge grant proposals with preliminary results and skimp on truly daring, basic research aims. It’s also a serious misallocation of human capital whose enormous potential would be unleashed in an open system. According to recent stats, less than 20% of people who enter the NIH-funded graduate training pipeline emerge on the other with a tenured professorship. The downward trend was apparent even in the early 90s, when the ratio was closer to 50/50.

Think about this for a second. The very capable and creative people who run the gauntlet of graduate school, one or more postdocs, AND an assistant professor search committee are a highly selected bunch. Even in so-called good years, R01 rejection rates were around 70%, and believe it or not but decades ago ~40% of NIH grant proposals were funded. We’re squandering so much talent when we ask people who’ve endured a decade of intense, specialized training in exchange for low wages, and established scientific excellence as researchers, to hang out for an extra decade at precisely the same time when many of them are starting families, usually after delaying parenthood.

Of course, I anticipate challenges from establishment thinking as well as organizational resistance. But the first step to recovery is calling a spade a spade. I expect secrecy from Pharma, but not from Academia, which I think abuses the freedom to pursue knowledge for its own sake on the public’s dime. At the same time, I expect more innovation to originate within Pharma and not simply be hastily imported from Academia with taxpayer subsidies.

Pioneer in the Virtual World Announces Three New Life Science Conferences

Author: 
Theral Timpson

My first LinkedIn invite was sent by a young businessman named Greg Cruikshank.  He was listed as the founder of Labroots.

Not knowing a thing at the time about social media, I wanted to understand more about this site that expected me to type in my resume and share it with the whole world.  And who was Greg Cruikshank and Labroots?

Starting in the industry as salesman for several of the bigger companies, Greg turned entrepreneur  and has been pioneering the power of the internet and web 2.0 to disrupt and protect the  life sciences.  Labroots was one of the first social media sites in our industry, and their foray into virtual conferences under the name BioConference Live has been extremely successful.

“This is our fifth year, and we’re growing rapidly,” Don Cruikshank told me on the phone.  He’s Greg’s father and a partner of Labroots.

When I heard from Greg after accepting his LinkedIn request, he was asking me to join his first BioConference Live (BCL) virtual conference.  Book a virtual booth, he urged, and do lead generation without ever leaving your desk or office.

I was the marketing director for a lab consumables company at the time, and we regularly had booths at the bigger trade shows.  I didn’t book a booth that year to the first BCL conference.  But I attended.  And picked up a great deal of helpful information.  Listening to other science marketers.  Seeing the companies in the exhibit hall, and going into the chatrooms.  I was hooked.

BCL conferences are keynoted by great speakers--the same ones we hear at other conferences.  And shows are sponsored by many of the top companies in the industry, including Roche, Siemens, and Thermo.  

But the real big plus is that you can “attend” the event in the comfort of your own workstation, and if you miss any of the talks, they’re available in the archives for months.  You’re not out a week from your own schedule making travel plans, booking a trip, flying, checking into hotels and the whole bit.  Of course when you’re ready for a trip, by all means take it.  I attend many conferences in person, but I still take advantage of this two day conference.   For those who don’t want to or cannot travel, it’s perfect. You can walk through the “Expo Hall” and talk with the lead marketers of dozens of life science companies in personal or group chats.  

This year BCL is expanding by dividing their generally titled Life Science Conference that has taken place for four years into three new conferences.   (A disclosure is in order here.  I’m on the advisory board for BioConference Live as of this past year.  And why?  Because I believe in the direction of this conference, and I want to witness first hand the transformation of the traditional conference.)

The first of the new conferences, Genetics and Genomics, is taking place August 21-22.    The lineup of speakers includes George Church of Harvard, Drew Endy and Mike Snyder of Stanford, Paul Billings of Life Tech, and Charles Cantor of the blossoming  Sequenom.  

The second will be Cancer Research Discovery and Therapeutics and is set for October of this year.  And the third is a Neuroscience conference next March. 

And they are all free.

That such great content is becoming this easy to access--part of life in 2013.  Does it cheapen the content?  Does one have to attend in person to have it count? 

Don Cruikshank points to our shrinking budgets.

“We began back during the ’08 recession and the virtual conference became popular.  There are folks who want to collaborate, but can’t afford the travel.”  He adds, “it is not our intent to replace the physical trade show.’

The company also relaunched their social media platform, www.labroots.com, this year.   A longer list of resources on the left side menu including reviews and jobs and a webinar service is making it more attractive for life scientists to maintain yet another profile.

Register at www.bioconferencelive.com/1-genetics-and-genomics.html

Genus envy

Author: 
Ethan O. Perlstein

In 1997, a breakthrough was made in rare/orphan disease research. An evolutionarily conserved gene called NPC1 was shown to be responsible for Niemann-Pick disease type C, a degenerative lysosomal storage disorder that affects 1 in 150,000 people on Earth, half of whom manifest symptoms as children. The discovery of NPC1 should have unleashed a torrent of follow up studies in simple model organisms like yeast, worms and flies, all of which have an ancestral version of NPC1. Instead, what followed was a trickle, with clunky rodent models getting all the basic research attention. Is that partly why 16 years later we still don’t have a cure for NPC?

It was once axiomatic to say that model organisms illuminate cellular bits that have been conserved by evolution over the eons. Despite this overwhelming evidence of commonality, the biomedical establishment operates with a mindset of human exceptionalism. According to this mindset: 1) any organism simpler than a mouse or a rat is not relevant to drug discovery; 2) technological advances in human cell in vitro culture and genetic manipulation obviate the need for non-human models. I believe this view is both conceptually flawed and economically inefficient. The basic understanding we so desperately need to cure NPC and thousands of rare/orphan diseases like it will only come from painting meticulous physiological portraits of human disease on a canvas of simple model organisms, starting with our far-removed unicellular cousins.

Here I present Saccharomyces cerevisiae, which goes by several aliases: budding yeast; brewer’s yeast; baker’s yeast. As you can tell from the monikers, we and yeast go way back. Thousands of years ago the lucky bastard who first stumbled upon a natural fermentation put brew and brew together, and our fates have been entwined since. The use of fungi as model organisms in experimental biology dates back to the 1930s and 1940s to the seminal “one-gene, one-enzyme” auxotrophy studies of George Beadle and Edward Tatum on the bread mold Neurospora. The genome of S. cerevisiae (hereafter yeast) weighs in at 12 Mb, or megabases, and boasts around 5,000 genes. Depending on how the calculation is done, 20% – 30% of yeast genes have a statistically significant match to a human gene at the DNA level. For scale, the human genome is 3000 Mb, or 200 times larger than the yeast genome, and features ~20,000 genes. Yet most biomedical researchers appear to treat that 20% – 30% as though it were 1%, or ignore it altogether. Have they simply forgotten the literature, or is it the hex of human exceptionalism?

It’s not as though that conserved bloc of genes is chopped liver in terms of cellular functions. Obviously included in this tally are enzymes involved in central metabolism, e.g., glycolysis, or the breakdown of the sugar glucose into chemical energy. But non-metabolism genes and the proteins they encode are also part of the mix. There’s actin and tubulin, two proteins that comprise the dynamic scaffolding, or cytoskeleton, of cells; histone, a protein that wraps DNA double helices in a regulatory embrace; clathrin, the triskelia-shaped protein that forms Bucky Ball coats around lipid droplets called vesicles. And it’s not just the pipes and dry wall that’s shared. Even complex enzymes like kinases are conserved from yeast to humans, including one of my favorites TOR, which stands for Target Of Rapamycin, an ancient nutrient sensor.

The full force of evolutionary conservation is no more persuasively felt than in gene-replacement experiments. If DNA sequence alignment indicates that two genes are related in organisms separated by over a 1 billion years of evolution, how do we know that this DNA sequence similarity translates into functional interchangeability? Swap the modern version for the ancient one, and see if the cell or organism behaves normally. It’s a concept from genetics called complementation. It must have been in those heady days that the expression “the awesome power of yeast genetics” was born. Once I got a taste for yeast in my first-year graduate school laboratory rotations, there was no turning back. In my graduate and postdoctoral research over the last decade, I’ve been trying to connect basic discoveries made in yeast to human diseases, and now my focus is rare/orphan disease.

Studying yeast alone is not going to cure NPC, but if you take evolutionary conservation at face value, the awesome power of yeast genetics is a modest down payment on a cure.

Time for a Rare Disease Moon Shot

Author: 
Ethan O. Perlstein

Why is it so difficult to translate biological discoveries into effective drugs for diseases that are triggered by a single faulty gene?

Once upon a time we stamped out smallpox. We went to the Moon and back in a decade, then sequenced the human genome. We've made giant leaps before. The holdup doesn't appear to be lack of talent or money. Right now we're experiencing a generational glut of underemployed or downsized professionally-trained life scientists – postdocalypse, anyone? The largest pharmaceutical companies are sitting on tax-sheltered cash reserves totaling several NIH annual budgets. (Note: 1 NIH annual budget = $29B).

Some say it’s just really, really hard to bring a drug to market, and there’s certainly truth to that. But it’s not time travel hard. Drug failures seem to arise from misaligned incentives and scientific blind spots rather than conceptual impenetrability. On the one hand, I blame the rampant targetophilia, or drug action reductionism, on Big Pharma. Blockbuster magic bullets, e.g., Gleevec, were never going to be economically sustainable, and this groupthink has contributed to declining productivity in drug development i.e., Eroom’s Law. On the other hand, academia's Tenure Games are no picnic. The “publish or perish” ethos thwarts rapid dissemination and sustained cross-pollination of human disease-model discoveries. Even projects that lead to a glorious Cell paper can be abandoned when a postdoc leaves the lab, left to languish for years behind a paywall, or snuffed out much earlier at the proposal stage by risk-averse study sections.

I’m not the only person bypassing pharmageddon and postdocalypse to become an independent scientist even if I'm one of the most vocal about it. And I hope that I’m not the only person interested in applying my scientific curiosity toward a long-neglected challenge that deserves its own Moon Shot: curing all 7,000 rare diseases. Think of each rare diseases as a basic science puzzle waiting to be solved. And think of all the zeal and resourcefulness of rare disease advocates, particularly rare disease parents, waiting to be harnessed by a collaborative critical mass of professional life scientists. We know these puzzles are solvable. For example, Vertex Pharmaceutical’s highly profitable and successful cystic fibrosis (CF) drug called Kalydeco is a monumental pharmacological achievement. But it took 23 years to develop this treatment from the moment the cystic fibrosis gene was discovered, cost hundreds of millions of dollars with critical seed investment from the Cystic Fibrosis Foundation, and now people with a drug-responsive form of CF must pay $200,000 to $400,000 per year for the right to live.

This summer, I'll be prototyping a cheaper, modular, evolutionarily informed drug repurposing approach to rare disease research that seeks to compress the time between rare disease diagnosis and rare disease drug discovery at a fraction of the $50-$100M that I’ve been told is the minimum cost threshold for serious drug discovery. In plain terms, I want to identify off-the-shelf candidate treatments for $1M bucks or less, and do it in less than the time it takes to review a government grant. 

A rare disease Moon Shot starts with the rare diseases about which we know the most, and the lessons learned from initial forays will be applied to increasingly challenging cases, e.g., undiagnosed rare diseases. I think for the subset of diagnosed rare diseases that have been studied extensively in model organisms, like the lysosomal storage disorders, preclinical leads could be generated rapidly and then validated in patient-derived cells based on knowledge that is languishing behind paywalls. Public investments in model organism research has "de-risked" the earliest stages of rare disease drug discovery. The take-home message is: rare diseases don’t discriminate evolutionarily!

 

 

Indie science

Author: 
Ethan O. Perlstein

Transitioning to scientific independence from the tribal world of government-sponsored biomedical research is not easy. The stars have to align in your personal life. For example, having a supportive spouse who will quit their job and move across the country with you. Alas, searing curiosity alone cannot pay the bills. You’ll need a day job, or better yet a complementary, part-time consulting gig that leaves you enough time to do professional-grade science. Even after all that, you’ll still need a place to do the experiments – and of course a way to pay for it.

When I began my search for lab space in earnest at the end of April, I didn’t think I’d find a turnkey option that was not only yeast-friendly but also in biking distance of my Oakland abode. I dutifully put out the word on my lab website, made lots of cold calls, and reversed course at several cul-de-sacs. Luckily, it all paid off: yesterday I signed a short-term lease on a space in Berkeley!

This quest for a bench began with leads that I had rounded up before I even touched down in the Bay Area. Now keep in mind that I don’t need an entire lab, just enough bench space to conduct genetic and chemical screens with yeast cells. I don’t need core facilities or expensive, esoteric instrument. The experiments I’ve planned require consumables, reagents that are standard fare in any modern molecular biology lab, and experiments that I can’t do for lack of expertise or equipment I’ll outsource on a fee-for-service basis. I also don’t need employees, as I’ll be doing most of the experiments myself; nor a grant, as I’ll be self-funding this initial foray. (Fear not: a post dedicated to funding is in the works).

At one extreme is the biohacker/DIY bio option, e.g., BioCurious. At the other extreme is the biotech incubator option, e.g., QB3. Neither is the best fit for me right now. I admire the hacker ethos, but BioCurious is primarily serving hobbyists and fledgling scientists as opposed to professionally trained scientists. QB3 is primarily incubating life science startups up to 5 team members. Right now Perlstein Lab is just yours truly, and I want to retain this flexibility as I continue to seek out partners and patrons for a rare disease Moon Shot.

So I kept sleuthing, and received feedback from unexpected sources. For example, Kevin Lustig, the founder of Assay Depot, noticed my query from a LinkedIn share and suggested that I cold call local contract research organizations (CROs) to find out if any of them would sublease space to me. I reached out to a dozen or so CROs that seemed to fit my specs but these leads all came up empty. Then one day I got an email from Pia Abola, whom I’d never met. She used to work at an independent institute affiliated with UC Berkeley. She said based on my description her old lab might be a good fit.

So that very day I sent an email to the director of this institute, which is called VTT/MSI Molecular Sciences Institute (hereafter just MSI). As I wrote above, it’s a turnkey lab, which means fully stocked with the exact common equipment that I need for yeast work. I’d have to pay a premium for turnkey, more than double what QB3 charges for comparable albeit completely empty bench. In spite of the higher rent, another factor weighed in my decision: time. At QB3 I’d have to sign a one-year lease, and communal equipment would be hit or miss. Whereas at MSI I could start with a short-term lease. Besides, the $900 – $1,000 per month that QB3 charges startups at, say, the East Bay Innovation Center would quickly balloon with the cost of basic equipment like pipettors, which I’d be responsible for purchasing.

With a tentative lease agreement in hand, the next step in the process proved to be the most challenging – finding liability insurance. It’s been a wild goose chase. I know I’m at the leading edge because underwriters at popular insurance companies, e.g. Farmers or State Farm, won’t cover independent scientists. Practically speaking, I squandered my first leads not knowing exactly how to describe myself or my situation. After each decline though, insurance brokers offered parting sage advice on what to say and what not to say the next time.

If you’re considering going down the independent scientist path, know that when dealing with insurance brokers, refer to yourself as a one-person startup or a sole proprietor. Tell the broker you’re looking for commercial liability for “premises and ops.” In terms of liability caps, MSI and probably any other laboratory landlord want $1MM. As a result, my monthly premium will be higher than typical renter’s insurance.

So…what projects will I be working on? I gave a preview of some of the experiments I’ll be doing in the form of an open proposal in early May. I’m still integrating all the helpful comments, which are spread out on my lab website, Facebook and Twitter. A new post will be the first order of business next week, after I complete safety training, etc. Ever in the spirit of Open Science, I will be broadcasting my scientific method live, trying my best to follow in the footsteps of open lab notebook pioneers like Rosie Redfield, Anthony Salvagno, and an old grad school friend and newly minted assistant professor Greg Lang.

As an early adopter of the independent scientist model, I don’t pretend to have all the answers or a crystal ball. But I take inspiration from independent scientists who have tinkered and explored for centuries outside universities long before the Academia-Pharma Complex began to crowd them out since 1945. More and more unabsorbed academic trainees will go indie over a second postdoc. Same goes for all the downsized Pharma scientists.

Gene Patents No More

Author: 
Eric Schuur

As most of you know at this point, on Jun 13, 2013 the Supreme Court of the United States ruled essentially that native DNA sequences are not patentable subject matter.  The question ended up with the Supreme Court precisely because there are good arguments on both sides and, as you would expect, there was a lot of highly charged rhetoric exchanged leading up to the decision.  I’m not going to settle those questions here, but I did think it was worth a few moments musing about less technical aspects of patent issues.

I have tended to side with those who believe that gene sequences, at least the ones that exist in the body, should not be patented because they are a principle of nature—it just feels like giving too much away to me.  However, as one of the founders and early investors in Myriad Genetics shared in The Wall Street Journal, without the monopoly guaranteed by a patent, investors would not have anted up to launch a company to develop the genetic tests, certainly not at that high risk time (circa 1991) when it was all but certain if these tests would be worth anything.  Others have argued that the monopoly on the gene sequences that Myriad (and others, with respect thousands of other genes) have enjoyed have impeded progress in understanding genetic function and utility.

As with most persistent debates, both points of view are probably true in part.  From the perspective of the pro-patent camp, a large shift in how diagnostic products are developed that was necessary to bring tests such as these to market.  A huge promise of the human genome project was to provide for health care based on the sequence of an individual’s genome—personalized medicine.  But, it was a promise made, lo these many long years ago, when we really had no idea if it would actually work.  To get to market, we would need to generate a large amount of experience with these sophisticated new tests, the operating characteristics of which were largely unknown.  Would genetic prediction of cancer susceptibility actually work?  Would patients and doctors actually find the test useful?  Much different than measuring the number of white blood cells in a blood sample.

To get physicians and patients to order the tests, data on the validity of the tests was needed.  That required a lot of free testing to generate the data on the relationship between the gene variations and cancer incidence.  Myriad Genetics, as well as Genomic Health and other vendors of gene-based tests, have invested heavily in clinical validation of their tests in order to convince patients and physicians of their value.  Investors paid for much of this and patent monopolies were their reward.

On the anti-patent side of things, I wonder if Myriad Genetics has shot itself in the foot by jealously guarding its monopoly and appearing to be greedy (even after the Supreme Court decision, Myriad continues to aggressively pursue its perceived monopoly, see this article).  Stanford University and UCSF famously structured their genetic engineering patents to allow broad-based licensing, winning high levels of praise for licensing savvy and social conscientiousness.  Their patents did not meet the raucous challenges faced by Myriad.  In contrast, articles such as the one mentioned above  and this article decry the excessive costs imposed by companies of not only diagnostics, but therapeutics, as well.

Now I haven’t sat down and poured over Myriad’s (or anyone else’s) financial statements to ascertain if they really need to charge $4000 per test to recoup their investment and earn a reasonable rate of return for their investors.  But, what is clear is that that $4000 number is very high compared to what the world is used to paying for diagnostic tests.  It probably would have helped Myriad if they had been more transparent about why they needed to charge that amount, given that their soon-to-be competitors are planning to charge under $1000 in some cases.

Since that our government grants patent monopolies for the betterment of our society in general, I wonder if it might be prudent for companies and other patent holders to consider public reaction to how they handle the right to charge what the market will bear.  If the public perception is that the patent owner’s behavior is not in the best interest of society, they may be sacrificing goodwill, which ultimately, in closely watched cases, such as this one, might tip the balance one way or the other.

Follow me on Twitter: @erschuur

Biotech Journalist Shares Photos, Trip to Top of North America

Author: 
Theral Timpson

Luke Timmerman is the National Biotech Editor for Xconomy. His weekly BioBeat columns on Mondays are a favorite among industry veterans.

Last month Luke took three weeks to climb Denali (Mt. McKinley) in Alaska. I was blown away with this photo essay he posted on his return. Having climbed a few mountains, including Mt. Whitney here in California, I know a bit about what it takes.

At 20,300 feet, Denali is the tallest place in North America and provides climbers with some formidable obstacles. For one, the ascent from base to peak is higher than with any other mountain entirely above sea level. And being in Alaska, the mountain is further from the equator than any of the other big climbs around the globe. This means that the barometric pressure is lower; air is harder to come by.

Luke's pictures are stunningly beautiful. Jagged peaks topping immense glaciers. Exhausted, ice-tipped, bearded faces and smiles. That Luke worked his way up to this with rigorous daily training and then made it to the summit--what to say?   I'm inspired.

More stories from the trip arranged as lessons on leadership.

"Cloud First" Insist IT Experts at 5th Annual Cloud Slam Conference

Author: 
Theral Timpson

Thomas Barton is an IT engineer at Novartis whose job is to link things together.   When Barton wanted to upgrade the company’s middleware-software that connects one piece of software to another--he was encouraged by a colleague to turn to “the cloud.” 

In fact, folks in the IT industry are more and more promoting a “cloud first” strategy. Just starting a new business?  Don’t buy your own software.  Start with the cloud.  Upgrading software for your existing business?  Step into the cloud.  These are folks from Microsoft, Intel, Dell and other IT giants, and their mantra is clear:  Waste no more time or money.  Go to the cloud first.  The message rang loud and clear at the Cloud Slam conference in Santa Clara this week focused on cloud computing and healthcare/life science.

Stats First

Let’s start with some statistics and projections made by keynote speaker, Mark Weiner of Microsoft:

  • Data will grow by 4400% over the next 10 years
  • By 2015, there will be 2x more smartphones than people
  • By the end of 2016, 60% of healthcare organizations will be taking data into the cloud
  • Only 16% of healthcare organizations know where archived data resides
  • There are 600 million imaging studies done per year
  • By 2016, there will be one exabyte worth of medical imaging

With all the data coming down the pike, software engineers say there is no way around it.  We must go virtual.

The Growth/Security Conflict

The issue of whether to use the cloud is much like the “open science” conflict I’ve discussed in previous blogs.  One must use the cloud to grow in a cost effective and efficient way.  Yet one must not use the cloud to maintain the highest security of sensitive data.  

“There’s no way our data is going in the cloud,” is a common line industry veterans have heard over the past few years.  Yet, business by business, industry by industry, conference by conference IT engineers are pushing back. 

The security issue is the main concern and understandably took up the bulk of the time for panel discussions throughout the Cloud Slam conference that were populated with leaders from Microsoft, Dell Boomi, Intel, and other software giants.  

So how do these experts address the concern over security and privacy?  Healthcare is a heavily regulated industry.  

“The cloud can provide better security,” said Microsoft’s Hector Rodriguez.  “In fact, we already live in the cloud.”

Microsoft’s answer to the Amazon cloud is Windows Azure, touted as “a cloud for modern business.”  Hector went on to explain that without a comprehensive data strategy, employees of  healthcare organizations resort to “work around” methods, such as email and social media.  Any data shared in this fashion is not secure.  Using one system hosted in a cloud platform where data integrity is maintained according to the needs of the organization can thereby improve security.

The cloud is also better for disaster recovery.  During natural disasters, such as the recent Sandy storm in the northeast, millions of dollars are spent recovering data centers that were not originally budgeted for.  The cloud is not limited by local conditions.  When organizations use the cloud, data back up can be done within hours.

So are concerns over security overblown?  Panelists at the conference urged healthcare organizations to “push your vendors.”  Special cloud platforms can be HIPPA compliant, can be made to work with demands of regulators such the FDA makes on clinical trial data.  In fact, it was pointed out, an FDA cloud workgroup has just begun meeting.

An important trend to help business take advantage of the cloud is to consider a “hybrid cloud”, using a combination of platforms, perhaps one internal and one virtual, or two separate virtual systems.

The Novartis Story

Thomas Barton of Novartis began with one project.  Their middleware was just taking too long and costing too much.  Over the past seven months Barton and his team have changed from enterprise software hosted on sixteen servers internally to eight servers in the cloud hosted by Dell Boomi.  Though Barton didn’t share the exact cost savings from this transfer, he did say that the company was now saving 30% for this application.  

The Novartis case is an example of a hybrid system.  They are converting one project at a time and still able to have all their various software connected.    

Barton says skeptics in the company asked, “oh no, where does my data go?”  He replies saying that it doesn’t go anywhere.  That it is configured and coded in the cloud, but the data is still local.

Novartis deals with sensitive, regulated data, and according to Barton, “everything is validated and qualified.”  Disaster recovery has been vastly improved.  And, Barton says he’ll see payback within the first year.

There were more success stories.  Jason Stowe is the founder and CEO of Cycle Computing and devotes his time to life science customers.  He’s convinced that cloud computing has the power to advance science in a dramatic way.

“Researchers can ask bigger questions with new compute technologies,” Stowe said in a breakout session.  

In one project, Stowe was able to turn “39 years of science into 11 hours” with his cloud platform.  Going to the cloud versus using traditional in house servers, reduced the cost of the project from $44 million to $4,372.  

With a genomics project, Stowe’s company did what would normally be 115 years of compute hours in one week for $19,555.

I’m not sure how Jason calculated his numbers, but they’re getting him into some big doors.  Cycle Computing serves most of big pharma and some large genomic projects.

Making the Jump

So why don’t more life science companies turn to the cloud?  After hearing stories like those of Novartis and Cycle Computing, I expected to see more companies from our industry at the conference.  

David Houlding, a speaker and panelist from Intel Healthcare (Intel has a healthcare division?), speculated on the resistance to the cloud.

“It’s been a disservice to the industry that the cloud has been framed as new and risky,” he said.  “Actually, what we’re seeing is just the evolution of the data center.”   Houlding insists that the cloud is “not an all or nothing journey.”  

Other panelists discussed resistance within the ranks of an organization’s IT department.  When the idea of moving to the cloud is mentioned and considered, it could lead to jobs lost.  Panel members pointed out that overall IT departments are understaffed.  And that engineers who have specialized in on-site data centers can go on to become privacy teams, focused on theft protection and other security issues.  

“Even though an organization may outsource to the cloud, it is still the one accountable to its customers and regulators,” said Houlding.  “IT folks will become auditors.”

The conference was very commercially focused, being sponsored by the speakers from the big software giants.  As expected, there was not much push back or skepticism.  

The comments of Intel’s Houlding about the disservice of calling the cloud something new and risky make sense on the one hand.  The life science industry can be especially slow to adopt new ways that are perceived risky.  In fact, I see this as our industry’s core conflict.  We have, and must continue to cross new borders to grow and improve.  Yet we must be careful not to add more risk.  

Sometimes a paradigm shift is necessary.  A big jump in conception.  Sometimes it is helpful to say that we need to think in new ways.  Jason Stowe gives a provocative challenge to  investigators, and all of us in the industry.  Can we come up with bigger questions, can we think bigger with better compute power?

The question of whether to use the cloud, it seems, is not if, but when.

Supreme Court Invalidates Myriad's BRCA Gene Patents, Allows for cDNA Patents

Author: 
Theral Timpson

The Myriad gene patent case reached its final point today when the U.S. Supreme Court handed down a decision that will resonate throughout the life science industry for years to come.

In a rare unanimous ruling, the high court ruled that Myriad's BRCA gene patents are invalid.

Justice Clarence Thomas wrote for the court: “A naturally occurring DNA segment is a product of nature and not patent eligible merely because it has been isolated, but cDNA is patent eligible because it is not naturally occurring . . . Had Myriad created an innovative method of manipulating genes while searching for the BRCA1 and BRCA2 genes, it could possibly have sought a method patent. But the processes used by Myriad to isolate DNA were well understood by geneticists at the time of Myriad’s patents” and are not at issue in this case, the decision went on. “Similarly, this case does not involve patents on new applications of knowledge about the BRCA1 and BRCA2 genes.”

However, the court left the door open for patenting cDNA.

“cDNA retains the naturally occurring exons of DNA, but it is distinct from the DNA from which it was derived. As a result, cDNA is not a 'product of nature' and is patent eligible,” continued the decision.

At issue for the life science industry, particularly the booming area of diagnostics is whether without patent protection, diagnostics companies such as Myriad are less likely to develop innovative tests.  Where would the drug industry be, for example, without patents for such therapeutics as Viagra and Lipitor?

The decision will be greeted with different responses around the industry.  Those in academic research positions are generally against gene patents, while those in industry such as today's guest, Mark Trusheim, hoping for IP protection for their innovation.

This morning on Twitter scientists were attempting to parse the decision and determine just where the justices had drawn the line.

Ewan Birney, leader of the ENCODE project appeared confident that the court struck a good balance. "Plenty of real estate each side of the boundary - both public goods side and private innovation," he tweeted.

Scientist Leonid Kruglyac was less enthusiastic about the cDNA part of the decision. He tweeted: "Thomas: "lab technician unquestionably creates something new when cDNA is made.” // So do I when I make a sandwich."

The Wall Street Journal article this morning asserts that "the court in recent years has sought to constrict the scope of patent protections, concerned that patents were being issued too easily and so broadly as to squelch competition and impede innovation. Justice Elena Kagan at one point alluded to such concerns, describing the Patent and Trademark Office as "patent-happy.""

Over at GEN, an article points out that in a blog response, Myriad is seeking to "shift focus away from the patentability of the genes. Myriad noted that under the Affordable Care Act, the BRCAnalysis test is considered preventative, and insurance is required to cover 100% of the cost “for many women. We offer financial assistance to uninsured patients with the greatest need.”"

The results of today's decision will be far reaching and hotly debated for some time to come.   

"Perhaps there's a silver lining to the ruling, as this will free the industry up to develop broader, algorithmic based tests," says Charles Matthews of Boston Healthcare in an upcoming program.  

We'll be following the decision and its fallout closely.

In ASCO Speech, Hamburg Hints of Change to LDT Regulation

Author: 
Theral Timpson

Is the FDA going to go through with the much anticipated crack down on LDTs?

In a posting at Forbes, Science and Medicine Reporter, Matthew Herper, quotes FDA Commissioner Hamburg with the following:

“Advanced diagnostics such as these are the cornerstone of personalized medicine, and their development can only foreshadow the many advances on the horizon. This is truly an exciting time in the history of cancer therapies and their companion diagnostics,” Hamburg said. “Unfortunately, not all complex diagnostics used in cancer diagnosis or treatment have been developed to perform at the same demonstrated standards.

“There is a category of diagnostics called laboratory-developed tests which are produced in and offered by laboratories for use in their own facilities. LDTs are currently marketed without FDA premarket review to determine whether they are safe and effective – whether they are accurate and clinically valid. And that can be a problem.”

Herper took these excerpts from a prepared speech Hamburg gave at the American Society for Clinical Oncology in Chicago this week.

As we've covered in our series, Commercializing Diagnostics, there exists an un-level playing field for diagnostics providers in the U.S.   Roche and Qiagen have been leaders in securing FDA approval for their diagnostic tests, only to see clinical labs using non-regulated copies of their tests.  There has been extensive debate about whether LDTs, or laboratory developed tests, done in a CLIA certified lab should require FDA approval as well.  Some, such as representatives from Roche and Qiagen, argue that the non regulated tests are done at lower standards.  Others say that basic quality standards can be maintained by CLIA.  And that the unnecessary burden of regulation will hamper innovation.

“Historically, FDA exercised enforcement discretion – that is, it did not generally enforce applicable regulatory requirements for these devices, because they were relatively simple, low-risk tests performed on a few patients being evaluated by physicians at the same facility as the lab,” Hamburg says. “But LDT’s have become more sophisticated and complex. Results from these tests are rapidly becoming a staple of medical decision-making, particularly for cancer.”

Hamburg's speech this week indicates that the FDA is moving toward a more level playing field.  How far will the FDA go?  Many experts have been expecting the FDA to pursue a stratified rather than a blanket approach, guiding their limited resources toward the most complex, clinically important tests first.

The move could have a big impact on the diagnostics industry.  Many companies, such as CardioDx who we featured last year, have bypassed the FDA and gone on to win reimbursement from the major payers.

There is also an insecurity about the IP value of diagnostics due to the ongoing Myriad Genetics gene patent case. Will diagnostic tests be patentable?  In an upcoming interview at Mendelspod, Mark Trusheim, a Special Government Employee for the FDA's Office of the Commissioner, talks about ways that diagnostics companies can build the value of their tests, other than regulation.  

The diagnostics industry is taking off as we unravel biology and develop new tools, such as next generation sequencers and improved algorithms for finding ever more meaningful biomarkers.  And regulation and patent issues continue to keep the new industry in flux as it attempts to find solid footing.



-->