Saturday, December 15, 2007
In the early 1970s, physicians were finally forced to abandon their belief that, given the vast array of effective antimicrobial agents, virtually all bacterial infections were treatable. Their optimism was shaken by the emergence of resistance to multiple antibiotics among such pathogens as Staphylococcus aureus, Streptococcus pneumoniae, Pseudomonas aeruginosa, and Mycobacterium tuberculosis. The evolution of increasingly antimicrobial-resistant bacterial species stems from a multitude of factors that includes the widespread and sometimes inappropriate use of antimicrobials, the extensive use of these agents as growth enhancers in animal feed, and, with the increase in regional and international travel, the relative ease with which antimicrobial-resistant bacteria cross geographic barriers.
Staphylococcus aureus is perhaps the pathogen of greatest concern. S. aureus is a gram-positive bacterium that colonises the skin and is present in about 25–30% of healthy people. This species has high intrinsic virulence, acquiring antibiotic resistance either by gene mutation or horizontal transfer from another bacterium. S. aureus is the primary cause of lower respiratory tract and surgical site infections, and is also the leading cause of hospital-acquired bacteremia, pneumonia, and cardiovascular infections.
As rapidly as new antibiotics are introduced, staphylococci have developed efficient mechanisms to neutralize them. Resistance to penicillin appeared soon after it was introduced into clinical practice in the 1940s. The effect was initially confined to a small number of hospitalized patients, but resistance spread as use of penicillin increased, first to other hospitals and then into the community. By the late 1960s, >80% of community- and hospital-acquired S. aureus isolates were resistant to penicillin.
The evolution of resistance which first emerges in hospitals and is then spread to the community, is an established pattern that recurs with each new wave of antimicrobial resistance. Methicillin, introduced in 1961, was the first of the semisynthetic penicillinase-resistant penicillins. Its introduction was rapidly followed by reports of methicillin-resistant isolates. Recent information suggests that the evolution and spread of methicillin-resistant S. aureus (MRSA) seems to be following a wavelike emergence pattern similar to that of penicillin. First detected in hospitals in the 1960s methicillin resistance is now increasingly recognized in the community.
Glycopeptide antibiotics are used as a last resort.
Vancomycin is a glycopeptide antibiotic originally developed by Eli Lilly for penicillin-resistant staphylococci and fastracked by the FDA in 1958. Vancomycin and other subsequently developed glycopeptide antibiotics have never been used as first line treatment for S. aureus infections largely because of the development of methicillin and relevant analogs, and because they must be administered intravenously. However, with steadily rising MRSA cases the use of vancomycin and other glycopeptides as a last resort against these resistant infections has become increasingly widespread. Not suprisingly therefore, in 1996 the first reports of vancomycin intermediate S aureus (VISA) began to come from around the globe. Since then a number of reports of fully resistant (VRSA) strains have been reported. Furthermore, these strains tend to be multidrug resistant (including teicoplanin, another glycopeptide) against a large number of currently available antibiotics, compromising treatment options. At the moment, as there are no formal recommendations regarding treatment, identified strains of VISA or VRSA must be submitted to laboratory screenings to determine a potential antibiotic regimen.
So that brings us up to speed on the state of the perpetual war between Humans and Microbes. Certainly this is a reasonably hot topic right now due to recent news reports of MRSA outbreaks amongst the community. It is important to remember that before the discovery and widespread use of penicillin, bacterial infections were by far the biggest cause of early 20th century mortalities, and microbes like S. Aureus are constantly using the power of darwinian evolution to discover a method to reclaim that title. Furthermore antibiotics have largely been overlooked by the bigger pharmaceutical firms despite volumes of increased molecular understanding of these organisms coming out of academia. This is changing quickly however, so next time we'll look at the other competitors who are developing and marketing novel antibiotics. We'll and cover Targanta's number one candidate Oritavancin and how it measures up to the science. Stay tuned.
Wednesday, December 12, 2007
Tuesday, November 13, 2007
Biolex Therapeutics is a clinical-stage biopharmaceutical company who filed for an IPO in August 2007. I wrote the following article shortly thereafter, expecting an imminent pricing. Interestingly, aside from some phase II results on their lead therapeutic candidate there hasn't been much in the way of information concerning either the company or the expected IPO date. Furthermore, the Biolex website has been curiously "under construction" since shortly after their S-1 filing. Regardless of whether Biolex plans to stay private, is talking to potential acquirers, or just dragging their feet and waiting for better market timing, I decided to post my research anyway because it was a good time learning the story.
The intriguing aspect of Biolex is that they employ a proprietary and very novel protein expression system (the LEX system), enabling the production of biologic candidates otherwise difficult to make through traditional commercial means. The LEX system utilizes the aquatic plant Lemna, known commonly as duckweed, as an expression host to produce these difficult proteins.
The Duckweeds: A Valuable Plant for Biomanufacturing ?
- Lemna can be proliferated in an aqueous medium cheaply and clonally, and doubling times are 20-24hours.
- The plants can tolerate a broad pH range and a number of organic buffers and protein stabilizing coupounds. (MES, MOPS, EDTA,PVP)
- Duckweed cells can be processed easily as they contain no lignin (woody material), and can be homogenized readily by commercially available methods.
- Lemna species, among others are amenable to genetic manipulation, reliable and relatively expedient methods can generate transgenic lines in 6 weeks or less.
- Lemna cells are eukaryotic, and have glycosylation machinery.
It is also important to mention that plants in general all share some advantages as organisms for biopharmaceutical production. The most obvious is that transfer of human viruses and other contaminations is effectively impossible, either from the outside environment to the culture or from the culture to the biologic itself. This is not only beneficial from a cGMP perspective, but it also cuts orders of magnitude off the cost of production. Plant based expression systems are also easily scalable and have a potential of far less capital and operating costs than even existing bacterial systems. Finally not only are all plants and mosses able to glycosylate their proteins, there has been initial success with actual secretion of the therapeutic protein of interest into the media from either the root system (rhizosecretion) or directly from moss protoplasts, eliminating the need for homogenization and greatly simplifying the purification procedure.
The major challenges facing the usage of plants to produce biopharmaceuticals are largely twofold. The first is that there are major structural differences between plant and mammalian n-linked glycans. Therefore proteins glycosylated by the endogenous plant machinery elict an immunogenic response in humans when administered parenterally. This problem has been overcome somewhat in transgenic tobacco expression systems by engineering the plants to produce human galactosyltransferase. Alternatively this problem has been addressed in the moss expression system Physcomitrella patens by knocking out the genes encoding for the plant specific sugar transferases (developed commerically by Greenovation). Biolex themselves followed a similar concept by using RNAi to inhibit two undesirable endogenous Lemna sugar transferases, giving rise to a single species of non-immunogenic glycosylated antibody that performed better in vitro than those produced in CHO cells.
The second challenge facing those in the business of producing biologics in plant based systems is that of public opinion. In 2001, the Prodigene incident, where corn genetically modified to produce trypsin acidentally cross-pollenated a nearby field in Iowa, caused a public outcry and eventually sunk the company. This, and numerous other debacles concerning GM food crops have fueled the fire of public concerns to which the demise of several biotech business models of plant-made products (PMPs) can be attributed. Because of this, the current front runners of biotech PMP manufacturing are sidestepping this hot-button issue by using highly contained, non-food plants such as tobacco, moss, or duckweed. In this manner, it would seem Biolex has chosen a less controversial organism which is propigated under easily controlled, and highly contained conditions.
So far so good. Biolex has a proprietary method of making their therapeutic proteins, which as has been mentioned elsewhere on this blog, is a very good strategy indeed for potential acquisition or competition. I'll leave the analysis here for the time being, and in the case of a IPO pricing, I'll continue on with Biolex's lead therapeutics, patent positioning, partnerships and financials.
Thursday, October 4, 2007
Traditional small molecule pharmaceuticals are synthesized via highly reproducible chemical processes. The resultant compound is patented based on its atomic structure rather than its manufacturing process because of the fidelity of these reactions and also because determination of the purity and composition of such small molecules is routine.
In contrast, protein therapeutics are on average 100-1000 times larger than small molecules and logarithmically more complex. They are produced in recombinant organisms, usually bacterial, yeast, or mammalian cells, and the protein is purified to homogeneity via biochemical methods. These methods are often lengthy protocols of which any or all of the process can be proprietary. In addition, the utilization of living organisms as miniature factories results in an inevitable heterogeneity in the manufactured therapeutic. The final product can be effected by minute alterations in protocol, and stringent attention must be paid to the necessities and behavior of the co-opted organism.
The bottom line is that not only are protein therapeutics themselves overwhelmingly more complicated than traditional pharmaceuticals, the degree of randomness and consequent difficulty involved in their production is many orders of magnitude higher than the analogous small-molecule synthetic chemistry manufacturing methods.
Given the complexity and heterogeneity described above, its no surprise that in the early years of biologics the philosophy that "process defines product" governed regulatory actions. The result was that the FDA required a single biotechnology company to to obtain a Product License Application (PLA), and an Establishment License Application (ELA), in addition to performing pivotal phase 3 trials using the same facility used for final commercial production. Fortunately for the Biotech industry, this was replaced with a single Biologics License Application (BLA) through the FDA Modernization Act (FDAMA) in 1997, and companies were also permitted to change, or more importantly, outsource their manufacturing process so long as the resultant therapeutic was shown to be comparable.
Early biologics such as Amgen's recombinant erythropoiten and Genetech's human growth hormone suffered from a supply deficit even as they proved to be commercial successes. However, it was Immunex's (acquired by Amgen) Enbrel, released in 1998 as a treatment for rheumatoid arthritis, that would become the prime example of a biologics manufacturing shortage. Supply rapidly outstripped demand leading to patient waiting lists and shortfalls that continued until the end of 2002. This event opened the doors for competing biologics like Centocor's (acquired by J&J) Remicade, which did have enough manufacturing capacity, and may have ultimately may have cost Enbrel's makers more than $200 million in lost revenue.
These above events gave birth to a boom in the industry of Biologic Contract Manufacturing Organiztions (CMOs) almost overnight. Building a new biopharmaceutical plant costs hundreds of millions of dollars and can take up to 5 years, many biotechnology companies are reluctant to make that kind of investment to support a therapeutic candidate still in clinical trials. Business vacuums don't exist for long, and in only a couple of years the concern over biopharmaceutical manufacturing capacity had died down considerably. However, CMOs are here to stay and their growth outlook is generally accepted as favorable. Big pharma and larger biotechnology companies are building their own facilities for biologic manufacture, or even adopting a shared-capacity strategy. Meanwhile, smaller biotechnology firms, especially those with only a few candidate therapeutics will continue to rely on CMO's facilities and expertise as they focus resources on product development and establish a proof of concept in clinical trials.
Just as CMOs are considered by the biotechnology industry as being a bright side of biologic manufacturing, their close relative, generic biologics manufacturers represent the dark side. These off-patent versions of protein therapeutics, dubbed biogenerics, biosimilars or follow-on-biologics are already a reality in Europe and elsewhere, and they are on the horizon in the United States. This is the current hot topic of debate surrounding biologics, and although arguments can be made on either side as to how long it will take for either the US legislation to pass or for the developing nations to bring their cGMP facilities up to speed, most will agree that biogenerics and outsourcing are only a matter of time.
With respect to the biogenerics market, I consider the business of evaluating either domestic legislative or foreign compliance risks particularly volatile. The above retrospective does however, firmly illustrate the overwhelming market pressures in the biotechnology industry as a whole to not only discover new protein therapeutics and biologics, but also to produce them on a industrial scale in an increasingly more cost-effictive manner. This thesis is restated in a recent article written by the senior VP of technical operations at Wyeth.
Although there are many obvious differences in producing protein therapeutics versus microprocessors, the most notable is that protein production is not easily standardized. In this regard each protein product will have a customized process to some degree.Yet I believe the comparison to be apt in terms of market opportunity. Just as Intel benefits whether Microsoft or Google become popular, developers of technologies which can successfully and generally reduce the cost of protein production should stand to gain handsomely regardless of any of the above market uncertainties.
To get a glimpse of what the future may hold for the biopharmaceutical industry, one need only look back to the transformation that took place in semiconductor manufacturing. Similar to the biotechnology industry, the technology for semiconductor manufacturing was initially highly specialized and expensive. Competitive pressures and the need for large-scale production required the construction of large plants, at costs that were prohibitive for most industry companies. The investment in such large plants led to a compromise in the ability to rapidly respond to new technological advances. To better respond to markets and compete with lower cost operations in Asia, semiconductor companies began to form consortia to share capacity and hire contract manufacturers. As in the biotechnology industry today, shared capacity in semiconductor manufacturing was only possible through the standardization of processes and technology. Technology standardization became more firmly established as the small number of companies, which held the dominant intellectual property required for the design and manufacture of state-of-the-art semiconductors, became the industry leaders.
The bottom line is that those technologies which can facilitate the production of biologics more economically will equate to a competitive edge in an industry beset on all sides by an imperative to reduce manufacturing costs.
In this manner I also hope to be able to predict the long term valuations of companies supporting the biomanufacturing industry. Applied to medium and large cap stocks, I would expect Invitrogen (IVGN) to post better than expected earnings tomorrow. In a likewise manner I expect Thermo Fisher (TMO), GE Healthcare (GE) and BD biosciences (BDX) to continue to outperform in their life sciences departments into the forseeable future. Pall (PLL) and Millipore corporations (MIL) have had nasty tax problems and poor Q2 performance respectively, but the above thesis predicts that their product technologies will continue to show strong value and also present a long term investment opportunity. Internationally, Cobra Biomanufacturing (LSE:CBF.L) and Sartorius AG (XETRA:SRT.DE) among others, meet the criteria outlined above.
Looking forward, the above investment thesis will be one of the factors directing my micro and small-cap research. The endeavor has turned out to be challenging especially due to the fact that many of the technologies are still in venture capital stages. Furthermore, those technologies that do meet the above criteria are often rapidly bought by larger-cap companies. I am very doubtful that the cost of biologic production will diminish overnight with a single technology. Therefore this thesis will continue to be an integral theme in biologics industry research, and further revisited upon all potential biopharmaceutical product candidates.
Disclosure: I am long shares of TMO
Monday, September 24, 2007
Anyone interested in the biotechnology or the pharmaceutical sector is sure to understand the ongoing hype surrounding biologics. Bristol Meyers Squibb's recent purchase of Adnexus for $430 million is the most recent example. Last month Pfizer announced that it is breaking ground on a $50 million biologics facility and simultaneously paid out $30 million to use Xoma's (XOMA) bacterial cell expression technology, aiming to have 20 per cent of its pipeline product portfolio in this sector by 2009. Merck shelled out $400 million for Glycofi last year, in attempt to catch up with the big pharma early adopters of biologics (eg. Roche, J&J, GSK, Astrazenca). Needless to say that examples of vigorous froth in the biologic arena are easy to come by.
For an in depth understanding of the state of the pharmaceutical corporations, I highly recommend the recent report published by Price Waterhouse Coopers: Pharma 2020: The vision. In short, R&D expenditures have risen steadily while the amount of new molecular entities (NMEs) approved have slumped. Only a minority of Pharma companies earn a substantial income from new products, and a majority of the leading firms stand to lose anywhere up to 40% of total revenue due to patent expiries. Contrasted with the fact that there are in fact growing national and global opportunities for the healthcare and medicine industries, the bottom line amounts to big pharma shifting into crisis mode.
The above scenario, currently playing out in slow motion, serves as a kind of vindication for the biotechnology field, which since its beginnings in the early 80s has generally employed comparatively more nimble, scientifically driven approach to medicine. The business model of innovating biomolecule therapeutics and treating smaller markets of unmet medical needs was originally dismissed by big pharma in favor of a small molecule blockbuster approach. In the past few years however, as niche market drug discovery has become increasingly necessary, unmet disease therapeutics have proven valuable both monetarily and strategically. In addition, biomolecules have become blockbusters themselves, accounting for nearly one quarter of this year's total pharmaceutical market sales growth. This apparent role reversal, and the stong market projections for protein therapeutics are no doubt the reason behind the recent biotechnology wave of mergers and acquisitions.
Unfortunately, the biotechnology drugmakers will have little time to rest on their laurels. Until now, these companies have not had to worry about competition from generic manufacturers for a number of reasons. Simply put, proteins are larger and more complex by orders of magnitude, and producing these biotherapeutics requires living organisms and detailed purification protocols. Likewise, proving that two biological macromolecules are identical in structure and efficacy without performing the actual clinical trials is no small feat. Ultimately the Hatch-Waxman Act, which regulates generic pharmaceuticals, breaks down. In a few months however, its all about to change, the new regulatory legislation is currently in congress and generics firms are eagerly anticipating the go-ahead for biologic follow-ons in the US market.
Ok that pretty much brings us up to speed with the state of things. In part II, I'd like use this background knowledge of the protein therapeutics industry to evaluate opportunities of technology and in the market.
Monday, September 17, 2007
Background on Steve Lombardi :
Helicos Executive Vice President and Chief Operating Officer
Mr. Lombardi joined Helicos in June 2006. He has over 27 years of commercial biotechnology experience, as a researcher and in various business management and executive positions. Prior to joining Helicos, he was Senior Vice President at Affymetrix, serving in executive positions in Corporate Development, Product Development and Research and Corporate Marketing. Before Affymetrix, Mr. Lombardi worked for 16 years at Applied Biosystems in various business roles, first as a marketing manager and later as a senior executive.
From 1989 to 1998, Mr. Lombardi led the formation of the company's DNA sequencing and genetic analysis business, the products of which formed the technological basis of the worldwide Human Genome Project. He was also involved in the formation of Celera within the broader Applera corporate structure. Prior to joining Applied Biosystems, Mr. Lombardi spent 8+ years as a nucleic acids chemist focused on the development of novel approaches to DNA synthesis. He earned his BA in Biology from Merrimack College.
SL: As a startup technology doing first generation of a first generation of a disruptive technology like single molecule sequencing, there’s a huge amount of IP that a company like us can accumulate and as such want to hold that very close to our vest for patenting reasons but also for not giving competitors a sense of what you’re really trying to do, so you’re absolutely right in what you’ve said but we’re just in the process of going commercial and you’ll start to see more and more from us. We just went live with our new website that is still a bit shallow in content but you’ll see a lot more from us as we begin to roll out the commercial launch.
AW: Ok that’s great and like I say I’m sure this has got to be a very active and exciting time for all you guys involved with Helicos. It definitely seems to be ramping up as you’ve just described
SL: Yeah it’s a fun time.
AW: So as you may or may not know I’m actually in the structural biology department here at NYU and so I have a vested interest in single molecule technologies and research. Many of us structural biologists try to keep up to date with single molecule technology and that’s part of the driving reason why I wrote my review, but so let me just jump into some questions I had first and get those out of the way.
AW: So I wrote my review of the technology behind the HeliScope using Dr Quake’s original PNAS paper along with the patents and all the other publicly available information but there are still a number of questions that I iterated in my article concerning the tSMS technology. Do you mind if I try to clarify some of those technical aspects of the HeliScope with you?
SL: Not at all and that’s one of the things that I wanted to do was to help clarify that for you.
AW: In the PNAS paper the Quake group was able to achieve the resolution and sensitivity needed by combining TIRM and FRET does the HeliScope still use both of these methods?
SL: The HeliScope still uses what’s called TIRF, “total internal reflection fluorescence”, but we’ve moved away from FRET. We found we didn’t need that modality; by using the right dyes we are able to get signal to noise with the TIRF system and single dyes. What we’ve learned is that with TIRF and enough light, and we put a lot of laser flux into the system, we can get substantial signal. However, the whole key here is controlling noise. First of all, TIRF optically reduces the noise substantially. Additionally, one of our founding scientists is Tim Harris, who was at
AW: Ok that was going to be my next question, so does it require excess washing steps to remove nonspecifically bound nucleotides?
SL: Oh yes, a majority of our cycle time is washing. We have a spec for florescent background, but consider the density we’re talking about. We have one DNA molecule per square micron. That means high density with regards to information content, but it’s very low density at the molecular level. One of the neat things about this technology is that, because detecting single molecules is not the problem, the advantage is that at every step in the process, we are detecting the emission of a single fluorescent dye. So during 30 cycles of single base addition, whether at step 1, or step 120, the signal to noise is constant because we’re always measuring a binary event; is the dye present or not? The huge advantage against the amplified, or think of them as “ensemble technologies”, is that even though you lose yield in a single molecule approach, you literally lose the signal in an ensemble approach. In the ensemble approach, you may start with for instance one million molecules but then over time you get degradation of the signal, you get less and less molecules reporting.
AW: That’s really amazing, by also cutting out FRET you guys have really done away, I’m sure, with a lot of logistical issues of using it.
SL: Absolutely. The other thing we don’t have to deal with are the phasing issues. You get a lot of signal in an ensemble approach, but you also generate signal coming from the phased fluorophores, the nucleotides at n-1, n-2, n-3…. that are generating noise. Many cycle chemistries like DNA synthesis or protein sequencing run into these phasing and yield problems. We are measuring binary events, starting at one molecule per square micron but having 3 billion strands down on our system, even a 10% yield of molecules reporting out to readable, alignable lengths, that’s still 300 million strands at 25 bases, or 7.5 billion strands per run.
AW: That really has clarified a lot up for me. So let me move on and ask you another question. How does the HeliScope deal with long homopolymer regions?
SL: So you accurately described what we announced at
AW: Yes I read about that, you can cap it but then there are other issues with uncapping.
SL: Bingo, so what we have done is to create what’s called a virtual terminator that adds no time to the cycle and kinetically inhibits the addition of the second base.
AW: So that’s what I thought, after reading the patent it seemed like it was largely done through kinetic means which has a lot of benefits especially because even if an incorporation is not made it doesn’t necessarily matter for the HeliScope system.
SL: Exactly, because we can take advantage of the fact that each strand grows asynchronously. You can’t have any asynchrony in an ensemble approach, so you have to drive all your kinetics to 100%. We don’t have that problem so we can take advantage of kinetics with these virtual terminators to adjust the incorporation of the first base and the second base such that in the cycle time that we use, we have virtually no second base addition. We are still doing research as to how far we can go with the homopolymer regions and I can’t say where we are right now, but were making very good progress towards performance that its really exciting.
AW: I’m sure, it’s good to know that at least some of my assumptions were correct.
SL: I want to give somebody credit here, and its Bill Efcavitch who is our senior VP of R&D. I’ve worked with Bill for 21 years and known him for 27. I was a nucleic acid chemist in my previous life, and was an early customer of ABI in their DNA synthesis business. Bill came from Marv Caruthers lab at
AW: Yeah I think that’s really in keeping with what seems like the quality of people at Helicos working there and running the show. I wrote in my article that the whole company seems chalk full of pretty remarkable people
SL: It’s a pretty hard thing not to join this company, I moved back here from
AW: That says enough.
AW: Let me move on here, you really tied up some loose ends for me in terms of the technology, I’d like to continue asking you about the performance of the HeliScope system, is there any way you can give us some updated statistics?
SL: What we’ve been saying in the public domain is that we’ve had two production prototype HeliScopes running since around the first of the year; we call them mules, because they are the place where we do all our testing. We’re also doing all of our systems integration, so one is always running and generating results while we’re putting new parts on the other doing the integration testing. During the IPO period we brought these prototypes up, and were confident enough with the performance we saw from them right after the IPO that we began the build of the commercial HeliScope. The instruments we will be shipping are on the factory floor today, and will be put through a very rigorous verification and validation program. People have been asking us why we haven’t been doing beta, which is a program you tend to do with existing technologies and if you’re doing the next version of something already in production. My experience, and Bill’s, with ABI for 25 years, is that with something as new as this, what we want to do is sort of a beta on the factory floor. Where a company normally builds, tests, ships and then installs knowing that the tests, either on the factory or at the install, correlate to good customer needs, in our current case we’re still figuring out what those parameters are. Along with this testing, we are bringing in-house customers who are going to be part of the CTS program. We have our service people learning about the instrument by helping to build it. We’ve also got field application scientists who are learning how the instrument works and who’ll support the instrument in the field. Our goal is to ship product by the end of the year; where we will send units to those customers and confirm performance with the same tests that were done on the manufacturing floor, but now in their production labs.
AW: So you’d say it’s safe to say that any numbers that have been bandied about concerning throughput at this point are still in the rumor phase.
SL: Well we’ve quoted 25MB/hour for sequencing applications and 90MB/hour for our gene expression application, and but those are ballpark numbers. We haven’t set commercial specs yet because we are still in the process of doing the validation and verification. We wouldn’t have started the commercial build if we had we not seen performance off the prototypes that gave us confidence that we could get near those specs. I can’t give you definitive numbers on this but we would have not have started the build if we were not confident in the system.
AW: Ok and those numbers were talking about, obviously accuracy is paramount.
SL: Absolutely, so the difference between the 25 and the 90 is to get to accuracy. For gene expression applications where you can use tags, you can use compressed sequence space and can deal with errors in an orderly manner, we can run the instrument at 90MB/hour. For sequencing applications where you don’t have that control, what we can do is again take advantage of the single molecule approach and do something that we call multi-pass sequencing. This single molecule advantage comes from the nature of biochemical errors in single-molecule mode, and what we’re finding is that those errors are stochastic. For example, let’s say that I have the same base represented one hundred times on one hundred different molecules positioned at different positions on the flowcell, generated that way because we use a prep process that stochastically fragments a genome. The error rate of that base is the aggregate of those signals. Let’s say we do a run and create a table of errors. What we can then do with single molecule is really neat; after the run, we can literally melt off the strand that we synthesized, and resequence the same templates all over again so that we have two independent measurements. By comparing the tables of each run, we find that the errors are chance events, so that you get the same error rate in the aggregate, but the odds that the same base on the same strand will generate an error is infinitesimally small. What you can do then is take the square of the error of both runs to increase accuracy. So what were doing is again taking advantage of single molecules and of the fact that we’re starting with 3 billion strands to do a dual pass run. We literally do two sequencing runs to get accuracies that are good enough.
SL: Now the issue of why we need to do this dual pass is that if we look at the errors that occur, the insertion error rates and substitution error rates are infinitesimally small, but what looks to be deletion errors are high. These deletion errors are in fact dark base additions; there are some proportion of molecules in the virtual terminator formulation that are dark. The base that’s added isn’t detected through signal and we’re pretty sure that we know what the problem is. We think we can solve this through process development because as you know were still dealing with research pilot-grade manufacturing. We think as we move through to full production, we will get development to a highly repeatable process of these molecules and we will be able to control quality to get that dark base addition down to a point where single pass accuracies will be sufficient. The result will be that we’re all of a sudden at 90MB/hour for single pass sequencing, and we think were currently going to be industry best at 25MB/hour.
AW: Yeah 90MB/hour at acceptable accuracy is definitely I think twice or four times better than the other industry leaders.
AW: And the other key thing about your issue of sensitivity is that the error rate on base one and the error rate on base 30 are exactly the same. If you look at the error rate on the ensemble methods, this rate increases as a function of length.
AW: In academia at least the highest fidelity polymerase misincorporates one in every million bases.
SL: There was a report in GenomeWeb last year that we corroborated that we have a relationship with Floyd Romesberg at Scripps about protein evolution so one of the things that we are looking at is new polymerases via protein evolution.
AW: That’s a pretty exciting area of research as well.
SL: Another thing we’ve been able to do is attract an unbelievable Scientific Advisory Board. Most companies would be happy to have one of the people that we’ve got. There are experts across so many different disciplines interested enough in our technology to be involved in the SAB, and this gives the whole team a huge advantage because they can utilize people like Steve Chu who himself won a Nobel prize for single molecule work.
AW: Yeah he gave a talk here not too long ago, “optical tweezers” mindblowing stuff.
SL: The guy is just amazing, and we’re starting to round out the SAB on the application side. We have Victor Velculescu from Johns Hopkins and we have Eugene Meyers, a bioinformatician previously at Celera now at Janelia Farms.
AW: I couldn’t agree more concerning the SAB of Helicos, I actually wrote about how impressive it was.
AW: Let me just finish up with a couple of quick questions, what about cost of the HeliScope system, any ballpark figure?
SL: We haven’t set price yet, it will be more expensive than anything out there, but what we’ve been saying is that there’s two key reasons why. One is our system configuration; we provide with the HeliScope a not-inexpensive image processing tower that in near-real-time converts the raw data to base calls, reliable bases, so that at the end of the run, you have the ability to just port that data to your bioinformatics engines. Some of the other technologies at the end of their runs have raw data and require the customer to provide the compute facility to do the image processing, so it’s an apples and oranges comparison with the HeliScope because the tower is not an insignificant part of the cost; we’re talking about terabytes of data that need to be processed and stored. The second reason is more strategic, this is classic Bill Efcavitch and I think a testament to the smarts of the board. They looked very closely at this market at the outset and concluded that this market is a long term play and $1000 genome performance is going to be the real inflection point in the marketplace. So what Bill did was to design an instrument with headroom in imaging capacity. 3 billion strands is today’s density but we have licensed technology from Steve Quake where we can potentially increase the density of that by a factor of four. That’s 12 billion strands that we would be able to look at, and the instrument has build into it this imaging capacity. Think of it as headroom; an instrument that will get people to the $1000 genome. From a marketing perspective, we believe we can look at a customer and say this instrument will get you there. How you will improve performance and how you will get decreases in cost is by buying the next kit that will have in it the better flow cells, the more dense flow cells, the better chemistry cycles that use less reagent and have more stability to them. Through that process we really believe in the future that we can give people $1000 genome performance. That’s important not only for sequencing but for things like digital gene expression, genomic signature sequencing, apps like methylation. To me the really exciting thing about this from a marketing guy’s perspective is any time Bill makes an improvement in the assay, any customers’ application benefits from it because the assay is agnostic to the application.
AW: Yeah I think it’s elegant business planning to build a higher quality and expandable machine rather than something that may be outdated in 5 years.
SL: It gives the customer confidence that an instrument will last because it’s not a cheap investment and it also gives us the ability not to have to turn around and make a continued investment in engineering, we will continue to do maintenance engineering and make the thing better and cheaper but we don’t have to build another instrument for a while. So what we’ve done with our R&D expenses, and you saw this right after the IPO, is that we’ve started a CSO office to do genomic collaborations; we’ve hired Patrice Milos from Pfizer. She was their head of pharmacogenomics and executive director of their whole molecular profiling at Pfizer development. She’s building a world class genomics team to do collaborations with customers. So Bill focuses on making the assay cheaper and what Patrice will continue the effort to work with customers to build all the applications and publish good science with them on the HeliScope.
AW: Right, publishing good science is always key.
AW: That’s actually really exciting, that’s much more than I expected, an expandable machine. One last question which is more of a business question. From a medical standpoint, the area of human resequencing is presumed to have a much larger therapeutic interest than de novo sequencing in terms of where the market share is. Is this the market that Helicos is specifically focusing on this type of an application or more broad de novo and resequencing efforts?
SL: The whole focus of this company from day one was not to be a replacement for ABI sequencers. The goal of this company was to enable, through the price performance and the simplicity of the workflow that you’ve described, an ability to let people ask new and more important questions of the genome. So we believe that we can create value by growing the market, not just replacing the current sequencing marketplace, and in doing so, medical resequencing is a huge opportunity, but it’s not the only opportunity. There are people incredibly interested in looking at whole genome methylation studies, making quantitative measurements of the genome, not just quantitative measurements of the transcriptome; applications like copy number variations, ChIP sequencing and measuring the amount of a transcription factor binding to DNA. These measurements are all very important and we fundamentally believe we’ve got the best technology when you get to quantitative apps because we do so little to the sample in prepping that we don’t perturb what is digitally in that cell. We’ve found a tremendous interest from people beyond the genome centers, they’re mostly in the academic health centers and people doing translational research. The fact that you can do single molecule measurements, and you don’t have to build a genome centers’ worth of infrastructure causes something to resonate when they realize that the patients DNA could be sitting on the HeliScope and being measured. It resonates and we’re getting tremendous interest about it and finding market segments within this life science area that no other technology can get to. A lot of these competing companies are saying “we’re going to find every ABI sequencer and replace it”. That’s not what we want to do, we want to add value to the marketplace because there are places where there is lots of money, lots of samples, and lots of known genome annotations but still we think there will be new genome annotation and people will do genomics and genetics in new ways which will grow the market. And that’s our whole idea.
AW: That was a great response and in your response I came up with some thoughts of how the single molecule technology could be applied to epigenetics and such. I couldn’t agree more that it’s a really exciting prospect.
SL: Cancer stem cell research is a hot area; people are figuring out how to purify them, but quantitative measurements of those stem cells to understand the functional genomics of how regulate themselves is hugely important and we’ve got people looking to us to collaborate; they look at single molecule measurements and they look at the ability to do these digital experiments on sequence and quantitation and say “this could be the answer.”
AW: That’s really great to hear that. That’s it for all the questions I have, are there any other important facts or events that you’d like interested parties to be aware of?
SL: I just wanted to clarify the sensitivity issues with you, I wanted to reemphasize the issues around homopolymer stuff and then I wanted to explain the accuracy a little further. Those were the key questions that you asked me, I didn’t have anything beyond that given the scope of what you’ve written about us to date. But I hope that you’ll be interested in us and help keep the community aware of who we are and what we are doing.
Thursday, August 16, 2007
Helicos Part I
Helicos Part II
Raised in IPO: 48.6 million
Q2 Burn Rate Ending June 30 2007: 8 million
6 month Burn Rate Ending June 30 2007: 16.4 million
(18.1 million repaid in stock conversion)
Yearly Burn Rate 2006: 21.3 million
Total Assets June 30 2007: 68 million
The take home message from both the S-1 and the recent 10-k is that Helicos is burning money at an increasing rate. This however is to be expected from a company less than one year from their initial product launch, and from an investor's viewpoint nothing in the publicly available financials throws up any red flags. I am also assuming that the 18 million payout for preferred stock conversion pertains to a number of venture investors getting a return out of the IPO fund. Also keep in mind that the 48 million raised during the IPO was far less than the original 80 million that management was hoping for. Naturally anyone looking to invest in HLCS in the quarter following the launch of the Heliscope will have to take a much harder look at the financials, but for our purposes I think its safe to say that the numbers seem to be on track relatively speaking.
Helicos: to Buy or Not?
What a ride this analysis has been. To recap, everyone agrees that next generation sequencing is going to change the world of healthcare as we know it. Naturally therefore, the field of NGS is highly competitive, and the players include some of the biggest and baddest, as well as a lot more little guys in the wings hoping either to get a foothold in the market, or to get bought out for some of those ridiculous sums we talked about earlier. Any way that you look at it this field is a frothing pool of speculation.
Enter Helicos, a technological startup intellectually founded at CalTech by the now head of the Stanford Bioengineering Department and an HHMI investigator Stephen Quake. Backed by Flagship Ventures who, if you aren't already aware, are a very well connected firm out of Cambridge. In fact, Stanley Lapidus leaves Flagship to become the CEO of Helicos. In addition, the Helicos scientific advisory board reads like a who's who in the sequencing and bioengineering field. These include, Leroy Hood, developer of automated Sanger sequencing instrumental to the human genome project, Steven Chu director of LBL and Nobel laureate, John Quackenbush, previously of TIGR, and Eugene Meyers, co-developer of BLAST, to name only a few.
On the surface, the concept of sequencing single DNA molecules is enticing to any biotechnical investor for a multitute of reasons. Not least of which are that the words "single molecule" are white hot at the moment in both acedemia and in industry, not to mention that the whole concept smacks of "nanotechnology" another concept on the tips of everybody's tongues. I will readily admit that "true single molecule sequencing" was precisely what piqued my interest in researching Helicos in the first place.
Under the surface, the technology seems feasible, in other words I have no doubt that it actually works. How well it works on the other hand, is another matter. Helicos is keeping entirely mum about what I believe to be their achilles heel ... accuracy. Now that we are armed with an in depth understanding of the fundamentals of the technologies it becomes plain to see that the error rate of any single molecule system is going to be higher than those methods which use a PCR step to make millions of identical copies and essentially increase the signal available to detect millions-fold. Furthermore accuracy is so important in this field because the major market, that of human resequencing for disease or genetic anomalies, must be 99.9% accurate for reasons which need no explanation. I think that it also goes without saying that 1000MB per day is of no use if the accuracy is 97% and you have to do sequences in triplicate. It is perfectly possible that the Heliscope does provide an accuracy comparable to the other NGS systems, however as an investor I would be more comforted to see Helicos release this information along side their 1000MB/day claims.
So Helicos is talking a big game but holding their cards close. Adding to these mixed signals, the company is absolutely stacked with some of the biggest names in the field. They have also been awarded a handsome grant from the NHGRI which further inspires confidence. Yet, when so much is at stake battling for supremacy in this truly revolutionary field the skeptic in me remembers the last time genomic hype was at its peak. In this, I feel that some sort of academic coup is not entirely out of the question. That being said, this particular academic coup would have a very good chance of pulling off say, the backing of some very big name institutions.
The entire speculative business of evaluating biotechnical IPOs notwithstanding, I am inclined to jump on the bandwagon and buy myself a small stake in HLCS. Therefore should Helicos not commence shipments on an acceptable Heliscope "next generation sequencing system" that lives up to its expectations by say, the first quarter of 2008, I will indeed have egg on my face. However in this, I will also be in some very distinguished company.
Disclosure: I am long shares of HLCS
Wednesday, July 25, 2007
Helicos Part I
Having familiarized ourselves with the NGS market, we are better positioned to take a look at Helicos and evaluate their recent IPO
In 2003 professor Stephen Quake, then at California Institute of Technology, published a paper in the Proceedings of the National Acedemy of Sciences describing that sequence information could be obtained from a single strand of DNA without amplification. Quake's group was able to overcome the hurdle of resolving individual bases by using DNA polymerase in combination with fluorescent nucleotides to image sequential incorporations during synthesis. Furthermore, this paper demonstrated a breakthrough application of single molecule theory in the field of DNA sequencing.
Dr. Quake subsequently met with Noubar Afeyan and Stanley Lapidus, then CEO and Partner at Flagship Ventures respectively, and they agreed to found a company to develop and commercialize the single molecule sequencing technology. Professor Eric Lander, Director of the Broad institute, played some sort of advisory role during the founding stages of the company. The company was incorporated in May 2003 and was renamed Helicos BioSciences in November of the same year.
Besides launching an IPO, highlights of Helicos since its inception include receiving a $2 million grant from NHGRI. In addition they have assembled a rather strategic management team and a prototype collaboration with, among others, Dr. Leroy Hood of the Institute for Systems Biology. Finally, during the 2Q investor conference Mr. Lapidus stated that the company is on track for a product launch to take place later in 2007.
price : n/a
MB/run : n/a
run length : ~100MB/hr
MB/day : ~1000MB
Read length : ~25
Raw base accuracy : n/a
The length of time between this blog entry and my last is largely due to the difficulties I encountered researching on the exact processes that the Heliscope employs to enable its "true Single Molecule Sequencing" (tSMS) technology. Needless to say that Helicos, despite the media fanfare, is playing their precise technology pretty close to the vest.
Using the original PNAS article, available patents, this chapter, (written by the two first authors on the PNAS paper for the Ohio U physics department) and some rumors, I will piece together what exactly we know about the Heliscope technology. Where applicable, I will add my suspicions to the best of my knowledge to fill in the gaps.
The Heliscope process in a nutshell is to shear the DNA and polyadenylate the fragments. These fragments, which also incorporate a dye molecule, are then attached randomly (via poly Ts) to the proprietary flowcell surface. Initial attachment locations are recorded via the dye molecule which is illuminated by laser excitation and recorded by a CCD camera connected to a microscope. After removing the dye, DNA polymerase and a dye-labeled nucleotide flow in and are then washed out. If the particular nucleotide-dye is complementary to any given fragment it will thus become incorporated into the growing strand. The camera will again mark the location of the fluorescence upon subsequent laser excitation. The dye molecule is then removed, washed away, and sequencing processes by repeatedly cycling through the four different nucleotides.
The benefits of sequencing single molecules of DNA are advantages of improved throughput and reduced cost. Compared to the other techniques discussed in my last entry, the Heliscope workflow is considerably less complicated, leading to shorter run times....there is no PCR, no beads, no microtiter plates etc. Of course this also equates to less reagents used, and conceivably, lower price per run.
However, the major challenge facing single molecule sequencing is that of sensitivity. Allow yourself to imagine the difference in signal intensity between a singly incorporated nucleotide on the Heliscope surface and the analgous millions of incorporated flourescent molecules on the Illumina system's flowcell surface. Therefore exploring how the Heliscope allegedly attains this sensitivity to the degree of six orders of magnitude over its competitors is essential to our assessment of the company.
In the field of single molecule imaging, the largest challenge is that of increasing both the resolution and sensitivity of a given signal past the limitations of the detecting instrument. This effort is primarily approached by increasing the signal to noise ratio. Given that the efficiency of the flourophores are maximal this is best accomplished by reducing the noise in the system. The Heliscope, apparently based on the method developed in Dr. Quake's laboratory, accomplishes this in two major ways. First, it uses a method called Total Internal Reflection Microscopy (TIRM) whereby only the flourophores within ~150nm of the flowcell surface are illuminated. This leads to a dramatic reduction of the noise from the bulk fluids. In addition, this method increases theoretical speed of readout as no scanning is involved. However, while TIRM reduces noise from objects in the solution far away from the surface, it does not reduce noise from surface bound impurities. In this manner, Dr. Quake's group overcame the second challenge of eliminating non-specifically bound surface dye molecules using a method known as Flourescent Resonant Energy Transfer (FRET). In this method, not one but actually two flourescent dyes are used. Requirements are that one dye, (Cy3) termed the donor, has an emission spectra that overlaps with absorption spectra of a second dye (Cy5), termed the acceptor. Thus when the donor molecule is within proximity to the acceptor, usually less than 10nm, and is excited at its specific excitation wavelength, it will transfer this energy to the acceptor dye which in turn becomes excited and emits a photon of lower energy. Consider it to be a kind of baton passing between the donor and the acceptor in a molecular relay race. The PNAS paper describes FRET being used by placing the donor molecule (Cy3) on the existing DNA strand and the acceptor (Cy5) on the incorporated nucleotide. In this manner, only the polymerase incorporated nucleotide-Cy5 molecules emit a signal while the non-specifically bound surface nucleotide-Cy5s remain dark because they are not within 10nm of the attached DNA containing a Cy3 donor. This combination of TIRM with FRET provide an unparalleled increase in the signal to noise ratio of single molecule detection and was indeed a groundbreaking application in DNA sequencing. Presumably this is the technology which makes the Heliscope possible, but how has Helicos improved, if at all, upon the technology?
Three fundamental questions remain which I will endeavor to answer in turn. How does the Heliscope deal with long stretches of repeat nucleotides? Does the Heliscope still use FRET and where exactly is the donor? How is the dye molecule "removed" after each incorporation cycle?
To begin, an initial issue with this method of DNA sequencing was a problem with accurately sequencing large numbers of repeat nucleotides, so called homopolymer regions. The problem can be realized if one imagines that during a cycle of a given nucleotide say dGMP the instrument would need to be able to detect 1, 2, 3 or more simultaneous incorporations if the template has a string of cytosines. As the problems explained above with signal to noise concerned simply detecting the presence of a fluorescent molecule it can be understood that detecting the difference between one and two incorporations is possible, but higher orders are out of the question. After a period of uncertainty it seems that Helicos has solved this issue. The press release on Feb 9th 2007 states "The proprietary nucleotide analogs contained in these unique formulations control accurate base-by-base extension through chemical means." I assume that they refer to their patent issued on Jan 30 2007. This patent simply describes using a dye-conjugated nucleotide in conjunction with DNA polymerase kinetics in such a way that one or two (but statistically insignificant amounts of higher) nucleotides are incorporated per relatively short reaction cycle.
Does the Heliscope use FRET? I ask because on the website the "technology" shows one dye molecule only. The short answer is that I don't know for sure, but I'm pretty sure. The original PNAS paper was published in 2003 which wasn't that long ago and all the single molecule people I know are still using FRET for the unsurpassed resolution. Furthermore, page 12 of this presentation (by the chairman of research core facilities and technology at the Mayo clinic) shows FRET as part of the process and furthermore has the donor attached to the end of the polyadenylated tail. This makes additional sense in light of the rumored ~25bp read length because at 3.4nm per turn and 10.5 bases per turn, that puts about 30bp within the 10nm range of FRET acceptor absorbance. It should be noted that there has been discussion in the original paper, the patents, and elsewhere of putting the donor molecule on the DNA polymerase itself, an area of research that I have no doubt that Helicos is actively pursuing for its later generation sequencers.
Finally, is the acceptor molecule removed and if so, how? On this topic I am unable to find any definitive information. Traditionally, the acceptor molecule is photobleached after detection using specific laser illumination at its absorbance. This leaves the donor molecule relatively unharmed, and capable of donating to another fresh acceptor. One drawback to this technique however, is that the acceptor molecule is not removed and therefore successive incorporations are compromised via steric interactions. In addition, successive photobleaching of the acceptor molecules will eventually also bleach the donor. Perhaps Helicos has developed a more robust donor, and an uncompromising acceptor, in this anything is possible. Alternatively, in the single molecule literature there are many examples of cleavable dyes, either chemically or photocleavable. That the dye is in fact cleaved off and removed seems indicated on the Helicos website, but I am not confident about the specific details of those slides. Certainly cleaving and removing the dye is the preferred method in this instance and if Helicos does not yet employ this method I am sure it is another development for future machines.
Phew! I hope you are still with me after all that. There is no doubt that it has been a challenge to disentangle the methodology from the hype regarding Helicos. However, the benefit is that we get to make an assessment based on quite a bit of science, long before S&P even touches it. At this point I still have some misgivings regarding whether or not the Heliscope will live up to expectations, but we will address these issues next time, as well as go over the financials.
Disclosure: I am long shares of HLCS
Monday, July 9, 2007
Helicos Biosciences part I (a.k.a Next Generation Sequencing overview)
Helicos Biosciences Corporation aims to commercialize a novel platform for “next generation” genomics. Their first product, the Heliscope, is designed to enable ultra-high-throughput genetic analysis based on the direct sequencing of single molecules of DNA and RNA.
Before we begin an in depth look at Helicos some background in sequencing is necessary. To begin, I am going to assume that the reader knows what sequencing is. More importantly, we can agree that as the price of personal genomics drops to a reasonable cost the impact on human biology and modern medicine will be changed profoundly. In essence, the hype of the sequencing of the human genome is finally coming to fruition as the cost of individual sequencing approaches the $1000 mark. In addition, for those who may have been napping I will recap the sequencing news from the last 12 months. Next generation sequencing companies are ostensibly getting snatched up by larger pharmaceuticals. In May 2006 Agencourt Personal Genomics was bought for $120million by Applied Biosystems, then in November Illumina purchased Solexa for a whopping $600million, and finally Roche payed $155million for 454 life sciences in March of 2007. What the three acquired companies have in common is that they employ technologies known as “next generation” sequencing. In other words, they have developed technologies that vastly increase the speed and breadth of DNA sequencing over the traditional Sanger method.
Next Generation Sequencing (NGS)
Sanger sequencing is the gold standard for very large projects. Unfortunately it does require a large infrastructure. The current state-of the-art, the Applied Biosystems 3730 xl Genetic Analyzer has an average read length of 1000bp and can generate a maximum 2.1Mbp (2,100,000) of sequence per day. This machine is priced at ~$400,000, and estimated cost for sequencing a human genome using the 3730 xl is $24M.
Most importantly, sequencing a human genome on this machine with six-fold coverage (~18 GB) would take 18 years. Because of this, large scale sequencing efforts have been carried out by genome centers which employ many machines running in parallel.
The goal of developing Next Generation Sequencing is to develop technologies that can produce a complete human genome in a reasonable time-frame, by using a single sequencing machine for $100,000 and eventually $1000. Lets have a look at the three top currently marketed products and methods.
Illumina/Solexa 1G genome analyzer
Solexa was the first out of the gate to provide a technology capable of generating a $100000 genome with their 1G analyzer which began shipping in the second quarter of 2006. Having just celebrated their 75 order placement, this puts Illumina firmly in front with the first mover advantage in the NGS market.
Run Length: ~2-3 days
Read Length: ~25bp
Raw Base Accuracy:99.99%
Genomic DNA is first sheared into small fragments and adapters are ligated onto both ends of the sequence. The DNA is then added to the flow cell whereby the ends bind to the proprietary surface on the inside of the channels. The free adapters of these fragments form bridges to the complementary nearby attached primers Next during a process known as solid phase amplification the fragments are thermocycled in the presence of nucleotides and polymerase and the bridges become double stranded. After denaturation, repeated cycles of this process give rise to random dense clusters of homogeneous DNA fragments containing millions of copies.
Solexa sequencing uses four proprietary, different colored, fluorescently-labeled modified nucleotides to sequence the above clusters present on the flow cell surface. Clusters are first referenced via laser excitation of the flourophores and their individual locations are captured via a CCD detector. In subsequent cycles progressive bases are sequenced as nucleotides are added, the entire slide is excited and scanned for individual colored incorporations, and then the flourescence is removed for the next base to repeat the cycle. In this manner each cluster on a slide is sequenced in a massively parallel manner.
Applied Biosystems SOLiD (Supported Oligo Ligation Detection)
Applied Biosystems, the current leader in sequencing has their own product now on the market. ABI's current NGS platform the SOLiD system. The shipping of initial units began in June 2007.
Price: $600,000 which includes the instrument, a computing cluster, a high capacity data storage centre and ancillary equipment for upfront sample preparation.
MB/day: ~500MB for mate pair samples (genome resequencing)
Read Length: up to 35bp
Raw Base Accuracy: 99.94%
The technology works by first amplifying DNA fragments using a water in oil emulsion polymerase chain reaction (PCR) technique that amplifies the DNA onto polystyrene beads. When the emulsion is broken the beads float to the top of the sample and are then placed on an array. Sequencing primers are then added along with a mixture of four different fluorescently labelled oligo probes. The oligo probes are eight bases long and bind specifically to the fifth base in the sequence to determine which of the four bases (A, T, C or G) it is. After washing and reading the fluorescence signal from the first base, a ligase is added, not a polymerase as in standard Sanger sequencing. The ligase cleaves the oligo probe between the fifth and sixth bases, removing the fluorescent dye from the strand of amplified DNA.
The whole process is repeated using a different sequence primer, until all of the intervening positions in the sequence are imaged. The process allows the simultaneous reading of millions of DNA fragments in a 'massively parallel' manner. This 'sequence-by-ligation' technique also allows the use of probes that encode for two bases rather than just one allowing error recognition by signal mismatching, leading to increased base determination accuracy.
Roche/454 Life Sciences Genome Sequencer FLX
Roche Diagnostics began distribution of the GS FLX in November 2007. The FLX system incorporates a number of technological advances over the original GS 20 launched by 454 Life Sciences in October 2005.
Price: ~$500000. (Academic Price?)
Read Length: ~250bases/read (depending on the organism)
Raw Base Accuracy: 99.5%
Like the 1G analyzer, the GS FLX also uses a methodology of sequencing by synthesis (rather than ligation) specifically known as pyrosequencing. DNA fragments 300-500bp in length are ligated by two short adaptors, which provide primers for both amplification and sequencing of the fragment as well as a biotin tag that immobilises it onto a streptavidin-coated bead.
A subsequent emulsion PCR step gives rise to beads with millions of copies of each DNA fragment attached. These beads are then deposited in 454's PicoTiterPlate device by centrifugation. The PicoTiterPlate wells are 44um and therefore fit only one bead apeice. In addition, each well has an optical fibre attached to its base, which form an array leading to a CCD camera.
The fluidics sub-system allows nucleotides to be pumped in, in a fixed order. During the nucleotide flow each of the beads is sequenced in parallel, with the polymerase extending the sequencing strand only if the nucleotide is complimentary to the template strand. The addition of one (or more) nucleotides results in a sequential pyrophosphate reaction with sulfurylase and luciferase producing a light signal, which is recorded by the instrument's camera.
I have been wanting to take a hard look at the next generation sequencing market for a little while now, and after the initial research needed for a background in Helicos I am very glad that I put in the effort to research it thoroughly. The implications involved with this technology are undoubtedly going to be groundbreaking from all areas of research right through to the clinic, and as an investor and scienist I believe it will pay to know who the players in the market are.
In summation, the sequencing market is extremely hot, and very competitive. However there may be room for multiple instruments which have different advantages depending on the application. For many applications, such as sequencing of everyday plasmids, the above technologies' read lengths are unacceptable. (Assembling 30 contigs to check the sequence of a 700bp gene is just not convenient ). In this manner, I dont expect the trusty 3730XL to be disappearing anytime soon.
Next time I'll get into Helicos and see what technology they have to throw into this frothing mix, for better or for worse.
Disclosure: I am long shares of HLCS