SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Biotech / Medical : PROTEOMICS

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: tuck who wrote (244)4/10/2001 2:56:56 PM
From: D. K. G.   of 539
 
Is a Human Proteome Project Next?
the-scientist.com
Large-scale efforts appear likely, though the field lacks the clear goals and speed of the Human Genome Project
By Douglas Steinberg
Graphic: Leza Berardone

 
Three dozen scientists, officials, and executives from academia, government, and business are speaking this week at a conference in McLean, Va., titled "Human Proteome Project: Genes Were Easy." This event, which is expected to draw at least 400 other participants, is the first sizable public meeting devoted to the possibility and advisability of a proteome project, according to organizer Chris Spivey, a conference director at Cambridge Healthtech Institute in Newton Upper Falls, Mass.

The Virginia conference is also hosting the first meeting of the Human Proteome Organisation (HUPO). One of its founders, Ian Humphery-Smith, a professor at the University of Utrecht in the Netherlands, describes HUPO as an international effort to bring commercial and academic groups together to study the output of all human genes. Professing certainty that proteomics is "an ocean in which many can swim and possibly even help one another stay afloat," Humphery-Smith, who is also chief operating officer of the Dutch company Glaucus Proteomics, warns in an E-mail that the alternatives to cooperation are sabotage and self-interested competition.

Whence arises the sudden ferment in proteomics? Joshua LaBaer, director of the two-year-old Institute of Proteomics at Harvard Medical School and a member of HUPO's inaugural council, observes that genomics conferences have recently become "a bit of a scramble to figure out what the next step is, now that the genome is largely completed." Interviews with leading players in both proteomics and genomics indicate that this nagging "what next?" feeling is also taking hold in proteomics, as scientists ask: Is proteomics, in whole or in part, suitable for large-scale efforts, given its diffuse goals and lack of high-throughput tools? If so, who should undertake these efforts? And is the biomedical research community really prepared for another bout of Big Science?

Francis S. Collins, director of the National Human Genome Research Institute (NHGRI), offers some historical perspective on the situation. Once the Human Genome Project (HGP) had begun (several years after its conception in 19851), "It was six years before a pilot effort to sequence the human genome got under way and nine years before it really scaled up to full production," he notes. "Proteomics is sort of back there in that earlier phase."

One Project or Many?
Courtesy of Rockefeller University

Defined as the characterization of all proteins, proteomics encompasses nearly all of biology, some observers say. They often couple this blunt description with a more nuanced one. Proteomics is "a style of doing biology," remarks Brian T. Chait, head of the mass spectrometry and gaseous ion chemistry lab at Rockefeller University. "Whereas in the past one might have spent a huge amount of time on a single protein, the idea here is to gather much more data much more quickly." Ruedi Aebersold, a professor at the Institute for Systems Biology in Seattle, describes proteomics as a "change in mindset" involving "a more global type of [protein] analysis."

Some histories of proteomics trace the field back to 1975, when the invention of two-dimensional gel electrophoresis allowed protein biochemists to cut isolated spots out of a gel and sequence their amino acids. Other chronicles start in the early to mid-1990s with the gradual appearance of protein identification methods that used nascent DNA and protein databases to analyze mass spectra of peptide fragments. By all accounts, however, the completion of HGP within the past year has signaled a new era in proteomics by giving researchers a handle on all human protein data.

A commonly expressed opinion is that a single Human Proteome Project can never match HGP's success. Eric S. Lander, director of the Whitehead Center for Genome Research in Cambridge, Mass., notes that biologists simply don't know how to characterize the proteome "from end to end, nailing every protein. The tools are not ready. And it's not clear that [such a project] makes sense." He contrasts proteomics to HGP where "there is a certain fixed number of base pairs--about three billion--and we were going to get them all. And so it had a beginning and an end to it."

Lander and Collins, however, can envision circumscribed proteomics projects that emulate certain HGP features: its scaling-up procedures, its insistence on early data release, and its consortia, which shared ideas and avoided duplicated effort. But duplication isn't necessarily bad, argued Denis F. Hochstrasser in a panel discussion at the New York Academy of Sciences on January 31. "When people do similar things to try to get better, that's when you get faster," said Hochstrasser, a professor at the University of Geneva and a director of Geneva Proteomics Inc. "If you try to regulate everything and say, 'OK, this group here in the [United States] does this, this group in Europe does this'--I don't think it would be as efficient."

Limited projects might focus on only one area of proteomics. To determine the functions of proteins, subdisciplines of proteomics study the structures, locations, or interactions of target molecules. Individual investigators with large labs, such as Stanley Fields at the University of Washington, are already identifying thousands of protein-protein interactions in yeast.2 But an organized network of labs might be advisable for analyzing higher organisms. Structural genomics has already spawned a host of consortia as well as a $150 million Protein Structure Initiative at the National Institute of General Medical Sciences (NIGMS).

The initiative's director, John C. Norvell, says that two factors made structural genomics suitable for a large-scale effort. The first was the field's success during the past decade, thanks to improvements in equipment and techniques. "The total time for determining a structure has gone from years to months, and now sometimes it's just days," he notes. The second consideration was the choices researchers made when they had total discretion to pick their own projects. They opted to study proteins with interesting biological or medical properties and ignored proteins whose sole distinction was that their structures could illuminate the structures of many similar proteins. "We decided an organized effort was thus needed," recounts Norvell.

This rationale for NIGMS's Protein Structure Initiative suggests a broader rule. "Any time when one is looking at, in effect, a cataloguing situation or a situation where one wants to establish an inventory of molecules or specific features of molecules, then I think a large-scale effort is immediately suitable," says Aebersold. "Questions aimed at the dynamics of a biological system--how things change, how things turn over, how things are regulated--are, in my view, much less amenable to a large-scale effort."

Public, Private, or Both?
Who will undertake massive proteomics projects? The conventional wisdom is that companies have taken a commanding lead in this field. But that's difficult to prove, and the fact remains that government and academia are already heavily committed to proteomics. Besides NIGMS's Protein Structure Initiative, the National Institutes of Health "support a vast number of functional studies of proteins, both at the level of individual proteins and genomewide studies," according to David Eisenberg, head of the University of California, Los Angeles-Department of Energy Laboratory of Structural Biology and Molecular Medicine.

Collins discloses that NHGRI's new program to establish Centers of Excellence in Genomic Science is considering a number of applications focused on proteomics. Susan E. Old, a health scientist administrator at the National Heart, Lung, and Blood Institute, notes that three of the institute's 11 programs for genomic applications, funded last fall, contain proteomics projects.

Government involvement on a larger scale has been hampered, in part, because "there's nobody out there advocating for it," says John R. Yates III, an associate professor of cell biology at Scripps Research Institute who develops proteomics techniques. Why aren't scientists assuming the kind of role taken by James D. Watson in promoting HGP? Yates responds that "the number of people who were in proteomics early in the United States"--and thus who might be expected to spearhead a big effort--"is actually fairly small." (Much proteomics activity has taken place in Europe.)

Yet even a determined advocate won't guarantee government involvement if politics intrudes. Around 1980, when N. Leigh Anderson was at Argonne National Laboratory, he tried to get authorities to fund a proteome project called the Human Protein Index. (He now acknowledges that the proposal was too advanced for its time.) "Ronald Reagan was elected, and Alan Cranston [D.-Calif.], who was providing a lot of the guidance and push behind the project, was no longer the majority whip in the Senate," Anderson recalls. "At that point, [the project] didn't make any further progress." Anderson's take-home message: "You have to be pretty sophisticated to be able to mesh scientific discoveries with political cycles, but it's not beyond the bounds of possibility."

Anderson, currently chief scientific officer of Large Scale Biology Corp. of Vacaville, Calif., predicts that large-scale proteome analysis "will be undertaken primarily by private entities"--and should be. Such analysis, he says, "requires a number of methodologies for which the major automated systems are just too big, expensive, and complicated to be sold to most academic institutions. And the efforts required to do certain things in proteomics may be larger than are going to fit well in most academic institutions." As for government labs, Anderson argues that they should dominate only "when there is a very strong vision of what should be done. Think about the Manhattan Project and NASA, the classic examples of Big Science. Those are cases where national prestige and security were really on the line, and the objective was initially absolutely concrete."

Should the government's HGP consortium be added to that list? "Some people would argue it didn't work as well as it might have because ultimately the product of it was a hybrid between a cottage industry and what had been envisioned originally as more centrally managed," says Paul Gilman, director of policy planning at Rockville, Md.-based Celera Genomics Group, which mounted a rival genome project. "So it was not a centrally managed effort but a centrally administered effort that was disseminated out to a number of different universities. The cottages were bigger, but it was still a cottage effort. If Celera demonstrated anything, [it was] that a centrally managed effort does take on certain economies of effort and money."

Some observers tout the advantages of public-private collaborations. Referring to a consortium identifying single-nucleotide polymorphisms and another sequencing the mouse genome, Collins remarks: "I would very much hope to see similar kinds of partnerships developed in proteomics because it is pretty basic information, which one could call pre-competitive. It's the kind of thing everybody wants to have access to, doesn't want to have to pay for, [and] wants to be sure is done right."

Harvard's Institute of Proteomics, now supported mainly by the university, is seeking both NIH and industry funds to finance its growing repository of expression-ready human genes, according to director LaBaer. "We think one could build this project along the lines of a consortium," he says. "Which is to say that some significant fraction of it would be paid for by public money, but also a significant fraction would be paid for by industry money. And the hope is that we can convince industry that this is important enough and should be available to enough people that they ought to help pay for it."

Anderson himself points to collaborations in which his company analyzed protein samples supplied by the National Cancer Institute as a "perfectly plausible model."3 And Celera, criticized for being uncooperative during HGP, might be more flexible this time around. "We recognize that to do [proteomics] justice, you need a very broad set of capabilities and areas of expertise," says Gilman. Some of those you can create in-house; some of them you can acquire; some of them you can collaborate; some of them you just cooperate. So we're looking to the full range, whether it's from the private sector or the public sector."

Ready for Big Science?
While proteomics might soon be ripe for large-scale projects, it's not clear whether the biomedical research community is similarly ready. "Academia hasn't fully embraced the idea of large biological science yet," asserts LaBaer. "Biology will always have a contingent of cottage industry--small labs that focus on very specific questions," he adds. "And at least for now, there's a very strong element in biology that wants that. In fact, I would say the majority of biologists prefer that."

Lander, who oversaw one of HGP's sequencing centers, also recognizes this individualistic spirit. "I think most scientists are such that, if they can get to Mars by themselves, they'd love to," he says. "And they'd prefer that. But between getting to Mars together with some great colleagues or not getting to Mars at all, I think most scientists are psychologically quite prepared to do the former because they really care about the goals."

Lander, however, seems less concerned about biologists' psyches than about institutional obstacles to the success of future Big Science projects. "Tenure processes for junior faculty look at what [faculty members] do alone, as opposed to in collaboration," he observes. "And we don't have career paths for senior professional scientists who aren't on the tenure track at a university. We're not rewarding these multidisciplinary professionals with career paths the same way, for example, that they are in electronics and physics."

As the successor to genomics, proteomics won't pause while such deficiencies are studied, discussed, and possibly remedied. Consider what's happening at The Institute for Genomic Research (TIGR), the independent nonprofit organization in Rockville, Md., that played an important early role in the Human Genome Project. From DNA sequencing, TIGR expanded into microarray expression analysis. Now it is applying for NIH grants to fund its first proteomics studies, which will focus on virulence factors in bacteria. Its rationale: The mRNA levels revealed by microarrays don't always correlate with protein levels. William C. Nierman, TIGR's vice president for research, says that the institute must break into proteomics "or we're going to be left in the dust." With sentiments like that circulating among life scientists, can large-scale proteomics be that far in the future?

Douglas Steinberg (dougste@attglobal.net) is a freelance writer in New York.

References
1. L. Roberts, "Controversial from the start," Science, 291:1182-8, Feb. 16, 2001.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext