By the 1880s, it was recognized that the virulence of infectious diseases varied with many factors, including the means and duration of exposure, the way in which the germ entered the body, and the physiological status of the host. By the turn of the century, the fundamental question
concerning scientists investigating the immune response was: is the mech- anism of innate and acquired immunity humoral or cellular?
When Behring was awarded the ﬁrst Nobel Prize in Medicine for
his work on serum therapy, he made a special point of reviewing the history of the dispute between cellular and humoral pathology. He considered antitoxic serum therapy ‘‘humoral therapy in the strictest sense of the word.’’ Humoral theory, Behring predicted, would put medicine on the road to a strictly scientiﬁc etiological therapy in con- trast to traditional, nonspeciﬁc, symptomatic remedies. The scientiﬁc debate often degenerated into vicious personal attacks, but Joseph Lister, ever the gentleman, delicately referred to this controversial era as the ‘‘romantic chapter in pathology.’’
As serology was transformed into immunology, scientists saw the
new discipline rapidly outgrowing its parental disciplines of micro- biology and toxicology. Studies of cellular mechanisms of defense seemed to be a relic of a less sophisticated era of biology closely asso-
ciated with old-fashioned ideas such as those of E´ lie Metchnikoff
(1845–1916). Other immunologists of this period were primarily con- cerned with serum antibodies and tended to ignore the role played by cells, but Metchnikoff, discoverer of phagocytes (the ‘‘eating cells’’ that devour invading microorganisms) and the process of phagocytosis, was more interested in the defenses of the host than the depredations of the pathogen. While most scientists argued that speciﬁc chemical entities in the blood defended the body from bacteria and toxins, Metchnikoff followed his own idiosyncratic hypotheses concerning evolution, inﬂammation, immunity, senility, and phagocytosis. When he shared the 1908 Nobel Prize for Physiology or Medicine with Paul Ehrlich, Metchnikoff was praised as the ﬁrst scientist to establish an experimental approach to the fundamental question of immunity, that is, how does the organism overcome disease-causing microbes?
Through personal experience Metchnikoff knew how little physi- cians could do for victims of infectious diseases. His ﬁrst wife had been so weakened by consumption that she had to be carried to their wed- ding. When she died ﬁve years later, Metchnikoff tried to end his own life by swallowing a large dose of opium. With his second wife close to death from typhoid fever, Metchnikoff inoculated himself with the spirochetes thought to cause relapsing fever so that his death would be of service to science. Fortunately, the excitement generated by the discovery of phagocytosis rescued Metchnikoff from the depression that had driven him to attempted bacteriological suicide. From 1888 on, the Pasteur Institute provided a refuge in which Metchnikoff could pursue research problems that were creative and original to the point of eccen- tricity. Primarily a zoologist, inﬂuenced as much by Charles Darwin as Pasteur or Koch, Metchnikoff’s theories of inﬂammation and immunity grew out of his evolutionary vision of comparative pathology.
Studies of inﬂammation that began with starﬁsh larvae led Metchnikoff to the conclusion that phagocytosis was a biological phenomenon of fundamental importance. While observing the inter- action between phagocytes and bacteria, Metchnikoff discovered that phagocytosis was greatly enhanced in animals previously exposed to the same kind of bacteria. He, therefore, concluded that mobile phago- cytes were the primary agents of inﬂammation and immunity. In his Nobel Prize lecture, he expressed hope that people would see his work as an example of the ‘‘practical value of pure research.’’ Inspired by Metchnikoff’s ‘‘phagocyte theory,’’ some surgeons attempted to rush white corpuscles to the rescue by introducing various substances into the abdominal cavity or under the skin. Another follower of Metchni- koff’s theories systematically applied cupping-glasses and rubber liga- tures around the site of abscesses and similar infections. The localized edema produced by these procedures was supposed to attract an army of protective phagocytes.
Conﬁdent that science would eventually free human beings from
the threat of disease, Metchnikoff applied his theory of the phagocyte to the specter of senility. Reﬂecting on the principles of comparative pathology, he concluded that phagocytes were primarily responsible for the signs and symptoms of senility. From gray hair and baldness to weakness of bone, muscle, and brain, Metchnikoff saw the telltale footprints of myriads of motile cells ‘‘adrift in the tissues of the aged.’’ Noxious inﬂuences, such as bacterial toxins and the products of intesti- nal putrefaction, allegedly triggered the transformation of friendly phagocytes into fearsome foes. Even though Metchnikoff believed that phagocytes caused senility, he warned that destroying these misguided cells would not prolong life, because the body would then be left defenseless in the struggle against pathogenic microbes.
After comparing the life spans of various animals, Metchnikoff
concluded that the organs of digestion determined length of life. Specif- ically, the problem resided in the large intestine where microbial mischief produced ‘‘fermentations and putrefactions harmful to the organism.’’ Stopping just short of a call for prophylactic removal of this ‘‘useless organ,’’ Metchnikoff suggested that disinfecting the digestive tract might lengthen life. Unfortunately, traditional purges and enemas seemed to harm the intestines more than the microbes. Since acids could preserve animal and vegetable foods, Metchnikoff concluded that lactic fermentation might prevent putrefaction within the intestines. In prac- tical terms, his advice could be summarized by the motto: ‘‘Eat yogurt and live longer.’’
Although scientists generally ignored Metchnikoff’s theory of the treacherous phagocytes and the useless large intestine, his ideas about the positive and negative activities of phagocytes and the ambiguity of the inﬂammatory response were remarkably prescient. When the
body responds to noxious stimuli, the site of the injury exhibits what the Roman writer Celsus called the cardinal signs of inﬂammation and becomes red, swollen, warm, and painful. Although the inﬂammatory response is most noticeable on the skin, it also occurs internally, in response to viral invaders or spoiled food. Thus, although inﬂammation is the body’s normal protective reaction, in many cases, inﬂammation can harm the tissues it is meant to heal. This occurs in diseases like rheu- matoid arthritis and multiple sclerosis. In the elderly, the destructive effects of inﬂammation may be involved in other common chronic dis- eases, such as arteriosclerosis, diabetes, Alzheimer’s disease, osteo- porosis, asthma, cirrhosis of the liver, some bowel disorders, psoriasis, meningitis, cystic ﬁbrosis, and cancer. Indeed, some researchers suggest that the use of anti-inﬂammatory drugs like ibuprofen or naproxen may prevent or delay the development of some of the chronic and debilitat- ing diseases of old age, such as Alzheimer’s disease.
Of course the body’s failure to mount an effective defense against some pathogens was well known, but the discovery by Charles Robert Richet (1850–1935) and Paul Jules Portier (1866–1962) that the immune system could react to certain antigens with life-threatening hypersensi- tivity was unexpected. Richet, who won the Nobel Prize in 1913, coined the term ‘‘anaphylaxis’’ to describe this dangerous response. Based on the Greek word phylaxis meaning protection, anaphylaxis referred to
‘‘that state of an organism in which it is rendered hypersensitive, instead
of being protected.’’ Violent itching, vomiting, bloody diarrhea, faint- ing, choking, and convulsions characterized this state of hypersensitiv- ity. In its most severe form, anaphylactic shock could cause death within a few minutes of exposure to the offending antigen. Further investigations proved that just as it was possible to transfer passive immunity, it was also possible to transfer the anaphylactic condition via serum.
Anaphylaxis seemed to be a peculiar exception to the generally beneﬁcial workings of the immune system. Thus, many scientists believed that immunology would provide the key to establishing powerful new approaches to therapeutics. A good example of the optimism characteristic of this early phase of immunology is provided by Sir Almroth Wright (1861–1947), a man who was expected to take the torch from Pasteur and Koch and illuminate new aspects of experimental immunization and medical bacteriology. Wright expected his work in the Inoculation Department at St. Mary’s Hospital to bring about a rev- olution in medicine, but he is generally remembered only as Alexander Fleming’s mentor.
A man of broad interests and inﬂexible opinions, Wright published about 150 books and papers in science, intellectual morality, and ethics. In addition to his scientiﬁc articles, Wright often used the British newspapers to vent his opinions on issues ranging from the ignorance
of army ofﬁcials to the campaign for woman suffrage, which he vehe- mently opposed. While professor of pathology at the Army Medical School in Royal Victoria Hospital, Wright developed sensitive labora- tory methods of diagnosing the ‘‘army fevers’’ that killed more soldiers than bullets. Using a diagnostic test based on what he called the aggluti- nation effect (the clumping of microbes in response to serum from patients recovering from a disease), Wright prepared a vaccine that apparently protected monkeys from Malta fever. In the great tradition of scientists who served as their own guinea pigs, Wright injected himself with his vaccine. Unfortunately, Wright was not as lucky as his monkeys.
While recovering from Malta fever, Wright began planning a
major study of typhoid fever. During the 1890s, this dreaded dis- ease claimed tens of thousands of lives in the United States and Great Britain. The case fatality rate varied from 10 to 30 percent, but recovery was a slow and unpredictable process. Using himself and his students as guinea pigs, Wright found that heat-killed cultures of typhoid bacilli could be used as vaccines. Sir William Boog Leishman’s (1865–1926) study of typhoid cases in the British Army between 1905 and 1909 provided the ﬁrst signiﬁcant documentation of the value of antityphoid inoculations. According to Leishman, the death rate of the unvaccinated men was 10 times that of the inoculated group. Nevertheless, at the beginning of World War I, antityphoid inoculations in the British Army were still voluntary.
Openly contemptuous of the ‘‘military mentality,’’ Wright was happy to resign from the Army Medical Service when he was offered the position of pathologist at St. Mary’s Hospital in 1902. Although he received only a small salary, meager facilities, and was responsible for many tedious and time-consuming duties, he attracted eager disci- ples and hordes of desperate patients. With the fees charged for vaccine therapy, Wright’s Inoculation Department became a ﬂourishing and ﬁnancially rewarding enterprise. According to Wright, ingestion of microbes by phagocytes required the action of certain substances in the blood that he called ‘‘opsonins,’’ from the Greek opsono, meaning,
‘‘I prepare victuals for.’’ Wright’s vaccines were designed to increase the
so-called opsonic index of the blood by making pathogenic microbes more attractive and digestible.
Patients suffering from acne, bronchitis, carbuncles, erysipelas, and even leprosy submitted to Wright’s experimental inoculations and blood tests. Doubtless many patients eventually recovered from self- limited infections, despite the therapy rather than because of it. Disdain- ful of statistical evaluations of medical interventions, Wright exhibited great conﬁdence in his methods and warned reactionary physicians that they would be ‘‘degraded to the position of a head nurse’’ as the art of medicine was transformed into a form of applied bacteriology. By the
end of World War II, it was obvious that Wright’s opsonically calibrated vaccines were no more successful than Metchnikoff’s attempts to neu- tralize the harmful effects of phagocytes with yogurt. Even Wright’s admirers were forced to conclude that the vaccines dispensed by his Inoculation Department were generally ‘‘valueless to the point of fraudu- lence.’’ British playwright and social critic George Bernard Shaw (1856–
1950) immortalized Almroth Wright’s eccentricities in The Doctor’s
Dilemma, but scientists remembered him as ‘‘Sir Almost Wright.’’ Reviewing what was known about immunology in the 1920s, the
eminent physiologist Ernest H. Starling (1866–1927) concluded that
the only thing perfectly clear about the immune system was that ‘‘immu- nity, whether innate or acquired, is extremely complex in character.’’ Further studies of the system have added more degrees of complexity and controversies at least as vigorous as those that characterized the conﬂict between humoral and cellular theory. Immunology is a rela- tively young ﬁeld, but its twentieth-century evolution was so dynamic that it ultimately became one of the fundamental disciplines of modern medicine and biology. Discussions of AIDS, cancer, rheumatoid arthritis, metabolic disorders, and other modern plagues are increas- ingly conducted in the arcane vocabulary of immunobiology.
Modern explanations for the induction of antibodies and their
remarkable diversity and speciﬁcity can be divided into information, or instructionist theories, and genetic, or selectionist theories. According to the information theory of antibody synthesis, the antigen dictates the speciﬁc structure of the antibody by direct or indirect means. Direct instruction implies that the antigen enters a randomly chosen antibody-producing cell and acts as a template for the production of antibodies with a conﬁguration complementary to that antigen. An in- direct template theory suggests that when an antigen enters an antibody- producing cell it modiﬁes the transcription of immunoglobulin genes and, therefore, affects the sequence of the amino acids incorporated into the antibodies produced by that cell and its daughter cells.
A genetic or selectionist theory of antibody production assumes
that information for the synthesis of all possible conﬁgurations of anti- bodies is contained in the genome and that speciﬁc receptors are normally present on immunocompetent cells. Selectionist theories pre- suppose sufﬁcient natural diversity to provide ample opportunities for accidental afﬁnity between antigen and immunoglobulin-producing cells. In this scenario, the antigen acts as a kind of trigger for antibody synthesis.
One of the ﬁrst modern theories of antibody formation, Paul
Ehrlich’s side-chain theory, was an attempt to provide a chemical expla- nation for the speciﬁcity of the antibody response and the nature of toxins, toxoids, and antibodies. According to this theory, antibody- producing cells were studded with ‘‘side-chains,’’ that is, groups capable
of speciﬁcally combining with antigens such as tetanus toxin and diphtheria toxin. When a particular antigen entered the body, it reacted with its special side-chain. In response, the affected cell committed itself to full-scale production of the appropriate side-chain. Excess side-chains became detached and circulated in the body ﬂuids where they neutral- ized circulating toxins. Like a key in a lock, the ﬁt between antigen and antibody was remarkably speciﬁc, although it was presumably due to accident rather than design.
Karl Landsteiner (1868–1943) argued that Ehrlich’s theory was untenable primarily because it presupposed an ‘‘unlimited number of physiological substances.’’ However, it was Landsteiner’s demon- stration that the body is capable of making antibodies against
‘‘haptens’’ (small molecules, or synthetic chemical radicals that were linked to proteins) that transformed the supposed number of antibodies from unlimited, in the sense of very large, to unlimited, in the sense of almost inﬁnite. The implications of this line of research were so startling that Landsteiner, who won the 1930 Nobel Prize in Medicine for his dis- covery of the human blood groups, considered his development of the concept of haptens and the chemical approach to immunology a much greater scientiﬁc contribution.
It had always been difﬁcult to imagine an antibody-producing cell
carrying a large enough array of potentially useful side-chains to cope with naturally occurring antigens. Imagining that evolution had equipped cells with side-chains for the synthetic antigens produced by ingenious chemists was essentially impossible. No signiﬁcant alternative to the genetic theory emerged, however, until 1930 when Friedrich Breinl (1888–1936) and Felix Haurowitz (1896–1988) proposed the ﬁrst inﬂuential instructionist theory, which they called the ‘‘template theory.’’ According to this theory, an antigen enters a lymphocyte and acts as a template for the speciﬁc folding of an antibody. Many kinds of objections were offered in response to this hypothesis, but proof that antibodies differ in their amino acid sequence made early versions of this theory untenable. A long line of complex clinical puzzles and methodological challenges culminated in the complete deter- mination of the amino acid sequence of an entire immunoglobulin molecule in 1969 by Gerald M. Edelman (1929–) and his associates. Edelman and Rodney R. Porter (1917–1985) were awarded the Nobel Prize in 1972 in recognition of their work on the biochemical structure of antibodies.
The instructionist theory of antibody production was challenged in
1955 by Niels Kaj Jerne’s (1911–1994) ‘‘natural-selection theory,’’ which has been described as a revised and modernized form of Ehrlich’s classi- cal side-chain theory. Jerne worked at the Danish State Serum Institute before earning a medical degree at Copenhagen. He served as the chief medical ofﬁcer of the World Health Organization from 1956 to 1962
and director of the Institute of Immunology at Basel from 1969 to 1980. According to Jerne’s natural-selection theory, an antigen seeks out a globulin with the appropriate conﬁguration, combines with it, and car- ries it to the antibody-producing apparatus. Although Jerne introduced his theory in the 1950s, it was not until 1984 that he was awarded the Nobel Prize in Medicine or Physiology ‘‘for theories concerning the speciﬁcity in development and control of the immune system and the discovery of the principle for production of monoclonal antibodies.’’ In his lecture at the Nobel ceremonies Jerne said, ‘‘My concern has always been synthetic ideas, trying to read road-signs leading into the future.’’ Jerne’s vision of the natural-selection theory of antibody for- mation and his complex network theory of the immune response pro- vided the framework for a new phase in the development of cellular immunology. Jerne’s early publications served as a challenge to the instructionist theories that had become the dominant paradigm of immunology.
The natural-selection theory implied that the body’s innate ability
to generate a virtually unlimited number of speciﬁc antibodies was inde- pendent of exposure to foreign antigens. Normal individuals are born with the genetic capacity to produce a large number of different anti- bodies, each of which has the ability to interact with a speciﬁc foreign antigen. When the immune system encounters a novel antigen, the pre-existing antibody molecule that has the best ﬁt interacts with it, thus stimulating the cells that produce the appropriate antibody. In the 1970s, Jerne elaborated his network theory as an explanation for the regulation of the immune response, essentially through an antibody cascade leading to anti-antibodies, anti-anti-antibodies, and so forth, and the ability of the immune system to balance the network by stimu- lating or suppressing the production of particular antibodies. This theory provided vital insights into the body’s response to infectious diseases, cancers, allergies, and autoimmune disease.
Modiﬁed versions of antibody-selection theory solved the primary
difﬁculty of Jerne’s original concept by substituting randomly diver- siﬁed cells for his randomly diversiﬁed antibody molecules. That is, cells are subject to selection, not antibodies. In particular, the cell, or clonal- selection theory, independently proposed by Sir Frank Macfarlane Burnet (1899–1985) and David Talmage (1919–), revolutionized ideas about the nature of the immune system, the mechanism of the immune response, and the genesis of immunologic tolerance. Burnet’s clonal- selection theory encompassed both the defense mechanism aspect of the immune system and the prohibition against reaction to ‘‘self.’’ During development, ‘‘forbidden clones’’ (cells that could react against self) were presumably eliminated or destroyed. Macfarlane Burnet and Sir Peter Medawar (1915–1987) were awarded the Nobel Prize in 1960 for their work on immunological tolerance.
When Macfarlane Burnet reviewed the state of immunology in
1967, 10 years after he proposed the clonal-selection theory, he was pleased to report that the ﬁeld seemed to have ‘‘come of age.’’ Unlike Ehrlich and Landsteiner, who had emphasized the importance of a chemical approach to immunology, Burnet’s emphasis was on biological concepts: reproduction, mutation, and selection. By the 1980s, the cell- selection theory had gone beyond general acceptance to the status of
‘‘immunological dogma.’’ This transformation was stimulated by the
explosive development of experimental cellular immunology. Immu- nology laboratories were awash with T cells and B cells, helper cells, suppressor cells, killer cells, and fused cells producing monoclonal anti- bodies. Throughout the 1970s and 1980s, immunologists were awarded Nobel Prizes for remarkable theoretical and practical insights into organ transplant rejection, cancer, autoimmune diseases, and the devel- opment of new diagnostic and therapeutic tools of great power and pre- cision. A century of research in immunology since the time of Louis Pasteur had created as many questions as it had answered, but it clearly established the fact that much of the future of medical theory and practice would be outgrowths of immunobiology.
With cardiovascular disease and cancer replacing the infectious diseases as the leading causes of morbidity and mortality in the wealthy, industrialized nations, immunology seemed to offer the answer to the riddle of health and disease just as microbiology had provided answers to questions about the infectious disease. In the 1950s, Macfarlane Burnet expressed his belief that immunology was ready for a new phase of activity that would reach far beyond the previous phase inspired by Paul Ehrlich. Microbiology and chemotherapy had provided a powerful arsenal of magic bullets directed against the infectious diseases. By combining molecular biology and immunology, scientists were attempt- ing to create a new generation of genetically engineered drugs, including so-called smart bombs and poisoned arrows. These new weapons would be designed to target not only old microbial enemies, but also modern epidemic conditions and chronic diseases, such as cardiovascular dis- ease, cancer, Alzheimer’s disease, autoimmune disorders, allergies, and organ rejection.
When Cesar Milstein (1927–2002) and Georges Ko¨ hler (1946–1995) shared the 1984 Nobel Prize with Jerne, they were speciﬁcally honored for the discovery of ‘‘the principle for production of monoclonal antibodies.’’ In his Nobel Lecture, Milstein stressed the importance of the fact that the hybridoma technology was an unexpected by-product of basic research that had been conducted to understand the immune system. It was, he said, a clear example of the value of supporting research that might not have an obvious immediate practical application. Monoclonal antibody production was one of the principal driving forces in the creation of the biotechnology industry. It opened the way for the
commercial development of new types of drugs and diagnostic tests. Monoclonal antibodies could be equipped with markers and used in the diagnosis of a wide variety of illnesses and the detection of viruses, bacteria, toxins, drugs, antibodies, and other substances.
In 1969, Jerne had predicted that all the interesting problems of immunology would soon be solved and that nothing would remain except the tedious details involved in the management of disease. Such drudgery, he suggested, was not of interest to scientists, but would pro- vide plenty of work for physicians. The innovative hybridoma technique developed by Milstein and Ko¨ hler in 1975 falsiﬁed that prediction and made it possible to explore many unexpected aspects of the workings of the immune system. Contrary to Jerne’s prediction, researchers have not run out of questions to ask about the immune system, nor have there been any complaints that the ﬁeld has become less exciting.
The characteristic of the immune system that is so important
in guarding the body against foreign invaders, that is, the ability to produce an almost unlimited number of different antibodies, represents a problem for scientists trying to understand the system. Immunologists who have struggled with the phenomenon of antibody diversity estimate that a mouse can make millions of different antibodies. The technique developed by Milstein and Ko¨ hler has transformed the study of antibody diversity and made it possible to order what Milstein called
‘‘antibodies a` la carte.’’ The new generation of magic bullets that might
be derived from hybridomas could be compared to creating derivatives of atoxyl and the aniline dyes. Hybridomas are made by fusing mouse myeloma tumor cells with spleen cells derived from a mouse that was previously immunized with the antigen of interest. The hybrid cells pro- duce large quantities of speciﬁc antibodies, which are called monoclonal antibodies (Mabs). By combining the techniques of immunology and molecular biology, scientists expect to design new generations of magic bullets. As Sir Almroth Wright predicted, the healer of the future might well be an immunologist.
By 1980, only ﬁve years after Ko¨ hler and Milstein ﬁrst published an account of their technique, monoclonal antibodies were well-established tools in many areas of biological research. By 1990, thousands of monoclonal antibodies had been produced and described in the litera- ture. Researchers predicted that monoclonal antibodies might be used as novel vaccines and in the diagnosis and treatment of cancers. In cancer therapy, monoclonal antibodies might function as smart bombs, targeted against cancer cells to provide site-speciﬁc delivery of chemotherapeutic drugs. The concept is simple in theory, but difﬁcult to achieve in practice. In part this is due to the fact that, despite new insights into the etiology of cancer, discussions of ‘‘cancer’’ are rather like nineteenth-century debates about the nature of fevers, plagues, pestilences, and infectious diseases. The complex constellation of disorders subsumed by the category
commonly called cancer looks quite different to physicians, patients, pathologists, oncologists, and molecular biologists. There is a great gap between understanding the nature of oncogenes (genes that appear to induce malignant changes in normal cells when they are affected by carcinogenic agents), transforming retroviruses (RNA viruses that can transform normal cells into malignant cells), proto-oncogenes, and so forth, and establishing safe and effective means of preventing and treat- ing cancers.
Studies of viral infections and possible links between cancer and viruses led to hope that some endogenous agent might serve as a univer- sal viral antidote and cancer drug. Interferon, a protein that interferes with virus infections, was discovered in the 1950s by researchers study- ing the growth of inﬂuenza virus in chick embryonic cells. Despite early excitement about interferon, the substance was very difﬁcult to isolate and characterize. By 1983, about 20 distinct human interferons had been identiﬁed. The interferons were involved in the regulation of the immune system, nerve function, growth regulation, and embryonic development. Experiments in the late 1960s suggested that, at least in mice, interferon inhibited virus-induced leukemias and the growth of transplantable tumors. Interferon’s potential role in cancer treatment attracted the attention of the media, patient advocacy groups, and members of Congress.
Preliminary tests of interferon’s clinical efﬁcacy against osteogenic
sarcoma (a malignant bone cancer) in the early 1970s interested virolo- gist Mathilde Krim (1926–), who launched a crusade to support research on interferon as an antitumor agent. Krim, who had earned her Ph.D. in 1953 at the University of Geneva, Switzerland, joined the Sloan-Kettering Institute for Cancer Research in 1962. From 1981 to 1985, she served as Director of the Institute’s Interferon Laboratory. Interferon was initially promoted as a potential ‘‘miracle drug,’’ which would presumably be well tolerated because it was a ‘‘natural agent.’’ Clinical trials were, however, quite disappointing in terms of effec- tiveness and safety. Adverse reactions to interferon included fever, chills, fatigue, loss of appetite, decreased white-blood-cell counts, and hair loss. Through further development, however, interferon gained a role in the treatment of certain cancers and viral diseases. In addition to her interferon work, Krim became well known as a health edu- cator and AIDS activist. She was the founder of the AIDS Medical Foundation (1983), which later became the American Foundation for AIDS Research. In 2000, President Bill Clinton awarded the Presi- dential Medal of Freedom to Krim for her contributions to AIDS education and research.
Ever since 1971, when President Richard M. Nixon (1913–1994)
declared war on cancer, oncologists and cancer patients have been caught in cycles of euphoria and despair. Since the 1970s, the phrase
‘‘war on cancer’’ has been used to stimulate spending on research. Yet the total cancer death rate has not signiﬁcantly declined since the declara- tion of war. Critics insisted that the war was profoundly misguided in term of its overly optimistic predictions and its implementation. More- over, the rhetoric of the cancer crusade often conveyed false and misleading information to the general public. Premature reports of
‘‘breakthroughs’’ and ‘‘miracle cures’’ convinced many people that cancer is essentially a single disease and that sufﬁcient spending would soon result in the discovery of a magic bullet. Scientists point out that the funds and technologies associated with the war on cancer stimulated revolutionary developments in molecular biology and biotechnology. Congress and the public, however, prefer to support mission-oriented research rather than basic scientiﬁc investigations.