Misjudging the value of colchicine for gout caused Sydenham much per- sonal discomfort, but his studies of quinine for malaria offered relief from the debilitating intermittent fever that is still worthy of the title ‘‘million-murdering Death.’’ Symptoms of malaria include raging thirst, headache, fatigue, and delirium. Patients suffer from bouts of fever and chills that alternate with periods of apparent remission. If we consider the impact of diseases on populations over time, as measured by the greatest harm to the greatest number, malaria has been the most devas- tating disease in history. Scientists and historians generally agree that malaria has been a signiﬁcant force in human evolution and in deter- mining the success or failure of settlement patterns and colonial ven- tures throughout the world. Malaria seems to have achieved its widest distribution in Europe during the seventeenth century, but it was not uncommon there even in the nineteenth century. According to the World Health Organization, malaria attacks about 300 million people a year, and causes more than 1 million deaths, about 90 percent of them in Africa. Some authorities say that deaths from malaria may actually number about 2.7 million a year. Malaria was Africa’s leading killer until 1999, when it was displaced by AIDS. One of the great accomplishments of seventeenth-century medical science was the discovery that quinine could be used as a speciﬁc remedy for malaria. Quinine is the active ingredient in cinchona (also known as Peruvian bark, Jesuits’ powder, or Devil’s bark), a traditional Peruvian remedy supposedly named after the Countess of Chincho´ n, wife of the Governor of Peru. The story of the feverish Countess appears to be pure ﬁction, but, with or without her blessings, the New World remedy spread quickly throughout Europe. As demand for the wonder-working bark drove its price higher and higher, charla- tans amassed great fortunes selling secret remedies containing Peruvian bark or useless imitations that mimicked quinine’s bitter taste. By the end of the 1660s, conﬁdence in Peruvian bark had dropped precipitously because many physicians claimed that the drug was responsible for dangerous relapses and sudden deaths.
Careful study convinced Sydenham that the bark was safe and effective; adverse reactions were due to improper use rather than to any evil in the drug itself. Peruvian bark was important not only as a remedy for malaria, but also for its symbolic value in challenging the ancient foundations of phar- macology. Medical dogma called for remedies that were complex and pur- gative, but Peruvian bark cured malaria without purgation. Orthodox medical theory condemned the new remedy as ‘‘irrational’’ because it was theoretically impossible for healing to occur without the expulsion of morbid matter. Therefore, while the bark seemed to interrupt cycles of intermittent fevers, opponents of Peruvian bark assumed that its use led to the accumulation of dangerous materials within the body. Sydenham argued that experience was more compelling than theory; the drug was safe and effective if dosage, timing, and duration of treatment were carefully regulated. In terms of medical practice and theory, therefore, quinine was as revolutionary as gunpowder had been to the art of warfare. Despite Sydenham’s conviction that the bark was harmless, the use of quinine can cause some very unpleasant side effects, including head- aches, vomiting, rashes, and deafness. Indeed, some physicians used complaints about ringing in the ears to determine the optimum dosage for each patient. Because few practitioners, or patients, could accept the concept of speciﬁcity in diseases and remedies, Peruvian bark was freely prescribed for fevers, colds, ﬂu, seasickness, headache, and hangovers. But quinine is a speciﬁc remedy for the speciﬁc intermittent fever known as malaria. Its use as a general febrifuge and tonic exposed many people to risks without beneﬁts. Peruvian bark prepared Europe for a new relationship with malaria. For hundreds of years, malaria and other murderous diseases kept Europeans from penetrating the vast African continent. Thus, quinine became one of the tools that made European exploitation of Africa, and much of Asia, possible. In areas where malaria is highly endemic, slight genetic variations that enhance resistance to the disease may provide a powerful evolutionary advantage. The prevalence of genes for disorders known as sickle cell anemia and thalassemia suggests such an evolutionary pattern. Biologists as well as anthropologists, therefore, have been intrigued by the relationship between genes for abnormal hemoglobins and resistance to malaria. Quinine, the compound responsible for cinchona’s effectiveness against malaria, was isolated in 1820. Within 10 years, the puriﬁed drug was being produced in large quantities. Until the 1850s, the forests of Peru, Bolivia, and Colombia were the only sources of the bark, but the Dutch and British established cinchona plantations in Indonesia and India. Intensive experimentation led to signiﬁcant increases in the yield of active alkaloids. By the turn of the century, the Dutch had captured more than 90 percent of the world market. The Dutch monopoly on this vital drug was not broken until the 1940s, with the Japanese con- quest of Indonesia and the European development of synthetic antima- larial drugs. A type of protozoan belonging to the genus Plasmodium causes malaria. The minute parasite has a complex life cycle that includes forms that grow and multiply in blood-sucking mosquitoes and other forms that live in the liver and red blood cells of vertebrate hosts.
The female Anopheles mosquito transmits the parasite from infected individuals to new victims. Four species of the protozoan parasites cause human malaria: Plasmodium vivax, Plasmodium falciparum, Plasmodium malar- iae, and Plasmodium ovale. All forms of malaria may have serious con- sequences, but P. falciparum (malignant tertian malaria) is particularly dangerous. Other members of the genus Plasmodium are parasites of various species of birds, reptiles, amphibians, and mammals. Because anopheline mosquitoes prefer to lay their eggs in stagnant waters, malaria typically becomes endemic in marshy areas. The ancient Greeks and Romans noted the connection between malaria and marshes, but the basis of this relationship was not discovered until the end of the nineteenth century. During the ﬁrst half of the twentiethth century, the conquest of malaria seemed to be a real possibility, but the optimism raised by the anti-malaria campaigns of the 1950s and 1960s ended in the 1970s as the resurgence of malaria became obvious. By the 1980s, the hope that malaria could be eradicated by pesticides and drugs had been abandoned. The increasing prevalence of pesticide-resistant mos- quitoes and drug-resistant malarial parasites was only part of the prob- lem; socioeconomic and geopolitical issues were even more signiﬁcant. Although the global campaign to eradicate malaria that was launched in 1955 has been called a misguided failure, it did provide valuable lessons. Public health workers realized that even though global eradication of malaria was not a realistic goal, sustained control was essential to eco- nomic development in areas where the disease remained endemic. Malaria has continued to ﬂourish because global recessions, large- scale population migrations, political upheavals, and warfare militated against the high levels of ﬁnancial and administrative support, sophisti- cated organizational infrastructure, and international cooperation needed to sustain anti-malarial campaigns. To reverse this dismal trend, the World Health Organization established special programs to support research on malaria, schistosomiasis, trypanosomiasis, leishmaniasis, ﬁl- ariasis, and leprosy in areas where these diseases were still endemic. Many anti-malaria initiatives were launched in the 1990s, including Roll Back Malaria (RBM), funded by a consortium of the WHO, World Bank, United Nations Development Program, and United Nations Children’s Fund. Because of advances in molecular biology culminating at the end of the twentiethth century, parasitology—once known as tropical medi- cine—became an attractive and challenging area of biomedical research. Basic research on the biology and immunology of malaria raised hopes for the development of anti-malaria vaccines. Certainly, Sydenham would pronounce such ‘‘mission oriented’’ therapeutic research both good and useful, but he might disparage the hyperbole surrounding the sequencing of the genome of the parasite and the mosquito as too far removed from the needs of patients.
Many public health experts would agree with him and object to diverting funds from practical efforts to control malaria into projects that have been called the Star Wars approach to infectious disease. Decoding the genome of Plasmodium falciparum, the most dangerous of the four strains of malaria parasites, took six years. The genome of Anopheles gambiae, the primary vector of the parasite, took about 15 months. In 2002, the genome sequence of A. gambiae was published in Science; the genome sequence of P. falcip- arum was published in Nature. Understanding the genetic make-up of both the mosquito and the plasmodium might, however, facilitate the development of new drugs, insect repellents, and mosquito traps. Advances in the techniques of molecular biology provided sophis- ticated insights into the machinations of the malaria plasmodium, but the most intractable obstacles to the development of a malaria vaccine have generally been economic and geopolitical factors. Disputes between the WHO and the biotechnology companies that have the tech- nical competence to manufacture novel vaccines reﬂect the central prob- lem in tropical medicine: the tension between the poorest nations, which need remedies and vaccines for malaria and other infectious diseases, but lack the resources to produce them, and the developed nations, which could develop such medicines, but do not need them. Given the role malaria has played in history, it would be ironic indeed if the question of whether or not it is possible to develop a malaria vaccine is subverted by the problem of whether it is politically or economically expedient to do so.