Although nutrition is generally regarded as a twentieth-century science, the belief that health and longevity depend on regulating the con- sumption of food and drink is one of the most ancient and universal princi- ples of medical theory. Foods were generally classiﬁed in terms of opposing qualities such as hot or cold, moist or dry, which determined whether particular foods would be strengthening, weakening, purgative, or constipating. These concepts were not seriously challenged until well into the eighteenth century, when new chemical theories sparked an interest in the acidity or alkalinity of foods. By the end of the nineteenth century, these chemical distinctions were giving way to a new physio- logical concept of the role of food substances in the animal economy.
Since that time, nutrition scientists have lamented that the development of their ﬁeld has been hampered, not by neglect, but by enormous amounts of misinformation, generated, at least in part, by its uniquely popular appeal. Critics of the modern food industry see diet as a political issue, especially in societies where food deﬁciencies have been replaced with the problems of excess and confusion about rational dietary guidelines. The modern science of nutrition grew out of efforts to understand and isolate the dietary factors promoting health and preventing disease. Finding the causes of vitamin deﬁciency diseases was no simpler than unraveling the etiology of infectious diseases. Indeed, both kinds of dis- orders often seemed to occur in the form of devastating plagues and pestilences. Despite the accumulation of empirical evidence about the relationship between diet and disease, scientists could not unequivocally establish the existence of putative micronutrients without substantial progress in chemistry. Nevertheless, naval surgeon James Lind (1716– 1794) and other pioneers of nutritional science proved it was possible to prevent certain diseases by speciﬁc changes in diet. Although there are many vitamin deﬁciency diseases, scurvy is of special interest because the experimental foundations of our understanding of this dis- ease are part of the abiding legacy of the eighteenth century.
Scurvy may be among the most ancient and ubiquitous pestilences, tormenting its victims with rotting of the gums and teeth, deep aches and pains, blackening of the skin, and an overwhelming lassitude. Seeing whole families, monasteries, or armies afﬂicted with scurvy, ancient writers variously concluded that the disease was contagious, congenital, inherited, transmitted by scorbutic nurses, or generated by malign combinations of diet and atmosphere. Hermann Boerhaave, for example, considered scurvy a very infectious poison. As sailing ships replaced oared galleys and long ocean voyages became possible, the old army sickness became known as the sailors’ disease. Nostalgic visions of graceful tall ships notwithstanding, these sailing vessels were more accurately called ﬂoating hells. The common sailor could expect accommodations that were dirty, damp, vermin- infested, and a moldy, monotonous diet of salt pork, indigestible oat- meal, and ship’s biscuits. Lord George Anson, to whom James Lind dedicated his Treatise on Scurvy, lost more than half of his men to scurvy during his voyage of circumnavigation in 1741. Deaths of sailors were so common that they were hardly worth noting. As long as one in ﬁve ships returned with a cargo of spices, the sponsors of an expedition could make a good proﬁt. Between 1500 and 1800, scurvy killed more sailors than all other diseases and disasters combined. Thus, it is not surprising that naval surgeons were among the ﬁrst to provide good clinical descriptions of the disease and remedies to prevent and cure it.
Before obtaining his M.D. at the University of Edinburgh in 1748, James Lind served as ship’s surgeon on voyages to the West Indies, Guinea, and the Mediterranean. In 1758, Lind was appointed Senior Physician to Haslar Hospital, where he saw hundreds of scorbutic patients. Lind’s lesser known contributions to medicine include obser- vations on tropical medicine, a distillation apparatus for making safe drinking water, and a remedy composed of quinine, spirits, and citrus peel that sounds like the quintessential summer restorative, the gin and tonic. A practical man, proud of his experience, but well read and thoughtful, Lind was ready to take exception to the most learned physicians of his day. While his contemporaries deferred to scholars like Hermann Boerhaave, Lind was not equally impressed. After reviewing scholarly writings on scurvy, Lind insisted that theories must stand or fall according to the test of experience. Clearly, Lind saw himself as more original and less gullible than Boerhaave and his disciples, including one who published a book in which scurvy was attributed to sin and the Devil. Scholars who attrib- uted scurvy to a ‘‘very infectious poison’’ could not explain why no ofﬁ- cers contracted the disease when it raged with remarkable virulence among common soldiers. Learned physicians felt obliged to ground their ideas about scurvy in theoretical rationalizations derived from classical authors. For the scholar, remedies were only of interest if theory rationalized their action. Similarly, if an idea was theoretically sound, no empirical tests were necessary. For example, according to Boerhaave, the blood serum of patients with scurvy was too thin and acrid, while the material that made up the portion that clotted was too thick and viscid. It was, therefore, the physician’s delicate task to thicken and neutralize the acridity of the serum while simultaneously thinning the clot-forming portion of the blood.
Although scurvy took many forms, its characteristic signs were putrid, bleeding gums and blue-black spots on the body. Generally, the ﬁrst signs of the disease were pale and bloated complexion, listless- ness, and fatigue. Eventually, internal hemorrhages caused weakness, lethargy, stiffness and weakness of the knees, swelling of the ankles and legs, chronic sores, putrid ulcers, and breathlessness following any exertion. Advanced cases were marked by coughing and pains in the bones, joints, and chest. Profuse hemorrhages and violent dysen- teries reduced the patient to extreme weakness. Sudden death might occur in patients suffering from the breakdown of previously healed ulcers, chest pains, and difﬁcult respiration. During two cruises of 10 and 11 weeks in 1746 and 1747, scurvy attacked the British frigate Salisbury with great virulence after only 4 weeks at sea. Although the captain generously provided the sick with fresh provisions, including mutton broth and meat from his own table, 80 of the 350 crewmen suffered from scurvy. Generally, the sailor’s diet consisted of putrid beef, rancid pork, moldy biscuits, and bad water. Probably only a liberal allowance of beer, brandy, and rum could make such food palatable. While greens, fresh vegetables, and ripe fruits were regarded as preservatives against scurvy, Lind could not tell whether these foods were needed to counteract the bad effects of the moist sea air, or to correct the quality of hard, dry ship’s rations. One hundred years previously, John Woodall (1570–1643), author of The Surgeon’s Mate, or Military and Domestic Medicine (1636), had called attention to the antiscorbutic virtues of lemon juice. Woodall’s observations were interesting, but essentially anecdotal.
It was Lind’s special genius to test possible antiscorbutics with a con- trolled dietary experiment. A group of scorbutic sailors were put on a diet of gruel, mutton broth, puddings, boiled biscuits, barley, raisins, rice, currants, and wine. In addition to this basic diet, two of the men were given a quart of cider a day; two were given elixir vitriol (sulfuric acid diluted with water and alcohol); two received rations of vinegar; two were given sea water; two received a combination of garlic, mustard seed, balsam of Peru, gum myrrh, and barley water, well acidulated with tamarinds and cream of tartar; two others were given two oranges and one lemon per day. Within six days one of the sailors given oranges and lemons was ﬁt for duty and the other was strong enough to serve as a nurse. Lind’s experiment not only demonstrated that oranges and lemons cured scurvy, it also showed that it was possible to test and compare alleged remedies. Proving that lemons and oranges cured scurvy was easier than con- vincing the authorities to utilize the information. There was no scientiﬁc obstacle to the eradication of sea scurvy, but it was essentially impos- sible for a naval surgeon to force his so-called superiors to abandon entrenched opinions, sanctioned by ‘‘time, custom, and great authori- ties.’’ The British Admiralty did not adopt Lind’s remedy until 1795, when it proposed that lemon juice should be provided after six weeks on standard rations. The British Board of Trade did not require rations of lime juice in the merchant marine until 1865. Lemons did not become part of standard rations in the American Navy until 1812. Even without ofﬁcial blessings, some naval surgeons included a form of lemonade in their medical kit, but supplies of antiscorbutics were generally inade- quate and unreliable. Army doctors ignored or rejected Lind’s work and argued that a great many factors, especially a history of ‘‘evil habits,’’ along with fatigue, depression, and bad food, could cause scurvy. Apathy and ignorance only partially explain the failure of the medical community to call for the universal adoption of Lind’s remedy. Although naval surgeons and sailors were well acquainted with the natural history of scurvy, confusion about the nature of the disease per- sisted into the twentieth century. Moreover, experience seemed to prove that scurvy had no single cause or cure. One argument against the die- tary deprivation theory of scurvy was the observation that cooks were often the ﬁrst to die of the disease. A certain degree of skepticism is cer- tainly valid in the face of any claim for a cure too wonderful and too simple to be true. Indeed, marvelous health-giving fruits seemed more at home in a utopian fantasy such as Francis Bacon’s (1561–1626) New Atlantis, than in a medical treatise. In Bacon’s allegory, a wonderful fruit that resembled an orange cured the sickly crew members of a lost British ship that landed in the mythical New Atlantis.
Physicians had heard of many equally miraculous antiscorbutics touted by sailors and explorers. For example, when Jacques Cartier’s expedition in search of a northern route through North America was trapped by the ice during the winter in 1536, his crew was attacked by scurvy. Native Americans showed the French how to make a remedy from the bark and leaves of a certain tree. At ﬁrst, most of the sick refused to try the Indian remedy, but when those who tried it recovered it was soon in great demand. The French had to admit that all the learned doctors of France could not have restored their health and strength as successfully and rapidly as the Indian remedy. Other sailors and doctors ascribed antiscorbutic virtues to high morale, good food, water distilled over powdered scurvy grass, cleanliness, dry clothing, wholesome exercise, sour oranges and lemons, oil of vitriol, and peri- odic access to fresh country air. Many sailors believed that all complex medical approaches were useless. Instead of medicines, they insisted that being buried in the earth up to the neck cured scurvy. One of the most distinguished and inﬂuential of all naval physi- cians, Sir Gilbert Blane (1749–1834), Physician of the Fleet, and per- sonal physician to Lord Rodney, Commander-in-Chief, was able to implement reforms that naval surgeons had long desired. Sir Gilbert earned the nickname ‘‘Chilblain’’ because his concern for the welfare of the common sailor was so well hidden by his icy demeanor. Through- out history, it had been taken for granted that armies would lose more men by sickness than by the sword, but in the eighteenth century new approaches to vital statistics provided disconcerting evidence of the human and economic toll. As physician to the Fleet, Blane received a monthly report of the prevalence of diseases, mortality, and other mat- ters related to health from every naval surgeon. In order to improve the condition of Britain’s sailors, Blane used these reports to prepare his ﬁrst treatise on naval hygiene. Later, in his Observations on the Diseases of Seamen, Blane advised the authorities that preserving the health of seamen was not only a matter of basic humanity, but also a form of enlightened self-interest spurred by economic and political necessity.
As warfare and economic ventures required ever-greater numbers of sail- ors and soldiers, statistical methods proved that the state could not afford to waste its valuable stock of able-bodied men. These common sailors, ‘‘the true sinews of war,’’ were essential to the public defense (and offense). As a nation dependent on her navy, Britain had to realize that, even if her ofﬁcials thought of sailors as a ‘‘commodity,’’ economic and political necessities indicated that it was less expensive to maintain life and health than to support invalids and replace the dead. In 1795, Blane became Commissioner of the Board of the Sick and Wounded Sailors, a position that he used to sponsor many much needed reforms. After 1796, the incidence of sea scurvy declined dramatically. The inexorable logic of numbers demonstrated that, in the years before Blane’s reforms were instituted, about 1 of every 7 British sailors died, while many others were permanently disabled. At the beginning of the war in America, 1 in 2.4 men became ill and 1 in 42 died. By the end of the Napoleonic wars, these rates had been reduced to 1 in 10.7 sick and 1 in 143 dead. Blane calculated that if the 1779 mortality rate had not been reduced, Britain’s entire stock of seamen would have disappeared long before the defeat of Napoleon. By 1815, although fevers, pulmonary inﬂammation, and dysentery continued to plague British sailors, sea scurvy had been nearly eradi- cated. Whatever expenses had been incurred in provisioning ships with citrus fruits were clearly offset by lower manpower costs. Thomas Trotter (1760–1832), another Physician to the Fleet, continued the battle for the health of sailors. In addition to dietary reform, Trotter recognized the value of inoculation against smallpox and became an early champion of vaccination. Indifferent to scholarly theories about scurvy, Trotter simply argued that fresh citrus fruits provided ‘‘some- thing to the body’’ that fortiﬁed it against the disease and warned his readers to resist ‘‘imaginary facts and fallacious conclusions.’’ Despite lime rations, sporadic outbreaks of scurvy continued to occur at sea, while army surgeons fatalistically accepted scurvy as one of the pestilences of war, along with typhus, typhoid, and dysentery.
Nevertheless, when a British naval expedition returned from the arctic in 1876 with the news that half the 120 men had suffered from scurvy, and 4 had died, the House of Commons called for an inquiry. Similar scandals caused doubt and confusion among scientists as to the nature of scurvy and antiscorbutics. In the 1870s, physicians were surprised to ﬁnd scurvy appearing among the children of middle-class families in London’s suburbs. Unlike the poor, who relied on potatoes, well-to-do people were likely to feed their children bread and butter and tinned milk (canned, sterilized milk). In this situation we can see how some medical and hygienic advances help solve one problem, but create unforeseen difﬁculties. Although ster- ilization of milk helped reduce the problem of infantile diarrheas, as more families switched to tinned milk, infantile scurvy appeared in both rich and poor families. Even today, problems associated with artiﬁcial feeding continue to arise and spread throughout the world because the manufacturers of infant formulas promote their products as the mod- ern way to feed the baby.
A new class of adult scorbutics was created in the 1960s as Zen macrobiotic diets became more fashionable and more extreme. Some followers of such dietary regimens consumed nothing but brown rice sprinkled with sesame seeds. Many people living in climates more inhospitable than Europe avoided scurvy through the ingenious use of plant and animal resources. For example, American Indians made teas and tonics from the needles, sap, or bark of appropriate trees, and the indigenous people of Australia used a green plum for medicinal purposes. Cereals, peas, and beans lack antiscorbutic properties in their dry, dormant state, but during their sprouting stage they are good sources of vitamin C. The nutritional value of bean sprouts has long been appreciated in Asia. Although some groups of Eskimos were able to gather berries and others ate the veg- etable material found in the rumen of caribou, for the most part, the Eskimo menu was limited to meat and ﬁsh.
Since vitamin C is present at low levels in animal tissues, it is possible to avoid scurvy by consuming fresh meat and whole ﬁsh without the niceties of cleaning and cooking. Even though physicians generally agreed that the prevalence and severity of scurvy were related to diet, other factors, such as contagion, climate, and physical condition, were considered equally important. For example, Jean Antoine Villemin (1827–1892) attributed scurvy to a con- tagious miasma, similar to that which caused epidemic typhus. Fresh vegetables and lemons might have some therapeutic value, but, Villemin argued, that did not mean a deﬁciency of lemons caused scurvy any more than a deﬁciency of quinine caused malaria. Russian physicians expressed a similar belief as late as World War I, when they suggested that scurvy was an infectious disease spread by lice. A chemical theory of scurvy reminiscent of Boerhaave’s was proposed by Sir Almroth Wright (1861–1947), who argued that scurvy was caused by acid intoxi- cation of the blood. Wright insisted that the direct administration of antiscorbutic chemicals, such as sodium lactate, would restore normal alkalinity to blood more efﬁciently than lime juice. As scientists attempted to determine the antiscorbutic value of various foods, they found that animal experiments often added to the confusion, because different animal species vary in their vitamin requirements. In 1907, Axel Holst (1860–1931) and Theodor Fro¨ lich (1870–1947) discovered an appropriate animal model for the systematic evaluation of antiscorbutics.
Holst, Professor of Hygiene and Bacteri- ology at the University of Christiana, Oslo, had studied bacteriology in France and Germany. He had also visited the laboratory of the Dutch physician and bacteriologist Christiaan Eijkman (1858–1930) in Indonesia to learn about a disease known as beriberi. Searching for a mammalian model for beriberi, Holst tested the guinea pig. When he noted signs of scurvy, he enlisted the assistance of Fro¨ lich, a pedia- trician concerned with infantile scurvy. Holst and Fro¨ lich demonstrated that scurvy in the guinea pig was induced by diet and cured by diet. If Holst had used the rat as his experimental animal, the story would have been quite different. Although some scientists considered the rat the ideal and universal model for deﬁciency diseases, unlike guinea pigs and primates, rats are not readily susceptible to scurvy. Beriberi is now known as a nutritional disorder caused by a deﬁciency of thiamine (vitamin B1). Its symptoms and severity, how- ever, may vary from swelling of the legs, arms, and face, to a gradual loss of sensation that may culminate in paralysis. Eventually, damage to the cardiovascular system and the nerves may lead to severe debility and even death. Although beriberi occurred throughout the world, it was particularly prevalent in Asia. In some Asian nations, beriberi was one of the leading causes of death. The best-known example of the relationship between food preparation methods and beriberi is the milling of rice, which removes the thiamine-containing bran and germ layers. While working in Indonesia, Eijkman realized that chickens could be used as a model experimental system to study this ancient dis- ease. In 1929, Eijkman was awarded the Nobel Prize in Physiology or Medicine for his contributions to the study of vitamin deﬁciency dis- eases. Thiamine was chemically characterized and synthesized in the 1930s by Robert R. Williams (1886–1965). His brother, Roger John Williams (1893–1988) isolated two other important B vitamins, panto- thenic acid and folic acid. At the University of Texas, Roger Williams founded and directed the Clayton Foundation Biochemical Institute, where many other vitamins were discovered. Williams thought that pantothenic acid might be helpful in the management of rheumatoid arthritis and other diseases. As early as the eighteenth century, experiments deliberately con- ducted on human guinea pigs had provided support for Lind’s hypoth- esis. William Stark (1740–1770), who served as his own guinea pig, was probably the ﬁrst physician to attempt a systematic series of dietary deprivation experiments.
Weakened by a diet of bread and water, to which he had added tiny amounts of various oils, bits of cooked meat, and honey, Stark, with gums swollen and purple, consulted the great Sir John Pringle (1707–1782), founder of modern military medicine. Although Pringle had considerable experience with scurvy, instead of recommending fruits or vegetables, he suggested that Stark reduce his salt intake. Less than nine months after beginning his experiments, Stark was dead. Had his eminent colleagues suggested oranges and lemons instead of venesection, Stark might have recovered and demon- strated the value of Lind’s dietary experiments. In 1940, John Crandon, a young American surgeon, served as his own guinea pig in a study of the relationship between vitamin C deﬁciency and wound healing. Perhaps the most surprising ﬁnding in Crandon’s experiment was that signs of scurvy did not appear until he had endured about 19 weeks on a restricted diet. In similar experiments conducted in England during World War II, it took several months to provoke signs of scurvy. Presumably, the nutritional status of twentieth-century volunteers was very different from that of the wretched sailors Lind had described. Moreover, Lind believed that exhaustion, hunger, and desperation—factors now subsumed by the term stress—predisposed sailors to scurvy.
Although progress in bacteriology and surgery helped reduce the death toll from battleﬁeld injuries during World War I, dysentery and deﬁciency diseases rendered some military units totally unﬁt for any kind of action. Indeed, even after the First World War, doctors still considered it possible that certain foods produced scurvy and that others functioned as antidotes. Laboratory experiments and historical research on the antiscorbutics that had been used by the British navy helped explain many paradoxical reports. While it may be true that ‘‘a rose is a rose is a rose,’’ we cannot assume that ‘‘a lime is a lime is a lime.’’ During the ﬁrst half of the nineteenth century, the lime juice used by the British navy usually came from Mediterranean sweet limes or Malta lemons. In the 1860s, the Navy began using West Indian sour limes. Scientists eventually discovered that the antiscorbutic potential of this lime was negligible. Using guinea pigs to test antiscorbutic diets, Harriette Chick (1875–1977) and associates at the Lister Institute care- fully measured the antiscorbutic quality of various foods. Researchers proved that not all species of lemons and limes were effective as anti- scorbutics; moreover, preserved citrus juices were often totally useless. As a result of such studies, during World War II discussions about pro- visions for the armed forces focused on how to allow a safety margin against scurvy rather than emergency measures to combat epidemic scurvy. Many researchers were actively pursuing the antiscorbutic factor, but it was the biochemist Albert Szent-Gyo¨ rgyi (1893–1986), who was not actually looking for dietary factors, who discovered it. Although Szent-Gyo¨ rgyi began his career as a doctor, he was more interested in chemistry, histology, physiology, the biochemical mechanism of respir- ation, and the biological oxidation of carbohydrates. The path that led to Szent-Gyo¨ rgyi’s discovery of vitamin C was extremely circuitous. It began with studies of Addison’s disease (chronic adrenocortical insufﬁciency). In his classic monograph, The Constitutional and Local Effects of Disease of the Suprarenal Capsules (1855), Thomas Addison (1793– 1860) described the symptoms of this disorder as ‘‘anemia, general lan- guor or debility, remarkable feebleness of the heart’s action, irritability of the stomach, and a peculiar change of color in the skin.’’ Weakness, nausea, vomiting, anorexia, and abdominal pains usually preceded changes in pigmentation, but bronzing of the skin was often the ﬁrst symptom to attract attention. Szent-Gyo¨ rgyi associated the darkening of the skin in Addison’s disease with the browning of fruits and vege- tables like apples and potatoes. Using this rather tenuous connection, he attempted to isolate the mysterious anti-bronzing factor from fruits that did not undergo browning on withering, such as lemons and oranges. In 1927, Szent-Gyo¨ rgyi isolated a novel substance that he planned to call ‘‘ignose,’’ meaning, ‘‘I don’t know,’’ because it could not be chemically identiﬁed. When editors refused to publish a paper about ignose, Szent-Gyo¨ rgyi suggested ‘‘godnose,’’ but he ﬁnally had to settle for ‘‘hexuronic acid.’’ Nutritional experiments conducted in collaboration with the American biochemist Joseph L. Svirbely in 1931 demonstrated that hexuronic acid was vitamin C. In keeping with its nutritional role in the prevention of scurvy, hexuronic acid was renamed ascorbic acid. Szent-Gyo¨ rgyi was awarded the 1937 Nobel Prize in Medicine or Physiology for his work on biological oxidation reactions and vitamin C. Ascorbic acid plays an essential role in the ﬁnal stages of the syn- thesis of collagen, a protein that serves as a kind of intercellular cement and plays a major structural role in connective tissue.
The role of vita- min C in preventing scurvy was, therefore, clearly established, but the activities ascribed to this vitamin and the appropriate daily dosage for human beings remain controversial. The mystique of vitamin C has con- tinued to evolve since the 1960s, when Irwin Stone, an industrial chem- ist, made the claim that primates suffer from an inborn error of metabolism that could be corrected by consuming large amounts of vitamin C. Megavitamin therapy, also known as orthomolecular medi- cine, acquired some eminent spokesmen, such as Roger Williams, the discoverer of pantothenic acid, and the ingenious chemist and two-time Nobel Laureate, Linus Pauling. Vitamin C enthusiasts claimed that the vitamin has antiviral and antibacterial activity, lowers blood choles- terol, cures the common cold, and increases mental alertness, intelli- gence, and general well being. Predictably, as AIDS hysteria mounted, reports appeared in newspapers and magazines about victims of AIDS who had been cured by megadoses of vitamin C. Expensive vitamin preparations called HIM (Health and Immunity for Men) were mar- keted to the ‘‘sexually active male’’ as a means of maximizing the ability of the immune system to ﬁght infections, while allowing the body to maintain ‘‘sexual vitality and potency.’’ With so many self-proclaimed authorities promoting megadose vitamin products for mental and physical illness, a hearty dose of skepti- cism and caution is absolutely necessary. The idea that if a little bit is good, a lot must be better does not ﬁt the facts about vitamins. In large doses, some vitamins may be toxic or teratogenic (causing deformity in the fetus). For example, a report released by the Institute of Medicine in 2001 warned that megadoses of vitamin A, such as those sold in health food stores, can cause severe liver disease, as well as birth defects when they are taken by pregnant women, and excessive doses of vitamin E can cause uncontrolled bleeding. Vitamin A is, of course, essential for good vision, immune function, and so forth.
In poor countries, vitamin A deﬁciency is a major cause of blindness. The vitamin is found in meat, ﬁsh, eggs, fruits and vegetables (oranges, carrots, spinach), and in vitamin-fortiﬁed breakfast cereals. People who believe that raw foods are better sources of vitamins might be surprised to learn that cooking doubles the body’s absorption of vitamin A. In an era of food abundance, dietary and nutritional standards may be more powerfully inﬂuenced by political and economic forces than by scientiﬁc research. Scientists, nutritionists, and public health experts argue that the food industry has effectively campaigned to con- fuse the American public and block efforts to provide rational nutritional guidelines. The industry won a major victory in 1994 with the passage of the Dietary Supplement Health and Education Act, which deregulated dietary supplements and exempted such products from FDA oversight.
Based largely on economic factors clad in the rhet- oric of freedom of choice, the Dietary Supplement Act broadened the deﬁnition of supplements to include herbs, diet products, and essentially any product that could be called a dietary supplement. Manufacturers of dietary supplements, ‘‘Techno Foods,’’ or nutraceuticals do not have to prove that their products are essential or speciﬁcally beneﬁcial to the body. With the completion of the Human Genome Project, some food supplement producers claimed that the new science of nutritional geno- mics, or nutrigenomics, could provide diets speciﬁcally calibrated to an individual’s genetic makeup. Potential customers were given kits for the collection of DNA in order to obtain dietary advice and purchase very expensive customized vitamins and supplements. Many scientists expressed skepticism about such claims. Despite the increased emphasis on nutrition and dietary supple- ments, speciﬁc vitamin deﬁciency diseases still occur even in wealthy nations. In 2000, physicians were surprised to see an apparent resur- gence of nutritional rickets, a disease of infants caused by a deﬁ- ciency of vitamin D. Without vitamin D, the cartilage of developing bones cannot properly mineralize, but the symptoms of rickets include enlarged heart and organ failure, as well as soft bones and deformed limbs. Physicians and nutritionists generally assumed that rickets had been eradicated, because vitamin D has been added to milk since the 1930s. Moreover, people make their own vitamin D when a precursor molecule in the skin is activated by sunlight. Rickets can, however, occur in infants who have been breastfed and carefully protected from sunlight, in order to prevent skin cancers. Vitamin D deﬁciency in adults, especially the elderly, can lead to osteomalacia (adult rickets), bone fractures, seizures, or heart failure due to very low blood levels of calcium.
Whether nutritional guidelines are based on clinical observations or laboratory research, the history of scurvy indicates that the well-being of populations is more likely to be affected by the politics than the science of nutrition. The economic and political aspects of nutrition are most apparent in the frequent rediscovery of the trinity of malnutrition, poverty, and disease. Obviously, in the case of vitamin deﬁciency dis- eases, preventive measures were available long before any speciﬁc die- tary factors were discovered. Advances in the science of nutrition proved that certain diseases, such as scurvy, beriberi, and pellagra, were not due to infectious microbial agents and were not, therefore, a direct threat to those with adequate diets. During the late nineteenth century, the threat of infectious diseases and the development of germ theory diverted attention from other kinds of diseases. But today, in wealthy, industrialized nations, the growing burden of chronic disorders has overshadowed the threat of infectious disease. By the 1970s, the United States Congressional Ofﬁce of Tech- nology Assessment was chastising researchers for neglecting dietary links to cancer, stroke, hypertension, diabetes, and dental disorders. Although there is general agreement about the importance of nutrition for good health, physicians and researchers remain cautious when con- fronted with claims that the diet–disease connection provides an immedi- ate panacea for the heavy burden generated by chronic degenerative diseases in the wealthy industrialized nations.