In the end we all die: life is a condition with a mortality rate of 100 per cent. Doctors talk of saving lives, but what they really do is defer death. This chapter is about the deaths that medicine has deferred. Deferring death is the main test of medicine’s success –– not the only one, admittedly, since doctors also alleviate pain and suﬀering and cure non-fatal conditions. But it is far easier to measure deferred deaths than improved qualities of life. Modern medicine, it turns out, has been far less successful at deferring death than you would think. The story so far has been straightforward: up until 1865 medicine was almost completely ineﬀectual where it wasn’t positively harmful. Histories of medicine which treat medicine as if it was in some sense ‘scientiﬁc’ and capable of ‘progress’ before the emergence of a prac- tical germ theory of disease have to keep drawing attention away from this fact, even though it is one that almost no one would deny. After 1865 doctors began to tackle diseases with some success. There began to be some real progress in medicine, and this represents the beginning of a new epoch. Recognizing this, it would be easy to conclude that medicine was ‘bad’ until 1865 (when antiseptic surgery began), or 1885 (when the ﬁrst modern vaccine was discovered), or 1910 (when salvarsan was introduced as the ﬁrst eﬀective chemical therapy), or 1942 (when the ﬁrst antibiotic was introduced), and that thereafter it became, in fairly short order, good medicine, life-saving medicine. Certainly between 1865 and 1942 doctors began for the ﬁrst time to defer deaths in signiﬁcant numbers, but not in numbers anywhere near large enough to explain the astonishing increase in life expect- ancy that took place during the same period. Medicine has been taking the credit for something that would have happened anyway. And because there had been a real revolution in life expectancies the impression was created that doctors were rather good at doing what they do. In fact, when it comes to saving lives, doctors have been surprisingly slow and ineﬃcient. For every Semmelweis, horriﬁed at his failure to transform the practice of his contemporaries, there is a Fleming, oblivious in face of a missed opportunity to save lives. In order to get the achievements of modern medicine in perspec- tive we have to start thinking about life expectancies. What matters is the age at which we die, or (to look at it from another point of view) the proportion of the population that dies each year. If 1 per cent of the population die each year, and if deaths are randomly distributed across ages, then the average life expectancy will be 50. But death does not play fair. It singles out the very young and the very old. In pre- industrial economies something like half those born die by the age of 5; on the other hand a very large proportion of those who survive infancy and early childhood die in their ﬁfties, sixties, and seventies. The result is a life expectancy at birth that rarely rises above 40. The distribution of deaths across ages in early modern England was such that a death rate of 2.5 per cent per annum corresponded roughly with a life expectancy of 40 years. (The fact that 2.5 goes into a hundred forty times is a coincidence; the relationship between death rate and life expectancy is an empirical one, determined by the distribution of deaths across ages.) The rate ﬁrst dropped signiﬁcantly below 2.5 per cent per annum (and life expectancy ﬁrst rose above 40 years) around 1870, though death rates had intermittently been lower and life expectancies higher in the late sixteenth and early seventeenth centuries. Medicine has always claimed to be able to postpone death, but there is no evidence that it was able to do so for signiﬁcant numbers of people before 1942. Between 1900 and 2000, life expectancies in Western countries increased from 45 to 75 years, and death rates fell from 2 to 0.5 per cent. In the course of the last century, death had been deferred by thirty years. This is known as the ‘health transition’ or the ‘vital revolution’. Most people assume that this increase in life expectancy is the result of improvements in medicine, but by 1942 life expectancies had already risen by about twenty years. As a result the extent of medicine’s contribution to the health transition is hotly debated. One estimate is that in America modern medicine has increased life expectancy by ﬁve years; of those ﬁve years, two were gained in the ﬁrst half of the century, when life expectancy increased by twenty-three years, and three in the second half, when life expect- ancy increased by seven years. This study implies that Americans owe less than 20 per cent of the increase in life expectancy over the past century to medicine. Another study suggests a ﬁgure of 25 per cent for the period 1930 to 1975. A study for the Netherlands proposes that between 4.7 and 18.5 per cent of the increase in life expectancy between 1875 and 1970 was due to direct medical intervention, almost all of it since 1950: in other words, in the region of 12 per cent. The same study estimates that between 1950 and 1980 medical inter- vention improved the life expectancy of Dutch males by two years and of Dutch females by six years. Thus, according to this research, medical intervention has been the key factor in gains in life expect- ancy since 1950, but more than three-quarters of the gain in life expectancy took place between 1875 and 1950. I ﬁnd these ﬁgures hard to believe in the light of my own history: a compound fracture of the arm at the age of 8 would in all probability have killed me before the antiseptic revolution, for I would have been fortunate to survive amputation; and then peritonitis from a burst appendix would certainly have killed me at the age of 13 had I been born anywhere without access to modern surgery: the ﬁrst append- ectomy was performed in 1880. But apparently my own experience is far from typical. The simple fact is that few of us owe our lives to modern medicine. In order to understand this puzzle we need to explore the changes in health over the last two hundred years. Because evidence is particu- larly good for England, and because much of the debate over the eﬀectiveness of modern medicine has been concerned with the interpretation of the English evidence, in what follows I am going to concentrate on England, but nothing important would change if we looked at any other modern industrialized country. England is pecu- liar only in that industrialization and urbanization took place earlier and more rapidly there than anywhere else. Death rates in cities were higher than in the countryside in every Western country until around 1900, so England’s exceptionally rapid population growth was achieved despite the braking eﬀect of urbanization. Between 1681 and 1831 the population of England and Wales tripled, from 4.93 million to 13.28 million. It would seem reasonable to assume that this population increase was largely due to increased life expectancy –– that is, to assume that adults are normally sexually active, and that, without birth control, fertility is largely determined by female life expectancy. In the 1680s life expectancy was 32 years; in the 1820s (despite a century of falling wages) it was 39 years, and it was still about the same ﬁfty years later (a fact partly explained by increased urbanization, which shortened lives); it then started a steady climb to 70 in 1960. Thus the ﬁrst thing to note is that there was a small but signiﬁcant gain in life expectancy before the ﬁrst of the modern revolutions in medicine, the victory of germ theory in 1865, took place. The classic argument that medicine has had almost nothing to do with modern gains in life expectancy is Thomas McKeown’s The Modern Rise of Population (1976). McKeown’s case depended on a series of tables and graphs that showed the proportion of the popula- tion killed by a number of key diseases and the way in which this changed over time. Thus respiratory tuberculosis killed 40 people in every 10,000 a year in 1840 and was responsible for 13 per cent of all deaths; this had fallen to 5 deaths in 10,000 by 1945, and yet there was no eﬀective treatment in England until the introduction of strepto- mycin in 1947. The BCG vaccine had been available from 1921, but its general introduction was delayed because of doubts about its eﬀectiveness –– doubts that continue to the present day. Bronchitis, pneumonia and inﬂuenza killed 27 in every 10,000 in 1901 (fully 15% of all deaths, for the death rate had fallen by more than 20%); this had halved by the time the ﬁrst eﬀective treatment, sulphapyridine, was introduced in 1938. Scarlet fever killed 23 in every 10,000 children in 1865; this had fallen to 5 by 1890, and to 1 by the time prontosil, the ﬁrst eﬀective treatment, was introduced in 1935. Remarkably, then, modern chemical therapies and antibiotics appear on the scene when the major killers have already ceased to kill; indeed if one looks at the graphs of the death rates, they plunge as fast before the introduction of modern therapies as after. The only pos- sible exception is diphtheria, which killed 9 in every 10,000 children in 1895 when antitoxins were ﬁrst used in treatment, and the death rate from which had fallen to 3 by 1920, though the role of antitoxins in this decline is a matter of dispute, as a similar decline took place in America before the introduction of antitoxins. Of the major fatal diseases in 1850, only bronchitis, pneumonia and inﬂuenza were still killing signiﬁcant numbers in 1970: 5 in every 10,000, or 11 per cent of all deaths. Disease after disease appears to have lost much of its capacity to kill long before there was anything resembling an eﬀective treatment. In one case, scarlet fever, the disease itself seems to have declined in virulence. In every other case, either the external environment had become less favourable to the micro-organisms responsible, or human beings had become better at resisting infection. In 1850, 60 per cent of all deaths were caused by micro-organisms; in 1900 it was 50 per cent; in 1970 it was under 15 per cent. (People were now dying of heart disease and cancer rather than pneumonia and tuberculosis.) The germs had been very largely defeated, but the new drugs played only a small part in this triumph. The same picture appears if we turn from disease to childbirth. In England death in childbirth was around 160 mothers in every 10,000 births in 1650; this had fallen to 55 by 1850, a level that continued almost unaltered at least until the introduction of prontosil in 1935. Thereafter the level falls sharply to close to 1 in 10,000 in the 1980s. This is not how it ought to be. Once Lister had formulated antiseptic principles, deaths in childbirth should have fallen sharply, and indeed did in countries that relied on well-educated midwives. In England however, busy general practitioners refused to take adequate anti- septic precautions and death rates remained far higher than they should have been. The shape of the curve for deaths in childbirth is very diﬀerent from that for deaths from infectious diseases, but again there are major gains before 1865, even if the impact of modern medicine was signiﬁcant and immediate after 1935. Here the major advance prior to antibiotics was the introduction of the obstetrical forceps, at ﬁrst a secret in the Chamberlen family, but in widespread use after 1730. Apart from obstetrical forceps, were there any other successful interventions to extend life expectancy before the 1860s? One case that calls for consideration is the disappearance of bubonic plague from Western Europe. Bubonic plague killed large numbers of people between its ﬁrst European occurrence in 1348 and the mid- seventeenth century. In the 1650s it ceased to attack Italy, in the 1660s England became plague free, and France suﬀered only a small and ﬁnal outbreak in Marseilles in the 1720s. Some hold that this fear- some disease (which killed 225,000 people in London between 1570 and 1670) was conquered by quarantine measures, but this claim is impossible to prove. Plague is carried by rat ﬂeas, and is primarily a disease of rats that happens also to infect humans. If its primary means of spreading was from rat to ﬂea to rat, then quarantining humans could only have had a limited eﬀect on the movement of rats and ﬂeas. Early modern doctors believed quarantine would work because they thought the disease was quite exceptional in that it could (at least in epidemic circumstances) be spread directly from human to human, even though they believed it was originally caused by a corruption of the air. Although working with a false theory, they certainly had some success in protecting individual cities from plague some of the time; but whether their measures were capable of eradicating the disease from Western Europe as a whole, or (if it was never endemic) of preventing its periodic reintroduction, is more doubtful. Probably an alternative explanation in terms either of the declining virulence of the disease or of changes in the rat population (plague infects black rats, not brown) is to be preferred. With the possible exception of bubonic plague, there is only one major success story prior to 1865 that needs to be considered: smallpox. The ﬁrst demonstrably eﬀective intervention against micro- organisms was vaccination, which oﬀered an eﬀective method of pre- vention. Jenner introduced vaccination against smallpox in 1796. The eﬀects were extraordinary: there were 12,000 deaths from smallpox in Sweden in 1800 (approximately 15 per cent of all deaths), and just 11 in 1822, although only 40 per cent of the susceptible population had been vaccinated. Vaccination made possible the ﬁnal elimination of smallpox throughout the globe in 1978. In England, vaccination was preceded by inoculation with live virus, which became fairly wide- spread from the 1750s. On the other hand, uptake of both inoculation and vaccination was sporadic and patchy compared to those countries such as Sweden and Prussia that made systematic use of Jenner’s vaccine. In the 1680s in London at least 7 per cent of all deaths were attributable to smallpox, and by 1850 this had fallen to 1 per cent. The ﬁgure of 7 per cent is almost certainly an underestimate. In London the cause of death was recorded by laypeople; elsewhere, where it was recorded by doctors, smallpox seems to have been responsible for 14 per cent of deaths. This implies that inoculation and vaccination were responsible for something like half the increase in life expectancy in England during the period 1680 to 1850. Let us take Bernoulli’s unduly conservative estimate, that inoculation increased life expectancy by two years, and compare this with the average gain in life expectancy attributable to modern medicine of four years in Holland between 1950 and 1980. If one thinks of the vast investment in research laboratories, hospitals, drug companies, and general practitioners dedicated to increasing life expectancy in the period between 1950 and 1980, it is striking that the result was at best only equivalent to the conquest of smallpox twice over. Fifty years after Jenner’s discovery, John Snow called it both ‘the greatest discovery that has ever been made in the practice of medicine’ and ‘the greatest beneﬁt’ that humankind ‘have probably ever received’. It comes as something of a shock to realize that this may still be true. Modern vaccination therapies in human beings (based on the identiﬁcation of the infective agent, which Jenner was unable to make) begin with Pasteur’s vaccination against rabies in 1885. But, apart from diphtheria, there was no major breakthrough against a disease that killed signiﬁcant numbers in the West prior to the BCG vaccine against tuberculosis (and even the BCG is questionable). The major impact of vaccinations (against polio in 1957, for example, or rubella in 1963) came after, not before the ﬁrst antibiotics, and falls in the period 1950–80. So what did cause the long-term doubling of life expectancy in England from 32 in 1680 to 65 in 1942? There is general agreement that medicine was not responsible and that McKeown is therefore right in his central claim, but it is much harder to work out what exactly was responsible. We are in the profoundly unsatisfactory situ- ation of not having anything resembling an adequate understanding of the most important single event in modern history, the revolution in life expectancy. McKeown thought he knew the answer. He argued that the major factor was better resistance to disease, and that the only thing that could have made this possible was better nutrition. His argument has come under repeated and sustained attack, but it remains the best explanation we have. On some things McKeown was certainly wrong. He was certainly wrong to think that, until contraception, fertility varies less than mor- tality. We now know that, although there is a modest increase in life expectancy between 1680 and 1820, two-thirds of the English popu- lation increase over that period was caused by increased fertility –– and birth rates continued to rise until 1870. Part of this increased fertility resulted from a higher rate of marriage (the proportion of women never marrying dropped from 15 per cent to half of that); part from earlier marriage (the average age for ﬁrst marriage fell by three years); and part from increased procreation outside marriage (in 1680 less than one tenth of all ﬁrst births were illegitimate, while in 1820, 25 per cent were). Earlier marriage and a higher proportion marrying could in principle be the result of rising standards of living, which would make it easier for people to aﬀord to start a family, but in fact the ﬁt between rising fertility and rising standards of living is not very good. It seems clear that in the early modern period fertility was kept in check by deliberate abstinence on the part of the unmarried, and that in the course of the eighteenth and nineteenth centuries abstinence became much less popular. In short, people became more sexually active. There is no adequate study of why this might be, but a reason- able guess is that it reﬂects the decline in the church courts and of other mechanisms, formal and informal, of policing sexual behaviour. The history of population increase before 1870, in England at least, turns out to have more to do with the history of sexual activity (including sex within marriage) than with the history of life expect- ancy. Smallpox inoculation and vaccination may have been respon- sible for one-third of the increase in life expectancy, but they can only explain one-ninth of the increase in population. The primary cause of population increase, at least in England, was an increase in sexual activity, a possibility which McKeown never suspected although his subject was ‘the modern rise in population’. Second, McKeown chose to concentrate his attention not just on diseases caused by germs, but on diseases caused by airborne germs. Of the increase in life expectancy between 1850 and 1970, 40 per cent was due to the declining death rate from these diseases –– tuberculosis, bronchitis, pneumonia, inﬂuenza, whooping cough, measles, diph- theria, and smallpox. The most important single cause of improved life expectancy was the decline in the death rate from tuberculosis. The consequence of McKeown’s concentration on airborne diseases was that standard public health measures, which had conventionally been assumed to be a major factor in rising life expectancy, suddenly seemed irrelevant. Piped water, sewers, and water-closets are com- pletely irrelevant to the spread of TB. What then were the obstacles to the spread of TB? Once the bacillus had been identiﬁed by Koch in 1882 people knew that they were dealing with an airborne germ, and public health campaigns against spitting might well have had some impact on the spread of infection. When I was a child there were still signs on buses in Eng- land (and in France) telling passengers not to spit. Perhaps the isol- ation of suﬀerers in sanatoria might also have served to protect the uninfected population. Yet arguments like these break down in face of a simple fact: 85 per cent of young people in 1946 had antibodies that showed they had been exposed to TB. Thus the TB germ was still clearly widespread. What seems to have changed is not the pro- portion of the population being exposed to TB, but the proportion dying as a result of exposure. In fact this seems to be true of disease in general. Three surveys of a friendly society, delightfully named the Odd Fellows, enable us to assess the incidence of sickness in the working-class population in 1847, 1868, and 1895. What we discover is that there was no decline in the rate at which people fell ill. What declined was the proportion of illnesses that resulted in death. McKeown is right: the germs were still there, but the people were better able to survive them. Third, McKeown’s preoccupation with airborne diseases meant that he paid little attention to the history of sanitation. In the period 1850 to 1900, the reduction of death from water- and food-borne diseases was almost as important as the reduction in death from air- borne diseases. London began to introduce sand ﬁltration of the water supply in 1828. Chemical treatment of sewage water was com- mon in the 1860s –– it was in part this that gave Lister the idea of antiseptic surgery. The construction of a modern sewage-treatment system in London began in 1858. Generally across England, invest- ment in improvements to water and sewage was highest in the last two decades of the century. Public health measures were clearly crucial in eliminating cholera, which between appearing in England for the ﬁrst time in 1831 and for the last in 1866, caused in all some 113,000 deaths. The conquest of cholera has always been an exciting chapter in the history of medicine, but this success needs to be kept in proportion. Cholera itself was responsible for barely more than 0.5 per cent of all deaths, whereas whooping cough, for example, was responsible for 1.5 per cent, or 300,000 deaths in the cholera years. It is particularly striking that while adult deaths from diarrhoea and dysentery fell sharply during the period of public health invest- ment (in the 1890s the overall rate was three-quarters of what it had been forty years earlier, and amongst young adults it was one-tenth of what it had been then), death rates among children actually rose. At the beginning of the twentieth century roughly 3 per cent of English children under the age of 5 died from diarrhoea and dysentery, but the death toll rose to 5 per cent in years when there was a hot summer. By the 1930s, however, the death rate for children was one- tenth what it had been thirty years earlier: some important change had taken place in the intervening period. Evidently piped water and treated sewage, which were widespread by the end of the nineteenth century, did not signiﬁcantly reduce children’s exposure to water- and food-borne micro-organisms. Adults seem to have been less exposed than before –– even among the very elderly, whom one would expect to be highly susceptible, death rates were one third what they had been. However they also seem to have become more resistant, hence the steeper decline amongst young adults than the elderly. By contrast, nothing at all happened until the 1910s and 1920s to reduce the exposure of small children. How can we make sense of this rather bizarre pattern? Adults became more resistant to waterborne disease, but were also less exposed to it; while infants and small children were as vulnerable as before. A 1910 study found that diarrhoea (which mainly occurred amongst children) was as common in households that had ﬂush toilets as in those that did not. Modern sanitation was thus quite ineﬀectual. Why? Because young children still went to the toilet in the street and played amongst faeces. Children were thus far more susceptible to diarrhoea than adults, and adults must then have been primarily exposed to germs through their contact with children. From the 1890s on it became increasingly common for health visitors to pay regular visits to families after the birth of a child (a practice which became virtually universal after the 1907 Notiﬁcation of Birth Acts), bringing with them theories of disease transmission which were far from new, but which had previously not been properly disseminated amongst the working classes. It has been argued that it was only in the twentieth century that ‘domestic micro-sanitation came to supplement urban macro- sanitation, resulting in improvements more dramatic than any of those achieved in the nineteenth century’. By ‘domestic micro- sanitation’ is meant not only cleaning and scrubbing, but using nap- pies, and potty training; washing hands after going to the bathroom; and the killing of ﬂies and the covering of food. It is striking that where the average age for potty training is now twenty-four months an expert in 1839 thought that nappies could be abandoned at four months –– a strategy which implies a universal acceptance of frequent ‘accidents’. Hygiene had been improving for centuries. In the sixteenth cen- tury people ate oﬀ slabs of bread (‘trenchers’), in the seventeenth oﬀ wooden plates, in the eighteenth oﬀ pewter, and in the nineteenth oﬀ ceramic. Most people in the nineteenth century seem to have washed their hands and their faces every day. In the seventeenth century heavy woollen clothing (made of broadcloth) was replaced by lighter new draperies, and in the eighteenth and nineteenth centuries wool was replaced by cotton. As a result clothes could be washed much more easily and the consumption of soap rose from 3.5 pounds per person per year in 1800 to 8 pounds in 1861 (more soap was surely used washing clothes than washing bodies). Already in 1801 William Heberden thought he could show an astonishing reduction in deaths from diarrhoea through the course of the eighteenth century as a result of increased cleanliness and better ventilation. In the late nineteenth century public bath houses were built for the urban working classes: in 1852 Parisians took on average 3.7 public baths each a year; there is no way of counting the baths they may have taken at home. When the Pasteur Institute was built in 1888 it stood close to the factory manufacturing Eau de Javel or domestic bleach –– the case for cleanliness was clear long before the triumph of the germ theory of disease. By the beginning of the twentieth cen- tury most new English houses had running water, ﬂush toilets and baths, but what is clear from the English evidence is that much of this improvement in hygiene had little eﬀect on life expectancies. Children in particular continued to die in the same numbers as before. What had happened was that Heberden had misunderstood his own statistics: deaths from ‘griping in the guts’ had just been reclassi- ﬁed by doctors as deaths from ‘convulsions’. It took a systematic application of the principles of the sanitary reformers to domestic life to conquer infantile diarrhoea; there was no need to wait for germ theory. What needed to be done had been clear since (at least) Edwin Chadwick’s Report on the Sanitary Condition of the Labouring Population of Great Britain (1842), but it took almost a hundred years to transform childrearing practices. If we look for deliberate interventions to reduce disease prior to the 1930s there are some important examples: quarantine against plague, obstetric forceps, vaccination against smallpox, macro- and micro-sanitation. Forceps deliveries and smallpox vaccinations were mainly administered by doctors, and quarantine and sanitation drew extensively on medical theories. But none of these developments can account for the extraordinary increase in life expectancies from the 1870s onwards. Here we have to return to McKeown’s contentious claim that the explanation lies with improved nutrition. At ﬁrst it seems as though McKeown must be wrong on this crucial question. We now know that England was the ﬁrst country in Europe to escape periods of high mortality caused by bad harvests. From the early seventeenth century there was enough food, not only enough to prevent people from starving, but enough to prevent people from being so weakened by malnutrition that they succumbed in signiﬁcant numbers to infections in years of bad harvests and high food prices. If malnutrition caused, as McKeown argues, high death rates, then surely death rates should be higher in these years? Since there is virtually no increase of this sort, McKeown would appear to be wrong. Or perhaps not. Adults in England in 1775, as we now know, were 10 cm or 4 inches shorter than they are at present –– similar or larger diﬀerences are to be found in all Western countries with the excep- tion of America, where the gap (amongst the white population) is smaller. Modern data demonstrate a remarkable correlation between height and life expectancy. Research in Norway shows that a middle- aged man who is 5ft 5 in. is 70 per cent more likely to die over a sixteen-year period than a man of the same age who is 6ft tall. A person’s ﬁnal height crucially depends on two things, apart from their genetic inheritance: nutrition in infancy and in childhood, and exposure to disease while growing. Robert Fogel has argued that, if one compares heights in the US (where meat was already plentiful in the mid-nineteenth century), England, and other Western European countries, one can form the impression that 60 per cent of the diﬀer- ence between modern English heights and those in 1775 is due to improved nutrition, and 40 per cent to reduction in exposure to disease (which is mainly a twentieth-century phenomenon). More- over, improved nutrition in infancy and childhood, as reﬂected in increased height, explains almost all the increase in life expectancy before 1875 and 50–75 per cent of the increase after 1875 (when public health measures and modern medicine begin to play a signiﬁ- cant role). McKeown’s thesis that increased life expectancy is due to improved nutrition is thus, if one accepts Fogel’s argument, broadly correct, but it requires one simple modiﬁcation: what is crucial is nutrition in infancy and childhood, and here what matters is not just the number of calories consumed, but also the consumption of pro- tein and vitamins. Meat consumption was far higher in the US than in Europe: in France in 1870 it was about 40 per cent of what it was in the US at the same date, and as a consequence Americans were on average some 5 cm taller than the French, and signiﬁcantly longer lived. If Fogel is correct, the fact that the English were rarely so malnourished as to die within weeks or months of a bad harvest is irrelevant; through into the twentieth century they (and particularly the poorest amongst them) were generally suﬃciently malnourished during infancy and childhood for their long-term life expectancy to be adversely aﬀected. More recent work suggests that nutrition in the womb may be even more important than nutrition in infancy. Thus at the moment the best explanation for increases in life expectancy between the 1870s and the 1930s is improvement in foetal and child- hood nutrition, and improved nutrition continues to be a major factor (though perhaps no longer the major factor) in rising life expectancy down to the present day. How much has modern medicine contributed to the increase in life expectancy? The answer seems to be about 20 per cent, much less than improved nutrition and improved sanitation. From 1865 onwards, doctors have become increasingly good at deferring death, but surprisingly few of us owe our lives to modern medicine. It is easy to adopt a patronizing attitude to those patients who, from 425 bc to 1865, imagined their doctors were doing them good when they were only doing them harm. But we too are credulous. We owe much less to modern medicine than we imagine.