12 May

The  impact  of anesthesia  on  the  frequency  of operations has  been  a matter of debate, but careful analyses of patterns of surgery in nineteenth century  hospitals  indicate  a positive  correlation between  the develop- ment  of anesthesia  and  the number  and  range  of surgical operations. In part,  the rise in surgical cases was an outgrowth of urbanization and industrialization, but  the  increase  in gynecological  surgery,  especially

ovariotomy was dramatic;  many of these operations were done to treat nonspecific  ‘‘female complaints’’ and  emotional  problems.  Those  who harbored suspicions that  surgeons  were driven by a ‘‘savage desire for cutting’’ were convinced  that  surgeons  operated  on moribund accident victims not  because  they  expected  to  save them,  but  because  doctors saw them as ‘‘teaching material’’ or experimental  specimens. Sir John Bell (1774–1842), eminent  surgeon,  physiologist,  and  neurologist,  said that  the ideal surgeon had the ‘‘brain of an Apollo, the heart  of a lion, the eye of an eagle, and the hand of a woman,’’ but his contemporaries were more likely to see the surgeon as an ‘‘armed savage.’’

A striking upsurge in novel operations occurred in the post- anesthetic,  pre-antiseptic  period,  but  there  is some  evidence that  the notoriously  high rates of post-surgical  infections associated  with this era had more to do with changing patterns  of urbanization, industrialization, poverty, and malnutrition than anesthesia.  The deplorable  conditions  of hospitals,  the  misery of the  typical  hospital  patient,  and  the  growing evils of poverty and industrialization provide an explanatory framework for the prevalence of hospital  infections in the nineteenth  century.

Ideally,  surgery  should  be  judged  in  terms  of  the  survival  and

rehabilitation of the patient,  but  the drama  of the operation tends  to overwhelm  the  mundane  details  of post-surgical  management. In  the pre-anesthetic era, the dazzling speed, strength, and daring of the master surgeon were displayed to good advantage  in a limited range of opera- tions.  The legendary  surgeon  who amputated a leg at the thigh,  along with two fingers of his assistant,  and both  testes of an observer,  repre- sented the epitome of this genre of surgery. Better authenticated heroes of this era were men like William Cheselden (1688–1752) who could per- form an operation for bladder stones in less than one minute, and James Syme (1799–1870), who amputated at the hip joint in little more than

60 seconds.  Surgeons  were as obsessed  with  setting  speed  records  as

modern  athletes,  but  their  goal  was the reduction  of the stress,  pain, and  shock endured  by the patient.  In this context,  surgical anesthesia might  be seen as a prerequisite  for  the  standardized antiseptic  ritual, because it would have been virtually impossible for the lightening-quick surgeon  to  carry  out  such procedures  while coping  with a screaming, struggling, conscious patient.

When the art of anesthesia had been mastered, the surgeon was no

longer damned as the ‘‘armed savage,’’ but, in the crowded, filthy wards of the typical nineteenth-century hospital,  wound  infection  was trans- formed from a sporadic event into an array of epidemic conditions generically referred  to as hospitalism. Although  surgeons  might  admit that  the patient  on the operating  table in a hospital  was more likely to die than  a soldier on the battlefield,  the poor  prognosis  did not inhibit rising interest in surgical intervention. The cause of wound infection was not   clearly  understood  until   the  elaboration  of  germ  theory,   but

‘‘uncleanliness’’ had been a prime suspect since the time of Hippocrates. Hippocratic physicians knew that  it was preferable for a wound to heal by   first   intention,  that   is,  without   suppuration  (pus   formation). Surgeons  could  only  hope  that  if  a  wound  was  washed  with  wine, vinegar,  freshly-voided   urine,  or  boiled  water,   cleansed  of  foreign objects,  and  covered  with  a  simple  dressing,  healing  would  proceed without  complications.  Wound  infection was, however, such a common occurrence that  by the medieval period,  surgeons had developed elabo- rate methods to provoke suppuration. The theoretical rationalization for these procedures  is known as the doctrine of ‘‘laudable pus.’’ According to this offshoot  of humoral  pathology,  recovery from disease or injury required  casting off putrid  humors  from the interior  of the body.  The appearance of  nice  creamy  white  pus  in  a  wound  was,  therefore,  a natural and necessary phase of healing.

Assessing the relationship  between changing  surgical practice and

post-surgical  mortality  rates in the nineteenth  century is complicated by the  simultaneous shift  to  hospital-based medical  practice.  Crude  sta- tistics, such as the 74 percent  mortality  rate  among  Parisian  hospital patients who had undergone  amputation at the thigh in the 1870s, how- ever, seems to speak for itself. Knowing how often successful operations were followed by fatal  infections,  doctors  were famously  among  those who refused to submit to the knife. For example, when the great French surgeon,  diagnostician, and  anatomist Guillaume   Dupuytren (1777–

1835) faced  death,  he rejected  the  possibility  of an  operation,  saying

he would  rather  die by God’s  hand  than  by that  of the surgeon.  The motto so popular  with anatomists, medical examiners, and pathologists,

‘‘Hic locus est ubi mors gaudet succurrere vitae’’ (This is the place where death delights to help the living), would certainly not be comforting  to a surgeon  who found  himself in the role of the patient.  Respect  for the sick was, however, reflected in another  Latin maxim often found in hos- pitals: ‘‘Praesent aegroto taceant colloquia, effugiat risus, namque omnia dominatur morbus.’’ (In the presence of the sick, all conversation should cease, laughter  should disappear,  because disease reigns over all.)

Despite the reputation of hospitals  as places where people went to die, perhaps comforted  by an atmosphere imbued with compassion  and piety, the annual  reports of some hospitals suggest a respectable success rate. For  example, the 1856 annual  report  of Philadelphia’s  Children’s Hospital  claimed that  of 67 children  admitted  that  first year, 41 were discharged  as cured,  and  none  had  died.  In  contrast, in 1870, when Dr.  Abraham Jacobi  (1842–1906) publicly revealed the appalling  mor- tality rate at a children’s hospital in New York, he was forced to resign. Hospital  administrators had  refused  to institute  reforms  suggested  by Jacobi, one of the founders of American pediatrics. The philanthropists who controlled  many  hospitals  often  considered  moral  guidance  more important to the mission of the institution than  medical science.

Physicians  and  surgeons  knew  all too  well that  even a pinprick opened  a doorway  to death.  The doctor  was no more  immune  to the danger  than  his patient;  minor  wounds  incurred  during  dissections  or operations could lead to death from a massive systemic infection known as pathologist’s  pyemia. With but  slight exaggeration,  doctors  warned that  it was safer to submit to surgery in a stable, where veterinary  sur- gery was routinely and successfully performed,  than in a hospital. When miasmata  generated by ineluctable cosmic influences permeated the hos- pital, patients  in the wards inevitably succumbed  to hospital  gangrene, erysipelas, puerperal  fever, pyemia, and septicemia. Physicians endlessly discussed the nature  of these disease entities,  but  all of these hospital fevers  can  be  subsumed   by  the  term   hospitalism.  When   epidemic fevers were particularly virulent, the only way to prevent  the spread  of infection was to burn  down the hospital.

Ironically,  the evolution  of the hospital  into  a center for medical

education  and research  may have been a major  factor  in the appalling mortality  rates of the large teaching hospitals. Changes in the hospital’s social role may also have contributed to the pandemic  of hospitalism. By the nineteenth  century,  the reputation of many urban  hospitals  was so low that no horror  story seemed too implausible. Impoverished  slum dwellers were convinced  that  hospital  patients  were doomed  to  death and dissection to satisfy the morbid  curiosity of doctors.  Hospital  man- agers in France  were confronted by terrifying  rumors  of secret dissec- tion  rooms  where  human  fat  was collected  to  light  the  lamps  of the Faculty  of Medicine.

Descriptions of major hospitals  invariably  refer to the overcrowd-

ing, stench,  and  filth  of the  wards.  Surgeons  complained  that  nurses were rarely sober enough  to work; patients  complained  that  they were being starved to death. Blood, pus, expectorations, excrement, and urine covered hospital  floors. Operations were often performed  in the center of the ward when a separate operating  room was unavailable.  The same washbasin,   water,  and  sponge  were  used  to  treat   a  whole  row  of patients,  and the pus-saturated dressings were collected in the common

‘‘pus-bucket.’’ On a more positive note, pus-saturated surgical bandages provided  the  cells that  Johann  Friedrich  Miescher  (1844–1895), phy- sician  and  chemist,  used  in the  research  that  led to  the  discovery  of nucleic acid. Moreover,  the great quantity  and diversity of patients pro- vided invaluable clinical experience for young surgeons, physicians, and pathologists. Hospitals  began as places of refuge and charity that cared for the sick and comforted  the dying. Changing  medical theory,  train- ing, practice,  and  intense  interest  in pathological  anatomy,  as well as socioeconomic  factors,  created  new roles  for  this  institution. But  the hospital remained embedded in a matrix of poverty and charity in which the virtues of economy and efficiency were more important than cleanli- ness. Philanthropists, administrators, and physicians, as members of the

‘‘better classes,’’ expected their ‘‘lower class’’ patients to be conditioned to  crowding,  discomfort,   and  filth;  excessive cleanliness  might  even shock and distress such people.

Surgeons   began   operations  without   any   special   preparation,

although  a brief hand  wash was considered  appropriate when leaving the dissecting room. During operations, surgeons protected  their clothes with an apron  or towel, or wore an old coat already covered with blood and pus. Patients were ‘‘worked up’’ for surgery by the removal of their outer clothing and a swish of a well-used sponge. Observers were often invited to probe and examine interesting wounds. After the introduction of anesthesia,  the pace of surgery became less frantic,  but certainly not leisurely. Habits  acquired  in the pre-anesthetic  era were not easily bro- ken.  A surgeon  took  pride  in his ingenious  methods  for  saving time, such as holding  a knife in his mouth  while operating.  Using the same coat  for  all operations was convenient,  because  needles, sutures,  and instruments  could be kept handy  in the lapel, buttonhole, and pockets.

It would be wrong to extrapolate from the epidemics of infection

that  swept through  nineteenth-century hospitals  to the problem  of sur- gical infection  in other  ages. Indeed,  it has  been  suggested  that  fluc- tuations  in hospital  mortality  rates reflected the level of distress in the community.  Famine,  scurvy,  and  disease would  certainly  affect  resis- tance  to  infection.  This  hypothesis  is consistent  with  the  observation that  veterinary  surgery  was relatively  free of  the  problem  of  wound infection, although  it was carried out under rather  primitive conditions with little concern for asepsis. Hospitalism  might, therefore, have been a unique nineteenth-century plague, perhaps  caused by the effects of the Industrial Revolution, rather  than  a reflection of surgical practice from Hippocrates to Lister.

Random Posts

Comments are closed.