kids encyclopedia robot

History of medicine in the United States facts for kids

Kids Encyclopedia Facts


The history of medicine in the United States encompasses a variety of approaches to health care in the United States spanning from colonial days to the present. These interpretations of medicine vary from early folk remedies that fell under various different medical systems to the increasingly standardized and professional managed care of modern biomedicine.

Colonial Era

At the time settlers first came to the United States, the predominant medical system was humoral theory, or the idea that diseases are caused by an imbalance of bodily fluids. Settlers initially believed that they should only use medicines that fit in this medical system and were made out of "such things only as grown in England, they being most fit for English Bodies," as said in The English Physitian Enlarged, a medical handbook commonly owned by early settlers. However, as settlers were faced with new diseases and a scarcity of the typical plants and herbs used to make therapies in England, they increasingly turned to local flora and Native American remedies as alternatives to European medicine. The Native American medical system typically tied the administration of herbal treatments with rituals and prayer. This inclusion of a different spiritual system was denounced by Europeans, in particular Spanish colonies, as part of the religious fervor associated with the Inquisition. Any Native American medical information that didn't agree with humoral theory was deemed heretical, and tribal healers were condemned as witches. In English colonies it was more common for settlers to seek medical help from Native American healers. However, their medical knowledge was still looked down upon as it was assumed that they didn't understand why their treatments worked because their medical system differed.

Disease environment

Mortality was very high for new arrivals, and high for children in the colonial era. Malaria was deadly to many new arrivals. The disease environment was very hostile to European settlers, especially in all the Southern colonies. Malaria was endemic in the South, with very high mortality rates for new arrivals. Children born in the new world had some immunity—they experienced mild recurrent forms of malaria but survived. For an example of newly arrived able-bodied young men, over one-fourth of the Anglican missionaries died within five years of their arrival in the Carolinas. Mortality was high for infants and small children, especially from diphtheria, yellow fever, and malaria. Most sick people turn to local healers, and used folk remedies. Others relied upon the minister-physicians, barber-surgeons, apothecaries, midwives, and ministers; a few used colonial physicians trained either in Britain, or an apprenticeship in the colonies. There was little government control, regulation of medical care, or attention to public health. By the 18th century, Colonial physicians, following the models in England and Scotland, introduced modern medicine to the cities. This allowed some advances in vaccination, pathology, anatomy and pharmacology.

There was a fundamental difference in the human infectious diseases present in the indigenous peoples and that of sailors and explorers from Europe and Africa. Some viruses, like smallpox, have only human hosts and appeared to have never occurred on the North American continent before 1492. The indigenous people lacked genetic resistance to such new infections, and suffered overwhelming mortality when exposed to smallpox, measles, malaria, tuberculosis and other diseases. The depopulation occurred years before the European settlers arrived in the vicinity and resulted from contact with trappers.

Medical organization

The city of New Orleans, Louisiana opened two hospitals in the early 1700s. The first was the Royal Hospital, which opened in 1722 as a small military infirmary, but grew to importance when the Ursuline Sisters took over the management of it in 1727 and made it a major hospital for the public, with a new and larger building built in 1734. The other was the Charity Hospital, which was staffed by many of the same people but was established in 1736 as a supplement to the Royal Hospital so that the poorer classes (who usually could not afford treatment at the Royal Hospital) had somewhere to go.

In most of the American colonies, medicine was rudimentary for the first few generations, as few upper-class British physicians emigrated to the colonies. The first medical society was organized in Boston in 1735. In the 18th century, 117 Americans from wealthy families had graduated in medicine in Edinburgh, Scotland, but most physicians learned as apprentices in the colonies. In Philadelphia, the Medical College of Philadelphia was founded in 1765, and became affiliated with the university in 1791. In New York, the medical department of King's College was established in 1767, and in 1770, awarded the first American M.D. degree.

Smallpox inoculation was introduced 1716–1766, well before it was accepted in Europe. The first medical schools were established in Philadelphia in 1765 and New York in 1768. The first textbook appeared in 1775, though physicians had easy access to British textbooks. The first pharmacopoeia appeared in 1778. The European populations had a historic exposure and partial immunity to smallpox, but the Native American populations did not, and their death rates were high enough for one epidemic to virtually destroy a small tribe.

Physicians in port cities realized the need to quarantine sick sailors and passengers as soon as they arrived. Pest houses for them were established in Boston (1717), Philadelphia (1742) Charleston (1752) and New York (1757). The first general hospital was established in Philadelphia in 1752.

19th Century

1834 Illustration from Thomsonian Botanic Watchman
A cartoon depicting Thomson (on the right) saving a patient from contemporary European medical practices (an MD. and Fellow of the Royal Society of London on the left).

Moving into the early 19th century, there was a general move to distinguish America from its ex-colonial ruler, Britain. Part of this spilled over into medical systems as well. Given that contemporary European medical interventions included things like blistering, blood letting, and calomel, there was a push to find a less damaging alternative. Samuel Thompson introduced his own alternative medical system, Thomsonianism, in the early 19th century. It quickly became extremely popular as a medical system in New England, especially in the northeast. While Thompson claimed his medical system was entirely his own, it was more so a repackaging of the humoral and Native American medical theories combined. Thomsonianism was all about maintaining heat in the body, and he accomplished this through various herbal interventions. His most commonly used drug, which he himself referred to as his number 1 drug, was Indian Tobacco, a commonly used native American medicinal herb. Thompson attributed his discovery of the herb and its medicinal properties to his explorative youth, but he also credits an old lady in his village with introducing him to the herb. Scholars have suggested that this lady was actually Native American, but that Thomson occluded that fact due to the general stigma and inferiority associated with Native Americans at the time.

Civil War

In the American Civil War (1861–65), as was typical of the 19th century, more soldiers died of disease than in battle, and even larger numbers were temporarily incapacitated by wounds, disease and accidents. Conditions were poor in the Confederacy, where doctors and medical supplies were in short supply. The war had a dramatic long-term impact on American medicine, from surgical technique to hospitals to nursing and to research facilities.

The hygiene of the training and field camps was poor, especially at the beginning of the war when men who had seldom been far from home were brought together for training with thousands of strangers. First came epidemics of the childhood diseases of chicken pox, mumps, whooping cough, and, especially, measles. Operations in the South meant a dangerous and new disease environment, bringing diarrhea, dysentery, typhoid fever, and malaria. Disease vectors were often unknown. The surgeons prescribed coffee, whiskey, and quinine. Harsh weather, bad water, inadequate shelter in winter quarters, poor sanitation within the camps, and dirty camp hospitals took their toll.

This was a common scenario in wars from time immemorial, and conditions faced by the Confederate army were even worse. The Union responded by building army hospitals in every state. What was different in the Union was the emergence of skilled, well-funded medical organizers who took proactive action, especially in the much enlarged United States Army Medical Department, and the United States Sanitary Commission, a new private agency. Numerous other new agencies also targeted the medical and morale needs of soldiers, including the United States Christian Commission as well as smaller private agencies such as the Women's Central Association of Relief for Sick and Wounded in the Army (WCAR) founded in 1861 by Henry Whitney Bellows, and Dorothea Dix. Systematic funding appeals raised public consciousness, as well as millions of dollars. Many thousands of volunteers worked in the hospitals and rest homes, most famously poet Walt Whitman. Frederick Law Olmsted, a famous landscape architect, was the highly efficient executive director of the Sanitary Commission.

States could use their own tax money to support their troops as Ohio did. Following the unexpected carnage at the battle of Shiloh in April 1862, the Ohio state government sent 3 steamboats to the scene as floating hospitals with doctors, nurses and medical supplies. The state fleet expanded to eleven hospital ships. The state also set up 12 local offices in main transportation nodes to help Ohio soldiers moving back and forth. The U.S. Army learned many lessons and in 1886, it established the Hospital Corps. The Sanitary Commission collected enormous amounts of statistical data, and opened up the problems of storing information for fast access and mechanically searching for data patterns. The pioneer was John Shaw Billings (1838-1913). A senior surgeon in the war, Billings built the Library of the Surgeon General's Office (now the National Library of Medicine, the centerpiece of modern medical information systems. Billings figured out how to mechanically analyze medical and demographic data by turning it into numbers and punching onto cardboard cards as developed by his assistant Herman Hollerith, the origin of the computer punch card system that dominated statistical data manipulation until the 1970s.

Modern Medicine

After 1870 the Nightingale model of professional training of nurses was widely copied. Linda Richards (1841 – 1930) studied in London and became the first professionally trained American nurse. She established nursing training programs in the United States and Japan, and created the first system for keeping individual medical records for hospitalized patients.

After the American Revolution, the United States was slow to adopt advances in European medicine, but adopted germ theory and science-based practices in the late 1800s as the medical education system changed. Historian Elaine G. Breslaw describes earlier post-colonial American medical schools as "diploma mills", and credits the large 1889 endowment of Johns Hopkins Hospital for giving it the ability to lead the transition to science-based medicine. Johns Hopkins originated several modern organizational practices, including residency and rounds. In 1910, the Flexner Report was published, standardizing many aspects of medical education. The Flexner Report is a book-length study of medical education and called for stricter standards for medical education based on the scientific approach used at universities, including Johns Hopkins.

World War II

Nursing

As Campbell (1984) shows, the nursing profession was transformed by World War II. Army and Navy nursing was highly attractive and a larger proportion of nurses volunteered for service higher than any other occupation in American society.

The public image of the nurses was highly favorable during the war, as exemplified by such Hollywood films as Cry "Havoc", which made the selfless nurses heroes under enemy fire. Some nurses were captured by the Japanese, but in practice they were kept out of harm's way, with the great majority stationed on the home front. The medical services were large operations, with over 600,000 soldiers, and ten enlisted men for every nurse. Nearly all the doctors were men, with women doctors allowed only to examine patients from the Women's Army Corps.

Women in Medicine

In the colonial era, women played a major role in terms of healthcare, especially regarding midwives and childbirth. Local healers used herbal and folk remedies to treat friends and neighbors. Published housekeeping guides included instructions in medical care and the preparation of common remedies. Nursing was considered a female role. Babies were delivered at home without the services of a physician well into the 20th century, making the midwife a central figure in healthcare.

The professionalization of medicine, starting slowly in the early 19th century, included systematic efforts to minimize the role of untrained uncertified women and keep them out of new institutions such as hospitals and medical schools.

NYinfirmary
The Woman's Medical College of the New York Infirmary. [Announcement, 1868-69].

Doctors

In 1849 Elizabeth Blackwell (1821–1910), an immigrant from England, graduated from Geneva Medical College in New York at the head of her class and thus became the first female doctor in America. In 1857, she and her sister Emily, and their colleague Marie Zakrzewska, founded the New York Infirmary for Women and Children, the first American hospital run by women and the first dedicated to serving women and children. Blackwell viewed medicine as a means for social and moral reform, while a younger pioneer Mary Putnam Jacobi (1842-1906) focused on curing disease. At a deeper level of disagreement, Blackwell felt that women would succeed in medicine because of their humane female values, but Jacobi believed that women should participate as the equals of men in all medical specialties. In 1982, nephrologist Leah Lowenstein became the first woman dean of a co-education medical school upon her appointment at Jefferson Medical College.

Nursing

Nursing became professionalized in the late 19th century, opening a new middle-class career for talented young women of all social backgrounds. The School of Nursing at Detroit's Harper Hospital, begun in 1884, was a national leader. Its graduates worked at the hospital and also in institutions, public health services, as private duty nurses, and volunteered for duty at military hospitals during the Spanish–American War and the two world wars.

The major religious denominations were active in establishing hospitals in many cities. Several Catholic orders of nuns specialized in nursing roles. While most lay women got married and stopped, or became private duty nurses in the homes and private hospital rooms of the wealthy, the Catholic sisters had lifetime careers in the hospitals. This enabled hospitals like St. Vincent's Hospital in New York, where nurses from the Sisters of Charity began their work in 1849; patients of all backgrounds were welcome, but most came from the low-income Catholic population.

kids search engine
History of medicine in the United States Facts for Kids. Kiddle Encyclopedia.