On Independence Day in 1776, one year after Paul Revere made his famous journey on horseback to warn the colonial minutemen that the British were coming, the Declaration of Independence was signed. This piece of parchment precluded what would be a long and bloody war between the newly-independent Americans and their not-so-distant relatives across the sea. However, warfare wouldn’t be the only thing killing soldiers; infectious diseases would blindly annihilate both sides, and the devastation wouldn’t stop with the Treaty of Paris in 1783.
While jet lag wasn’t a concept Revolutionary British armies would understand for several more generations, they did have to get used to the difference in climate and the array of diseases that came with it when they came across the ocean to reel in the American rebels. Warmer weather and the particularly steamy climate in the south drastically increased the risk of disease, making infections like small pox, yellow fever, dysentery, typhoid fever, dengue fever, and malaria somewhat of an “ally” to the Americans. Historians recount the unhealthiest region of British North America during the war: the Lower South. Yet British officers decided to try and secure control there. Imagine hundreds of soldiers packed together in a camp, creating the perfect environment for bacteria to grow, viruses to spread, and epidemics to spawn - killing officers and commanders before the American continental army had the chance. After all, infectious microbes don’t know the difference between Red Coats and American rebels, between common foot soldiers and high commanders. Did the sweltering weather of the American “fever season” cost the British the war?
A Time When Smallpox Was Not Eradicated
Smallpox is a highly contagious disease caused by a virus called variola. This infection is infamous for its skin rash, and it causes a fever that kills 30 percent of the people it touched. However, for those left alive, irreversible scarring was visible for all to see, the mark of a smallpox survivor – if they didn’t end up blind.
Near the end of the 18th century and the beginning of the 19th century – just shy of being useful to Revolutionary soldiers, English physician Edward Jenner made a connection between two similar diseases, cowpox and smallpox, that would lead to the world’s first vaccine to prevent viral infections. He saw that milkmaids tending to cattle would catch cowpox, a mild illness compared to smallpox, but when exposed to smallpox, they would be immune. Fast-forward a few years and several questionable experiments later, Jenner showed the world how intentionally exposing the body to a less severe version of smallpox could prevent the dangerous disease. A century and-a-half later, the Global Smallpox Eradication Program made it possible for smallpox to be officially eradicated in 1977.
How Do You Treat Bacterial Infections Without Antibiotics? You Don’t
In the fall of 1928, much too late for use during the Revolutionary War, English scientist Alexander Fleming picked up a petri dish from his lab after several weeks of vacation and saw mold growing. He could have chucked the contaminated plate in the trash, but instead he noticed that the mold had created a circle of death in the middle of what Jenner initially intended to grow on the dish for his studies: Staphylococcus aureus, a species of bacteria. Some chemical released by the mold was killing the bacterial cells, and – Jenner probably thought at the moment – S. aureus probably isn’t the only type of bacteria that the chemical could destroy. The mold species was eventually identified as Penicillium notatum, and – yes, you guessed it – would later be extracted for use as the world’s first antibiotic, penicillin. Thanks to Fleming, people could now be easily cured of bacterial infections that use to kill those afflicted, infections we don’t worry as much about today. His studies also gave birth to a whole new science of developing antibiotics tailored to kill specific types of bacteria, like Streptococcus (Strep Throat), Shigella (dysentery), and Salmonella typhi (Typhoid fever).
Malaria: Then and Now
Unlike some of the diseases afflicting soldiers during the Revolutionary War, malaria still kills hundreds of thousands of people every year, even though the parasitic disease is largely treatable and preventable. In 2017, the burden is the worst sub-Saharan Africa and other marginalized populations. In 1776, a time without mosquito nets, bug spray, or antimalarial drugs, the burden was visible everywhere, making malaria the most common illness during the Revolutionary War.
Malaria is spread by Anopheles mosquitos that carry the disease-causing parasites. Two specific types of the parasite, Plasmodium falciparum and Plasmodium vivax, are the most notorious species that trigger malaria infections. Following a bite from a mosquito carrying a malaria parasite, the infected person will experience chills, headache, and – as often reported in 1776 – fever. As with any fever that grows out of control, malaria can be life threatening without treatment, which wasn’t available in the 18th century. Currently the best treatments for malaria are drugs like chloroquine, quinine, and primaquine. All three and more like them kill the parasite responsible for the infection. However, the Plasmodium parasites weren’t even discovered until 1880 by French army surgeon Charles Louis Alphonse Laveran – who was later awarded the Nobel Prize for the discovery – more than a century after the signing of the Declaration of Independence.
During the Revolutionary War and the early years of American independence, there was so much that scientists and physicians didn’t know about the infectious microorganisms that were slowly killing people everywhere. In the present day, although we know a lot more about disease, there is still much that remains a mystery. Maybe 241 years from today scientists will be looking back, reminiscing about the diseases of the 21st century.