Photo: myf (Flickr)
Last time we explored the introduction of smallpox inoculation to England in the early 1700s, when naturally spread smallpox often caused severe scarring or death. Inoculation exposed a person to a tiny dose of the smallpox virus, which usually made them mildly sick and gave them immunity to the disease. Although smallpox inoculation offered people a better chance, it was still risky.
In the 1790s, the English country physician Edward Jenner noticed that milkmaids’ faces rarely were scarred from smallpox. Jenner wondered if cowpox, a mild infection from cows that caused sores on the milkmaids’ hands for a few days, somehow protected them from smallpox.
He decided to test his theory by exposing a young boy to cowpox, then later exposing him to smallpox to see if he’d developed immunity. Jenner’s method seems irresponsible from our perspective today, but the cowpox virus did make people immune to the similar, but far more deadly, smallpox virus.
Latin For Cow
To distinguish his method from the riskier smallpox inoculation, Jenner named his procedure vaccination, after the Latin for cow. Cowpox vaccination rapidly gained acceptance. Two centuries later, safer, standardized vaccines brought about the success of the World Health Organization’s Smallpox Eradication Program. The last case of smallpox in the world, contracted outside of a laboratory, was in 1977. Today, the smallpox virus doesn’t exist in a natural setting–only in a handful of research labs.