(Editor’s Note: Below is the 7 part series in full from our friends at Age of Autism. Please feel free to share it to Facebook, Tweet it, etc. Yes it’s a long read but it’s so worth it.)
By Dan Olmsted and Mark Blaxill
1. The Wrong Narrative.
Polio is the iconic epidemic, its conquest one of medicine’s heroic dramas. The narrative is by now familiar: Random, inexplicable outbreaks paralyzed and killed thousands of infants and children and struck raw terror into 20th century parents, triggering a worldwide race to identify the virus and develop a vaccine. Success ushered in the triumphant era of mass vaccination. Now polio’s last hideouts amid the poorest of the poor in Asia and Africa are under relentless siege by, among others, the Bill & Melinda Gates Foundation. Eradication is just a matter of time, and many more illnesses will soon meet the same fate.
But based on our research over the past two years, we believe this narrative is wrong – and wrong for reasons that go beyond mere historical interest. The misunderstanding of polio has warped the public health response to modern illnesses in ways that actually make them harder to prevent, control, and treat.
The reality, we believe, is that the virus itself was just half the epidemic equation — necessary but not sufficient to create The Age of Polio. Outbreaks were not caused solely by poliovirus – the microbe was an ancient and heretofore harmless intestinal bug — but by its interaction with a new toxin, most often innovative pesticides used to treat fruits and vegetables.
This alternative narrative makes better sense of the natural history of polio, and it resolves a number of anomalies that remain to this day. It suggests why poliomyelitis outbreaks emerged, evolved, and exploded the way they did; it probably solves, for the first time, the enduring riddle of why Franklin D. Roosevelt was afflicted 90 years ago this summer on Campobello Island; and it may mean today’s billion-dollar-a-year eradication effort is misguided, if not downright quixotic.
These are large claims. Let us explain.
Polio was a strange illness, never fully understood even by those who devoted their lives to studying and subduing it. It was a summer plague, coming on in late spring and all but vanishing in the fall. Many thought contagion had something to do with water, and Americans kept their children away from swimming pools in droves.
There is a profound distinction between poliovirus – an enterovirus, one that enters through the mouth and takes up residence in the GI tract and bloodstream – and poliomyelitis, the paralytic form of the illness. In the vast majority of cases, the virus causes either a minor illness or an inapparent infection.
But in 1 or 2 in 100 cases, the virus somehow gets past multiple defenses and into the nervous system, where it finds its way to the anterior horn cells at the top front of the spinal column. There, it preferentially attacks the gray-colored motor neurons (polio means gray in Greek) and causes inflammation of the protective myelin sheath (myelitis). This interferes with nerve signals to the muscles and can lead to temporary or permanent paralysis of the limbs and the respiratory system. A small number of people who contract poliomyelitis — on the order of 1 percent — die.
The first recorded U.S. outbreak was in 1841 in West Feliciana, Louisiana (10 cases, no deaths). There was a half-century gap until the next cluster, in 1893 in Boston (26 cases, no deaths). Then, in 1894, came what is widely regarded as the first major epidemic, in Rutland and Proctor, Vermont (132 cases, 18 deaths). Thirty more outbreaks – from such seemingly disparate locations as Oceana County, Michigan, and California’s Napa Valley — were reported in the United States through 1909. The worst by far was New York in 1907, with 2,500 cases and a five percent mortality rate, a harbinger of the 1916 epidemic in the Northeast that killed 2,000 in New York City alone.
What is most remarkable about this list is that so few outbreaks of paralytic polio were recorded anywhere in the world before the latter 19th century. Poliomyelitis is considered an ancient scourge, but the evidence supporting that belief is quite threadbare. An oft-cited Egyptian drawing depicts a priest with a withered leg that could have stemmed from paralytic polio, but for most of recorded history there were few observations of the sudden-onset fever and paralysis in infants that characterizes the disease. The earliest well-documented case of infantile paralysis in an individual is widely considered to be Sir Walter Scott, afflicted as an infant in 1773.
There is little question that the poliovirus was endemic in humans for millennia; there may even have been isolated cases of poliomyelitis for much of that period. Yet the poliovirus did not trigger widespread outbreaks of poliomyelitis. Setting aside for now the 1841 Louisiana outbreak, reported retrospectively, something seems to have happened around 1890 to launch The Age of Polio in the United States. And something else must have changed around the end of World War II to create the large modern epidemics seared into the minds of older Americans, thousands of whom are poliomyelitis survivors and almost all of whom know someone who was afflicted.
While we have not written about polio, we have seen this pattern before. In our book, The Age of Autism – Mercury, Medicine, and a Man-made Epidemic, we argued that something happened in the 1930s to launch The Age of Autism. We proposed it was the commercialization of ethyl mercury compounds for use in pesticides – seed disinfectants and lumber preservatives – and in vaccinations; we offered evidence of those inventions in the family backgrounds of the first autism cases identified in the medical literature, in 1943. Similarly, we proposed that the sharp rise in autism cases beginning around 1990 tracks with the federal government recommending several more mercury-containing shots.
Our attention was drawn to polio during our autism research when a virologist mentioned, in passing, that poliomyelitis could be triggered in some instances by injections. Called “provocation poliomyelitis,” this can happen when a needle stick punctures a nerve in the peripheral nervous system. An active poliovirus infection – typically, in a child exposed to the virus for the first time and not yet immune — can gain access to the nervous system through a process called “retrograde axonal transport,” traveling back to the spinal column and triggering the dreaded paralytic form, poliomyelitis.
Such cases of provocation paralysis, we learned, occurred in Eastern Europe when antibiotics were excessively administered by injection; this practice led to multiple cases of poliomyelitis. Bulbar polio – of the throat and respiratory system – was recognized as more common after tonsillectomies, again because nerve endings had been exposed. Outbreaks, then, can unquestionably occur as a result of an environmental injury, in these instances either excessive injection or surgery that led to peripheral nerve damage, in the presence of poliovirus infection.
We began to look at the poliomyelitis literature and found that another and much more comprehensive environmental theory of the disease had been put forward almost immediately after the early outbreaks, although it never gained mainstream attention. This theory proposed that what is called “polio” is not caused by a virus at all, but by poisoning from pesticides. In this theory, lead arsenate triggered the early clusters, and DDT kicked off the large outbreaks after World War II. (The pesticide theory has been championed in recent years by Jim West and by Janine Roberts.)
That really got our attention. In our research for The Age of Autism, we investigated a paralytic illness we believe resulted from an unrecognized interaction between a toxin and a microbe. Called general paralysis of the insane, or GPI, it was a gruesome and universally fatal outcome in a percentage of people infected years earlier with the syphilis bacteria. We proposed that a manmade mercury compound — ironically used to “treat” syphilis — allowed syphilis to gain entrance to the brain. When penicillin was developed in the 1940s and actually killed syphilis infections, GPI disappeared because one of the two requirements for the illness – the microbe – was destroyed.
We suggested that a number of other illnesses may follow a similar pattern in which microbes and metals interact, including, in some instances, autism. So the idea that an environmental insult – whether a needle stick or surgery or a toxic metals exposure – could be at work in outbreaks of poliomyelitis intrigued us.
But we did not find the claim that polio was simply poisoning by pesticides alone to be persuasive. The strong versions of both the virus theory and the pesticide theory – that it was entirely one or the other – are too simple to explain the pattern of evidence. The strong viral theory can’t explain the sudden emergence of poliomyelitis; the strong pesticide theory can’t explain the sudden protective effect of poliovirus vaccinations. Rather, we propose that poliomyelitis outbreaks are man-made events that result from the synergy of microbe and toxin.
A threshold question – one that requires an answer for our argument to make sense – concerns what scientists call biological plausibility. What is the mechanism by which the virus and a toxin could cause such damage? We’ll look at the particular properties of lead and arsenate shortly, but our fundamental idea is that both the poliovirus and the pesticide enter the body by the same route — they are ingested — and both end up in the stomach. There, the toxin could damage the stomach lining in such a way that the virus gains access to peripheral nerves. This kind of virus-toxin interaction (perhaps with arsenic or lead acting alone as the toxin) took place sporadically before 1890 and increased dramatically, we propose, with the invention of more potent insecticides like lead arsenate. With the advent of DDT, the interaction became even more dangerous, dramatically increasing the number of cases.
The idea that toxins have played any role in poliomyelitis outbreaks is not widely accepted, to say the least. In his Pulitzer Prize-winning 1995 book, Polio, Peter Oshinsky dismisses it in a sentence: By 1952, the peak year of the epidemic, the search for answers had grown so desperate that “a few blamed the dumping of poisons into the environment, especially the pesticide DDT,” he writes.
- Triplets Regress into Autism ON THE SAME DAY — Genetics Could Not Do This! (Vaxxed Story)
- De Niro Finally Opens Up About His Son’s Autism Link to Vaccines
Yet on the very next page, Oshinsky describes a farm family, frantic about the epidemic sweeping Iowa that awful summer. The parents “tested the well water – it was fine – and used extra DDT to drive away flies.” Still, nine of their 11 children were affected, two of them paralyzed. The family “had done everything they were told to do,” Oshinsky writes, “everything they could. Why had it happened to them?”
Why, indeed? The search for an answer begins in the 1850s in Medford, Massachusetts.
2. A Gypsy Moth Flaps Its Wings.
Etienne Leopold Trouvelot arrived in the United States from France in the late 1850s and settled into his brand new house at 27 Myrtle Street in Medford, a suburb of Boston. A self-taught scientist and later an astronomer affiliated with Harvard, his interest alighted first on insects, and he turned the land adjoining his home into a virtual boarding house for bugs.
“To contain his hordes of larvae he constructed a stupendous barricade to encircle his grounds – a wooden fence eight feet high that encompassed his full five acres of shrubs and small trees,” writes author Robert J. Spear. “Netting was stretched from the perimeter of the fences across the trees and was supported in the middle on posts, making it possible for Trouvelot to walk upright through his specialized insectary.”
A decade later, he acquired a handful of gypsy moths, probably on a trip back to France – there were none in the United States. What happened next can be deduced from the title of Spear’s book, The Great Gypsy Moth War. Inevitably, insects escaped, not least because birds continually pecked their way into what they viewed as a very large diner.
The gypsy moths did not make their presence known outside the “stupendous barricade” for about a decade, but when they did, the results were apocalyptic. Lacking natural predators, they denuded trees – especially fruit trees – in what seemed like a single collective gulp. Then they crawled onward and upward. “Citizens could only stare in disbelief as the dirt streets became carpeted with millions of larvae across Myrtle Street,” writes Spear, “turning its surface black with the bodies of fast-moving caterpillars.”
Horrified residents combed gypsy moth larvae out of their hair, shoveled them off the steps, stomped them underfoot and burned huge clusters in noxious kerosene fires. But humans were simply outmatched. The Hellstrom Chronicles, the 1970s movie that suggested insects would inherit the earth, was coming alive in suburban Boston.
Even professional bug-killers were defenseless against the new arrival – Paris Green and London Purple, two state-of-the–art arsenic compounds that were potent against most pests, didn’t work at all. To some, it appeared that the food supply of the United States was at imminent risk. The state put together a Gypsy Moth Commission with an urgent mandate: Kill the bugs dead.
Fortunately – or so it seemed – a scientist working for the commission quickly found a solution. Adding lead to arsenic proved lethal to the larvae, and the new compound was sprayed on trees in and around Boston starting in 1893. It quickly proved its value against not just gypsy moths but all manner of agricultural pests. In fact, it worked better against codling moths, the source of the proverbial “worm in the apple.”
“In the case of insects which do not readily yield to Paris Green, a different substance, used with great success by the Gypsy Moth Commission, with which it originated, may be applied,” wrote George H. Perkins, state entomologist of Vermont in his annual report for 1893, published in early 1894. “This is arsenate of lead; sodic arsenate 29.93%, lead acetate 70.07%, are mixed in water, from which arsenate of lead is soon formed.”
Something else of note happened in 1893 in the Boston area. Two doctors used to seeing sporadic cases of paralysis in infants became concerned when the small caseload suddenly increased, to 23. There had only been six in the same September-November time span the year before.
“Is Acute Poliomyelitis Unusually Prevalent This Season?” asked Drs. James J. Putnam and Edward Wyllys Taylor in the Boston Medical and Surgical Journal for November 23, 1893. “It would not have seemed worthwhile to report these few observations had it not been that the number of cases observed at the Massachusetts General Hospital in September and October of this year is decidedly larger than usual,” they wrote. (The comment shows that isolated cases of paralysis were not unusual in Boston, where the Gypsy Moth War had been raging since 1890. It was the number and timing that drew their attention.)
While the doctors noted the time of year, they did not notice that September and October were apple harvesting season. They did ask “other physicians who have seen these interesting cases, or may see them in future, to send brief records.”
The future came quickly. Within seven months, Vermont – where George H. Perkins had given the formula for lead arsenate in his annual report – was hit hard. “The first major polio outbreak to be recognized in the United States did not occur until 1894,” writes well-known vaccine developer Samuel Katz. “It came in Rutland, Vermont, for reasons I need an epidemiologist to explain to me. There had been scattered individual cases prior to that date, but this was the first recorded outbreak resulting in 18 deaths and 32 individuals with residual paralysis among a total of 132 cases.”
The outbreak was described in a classic report by Dr. C.S. Caverly, a Rutland physician and president of the Vermont Board of Health. “During the month of June, 1894, there appeared in a portion of the valley of the Otter Creek, in the State of Vermont, an epidemic of nervous disease, in which the distinctive and most common symptom was paralysis.” Caverly didn’t know what to make of it, but he noted that the cases tended to cluster along Otter Creek and its tributaries.
In 1896, in a follow-up report in the Journal of the American Medical Association, Caverly noted something strange: “During this epidemic and in the same geographical area, an acute nervous disease, paralytic in its nature, affected domestic animals. Horses, dogs and fowls died with these symptoms.”
A horse “died paralyzed in the hind legs,” a fowl was paralyzed in its legs and wings. In the horse, the spinal cord showed “atrophy of the anterior nerve root”; in the fowl, “an acute poliomyelitis of the lumbar portion of the cord …”
After the Boston and Rutland outbreaks, poliomyelitis clusters were reported in the United States almost every year. Earlier, we mentioned the 33 U.S. outbreaks recorded before 1910. Returning to that list, there appears to be an overlooked association with intensive commercial fruit and vegetable growing. After its introduction in 1893, lead arsenate was soon used on apples, apricots, asparagus, avocados, blackberries, blueberries (huckleberries), boysenberries, celery, cherries, citrus (in Florida), cranberries, currants, dewberries, eggplant, gooseberries, grapes, loganberries, mangoes, nectarines, peaches, pears, peppers, plums, quinces, raspberries, strawberries, tomatoes and youngberries.
With that in mind, consider these pre-1910 outbreaks:
–Boston, 1893. The year lead arsenate was first used there or anywhere.
–Rutland, Vermont, 1894. The year the state entomologist gave the formula for making it.
— Cherryfield, Maine, 1896. Commercial blueberry-producing center.
— San Francisco and the Napa Valley, 1896. Home to vineyards and many other crops.
— Dutchess County and Poughkeepsie, N.Y., 1899. Agriculture and especially orchards – the county’s Web site features a stylized apple.
— San Joaquin Valley, California, 1899. The nation’s Salad Bowl.
— San Francisco and vicinity, 1901. A reprise of 1896.
— Galesville, Wisconsin, 1907. Apples again. The Chamber of Commerce logo features an apple, and the annual Apple Affair is held the first Saturday in October. “Orchards from the area set up stands on the square where visitors can purchase apples and apple treats served up by local growers. Apple pie, apple slices, caramel apples, Apple Normandy, Queen’s Apple, apple cider, apple juice — if it’s apple, you’ll find it here.”
— Oceana County, Michigan, 1907. Self-proclaimed Asparagus Capital of the World, the largest asparagus producer in Michigan and one of the largest in the nation. Home to the National Asparagus Festival held the second week of June. (One of us, born in Chicago in June 1952, spent summers in Pentwater, in Oceana County. His parents wanted to get their children away from the risk of polio in the big city.)
These links, we suggest, are far from random, pointing instead to locations where circulating poliovirus strains might combine with the growing use of the new lead arsenate insecticide to disastrous effect. Other locations point more generally to agriculture – Central Illinois in 1905, the states of Iowa and Wisconsin in 1908, rural locations in Pennsylvania in 1907 and 1908 – or to population centers where produce would be shipped – particularly New York City in 1907, a veritable fresh fruit and vegetable market to this day.
So the convergence of time and place in the early natural history of poliomyelitis outbreaks raises intriguing questions. But what can we say about the toxin itself and its relation to features of poliomyelitis? We can say:
Lead and arsenic can kill and paralyze humans.
Whether ingested or inhaled, several metals have long been recognized as hazards in the workplace and for miners; in Roman times, work in mercury mines was so lethal that only slaves and prisoners were sent into them. More recently, lead was removed from gasoline and paint because both inhalation and ingestion can stunt the mental development of children; arsenic has been banned from medicine and agricultural use in the United States; small amounts in apple juice, made from apples grown in China, triggered a national controversy this month.
“That arsenic can produce paralysis was already known in the 14th century,” begins the monograph On Arsenical Paralysis, written in 1893 by S.E. Henschen in Sweden. “Since then many similar cases have been observed; and at present there are more than 150 cases of arsenical paralysis mentioned in the literature.”
The cases, he said, were comparatively few, and most recovered. He described the case of Maja Lisa Blomster, age 49, who in 1883 took “a white tasteless powder” on the advice of a traveler who said it would help her epilepsy. “After that the patient experienced a prickly sensation or slight pains in the inside of the hands and soles of the feet, and when she walked she thought she felt something like needles between her feet and the floor.
“The following morning she noticed on waking that the feet refused to do service. She tried in vain to bend and stretch out the foot joints and on trying to stand and walk the feet turned on the sides.”
Lead is even more toxic to humans. “The ancients were unquestionably aware of the dangerous character of lead and knew that it was poisonous when taken internally,” wrote H.A. Waldron in “Lead Poisoning in the Ancient World.” The most toxic sources appeared to be lead drinking goblets. “And yet” – shades of mistakes and misjudgments to come – “the Romans and Greeks continued to expose themselves to the effects of a metal they knew to be harmful through their food and drink.”
In the seventh century, Paul of Aegina gave the first account of an epidemic of lead colic he described as “having taken its rise in the country of Italy, but raging also in many other regions of the Roman empire, like a pestilential contagion, which in many cases terminates in epilepsy, but in others in paralysis of the extremities. … of the paralytics the most recovered, as their complaint proved a critical metastasis of the cause of the disorder.”
In Lead and Lead Poisoning in Antiquity, Jerome R. Nriagu of Environment Canada writes “literary classics sparkle with passages on lead poisoning, reflecting, no doubt, the attempts by men of letters to deal with problems of their time.” In The Uncommercial Traveler, Charles Dickens describes a woman who worked in the lead mills because she had no alternative but desperate poverty. “What could she do? Better be ulcerated and paralyzed for eighteen pence a day, while it lasted, than see the children starve.”
In the Book of Minerals, Albertus Magnus wrote in 1262 that “care must also be taken lest it [lead] cause paralysis of the lower limbs, and unconsciousness. This, then, is the nature of lead in its constitution and effects.”
And these, then, were the compounds chemists were mixing beginning in 1893 in Boston. Is it any wonder that Dr. Putnam raised the question the same year whether infantile paralysis was “unusually common this season?”
In fact, just two years before, a speech Putnam gave to the Massachusetts Medical Society was described in a British Medical Journal article titled, “The Injuriousness of Arsenic as a Domestic Poison.” Putnam “points out that paralysis is only the final and gross symptom of a neuritis which may have been going on for a long time.” Arsenic was hard to avoid, he added, because it was still widely used in medicine (a book about the ubiquity of arsenic during the era is titled The Arsenic Century).
Lead arsenate can paralyze and kill animals that ingest it. In January 1920, Veterinary Times published an article by J.W. Kalkus, head of Veterinary Science at the State College of Washington Agricultural Experiment Station, titled “Orchard Horse Disease.” This revealing piece begins: “The writer recently had an opportunity of making an investigation of a disease which has been causing considerable loss among horses in certain sections of Washington.”
It went by several names, Kalkus reported, among them orchard horse disease; orchard poisoning; alfalfad horses; arsenate of lead poisoning; mold poisoning.
Regardless of the name, the circumstances were the same: “The condition occurs in enzootic form in the irrigated apple orchard districts. … The disease was little known prior to the last three years. … It is now claimed by many that it is practically impossible to keep a horse for any great length of time on an irrigated orchard tract, where orchard-grown hay is fed, without the animal attracting the disease. … Present knowledge indicates this disease is confined to the irrigated apple orchard districts where fruit is grown on a commercial basis, and where it is common practice to use arsenate of lead in spraying fruit trees.”
Alfalfa was grown as a cover crop between orchard trees. Lead arsenate spray was often applied so thickly that it dropped onto the alfalfa, giving it a gray color. While some veterinarians did not believe lead arsenate caused the problems – because it did not exactly mimic what was known of lead and arsenic poisoning – Kalkus seemed in little doubt. One reason is that such problems had already been reported.
Experiments with lead arsenate conducted on pigs, calves, and sheep produced symptoms similar to horse orchard disease, Kalkus wrote. Another researcher fed different amounts of lead arsenate to five cows, “all of which died following symptoms of violent purgation, in some cases followed by paralysis.”
The horses Kalkus describes all became ill with fever, apparent abdominal pain and a cough. Some started to recover, but were then affected by paralysis of the vocal cords. Kalter quotes a veterinarian who was convinced lead arsenate was the cause: “As lead and arsenic are contained in the spray material it is highly possible that an idiosyncrasy exists, some individuals being susceptible to lead and acquiring the chronic form and others showing an acute form due to arsenic.”
In early polio epidemics, both animals and people suffered from paralysis. In May 1912, The Medical Times published an article by Jacolyn Van Vliet Manning titled, “The Correlation of Epidemic Paralysis in Animal and Man.”
“A close relationship between paralytic cases in man and animal during epidemics of poliomyelitis has been observed in nine Western states of the United States” as well as in England and Sweden, he wrote, affecting dogs, cats, sheep, hogs and fowl. One case: On May 27, 1911, a boy contracted poliomyelitis. “One week before the boy’s illness a horse belonging to this lad’s father had an attack of what is locally known as ‘poke-neck;’ it is said to have been paralyzed in the neck and forequarters; it fell down in the stable and was unable to rise.”
In Minnesota in 1909, during an epidemic of 1,000 cases in humans, a state epidemiologist reported that a disease “strongly analogous in clinical history and symptoms to the disease in the human” had simultaneously afflicted three colts. Wrote Dr. C.S. Shore: “In my veterinary practice of the past five or six years I have found a disease appearing among one or two year old colts that shows a line of symptoms corresponding closely to anterior poliomyelitis in children. I have had from five to six cases a year during this time, always occurring during the Summer months, and the majority of them during the month of August.”
In fact, as Manning notes, C.S. Caverly made the same observation about the first U.S. epidemic in Vermont in 1894. Something was causing polio-like symptoms in both humans and animals at the same time, in the same place.
The poliovirus may have been a key co-factor in the human cases of poliomyelitis, but the local presence of lead arsenate is also demonstrated by the paralytic cases of animals. These could not have been caused by the virus, which only produces illness in primates. The simultaneous illness of humans and animals is thus a crucial but overlooked clue to causation.
There were concerns lead arsenate caused polio outbreaks. Astonishingly, lead arsenate was proposed as a cause of polio outbreaks early on. In Massachusetts, where the compound was first used, the State Forester reported in 1912, under a section headed Infantile Paralysis: “In view of the fact that a feeling has been entertained by some people in the State that infantile paralysis has been caused in some instances by arsenate of lead used in spraying for the gypsy and brown-tail moths, the State Forester has caused a rigid investigation to be made in order to determine if there is any foundation upon which to base such fears.”
On the other side of the globe, a New Zealand newspaper reported in 1914: “The oft-expressed opinion that the arsenate of lead spray on fruit is the cause of the prevalence of infantile paralysis will be discussed at the next meeting of the Upper Clutha Fruit-growers’ Association at Bannockburn. The association is taking steps to obtain the result of Government experiments regarding this matter.” (We have not found a follow-up report.)
Toxins as a possible factor in outbreaks of paralysis were dismissed by biased investigators. Given these connections and concerns, what happened? Or rather, what didn’t happen? Why did such strong clues about the nature of a disease that followed the planting season like clockwork year after year fail to crystallize in the minds of researchers?
The simple answer is that right from the start, the wrong people were in charge of connecting the dots. Regarding the fears of Massachusetts residents, the State Forester – whose job is to look after trees, not people – wrote that “as a result of his research he is firmly convinced that the use of arsenate of lead has in no way been responsible for the existence of the disease [infantile paralysis], and apprehends no danger in the future from its use. Any anxiety concerning the danger from the use of arsenate of lead is entirely unwarranted.”
As for animals, in 1897 A.H. Kirkland, a researcher for the Gypsy Moth Commission in Massachusetts, home to lead arsenate, conducted an experiment with a single horse fed the pesticide. The animal not only remained “well and hearty” but in “better condition” [italics in original] than before. That contradicted not only the Washington state report that described horse orchard disease, but several other veterinary studies.
A 1917 article in the journal Economic Entomology – by and for specialists who make their living controlling pests that threaten profits – also examined the risk to livestock and, by extension, people. “So far as our experiments with guinea pigs may be relied on, the results indicated that five or even ten times the average maximum per apple found in our analyses could not be expected to constitute a dangerous single dose for a human being.”
In the 1962 classic Silent Spring, Rachel Carson addressed this kind of convenient blindness and bias when she wrote about the inability of parties with an economic interest to acknowledge the damage pesticides caused to wildlife.
“The credibility of the witness is of first importance,” she wrote. Compared to a wildlife biologist, “the entomologist, whose specialty is insects, is not so qualified by training, and is not psychologically disposed to look for undesirable side effects of his control program.
“Yet it is the control men in state and federal governments – and of course the chemical manufacturers – who steadfastly deny the facts reported by the biologists and declare they see little evidence of harm to wildlife. Like the priest and the Levite in the biblical story, they choose to pass by on the other side and to see nothing. Even if we charitably explain their denials as due to the shortsightedness of the specialist and the man with an interest this does not mean we must accept them as qualified witnesses.”
And so poliomyelitis spread unchecked through the first two decades of the 20th century before snaring, in 1921, its most famous victim.
3. Making Sense of Campobello.
Despite its name, the town of Cherryfield, Maine, calls itself the Blueberry Capital of the World, and there is no disputing the claim. Ninety-five percent of the world’s commercial blueberries are grown in surrounding Washington County, and Cherryfield is a major processing and shipping center. Machias, the county seat, hosts the annual Blueberry Festival every August. The festival puts on a musical, this year titled “Blueberry Fields Forever,” and a pie-eating contest – blueberry, of course.
The area has a couple of other claims to fame. Washington County hugs the Atlantic Coast where the United States meets Canada, the easternmost point in the United States; the city of Eastport is the first to see the sunrise. The region is known as Down East
Two miles offshore is the island of Campobello, part of New Brunswick, Canada, where Franklin D. Roosevelt and his family spent summers. It was on his “beloved island” in August, 1921 – ninety years ago – that Roosevelt was afflicted with a paralytic illness diagnosed as poliomyelitis.
But this remote and lightly populated area already had a significant history with polio – one of the first clusters in the United States occurred in Cherryfield a quarter-century earlier, in 1896 (we cited it in our list of pre-1910 outbreaks with links to fruits and vegetables). Seven children were affected, and one died. In all the discussion and theorizing about Roosevelt’s illness over the intervening decades, this convergence has been overlooked.
Roosevelt arrived at Campobello on Sunday afternoon, August 7, on the yacht of a friend who sailed him up from New York City. The previous week, Roosevelt had visited a Boy Scout camporee on Bear Mountain, N.Y., not far from the family’s Hudson River home in Hyde Park.
At the dock, his family was waiting. His children played on the yacht through the adults’ cocktail hour, then were taken home while Franklin and Eleanor stayed for an elegant dinner on the fantailed aft deck, served by uniformed stewards.
Three days later, on Wednesday, August 10, Roosevelt went to bed early in the cranberry-red cottage on Campobello Island, unusually tired and suspecting “a slight case of lumbago” (lower back pain). He had chills during the night, and in the morning one of his legs was weak; the paralysis had begun. By the next night, both legs were paralyzed.
Because of the defining role it played in his life and, inevitably, world history, the days leading up to the attack have been dissected in detail by Roosevelt’s multiple biographers. Most historians believe he contracted the poliovirus on his visit to the Boy Scouts, which would have multiplied the chances of exposure to a youth with an active infection. Alternatively, he could have come down with the virus sometime between the Bear Mountain trip and his departure for Campobello.
After his arrival at Campobello, much has been made of a fall overboard while sailing in the Bay of Fundy; of his typically energetic activities on the day he first felt ill, which included putting out a small forest fire on a nearby island and going for a dip with his children in a freshwater pond near his house. The “paralyzingly cold” water of the Bay of Fundy became an ominous metaphor for what was about to happen but was never a serious biological argument.
Since most victims were infants or children, the fact that Roosevelt was 39 at the time has also gotten attention. In 2003, a study in The Journal of Medical Biography proposed Roosevelt actually had Guillain–Barré syndrome, not poliomyelitis. While interesting, the evidence for such a diagnosis is not strong. Arguing against it is a comment by Elliott Roosevelt, FDR’s young son who was present when his father took ill. He and other children went on a previously planned camping trip – now without their stricken father — because Eleanor wanted to keep them away from the risk of infection.
On that trip, Elliott wrote, “each of us children had some of the same symptoms as Father but in much milder form. We had runny noses, slight temperatures, and, a telltale sign, an odd feeling of stiffness in the neck. These comparatively mild aches and pains got overlooked in the developing crisis which gripped us all.”
Poliomyelitis remains the likeliest diagnosis: the timeline fits with an exposure at the Boy Scout camp. Estimates of the incubation period – typically a week or two, though that can vary considerably in either direction –match the Roosevelt scenario, no longer than 13 days.
And Roosevelt’s presence in the world’s commercial blueberry capital at harvest time when his illness struck seems remarkable in light of the lead arsenate theory, which already had been proposed more than once in the decade before his illness. (Next year’s Blueberry Festival begins August 14.) Eleanor herself did the family’s grocery marketing in Eastport, and Roosevelt’s love of blueberries and other fresh fruit is well documented. His chef in the White House, Henrietta Nesbitt, wrote that he was “fond of blueberry and other pies.” In the cafeteria at FDR’s presidential library in Hyde Park, the Henrietta Nesbitt Café, the most prominent picture is of the broadly grinning president being served a big piece of pie.
Before a trip to South America, Nesbitt wrote, “I made up a list of his favorite dishes for the ship’s mess, and it was practically a copy of the list Mrs. Roosevelt had made out and had ready for me on my first day at the White House.” That list began with “Roast beef pink juice running” and includes “frozen strawberries, raspberries, and cherries for dessert.” Eleanor Roosevelt’s recipe for Blueberry Pudding has survived.
None of this shows FDR eating a mound of fresh blueberries treated with lead arsenate in August 1921, but it seems more probable than not.
Roosevelt famously survived polio, turning his personal tragedy into “Sunrise at Campobello” and leading America through the Great Depression and World War II, though his health flagged in later years as a result of his battle with polio. Some historians believe he was not up to the task of confronting Stalin at the Yalta conference that shaped the post-War world and led to the descent of the Iron Curtain and the Cold War. Roosevelt died in Warm Springs, Ga., in 1945, the retreat where he had worked valiantly to overcome paralysis and help others do the same.
In death he became a symbol for the suffering of thousands of polio victims and galvanized the search for a vaccine. Sadly, despite the March of Dimes campaign he helped launch to find the cause and cure, polio outbreaks were about to get worse. Much worse.
4. Post-War Epidemics and the Triumph of Vaccination.
Life magazine for August 15, 1949, reflected the booming exuberance of the times. The cover, “How to Dress for Hollywood,” featured a buxom starlet in suitably sultry attire. There were ads for DeSotos and Nashes and Chevys to mobilize families and their growing broods of children; cigarettes like Pall Mall, whose “greater length of traditionally fine, mellow tobaccos serves as a longer, natural filter to screen and cool the smoke on the way to your throat”; toothpastes to brush away smoker’s breath and shine stained teeth, and articles on everything from a new sailboat called the Sunfish to a town in Louisiana that cut its taxes in half by installing slot machines.
But twin specters of death and destruction hung over this bright baby-boomer world – the anxiety over atomic annihilation if the Cold War turned hot, and every parents’ most proximate fear for their children, polio.
There were two articles on polio in this August issue. One was titled “Summer season brings epidemics of this uncontrollable disease” and noted that “throughout the nation last week the threat of polio was growing. Starting with some spotty outbreaks during May and June the disease had reached near-epidemic proportions during the sultry drought-ridden month of July. By Aug. 1, 8,300 cases had been reported, a 43% increase over last year. Polio seemed more uncontrollable than ever.”
The peak was still ahead – 1952 would bring 58,000 cases — but the path to prevention had already accelerated faster than any of the cars on display in Life’s pages in 1949. The year before, John Enders’ research group in Boston had cultivated the poliovirus in human tissue, a Nobel-winning breakthrough that cleared a path for Jonas Salk’s vaccine, which followed in 1955. Successful field trials among several hundred thousand children known as Polio Pioneers were announced on April 12, 1955 – the tenth anniversary of FDR’s death. Church bells rang out across the nation.
The jubilation was justified in terms of the vaccine’s effect on the poliovirus – by 1961, only 161 cases of poliomyeltis were confirmed in the United States, just 29 more than the first epidemic year of 1894. But with the outbreaks ending, basic research withered. As Life noted, “how polio is spread, how the virus enters the body, they do not know.”
In 1949, the same year as the Life article, Drs. Morton S. Biskind and Irving Bieber published “DDT Poisoning – A New Symptom With Neuropsychiatric Manifestations” in the American Journal of Psychotherapy. “By far the most disturbing of all the manifestations are the subjective reactions and the extreme muscular weakness,” they reported.
In subsequent papers and testimony, Biskind linked DDT directly to cases of poliomyelitis – including a Dec. 12, 1950, statement to the Select Committee to Investigate the Use of Chemicals in Food Products, United States House of Representatives. He quoted another doctor that “wherever DDT had been used intensively against polio, not only was there an epidemic of the syndrome I have described but the incidence of polio continued to rise and in fact appeared where it had not been before.
“This is not surprising since it is known that not only can DDT poisoning produce a condition that may easily be mistaken for polio in an epidemic but also being a nerve poison itself, may damage cells in the spinal cord and thus increase the susceptibility to the virus.”
“Facts are stubborn,” Biskind concluded, “and refusal to accept them does not avoid their inexorable effects — the tragic consequences are now upon us.”
The theory was also advanced by Ralph R. Scobey, who in 1952 gave a statement to the same House committee. Titled “The Poison Cause of Poliomyelitis and Obstructions To Its Investigation,” it described associations between harvest seasons, fresh fruit consumption, and polio epidemics.
The next year, Biskind made the link even more explicit: “In the United States the incidence of polio had been increasing prior to 1945 at a fairly constant rate, but its epidemiologic characteristics remained unchanged. Beginning in 1946 the rate of increase more than doubled.” Yet far from looking into a toxic etiology, he said, “virtually the entire apparatus of communication, lay and scientific alike, has been devoted to denying, concealing, suppressing, distorting and attempts to convert into its opposite, the overwhelming evidence. Libel, slander and economic boycott have not been overlooked in this campaign.”
But the idea that the active compounds in pesticides could cause paralysis was hardly farfetched. Pesticides are designed to cause mayhem with the nervous systems of their targets.
Lead arsenate was an inorganic pesticide, DDT an organochlorine compound. Both cause neurons to fire randomly, interfering with the ability of the brain to communicate with the rest of the body and leading to paralysis, spasms and death. DDT’s unintended impact on other living things was recognized after Silent Spring, though the focus then was on wildlife, not humans. That was enough to get both DDT and lead arsenate banned in the United States.
Because DDT required a co-factor – the poliovirus – to trigger outbreaks of poliomyelitis, the effect on humans was missed. Adding to the complexity may be the fact, observed in “horse orchard disease,” that living things react with different levels of sensitivity to toxins.
So DDT, we believe, succeeded lead arsenate not just as the insecticide of choice, but as an even more potent environmental co-factor in polio outbreaks. Understanding the role these toxins played was a significant insight and deserved serious attention, just as the early concerns about lead arsenate might have ended The Age of Polio almost as soon as it began.
The DDT theory, like the lead arsenate observation, failed because it wrongly dismissed the equally important role of the virus itself. It could not account for the prompt collapse of polio in the U.S. after the vaccine was developed. The vaccine clearly eliminated outbreaks in the United States. Subsequent attempts to show that domestic DDT use waned about the same time, or that polio was reclassified as other illnesses in an elaborate “scam” to hide the vaccine’s ineffectiveness, don’t really stand up against the evidence.
The pesticide theory was an important one, and Biskind pointed to the synergy of toxin and virus when he suggested DDT might damage cells in the spinal cord and “increase the susceptibility to the virus” – though that is not the mechanism we believe was at work. But the virus hunters were not about to be distracted as they closed in on a vaccine that could stop the epidemics in their tracks. This meant, as we shall see, that in areas where the vaccination effort was less successful, co-factors could continue to trigger outbreaks.
Before addressing that, however, there are two more obvious tests to which we need to put our theory. Infantile paralysis occurred before lead arsenate was invented in 1893. How do we explain that? And what about polio outbreaks that have continued in the absence of either lead arsenate or DDT pesticides? Do they fit our new narrative?
Our research on the natural history of autism convinced us that while there may have been a few scattered cases throughout history, the disorder first occurred in appreciable numbers – as clusters and ultimately as an epidemic — only after 1930. Observations at that time about the rarity and novelty of the disorder are far more persuasive than retrospective efforts to claim significant numbers of cases before then.
The same holds true for poliovirus. Nowhere is that clearer than in the 1917 book “Poliomyelitis In All Its Aspects,” by John Ruhrah and Erwin E. Mayer. “It seems to be a disease of comparatively recent origin,” they wrote. “In the history of most diseases there is a gradual shading off into the older writers until the disease is lost in confusion of inaccurate descriptions. Not so with polio.”
They continued: “The disease is so striking in its symptomatology, so devastating in its results, and produces such a deep impression on the popular mind that it does not seem possible that any very considerable epidemics could have happened in the countries in which there were physicians making records of what occurred.”
The same point is driven home by John R. Paul in his standard 1971 text on the disease, A History of Poliomyelitis. “There was no idea in the eighteenth and early nineteenth centuries that poliomyelitis was contagious,” wrote Paul, a professor of preventive medicine and epidemiology at Yale who conducted important polio research himself. A half a dozen cases within a half-mile of each other might have escaped notice, but “had there been larger outbreaks in the early or mid-19th century it seems highly unlikely that they would have gone unnoticed.”
But just as arsenic and lead caused instances of paralysis before the invention of lead arsenate, there was also an emerging medical literature of poliomyelitis before recurring outbreaks began in 1893. Beginning just before 1800, there’s a history of doctors who took a reasonable cut at identifying the disease. These include, in 1789, Michael Underwood, who used the term “debility of the lower extremities”; and in 1840, Jacob Heine, who is sometimes credited as the first to diagnose poliomyelitis.
Several doctors later used the word “paralysis” in describing a similar condition in infants: West in 1843; Rillet in 1851; Duchesne in 1864. By 1860, Heine had pinpointed the spinal cord as the source of the paralysis, and in 1872, the great French neurologist Jean-Martin Charcot called it “tephromyelitis anterior acuta parenchymatose.” This was a pretty modern description, but his precise nomenclature didn’t quite catch hold. In 1874, German doctor Adolph Kussmaul coined the term “poliomyelitis anterior acuta,” later shortened to poliomyelitis.
Going further back, plausible descriptions grow sparse but include the crippled Egyptian priest in a stele dated from 1580-1350 B.C. In 460 B.C., Hippocrates wrote about clubfoot, which may have included some cases of infantile paralysis. And in 200, Galen also discussed clubfoot.
But those were vanishingly rare, and when Underwood described “debility of the lower extremities” in 1789, he clearly thought he was reporting a new phenomenon, just as Ruhrah and Paul asserted with the benefit of much greater epidemiological sophistication. “This disorder either is not noticed by any medical writer within the compass of my reading, or is not so described as to ascertain the disease here intended,” Underwood wrote. “It is not a common disorder anywhere, I believe.”
— In 1835, John Badham’s description of 4 cases in Worksop, England.
— In 1830-36, Charles Bell’s discussion of multiple cases in St. Helena.
— In 1841, George Colmer’s discussion of 8-10 cases in Feliciana, Louisiana.
— In 1868, Bull’s discussion of 14 cases in Odalen, Norway (cited by Leegard, 1914).
— In 1881, Bergenholtz’s description of 13 cases in Umea, Sweden.
–In 1883, a report of 5 cases in Arenzano, Italy (cited by Hull, 1917).
–In 1885, another report of 13 cases in Sainte-Foy L’Argentiere, France (cited by Hull, 1917).
–In 1886, 9 cases in Mondel, Norway (cited by Hull, 1917).
— In 1887, Oskar Medin’s discussion of 44 cases in and around Stockholm. This description uses the term “poliomyelitis anterior acuta,” argues that it is likely to be infectious and gives a pretty clear description of the modern disorder. His piece is considered a classic and polio used to be called “Heine-Medin disease” in honor of his very contemporary description.
Obviously, all these cases were described before the invention of lead arsenate in 1893. So there were undoubtedly other ways to make this pattern occur; a prime suspect would be arsenic in other forms, as well as other toxic metals, since they are well known to cause paralysis in workers and others exposed to them. Interestingly, both Badham and Colmer mention teething, and the latter said it was a likely cause. In fact, “teething paralysis” was sometimes used as a term for infantile paralysis.
Teething powders containing calomel – mercurous chloride – were used beginning around then, and mercury is well-known to cause paralysis. Mercury in medicine was so widespread that it doubtless caused numerous disorders identified as something else, especially by the doctors who prescribed it. (Badham even prescribed calomel in “repeated doses” for his paralyzed patients.)
Teething powders also caused pink disease, a feature of which was sometimes paralysis. In “Pink Disease,” Charles Rocaz reports that “Karl Petren of Lund (Sweden) has suggested that pink disease is due to chronic arsenical intoxication. … Nervous manifestations occur in the form of paresis [paralysis] of the lower limbs with pain, tingling and burning of the hands and feet.”
So a number of outbreaks might be explained by exposures to metals, including earlier pesticides and other products and medicines containing arsenic. The arsenic-containing Paris Green was originally a pigment and was used in wallpaper as early as 1814.
As for the larger Scandinavian clusters in the later 1880s, northern climes – think Campobello Island, Canada – are hospitable to berries. Also, apples are grown there and are part of the traditional cuisine. Keeping “worms” (actually codling moths, not gypsy moths) out of apples was something people were obviously concerned about for a long time before the war on gypsy moths provided the tools to fight codling moths.
But none of these reached the scale, or occurred with the frequency, of poliomyelitis outbreaks after 1893 and the invention of lead arsenate. This leads to the second test of our theory – once lead arsenate and DDT disappeared from the scene, why did poliomyelitis outbreaks continue, albeit in diminished fashion?
6. After — The Persistence of Polio.
To summarize our theory: Polio is a virus, contagious like all viruses, and generally a harmless enterovirus. When it is introduced into the human body, it has the capacity to enter the nervous system when nerves are damaged. Damage can occur many ways: mechanically through needle puncture or surgery, or, we propose, biochemically via pesticidal or other toxic exposure. Once the virus enters the nervous system, it becomes dangerous in a way nature never allowed before. It spreads through the nervous system via “retrograde axonal transport.” The resulting damage can lead to paralysis or death.
Two regions of the world continue to experience polio outbreaks, despite hopes the virus would be eradicated by 2000. This persistence has surprised and confused the experts.
“The eradication campaign has been stalled from about 2002 to 2007 … Why is it so difficult to complete the global eradication of wild poliovirus?” asked Neal Nathanson of the University of Pennsylvania School of Medicine in a 2008 medical review, “The Pathogenesis of Poliomyelitis: What We Don’t Know.” (What we don’t know turns out to be a lot – the paper runs to 50 pages.)
“Currently, there are two epicenters that have resisted virus elimination, one in South Asia (Afghanistan, Pakistan, northern India) and one in West Africa (centered in Nigeria). What explains the persistence of wild polioviruses in these two foci?”
Nathanson cites three possibilities: those are warm climates, so poliovirus doesn’t go dormant in the winter as it theoretically did in other countries; the prevalence of other enteroviruses means that the live-virus polio vaccine is not as effective because the other viruses interfere with it; and poor public health infrastructure couple with fears about vaccination made the achievement of “herd immunity” harder than expected.
If one considers the toxin idea, however, another explanation jumps out, especially in South Asia. Erase national borders for a moment. While outbreaks are small and have waxed and waned over the past decade, the primary sites have been directly south of the Himalayan range in a smiley-face arc that runs west from Nepal and Bangladesh, through the Northern India districts of West Bengal, Bihar and Uttar Pradesh, into Pakistan and Afghanistan.
This also happens to be the area with the worst mass poisoning from arsenic in human history. This is not ancient history – it didn’t even begin until the 1980s. It is a story of the single-minded war against microbes gone badly wrong. What happened is beautifully outlined in an American Scientist article, “No one checked: Natural Arsenic in Wells.”
“The wells that now supply the people’s drinking water are sealed from bacterial contamination; their tight concrete tubes reach down 60 feet or more, past surface contamination,” write Phillip and Phylis Morrison. “The big investment in concrete wells, originally made by UNICEF and the World Bank, has beaten back diarrheal diseases, making a real contribution to the vigor and quality of life of the people here.”
But what no one checked was the possibility of another kind of contamination: the wells tapped into the deeper water table and pulled up arsenic that had been swept down the Himalayan watershed by the Ganges and Indus Rivers, both of which drain both slopes of the mighty range.
And that proved to be a catastrophic failure. “A new calamity as astonishing as it is threatening confronts the country people of the Bengal Basin,” the Morrisons write – a calamity that has continued to spread through India and westward. “The drinking water, though sealed from infection, can hold a chronic dose of invisible, tasteless, odor-free dissolved arsenic. … That trace presence is a public poison.”
In Bangladesh alone, the World Health Organization calls arsenic contamination of drinking water “the largest mass poisoning of a population in history” (an eerie echo of Biskind calling DDT use “the most intensive campaign of mass poisoning in human history”). In focusing solely on microbes, in failing understand the ecology they were tapping into, public health experts failed in their due diligence – “no one checked,” and so they simply missed the risk from the toxin. In a deep and disturbing irony, we believe that this arsenic exposure – born of a sincere but disastrously conceived effort to protect people from dangerous microbes — has also led to the persistence of poliomyelitis.
On May 12, 2010, Bill Gates boarded a boat in the city of Patna, on the Ganges River in the Indian province of Bihar, and traveled 140 miles east to the small village of Guleria. He was there to personally take the Gates Foundation’s fight against polio into the heart of the beast. Bihar is “one of only two Indian States where new cases of polio continue to be reported, according to UNICEF. Uttar Pradesh is the other.”
This is also arsenic country. The districts on both sides of the Ganges, including Patna and Khagara, are among the 16 worst arsenic-affected districts in Bihar. In neighboring Uttar Pradesh, “as many as 20 districts have alarmingly high arsenic content in the groundwater and the state government is at its wits end.”
The consequences include lesions on the hands and feet, intestinal problems, and cancer that can kill. In India, “the spread of arsenic contamination in groundwater seems to be assuming gargantuan proportions,” reported Current Science in 2005. “What is worse is that inhabitants of the affected areas are unaware and the local authorities totally oblivious to this grave problem. It was known that West Bengal (WB) and Bangladesh had high levels of arsenic in the groundwater, but slowly the problem is spreading to other states like Uttar Pradesh. This is confirmed by the reports of All India Institute of Medical Sciences, New Delhi that people living in the Ballia district of UP also have high levels of arsenic in their blood, hair, nails, etc.”
This convergence has totally escaped the polio fighters, just as the arsenic risk from deeper wells escaped the planners (ironically, those planners include the same World Health Organization now combating poliomyelitis there). “Most parts of India are polio free. Of the 35 states and Union Territories, 33 have stopped indigenous polio virus transmission. Only Uttar Pradesh (UP) and Bihar remain endemic for polio virus because of the uniquely challenging conditions like poor environmental sanitation, high population density, high birth rate which make them the most challenging places on earth to eradicate polio.”
But do those factors really make the two areas “unique” in all India? And why does the polio arc sweep across the same arsenic-poisoned swath of neighboring countries?
At the proverbial 30,000-foot level – Himalayan height, as it happens — the theory makes sense. But it also holds up on the ground. District by district, city by city, the dots connect.
— In West Bengal, Howrah District was singled out for high arsenic contamination – and polio. A case of poliovirus reported there in April “has shocked the World Health Organization, UNICELF, Rotary International and the government itself.”
— In Pakistan, “Another polio case in Muzaffargarh” coincides with the fact that “arsenic was recently found in Pakistan, in and around Muzaffargarh on the south-western edge of the Punjab.”
— In Afghanistan, half a million people are potentially at risk from arsenic poisoning, and the country is one of four where poliovirus remains endemic.
Why, given the arsenic disaster in Bangladesh, are there no recent polio cases there? It appears the virus has been wiped out. “Concerted efforts to eradicate polio in Bangladesh, resulted in the country being declared polio free in August 2000.”
Arsenic abatement also has been attacked most aggressively in Bangladesh, where the problem first surfaced. “In Bangladesh and West Bengal, at present less people are drinking arsenic contaminated water due to growing awareness and access to arsenic safe water. But no doubt the problem would not have attained such gravity, if it were not ignored for quite a long time. Unfortunately today similar mistakes are being repeated in Bihar, UP, Jharkhand, and Assam where still the villagers are drinking contaminated water. Non recognition of truth continues.”
Indeed it does. Polio outbreaks, we believe, are persisting today for the same reason they arose. South Asia is simply a place where toxic interactions are triggering outbreaks that highlight the presence of the virus, like Luminol bringing out hidden blood splatters at a crime scene.
That would seem to spell trouble for programs guided by the belief that going after polio outbreaks will eradicate the virus – despite the vast resources currently being thrown at the effort.
On January 31, Bill Gates spoke at the Roosevelt House in Manhattan – “Bill Gates Channels Franklin Roosevelt,” as one news service put it — where FDR recuperated after being stricken at Campobello. Gates said his foundation is making polio eradication its top priority because “it is the thing we can do to most improve the human condition.” He set a goal of 2013 but said, “Eradication is not guaranteed. It requires campaigns to give polio vaccine to all children under 5 in poor countries, at a cost of almost $1 billion per year.”
Polio eradication itself is a controversial priority. Bill Gate’s own vanity – a heroic effort to make as big an impact on the technological destruction of disease as he did on the technology of computer software – has driven his poliovirus campaign, but many public health advocates grumble that it’s a low priority. Millions die every year from preventable diseases ranging from pneumonia to diarrhea.
“As Bill Gates presses forward in a costly mission to eliminate the disease, some eradication experts and bioethicists ask if it’s right to keep trying,” reported The New York Times in February 2011. “As new outbreaks create new setbacks each year, he has given ever more money, not only for research but for the grinding work on the ground.”
“We ought to admit that the best we can achieve is control,” argued Arthur L. Caplan, director of the University of Pennsylvania bioethics center, who had polio as a child. Gates calls his critics “cynics” who are “accepting 100,000 to 200,000 crippled or dead children a year” if polio resurges.
We believe a more cost-effective way to reduce and contain poliomyelitis outbreaks – as well as improve total health outcomes — would be an all-out effort to reduce arsenic contamination and make sure people in South Asia have safe drinking water.
7. “Where was God?” — Lessons learned and lost.
What, then, is the natural history of polio telling us? Beyond the lessons for containing polio outbreaks themselves, we suggest that a single-minded focus on germs – and an unwillingness to explore novel and potentially uncomfortable ideas from outside medical orthodoxy – is an inadequate strategy when it comes to modern diseases.
It’s hard to overstate the impact the polio experience has had on our modern medical culture, starting with the doctors who watched helplessly as its victims fell. J.R. Paul, in his definitive A History of Poliomyelitis, wrote how “the flowering of scientific medicine brought a new point of view, an era of sudden and incredible hope that something might be done after all.”
For a generation of medical professionals born in the heart of this period, the heroic conquest of poliomyelitis was among the most influential narratives that shaped their beliefs about medicine. These beliefs go far beyond science, as Paul suggests. “As the crusade heightened, the world looked on expectantly. … Much as our grandparents had contributed during the nineteenth century to missionary societies, our dimes and dollars went to another ‘religious’ cause, signalized by efforts to stamp out this pestilence and to alleviate the suffering and tragedy it inflicted.”
Paul leaves no doubt as to the hero of this new religious crusade. “[I]n due time, the disease was abruptly scotched by means of vaccination. It was to all intents and purposes finished. The crusade has been described as one of the greatest technical and humanistic triumphs of the age. It was one of those rare achievements which the world greeted as an example of what could be done when science and technology were directed to good use for mankind.”
But the victory over the epidemics of poliomyelitis means our understanding of polio is essentially frozen in amber, circa 1955. Few diseases have been so completely conquered, at least at home, while being so incompletely understood, and that is not a good outcome. In leaving so many important topics on the table – why outbreaks occurred, why the pattern of contagion was so atypical for an infectious disease – scientists allowed some weak ideas to become conventional wisdom and some important ones to be missed.
The prevailing current explanation for the rise of poliomyelitis outbreaks is the “hygiene hypothesis,” which posits that such simple steps as clean underwear, better sanitation and good housekeeping, along with less exposure to germs like polio in early infancy, meant the effects of disease became much worse when children were finally exposed. This is not a satisfactory explanation, and it never has been – the epicenter of the 1916 epidemic was placed in possibly the filthiest place in Brooklyn, an Italian immigrant community evocatively called Pigtown. And hygiene certainly doesn’t work very well to explain polio’s persistence among the world’s poorest, where sanitation is bad and public health infrastructure is close to nonexistent. (“One injection stops smallpox, The Times noted in its article of Gates’ polio drive, “but in countries with open sewers, children need 10 polio (vaccine) drops up to 10 times.”)
Yet those ideas have spread and now are used to explain other ailments that are likely also mostly environmental, such as asthma (hygiene hypothesis: because children no longer tumble around in the barnyard with farm animals, they are less likely to be exposed … etc.).
And the connection of other illnesses to pesticides, and environmental toxins in general, has been slow in dawning, though it is now becoming clear that a range of degenerative and neurological diseases are related to such exposures.
“In a new epidemiological study of Central Valley residents who have been diagnosed with Parkinson’s disease, researchers found that years of exposure to the combination of … two pesticides increased the risk of Parkinson’s by 75 percent,” reports Science Daily.
The Central Valley was also the setting for a study that found “women who live near California farm fields sprayed with organochlorine pesticides may be more likely to give birth to children with autism, according to a study by state health officials,” reported the Los Angeles Times.
“The rate of autism among the children of 29 women who lived near the fields was extremely high, suggesting that exposure to the insecticides in the womb might have played a role.” The findings echoes those from a 2005 study in Italy “pesticides known as organophosphates could cause neurological changes that lead to autism.”
Recall that the San Joaquin Valley, the Southern Half of the Central Valley of California, was the site of an 1890s outbreak of poliomyelitis, along with nearby San Francisco, Napa Valley, and other agricultural hubs. If mainstream scientists had made this connection between polio and pesticides a century ago – or even after the great epidemics ended in the 1950s – would pesticide use have continued in the same fashion, endangering great-great grandchildren of the first polio generation? Church bells might not have rung for this discovery, but the toll on later generations could have been greatly reduced.
Nor has the polio vaccine, for all of its efficacy, been a risk-free remedy. There were accidents and deaths from the beginning – starting with the Cutter incident in the first weeks of the mass vaccine campaign, in which tainted shots paralyzed dozens of children and killed five. There is ongoing debate about whether a cancer-causing monkey virus, SV-40, infected milions of doses of vaccine in the 1960s and may be causing cancers today. There is the theory that mass vaccine trials in Africa in the 1950s gave rise to the AIDS epidemic – an idea that has been dismissed and derided by the medical industry with the same religious disregard for inconvenient truths as we’ve seen in other man-made epidemics.
And the live virus vaccine now in use in South Asia and Africa indisputably spreads the virus and, in a small percentage of case, causes poliomyelitis. For that reason alone, vaccination may perpetuate polio in the service of eradicating it. The vaccine strain also can and does mutate. (“Polio spreads fast in Nigeria after rare mutation,” reads a 2009 headline.) The only thing better than ending polio epidemics, in short, would have been not causing them in the first place. The real polio narrative is an American tragedy as much as the triumph of scientific medicine.
Yet triumphalism is an ongoing legacy of The Age of Polio. Merely invoking the word today can shut down debate over public health, especially concerns over any aspect of vaccination policy. Asked during the presidential campaign of 2008 whether he favored vaccination choice, Barack Obama responded: “I believe that it will bring back deadly diseases, like polio.”
In a similar vein, a commenter on our blog who identified herself as Kim asked, “What would you like us to do? Let’s stop all immunizations. Guess what will happen? Measles, mumps, rubella, tetanus, polio, influenza will all come back. We will now not only have people scarred from the diseases, but so many people dying. People do not remember when people actually died from these diseases because they have been literally obliterated from the industrial nations.
“I would give just about anything to have a grandmother, but she died from polio when my mother was 17 months old. I have empathy for those with autistic children, but we have gotten so focused on immunizations that we do not look at any other causes. So the next time you hug your child remember my mother who cannot remember any hugs from her mother. Be thankful you have a child to hug.”
In offering our new narrative, we recognize the very real suffering over a very long time. In 1916, the year of the epochal Northeastern epidemic, a New Jersey nurse named Charlotte Talley wrote an article for The American Journal of Nursing with the antiseptic title, “Tracing the Sources and Limiting the Spread of Infantile Paralysis.” But her descriptions were deeply empathetic:
“’Blease, blease, do something,’ pleaded a Polish mother hysterically, clasping her hands in supplication, her mouth quivering. ‘They took my boy to ‘ospital and see,’ showing the bathtub full of soiled clothing, ‘here are all the clothes from the sickness and no water to wash ‘em. Landlady said she get plumber today. She gets no one.’”
The epidemic turned health workers into bystanders to despair.
“A little girl of nine had died of paralysis after a few days of great suffering. She had been a beautiful, bright, lovable child, the pride of the household,” Talley wrote. Apparently, despite all her parents’ precautions, she had played with a neighbor child with an inapparent infection and may have been exposed to the virus that way.
“Where was God?” asked Talley. “It is difficult to understand how such things are permitted by Providence to occur. Evidently, human intelligence is expected to work out this serious problem in order to prevent such disasters.”
The suffering of polio’s victims is honored by learning all of its lessons, including the danger of environmental toxins and the perils of ignoring their role in modern disease; the risk of focusing all of our energy on vaccinations as magic bullets, and the fundamental ethical obligation to search for the truth without fear or favor. Only then can we work out the real nature of illnesses that confront us here and now, ranging from autism to Parkinson’s to the persistence of poliomyelitis itself. Only then can we begin to prevent such disasters as The Age of Polio.
Dan Olmsted and Mark Blaxill are co-authors of The Age of Autism – Mercury, Medicine, and a Man-made Epidemic, published in paperback in September by Thomas Dunne Books. Olmsted is Editor, and Blaxill is Editor at Large, of ageofautism.com.
*Article originally appeared at Age of Autism.