Half (51%) of U.S. adults say they are likely to get a COVID-19 vaccine when it becomes available, according to a recent Pew Research Center poll.
This is a more than 20-point drop from the same survey done in May, when 72% of Americans said they definitely or probably would get the vaccination.
Amid the largest pandemic since the 1918 flu, one would think more Americans would agree on the need for a vaccine. After all, billions have been spent on the development of a vaccine, and the public continues to demand more should be done.
But vaccination has been a divisive subject in the U.S. well before the so-called “anti-vaxxer” trend; there are a few possible reasons for this radical shift in only four months.
First, Americans uneasiness about vaccination programs goes back to Dr. Edward Jenner’s smallpox vaccination at the end of the 18th century.
This generated public outcry that was led by anti-vaccination physicians like Dr. Charles Creighton who claimed the vaccination caused syphilis.
Like many myths, Creighton’s claim contained a kernel of truth: early inoculation programs in England did create unintended side effects like tuberculosis, tetanus, blood stream infections and even syphilis.
Of course, modern infectious diseases professionals correctly attribute this to the inability to maintain modern quality control and sterility.
Thus, these side effects were not the result of the vaccine itself but due to contamination during vaccine production or the administration of the inoculation (reusing needles).
In 19th century America, vaccinations were regulated by individual states, but anti-vaccination societies emerged as early as 1879 claiming the crisis was over, and vaccinations were no longer necessary.
The issue came to a head in 1905 with Jacobson v Massachusetts where the Supreme Court upheld the constitutionality of mandatory vaccinations to protect the public health. This paved the way for large-scale vaccination programs.
The anti-vaccination controversy reemerged in 1955 with one of the most significant pharmaceutical disasters in history.
That year, Cutter Laboratories produced a batch of 120,000 doses of a polio vaccine, which mistakenly contained some live polio virus mixed with the intended inactive virus. As a result, over 40,000 people contracted the disease, resulting in dozens being paralyzed and at least five deaths.
Rumors of contaminated vaccines still circulate, despite all of the advances in technology and procedural safeguards that now govern vaccination production and inoculation.
Many who view vaccinations with suspicion cite Dr. A. J. Wakefield’s investigation that was published in The Lancet and ultimately retracted in 1998.
He claimed that between eight to 12 children saw neurological symptoms (autism) after taking the measles, mumps and rubella (MMR) vaccination.
While this study has been repeatedly shown to have countless flaws, it is still used by the anti-vaccination community to push back against government-sponsored mandatory inoculation programs.
It would be a mistake to use this recent Pew research study to argue the public has lost faith in the development of an effective COVID-19 vaccine, all of the politics related to its production notwithstanding.
While these recent numbers are insightful, they are not different from similar studies published over the last decade.
During the H1N1 swine flu pandemic of 2009, Pew released a study showing that around 47% of Americans stated they would be willing to receive a possible vaccination. For the 2017-18 flu season, the CDC estimated that around 45% of adults were vaccinated.
So why the abnormally high number in the May Pew research study?
Most likely, that early figure was related to fear. In May, COVID-19 was new. We did not know a lot. All the public saw was ever increasing infection and mortality rates.
And as noted above, Americans have an often-irrational mistrust of vaccines. The September numbers reflect this attitude.
All of this begs the question: If a COVID-19 vaccination is available in early 2021, why should we get it?
First, vaccines that are properly tested and produced to the highest standards simply work.
The last recorded case of smallpox was in 1977. While smallpox disappeared from most industrial and Western counties in the 20th century, it still circulated in Africa and Asia. Not until the World Health Organization’s intensified smallpox eradication program in 1967 was the battle won.
We have also seen polio almost eradicated worldwide. Even in the U.S., cases of whooping cough, mumps, diphtheria, hepatitis B and rubella have plummeted. This is linked to our comprehensive immunization programs.
In short, good quality vaccines work. Because modern vaccines are incredibly safe due to high industrial standards, it is generally safer to get a vaccine than to go without – of course, each patient should consult with their physician.
Second, when a large number of the populace is vaccinated for a specific pathogen it speeds up the process of herd immunity.
Herd immunity is a process where a large number of people have contracted and recovered from an infectious disease and their bodies produce a natural immunity to future infections. This in turn reduces the chance the virus can rapidly spread and often causes the virus to die out.
It can be argued that individuals who are healthy enough to receive a vaccination have a moral duty to participate in the process in order to protect the public. This is not so much about taking care of oneself as it is others and future generations.
What we learned from centuries of battling smallpox is that if a virus flourishes anywhere, it can ultimately come back. We are not safe until all of us are safe.
The subject of vaccinations has had a long and rocky history, often due to the science being misunderstood by the public.
Rumors still abound about inappropriate experiments, laboratory accidents or dangerous side effects. Every time we have a new virus and talk about developing a new vaccine, the old debate reemerges.
Yet, for established vaccines, the vast majority of Americans are supportive of the programs. Last year, Pew showed that 73% of adults favored the health benefits of the MMR vaccine and 82% were in favor of school vaccination programs.
As we all continue to move forward in this pandemic, we each need to take a step back and seriously look at the evidence.
What is being written in the best journals? What does my personal doctor have to say? What information is the Centers for Disease Control and Prevention and National Institute of Health releasing?
It is based on this information that individuals can decide two things: What is best for my family and me? What is best for my community and the world?
We cannot let rumors and fear of the unknown make our decisions for us.
Senior Staff Chaplain and Clinical Ethicist at the Baptist Health Medical Center in Little Rock, Arkansas.