Two New Papers Are Wrong About Cancer Risk from Nuclear Plants
Poor Research Design and Strong Claims Don’t Mesh Well
-
-
Share
-
Share via Twitter -
Share via Facebook -
Share via Email
-
In December 2025, researchers led by Yazan Alwadi at Harvard’s T.H Chan School of Public Health published a paper in Environmental Health that claimed to find that cancer incidence increased for people living closer to nuclear power plants in Massachusetts. Just this past week, the same researchers published an expanded nationwide study claiming a similar result—this time looking at cancer mortality rates, rather than incidence—in Nature Communications.
If these findings were true, the research would support the fringe idea that nuclear power is actively harmful to the general population even without a catastrophic failure, which has not been confirmed by past research. Anti-nuclear activists would no longer need to point to the possible risk of meltdowns; they can simply point to increased cancer risks just from living close to a plant.
That is an extraordinary claim. But the studies’ design cannot support that claim.
The problem is not that the authors found a statistical pattern. The problem is that their research design cannot determine whether proximity to a nuclear plant is the cause of that pattern. It can only show that cancer rates vary geographically and that cancer detection rates have increased over the past few decades, which we already know.
The two papers make the fundamental mistake of confusing correlation with causation. By failing to provide a control group in their Massachusetts study and using an improperly-sampled group in their national study. This research is not only wrong, it is fundamentally dangerous. By purporting to show increased cancer risks from safely operating nuclear power plants despite providing no real evidence to support the claim, these papers are actually increasing public health risk associated with the US power sector, fueling efforts to shutter operating nuclear power plants and prevent the deployment of new reactors and increasing the air pollution and public health burden from fossil-fueled electricity generation.
Distance Is Not Dose
To establish that nuclear plants are increasing cancer risk, a study must show more than a map-based association. At a minimum, it must demonstrate:
- A plausible exposure pathway,
- Evidence that meaningful doses are being received,
- A credible comparison showing what would have happened in the absence of that exposure.
Neither study does this.
Nor do they attempt to contend with the broader literature on radiation exposure and cancer risk amongst the nuclear workforce and radioactive release data.
Instead, both papers construct a “proximity score” based on distance from nuclear plants, up to 120 km in Massachusetts and 200 km in the national study. Every ZIP code or county inside those radii is treated as exposed, with closer locations receiving higher weights.
But distance from a facility is not a measure of radiation dose. It does not account for wind direction, release rates, shielding, plant operating history, or actual monitored emissions. It does not measure individual exposure. It does not validate whether residents received any incremental dose beyond natural background radiation.
It is a geographic proxy and a very broad one.
When that proxy is applied across large regions where cancer rates, demographics, income, healthcare access, and age structures also vary geographically, statistical associations can emerge even if there is no causal exposure from the plant itself.
Cancer Incidence in Massachusetts
Alwadi’s research group’s first study, published in December 2025, “investigates the association between residential proximity to nuclear power plants and zip-code level cancer incidence” for the population of Massachusetts. The paper’s abstract makes the strong causal claim, “proximity to plants significantly increased cancer incidence, with risk declining by distance.” But what the paper actually finds is that the aging population of urban and suburban Massachusetts between 2000 and 2018 experienced increases in new cancer cases. What they do not find, however, is that nuclear power plants had anything to do with it. This, at core, stems from a failure to properly assess the question at hand because of shoddy research design.
A primary issue with the paper is that it fails at proper research identification and specification. A model is considered identified when it is able to separate cause from coincidence by showing what would have happened without the exposure or treatment. In plain English, it’s having a credible way to compare reality to a believable “what if nothing changed?” scenario. Because the paper cannot show what would happen to the populations under scrutiny if there were no nuclear power plants, there is no way to say that nuclear plants, in particular, were the cause of increased cancer incidence. The authors then misspecify their model through incorrect assumptions. Misspecification occurs when a model does not correctly represent the underlying statistical or causal relationships, leading to misleading or biased results. In this case, the authors assumed uniform dispersion and distribution of effects based on proximity, did not consider the nonrandom construction of houses near nuclear power plants, had no mechanism for measuring actual doses received, and did not consider the lags involved in the development of cancer.
The paper’s model differentiates proximity to nuclear power plants and assigns exposure rates based on how far from nuclear plants people lived. But, because the population of Massachusetts is in close proximity to seven nuclear plants—Connecticut Yankee, Indian Point, Millstone, Pilgrim, Seabrook Station, Vermont Yankee, and Yankee Rowe, five of which were operational during the study period—the majority of communities were deemed to be affected by more than one nuclear plant. This is due to the authors selecting a very large 120km distance from each plant to evaluate. The emergency planning zone for these plants, even during severe accident scenarios, is only 10 miles (16.1 km) for plume exposures. For ingestion pathways, where contaminants could migrate over time, 50 miles (80.5 km) is the standard radius. The paper includes a sensitivity analysis for distance that would not be able to detect a sensitivity because it only evaluates beyond the standard range, from 80-150 km.
Some zips were counted as having up to four “exposures,” including the heavily populated zip codes between Boston and Springfield. And because the entirety of Massachusetts is counted as “exposed” to a nuclear plant, the study has no control group, nullifying any possibility of the study being able to show a causal relationship. Even the lower bar of showing correlation cannot be adequately attained because any relationship shown in the results would be self-referential due to every observation being “exposed.”
The paper later claims to run a sensitivity analysis by removing the two zip codes with the highest proximity values, explaining that, because results are consistent across the analyses, there are no concerns. In fact, that should have been a red flag that the way the proximity factor is designed must have a problem. The results would be consistent because they depend on analyzing the whole state without a real distance gradient. The parameters changed in the sensitivity analysis, namely the distance, ensured that the whole state was encompassed by the radii around the plant.
Figure 3 shows that the state is treated even more uniformly when the highest two zip codes are removed from the analysis. It visually makes the case that their sensitivity analysis, if anything, exacerbates the design problem instead of testing the sensitivity of the effect.
When we correlate the proximity factor from the paper against distance from nuclear power plants in Figure 4, it is clear that the southern-center of the state, in the area around Worcester, despite being the absolute farthest from all plants, is being over-weighted with this method.
Further, because the paper fails to prove a significant incidence of cancers across time and space, the authors go “model shopping” to find a methodology that gives their preferred result, a strategy known as p-hacking. They chose a cross-sectional model that gives them a positive finding. But the cross-sectional model is weak, and can be driven entirely by demographics, socioeconomic status, healthcare access, and other factors, not nuclear power plants.
The period of analysis (2000-2018) overlaps with a general improvement in the medical profession’s ability to detect cancer. New cancer cases across the country have increased over the past decades. This is due to advances in diagnostic technologies and practices that have allowed doctors to detect cancer earlier and more often than previously. According to data from the Centers for Disease Control and Prevention, the number of new cancer cases nationwide rose nearly 35.9% between 2000 and 2018. The number of new cases rose each year, from 1.3 million in 2000 to 1.8 million in 2018. Cancer mortality increased from 553,000 to almost 600,000 annually in the same time period.
Massachusetts saw a decrease in cancer rates during the study period, but saw an increase in new cancer cases each year. Per the Massachusetts statewide report on cancer incidence and mortality 2014- 2018, “The rise in new cancer cases is mostly due to an aging and growing Massachusetts population.”
Combined with the fact that over the same period, the population of Massachusetts aged dramatically, and elderly populations have higher cancer rates, there is little surprise that the researchers would find increasing cancer incidence in the zip codes in question.
Unable to disentangle these factors from their nuclear power plant proximity metric, the study effectively cannot claim to show any causal relationship. And because cancer incidence, by definition, lags any radiation exposure, the study can’t actually claim that living near a nuclear power plant over their study period had any effect on the cancer incidence during their study period. The study did a sensitivity analysis relative to lagged effects, but once again, the method setup would not detect a sensitivity due to the design of the overall study.
While this study alone is problematic—it asserts a causal relationship that it cannot prove, and provides material for an already fact-challenged anti-nuclear movement—that the researchers spun this Massachusetts-specific study into a national analysis is downright catastrophic.
Nuclear Power Plants Are Not the Cause of Nationwide Cancer Increases
On February 23, Alwadi et al. published their expanded nationwide study in Nature Communications, a premier scientific journal. While the national study had a few differences from their Massachusetts research, both papers suffer from the same fundamental errors.
The national study looks at counties within an even greater distance of 200 km from a nuclear power plant and uses roughly the same model as the Massachusetts paper to test their hypothesis. The authors make some improvements to the model by including proximity to the nuclear power plants 10 years prior to the cancer data and averaging the proximity over 10 years when a nuclear power plant goes offline. But these improvements do not go nearly far enough to make up for the same confusion over cause and correlation.
Just like the Massachusetts study, the national study has no control group. The authors include all of the U.S. counties within 200 km (124.2 miles) of a nuclear power plant that was operational between 1990 and 2018, and assign a proximity score or a cumulative score if they are close to multiple power plants. Most nuclear power plants are located in rural areas, but not extremely far away from population centers, because power generation tends to be located near loads due to transmission constraints. The proximity score allows for counties further away to have less of a treatment effect, but the further away from population centers one gets, the less representative the sample becomes. The counties with lower proximity scores are disproportionately poorer and have more rural populations, with lesser access to hospitals, treatment centers, and other non-medical life-extending amenities. Because of relative affluence and proximity to urban centers with medical facilities, those with greater proximity to one or more nuclear power plants are likely to live longer lives, with thus a higher chance of dying of cancer later in life, rather than some other cause earlier in life.
The researchers, to their credit, added controls for median income, distance from hospital, and other important factors. However, this matters little when the treated sample includes the Northeastern Corridor, containing 1/7th of the US population, which, due to the proximity of several nuclear power plants, is weighted many times more than any county further away from a nuclear power plant. Even then, every county included in the study is counted as affected, so there is effectively no control group against which to compare outcomes. Causation is out of the question, and any correlation is once again doubtful due to being self-referential and without an identified exposure pathway.
One significant difference between the two studies is what they are, in fact, measuring. In the Massachusetts-specific study, the authors look for rates of cancer incidence (cases of cancer). In the national study, they are looking for increased rates of cancer mortality (deaths caused by cancer). It is unclear why the authors chose to examine a different outcome. Perhaps cancer incidence did not give them a positive or significant result at the national scale, and by changing to a different focus, the authors would get the result they wanted. This would also be a form of p-hacking. Nevertheless, the switch to cancer mortality creates a few additional problems for the national study, mainly because mortality is an even noisier proxy for environmental causes of cancer than incidence rates because it incorporates treatment and healthcare disparities. Mortality also has an even greater lag problem: it can take decades for an exposure to lead to cancer, and can take even more years for cancer to cause death. Also, because of earlier detection, years between a cancer diagnosis and mortality are increasing. To make this problem worse, the authors lump all kinds of cancers together, conveniently ignoring variations across cancers—especially when it comes to latency period and mortality.
In the national study, the authors are explicit that they cannot establish causality, but still use a formula for “Attributable Fraction”, how much of cancer mortality was caused by being in proximity to a nuclear power plant, that assumes a causal relationship between distance from a nuclear power plant and cancer mortality. Attributable fractions are causal quantities. Using them here quietly assumes the conclusion.
Using this Attributable Fraction, they then go on to make the causal claim that they “show that U.S. counties located closer to operational nuclear power plants have higher cancer mortality rates than those farther away.” Their results do no such thing; the resulting indicator is nothing more than a division of relative risks based on bad assumptions and ignores the fact that the authors never measured dose or defined an exposure mechanism based on reality.
What We Do Know About Nuclear Power Plants, Radiation, and Cancer Incidence.
To have a health risk, there must be an exposure to a hazard. The papers assume there is chronic and significant pollution coming from nuclear power plants, citing a different paper that makes the same assertion, but makes no attempt to evaluate that exposure. If there is no exposure, it is equivalent to saying someone has a higher chance of getting black lung, even if they never breathed coal dust, or being in a plane crash, having never stepped on a plane. This was also a major concern of a peer reviewer, but the issue was not addressed before publication.
Data has shown that the radiation produced by nuclear plants is orders of magnitude lower than the typical exposure from medical imaging or long flights on airplanes. Data availability is not a barrier; data for each nuclear facility is publicly available from the Nuclear Regulatory Commission.
But considering exposure from nuclear power plants would force the authors to confront other sources of radiation. Spatial variation in radon exposure alone dwarfs plant emissions. Routine emissions from operating nuclear power plants are extremely small, usually hundreds to thousands of times lower than natural background radiation and well below regulatory limits. There is far greater variance in the natural background variation that the national sample is exposed to than any excess dose that could conceivably come from nuclear power operations. For people living near plants, measured doses from plant operations are often indistinguishable from zero relative to background radiation, meaning any statistical association between proximity to plants and cancer is far more likely to reflect demographics, detection, or geography rather than radiation exposure from the plant. Given this, it would be surprising to find any measurable correlation, much less one that is statistically significant.
How do we know this? As one of us, alongside colleagues, recently pointed out in their article in the Bulletin of Atomic Scientists, “large sample studies have tracked the health of millions of workers exposed to routine ionizing radiation and found no definite link to increased risk of cancer at low effective doses.” For example, there are almost 80 journal articles published based on the Department of Energy’s Million Person Study, a major, long-term epidemiological research project designed to understand the health effects of chronic occupational radiation exposure experienced by U.S. workers and military veterans. It is the largest study of its kind in the United States. The Million Person Study is designed to measure long-term health outcomes associated with chronic, low-dose rate radiation exposures, such as cancer, cardiovascular disease, neurological conditions, and other chronic ailments.
While the radiation exposure of nuclear workers will always be greater than or equal to those received by the surrounding public, most of the closely monitored US nuclear workforce receive no measurable annual dose. When workers are exposed to radiation, the average dose received is only 2 percent of the occupational limit. People do not build their houses on the sites of nuclear power plants, and even if they wanted to, the exclusion perimeter would prevent them from doing so. If operators and workers who are on-site at nuclear power plants receive an annual dose between zero and one-fiftieth of the occupational limit, how is it possible that residents 5, 10, 25, 50, 120, or 200 kilometers away would receive any measurable dose from the same plant?
Papers Masquerading as Science
The Harvard studies’ results are surprising given the well-established literature and studies that refute the authors’ claims. When results are surprising, we want to know what mechanism is causing them; however, the authors do not seem to have a good grasp on what drove their results. Still, good science can just find a new data point and then wait for others to determine what the new data point means. Unfortunately, the studies fail to even provide a new or valid data point. The model was not properly identified, was misspecified, is replete with errors, and therefore cannot show any causation.
These studies do a great job showing the prevalence of cancer, or really its detection, and mortality rose from 2000 to 2018, even if the authors don’t recognize that contribution. Unfortunately, it does little else.
Wealth, exercise, better nutrition, and access to healthcare have increased human life expectancy. With longer lives, the probability of cancer increases. This is obviously due to a plethora of factors, not simply aging. But, it’s assuredly not caused by any one specific factor, such as proximity to nuclear power plants.
The debate over nuclear power is too important to be contaminated by bad science.
Environmental Health, the journal that published the first study, indicated that a code sample would be available upon request. We emailed the lead author shortly after the paper was published in December and requested a sample of their code. The author has not returned or acknowledged the request. The code we at the Breakthrough Institute used to make the proximity indicator and radius maps is available here.