This is a Rorschach Test

And the perception may be unconnected to reality

The world at large sometimes has trouble assembling facts into accurate narratives. This is especially true for facts relating to nuclear energy. The problem can extend to elected officeholders, career journalists, and activists who try to influence both groups.

For example, this BBC report sounded straightforward; a worker at the Fukushima nuclear plant after the 2011 tsunami and meltdown had died from lung cancer, because of his radiation exposure there.

But within the nuclear industry was a more detailed picture. The man, in his 50s, and worked at several nuclear plants over decades. He had accumulated a radiation dose in various places, and his dose at Fukushima was not particularly high. And he died of a cancer that might or might not have come from radiation exposure, but under Japanese law, lung cancer was one of a list of ailments that were presumed to have been caused by radiation exposure if the patient had worked in the nuclear industry. So yes, he had died of cancer, the cancer could have come from radiation, and he’d worked at Fukushima. All that remained was for the BBC to connect the dots, correctly or not.

Think of the facts as a Rorschach test. Hermann Rorschach was a Swiss psychiatrist who developed a series of inkblots, symmetrical from left to right, but with no intrinsic meaning. Researchers show them to subjects and ask what they see, and the answer tells more about the viewer than the object being viewed. (The joke is that the man who is shown the series of inkblots says, “who is this guy Rorschach, and why is he showing me pictures of my parents arguing?”)

Rorschach died 103 years ago, but if he were living today, maybe we’d call him a kind of media analyst; he measured the differing responses of various individuals to a single set of inputs.

Technically, it isn’t possible to get a failing grade on a Rorschach test, but effectively that’s what the BBC did. The Fukushima worker story shows a phenomenon that exists in many realms of human enterprise, but seems especially strong in nuclear. Here is the central rule that describes processing nuclear information: People make sense out of the facts by fitting them into previous preconceived notions, and close off all other interpretations. This gets easier when their frame of reference is limited, which is true for most people on this subject.

In public perception, nuclear, and Fukushima in particular, create an enormous divide. On one side are the people think that after the giant earthquake and tsunami, the triple melt-down killed hundreds or thousands. On the other side are people who can rationally separate the tsunami’s death toll from effect of the reactors being wrecked.

In fact, the earthquake and tsunami killed about 20,000 people, but the World Health Organization says that the radiation doses weren’t large enough to cause a discernable change in cancer rates, and that no individual fatal cancers will be attributable to the accident. People died of cancer before Fukushima and have continued since then. Some smoked, some didn’t, some had a family history, some didn’t.

The BBC was hardly alone. Numerous TV news shows had an announcer reading dire pronouncements about Fukushima while the visual was a burning oil refinery. In the spring of 2011, if it was Japanese and on fire, it must be a nuclear plant, right?

There are other cases of nuclear-induced myopia. For example, there was the campaign of Norman O. Aamodt and Marjorie Aamodt, organic farmers in Pennsylvania, who campaigned against the re-start of Three Mile Island 1 after the accident at Unit 2.

Mrs. Aamodt testified before the Nuclear Regulatory Commission That they had taken blood samples from 29 people in 1994 and 1995 and sent them for analysis from Pennsylvania to Moscow, to the Russian Academy of Sciences. She testified that the Russians concluded that their average dose to the people who gave the samples was 100 rems. This was at least 1,000 times larger than public health officials estimated was the maximum dose to a member of the public could have received, and even that level was unlikely because it required a person to stand at the fence line for several days. And it was about 17,000 times larger than the average estimated dose.

But aside from the enormous discrepancies in estimates of the amount of dose that someone near the reactor could have received, the Aamodts asserted that the 1979 accident was the cause of numerous cancer cases. But the Aamodts identified those cases earlier than a radiation exposure could have caused them, a court found. At the NRC, one official suggested privately that such a finding could be helpful to public health, because if there was, in fact, an unusually large number of cancers in the area, it suggested that it was from some cause other than the accident, and might be continuing; further research might uncover an ongoing health threat. But the Aamodts, unable to fit the information they thought they had into their desired framework, gave up. Their object, it appeared, wasn’t to protect public health, but something narrower and less useful, to protect public health from nuclear energy.

Not all the misunderstandings overestimate the potential hazards of nuclear technology, and some misunderstandings are more consequential than the BBC’s about Fukushima. A worrisome one was a case about nuclear safety, where the Department of Energy did not know what it did not know about a nuclear materials plant it owned and wouldn’t believe it when it was told.

The problem became public at a scene that could have been from a Netflix drama, but it happened a decade before Netflix and it was real, and it teaches something about why our energy and environment deliberations are so frustrating.

In September, 1988, Jerry Hulman, director of the office of Quality Programs at the department, was testifying before a joint House and Senate hearing when the chairman, Senator John Glenn (D-Ohio), held up a list created by DuPont, a government contractor, about the worst safety incidents in the almost 40 years that the company had operated the Savannah River Plant where the department made materials for nuclear weapons.

Hulman, a veteran staffer at the Energy Department and its predecessor, the Atomic Energy Commission, was adamant: these events had been concealed from the government. But a week later at the site, a DuPont engineer who was a co-author of the report told me that the company always told the government everything. There were two reasons: because the government owned the plant, which DuPont ran for a symbolic payment of $1, and because nobody in the government would complain, because they wouldn’t understand the report anyway.

The symmetry of differing perceptions was perfect. Senator Glenn and his staff said the list showed an alarming safety problem. But the engineer said that it showed precisely the opposite; the trend was that reportable incidents were becoming less frequent and less severe, he said.

The Energy Department later conceded that it had, in fact, been given the document, but it never found its way to anybody in a position of authority in Washington. And two of the 30 “suppressed” incidents on the list had been featured over the years in reports in the Atlanta Journal-Constitution.

The problem for the Energy Department was that these facts did not fit into an established framework and had to be rejected. And it couldn’t keep track of all the facts. In reality, it lived in an isolation chamber, unaware that it didn’t know information that was in the public domain.

It calls to mind the oft-repeated (and oft-bastardized) observation of the philosopher and writer George Santayana, that those who cannot remember the past are condemned to repeat it.

And memory isn’t the problem. Santayana’s formulation presumes we understand the past, or, more modestly, the present. Often, we don’t. And it does no good to tell something to someone who isn’t equipped to believe it.

Nuclear energy is prone to be misunderstood, for several reasons. One is that it is one of the few technologies that is explicitly designed with the worst case in mind. Hence nuclear power plants have advance planning for evacuation of everyone within ten miles, when no such evacuation has ever been ordered. Battery plants that store energy from intermittent sources are not designed with that in mind, although earlier this month, one such plant near San Francisco forced closure of a major roadway and evacuation of 1,200 nearby residents. A nuclear plant that caused such an evacuation would be in deep trouble, but there is no sign that California will stop building battery plants.


Another is the highly unusual way that regulatory agencies treat radiation. It is almost in a category by itself: a threat for which there is no minimum threshold. To draw an analogy: swallowing 350 aspirin tablets of 500 milligrams each will probably kill you. Radiation is regulated on a different methodology: Having 350 people each swallow one tablet will probably kill one of them. The analogy isn’t perfect, though, because aspirin doesn’t exist in nature and radiation does, unavoidably. And the regulatory theory, which has a thin scientific basis, isn’t applied uniformly. Airline flight attendants, for example, accrue more dose than almost all power plant workers, because they spend so many hours above the thick atmosphere that shields people on the ground from radiation from space. Their dose can be substantial. But their exposure is unregulated.

We do worry about the health risks of flying, but generally not because of the radiation.

All of this allows the public, from the BBC to various dedicated anti-nuclear activists, to connect the dots irrationally.

The upshot: The irrational will not inherit the earth. They already own it.