Peter Karamoskos
... there is a linear dose-response relationship between exposure to ionizing radiation and the development of solid cancers in humans. It is unlikely that there is a threshold below which cancers are not induced. - National Academy of Science, BEIR VII report, 2006.
We need to develop a very firm commitment to the elimination of nuclear power as a source of energy on the earth. - Russell Train, former US Environmental Protection Agency administrator, 1977.
[t]he [economic] failure of the US nuclear power program ranks as the largest managerial disaster in business history, a disaster on a monumental scale. - Forbes, 1985.
Introduction The public health implications for a resurgence of nuclear power appear to have taken a subordinate position to the economic and global warming arguments that the industry has advanced to justify its expansion. The purpose of this essay is several-fold: to review the scientific evidence for the public health impacts of nuclear power; to assess occupational hazards faced by workers in the nuclear industry involved in the nuclear fuel cycle; to assess the evidence for nuclear reactor safety and critically challenge the underlying assumptions. This paper will also examine the public health risks of spent fuel from nuclear power reactors. The common thread linking these safety issues is the risk posed to public health by ionising radiation and in particular the risk of cancer. The nuclear industry and our understanding of radioactive health hazards developed in tandem during the 20th century, but the relationship has always been uneasy and often in conflict. A brief historical narrative of this joint evolution is essential to understanding the context and scope of the health issues at the heart of the debate.
If we are to believe the nuclear industry, nuclear power is both safe and vital to our future, yet over half a century of nuclear power has proven both contentions false. In the last decade, the nuclear power industry has undergone a 'renaissance' of interest and hype, spurred along by the claim that it is vital to combating global warming. The industry has had many false starts, each time failing to live up to its promises. At its inception, it sold itself as providing limitless electricity too cheap to meter. When this was proven false, it attempted to recreate itself as the key to energy security during the oil shocks of the 1970s. But it foundered again on the grounds that it was not only too expensive and most electricity did not rely on imported oil, it was so economically unattractive that financing was virtually impossible without heavy tax-payer subsidies and loan guarantees. Throughout this period, public health concerns increased against a backdrop of reactor safety concerns and the effects of ionising radiation on the surrounding populations,with ten core meltdowns in various nuclear reactors, including several in nuclear power reactors, culminating in the Chernobyl disaster of 1986.
The link between nuclear power and nuclear weapons is critical in understanding the context of the industry's development and its impact on public health and safety. Nuclear power followed the development of nuclear weapons in the USA in an attempt to garner public support for nuclear technology, which had shown how destructive it could be and how much of a threat it posed to humanity. Public tax-payer support was critical to facilitate further weapons development. Nuclear power was the product of the 'Atoms for Peace' program in the 1950s to achieve this end, leading to the export of nuclear reactor technology as well as bomb-grade highly enriched uranium as reactor fuel to many countries. Propelled by the 'more is better' hubristic military commanders and civilian nuclear boosters, in attempting to highlight the 'peaceful atom', the nuclear establishment inadvertently although not unpredictably led to illicit weapons programs around the world.
The original drivers of nuclear power were not a need for electricity, environmental concerns, or the need for energy security, but the political and military imperatives that dominated and spurred its development. In this climate, safety issues were not paramount. How could they be if the science of the human effects of ionising radiation was still in its infancy and the safety of nuclear reactors unknown? If anything, safety concerns posed potential obstacles to the industry's development and needed to be managed, as they were by savvy media men. It was a climate of 'electricity today, and (maybe) safety tomorrow'.
The questions are: what is different now? Is nuclear power now safe?
The history of human health and the safety of nuclear power is inexorably intertwined with the evolving history of the health effects of ionising radiation (IR). Whereas the science underpinning the generation of electricity from nuclear power is well established, the health effects on humans of ionising radiation is still evolving. This is not to undermine the voluminous research and findings clearly documenting the adverse effects of ionising radiation on human beings. That much is well documented and understood. The uncertainties lie in precisely quantifying the effects of IR, including defining the risks of ever decreasing doses with greater precision. This is the key in attempting to understand the direct adverse health effects of nuclear power on two groups; nuclear industry workers and populations in the vicinity of nuclear reactors and subject to their radioactive emissions.
Ionising radiation & public health Ionising radiation arises from many sources. Nuclear fission which powers nuclear reactors is one. It is postulated that ionising radiation imparts its deleterious health effects through two mechanisms: transference of its energy to atoms in biological tissue which then becomes electrically charged, leading to the formation of free radicals that then damage the cell's genetic blueprint (DNA) and lead to genetic mutations; and direct DNA disruption along the track the ionising radiation traverses through the cell's nucleus. The most mutagenic (causing genetic mutations) of these are double stranded breaks (DSB), where both strands of the double helix DNA molecule are simultaneously disrupted and the result is a high likelihood of mutations. This then predisposes to the initiation of cancer when the regulatory mechanisms of the cell fail. Cancer may not appear for 10-40 years (latency), although the time can be as short as five years for leukaemia. Ionising radiation is classified as a Class 1 carcinogen by the International Agency for Research in Cancer (IARC) of the World Health Organisation (WHO), the highest classification consistent with certainty of its carcinogenicity.
Two types of IR health effects are recognised. The severity of deterministic effects is directly proportional to the absorbed radiation dose. These effects include skin damage and blood disorders. The higher the dose, the worse, for example, is skin radiation burn. The effects have a threshold below which they do not occur, although this may vary between individuals. This threshold is around 100 mSv at which blood production begins to be impaired. The sievert (symbol: Sv) is the international unit of effective radiation dosage, i.e., 100 mSv is 100 millisieverts.
Stochastic effects are 'probabilistic' in nature. In other words, the higher the dose the greater the chance of them occurring, but once they occur their severity is the same irrespective of the original dose. The main stochastic effect is cancer. The lower the dose of IR, the lower the chance of contracting cancer, but the type and eventual outcome of the cancer is independent of the dose. It can be seen that the high dose deterministic effects of IR were readily observable early after the discovery of radioactivity, but the concept of a stochastic effect as a mechanism for the development of cancer took several decades to be understood. The quantification of stochastic effects has occupied scientific debate throughout most of the 20th century and is still being played out. The distinction is critical to understanding the health impacts of low-dose radiation, particularly with nuclear power and radiation doses to workers and the general population that are below deterministic levels, and to understanding why there is considerable controversy over its significance.
A brief history of radiation safety & the nuclear industry The hazards of IR were inadvertently demonstrated by the pioneering researcher Marie Curie who identified the radioactive element radium three years after Wilhelm Roentgen discovered x-rays, another source of IR. Curie died of leukaemia and Roentgen of cancer. Many more workers over the ensuing years experienced skin burns and deep tissue trauma, blood diseases and cancers, most famously the radium dial workers. Carcinogenicity was observed as early as 1902. Still, it was not until 1925 that the first protective limits were suggested for workers (the public had to wait until 1959 before general public limits were enacted in the US, although it was earlier in Europe). For three decades, these limits were based on the concept of a 'tolerance dose' which, if not exceeded, would result in no demonstrable harm to the individual and implicitly assumed a threshold dose below which radiation effects would be absent. This tolerance dose was determined by the concept of 'minimum erythema dose' which related to skin reddening after exposure to IR. This was initially 1 per cent of this dose and corresponded to approximately 2 mSv per day, which was halved in 1936 (the current occupational limits are 20 mSv per annum).
After World War II, largely because of genetic concerns related to atmospheric weapons testing, radiation protection dose limits were expressed in terms of a risk-based maximum permissible dose (MPD). This was an arbitrary limit based on the unsubstantiated assumption that any hazards below this level were not significant and represented a reasonable compromise between safety and pragmatism. In effect, the public wore the burden of proof for demonstrating significant harm below these limits. The concept of stochastic risk was not even considered. In 1946, the National Committee for Radiation Protection (NCRP) reduced the MPD in the USA to an annualised limit of 150 mSv per person per annum. These limits only applied to external radiation, i.e., x-rays and gamma rays. They did not apply to internal emitters (those ingested or inhaled), such as radon and radium, since there was no way of measuring radiation dose from these sources. The radiation dose from internal emitters was finally determined by the infamous 'radiation experiments', where subjects were unknowingly administered plutonium and uranium without any awareness of the nature of the experiment or with informed consent.
The 1927 discovery by Muller of x-ray induced genetic mutations in fruit flies, linear with increasing dose and with no apparent threshold, was an important underpinning for the standards. During this era, when business and the medical profession were trumpeting radium as a miracle cure, even adding it to bottled drinking water and chocolate bars, it was easy and convenient to dismiss Muller's findings as irrelevant to humans. This was the case for at least four decades after his discovery.
Nuclear weapons give rise to nuclear power - peace now, war later? The Manhattan Project was the code name for the project conducted during World War II to develop a nuclear bomb. It was a collaborative project led by the US with participation by the UK and Canada. It achieved its first controlled chain reaction of a nuclear reactor in 1942, designed and led by Enrico Fermi, and finally developed and detonated an explosive device, Trinity, in July 1945.
The atomic bombs detonated over Hiroshima and Nagasaki less than one month after Trinity were a watershed. Radioactivity would no longer have the lustre it once had; it became synonymous with death and destruction, particularly after the Soviet Union also succeeded in acquiring nuclear weapons. Seven years after nuclear weapons were used in war, Dwight Eisenhower set the US government on a new course, intended to show the world that nuclear weapons, radioactivity and radiation were not harbingers of death but in fact benign forces for the betterment of mankind. The 'Atoms for Peace' program was born to convince Americans that the new technologies were full of hope and that nuclear reactors should be developed with tax dollars to generate electricity. The vision was of electricity "too cheap to meter". Eisenhower manipulated Cold War fear of attack by the Soviet Union with the countervailing hope of the promise of peaceful nuclear energy. The process of persuasion was controlled by media men, led by Charles Douglas Jackson, an expert in wartime psychological operations. Eisenhower set the stage for the eventual formation of the International Atomic Energy Agency (IAEA), a key step in the government-backed worldwide promotion of civilian nuclear energy.
Domestically, the power utilities were reluctant to embrace a risky and expensive new technology. In 1954, there were a lot of unknowns about nuclear safety. Nevertheless, the US provided funding, research reactors, and bomb-grade highly enriched uranium to 42 countries to kick-start interest. It also heavily subsidised domestic utilities and offered them liability protection (the Price-Anderson Act) in the case of a nuclear accident, transferring the risk to the public. To this day, private nuclear energy utilities worldwide rely on liability protection. They would not be generating nuclear power without it.
Inspired by Eisenhower's example, the nuclear establishments in Britain and France misleadingly promoted the 'peaceful' face of the nuclear industry to conceal the true purpose of the early reactors, which was to produce plutonium for weapons. In every country where nuclear power was under development, the public was misled into thinking that the separation between military and civilian purposes was real. This was clearly evident in countries with reprocessing plants that highlighted (and still do) the duality of the technology. Any country with a reactor, even those specifically designed for electricity generation, had the means to produce plutonium. One could surmise that nuclear power was nothing more than a fig leaf to hide the military intentions of the atomic establishments from the public. Gullible politicians aided and abetted the subterfuge through their ineptness and reluctance to question the nuclear establishments in most countries, which in turn provided phony economics and false book-keeping in order justify large expenditures.
The regulation (selling) of nuclear power in a climate of increasing public health concern The Atomic Energy Commission was conceived by the Atomic Energy Act of 1946 as the successor to the wartime Manhattan Project, with the conflicting role of overseeing the development and testing of nuclear weapons and convincing the public of its safety. In 1954, the Commission's mandate expanded to promoting and regulating nuclear industries, particularly nuclear power and certifying safety. As a practical matter, given its military origins, the AEC was subjected to close control by top military commanders. Thus, by a series of accidents, all major sources of ionising radiation fell under the remit of people and institutions with no reason to explore the early knowledge that IR was harmful. The AEC's military commissioners often over-ruled recommendations to decrease the MPDs. The conflicts of interest led to public clamour for change, particularly noting that the MPDs were constantly subject to change and varied from one institution to another.
The AEC's conflicts were no better illustrated and its credibility no more harmed than by its anodyne interpretations of nuclear fallout studies, contrary to the concerning results that were seized upon by its critics. More than any issue, atmospheric nuclear testing brought the health effects of low-level radiation to the mass media. As the fallout controversy intensified, in 1956 the AEC appointed a prestigious non-governmental scientific panel, the National Academy of Sciences, to assess the evidence for the effects of nuclear fallout and comment on the health effects of low-level radiation. Its conclusions were that atmospheric nuclear testing at that point in time did not pose a 'significant' hazard. Yet it foresaw the dangers of radiation exposure from nuclear power and called for careful control. More disturbing, it concluded that exposure to radiation, even in small doses, could cause genetic consequences that would be tragic in individual cases and harmful in the long term for the entire population over generations. "We ought to keep all our expenditures of radiation as low as possible. From the point of view of genetics, they are all bad."
Partly as a result of the fallout controversy and the National Academy of Sciences report, the National Council on Radiation Protection (NCRP) revised its permissible occupational doses down to 50 mSv per annum. The International Committee on Radiological Protection (ICRP), unlike the NCRP at the time, advised radiation dose limits for the general public which were one-tenth that of the occupational permissible doses. The NCRP finally concurred with these in 1959, being 5mSv per annum exclusive of background (natural) radiation (and 1.7 mSv across population groups to minimise heritable genetic effects).
Nuclear power critics on the ascendancy - no safe threshold The concept of MPDs implicitly acknowledged there was no threshold below which radiation could be said to not cause harm, only that the magnitude was not known (although it was assumed to be minimal). The somatic effects of low dose radiation, mainly cancer, had not been quantified due to a paucity of data. Increasingly influential though was the extrapolation of somatic effects from higher dose levels in a linear fashion. This was the 'linear no-threshold' dose-response model (LNT).
In this environment, the potential health impacts of emissions from nuclear power plants were gaining unprecedented national attention. In the late 1960s and early 1970s, two scientists affiliated with and funded by the AEC were instrumental in putting enormous pressure on the nuclear power industry and regulatory agencies, which led to the introduction of the LNT model as a means of numerically estimating cancer risks. Arthur A. Tamplin, a biophysicist and a group leader in the biomedical division of the Lawrence Livermore Laboratory funded by the AEC, and his supervisor, John W. Gofman, a chemist with a medical degree who co-founded the division and was a former alumnus of the Manhattan Project, argued that the then MPDs were too high and advocated reducing them by a factor of ten. They looked at health studies of the survivors of Hiroshima and Nagasaki and other epidemiological studies, and conducted research on radiation's influence on human chromosomes. On this basis, they argued that if the MPD of 1.7 mSv were applied to the whole population, this would result in 17,000 additional cases of cancer in the US annually. They further argued that it was not obvious that the benefits of more nuclear power outweighed the risks.
In 1972, the Biological Effects of Ionising Radiation (BEIR-I) report of the National Academy of Sciences declared that nuclear plants generated concern because of their growth and widespread distribution, and that the current limits of 1.7 mSv across populations was "unnecessarily high". The report also found that the somatic risks (cancer) of low dose radiation were appreciable and that the LNT model was "the only workable approach to numerical estimation of the risk in a population". It also pointed out the need to evaluate the risks from radiation of nuclear power. The AEC, NCRP and other expert groups now accepted the LNT model that assumed no level of radiation exposure was certifiably safe. Eventually, this led to further reductions of nuclear plant effluent (gas and liquid) emissions limits. Ultimately, Tamplin and Gofman's arguments were vindicated, but not before they were forced to leave the Lawrence Livermore Laboratories after a public vilification campaign by the AEC.
It was not until 1974 that the AEC was disbanded and replaced by the Nuclear Regulatory Commission and the short-lived Energy Research and Development Corporation. A Joint Committee on Atomic Energy oversaw radiation related issues in all federal agencies, but frequent tension and disagreement prevailed, partly due to a paucity of firm data to base MPDs and policy and, more than likely, the political desire to expand the number of nuclear power stations in the US, which by this stage had stalled due to cost blowouts and revised estimates of electricity demand downwards. The importance of radiation to national security, energy policy and environmental health has always made the determination of the effects of low-dose radiation on health a difficult problem. To establish regulations for the safe use of radiation, federal agencies had to balance the uncertain health consequences of radiation against the government's interest in nuclear weapons and nuclear power.
The creation of the US Environmental Protection Agency (EPA) in 1970 created an organisation whose remit included the protection of the population from environmental radiation, including radioactivity, and almost immediately came into conflict with the AEC (and its immediate successor, the NRC). The EPA published a report in 1974 on the possible long-term hazards of nuclear power plants, which argued that the cost of radiation releases from nuclear plants from 1970-2020 were estimated at up to 24,000 deaths (including a period of up to 100 years after the releases took place). The EPA further acknowledged that the radiation burden on the population related to the entire nuclear fuel cycle, not only including nuclear plant operation but also fuel fabrication, reprocessing and other processes, which enraged the AEC.
Contemporaneously, the oil embargo of the early 1970s shifted the focus of the US government to becoming energy independent, which it determined required a major expansion in nuclear power. The EPA's plans for fuel cycle regulation of emissions were thus seen as compromising this aim and were curtailed. Instead, its activities were restricted to overseeing an ambient standard for the amount of environmental radiation from fuel cycle activities, rather than setting standards for individual fuel cycle facilities. In commenting on the public health risks of nuclear power in 1977, Russell Train, the former EPA administrator, declared a short time after leaving the agency that "We need to develop a very firm commitment to the elimination of nuclear power as a source of energy on the earth".
Over the ensuing several decades, three further BEIR reports were released. In addition, several nuclear accidents and incidents, including Three Mile Island and Chernobyl (discussed later), damaged the credibility of nuclear power plant safety and increased public concern and scrutiny, which had partly died down during the latter 1970s. In 1986, the NRC reduced its permissible limits for workers to 50 mSv (combined internal and external sources) and the exposure of the general public from nuclear plants to 1 mSv per person. The BEIR V report (1990) concluded that risks of cancer and leukaemia were three to four times greater than suggested in the BEIR III (1980) report. As a consequence, the ICRP reduced its occupational exposure limits to 20 mSv per annum averaged over five years (within which up to 50 mSv in a single year was permitted). The NRC retained its limits, arguing that the principle of 'as low as reasonably achievable' (ALARA), which the ICRP had introduced in 1977, whereby the aim of good radiation protection was to attempt to reduce the doses as far as achievable resulted in occupational levels far below regulatory limits, obviating the need for revision. Australian occupational regulatory limits reflect the ICRP limits. There is a 1 mSv limit per annum to the general public.
The BEIR VII report (2006) defined 'low dose' as less than 100 mSv. Since the previous report in 1990, much new information had come to light reinforcing the original heightened assessment of the risk of cancer and leukaemia, and stated that "there is a linear dose-response relationship between exposure to ionizing radiation and the development of solid cancers in humans. It is unlikely that there is a threshold below which cancers are not induced." The report relied on updated data from the Hiroshima and Nagasaki survivors, medical exposure studies, and nuclear workers exposed at low doses and dose rates. Importantly, and contrary to previous assertions that most of the risk estimates were mere extrapolations from very high doses in atomic bomb survivors, more than 60 per cent of exposed survivors experienced a dose of less than 100 mSv, and 45 per cent less than 50 mSv, well within current cumulative occupational regulatory limits.
It can thus be seen that, throughout the 20th century, the progressively decreasing radiation dose regulatory limits mirrored the increasing realisation of the hazards to human health of ionising radiation, even at low doses, to the point where we now recognise there is no safe threshold. Nuclear power was spawned during this era and became inexorably tainted with these realisations.
Nuclear power reactors & cancer The radioactive burden of nuclear power is not merely from the operation of the power plants. There is an entire nuclear fuel cycle to consider. The potential health impacts of the nuclear fuel cycle not only concern the general public but also nuclear workers. The nuclear fuel cycle includes the mining and milling of uranium ore; fuel fabrication; production of energy in the nuclear reactor; storage or reprocessing of irradiated fuel; and the storage and disposal of radioactive wastes. The doses to which the public is exposed vary widely from one type of installation to another.
The nuclear reactor core containing nuclear fuel rods, where heat is generated through nuclear fission, is highly radioactive, is heavily shielded, accounting for virtually no ionising radiation from the reactor core to the surrounding region. Every day, however, nuclear reactors routinely produce radioactive gases and liquids that are largely captured and stored on-site until their activity decays to a level sufficient to enable their release into the environment consistent with ensuring the activity is below regulatory limits. Tritium is the largest of the nuclide emissions by activity from civilian reactors, apart from noble gases in some types of reactors. The radioactive effluents almost completely account for all radioactive emissions from nuclear power plants. The per capita dose to regional populations (less than 50 kms) surrounding nuclear power plants is 0.0001 mSv (compared to around 2 mSv natural background dose) and up to 0.02 mSv for specific groups up to 1 km from a nuclear reactor. These are very small doses.
The carcinogenicity of ionising radiation is well established. BEIR VII assigns a risk factor of 5 per cent per Sv, or roughly 1:25000 chance of contracting cancer per mSv dose per annum. On this basis alone, the cancer risk from the documented exposure to ionising radiation from nuclear power stations to the regional general population is 1:250,000,000 per person per annum (or 1:1,250,000 for the specific groups within 1 km of the plant). This would equate to one extra cancer per annum for the whole of the US if the regional population dose was hypothetically generalised to all citizens. Disturbingly however, epidemiological studies are demonstrating much higher cancer rates in selected groups, with the specific causes yet to be determined.
Do nuclear power plants cause cancer in local populations? The role of civilian nuclear power in the induction of cancer and specifically leukaemia in the general public has been a major controversy over the last three decades and remains unresolved. Leukaemia is malignancy of the blood forming cells and is notable in the context of IR induction in appearing before solid cancers with a latency of around 4 years (compared to >10 years for solid cancers). Although there is little doubt that exposure to radiation increases the risk of developing leukaemia (BEIR VII 2006; Preston et al. 1994; United Nations Scientific Committee on the Effects of Atomic Radiation 2006; IARC 1999), there is disagreement as to whether the amount of exposure received by children living near nuclear sites is sufficient to increase risk.
The first epidemiological study to raise concern of a link was in Great Britain. This addressed an unexpected observed increase in cases of leukaemia in children aged under ten between 1954 and 1983 at Seascale, three kilometres from a reprocessing plant and other nuclear facilities at Sellafield. Published by the epidemiologist, Martin Gardner in 1990, it suggested there was a connection between the increased incidence of leukaemia and Sellafield. Specifically, preconceptional exposures of the fathers of 46 cases of leukaemia, born in west Cumbria and diagnosed there between 1950 and 1985, were compared with those of 564 controls. An association was found between the exposure and leukaemia (Gardner's hypothesis), but this was dominated by four case fathers with high exposure (> 100 mSv). In 1993, a new report by the British Health and Safety Executive found the rate of childhood leukaemia in Seascale was 14 times the national average. Two further studies examined leukaemia clusters in Dounreay and Aldermaston could not correlate paternal exposure levels and leukaemia incidence at these nuclear sites. Furthermore, the increased incidence of leukaemia at Seascale was also occurring in children of unexposed fathers. Additionally, children born outside of Seascale to Sellafield workers did not have an increased incidence to leukaemia. A further study in Canada also failed to demonstrate a link between childhood leukaemia and preconceptional paternal irradiation, or even ambient radiation.
Several studies since 1990 have found mixed results. A congressionally mandated study by the US National Cancer Institute studied the incidence of cancers including leukaemia in 107 counties with nuclear facilities within or adjacent to their boundaries, assessing incidence before and after commencement of operation from 1950-1984. Each county was compared to three similar 'control' counties. There were 52 commercial nuclear reactors and 10 Department of Energy facilities. It found no evidence to suggest the incidence of cancer or leukaemia was higher in the study counties compared to the control counties. It did, however, acknowledge shortcomings in its methodology, including not accounting for the potential for 'at risk' populations to be smaller than the specific county study populations, and thus potentially masking underlying increases. Many other studies did confirm increased rates of childhood leukaemia in proximity to nuclear power plants. But they could not confirm a correlation with radiation dose and therefore the link was uncertain.
A 2007 meta-analysis of 17 research studies involving 136 nuclear sites in the UK, France, USA, Spain, Japan, Germany and Canada of the incidence and mortality of childhood cancer in relation to their proximity to nuclear power plants confirmed an increased incidence of leukaemia. The significance of this meta-analysis is that it not only stratified the distance from the nuclear plants, albeit in coarse terms, but also stratified the age groups of children, arguing that since the peak susceptibility to childhood leukaemia is under the age to ten, this group should be independently assessed. Any broader age groups could conceal an increase in incidence. They found in children up to 9 years old, leukaemia death rates were from 5 to 24 per cent higher, and leukaemia incidence rates were 14 to 21 per cent higher.
The most recent of these studies and also the most compelling was sponsored by the German government in response to public pressure to examine the issue of childhood leukaemia and nuclear power reactors. This was commissioned by the Federal Office for Radiation Protection (BfS) in 2003. The KiKK case-control study examined all cancers near all of the 16 nuclear reactor locations in Germany between 1980 and 2003, including 1592 under-fives with cancer and 4735 controls, with 593 under-fives with leukaemia and 1766 controls. The main findings were a 0.61-fold increase in all cancers, and a 1.19-fold increase in leukaemia among young children living within 5 kms of German nuclear reactors. These increases were statistically significant and much larger than the cancer increases observed near nuclear facilities in other countries. The study is notable also for measuring the distance of each case from the nuclear reactor so that a distance-risk relationship could be computed. This was the first study of this kind, previous studies having either grouped all cases or coarsely stratified the distance data. The study found not only that risk is greatest closest to the plants but that small increased risk extends up to 70 kms from the nuclear power plant. Their initial conclusions discounted the role of radiation in the development of leukaemia due to the stated emissions being too low. However, an independent review panel appointed by the BfS criticised them for this conclusion, arguing that the dose and risk models assumed by the Kikk authors did not necessarily reflect the actual exposures and possible radiation risks, and thus warranted further research before being dismissed. In other words, they implied that exposure doses might be higher than are currently being measured.
There is reasonably strong evidence now of a link between the proximity of nuclear power plants and childhood leukaemia. There is no significant evidence for solid cancers either in children or adults. Clearly further research is warranted, particularly to elucidate the leukaemia causation. Policy makers therefore need to factor this increasingly strong scientific evidence into their decision-making. Legislators considering introducing or expanding nuclear power should consider these health implications. Nuclear regulators also need to revisit their assumptions and consider revising standards at existing nuclear plants.
Occupational risks in nuclear power The complete nuclear fuel cycle poses health risks at every stage. Of course, hazards exist in every industry, particularly the fossil fuel and general mining industries which have deaths occurring not infrequently. But most industries in developed countries have legislated requirements to minimise the risks to their workers. Furthermore, most responsible industries would have a zero tolerance policy to workplace deaths. In several Australian states, legislation places a liability on the employer to prevent workplace deaths. Further, an employer is guilty of a crime if there is demonstrable negligence or culpability in relation to a workplace death if appropriate policies and implementation of workplace safety practices are lacking. The burden of proof is quite onerous, reversing the legal principle of a presumption of innocence. The repercussions of inadequate workplace safety may not be apparent for decades, as asbestos-related deaths now bring to light the scandalous disregard for employee health and welfare in the asbestos mining industry, despite the medical evidence at the time which clearly demonstrated a major health hazard to asbestos workers. The employer obligation to preventing workplace related deaths has no time limit, particularly as deaths may occur many years after employment has ceased. The nuclear industry is no exception to this principle. In many ways it may actually be considered a pre-eminent example of this principle, due to the established carcinogenicity of IR, the lack of a risk-free threshold, and the long latent period before cancer appears (several decades). Furthermore, it underscores the importance of miners being given accurate and complete information from their employers concerning radiation induced cancer risks.
Cancer is a common disease accounting for 25 per cent of all mortality in the general population. Therefore, there is much statistical noise obscuring small relative increases in cancer mortality consequent to ionising radiation exposure. In fact, the size of the study population required increases exponentially at lower radiation doses (because the number of excess cancer cases is commensurately less). In other words, if we are to detect a small increase in cancer risk at low doses, we need very large study populations to achieve statistical significance. Furthermore, given the latency period for radiation induced cancer, long follow-up periods are required. Occupational studies therefore can be difficult to perform and often have weak statistical power to prove a detriment. It is thus important to note that failure to establish statistical significance does not rule out the existence of a detriment, but may merely iniicate that the sample size was not large enough or the follow-up was not long enough.
Radiation risks to uranium miners The link between uranium mining and lung cancer has long been established. Certain groups of underground miners in Europe were identified as having increased mortality from respiratory disease as early as the 16th century. Lung cancer as the cause was not recognised until the 19th century. The radioactive gas, radon, was identified as the cause in the 1950s. Studies of underground miners, especially those exposed to high concentrations of radon, have consistently demonstrated the development of lung cancer in both smokers and non-smokers. On this basis, the International Agency for Research on Cancer (IARC) classified radon as a carcinogen in 1988. In 2009, the ICRP stated that radon gas delivers twice the absorbed dose to humans as originally thought and is in the process of reassessing the permissible levels. At this stage, previous dose estimates to miners need to be approximately doubled to accurately reflect the lung cancer hazard. The Biological Effects of Ionising Radiation VI report (1999) reviewed eleven cohort studies of 60,000 underground miners with 2600 deaths from lung cancer, eight of which were uranium miners in Europe, North America, Asia and Australia. These found a progressively increasing frequency of lung cancer directly proportional to the cumulative amount of radon exposure in a linear fashion. Smokers had the highest incidence of lung cancer, as would be expected, but the greatest increase in lung cancer was noted in non-smokers. The highest percentage increase in lung cancer was noted 5-14 years after exposure and in the youngest miners. Uranium miners are also exposed to IR directly from gamma radiation and the dose from this is cumulative to that from radon. At the Olympic Dam underground uranium mine, the total dose per miner is approximately 6 mSv, of which 2-4 mSv (allowing for the new ICRP dose coefficients) are due to radon and the balance due to gamma radiation.
Most modern uranium mines have air extraction systems and monitored ambient measures of radon concentrations to ensure levels remain low. Current levels of radon in underground uranium mines are only a fraction of mines of 100 years ago. Miners are now given personal protective equipment (PPE) including masks to filter out the radioactive particulate matter. Yet many underground miners find the masks extremely uncomfortable, especially in the hot underground environment they must contend with. It is estimated that up to 50 per cent of underground uranium miners in Australia do not use their masks, and thus drastically increase their risk of lung cancer while underestimating their actual radiation dose (since this is calculated assuming PPEs are used).
The Olympic Dam doses mentioned above are typical of modern mine practices. The average miner at Olympic Dam is in his 20s and stays on average five years at the site. A typical calculation using the linear no threshold model and the latest BEIR-VII figures of radiation carcinogenesis risks indicates miners at Olympic Dam therefore have a 1:420 chance of contracting cancer, most likely lung cancer. Note that the research demonstrates that the risk of developing lung cancer is greater for younger workers. These risks are not insubstantial. Radiation safety and risk principles can be quite complex and it is debatable whether miners have the training to understand the basis, or are even informed of the risks in a comprehensive and accurate manner that they can comprehend and make an informed work decision.
Radiation risks in the nuclear fuel cycle Many studies of mortality, and in some instances cancer, have been made over the last 20 years among nuclear industry workers (excluding mining). Studies have covered workers in Canada, Finland, France, India, Japan, Russia, Spain, the United Kingdom, and the United States. In general, exposure in most of these studies was due to external radiation (x-ray and gamma ray). Internal contamination (through inhalation, ingestion, skin absorption, or wounds) by tritium, plutonium, uranium, and other radionuclides occurred in some subgroups of workers but attempts to reconstruct internal doses have been incomplete. Studies of nuclear industry workers are unique in that individual real-time monitoring of exposure has been occurring since the 1940s with personal dosimeters. More than one million workers have been employed in this industry since its beginning. Studies of individual worker cohorts are limited in their ability to estimate precisely the potentially small risks associated with low levels of exposure. Risk estimates from these studies are variable, ranging from no risk to risks of an order of magnitude or more than those seen in atomic bomb survivors.
The 15-country study of nuclear industry workers (excluding mining) published in 2005, the largest study of nuclear industry workers ever conducted, was able to arrive at statistically significant conclusions confirming the increased risk of cancer and leukaemia in nuclear industry workers, even at low dose. This involved analysing dosimetric records of over 407,000 workers and correlating with solid cancer and leukaemia mortality with a total follow up of 5.2 million person years. The average cumulative dose was 19.4 mSv, with 90 per cent receiving less than 50 mSv. Recall that these are within the current permissible dose limits (50 mSv in any one year, provided that there is no more than 20 mSv per annum averaged over five years, i.e., 100 mSv total). The results indicated that there was an excess risk for solid cancers of 9.7 per cent per 100 mSv exposure and an excess risk of 19 per cent for leukaemia. The risks were dose related and consistent with the estimates from the Atomic Bomb studies. They estimated that 1-2 per cent of all nuclear worker deaths were probably radiation related.
Nuclear reactor safety & threats to public health The public health risks of nuclear reactor accidents are potentially catastrophic. Unlike virtually any other major industrial accident, the impact of a nuclear reactor core accident, and specifically an uncontained meltdown, can span multiple continents through the potential for contamination over vast distances. This can in turn lead to thousands of cancer deaths over the ensuing decades.
Whichever way one looks at nuclear reactors, they are enormous. Their magnitude, scale and complexity put them in an industrial category of their own. A typical nuclear plant sits under approximately four acres of roof alone, with the reactor core enclosed by masses of steel and concrete for protection from the deadly levels of radioactivity. A vast amount of electrical wiring snakes its way throughout the complex. Huge steam-carrying pipes and machinery the length of city blocks are easily consumed by the enormity of the structure. Few people occupy a nuclear plant because it mostly runs itself, with most of the human activity centred on the control room, from which engineers monitor and occasionally inspect systems inside the plant.
Visual inspection is impossible for the most critical and dangerous part of a plant; its core. Control room operators are more akin to pilots flying on instruments. Unable to visually inspect to any substantive extent the critical components of the reactor, they rely on interpretive analysis of the control room gauges to assess whether the reactor is functioning appropriately. If the readings are abnormal, their job is to analyse why not, and then synthesise appropriate responses. It is one thing to read gauges; another to correctly analyse their meaning. Different people may interpret the data very differently - with catastrophic results. Skilled engineers with logical linear thinking patterns can find themselves lacking the critical skills required when the linearity unravels in a crisis. Such was the fate at the Three Mile Island plant in Pennsylvania, which suffered a partial core meltdown in 1976: One malfunction led to another, and then to a series of others, until the core of the reactor itself began to melt, and even the world's most highly trained nuclear engineers did not know how to respond. The accident revealed serious deficiencies in a system that was meant to protect public health and safety.
'Probabilistic Risk Assessment' - lies, damn lies and statistics 'Probabilistic Risk/Safety Assessment' (PRA/PSA) is used to examine how the components of a complex system operate and attempts to quantify risk and identify what could have the most impact on safety, particularly in the operation of nuclear reactors. Complex computerised modelling is used to assess various scenarios and combinations of events. PRA results are therefore complex and imprecise, giving rise to a spread of results rather than an exact measure of risk. The imprecision and uncertainty in the results are partly because reality is more complex than any computer model, partly because modellers do not know everything, and partly because of chance. In essence, information is incomplete on the most serious or catastrophic events because they have not occurred with often enough frequency to provide enough data to be statistically useful. As a consequence, analysts need to make estimates (which often may be little more than guesses) of the related probabilities, which lead to large uncertainties. Therefore, there continues to be large uncertainties in core melt frequency and off-site risks (risks arising external to the reactor such as earthquakes). Yet the most catastrophic events can and do occur, even if infrequently, and these are the events which PRA is weakest in predicting, and even weaker in predicting the ultimate economic losses and health impacts on large populations. Additionally, human interactions are extremely important contributors to safety and reliability in nuclear plants. Modelling human behaviour is fraught, yet can significantly impact on the frequency or consequences of an accident sequence. PRAs assume rational actions by humans and cannot model irrational or malign activities, or a cascade of incorrect actions and responses. Most nuclear plant incidents and accidents are due to human error, including the Chernobyl disaster. As summarized by Edward Hagen: There is not now and never will be a "typical" or "average" human being whose performance and reactions to any operating condition, let alone an abnormal operating condition, can be catalogued, qualitatively defined, or quantitatively determined. There are no human robots.
Finally, new reactor designs increasingly rely on computer software for their operation. The National Research Council notes that there remains an ongoing "controversy within the software engineering community as to whether an accurate failure probability can be assessed for software or even whether software fails randomly." This has led to inconsistent treatment of software failure modes in PRAs for nuclear plants.
Statistical modelling is able to predict the 'known unknowns', but the complexity of nuclear power reactors and the uncertainties inherent in their operation involves 'unknown unknowns', or what the author and professor of risk engineering at New York University, Nassim Nicholas Taleb, refers to as 'Black Swan Events' - high impact, hard to predict and rare events. By definition, these are statistical outliers and not captured on PRA models. In essence, statistical modelling marginalises or even excludes Black Swan Events, often with catastrophic consequences.
In any case, most risk assessments are not really risk assessments, but merely probability assessments, because actual accident consequences are not evaluated in most cases. Thus they only cover half the risk assessment process. Furthermore, the risk assessments are based on several convenient but unrealistic assumptions. For example, the assessments assume nuclear plants always conform with safety requirements, yet each year more than a thousand violations are reported in the USA. Plants are assumed to have no design problems, even though hundreds are reported every year. Ageing of equipment is unrealistically assumed to result in no damage. Reactor pressure vessels are assumed to be fail-proof despite evidence of embrittlement. Risk assessments assume plant workers are far less likely to make mistakes than actual operating experience demonstrates. Finally, the majority of risk assessments are based on core damage and ignore the serious health hazards from spent fuel in cooling ponds igniting if there is a loss of water, or there is a rupture of a large tank filled with radioactive gases. Researchers at the Brookhaven National Laboratory have estimated that a spent fuel accident could release enough radioactive material to kill tens of thousands of people.
Of course, the irony is that the fundamental tenet underlying the rationale of using PRA is that nuclear reactors are too complex to guarantee absolute safety. It is an admission of the inherent risk in their operation and so, given this risk, mathematical modelling is employed to try to estimate it. The US Nuclear Regulatory Commission's basic job as mandated by the US Congress (and which mirrors most nuclear regulatory organisations around the world) is to ensure only that the plants it licenses and regulates will provide "adequate protection" to public health and safety, and that the operation of nuclear plants presents no "undue risk". There is no requirement that there be absolute protection because clearly, by their admission, the nuclear power industry cannot provide this. Whilst one may argue that there is no absolute protection to public health in any industry, only the nuclear power industry threatens such potential catastrophic consequences for the public in the case of a core meltdown with failure of containment. To expect such a level of protection is axiomatic.
Consequences of a nuclear accident Most (nuclear industry) experts consider a 1:10,000 chance of core damage per reactor-year based on historical data. The MIT study, the Future of Nuclear Power (2003), stated in its global growth scenario leading to a tripling of the number of nuclear power reactors to 1200 worldwide: With regard to implementation of the global growth scenario during the period 2005-2055, both the historical and the PRA data show an unacceptable accident frequency. The expected number of core damage accidents during the scenario with current technology would be 4 [using the PRA estimates].We believe that the number of accidents expected during this period should be 1 or less, which would be comparable with the safety of the current world LWR[Light Water Reactor] fleet. A larger number poses potential significant public health risks and, as already noted, would destroy public confidence.
The US government calculated that the lifetime core melt probability for all 104 US-commercial reactors is 1 in 5. In 1982, the government's Sandia National Laboratories modelled a study of the effects of a core meltdown and radioactive release at one of the Indian Point nuclear power plants north of New York City. The study estimated 50,000 near term deaths from acute radiation and 14,000 long-term deaths from cancer. A later study (2004) estimated 44,000 near term deaths and as many as 518,000 long-term cancer deaths within 50 miles of the plant. Estimates of economic losses indicate $50 to $100 billion in business losses, and as much as $300 billion in human death costs.
The disaster at the Chernobyl Nuclear Power Plant in the Ukraine in 1986 was the worst nuclear accident in history. On 26 April, reactor number four exploded, ironically following a safety test. The ensuing fire and core meltdown exposed the reactor core resulting in a massive release of radioactive material into the atmosphere which drifted over large parts of the western Soviet Union, Eastern Europe, Western Europe, and Northern Europe. Large areas in Ukraine, Belarus and Russia had to be evacuated, with over 336,000 people resettled. Although no more than around 50 people were initially killed, the International Agency for Research in Cancer (IARC), which is part of the World Health Organisation, predicts that there will be up to 41,000 excess cancers as a consequence by 2065, with 16,000 fatal.
Terrorism & nuclear power plants In addition to accidents, a successful terrorist attack on the scale of those carried out on September 11, 2001 could also lead to a major release of radiation. The Nuclear Regulatory Commission (NRC) considers the likelihood of this kind of attack occurring as small. The NRC also considers that nuclear power plants are difficult targets due to them being low lying and the reactor core a small target. But we should not forget that the probability of the World Trade Centre towers collapsing due to the impact of civilian aircraft was also considered to be small before they fell. More reactors mean more targets.
The Design Basis Threat (DBT) of all US nuclear reactors refers to the general characteristics of adversaries that nuclear reactors and nuclear fuel cycle facilities are meant to defend against. It is a defensive characteristic of the required design, dating from the Cold War era. No reactor has an aircraft impact as part of its DBT. The last reactor to come online in the USA was in 1996. Therefore no reactor is adequately defended against such a terrorist threat. It is disingenuous for the NRC to surmise, firstly, that the risk of such an event is low. The most that can be reliably stated is that the probability might be low but we just don't have the data to make anymore than educated guesses. Secondly, it is equally fallacious for the NRC to claim that the consequence of an aircraft impact is unlikely to lead to a breach of containment. For example, a sudden shutdown of a nuclear reactor ('scram') in the event of a terrorist attack does not necessarily guarantee the reactor core will not continue to increase in temperature and melt, particularly if the impact has disabled the emergency cooling systems. If the containment structure has been breached, this could lead to a major release of radioactive contaminants into the atmosphere. Nor does the DBT consider the consequences of an impact on the spent fuel cooling ponds, which may ignite if there is a loss of cooling water and disperse radioactivity into the atmosphere. As a result of the World Trade Centre attacks, the DBT of US nuclear reactors was upgraded in 2007 to include various terrorist attacks. Controversially, the NRC did not include aircraft attacks, overuling internal staff strongly advocating this. Instead, the NRC insisted ambiguously that only new reactors had to be able to withstand an aircraft attack. If this had been included in the upgraded DBT, all existing reactors would have been required to be retrofitted accordingly, which the NRC insisted was not required. All current US reactors are vulnerable to commercial aircraft terrorist attacks and will be so for their operational life due to the nuclear regulator's opposition to safety upgrades.
Despite all the evidence to the contrary, the nuclear industry claims nuclear power is safe. If nuclear plants are as safe as their proponents claim, why do utilities need the US Price-Anderson Act, which guarantees utilities protection against 98 per cent of nuclear-accident liability and transfers these risks to the public? All US utilities refused to generate atomic power until the government established this liability limit and continue to do so without it. Why do utilities but not tax-paying citizens need this nuclear-liability protection?
Nuclear waste & public health The average nuclear power reactor produces 300 m3 of low and intermediate level waste and some 30 tonnes of high level solid packed waste per year. Every year, there is 12,000 tonnes of spent fuel (high level) being produced, which will triple if the so-called nuclear renaissance occurs.
As of 2010, there was approximately 350,000 tonnes of nuclear fuel derived waste around the world. Currently this is stored on-site in dry casks at most nuclear power plants, or at reprocessing facilities such as La Hague (France), as an interim solution. Greatly complicating this task are the very long half-lives of some of the radionuclides present in this waste: for example, plutonium-239 - half-life of 24,000 years; technetium-99 - half-life of 212,000 years; cesium-135 - half-life of 2.3 million years; and iodine-129 - half-life of 15.7 million years.
These are highly hazardous to humans and ultimately require isolation from the biosphere for hundreds of thousands to a million years. The aim is to prevent water reacting with the waste, since this is the main mechanism by which the waste can re-enter the biosphere. The IAEA states that deep geologic disposal using a system of engineered and natural barriers to isolate the waste is the best method. The principal features of the geological repository concept is to place packaged waste in a stable formation several hundred meters below the surface with engineered barriers around and/or between the waste packages and the surrounding rock. There is no deep geological repository currently in operation, despite the nuclear power industry being in existence for over 50 years. With the projected tripling of nuclear power by 2050, a new repository will need to come online every six years somewhere in the world to keep pace with demand. No country currently plans to have a repository in operation before 2020, and all proposals have encountered problems.
High level waste (including spent fuel) accounts for 2 per cent by volume but 9 per cent by radioactivity and requires permanent storage in deep geological formations. Due to the complexity of the problem and the long time periods considered, the ability of a repository to retain radioactivity has a significant degree of uncertainty. Similar to assessing the safety of a nuclear reactor, conceptual and statistical models are employed. Similar assumptions usually based on insufficient or absent data are made to simulate the behaviour of a repository over an arc of time with orders of magnitude beyond that of recorded human history. The process requires the designers of the repository to know what they don't know about chemical and geological processes at a given site over this time. As summarised by the US National Research Council: Simply stated, a transport model is only as good as the conceptualizations of the properties and processes that govern radionuclide transport on which it is based. If the model does not properly account for the physical, hydrogeochemical, and when appropriate, biological processes and system properties that actually control radionuclide migration in both the near- and far-fields of the repository, then model-derived estimates of radionuclide transport are very likely to have very large -- even orders of magnitude -- systematic errors.
Numerous examples exist to demonstrate the failures of such analysis in current nuclear waste management. In July 2008, at the German nuclear waste dump in Asse, it was revealed the former salt mine has leaked radioactive brine for two decades and threatened major groundwater contamination. It was designed to last several hundred years. When many of the sites of the US nuclear weapons complex were founded, it was believed that their arid climates and thick unsaturated zones would protect groundwater from hundreds to thousands of years. These assumptions have been proven wrong. The transit time for several of these sites has been reduced to only several decades, underscoring the invalid underlying assumptions in the original modelling. Another example is the discovery of the mobility (leakage and contamination) of radionuclides below the high-level waste tanks at Hanford, Washington.
The large time-scales also exceed the rise and fall of many civilisations, together with their linguistic, cultural and artistic legacies, leaving a hiatus in our understanding of these civilisations much less their physical legacies. Information transfer is a key factor, with the management system more important than the media used, and the greatest threat to information transfer is institutional change. A number of external events, such as climate change, natural disasters, wars, and civilisation collapse could all affect the long term management of radioactive wastes, but it is the more 'trivial' causes such as destruction of archives by paper decay or disruption of electronic media that could lead to problems. Committing to a large increase in the rate of waste generation based only on the potential plausibility of a future waste management option would be to repeat the central error of nuclear power's past. The concept for mined geologic repositories dates back to at least 1957. Turning this idea into a reality has proven difficult. A solution to the waste problem remains elusive to this date. Even more disturbing is the prospect that our highly toxic waste will be our future generations' liability.
Conclusion The nuclear power industry is bedevilled with a military pedigree responsible for the worst weapons of mass destruction. The industry's marketing gurus would say they have a 'branding' problem. The nuclear power industry would also distance itself from this pedigree, claiming that it has an impeccable record of operational safety. The evidence contradicts their claims and underscores much of the uncertainty that shrouds their estimates for future safety. The 'branding' problem is ironic, since the creation of a nuclear power industry was an attempt to rebrand the nuclear weapons industry and give it legitimacy and cultivate public support by emphasising the perverse dichotomy of the need to prepare for nuclear war and the peaceful promise of the energetic atom - a 'peace now, war later' scenario.
The enthusiastic public relations driven motives of politicians and the military to pursue nuclear power presupposed the development and expansion of nuclear power, with safety as an afterthought and little tolerance for safety and public health concerns. The history of nuclear power is riven with conflicts of interest, understatement of risks, vilification of critics and masterful spin, adapting itself to perennially solving the next environmental problem or energy concern lest it be accused of creating it.
So what is different now? We could facetiously although equally credibly argue not much, for the nuclear power industry has now put its hand up to solve the environmental problem du jour, climate change, with claims of cost effectiveness and safety. The rhetoric is redolent of its 1970s mantra of saving us from fossil fuel pollution and establishing energy independence, just before the Three Mile Island and later Chernobyl accidents. Ultimately, the disastrous economic incompetence of the industry was exposed with an expose in Forbes stating that "[t]he [economic] failure of the US nuclear power program ranks a
s the largest managerial disaster in business history, a disaster on a monumental scale".
Perhaps the most glaring concern is that the nuclear power industry developed with safety concerns trailing a distant second. The science of radiation safety and health effects of ionising radiation were still evolving as the civilian nuclear boosters and industry vested interests encouraged further expansion, the motto being, 'electricity now, safety later.'
We now have voluminous evidence of public health risks of low levels of ionising radiation, even within occupational regulatory limits. We also know that there is no 'safe' level of radiation exposure below which radiation does not lead to a risk of cancer - there is no safe threshold. Although the measured doses on surrounding populations from nuclear power plants are very low, we also have strong evidence of a link between increased rates of childhood leukaemia and proximity to nuclear plants. We acknowledge that nuclear power reactors operate within a nuclear fuel chain that commences with mining of uranium and ends with decommissioning of nuclear reactors, with occupational risks at every step. The long association with uranium mining and lung cancer is unequivocal, due to radon gas exposure. Recent evidence points to radon gas being twice as hazardous as first thought. There is also increasing evidence of an increased rate of solid cancers in nuclear industry workers throughout the nuclear fuel chain proportional to their radiation dose.
Statistical risk modelling to determine nuclear reactor safety has been found wanting and prone to too many uncertainties, leading to orders of magnitude variations in likely reactor accidents. Add to this the potential catastrophic consequences of a core meltdown with failure of containment, and the industry's entreaties of excellence and safety ring hollow. Maybe we should stop listening to them and instead infer from their actions their true beliefs of the likelihood of a major accident - utilities refuse to operate without the liability of a major accident being transferred to tax-payers. Now who really needs protection?
Lastly, the ultimate in public health and safety concern is the intergenerational legacy of billions of tonnes of toxic nuclear fuel waste that needs to be sequestered from the biosphere for hundreds of thousands of years, using questionable statistical modelling of deep geological repositories which have not yet been prepared. Four decades ago, the then-director of the US government's Oak Ridge National Laboratory, Alvin Weinberg, warned that nuclear waste required society to make a Faustian bargain with the devil. In exchange for current military and energy benefits from atomic power, this generation must sell the safety of future generations.
Peter Karamoskos MBBS, FRANZCR is a nuclear radiologist; the Treasurer of the Medical Association for the Prevention of War (MAPW), the Treasurer of the International Campaign for the Abolition of Nuclear Weapons (ICAN), and the public representative of the Radiation Health Committee, Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) [This paper is for informational purposes and does not represent an endorsement by the institution.]
Suggested citation Karamoskos, Peter, 'Nuclear power & public health', Evatt Journal, Vol. 10, No. 1, December 2011<https://evatt.org.au/nuclear-power-public-health>
Comments