Introduction

Over the past few decades, scientists have investigated and debated the effects of exposure to low levels of ionizing radiation and the shape of the response curve at these low doses [14]. While the effects of high doses are well-known, those attributed to low doses present a much bigger investigative challenge. Given that many individuals are frequently exposed to low levels of ionizing radiation at work, it has become important to accurately characterize the relevant dose–response curve to understand the health effects and risks of repeated exposure. Moreover, the nuclear accident that occurred in Fukushima, Japan in 2011 has led to renewed interest in the field [5]. At present, there are several hypotheses to explain the correlation between low doses of ionizing radiation and the risk of cancer. The first hypothesis assumes a linear response curve, implying that no dose of ionizing radiation (no matter how small) is completely safe and that the effect of numerous small exposures will eventually resemble the effect of a large-dose exposure. The second hypothesis maintains that the dose response follows a threshold-type curve, whereby low doses are actually safe and will not lead to any increased risk of cancer. Finally, the third hypothesis [6, 7] suggests the idea of radiation hormesis, which implies that ionizing radiation at low levels may have beneficial effects and promote DNA repair mechanisms [8]. The purpose of this review is to summarize the current state of knowledge in this field and present important epidemiological studies that would help elucidate the application and validity of the aforementioned models.

Risk models associated with low-level radiation

Understanding the effects and risks associated with low-dose exposure has become a priority for our modern society. Diagnostic screening tests, the future of the nuclear industry, and frequent flier risks [9, 10] are commonly discussed issues in the media. The main problem with quantifying and accurately describing the effects of low levels of ionizing radiation is that very large epidemiological studies are required to describe the effects to a useful precision; that is, “to maintain statistical precision and power, the necessary sample size increases approximately as the inverse square of the dose. This relationship reflects a decline in the signal (radiation risk) to noise (natural background risk) ratio as dose decreases” [11], as shown clearly in Fig. 1.

Fig. 1
figure 1

Size of the cohort required to detect a significant increase in cancer mortality as a function of the radiation dose [11]

A number of potential models have been proposed for the low-dose region (Fig. 2), including linear, linear–quadratic, supralinear, and hormesis (suggesting a threshold) models [12]. Victims of the Hiroshima and Nagasaki atomic bombs are an important source of data for determining which of these models may be tested [6, 7].

Fig. 2
figure 2

Possible dose–response curves describing the excess risk of stochastic health effects at low doses of radiation [12]

Brenner et al. [11] make the case for a linear response function, implying that exposure to a lower dose will only lead to fewer affected cells that will still be subject to the same type of cell damage. They conceded that these arguments are based on an assumption that the cells do not interact. The linear no-threshold (LNT) model for risk estimation has also been endorsed by the United Nations Scientific Committee on the Effects of Atomic Radiation [13] and by the International Commission on Radiological Protection [14]. The radiation-induced “bystander effect,” an important phenomenon that has been observed and reported in many experimental settings [1518], refers to the behavior of nonirradiated cells that, as a result of receiving signals from irradiated cells, still display the effects of irradiation due to the exchange of information via intercellular signaling pathways. This phenomenon may lead to the response curve divergence from linearity at low radiation doses [1618]. Other effects observed at low doses are expressions of increased resistance and hypersensitivity (increased cell death at low doses compared with the extrapolated prediction of survival from higher dose responses (1–5 Gy)). The reason for the discrepancy is that, at very low acute doses, cells do not detect damage efficiently and thus repair mechanisms are not triggered [19]. However, as the dose increases, cells recognize damage more easily and activate repair mechanisms, which lead to cell radioresistance. As Vaiserman [20] notes, “cancer risk after ordinarily encountered radiation exposure (medical X-rays, natural background radiation, etc.) is much lower than projections based on the LNT model.”

Gilbert et al. [21, 22] conducted a thorough study on the mortality data of nuclear facility employees who were continuously exposed to low doses of ionizing radiation with an average dose of <50 mSv (primarily neutrons). They found “no evidence of a correlation” between radiation exposure and cancer and concluded that “estimates obtained through extrapolation from data at high doses have not seriously underestimated risks at low doses” [22]. Similarly, Schubauer-Berigan et al. [23] conducted a large study of radiation-exposed workers to examine the risk of leukemia. The findings suggested, however, that the quantitative leukemia risk estimates per unit of radiation dose “varied by birth cohort (and) by year of hire,” which eventually made the “identification of the key effect modifiers difficult.” Scott [24], meanwhile, indicated that, at approximately 1 mGy, there would be a “decrease in risk below the spontaneous level.”

In contrast, Bond et al. [25] offered a fresh perspective by carefully defining the concept of a dose (D) and considering the total energy in the irradiated system (ε) as:

$$ \varepsilon =mD $$
(1)

where m is the total mass of the system (e.g., human tissue) that is being irradiated. Their analysis of data from Japanese atomic bomb survivors suggests that a minimum energy (ε 0) rather than a threshold dose is required for excess cancer occurrence; they calculated an approximate threshold value of ∼3 kJ as a minimum for cancers other than leukemia to occur [24].

The excess leukemia risk observed among survivors of the Hiroshima bomb was indicated by Land [26] to be significantly greater than that observed among Nagasaki survivors exposed to similar estimated dose levels [2729], which suggests that neutrons may be more effective than gamma rays in causing leukemia [30]. Land further observed that the linear–quadratic model is “preferable” to the linear model by explaining that the presence of a “densely ionizing part of all gamma ray tracks” means that the linear dose coefficient cannot be zero [26].

A competing model, the threshold-type response curve, has found support among other researchers [31, 32]. Kathren [32] argues that many factors, such as the latency period and dose rate, need to be accounted for when considering the human response to radiation and proposes a Gompertzian model to depict this response (Fig. 3). The Gompertzian model can be divided into two response functions: one that characterizes the deterministic effects and another that shows the stochastic effects.

Fig. 3
figure 3

Stylized human radiation dose–response curve as depicted by Kathren [32]

Controversy regarding risk models

Numerous scientists support the view that exposure to low doses of ionizing radiation is actually beneficial, promoting DNA repair and antioxidative capacity and inducing apoptosis in transformed cells [3337]. Figure 4 illustrates the hormetic relative risk (HRR) model proposed by Scott et al. [35]; the population average relative risk (RR) is plotted against the total absorbed dose (D).

Fig. 4
figure 4

Schematic representation of the cancer RR as a function of the dose relationships according to the HRR model [35]

As explained in [35], stochastic thresholds for stimulating adaptive-response genes occur in the interval 0–D* (transition zone A). Stochastic thresholds for adaptive-response gene silencing occur in the interval from D** to D*** (transition zone B), and the linear zone is assumed to begin for doses over D***, at which point all gene adaptive responses have been silenced.

This model seems to be supported by various other experiments. Sakai et al. [33] investigated the effects of low levels of gamma radiation on mice treated with a carcinogenic agent or exposed to high doses of X-rays. The results (Table 1) show that prolonged gamma irradiation, at approximately 1.2 mGy/h, suppressed the tumorigenic process and increased the immune response of mixed lymphocyte reaction–lpr/lpr (MLR-lpr) mice (which have a deletion of an apoptosis-regulating gene, Fas, that leads to autoimmune diseases).

Table 1 Effects of low-dose-rate irradiation on the survival of MLR-lpr mice [33]

Additionally, an earlier experiment [38], in which mice were irradiated at various dose rates for 35 days with 137Cs gamma rays and then injected with methylcholanthrene (MC), concluded that suppression of tumorigenesis with low-dose irradiation occurred at an optimum dose rate of 1 mGy/h (Table 2).

Table 2 Tumor incidence after the injection of MC for mice irradiated under various conditions [38]

Evidence in support of hormesis was also presented by Rithidech et al. [39] who demonstrated gamma ray hormesis during low-dose neutron irradiation. Additionally, Scott and Di Palma [40] conducted a thorough investigation of the effects of the natural background radiation used in medical diagnostic procedures and found that the “environmental radiation hormesis associated with radon” and “elevated background radiation” appear to be “preventing many cancer deaths.” Furthermore, they found that diagnostic medical procedures, such as chest X-rays and mammograms, also prevent cancer by “stimulating the removal of precancerous neoplastically transformed cells” and stated that protracted exposure to small X-ray doses has been used to successfully treat non-Hodgkin lymphoma and other cancers [40]. In another study [37], it was concluded that an increased risk of lung cancer from ionizing radiation at lung doses of <1 Gy was not observed in exposed never-smokers, thus suggesting that gamma rays protect against the stochastic effects of alpha dose radiation and may enhance the apoptosis of chemically transformed cells.

Nevertheless, as Puskin notes in [41], it is unlikely that governmental agencies will shift their policies away from the LNT model unless it is clearly demonstrated that the model greatly overestimates the risk of low radiation doses. Additionally, the Biologic Effects of Ionizing Radiation (BEIR) VII report [42] from the US National Academies, one of the main reports directing radiation policy, concluded that the balance of experimental evidence favors a “simple proportionate relationship at low doses” between radiation dose and cancer risk.

In response to Puskin’s article [41], Cuttler [43] criticized the LNT model by suggesting that it is currently accepted that organisms have DNA repair mechanisms that may be activated following low doses of radiation and that “although the LNT model is still widely accepted, it does not reflect reality.” The discrepancy between the LNT model and the experimental data was raised again by Tubiana et al. [44] who conducted a review of relevant radiation biology experiments and concluded that the LNT model for low doses of ionizing radiation lacks “scientific justification,” citing the Chernobyl incident as an example in which overestimating the risk may be more dangerous than underestimating it.Footnote 1 An earlier report suggested that the lack of evidence of carcinogenic effects for doses below 100 mSv could be attributed to two possible explanations: either the carcinogenic effect is too small to be detected by statistical analysis or there is no effect and a practical threshold exists [45].

Epidemiology of leukemia in relation to low doses of ionizing radiation

Atomic bomb survivors

A number of studies [4650], conducted in the 1950s, examined the incidence of leukemia among atomic bomb survivors. Preston et al. [49] later analyzed the data for a life span study cohort of 93,696 atomic bomb survivors for the period from 1950 to 1987. They concluded that there was “strong evidence” of radiation-induced risks for all subtypes of leukemia, except adult T cell leukemia. Within the study population, there was also evidence of an increased risk of lymphoma in males but not in females. Later studies of 30,000 children of atomic bomb survivors showed a lack of significant adverse genetic effects [42].

Residents close to a nuclear weapon facility

The risk of leukemia among residents of the Techa River banks who were exposed to chronic low-dose-rate internal and external radiation due to the Mayak nuclear weapons plutonium production facility in Russia was studied using 83 cases with 47 years of follow-up and 415 controls [51]. Radiation was released both into the Techa River and into the air. The odds ratios per gray of total, external, and internal doses were 4.6 (95 % confidence interval (CI), 1.7–12.3), 7.2 (95 % CI, 1.7–30.0), and 5.4 (95 % CI, 1.1–27.2), respectively. The analysis confirmed an association between nonchronic lymphoid leukemia (CLL) risk and prolonged exposure to radiation.

The incidence of leukemia among these residents was further examined by Krestinina et al. [52] who reported on a follow-up of 30,000 exposed individuals, who resided in riverside villages between 1950 and 1960, from 1953 to 2005. The mean and median reported bone marrow doses were 0.3 and 0.2 Gy, respectively. A “significant” linear dependence on dose was reported for non-CLL.

Chernobyl cleanup workers

Romanenko et al. [53] conducted a case–control study on the incidence of leukemia among Chernobyl cleanup workers in Ukraine who were subject to fractionated exposure, primarily from gamma radiation. A conditional logistic regression analysis was used to estimate the leukemia risk, and the data were based on the diagnosis of 71 cases during the period from 1986 to 2000, with the mean recorded bone marrow dose reported as 76.4 mGy. A linear dose–response relationship for both CLL and non-CLL was also observed. In a similar study [54], 70 cases (40 leukemia, 20 non-Hodgkin lymphoma, and 10 other blood-related malignancies) were examined, and the median dose to the bone marrow was reported as 13 mGy. The reported excess RR figures were concluded to be “slightly higher but statistically compatible” with the data from atomic bomb victims and other studies conducted on low-dose radiation.

Nuclear power industry workers

Wilkinson and Dreyer [55] studied nuclear workers who were frequently exposed to low doses of ionizing radiation. Their analysis consisted of 7 different studies, including 83 leukemia deaths, and detected clear evidence of “a modest excess of leukemia” from exposure to doses of <10 mSv.

Zablotska et al. [56] published a study that analyzed mortality after chronic exposure to low-dose ionizing radiation based on a cohort of 45,468 nuclear workers. The mean monitoring duration time was 7.4 years, with the mean cumulative equivalent dose reported as 13.3 mSv. A “monotonic” increase in the RR by dose categories for leukemia (excluding CLL) was observed. In a similar study, Ritz et al. [57] analyzed the effects among 4,563 nuclear workers monitored for external radiation between 1950 and 1993. The results indicated (through external comparisons) a higher rate of death attributed to leukemia and (by internal comparisons) an increased death rate for workers exposed to more than 200 mSv for hematopoietic and lymphopoietic cancers. This observed surge in mortality was suggested to be due to protracted exposure to radiation levels that were acceptable under US government standards. A study by Guerin et al. [58] among 9,815 French nuclear industry workers monitored between 1967 and 2000 for exposure to ionizing radiation (X-rays and gamma rays) showed “no excess compared with the general population” when cancer sites were analyzed. The median cumulative dose for that study was 4.8 mSv; however, the study had several drawbacks, one of which was a “strong” healthy worker effect (HWE)Footnote 2. A 15-country collaborative cohort study to better characterize the cancer risks was conducted with 407,391 nuclear workers, and the risk estimate for leukemia excluding CLL was found to be “not significantly different from zero” [59]. The HWE hypothesis was also used in that study to explain the decreased standard mortality ratios (SMR) with increased employment duration. However, Fornalski and Dobrzynski [8] reexamined data from the International Agency for Research on Cancer using conventional least squares tests and Bayesian statistical methods and showed that the SMR are lower in the exposed cohort than in the nonexposed cohort. Further, they noted that a number of important studies showed a similar trend of “substantial decreases in cancer mortality” compared with the nonexposed reference group and consequently dismissed the HWE statistical bias as being a plausible explanation for the results.

Cartwright [60] reported an excess of leukemia among radiologists using equipment prior to 1920 (both in the UK and the USA) and further noted that several UK nuclear sites had reported excessive leukemia cases, noting that another study showed a similar excess in areas designated for, but not built on, a nuclear site, thus suggesting inconsistencies. Finally, Cartwright’s review of the effect of background radiation was inconclusive but consistent with other research [46, 61, 62].

Daniels and Schubauer-Berigan [63] conducted a case–control investigation of 23 studies on the relationship between prolonged exposure to ionizing radiation and leukemia. The primary exposure evaluated was the external radiation dose to the bone marrow from high-energy photons, lower-energy photons, neutrons, and tritium. The results of a conditional logistic regression suggested that the “risks among nuclear workers are comparable to those observed in high-dose populations” [63].

Medical and radiological workers

The cancer risk for 27,011 medical diagnostic X-ray workers in China between 1950 and 1995 was examined using the observed/expected system [64]. The study noted that a “significant” cancer risk may be induced by long-term fractionated exposure to low doses of ionizing radiation “when the cumulative dose reaches a certain level.”

Scott et al. [65] attempted to characterize the impact of diagnostic computed tomography (CT) scans on the risk of cancer in the US population; the study concluded that CT scans “may reduce rather than increase the lifetime cancer risk” and that sporadic exposure to diagnostic X-rays may reduce the future risk of tumorigenesis among irradiated adults.

Muirhead et al. [66] examined a cohort of 174,541 following occupational radiation exposure. The study was primarily focused on doses associated with X-rays and gamma rays, and the mean lifetime dose was 24.9 mSv. They established that “raised risks” of leukemia (excluding CLL) were observed among Japanese atomic bomb survivors, radiotherapy patients, and large groups of radiation workers. The subtype of leukemia showing the strongest association with radiation was found to be chronic myelogenous leukemia.

Risk of childhood cancer

A characterization of the risk of childhood cancer per unit dose of radiation received in utero was attempted by Wakeford and Little [67] using data derived from the largest case–control study of obstetric X-ray examinations. The authors concluded their study with a “cause and effect” interpretation of the association between childhood cancer and diagnostic X-ray exposure to the fetus. Additionally, they determined that the risk of childhood cancer from acute intrauterine doses of approximately 10 mSv is not zero.

Busby [68] conducted an exhaustive investigation of infant leukemia in the UK, Germany, Greece, and Belarus, where the doses absorbed by the fetuses were 0.02, 0.06, 0.2, and 2 mSv, respectively. Following a study on leukemia in the combined population of 15,466,854 between 1980 and 1990, Busby reported an excess RR of 1.43 and argued in favor of a “biphasic model” due to the “induced repair efficiency.”

A case–control investigation by Davis et al. [69] of the risk of leukemia in children following exposure to radionuclides from the Chernobyl power plant explosion found a “significant increase in leukemia risk with increasing radiation dose to the bone marrow.” The median estimated radiation dose of the participants was <10 Gy. Kaatsch et al. [70] released a report on the incidence of leukemia among German children under 5 years of age in the inner 5-km zone around nuclear plants; they reported a significant increase in leukemia. However, a review of the study by Zolzer [71] emphasized that the measured radiation doses were insufficient to account for the observed leukemia cases and further explained that similar increases had been observed near “planning sites,” where no nuclear facility had ever been built. Moreover, Laurier et al. [72] indicated that “no similar excesses have been observed in studies from other countries.”

Finally, the BEIR VII report [42] suggested a correlation between the incidence of leukemia and gender, age at exposure, and time since exposure.

Summary

Several models have been suggested to explain the relationship between exposure to low doses of ionizing radiation (<100 mSv) and the risk of leukemia. These models include the LNT model, threshold response model, and hormetic response model. In this review, we have outlined these models and presented the arguments and observations supporting each model. We have also provided a general overview of the critical studies on the epidemiology of leukemia with respect to exposure to low doses of ionizing radiation. A further understanding of the health effects of low doses of ionizing radiation is important for determining government policies concerning the use of radiation for human health, as well as for emergency response policies for postnuclear events.