Chromosome abnormalities induced in peripheral lymphocytes are excellent biological indicators of radiation exposure. The estimate of the absorbed dose of radiation based on the chromosome aberration yield is considered the most dependable means of the biological dosimetry. In this brief review, several aspects of the use of radiation-induced chromosome abnormalities for the assessment of the biological effects of radiation are discussed; chromosome abnormalities induced by radiation, the yields of chromosome abnormalities in relation to dose, quality and dose rate of radiation, dosimetry shortly after exposure and after many years, and the capacity of chromosome abnormalities as indicators of radiation effects in the low dose level. Chromosome abnormalities to be used for the estimate are chromosome type abnormalities such as dicentrics, rings and others, among which dicentrics and rings are regarded as the most adequate since they can be easily detected, and are highly dose-dependent. On the basis of the yield of dicentrics and rings, an absorbed dose equivalent to as low as about 10rad of 60CO gamma rays may be estimated. Even many years after exposure when there are no other indicators remaining, chromosome abnormalities can still serve as fairly reliable indicators.
The characteristic patterns of leakage photon energy spectra have been measured around research reactors and compared among them. Although there were some differences in the types of the reactors, it was difficult to find significantly their differences among the leakage photon spectra. The spectra obtained around these reactors were different from those in the natural environment. From the same spirit as that of the International Commission on Radiological Protection, it is recommended that the dose data are accompanied by some additional spectra, if it was different from those in the natural environment. Conceptual systems with a NaI true spectrum method for environmental surveillances are given.
Methods of evaluating factors relating to the concentration of uranium, radium, thorium and potassium and the exposure rate at ground level are chronologically reviewed from the beginning of this century up to the present. Time dependence of the values of these factors is illustrated in a single figure. The values estimated before 1956 are extremely low as compared with recent ones. Each of these values, however, shows recently a convergence to a constant value. As a result of the chronological survey of the conversion factors, the following points are presented as future problems: (1) benchmark calculations using various computational methods are required, (2) direct estimates of the factors by experimental measurements are necessary, and (3) these factors should be evaluated taking into account the effect of radon emanation.