The causes of the different optically stimulated luminescence (OSL) sensitivities for quartz aliquots from different origins were investigated in terms of radioluminescence (RL) during artificial irradiation. All RL spectra of red-thermoluminescence (RTL) quartz grains consist of two broad RL emission peaks, assignable to a violet region (V-RL, 400 nm) and a red one (R-RL, 630 nm) . The OSL sensitivities were affected on the total amounts of V-RL intensities during irradiation of a fixed total dose (20 Gy) with different dose rates. Additionally, the bleaching effects of RL with shorter wavelength light than OSL-illuminating light (470 nm) were assured from another experiment with the combination of quartz slices and optical filter. Conclusively, it is suggested that the V-RL emissions appreciably affect the residual or naturally accumulated doses when OSL/SAR protocol is applied.
The effects of plant growth stage on the bioavailability of Cs and Sr in rhizosphere soil were studied by soybean pot experiments. Soybean seeds were sown into 12 pots and the plants were grown in a greenhouse for 84 d. Three pots were kept unplanted. The concentrations of Mg, K, Ca, Sr and Cs in plants and in soil solutions at different growth periods were measured. The mass flow of the elements from soil solution to the root surface was calculated from the concentrations in the soil solution and daily transpiration of the soybean plant. The concentrations of elements in the soil solution decreased as the soybean plants grew. The decrease of Mg, K, Ca, and Sr was high in planted pots. The differences in Mg, K, Ca, and Sr concentrations between the planted and the unplanted pots indicated that the active uptake of these elements by the soybean plants caused the drop in their concentrations. However, no obvious difference in Cs concentrations was seen between the planted and the unplanted pots. Although the ratio of mass flow to actual uptake of Cs was 1.4 for the vegetative growth stage, it increased to 4.2 for the podding stage. This meant that the Cs mass flow was in excess of what was absorbed by the plants, so the Cs uptake was inhibited near the roots for the podding stage. It was assumed that the increase of Cs sorption due to the K concentration decrease in soil solution decreased the Cs bioavailability in the rhizosphere soil. The bioavailability of Cs and Sr in the rhizosphere was examined in a small-scale pot experiment. The soil-soil solution distribution coefficients (Kd) of Cs and Sr were observed as an index of their sorption level. Kd of Cs increased in the rhizosphere soil after cultivation. The decrease of bioavailable fraction of soil Cs was also observed. The exchangeable Cs in the rhizosphere soil clearly decreased. On the other hand, no specific rhizosphere effect was observed for Sr bioavailability. These results showed that the Cs bioavailability in agricultural soil could be decreased by plant growth due to enhanced Cs sorption in the rhizosphere while Sr bioavailability could not be changed by plant growth.
A field study was conducted to investigate the absorption of various elements into oats and carrots cultivated in brown forest soil after three years' applications of chemical fertilizer and two types of sewage sludge compost mixed with sawdust (SD compost) or rice husk (RH compost) . The results obtained in this study are summarized as follows. 1) The application of SD compost led to a significant increase on the concentrations of Mn, Zn, Ag and Ba in oat root, of Zn and Br in oat shoot, of Cl and Zn in oat ears, of Mg, Sc, Mn, Zn, Br, Ba and La in carrot peel, of Mn, Fe, Co and Zn in carrot edible portion and of Na, Sc, Mn, Fe, Co and Sm in carrot shoot. 2) The application of RH compost increased the concentrations of Mn, Zn and Ag in oat root, of K, Cr, Mn, Zn and Br in oat shoot, of Zn and Br in oat ears, of Mg, Mn and Br in carrot peel, of Cl, Mn, Zn and Br in carrot edible portion and of Na, Mn, Zn, Br and Sm in carrot shoot.
In order to reasonably supervise internal exposure of radiation workers, a calculation program that strictly presumes intake of radioactive substances using their concentration and thereby appropriately evaluates an extent of internal exposure has been originally developed. Advanced evaluation using the program is characterized by introduction of two factors, one is the derivative function, that reflects a gradual decrease in the concentration of radioactive substances caused by a ventilation ability of the concerned laboratory, and the other is to give careful consideration to the differences between times at which radiation workers go in (T1) and out of the lab and times at which air sampling is initiated (TCA) and terminated. Intake of radioactive substances evaluated by the previous method is underestimated in case TCA is larger than T1, whereas it is overestimated in case TCA is smaller than T1. Furthermore, it is much more amplified as an interval between TCA and T1 grows larger. In both cases, it was found that differences between the intakes evaluated by the previous and current methods become much larger as the effective displacement volume grows larger or the volume of the concerned lab grows smaller.