The Fukushima Daiichi Nuclear Power Plant accident in 2011 resulted in releases of enormous amounts of radionuclides into the atmosphere. The radionuclides were deposited over a large forested area in the Tohoku and Kanto regions. There were few reports about the initial depositions of radionuclides on forest ecosystems during the main emission period. We investigated the initial radiocesium deposition at various forest sites. The deposition of radiocesium by bulk precipitation during the initial few months (approximately until the end of May just after the accident) ranged from 4.4–42.1 kBq m-2 while that by throughfall ranged from 2.1–36.6 kBq m-2. The ratio of radiocesium deposition by throughfall to that by bulk precipitation (DTF /DBP) ranged from 0.13–0.66 during the first sampling period (approximately until the end of March just after the accident). In the following sampling periods, the radiocesium input by bulk precipitation decreased rapidly and became undetectable. The DTF /DBP ratios in these periods increased and then generally exceeded 1.0, meaning that the forest canopies gradually released the entrapped radiocesium. Atmospheric radiocesium inputs to forest ecosystems were strongly influenced by the forest canopy interception and temporal retention.
As a convenient, easy-to-use tool, time-domain reflectometry (TDR) is becoming extensively used to measure soil water content. Used not only in hydrological applications, the measurements are also used as ground truth for satellite remote sensing of soil moisture. However, TDR measurements usually include diurnal fluctuation caused by diurnal change of temperature. Though this is an old problem, there is not a general solution. The purpose of this study is to develop an algorithm to remove temperature effects of TDR measurements by analyzing its relationship with meteorological variables. From data observed at a Mongolian site, it is found that impact of soil temperature on soil water content is nearly proportional to soil temperature itself and soil water content. An algorithm is developed and applied to the Mongolian data set. The temperature effects can be effectively removed under dry and wet conditions.
Recent advances in remote-sensing technology have enabled estimation of surface solar radiation, which is an important input for land surface models (LSMs). This study investigates the impacts of satellite-derived solar radiation on an LSM by performing sensitivity experiments with and without a satellite-derived solar radiation product known as “EXAM”. Using the LSM “SiBUC-SIMRIW”, land surface analyses over Japan at a 1-km resolution were performed, comparable to observations from flux towers. We demonstrate that using the EXAM solar radiation improves not only the net solar radiation analyses, but also the analyses of net long-wave radiation, sensible heat flux, and latent heat flux at four ground observation sites. This suggests that using the satellite-derived EXAM solar radiation improves the three main budgets, i.e., radiation, heat and water budgets, of the land surface simulation. The findings demonstrate consistent improvements, therefore, SiBUC-SIMRIW-based land surface analyses can be expected to be improved using EXAM. The sensitivity experiments over Japan demonstrate that the change in solar radiation inputs largely affects the simulated sensible and latent heat fluxes. A relatively large change in surface runoff is evident in heavy snowfall regions in winter, which could be caused by a change in the snow melting period.