Chlorophyll a fluorescence (Chl a F) has recently attracted attention in the field of remote sensing. In this review, I describe the principles of Chl a F and its applications on imaging and remote sensing. The paper is organized as follows: (1) Introduction; (2) Behavior analysis and quantification of biological substances using fluorescent probes; (3) Roles in the initial photochemical reaction of photosynthetic pigments and Chl a F; (4) Imaging of Chl a F induction and parameters; (5) Relation between steady-state Chl a F and Chl content; (6) Chl a F measurements using Fraunhofer line; and (7) Problems with and future prospects for SIF (solar-induced Chl a F) remote sensing.
We have developed two types of laser-induced fluorescence lidar, which are LIFS (laser-induced fluorescence spectrum) lidar and LIFL (laser-induced fluorescence lifetime) lidar. The LIFS lidar consisted of a pulse 355-nm laser for excitation of plant leaf pigments, a spectrometer covered the entire range of plant fluorescence from 300nm to 800nm and a gated intensified CCD array for detection. The synchronous detection was applied to detect the weak fluorescence even in daytime. The LIFL lidar was for measurement of the fast lifetime of chlorophyll a fluorescence in the order of nano-second and under. A 40 pico-second 532-nm laser was prepared for short time excitation. A detection system was especially designed to accommodate for the fast lifetime measurement using MCP-photomultiplier tubes and other fast-rise time equipments. Rise time of the LIFL lidar was estimated to be 196ps. For data analysis, convolution integration technique was applied to the lidar observation data. Their performance tests were done for trees naturally growing in outside. The LIFS lidar observation of zelkova tree leaves 20m away from the lidar showed change of the fluorescence spectra depending on the growth/senescence period which turned to growth information about production, accumulation, and disorganization of molecular pigments occurring inside the leaves. The LIFL lidar observation showed that the lifetime of chlorophyll a fluorescence of plane tree leaves 30m away from the lidar at sun-shined position decreased gradually from the morning to noon and had the lowest at 13:00, and increased in the evening. Variation of photosynthetic activity during one day might be able to be monitored. These results will offer a new technique and novel data for vegetation remote sensing.
The concept for environmental control in greenhouse defining the environmental factors should be adjusted to the crop’s physiological status, which is called “Speaking plant approach (SPA)”, has been attracted a great deal of attention. The first and most important step in the SPA concept is to obtain physiological information from a living plant and then to judge whether the plant is healthy. Chlorophyll fluorescence (CF) imaging technique provides information on photosynthetic performance without destruction of or contact with the living plant. A robotized CF imaging system that evaluates daily changes in photosynthetic function of tomato canopy has been developed in our previous studies and came onto the market in 2015. In this report, some examples of the use case of CF imaging robot in greenhouses are introduced.
Water depth information is extremely important for safe marine navigation and offshore security. Currently, it is possible to acquire bathymetric information effectively with multi-beam echo sounder measurements and airborne laser bathymetry scanners. However, Japan is an insular country surrounded by open ocean, and a low-cost technique is needed for the spatial acquisition of shallow-water depths. Experiments with satellite-derived bathymetry using multispectral satellites have been underway since the 1970s. Their disadvantage is that they always require a field survey. However, the inversion method using hyperspectral data can simultaneously estimate bottom reflectance, water optical properties, and depths without a field survey. This study aims to research and develop a spatially derived water-depth method using hyperspectral data. Using airborne hyperspectral data and field survey data from Yamada Bay, Iwate Prefecture, we investigate adding a bottom index algorithm for classification of benthic type classification into the inversion method processing. We then apply the method to airborne hyperspectral data of Aka-jima Island, Okinawa Prefecture, and demonstrate its versatility with different benthic conditions and optical properties. A comparison to field survey data in Yamada Bay revealed a tendency for estimated depths to be deeper than field survey values, causing a slightly higher RMSE of 3.3m. On the other hand, in Aka-jima Island, the range of estimated depths was in good agreement with field survey data, with a reduced RMSE of 1.3m. Also, comparing estimated bathymetry maps to ALB data for Yamada Bay and an M-7000 digital bathymetric chart for Aka-jima Island, we found trends matched except in areas affected by macro alga and rock reefs shadows in Yamada Bay, or by cloud shadows in Aka-jima Island.
In order to establish a method of drone-based remote sensing (drone-RS) in agriculture and environmental measurement, it is important to find a balance of price and performance of the near-infrared camera for the drone. We aim to establish a low-cost drone-RS system that can be used even for small-scale farming. We initially used a Yubaflex (BIZWORKS) near-infrared camera, however, a RedEdge (MicaSense) camera became available in 2017, making it a matter of interest to compare the two, including the ability of RedEdge to inherit data acquired by Yubaflex. In this report, we present the results of our comparison of these two types of near-infrared camera from the viewpoint of price and performance, including points to be noted in image processing. Our conclusions are the following: (1) Yubaflex-based normalized difference vegetation index (NDVI) values were lower than RedEdge-based NDVI values, however, NDVI values from both systems were mutually convertible. (2) When monitoring vegetation based on NDVI time series or NDVI images generated from multiple images, processing (using software or a program) was required to convert digital numbers (DNs) to radiance values.