Journal of the Japan society of photogrammetry and remote sensing
Online ISSN : 1883-9061
Print ISSN : 0285-5844
ISSN-L : 0285-5844
Volume 62, Issue 3
Displaying 1-9 of 9 articles from this issue
Preface
Original Papers
  • Kazuya NAKANO
    2023 Volume 62 Issue 3 Pages 119-126
    Published: 2023
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    Unmanned/Unpiloted aerial vehicles (UAVs) equipped with laser scanners have been widely used for various purposes, such as in construction sites, agriculture, forestry, and disaster management, as they can obtain high density point clouds with millimeter to centimeter scale accuracy. Recently, low and high performance products that use direct georeferencing devices integrated with laser scanners, or laser scanners with Simultaneous Localization and Mapping (SLAM) techniques have been available. Moreover, there have been numerous reports regarding the various applications of UAVs equipped with laser scanners ; however, most of these reports only discuss the results using UAVs as measuring devices. Therefore, to understand the functioning of UAVs equipped with laser scanners, we investigated the accuracy aspects of the survey grade laser scanner unit from the viewpoint of photogrammetry. We evaluated the performance of the point clouds with reflection intensities acquired by Applanix AP40 GNSS/IMU and RIEGL VUX-1HA laser scanner at a UAV test site provided by JSPRS. We present the theoretical values obtained using the observation equations and results of the accuracy aspects of the acquired data.

    Download PDF (2540K)
  • Rei SONOBE, Haruyuki SEKI, Atsuhiro IIO, Hideki SHIMAMURA, Kan-ichiro ...
    2023 Volume 62 Issue 3 Pages 127-133
    Published: 2023
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    The potentials of multitemporal SAR data acquired from Sentinel-1 data have been reported for forest mapping and managing forest. However, SAR data have difficulty in visual interpretation due to speckle noise and geometric distortion caused by the distance-dependence along the range axis and the characteristics of radar signal wavelengths. Recently, the advantages of Generative Adversarial Networks (GANs) have been reported in image-to-image translation and there could be also effective methods for SAR-Optical image translation. In the current study, comparisons among CycleGAN, pix2pix, pix2pixHD and feature-guided SAR-to-optical image translation (FGGAN) were performed using Sentinel-1 and 2 data acquired in the forests. As a result, FGGAN was the best technique, achieving the structural similarity values of 0.664, 0.708 and 0.725 for red, green and blue bands, respectively.

    Download PDF (33632K)
feedback
Top