Journal of the Japan society of photogrammetry and remote sensing
Online ISSN : 1883-9061
Print ISSN : 0285-5844
ISSN-L : 0285-5844
Comparison of SAR-Optical translation frameworks based on Generative Adversarial Networks (GAN)
Rei SONOBEHaruyuki SEKIAtsuhiro IIOHideki SHIMAMURAKan-ichiro MOCHIZUKIGenya SAITOKunihiko YOSHINO
Author information
JOURNAL FREE ACCESS

2023 Volume 62 Issue 3 Pages 127-133

Details
Abstract

The potentials of multitemporal SAR data acquired from Sentinel-1 data have been reported for forest mapping and managing forest. However, SAR data have difficulty in visual interpretation due to speckle noise and geometric distortion caused by the distance-dependence along the range axis and the characteristics of radar signal wavelengths. Recently, the advantages of Generative Adversarial Networks (GANs) have been reported in image-to-image translation and there could be also effective methods for SAR-Optical image translation. In the current study, comparisons among CycleGAN, pix2pix, pix2pixHD and feature-guided SAR-to-optical image translation (FGGAN) were performed using Sentinel-1 and 2 data acquired in the forests. As a result, FGGAN was the best technique, achieving the structural similarity values of 0.664, 0.708 and 0.725 for red, green and blue bands, respectively.

Content from these authors
© 2023 Japan Society of Photogrammetry and Remote Sensing
Previous article
feedback
Top