6 years have passed since the 2011 Great East Japan earthquake. Many new findings, insights and suggestions have been made and were implemented in disaster observation, sensing, simulation, and damage determination. The challenges for disaster mitigation against future catastrophic natural disasters, such as the Tokyo metropolitan earthquake and Nankai Trough earthquake, are how we share the visions of the possible impacts and prepare for mitigating the losses and damages, and how we enhance society’s disaster resilience.
A huge amount of information called “disaster big data” obtained, which are related to the dynamic flow of a large number of people, vehicles and goods inside and outside the affected areas. This has dramatically facilitated our understanding of how our society has responded to the unprecedented catastrophes.
The key question is how we use big data in establishing the social systems that respond promptly, sensibly and effectively to natural disasters, and in withstanding the adversities with resilience.
Researchers with various expertise are working together under the collaborative project called JST CREST “Establishing the most advanced disaster reduction management system by fusion of real-time disaster simulation and big data assimilation.” The project aims to identify possible disaster scenarios caused by earthquake and tsunami that occur and progress in a chained or compound manner and to create new technologies to lead responses and disaster mitigation measures that encourages the society to get over the disaster.
This special issue titled “Disaster and Big Data Part 2,” including 13 papers, aims to share the recent progress of the project as the sequel of Part 1 published in March 2016. As an editor of this issue, I would like to express our deep gratitude for the insightful comments and suggestions made by the reviewers and the members of the editorial committee.
This paper reports the latest outcomes of the project “Establishing the Advanced Disaster Reduction Management System by Fusion of Real-time Disaster Simulation and Big Data Assimilation” that started in 2014. The objectives of targeting various kinds of damage due to earthquakes and tsunami, fusion of large-scale high-resolution numerical simulation, effective processing and analysis of big data from various observations, and data assimilation were achieved. The outcomes will be utilized to create the world’s first real-time simulation and big data analysis basis that would potentially assist with designing preliminary measures based on quantitative data and disaster responses to a disaster. Case studies using recent disasters were used in this endeavor and validation were performed. In the future, environments that rapidly provide information on possible damage situations in real time for public agencies, corporations, and citizens facing a catastrophic disaster in Japan will be developed by integrating these studies.
This paper describes a method of extracting the relation between the ground-motion characteristics of each area and a seismic source model, based on ground-motion simulation data output in planar form for many earthquake scenarios, and the construction of a parallel distributed processing system where this method is implemented. The extraction is realized using two-stage clustering. In the first stage, the ground-motion indices and scenario parameters are used as input data to cluster the earthquake scenarios within each evaluation mesh. In the second stage, the meshes are clustered based on the similarity of earthquake-scenario clustering. Because the mesh clusters can be correlated to the geographical space, it is possible to extract the relation between the ground-motion characteristics of each area and the scenario parameters by examining the relation between the mesh clusters and scenario clusters obtained by the two-stage clustering. The results are displayed visually; they are saved as GeoTIFF image files. The system was applied to the long-period ground-motion simulation data for hypothetical megathrust earthquakes in the Nankai Trough. This confirmed that the relation between the extracted ground-motion characteristics of each area and scenario parameters is in agreement with the results of ground-motion simulations.
An earthquake (Mw6.2) struck Kumamoto Prefecture, Japan on April 14, 2016. A larger event (Mw7.0) struck the same area 28 hours later, on April 16. The series of earthquakes caused significant damage to buildings and infrastructures. Remote sensing is an effective tool to grasp damage situation over wide areas after a disaster strikes. In this study, two sets of ALOS-2 PALSAR-2 images taken before and after the earthquake were used to extract the areas with collapsed buildings. Three representative change indices, the co-event coherence, the ratio between the co- and pre-event coherence, and the z-factor combining the difference and correlation coefficients, were adopted to extract the collapsed buildings in the central district of Mashiki Town, the most severely affected area. The results of a building-by-building damage survey in the target area were used to investigate the most suitable threshold value for each index. The extracted results were evaluated by comparing them with the reference data from field surveys. Finally, the most valid factor was applied to larger affected areas for Kumamoto City and its surroundings.
Remote sensing technology is effective for identifying the Remote sensing technology is effective for identifying the extensive damage caused by tsunami disasters. Many methods have been developed to detect building damage at the building unit scale. Of these methods, X-band Synthetic Aperture Radar (SAR) data has a high resolution and is useful to investigate the detailed conditions on the Earth’s surface, although its spatial coverage is relatively small. In contrast, L-band SAR data has a lower resolution, leading to difficulties detecting building damage, although it can cover a broad area. During disasters, it is important to understand the damage across extensive areas in a short time; therefore, it is necessary to develop a method with broad coverage with high accuracy. The primary objective of this study is to develop a method to estimate building damage in tsunami affected areas using L-band SAR (ALOS/PALSAR) data. We developed our method by extending a previously proposed method for X-band SAR (TerraSAR-X) data. This study focused on Sendai City and Watari town in Miyagi Prefecture, where many houses were washed away during the 2011 Tohoku earthquake and tsunami. We verified that the function we developed produced good performance in estimating the number of washed-away buildings, corresponding with ground truth data with a Pearson correlation coefficient of 0.97. Verification was conducted in another study area, which yielded a Pearson correlation coefficient of 0.87.
Earthquake-induced building damage assessment is an indispensable prerequisite for disaster impact assessment, and the increasing availability of high-resolution Synthetic Aperture Radar (SAR) imagery has made it possible to construct damaged building inventories soon after earthquakes strike. However, the shortage of pre-seismic SAR datasets and the lack of available building footprint data pose challenges for rapid building damage assessment. Taking advantage of recent advances in machine learning algorithms, this study proposes an object-based building damage assessment methodology that uses only post-event SAR imagery. A Random Forest machine learning-based object classification, a simplified approach to the extraction of built-up areas, was developed and tested on two ALOS2/PALSAR-2 dual polarimetric SAR images acquired in affected areas soon after the 2015 Nepal earthquake. In addition, a series of texture metrics as well as the random scattering metric and reflection symmetry metric were found to significantly enhance classification accuracy. The feature selection was found to have a positive effect on overall performance. Moreover, the proposed Random Forest framework resulted in overall accuracies of 93% with a kappa coefficient of 0.885 when the object scale of 60 × 60 pixels and 15 features were adopted. A comparative experiment with the k-nearest neighbor framework demonstrated that the Random Forest framework is a significant step toward the achievement of a balanced, two-class classification.
This study reports the results of the analysis of probe data, collected in the periods immediately before and after the foreshock in Kumamoto on April 14, 2016. Data were gathered under actual urban traffic conditions, and the traffic activity evaluated. The study also identifies any issues to be addressed in future, based on this analysis. The analytical results quantitatively show that movement from Fukuoka to central Kumamoto was impacted by closure of the Kyushu Expressway; it also shows that travel times during the day along alternative paths significantly increased, whilst congestion on highways decreased. The results show that locations such as shelters, supermarkets, and public baths were the beginning and end points of travel, thereby causing deviations from normal congestion patterns.
Natural disasters that frequently occur, such as typhoons and earthquakes, heavily affect human activities in urban areas by causing severe congestion and economic loss. Predicting the delay in usual commuting activities of individuals following such disasters is crucial for managing urban systems. We propose a novel method that predicts such delay of individuals’ movements in several frequently occurring disasters using various types of features including the commuters’ usual movement patterns, disaster information, and geospatial information of commuters’ locations. Our method predicts the irregularity of commuting activities in metropolitan Tokyo during several typhoons, and earthquakes, using Yahoo Japan’s GPS dataset of 1 million users. The results show that the irregularity of individuals’ movements are significantly more predictable than with previous models. Also, we are able to understand that commuters’ usual movement patterns, disaster intensity, and geospatial features including road density and population density are main factors that cause commuting delay following disasters.
This article describes the development of a comprehensive simulation model that integrates property damages (building-collapse, fire-spreading, and street-blockage) and various activities (rescue activity, firefighting activity, and wide-area evacuation activity) by local residents in the event of a large earthquake. Using this model, we analyze the effect and risk of rescue and firefighting activities carried out by local residents under the assumption that a large earthquake directly hits the Tokyo area. Furthermore, we attempt to find new knowledge on the situations where there are many fatal casualties caused by people failing to evacuate, being trapped on streets, and so on.
With the aim of investigating measures for multiple simultaneous fires during big earthquakes, this paper creates a simulation model for describing urban damage (such as the spread of fires and street blockages), as well as fire brigade actions, and examines firefighter team strategies to reduce damage to properties and enhance their effectiveness.
Natural disasters have inflicted major damage on humankind throughout history. Some of the most severe disasters in Japan and other countries have been earthquakes, which frequently trigger secondary disasters such as tsunamis, landslides, and fires. In fact, in Japan, most casualties in an earthquake are not due to the earthquake itself but to secondary disasters. Therefore, detecting the occurrence of secondary disasters and gathering the result in real-time are essential in order to minimize the number of casualties. Thus, we concentrated on creating an effective system for propagating information about fire accidents quickly using pedestrian mobile phones. (Note that we believed detecting fires was the easiest of these secondary disasters.) In addition, we establish a configuration-free system for propagating fire alerts by attaching an ad-hoc network device to fire alarms installed in homes. Thus, we can collect information on fire accidents without configuring any personal location information. We evaluate the effectiveness of our system in the event of a fire. Our results show that we can estimate the location of a fire with an error of <20 m. This is sufficiently accurate to locate a fire in order to obtain an overview of the situation.
This study aims to compress web news, delivered as a big-data source after disasters. In this paper, article clustering, which is a combination of conventional means and an algorithm that selects the representative articles of each cluster, is designed and adopted. Experiments are conducted by evaluators. The proposed algorithm is in accord with the evaluators for 50% of the clustering and for about 30% to 40% of the representative-article selection.
The role of public online information in helping to reduce disaster damage is expected to become increasingly important since it can be used for decision making about disaster response. This paper aims to discuss the effectiveness and limitations of real-time online information about heavy rainfall based on an analysis of data on the disaster caused by Typhoons 17 and 18 in 2015 in Miyagi prefecture, Japan, and on a focus group interview survey with four experts on natural disasters. The results from the interviews showed the following: (1) Landslide alert information is reliable for prediction purposes. However, many people did not monitor it because it was released around midnight. (2) Areas of landslide occurrence and river flooding correspond to areas with heavy cumulative rainfall. Yet cumulative rainfall data are not available on the web. (3) The available radar-rainfall data can be used to predict the situation one hour from the present as long as the person has expert knowledge. (4) It is possible to monitor river water levels at many points. Yet, about half of the observation points have no established “flood danger water level.” (5) Local governments released a great amount of disaster information through social media before flooding occurred on some rivers. However, one must monitor multiple social media accounts and not just the account of one’s hometown.
In analyzing observation data and simulation results, there are frequent demands for comparing more than one data on the same subject to detect any differences between them. For example, comparison of observation data for an object in a certain spatial domain at different times or comparison of spatial simulation data with different parameters. Therefore, this paper proposes the difference operator in spatio-temporal data warehouses, which store temporal and spatial observation data and simulation data. The requirements for the difference operator are summarized, and the approaches to implement them are presented. In addition, the proposed approach is applied to the mass evacuation of simulation data in a tsunami disaster, and its effectiveness is verified. Extensions of the difference operator and their applications are also discussed.
Despite a long developmental history of water-related disaster risk indicators, there is still no consensus or reliable system for selecting objective data, no methodological system for choosing and verifying the relevancy of water-related disaster risk indicators, and no linking results back to root causes or addressing possible impacts on policies or actors to instigate change.
Global policy documents such as the Sendai Framework for Disaster Risk Reduction (DRR) 2015–2013  emphasize the urgent need for indicators capable of measuring risk reduction. However, developing and determining risk indicators faces many issues. Most disaster risk indices published do not yet include a basic overview of what data was used and how it was collected, let alone provide a systematic explanation of why each indicator was included, and why others were not. This consequently complicates linking the findings to their potential policy impacts. It also complicates the providing of clear-cut recommendations for improving resilience, which is a common intent of disaster risk indices.
This study, which focuses on water-related hazards, aims to provide disaster managers with a set of criteria for evaluating existing datasets used in disaster risk indices, index construction methods, and the links back to policy impacts. So far, there has been no comprehensive overview of indicator requirements or scoring systems. Previous studies concerning indicator evaluating metrics  have fewer metrics and have not yet addressed the different tiers of requirements, namely objective indicator data quality, methodological/epistemological aspects of index composition, and, most importantly, policy and actors of change (impact requirements). Further testing of these metrics in local studies can lead to the greatly needed scientific justification for indicator selection and can enhance index robustness.
The results aid in developing an evaluation system to address issues of data availability and the comparability of commonly used indicator sources, such as the World Bank. Once indicators can be scientifically linked to impacts through policy devices, national governments or other actors can become more likely to claim ownership of the data management of indicators. Future studies should expand this evaluation system to other natural hazards and focus on investigating the links between indicators and DRR in order to further validate indicator selection robustly.
In case a large-scale disaster that can be called a national crisis, it is difficult to resolve lifeline issues by approaching them from only the service provider side. Therefore, this study investigated the possibility of using measures taken by citizens, the service recipients, to improve the problem. The results indicate that there is a high probability that the functions of common lifeline services can be substituted by effectively utilizing supplies normally kept in stock for daily living in case relief supplies cannot be delivered. It is important to revise conventional thinking on emergency stocks (i.e., keeping a large stock of supplies for the sole purpose of disaster preparation) and shift to “circulatory reserves.” However, it was found that many households are lacking in the tools for cooking, and in adequate stock of drinking water. This deficiency must be brought to citizens’ attention so that they will make an effort to remedy the situation.