Social Sensing and Big Data Computing for Rapid Flood Mapping

Rapid flood mapping is crucial for emergency responders to gain better situation awareness during the event. However, traditional approaches normally require months of processing and quality assurance before the final flood extent and water depth are mapped and the losses and damages tallied. For example, the official flood-inundation maps for the 2015 floods in South Carolina were first released on February 22, 2016 by the United States Geological Survey (USGS), four months after the flooding event.

Using the 2015 South Carolina floods as the study case, we developed a novel approach to mapping the flood in near-real time by leveraging twitter data in geospatial processes. Specifically, we first analyzed the spatiotemporal patterns of flood-related tweets using quantitative methods to better understand how twitter activity is related to flood phenomena. Then a kernel-based flood mapping model was developed to map the flooding possibility for the study area based on the water height points derived from tweets and stream gauges.

Spatiotemporal Patterns of Flood-related Tweets

By analyzing the number of flood-related tweets and stream gauge height within the study area, we found that people tend to tweet more about floods when the flooding magnitude increases during the flooding event.

The figure below shows the temporal pattern analysis: (a) Number of flood-related tweets and daily maximum gauge height from the gauge station 02169500 during the flood period; (b) Cross-correlation analysis result between the two variables.

For the spatial pattern, we find that people closer to the flooding area tend to tweet more about the flood. The figure below illustrates the spatial pattern analysis: (a) Flood-related tweets and the inundated area within the study area; (b) Percentage of flood-related tweets with different distances to the inundated area.

Kernel-based Flood Mapping Model

The model takes the following data as inputs: water height points (WHPs), flood-related tweets, and DEM. Flood-related tweets are used to create a density surface, serving as a weighting factor based on the identified spatial patterns of twitter activity. This model consists of two steps: 1) generating a Flood Possibility Index (FPI) surface for each WHP using a kernel-based approach by considering the distance and elevation, and 2) generating the final FPI map based on all FPI surfaces.

Following figure shows the cell-by-cell comparison between our model output(FPI map) and the USGS inundation map with four categories: matched (flooded, both the FPI map and USGS map agree a cell was flooded), matched (not flooded, both maps agree a cell was not flooded), overestimated (the FPI map shows a cell was flooded while the USGS map does not), and underestimated (the UGGS map shows a cell was flooded while the FPI map does not). A majority of cells (83.4%, blue and light blue) between the two maps agree with each other, indicating that the proposed approach could provide a consistent and comparable estimation of the flood situation in near-real time, which is essential for improving the situational awareness during a flooding event to support decision making.

Picture10

This model has been applied to the Hurricane Harvey Flooding in Huston, TX.  The map below shows the Harvey Social Vulnerability and Flooding Depths in the Harris County.

Further studies have been carried out to improve this model by integrating incorporating post-event remote sensing images as another data source. Details of the improved model can be found at Huang X., Wang C., Li Z. (2018a, 2018b).

CyberSense: A Multi-Sensing System for Rapid Flood Detection and Mapping

This project aims to design and implement a multi-sensing system, namely CyberSense, for rapid detection and mapping of an evolving flooding hazards across the nation. The system will seamlessly integrate social sensing (SS), remote sensing (RS), and in-situ sensing/stream gauges (IS) using advanced big data computing techniques to support rapid flood detection and mapping in a highly automated manner (see figure below).

This evolving multi-sensing design aims to maximize the situational awareness from available data sources using the shortest amount of time at different flood stages. The confidence of the situational awareness information increases by fusing more sensors.

Stage 1. Flood Detection: Detect a flood event and locate/delineate the impacted area. A flood detection algorithm will be developed to continuously monitor the text, hashtags, and the associated images of the incoming Twitter data and USGS stream gauge observations. This stage includes two steps. The first would be the statistical analysis of Twitter activity that also includes interpretation of the Twitter message. The Twitter message analysis would be a verification of the alert. The second part would be to add the information from USGS stream gauges to provide further confirmation of flood conditions in the area. This first stage will be fully automated and an alert will be automatically pushed out (via web service or dashboard) when conditions exceeds a threshold. The initial alert will be pushed out within about 30 minutes of a flooding event occurs)

Stage 2. Flood Mapping: Generate a flood depth grid for the affected area. This stage includes two steps. The first step is developing a kernel-based flood mapping model to generate a preliminary flood depth grid for the impacted area based on the flood information automatically extracted from stream gauges, hydro-corrected terrain data, Twitter text, and Twitter-based flood photos using advanced big data computing techniques (~ 2 hour, fully automated). The crowdsourced data voluntarily reported water depths, will be also be integrated into the model. The second step is to enhance the preliminary depth grid by integrating remote sensing imagery into the model when the imagery becomes available. A flooding reconstruction model will be developed in this step to fuse data from SS, RS, and IS to produce an enhanced depth grid with higher confidence (12-36 hours, semi-automated).

The system will integrate state-of-the-art big data computing platforms (Hadoop and Spark) and artificial intelligence (Caffe deep learning platform) to accelerate data processing and automated information extraction from multi-sensed big data. We will design the system in a way that it can be 1) fully deployed on a cloud platform (e.g. Amazon Cloud) or at state agencies, and 2) easily integrated with existing systems using the Web services and application programming interfaces (APIs). The system will be highly flexible and scalable. It will deliver the product to all levels of emergency response. The system could work in several modes where information is collected and sent to other offices for further analysis.

Publications:

Li Z., Wang C., Emrich C., Guo D.(2017)  Leverage Social Media for Rapid Flood Mapping: A Case Study of 2015 October Flood in SC, Cartography and Geographic Information Science, doi: 10.1080/15230406.2016.1271356

Wang C., Li Z., Huang X. (2018) Geospatial assessment of flooding dynamics and risks of the October’15 South Carolina Flood, Southeastern Geographer, 58(2), 164-180

Huang X., Wang C.,  Li Z. (2018a) A Flooding Probability Reconstruction Approach by Enhancing Near Real-Time Imagery with Real-Time Gauges and TweetsIEEE Transactions on Geoscience and Remote Sensing

Huang X., Wang C., Li Z., (2018b) A Near Real-time Flood Mapping Approach by Integrating Post-event with Satellite Imagery and Flood-related TweetsAnnals of GIS

Translate »