Constructing a large-scale landslide database across heterogeneous environments using task-specific model updates
Preparation and mitigation efforts for widespread landslide hazards can be aided by a large-scale, well-labeled landslide inventory with high location accuracy. Recent smallscale studies for pixel-wise labeling of potential landslide areas in remotely-sensed images using deep learning (DL) showed potential but were based on data from very small, homogeneous regions with unproven model transferability. In this paper we consider a more realistic and practical setting for large-scale heterogeneous landslide data collection and DL-based labeling. In this setting, remotely sensed images are collected sequentially in temporal batches, where each batch focuses on images from a particular ecoregion, but different batches can focus on different ecoregions with distinct landscape characteristics. For such a scenario, we study the following questions: (1) How well do DL models trained in homogeneous regions perform when they are transferred to different ecoregions, (2) Does increasing the spatial coverage in the data improve model performance in a given ecoregion (even when the extra data do not come from the ecoregion), and (3) Can a landslide pixel labeling model be incrementally updated with new data, but without access to the old data and without losing performance on the old data (so that researchers can share models obtained from proprietary datasets)' We address these questions by extending the Learning without Forgetting framework, which is used for incremental training of image classification models, to the setting of incremental training of semantic segmentation models (e.g., identifying all landslide pixels in an image). We call the resulting extension Task-Specific Model Updates (TSMU). TSMU semantic segmentation framework consists of an encoder shared by all ecoregions to capture the similarities between them, and ecoregion-specific decoders to capture the nuances of each ecoregion. This framework is continually updated using a threestage training procedure for each new addition of an ecoregion without having to revisit data from old ecoregions and without losing performance on them.
|Publication Subtype||Journal Article|
|Title||Constructing a large-scale landslide database across heterogeneous environments using task-specific model updates|
|Series title||IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing|
|Publisher||Institute of Electrical and Electronics Engineers|
|Contributing office(s)||Geologic Hazards Science Center|
|Google Analytic Metrics||Metrics page|