top of page

PUBLICATIONS

Papers, articles and reports are released as part of GEM's advancing science & knowledge-sharing initiatives. Selected reports and other materials produced by the international consortia on global projects, working groups and regional collaborations can also be found below.

Featured Publications

Development of a global seismic risk model

GEM Strategic Plan and Roadmap to 2030

Improving Post-Disaster Damage Data Collection to Inform Decision-Making Final Report

Anchor 1

Publications List

search thin.png

Filters: 117 results found

Sort by Title:

List
Gallery
Title
Year
Type
Topic
Loading...

Working together to assess risk from global to local: lessons from the Global Earthquake Model

Type:

Peer-reviewed

Reliable, high-quality risk assessment is the basis for an objective understanding of risk; priority for action 1 of the Sendai framework for disaster risk reduction 2015-2030. It is the foundation of decisions and actions that effectively build resilience. Earthquake risk continues to rise, yet reliable data, risk information, and assessment tools are out of reach or under-utilised in many areas of the world. The Global Earthquake Model (GEM Foundation) was created to bridge these critical gaps. Through authentic collaboration across public and private stakeholders, the GEM community supports risk management and awareness by developing and implementing open risk assessment tools, compiling and generating risk information. GEM influences risk reduction by promoting technology transfer and developing risk assessment capacity. All GEM risk assessment resources are made freely available through its web-based OpenQuake platform. As input to the Sendai framework, this paper provides an overview of GEM's achievements to date, lessons learnt - emphasizing effective modes of collaboration and capacity development -, and presents opportunities and challenges in going forward. This paper is from the Global Risk Forum Davos Planet@Risk Journal, which is no longer available.

ATLAS 2.0: Ground-shaking intensities at multiple return periods all over the world

Type:

Brochure

ATLAS 2.0 is GEM’s new hazard data service that allows users to access and interact with the outputs from the GEM Global Mosaic, used to generate the Global Seismic Hazard Maps. Available for public-good and commercial applications, users can now access full sets of hazard curves that describe the intensity of ground-shaking for different soil conditions, at multiple return periods, all over the world.

Quantify Your Earthquake Risk: Expert Solutions from the GEM Foundation

Type:

Brochure

A quick glance of GEM's commercially available scientifically robust risk information and flagship products. The brochure also highlights GEM's collaborative projects globally for public good.

Building a World Resilient to Earthquakes and other Natural Hazards

Type:

Brochure

An overview of the GEM Foundation's history, collaborative and transparent approach, pioneering scientific tools such as the OpenQuake Engine and the benefits of supporting GEM.

Ranking and developing ground-motion models for Southeastern Africa

Type:

Peer-reviewed

The southern East African Rift System (EARS) is an early-stage continental rift with a deep seismogenic zone. It is associated with a low-to-moderate seismic hazard, but due to its short and sparse instrumental record, there is a lack of ground-motion studies in the region. Instead, seismic hazard assessments have commonly relied on a combination of active crustal and stable continental ground-motion models (GMMs) from other regions without accounting for the unusual geological setting of this region and evaluating their suitability. Here, we use a newly compiled southern EARS ground-motion database to compare six active crustal GMMs and four stable continental GMMs. We find that the active crustal GMMs tend to underestimate the ground-motion intensities observed, while the stable continental GMMs overestimate them. This is particularly pronounced in the high-frequency intensity measures (>5 Hz). We also use the referenced empirical approach and develop a new region-specific GMM for southern EARS. Both the ranked GMMs and our new GMM result in large residual variabilities, highlighting the need for local geotechnical information to better constrain site conditions.

A Database and Empirical Model for Earthquake Post Loss Amplification

Type:

Peer-reviewed

The impact of destructive earthquakes might exceed the local capacity to cope with disasters and lead to an increase in the reconstruction costs. This phenomenon is commonly termed as post-loss amplification, and its main causes include the increase in the cost of construction materials and labor due to the sudden demand, the need to reconstruct following higher standards, or other unexpected costs. We reviewed 70 past earthquakes to identify events where post-loss amplification was observed, and collected a set of seismogenic, socio-economic, geographical, and impact variables for those events. Using this database, we developed two models to predict post-loss amplification, using a composite indicator that reflects the level of destruction in the region, or a parameter that characterizes the frequency of the event. This study indicates increased costs (>10%) for events where the economic losses exceed 1% of the regional gross domestic product, or for events with an estimated return period of at least 10 years. These models can be applied directly in the amplification of economic losses in earthquake scenarios or in probabilistic seismic risk assessment.

Earthquake scenarios for building portfolios using artificial neural networks: Part I – Ground motion modelling

Type:

Peer-reviewed

The calculation of the spatial distribution of ground motion is one of the most important steps in earthquake scenarios. The advancements in machine learning algorithms and the release of new ground motion data can enable improvements in the reliability and accuracy of this component. We developed an artificial neural network (ANN) ground motion model using a compiled database from two subsets with measured Vs30 from the Pan-European strong motion database and the NGA-West2 database. The ANN model employs five input parameters: moment magnitude (Mw), hypocentral depth, Joyner–Boore distance (Rjb), shear wave velocity in the top 30 m (Vs30), and faulting type. The outputs of the ANN are the RotD50 horizontal components of common intensity measures used in seismic risk assessment: PGA, PGD, PGV, Arias Intensity, and 5% damped spectral acceleration at 27 periods from 0.01 to 4.0 s. A mixed-effects modelling approach was followed to train the ANN and partition the ground motion variability into between-event and between-site terms. The input parameter scaling relationships were studied and demonstrated physically sound trends of the ground motion with respect to Mw scaling, distance attenuation, and site amplification. The predicted median response spectra for several combinations of Mw and Rjb were compared to common ground motion models for Europe, and the results are discussed. The developed ANN is used in a companion study to calculate the hazard footprints for several historical earthquakes in the Balkan region.

Earthquake scenarios for building portfolios using artificial neural networks: Part II – Damage and loss assessment

Type:

Peer-reviewed

Seismic risk assessment of building portfolios has been traditionally performed using empirical ground motion models, and scalar fragility and vulnerability functions. The advent of machine learning algorithms in earthquake engineering and ground motion modelling has demonstrated promising advantages. The aim of the present study is to explore the benefits of employing artificial neural networks (ANNs) in earthquake scenarios for spatially-distributed building portfolios. To this end, several recent major seismic events in the Balkan region were selected to assess damage and economic losses, considering different modelling approaches. For the assessment of the seismic demand, the ANN developed in the companion study and common ground motion models for Europe were adopted. For the vulnerability component, recent ANN models and existing scalar fragility and vulnerability functions for the Balkans were used. The estimates of all modelling cases were compared against the aggregated damage and economic loss data observed in the aftermath of these events. The findings of this study suggest that overall, the ANNs led to damage and economic loss estimates closer to the observations.

Advancing nearshore and onshore tsunami hazard approximation with machine learning surrogates

Type:

Peer-reviewed

Probabilistic tsunami hazard and risk assessment (PTHA and PTRA) are vital methodologies for computing tsunami risk and prompt measures to mitigate impacts. At large regional scales, their use and scope are currently limited by the computational costs of numerically intensive simulations behind them, which may be feasible only with advanced computational resources like high-performance computing (HPC) and may still require reductions in resolution, number of scenarios modelled, or use of simpler approximation schemes. To conduct PTHA and PTRA for large proportions of the coast, we therefore need to develop concepts and algorithms for reducing the number of events simulated and for more efficiently approximating the needed simulation results. This case study for a coastal region of Tohoku, Japan, utilises a limited number of tsunami simulations from submarine earthquakes along the subduction interface to build a wave propagation and inundation database and fits these simulation results through a machine learning-based variational encoder-decoder model. This is used as a surrogate to predict the tsunami waveform at the coast and the maximum inundation depths onshore at the different test sites. The performance of the surrogate models was assessed using a 5-fold cross validation assessment across the simulation events. Further to understand its real world performance and test the generalisability of the model, we used 5 very different tsunami source models from literature for historic events to further benchmark the model and understand its current deficiencies.

A Building Imagery Database for the Calibration of Machine Learning Algorithms

Type:

Peer-reviewed

In the last decades, most efforts to catalog and characterize the built environment for multi-hazard risk assessment have focused on the exploration of census data, cadastral data sets, and local surveys. Typically, these sources of information are not updated regularly and lack sufficient information to characterize the seismic vulnerability of the building stock. Some recent efforts have demonstrated how machine learning algorithms can be used to automatically recognize specific architectural and structural features of buildings. However, such methods require large sets of labeled images to train, verify, and test the algorithms. This article presents a database of 5276 building images from a parish in Lisbon (Alvalade), whose buildings have been classified according to a uniform taxonomy. This database can be used for the testing and calibration of machine learning algorithms, as well as for the direct assessment of earthquake risk in Alvalade. The data are accessible through an open Github repository (DOI: 10.5281/zenodo.7625940).
Filter by publication type:
Reset
bottom of page