Scientists are tackling the critical challenge of improving natural hazard modelling with a novel approach to assessing infrastructure vulnerability. Abdullah M. Braik and Maria Koliou, both from Texas A&M University, alongside A. Braik et al, present a two-stage Bayesian framework which uniquely combines physics-based fragility functions with real-time, post-disaster data. This research is significant because it allows for the continuous refinement of regional vulnerability estimates as observations become available, offering a more dynamic and accurate picture of risk than traditional static models. By reformulating fragility estimates and employing a sophisticated updating process, the team demonstrates how to correct biased predictions, spatially propagate information and ultimately generate more reliable uncertainty-exceedance probabilities for real-time disaster response.
Scientists have long recognised that natural hazards continue to challenge communities worldwide, revealing persistent gaps in our ability to anticipate, characterise, and interpret structural damage. Despite advances in engineering design, hazard modelling, and resilience planning, the observed performance of the built environment under extreme loading frequently diverges from model predictions. These discrepancies stem from substantial aleatory uncertainties in hazard intensity and structural capacity, as well as epistemic uncertainties embedded in the assumptions used to represent them. Before hazard assessments, risk assessments rely on probabilistic vulnerability models derived from physics-based simulations, empirical data from past events, heuristic rules, or hybrid formulations.
These models are indispensable for mitigation and preparedness planning, yet their usefulness is largely confined to the pre-event phase. Once a hazard unfolds, fragility-based predictions remain static and cannot adapt to evolving conditions or observed building performance, causing prediction accuracy to degrade when real conditions diverge from idealised assumptions. Field inspections, post-disaster imagery, aerial surveys, and crowdsourced reports provide direct evidence of building performance across affected regions. However, these data streams are often incomplete, delayed, and heterogeneous in fidelity.
More critically, they rarely enter a statistically principled framework capable of assimilating them and updating prior predictions. As a result, pre-event vulnerability models and post-event observations have progressed along parallel tracks, even though the limitations of one align almost perfectly with the strengths of the other. Without a unified framework to connect them, vulnerability models remain disconnected from the evidence that could refine them, while observational data lacks a coherent prior for rigorous inference. This disconnect represents a missed opportunity to develop dynamic digital twins that operate before and after a hazard, adapt to varying data fidelity and coverage, and provide a principled mathematical foundation for real-time decision support.
Fragility modelling emerged from the evolution of structural reliability theory, which reframed structural safety as a probabilistic event influenced by uncertainty in both loading and resistance. Early foundational work introduced limit-state formulations, reliability indices, and probability-of-failure concepts, enabling structural performance to be quantified probabilistically rather than through fixed safety factors. These ideas permeated modern design practice through load- and resistance-factor methodologies, shifting vulnerability assessment away from deterministic margins toward explicit representations of uncertainty. Within this context, fragility functions emerged as practical tools for expressing the probability that hazard demand exceeds structural capacity, accelerating adoption during the rise of performance-based earthquake engineering.
Over the past two decades, fragility modelling has expanded across hazards, building types, and infrastructure systems. In earthquake engineering, empirical, experimental, and simulation-based methods have produced multi-state fragilities for reinforced concrete systems, steel systems, wood systems, and nonstructural components. Incremental dynamic analysis further strengthened the statistical treatment of record-to-record variability and improved the robustness of fragility modelling. Fragility concepts were subsequently extended to other hazards. For hurricanes, fragility functions have been developed for wind effects on low-rise residential structures, flood impacts, and storm-surge or wave-induced damage.
Tornado-specific fragilities have been generated for buildings subjected to highly localised extreme wind fields.Today, fragility functions support a wide range of applications, including regional loss estimation, performance-based engineering, and mitigation planning at the community scale.Yet even the most advanced fragility models remain fundamentally pre-event constructs. Their mathematically structured form positions them naturally as priors for post-event inference, but this capability remains largely unrealised. Rapid and reliable post-disaster damage assessment is essential for emergency response, loss estimation, and early recovery planning.
Traditional assessment relied on ground surveys, windshield inspections, and reconnaissance teams, which delivered detailed building-level information but required substantial time, labour, and physical access to affected areas. While important for long-term model refinement and code calibration, these methods remain insufficient for real-time decision-making immediately after a hazard. Remote sensing has expanded the achievable scale and timeliness of post-disaster assessments. Unmanned Aerial Vehicle (UAV) imagery offers fine spatial resolution for detailed structural evaluation, whereas aerial and satellite imagery provide rapid, large-area coverage essential for regional situational awareness.
In parallel, deep learning has become the dominant approach for automating damage inference from these imagery sources. Beyond computer vision, other post-disaster data streams have emerged. Social sensing and outage reports offer indirect indicators of damage, while crowdsourcing platforms provide rapid, human-generated observations at scale. Although these sources broaden the available evidence, they also amplify challenges of heterogeneity, partial coverage, and variable fidelity. Despite these advances, most post-disaster approaches remain fundamentally data-driven and lack grounding in physics-based vulnerability principles or statistically coherent inference.
Some have explored reinforcement learning to guide autonomous robotics for reconnaissance and search-and-rescue, though these efforts emphasise operational efficiency rather than principled updating of structural performance predictions.Complementary research has incorporated post-disaster observations into broader risk and recovery frameworks. For example, Braik and Koliou combined remote sensing, GIS, and deep learning to automate post-event damage assessment and used the resulting outputs for recovery forecasting. Building on this, Braik, Han used computer-vision derived damage states to initialise agent-based recovery models and to back-infer tornado wind fields.
Bayesian Updating of Fragility via Beta-Bernoulli models offers
Scientists have developed a novel Bayesian framework that unifies physics-based fragility functions with real-time post-disaster observations, addressing a long-standing gap in natural hazard modelling. The research introduces a method for continuously refining regional vulnerability estimates as new data becomes available, offering a dynamic approach to assessing structural damage. Experiments reformulated physics-informed fragility estimates into a Probit-Normal (PN) representation, effectively capturing both aleatory variability and epistemic uncertainty in an analytically tractable form. Stage 1 of the framework performs local Bayesian updating by moment-matching PN marginals to Beta surrogates, preserving probability shapes and enabling conjugate Beta-Bernoulli updates with soft, multi-fidelity observations.
Fidelity weights were meticulously encoded to reflect source reliability, and the resulting Beta posteriors were re-projected into PN form, producing heteroscedastic fragility estimates, variances reflecting data quality and coverage. Measurements confirm that this process effectively corrects biased priors and propagates information spatially, enhancing the accuracy of vulnerability assessments. The team measured the impact of varying tornado widths, sampling strategies, and observation completeness on the framework’s performance, demonstrating its robustness under diverse conditions. Subsequently, Stage 2 assimilates these heteroscedastic observations within a probit-warped Gaussian Process (GP), which propagates information from high-fidelity sites to low-fidelity and unobserved regions through a composite kernel linking space, archetypes, and correlated damage states.
Tests prove that the GP effectively bridges data gaps, providing a comprehensive understanding of damage patterns across affected areas. Results demonstrate the framework’s ability to produce uncertainty-aware exceedance probabilities, supporting real-time situational awareness during and after hazardous events. The application of this framework to the 2011 Joplin tornado revealed significant improvements in damage assessment accuracy, particularly in areas with limited observational data. Researchers fused wind-field priors with computer-vision damage assessments, achieving a more nuanced and reliable understanding of structural vulnerability. Data shows that the method’s ability to integrate multi-fidelity data streams, from detailed inspections to aerial surveys, significantly enhances the precision of damage estimates. The breakthrough delivers a powerful tool for developing dynamic digital twins that adapt to evolving conditions and provide a principled mathematical foundation for real-time decision making.
👉 More information
🗞 A Two-Stage Bayesian Framework for Multi-Fidelity Online Updating of Spatial Fragility Fields
🧠 ArXiv: https://arxiv.org/abs/2601.13396
