EISG in Risk Analysis (2018)

Risk Analysis is the official journal of Society for Risk Analysis and publishes peer-reviewed, original research on both the theory and practice of risk. The application areas are vast. Below are articles with particular relevance to the Engineering and Infrastructure Specialty Group.

December 2018

Improving Hurricane Power Outage Prediction Models Through the Inclusion of Local Environmental Factors
Tropical cyclones can significantly damage the electrical power system, so an accurate spatiotemporal forecast of outages prior to landfall can help utilities to optimize the power restoration process. The purpose of this article is to enhance the predictive accuracy of the Spatially Generalized Hurricane Outage Prediction Model (SGHOPM) developed by Guikema et al. (2014). In this version of the SGHOPM, we introduce a new two‐step prediction procedure and increase the number of predictor variables. The first model step predicts whether or not outages will occur in each location and the second step predicts the number of outages. The SGHOPM environmental variables of Guikema et al. (2014) were limited to the wind characteristics (speed and duration of strong winds) of the tropical cyclones. This version of the model adds elevation, land cover, soil, precipitation, and vegetation characteristics in each location. Our results demonstrate that the use of a new two‐step outage prediction model and the inclusion of these additional environmental variables increase the overall accuracy of the SGHOPM by approximately 17%.

Multivariate Global Sensitivity Analysis Based on Distance Components Decomposition
In this article, a new set of multivariate global sensitivity indices based on distance components decomposition is proposed. The proposed sensitivity indices can be considered as an extension of the traditional variance‐based sensitivity indices and the covariance decomposition‐based sensitivity indices, and they have similar forms. The advantage of the proposed sensitivity indices is that they can measure the effects of an input variable on the whole probability distribution of multivariate model output when the power of distance . When , the proposed sensitivity indices are equivalent to the covariance decomposition‐based sensitivity indices. To calculate the proposed sensitivity indices, an efficient Monte Carlo method is proposed, which can also be used to calculate the covariance decomposition‐based sensitivity indices at the same time. The examples show the reasonability of the proposed sensitivity indices and the stability of the proposed Monte Carlo method.

November 2018

Understanding Compound, Interconnected, Interacting, and Cascading Risks: A Holistic Framework
In recent years, there has been a gradual increase in research literature on the challenges of interconnected, compound, interacting, and cascading risks. These concepts are becoming ever more central to the resilience debate. They aggregate elements of climate change adaptation, critical infrastructure protection, and societal resilience in the face of complex, high‐impact events. However, despite the potential of these concepts to link together diverse disciplines, scholars and practitioners need to avoid treating them in a superficial or ambiguous manner. Overlapping uses and definitions could generate confusion and lead to the duplication of research effort. This article gives an overview of the state of the art regarding compound, interconnected, interacting, and cascading risks. It is intended to help build a coherent basis for the implementation of the Sendai Framework for Disaster Risk Reduction (SFDRR). The main objective is to propose a holistic framework that highlights the complementarities of the four kinds of complex risk in a manner that is designed to support the work of researchers and policymakers. This article suggests how compound, interconnected, interacting, and cascading risks could be used, with little or no redundancy, as inputs to new analyses and decisional tools designed to support the implementation of the SFDRR. The findings can be used to improve policy recommendations and support tools for emergency and crisis management, such as scenario building and impact trees, thus contributing to the achievement of a system‐wide approach to resilience.

Pricing Storm Surge Risks in Florida: Implications for Determining Flood Insurance Premiums and Evaluating Mitigation Measures
The National Flood Insurance Program (NFIP) has been criticized for inaccurate flood hazard maps and premiums that are not risk based. We employ granular storm surge data comprised of five different event probabilities with associated flood elevations to calculate surge risk‐based premiums for homes in Pensacola, Florida, which we compare with NFIP premiums that are based on flood risk data with only one event probability (1% annual chance floods). We demonstrate how more granular flood risk data used for calculating risk‐based insurance premiums should be part of the NFIP mapping and rate‐setting processes. We then examine three different sea‐level rise (SLR) scenarios specific to Pensacola from the National Oceanic and Atmospheric Administration (NOAA), and assess surge risk‐based premiums out to 2100. We analyze the cost effectiveness of elevating homes to mitigate surge risks when costs of elevation are one lump upfront sum, and when costs are spread over 30 years via low‐interest mitigation loans. Benefits are the avoided future losses from surge risks going out to 2100 with the three different SLR scenarios. Findings show that it is cost effective to elevate high‐value homes with low first‐floor elevations in the most risky surge zones. Spreading costs of elevation with 30‐year loans should be directed at low‐income households to address affordability concerns. Alternative flood mitigation actions, such as wet floodproofing and elevating electrical and plumbing utilities, should be considered in instances where elevation is not cost effective.

Tornado Damage Mitigation: Homeowner Support for Enhanced Building Codes in Oklahoma
Tornadoes impose enormous costs on society. Relatively simple and inexpensive enhancements to building codes may reduce these costs by 30% or more, but only one city in the United States has adopted these codes. Why is this the case? This analysis addresses this question by examining homeowner support for more stringent building codes in Oklahoma, a conservative state that routinely experiences damaging tornadoes. Survey data show that support for mandatory mitigation policies like building codes is subject to countervailing forces. Push dynamics, including objective risk data, homeowners’ risk perceptions, and damage experience, encourage support for mitigation. Pull dynamics, such as individualistic and conservative worldviews, and skepticism about climate change, generate opposition. At the margin, the pull dynamics appear to exert more force than push dynamics, creating only a weak basis of support that is not strong enough to overcome the status quo bias in a state that is cautious about regulatory measures. The concluding section offers suggestions for changing these dynamics.

A Probabilistic Paradigm for the Parametric Insurance of Natural Hazards
There is a pressing need for simple and reliable risk transfer mechanisms that can pay out quickly after natural disasters without delays caused by loss estimation, and the need for long historical claims records. One such approach, known as parametric insurance, pays out when a key hazard variable exceeds a predetermined threshold. However, this approach to catastrophe risk, based on making deterministic binary predictions of loss occurrence, is susceptible to basis risk (mismatch between payouts and realized losses).

A more defensible approach is to issue probabilistic predictions of loss occurrence, which then allows uncertainty to be properly quantified, communicated, and evaluated. This study proposes a generic probabilistic framework for parametric trigger modeling based on logistic regression, and idealized modeling of potential damage given knowledge of a hazard variable. We also propose various novel methods for evaluating the quality and utility of such predictions as well as more traditional trigger indices.

The methodology is demonstrated by application to flood‐related disasters in Jamaica from 1998 to 2016 using gridded precipitation data as the hazard variable. A hydrologically motivated transformation is proposed for calculating potential damage from daily rainfall data. Despite the simplicity of the approach, the model has substantial skill at predicting the probability of occurrence of loss days as demonstrated by traditional goodness‐of‐fit measures (i.e., pseudo‐R2 of 0.55) as well as probabilistic verification diagnostics such as receiver operating characteristics. Using conceptual models of decisionmaker expenses, we also demonstrate that the system can provide considerable utility to involved parties, e.g., insured parties, insurers, and risk managers.

Resilience of Critical Infrastructures: Review and Analysis of Current Approaches\
In crisis situations, systems, organizations, and people must react and deal with events that are inherently unpredictable before they occur: vital societal functions and thus infrastructures must be restored or adapted as quickly as possible. This capacity refers to resilience. Progress concerning its conceptualization has been made but it remains difficult to assess and apply in practice. The results of this article stem from a literature review allowing the analysis of current advances in the development of proposals to improve the management of infrastructure resilience. The article: (i) identifies different dimensions of resilience; (ii) highlights current limits of assessing and controlling resilience; and (iii) proposes several directions for future research that could go beyond the current limits of resilience management, but subject to compliance with a number of constraints. These constraints are taking into account different hazards, cascade effects, and uncertain conditions, dealing with technical, organizational, economical, and human domains, and integrating temporal and spatial aspects.

October 2018

Theoretical Matters: On the Need for Hazard and Disaster Theory Developed Through Interdisciplinary Research and Collaboration
Hazard and disaster research requires a willingness to step outside of traditional disciplinary ontological and epistemological assumptions to both accommodate and integrate different perspectives. Moreover, the complex qualities of hazards and disasters necessitate interdisciplinary approaches to inform theory development that encompasses environmental, human, and infrastructure systems at multiple scales and units of analysis. Unfortunately, truly integrative hazard and disaster theory at a scale broad enough to account for the many systems and processes involved is currently limited. In this article, we argue that robust hazard and disaster theory can only arise from interdisciplinary research and collaboration. We examine challenges to the development of interdisciplinary hazard and disaster theory, and discuss the characteristics of theory necessary for the goal‐oriented nature of research aimed at reducing disaster impact.

Sequential Hazards Resilience of Interdependent Infrastructure System: A Case Study of Greater Toronto Area Energy Infrastructure System
Coupled infrastructure systems and complicated multihazards result in a high level of complexity and make it difficult to assess and improve the infrastructure system resilience. With a case study of the Greater Toronto Area energy system (including electric, gas, and oil transmission networks), an approach to analysis of multihazard resilience of an interdependent infrastructure system is presented in the article. Integrating network theory, spatial and numerical analysis methods, the new approach deals with the complicated multihazard relations and complex infrastructure interdependencies as spatiotemporal impacts on infrastructure systems in order to assess the dynamic system resilience. The results confirm that the effects of sequential hazards on resilience of infrastructure (network) are more complicated than the sum of single hazards. The resilience depends on the magnitude of the hazards, their spatiotemporal relationship and dynamic combined impacts, and infrastructure interdependencies. The article presents a comparison between physical and functional resilience of an electric transmission network, and finds functional resilience is always higher than physical resilience. The multiple hazards resilience evaluation approach is applicable to any type of infrastructure and hazard and it can contribute to the improvement of infrastructure planning, design, and maintenance decision making.

Semiautonomous Vehicle Risk Analysis: A Telematics‐Based Anomaly Detection Approach
The transition to semiautonomous driving is set to considerably reduce road accident rates as human error is progressively removed from the driving task. Concurrently, autonomous capabilities will transform the transportation risk landscape and significantly disrupt the insurance industry. Semiautonomous vehicle (SAV) risks will begin to alternate between human error and technological susceptibilities. The evolving risk landscape will force a departure from traditional risk assessment approaches that rely on historical data to quantify insurable risks. This article investigates the risk structure of SAVs and employs a telematics‐based anomaly detection model to assess split risk profiles. An unsupervised multivariate Gaussian (MVG) based anomaly detection method is used to identify abnormal driving patterns based on accelerometer and GPS sensors of manually driven vehicles. Parameters are inferred for vehicles equipped with semiautonomous capabilities and the resulting split risk profile is determined. The MVG approach allows for the quantification of vehicle risks by the relative frequency and severity of observed anomalies and a location‐based risk analysis is performed for a more comprehensive assessment. This approach contributes to the challenge of quantifying SAV risks and the methods employed here can be applied to evolving data sources pertinent to SAVs. Utilizing the vast amounts of sensor‐generated data will enable insurers to proactively reassess the collective performances of both the artificial driving agent and human driver.

A Three‐Part Bayesian Network for Modeling Dwelling Fires and Their Impact upon People and Property
In the United Kingdom, dwelling fires are responsible for the majority of all fire‐related fatalities. The development of these incidents involves the interaction of a multitude of variables that combine in many different ways. Consequently, assessment of dwelling fire risk can be complex, which often results in ambiguity during fire safety planning and decision making. In this article, a three‐part Bayesian network model is proposed to study dwelling fires from ignition through to extinguishment in order to improve confidence in dwelling fire safety assessment. The model incorporates both hard and soft data, delivering posterior probabilities for selected outcomes. Case studies demonstrate how the model functions and provide evidence of its use for planning and accident investigation.

Assessing Transboundary Wildfire Exposure in the Southwestern United States
We assessed transboundary wildfire exposure among federal, state, and private lands and 447 communities in the state of Arizona, southwestern United States. The study quantified the relative magnitude of transboundary (incoming, outgoing) versus nontransboundary (i.e., self‐burning) wildfire exposure based on land tenure or community of the simulated ignition and the resulting fire perimeter. We developed and described several new metrics to quantify and map transboundary exposure. We found that incoming transboundary fire accounted for 37% of the total area burned on large parcels of federal and state lands, whereas 63% of the area burned was burned by ignitions within the parcel. However, substantial parcel to parcel variation was observed for all land tenures for all metrics. We found that incoming transboundary fire accounted for 66% of the total area burned within communities versus 34% of the area burned by self‐burning ignitions. Of the total area burned within communities, private lands contributed the largest proportion (36.7%), followed by national forests (19.5%), and state lands (15.4%). On average seven land tenures contributed wildfire to individual communities. Annual wildfire exposure to structures was highest for wildfires ignited on state and national forest land, followed by tribal, private, and BLM. We mapped community firesheds, that is, the area where ignitions can spawn fires that can burn into communities, and estimated that they covered 7.7million ha, or 26% of the state of Arizona. Our methods address gaps in existing wildfire risk assessments, and their implementation can help reduce fragmentation in governance systems and inefficiencies in risk planning.

September 2018

Tiered Approach to Resilience Assessment
Regulatory agencies have long adopted a three‐tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies.

Exploring the Potential for Multivariate Fragility Representations to Alter Flood Risk Estimates
In flood risk analysis, limitations in the multivariate statistical models adopted to model the hydraulic load have restricted the probability of a defense suffering structural failure to be expressed conditionally on a single hydraulic loading variable. This is an issue at the coastal level where multiple loadings act on defenses with the exact combination of loadings dictating their failure probabilities. Recently, a methodology containing a multivariate statistical model with the flexibility to robustly capture the dependence structure between the individual loadings was used to derive extreme nearshore loading conditions. Its adoption will permit the incorporation of more precise representations of a structure's vulnerability in future analyses. In this article, a fragility representation of a shingle beach, where the failure probability is expressed over a three‐dimensional loading parameter space—water level, wave height, and period—is derived at two localities. Within the approach, a Gaussian copula is used to capture any dependencies between the simplified geometric parameters of a beach's shape. Beach profiles are simulated from the copula and the failure probability, given the hydraulic load, determined by the reformulated Bradbury barrier inertia parameter model. At one site, substantial differences in the annual failure probability distribution are observed between the new and existing approaches. At the other, the beach only becomes vulnerable after a significant reduction of the crest height with its mean annual failure probability close to that presently predicted. It is concluded that further application of multivariate approaches is likely to yield more effective flood risk management.

August 2018

A Framework to Understand Extreme Space Weather Event Probability
An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well‐being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments.

Security Investment in Contagious Networks
Security of the systems is normally interdependent in such a way that security risks of one part affect other parts and threats spread through the vulnerable links in the network. So, the risks of the systems can be mitigated through investments in the security of interconnecting links. This article takes an innovative look at the problem of security investment of nodes on their vulnerable links in a given contagious network as a game‐theoretic model that can be applied to a variety of applications including information systems. In the proposed game model, each node computes its corresponding risk based on the value of its assets, vulnerabilities, and threats to determine the optimum level of security investments on its external links respecting its limited budget. Furthermore, direct and indirect nonlinear influences of a node's security investment on the risks of other nodes are considered. The existence and uniqueness of the game's Nash equilibrium in the proposed game are also proved. Further analysis of the model in a practical case revealed that taking advantage of the investment effects of other players, perfectly rational players (i.e., those who use the utility function of the proposed game model) make more cost‐effective decisions than selfish nonrational or semirational players.

DAMS: A Model to Assess Domino Effects by Using Agent‐Based Modeling and Simulation
Historical data analysis shows that escalation accidents, so‐called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent‐based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent‐based modeling technique explains the domino effects from a bottom‐up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher‐level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large‐scale complicated cases.

Resilience Analysis of a Remote Offshore Oil and Gas Facility for a Potential Hydrocarbon Release
Resilience is the capability of a system to adjust its functionality during a disturbance or perturbation. The present work attempts to quantify resilience as a function of reliability, vulnerability, and maintainability. The approach assesses proactive and reactive defense mechanisms along with operational factors to respond to unwanted disturbances and perturbation. This article employs a Bayesian network format to build a resilience model. The application of the model is tested on hydrocarbon‐release scenarios during an offloading operation in a remote and harsh environment. The model identifies requirements for robust recovery and adaptability during an unplanned scenario related to a hydrocarbon release. This study attempts to relate the resilience capacity of a system to the system's absorptive, adaptive, and restorative capacities. These factors influence predisaster and postdisaster strategies that can be mapped to enhance the resilience of the system.

A Comprehensive Risk Analysis of Transportation Networks Affected by Rainfall‐Induced Multihazards
Climate change and its projected natural hazards have an adverse impact on the functionality and operation of transportation infrastructure systems. This study presents a comprehensive framework to analyze the risk to transportation infrastructure networks that are affected by natural hazards. The proposed risk analysis method considers both the failure probability of infrastructure components and the expected infrastructure network efficiency and capacity loss due to component failure. This comprehensive approach facilitates the identification of high‐risk network links in terms of not only their susceptibility to natural hazards but also their overall impact on the network. The Chinese national rail system and its exposure to rainfall‐related multihazards are used as a case study. The importance of various links is comprehensively assessed from the perspectives of topological, efficiency, and capacity criticality. Risk maps of the national railway system are generated, which can guide decisive action regarding investments in preventative and adaptive measures to reduce risk.

Toward an Application Guide for Safety Integrity Level Allocation in Railway Systems
The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety‐related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety‐related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety‐related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of “emergency brake” subsystems. Some specific SIL allocation rules are also defined and illustrated.

July 2018

Which Fire to Extinguish First? A Risk‐Informed Approach to Emergency Response in Oil Terminals: The performance of fire protection measures plays a key role in the prevention and mitigation of fire escalation (fire domino effect) in process plants. In addition to passive and active safety measures, the intervention of firefighting teams can have a great impact on fire propagation. In the present study, we have demonstrated an application of dynamic Bayesian network to modeling and safety assessment of fire domino effect in oil terminals while considering the effect of safety measures in place. The results of the developed dynamic Bayesian network—prior and posterior probabilities—have been combined with information theory, in the form of mutual information, to identify optimal firefighting strategies, especially when the number of fire trucks is not sufficient to handle all the vessels in danger.

Induced Earthquakes from Long‐Term Gas Extraction in Groningen, the Netherlands: Statistical Analysis and Prognosis for Acceptable‐Risk Regulation: Recently, growing earthquake activity in the northeastern Netherlands has aroused considerable concern among the 600,000 provincial inhabitants. There, at 3 km deep, the rich Groningen gas field extends over 900 km2 and still contains about 600 of the original 2,800 billion cubic meters (bcm). Particularly after 2001, earthquakes have increased in number, magnitude (M, on the logarithmic Richter scale), and damage to numerous buildings. The man‐made nature of extraction‐induced earthquakes challenges static notions of risk, complicates formal risk assessment, and questions familiar conceptions of acceptable risk. Here, a 26‐year set of 294 earthquakes with M ≥ 1.5 is statistically analyzed in relation to increasing cumulative gas extraction since 1963. Extrapolations from a fast‐rising trend over 2001–2013 indicate that—under “business as usual”—around 2021 some 35 earthquakes with M ≥ 1.5 might occur annually, including four with M ≥ 2.5 (ten‐fold stronger), and one with M ≥ 3.5 every 2.5 years. Given this uneasy prospect, annual gas extraction has been reduced from 54 bcm in 2013 to 24 bcm in 2017. This has significantly reduced earthquake activity, so far. However, when extraction is stabilized at 24 bcm per year for 2017–2021 (or 21.6 bcm, as judicially established in Nov. 2017), the annual number of earthquakes would gradually increase again, with an expected all‐time maximum M ≈ 4.5. Further safety management may best follow distinct stages of seismic risk generation, with moderation of gas extraction and massive (but late and slow) building reinforcement as outstanding strategies. Officially, “acceptable risk” is mainly approached by quantification of risk (e.g., of fatal building collapse) for testing against national safety standards, but actual (local) risk estimation remains problematic. Additionally important are societal cost–benefit analysis, equity considerations, and precautionary restraint. Socially and psychologically, deliberate attempts are made to improve risk communication, reduce public anxiety, and restore people's confidence in responsible experts and policymakers.

June 2018

A Probabilistic Analysis of Surface Water Flood Risk in London: Flooding in urban areas during heavy rainfall, often characterized by short duration and high‐intensity events, is known as “surface water flooding.” Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively.

Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent‐based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near‐miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high‐risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in‐depth behavioral and decision rules at the individual and community level.

A Systems‐Based Risk Assessment Framework for Intentional Electromagnetic Interference (IEMI) on Critical Infrastructures: Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man‐made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems‐based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is “replaced” by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst‐case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network.

Dynamic Economic Resilience and Economic Recovery from Disasters: A Quantitative Assessment: This article analyzes the role of dynamic economic resilience in relation to recovery from disasters in general and illustrates its potential to reduce disaster losses in a case study of the Wenchuan earthquake of 2008. We first offer operational definitions of the concept linked to policies to promote increased levels and speed of investment in repair and reconstruction to implement this resilience. We then develop a dynamic computable general equilibrium (CGE) model that incorporates major features of investment and traces the time‐path of the economy as it recovers with and without dynamic economic resilience. The results indicate that resilience strategies could have significantly reduced GDP losses from the Wenchuan earthquake by 47.4% during 2008–2011 by accelerating the pace of recovery and could have further reduced losses slightly by shortening the recovery by one year. The results can be generalized to conclude that shortening the recovery period is not nearly as effective as increasing reconstruction investment levels and steepening the time‐path of recovery. This is an important distinction that should be made in the typically vague and singular reference to increasing the speed of recovery in many definitions of dynamic resilience.

May 2018

An Emerging New Risk Analysis Science: Foundations and Implications
To solve real‐life problems—such as those related to technology, health, security, or climate change—and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular.

People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning
Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate.

Public Response to a Near‐Miss Nuclear Accident Scenario Varying in Causal Attributions and Outcome Uncertainty
Many studies have investigated public reactions to nuclear accidents. However, few studies focused on more common events when a serious accident could have happened but did not. This study evaluated public response (emotional, cognitive, and behavioral) over three phases of a near‐miss nuclear accident. Simulating a loss‐of‐coolant accident (LOCA) scenario, we manipulated (1) attribution for the initial cause of the incident (software failure vs. cyber terrorist attack vs. earthquake), (2) attribution for halting the incident (fail‐safe system design vs. an intervention by an individual expert vs. a chance coincidence), and (3) level of uncertainty (certain vs. uncertain) about risk of a future radiation leak after the LOCA is halted. A total of 773 respondents were sampled using a 3 × 3 × 2 between‐subjects design. Results from both MANCOVA and structural equation modeling (SEM) indicate that respondents experienced more negative affect, perceived more risk, and expressed more avoidance behavioral intention when the near‐miss event was initiated by an external attributed source (e.g., earthquake) compared to an internally attributed source (e.g., software failure). Similarly, respondents also indicated greater negative affect, perceived risk, and avoidance behavioral intentions when the future impact of the near‐miss incident on people and the environment remained uncertain. Results from SEM analyses also suggested that negative affect predicted risk perception, and both predicted avoidance behavior. Affect, risk perception, and avoidance behavior demonstrated high stability (i.e., reliability) from one phase to the next.

Review of Regulatory Emphasis on Transportation Safety in the United States, 2002–2009: Public versus Private Modes
The U.S. Department of Transportation is responsible for implementing new safety improvements and regulations with the goal of ensuring limited funds are distributed to where they can have the greatest impact on safety. In this work, we conduct a study of new regulations and other reactions (such as recalls) to fatal accidents in several different modes of transportation implemented from 2002 to 2009. We find that in the safest modes of commercial aviation and bus transport, the amount of spending on new regulations is high in relation to the number of fatalities compared to the regulatory attention received by less safe modes of general aviation and private automobiles. Additionally, we study two major fatal accident investigations from commercial aviation and two major automotive recalls associated with fatal accidents. We find differences in the cost per expected fatality prevented for these reactions, with the airline accident investigations being more cost effective. Overall, we observe trends in both the automotive and aviation sectors that suggest that public transportation receives more regulatory attention than private transport. We also observe that the types of safety remedies utilized, regulation versus investigation, have varying levels of effectiveness in different transport modes. We suggest that these differences are indicative of increased public demand for safety in modes where a third party may be held responsible, even for those not participating in the transportation. These findings have important implications for the transportation industry, policymakers, and for estimating the public demand for safety in new transport modes.

April 2018

Insurance, Public Assistance, and Household Flood Risk Reduction: A Comparative Study of Austria, England, and Romania
In light of increasing losses from floods, many researchers and policymakers are looking for ways to encourage flood risk reduction among communities, business, and households. In this study, we investigate risk‐reduction behavior at the household level in three European Union Member States with fundamentally different insurance and compensation schemes. We try to understand if and how insurance and public assistance influence private risk‐reduction behavior. Data were collected using a telephone survey (n = 1,849) of household decisionmakers in flood‐prone areas. We show that insurance overall is positively associated with private risk‐reduction behavior. Warranties, premium discounts, and information provision with respect to risk reduction may be an explanation for this positive relationship in the case of structural measures. Public incentives for risk‐reduction measures by means of financial and in‐kind support, and particularly through the provision of information, are also associated with enhancing risk reduction. In this study, public compensation is not negatively associated with private risk‐reduction behavior. This does not disprove such a relationship, but the negative effect may be mitigated by factors related to respondents' capacity to implement measures or social norms that were not included in the analysis. The data suggest that large‐scale flood protection infrastructure creates a sense of security that is associated with a lower level of preparedness. Across the board there is ample room to improve both public and private policies to provide effective incentives for household‐level risk reduction.

Scenario Analysis for the Safety Assessment of Nuclear Waste Repositories: A Critical Review
A major challenge in scenario analysis for the safety assessment of nuclear waste repositories pertains to the comprehensiveness of the set of scenarios selected for assessing the safety of the repository. Motivated by this challenge, we discuss the aspects of scenario analysis relevant to comprehensiveness. Specifically, we note that (1) it is necessary to make it clear why scenarios usually focus on a restricted set of features, events, and processes; (2) there is not yet consensus on the interpretation of comprehensiveness for guiding the generation of scenarios; and (3) there is a need for sound approaches to the treatment of epistemic uncertainties.

Optimal Mission Abort Policy for Systems Operating in a Random Environment
Many real‐world critical systems, e.g., aircrafts, manned space flight systems, and submarines, utilize mission aborts to enhance their survivability. Specifically, a mission can be aborted when a certain malfunction condition is met and a rescue or recovery procedure is then initiated. For systems exposed to external impacts, the malfunctions are often caused by the consequences of these impacts. Traditional system reliability models typically cannot address a possibility of mission aborts. Therefore, in this article, we first develop the corresponding methodology for modeling and evaluation of the mission success probability and survivability of systems experiencing both internal failures and external shocks. We consider a policy when a mission is aborted and a rescue procedure is activated upon occurrence of the mth shock. We demonstrate the tradeoff between the system survivability and the mission success probability that should be balanced by the proper choice of the decision variable m. A detailed illustrative example of a mission performed by an unmanned aerial vehicle is presented.

March 2018

Flood Risk Management: Exploring the Impacts of the Community Rating System Program on Poverty and Income Inequality
Flooding remains a major problem for the United States, causing numerous deaths and damaging countless properties. To reduce the impact of flooding on communities, the U.S. government established the Community Rating System (CRS) in 1990 to reduce flood damages by incentivizing communities to engage in flood risk management initiatives that surpass those required by the National Flood Insurance Program. In return, communities enjoy discounted flood insurance premiums. Despite the fact that the CRS raises concerns about the potential for unevenly distributed impacts across different income groups, no study has examined the equity implications of the CRS. This study thus investigates the possibility of unintended consequences of the CRS by answering the question: What is the effect of the CRS on poverty and income inequality? Understanding the impacts of the CRS on poverty and income inequality is useful in fully assessing the unintended consequences of the CRS. The study estimates four fixed‐effects regression models using a panel data set of neighborhood‐level observations from 1970 to 2010. The results indicate that median incomes are lower in CRS communities, but rise in floodplains. Also, the CRS attracts poor residents, but relocates them away from floodplains. Additionally, the CRS attracts top earners, including in floodplains. Finally, the CRS encourages income inequality, but discourages income inequality in floodplains. A better understanding of these unintended consequences of the CRS on poverty and income inequality can help to improve the design and performance of the CRS and, ultimately, increase community resilience to flood disasters.

Hazard Analysis and Safety Requirements for Small Drone Operations: To What Extent Do Popular Drones Embed Safety?
Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights.

Community‐Driven Hypothesis Testing: A Solution for the Tragedy of the Anticommons
Shared ownership of property and resources is a longstanding challenge throughout history that has been amplifying with the increasing development of industrial and postindustrial societies. Where governments, project planners, and commercial developers seek to develop new infrastructure, industrial projects, and various other land‐and resource‐intensive tasks, veto power shared by various local stakeholders can complicate or halt progress. Risk communication has been used as an attempt to address stakeholder concerns in these contexts, but has demonstrated shortcomings. These coordination failures between project planners and stakeholders can be described as a specific kind of social dilemma that we describe as the “tragedy of the anticommons.” To overcome such dilemmas, we demonstrate how a two‐step process can directly address public mistrust of project planners and public perceptions of limited decision‐making authority. This approach is examined via two separate empirical field experiments in Portugal and Tunisia, where public resistance and anticommons problems threatened to derail emerging industrial projects. In both applications, an intervention is undertaken to address initial public resistance to such projects, where specific public stakeholders and project sponsors collectively engaged in a hypothesis‐testing process to identify and assess human and environmental health risks associated with proposed industrial facilities. These field experiments indicate that a rigorous attempt to address public mistrust and perceptions of power imbalances and change the pay‐off structure of the given dilemma may help overcome such anticommons problems in specific cases, and may potentially generate enthusiasm and support for such projects by local publics moving forward.

February 2018

Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies
Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system‐based for high‐consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward‐looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high‐consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents.

Dynamic Blowout Risk Analysis Using Loss Functions
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real‐time risk analysis. The real‐time evolving situation is considered dependent on the changing bottom‐hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout.

Providing Limited Local Electric Service During a Major Grid Outage: A First Assessment Based on Customer Willingness to Pay
While they are rare, widespread blackouts of the bulk power system can result in large costs to individuals and society. If local distribution circuits remain intact, it is possible to use new technologies including smart meters, intelligent switches that can change the topology of distribution circuits, and distributed generation owned by customers and the power company, to provide limited local electric power service. Many utilities are already making investments that would make this possible. We use customers' measured willingness to pay to explore when the incremental investments needed to implement these capabilities would be justified. Under many circumstances, upgrades in advanced distribution systems could be justified for a customer charge of less than a dollar a month (plus the cost of electricity used during outages), and would be less expensive and safer than the proliferation of small portable backup generators. We also discuss issues of social equity, extreme events, and various sources of underlying uncertainty.

Assessing the Cost of Large‐Scale Power Outages to Residential Customers
Residents in developed economies depend heavily on electric services. While distributed resources and a variety of new smart technologies can increase the reliability of that service, adopting them involves costs, necessitating tradeoffs between cost and reliability. An important input to making such tradeoffs is an estimate of the value customers place on reliable electric services. We develop an elicitation framework that helps individuals think systematically about the value they attach to reliable electric service. Our approach employs a detailed and realistic blackout scenario, full or partial (20 A) backup service, questions about willingness to pay (WTP) using a multiple bounded discrete choice method, information regarding inconveniences and economic losses, and checks for bias and consistency. We applied this method to a convenience sample of residents in Allegheny County, Pennsylvania, finding that respondents valued a kWh for backup services they assessed to be high priority more than services that were seen as low priority ($0.75/kWh vs. $0.51/kWh). As more information about the consequences of a blackout was provided, this difference increased ($1.2/kWh vs. $0.35/kWh), and respondents' uncertainty about the backup services decreased (Full: $11 to $9.0, Partial: $13 to $11). There was no evidence that the respondents were anchored by their previous WTP statements, but they demonstrated only weak scope sensitivity. In sum, the consumer surplus associated with providing a partial electric backup service during a blackout may justify the costs of such service, but measurement of that surplus depends on the public having accurate information about blackouts and their consequences.

A Reliability‐Based Capability Approach
This article proposes a rigorous mathematical approach, named a reliability‐based capability approach (RCA), to quantify the societal impact of a hazard. The starting point of the RCA is a capability approach in which capabilities refer to the genuine opportunities open to individuals to achieve valuable doings and beings (such as being mobile and being sheltered) called functionings. Capabilities depend on what individuals have and what they can do with what they have. The article develops probabilistic predictive models that relate the value of each functioning to a set of easily predictable or measurable quantities (regressors) in the aftermath of a hazard. The predicted values of selected functionings for an individual collectively determine the impact of a hazard on his/her state of well‐being. The proposed RCA integrates the predictive models of functionings into a system reliability problem to determine the probability that the state of well‐being is acceptable, tolerable, or intolerable. Importance measures are defined to quantify the contribution of each functioning to the state of well‐being. The information from the importance measures can inform decisions on optimal allocation of limited resources for risk mitigation and management.

January 2018

Sociotechnical Resilience: A Preliminary Concept
This article presents the concept of sociotechnical resilience by employing an interdisciplinary perspective derived from the fields of science and technology studies, human factors, safety science, organizational studies, and systems engineering. Highlighting the hybrid nature of sociotechnical systems, we identify three main constituents that characterize sociotechnical resilience: informational relations, sociomaterial structures, and anticipatory practices. Further, we frame sociotechnical resilience as undergirded by the notion of transformability with an emphasis on intentional activities, focusing on the ability of sociotechnical systems to shift from one form to another in the aftermath of shock and disturbance. We propose that the triad of relations, structures, and practices are fundamental aspects required to comprehend the resilience of sociotechnical systems during times of crisis.

Development of an Asset Value Map for Disaster Risk Assessment in China by Spatial Disaggregation Using Ancillary Remote Sensing Data
The extent of economic losses due to a natural hazard and disaster depends largely on the spatial distribution of asset values in relation to the hazard intensity distribution within the affected area. Given that statistical data on asset value are collected by administrative units in China, generating spatially explicit asset exposure maps remains a key challenge for rapid postdisaster economic loss assessment. The goal of this study is to introduce a top-down (or downscaling) approach to disaggregate administrative-unit level asset value to grid-cell level. To do so, finding the highly correlated “surrogate” indicators is the key. A combination of three data sets—nighttime light grid, LandScan population grid, and road density grid, is used as ancillary asset density distribution information for spatializing the asset value. As a result, a high spatial resolution asset value map of China for 2015 is generated. The spatial data set contains aggregated economic value at risk at 30 arc-second spatial resolution. Accuracy of the spatial disaggregation reflects redistribution errors introduced by the disaggregation process as well as errors from the original ancillary data sets. The overall accuracy of the results proves to be promising. The example of using the developed disaggregated asset value map in exposure assessment of watersheds demonstrates that the data set offers immense analytical flexibility for overlay analysis according to the hazard extent. This product will help current efforts to analyze spatial characteristics of exposure and to uncover the contributions of both physical and social drivers of natural hazard and disaster across space and time.

Resilience Analysis of Countries under Disasters Based on Multisource Data
Disasters occur almost daily in the world. Because emergencies frequently have no precedent, are highly uncertain, and can be very destructive, improving a country's resilience is an efficient way to reduce risk. In this article, we collected more than 20,000 historical data points from disasters from 207 countries to enable us to calculate the severity of disasters and the danger they pose to countries. In addition, 6 primary indices (disaster, personal attribute, infrastructure, economics, education, and occupation) including 38 secondary influencing factors are considered in analyzing the resilience of countries. Using these data, we obtained the danger, expected number of deaths, and resilience of all 207 countries. We found that a country covering a large area is more likely to have a low resilience score. Through sensitivity analysis of all secondary indices, we found that population density, frequency of disasters, and GDP are the three most critical factors affecting resilience. Based on broad-spectrum resilience analysis of the different continents, Oceania and South America have the highest resilience, while Asia has the lowest. Over the past 50 years, the resilience of many countries has been improved sharply, especially in developing countries. Based on our results, we analyze the comprehensive resilience and provide some optimal suggestions to efficiently improve resilience.

Industrial Safety and Utopia: Insights from the Fukushima Daiichi Accident
Feedback from industrial accidents is provided by various state or even international, institutions, and lessons learned can be controversial. However, there has been little research into organizational learning at the international level. This article helps to fill the gap through an in-depth review of official reports of the Fukushima Daiichi accident published shortly after the event. We present a new method to analyze the arguments contained in these voluminous documents. Taking an intertextual perspective, the method focuses on the accident narratives, their rationale, and links between “facts,” “causes,” and “recommendations.” The aim is to evaluate how the findings of the various reports are consistent with (or contradict) “institutionalized knowledge,” and identify the social representations that underpin them. We find that although the scientific controversy surrounding the results of the various inquiries reflects different ethical perspectives, they are integrated into the same utopian ideal. The involvement of multiple actors in this controversy raises questions about the public construction of epistemic authority, and we highlight the special status given to the International Atomic Energy Agency in this regard.

Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I–I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I–I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term “essential entities” includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS.

Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks
Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the “do nothing” case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial.

How to Design Rating Schemes of Risk Matrices: A Sequential Updating Approach
Risk matrices have been widely used as a risk evaluation tool in many fields due to their simplicity and intuitive nature. Designing a rating scheme, i.e., determining the number of ratings used in a risk matrix and assigning different ratings to different cells, is an essential part of risk matrix construction. However, most of the related literature has focused on applying a risk matrix to various fields, instead of researching how to design risk matrices. Based on the analysis of several current rules, we propose a new approach, namely, the sequential updating approach (SUA), to design the rating scheme of a risk matrix in a reliable way. In this article, we propose three principles and a rating algorithm based on these principles. The three principles, namely, adjusted weak consistency, consistent internality, and continuous screening, characterize a good rating scheme. The resulting rating scheme has been proven to be unique. A global rating algorithm is then proposed to create the design that satisfies the three principles. We then explore the performance of the SUA. An illustrative application is first given to explain the feasibility of our approach. The sensitivity analysis shows that our method captures a resolution-reliability tradeoff for decisionmakers in choosing an appropriate rating scheme for a risk matrix. Finally, we compare the designs based on the SUA and Cox's axioms, highlighting the advantages of the SUA.