EISG in Risk Analysis (2017)

Risk Analysis is the official journal of Society for Risk Analysis and publishes peer-reviewed, original research on both the theory and practice of risk. The application areas are vast. Below are articles with particular relevance to the Engineering and Infrastructure Specialty Group.

December 2017

Geographic Hotspots of Critical National Infrastructure
Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real-life critical infrastructure networks by integrating high-resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national-scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location.

Evaluation of the Consequences of a Cistern Truck Accident While Transporting Dangerous Substances through a Tunnel
The transportation of dangerous substances by truck carriers harbors important safety issues in both road and mine tunnels. Even though traffic conditions in road and mine tunnels are different, the potential geometric and hydrodynamic similarities can lead to similar effects from the uncontrolled leakage of the dangerous material. This work was motivated by the design study of the LAGUNA-LBNO (Large Apparatus studying Grand Unification and Neutrino Astrophysics and Long Baseline Neutrino Oscillations) project. The considered neutrino detector requires a huge amount of liquid argon, which must be transported down the tunnel. The present work focuses on the estimation of the most credible incident and the resulting consequences in the case of a truck accident in the tunnel. The approach and tools used in the present work are generic and can be adapted to other similar situations.

Three-Stage Decision-Making Model under Restricted Conditions for Emergency Response to Ships Not under Control
A ship that is not under control (NUC) is a typical incident that poses serious problems when in confined waters close to shore. The emergency response to NUC ships is to select the best risk control options, which is a challenge in restricted conditions (e.g., time limitation, resource constraint, and information asymmetry), particularly in inland waterway transportation. To enable a quick and effective response, this article develops a three-stage decision-making framework for NUC ship handling. The core of this method is (1) to propose feasible options for each involved entity (e.g., maritime safety administration, NUC ship, and ships passing by) under resource constraint in the first stage, (2) to select the most feasible options by comparing the similarity of the new case and existing cases in the second stage, and (3) to make decisions considering the cooperation between the involved organizations by using a developed Bayesian network in the third stage. Consequently, this work provides a useful tool to achieve well-organized management of NUC ships.

November 2017

Spatial Optimization of Future Urban Development with Regards to Climate Risk and Sustainability Objectives
Authors: Daniel Caparros-Midwood, Stuart Barr and Richard Dawson
Abstract: Future development in cities needs to manage increasing populations, climate-related risks, and sustainable development objectives such as reducing greenhouse gas emissions. Planners therefore face a challenge of multidimensional, spatial optimization in order to balance potential tradeoffs and maximize synergies between risks and other objectives. To address this, a spatial optimization framework has been developed. This uses a spatially implemented genetic algorithm to generate a set of Pareto-optimal results that provide planners with the best set of trade-off spatial plans for six risk and sustainability objectives: (i) minimize heat risks, (ii) minimize flooding risks, (iii) minimize transport travel costs to minimize associated emissions, (iv) maximize brownfield development, (v) minimize urban sprawl, and (vi) prevent development of greenspace. The framework is applied to Greater London (U.K.) and shown to generate spatial development strategies that are optimal for specific objectives and differ significantly from the existing development strategies. In addition, the analysis reveals tradeoffs between different risks as well as between risk and sustainability objectives. While increases in heat or flood risk can be avoided, there are no strategies that do not increase at least one of these. Tradeoffs between risk and other sustainability objectives can be more severe, for example, minimizing heat risk is only possible if future development is allowed to sprawl significantly. The results highlight the importance of spatial structure in modulating risks and other sustainability objectives. However, not all planning objectives are suited to quantified optimization and so the results should form part of an evidence base to improve the delivery of risk and sustainability management in future urban development.

A General Framework for the Assessment of Power System Vulnerability to Malicious Attacks
Authors: R. Piccinelli, G. Sansavini, R. Lucchetti and E. Zio
Abstract: The protection and safe operations of power systems heavily rely on the identification of the causes of damage and service disruption. This article presents a general framework for the assessment of power system vulnerability to malicious attacks. The concept of susceptibility to an attack is employed to quantitatively evaluate the degree of exposure of the system and its components to intentional offensive actions. A scenario with two agents having opposing objectives is proposed, i.e., a defender having multiple alternatives of protection strategies for system elements, and an attacker having multiple alternatives of attack strategies against different combinations of system elements. The defender aims to minimize the system susceptibility to the attack, subject to budget constraints; on the other hand, the attacker aims to maximize the susceptibility. The problem is defined as a zero-sum game between the defender and the attacker. The assumption that the interests of the attacker and the defender are opposite makes it irrelevant whether or not the defender shows the strategy he/she will use. Thus, the approaches “leader–follower game” or “simultaneous game” do not provide differences as far as the results are concerned. The results show an example of such a situation, and the von Neumann theorem is applied to find the (mixed) equilibrium strategies of the attacker and of the defender.

Evaluating the Cost, Safety, and Proliferation Risks of Small Floating Nuclear Reactors
Authors: Michael J. Ford, Ahmed Abdulla and M. Granger Morgan
Abstract: It is hard to see how our energy system can be decarbonized if the world abandons nuclear power, but equally hard to introduce the technology in nonnuclear energy states. This is especially true in countries with limited technical, institutional, and regulatory capabilities, where safety and proliferation concerns are acute. Given the need to achieve serious emissions mitigation by mid-century, and the multidecadal effort required to develop robust nuclear governance institutions, we must look to other models that might facilitate nuclear plant deployment while mitigating the technology's risks. One such deployment paradigm is the build-own-operate-return model.

Because returning small land-based reactors containing spent fuel is infeasible, we evaluate the cost, safety, and proliferation risks of a system in which small modular reactors are manufactured in a factory, and then deployed to a customer nation on a floating platform. This floating small modular reactor would be owned and operated by a single entity and returned unopened to the developed state for refueling. We developed a decision model that allows for a comparison of floating and land-based alternatives considering key International Atomic Energy Agency plant-siting criteria. Abandoning onsite refueling is beneficial, and floating reactors built in a central facility can potentially reduce the risk of cost overruns and the consequences of accidents. However, if the floating platform must be built to military-grade specifications, then the cost would be much higher than a land-based system. The analysis tool presented is flexible, and can assist planners in determining the scope of risks and uncertainty associated with different deployment options.

Deterrence and Risk Preferences in Sequential Attacker–Defender Games with Continuous Efforts
Authors: Vineet M. Payyappalli, Jun Zhuang and Victor Richmond R. Jose
Abstract: Most attacker–defender games consider players as risk neutral, whereas in reality attackers and defenders may be risk seeking or risk averse. This article studies the impact of players' risk preferences on their equilibrium behavior and its effect on the notion of deterrence. In particular, we study the effects of risk preferences in a single-period, sequential game where a defender has a continuous range of investment levels that could be strategically chosen to potentially deter an attack. This article presents analytic results related to the effect of attacker and defender risk preferences on the optimal defense effort level and their impact on the deterrence level. Numerical illustrations and some discussion of the effect of risk preferences on deterrence and the utility of using such a model are provided, as well as sensitivity analysis of continuous attack investment levels and uncertainty in the defender's beliefs about the attacker's risk preference. A key contribution of this article is the identification of specific scenarios in which the defender using a model that takes into account risk preferences would be better off than a defender using a traditional risk-neutral model. This study provides insights that could be used by policy analysts and decisionmakers involved in investment decisions in security and safety.

October 2017

The Use of Simulation to Reduce the Domain of “Black Swans” with Application to Hurricane Impacts to Power Systems
Authors: Christine L. Berner, Andrea Staid, Roger Flage and Seth D. Guikema
Abstract: Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans.

In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts.

Assessing Climate Change Impacts on Wildfire Exposure in Mediterranean Areas
Authors: Antonio T. Monteiro, Mark A. Finney, Liliana Del Giudice, Enrico Scoccimarro and Donatella Spano
Abstract: We used simulation modeling to assess potential climate change impacts on wildfire exposure in Italy and Corsica (France). Weather data were obtained from a regional climate model for the period 1981–2070 using the IPCC A1B emissions scenario. Wildfire simulations were performed with the minimum travel time fire spread algorithm using predicted fuel moisture, wind speed, and wind direction to simulate expected changes in weather for three climatic periods (1981–2010, 2011–2040, and 2041–2070). Overall, the wildfire simulations showed very slight changes in flame length, while other outputs such as burn probability and fire size increased significantly in the second future period (2041–2070), especially in the southern portion of the study area. The projected changes fuel moisture could result in a lengthening of the fire season for the entire study area. This work represents the first application in Europe of a methodology based on high resolution (250 m) landscape wildfire modeling to assess potential impacts of climate changes on wildfire exposure at a national scale. The findings can provide information and support in wildfire management planning and fire risk mitigation activities.

Construction Safety Risk Modeling and Simulation
Authors: Antoine J.-P. Tixier, Matthew R. Hallowell and Balaji Rajagopalan
Abstract: By building on a genetic-inspired attribute-based conceptual framework for safety risk analysis, we propose a novel approach to define, model, and simulate univariate and bivariate construction safety risk at the situational level. Our fully data-driven techniques provide construction practitioners and academicians with an easy and automated way of getting valuable empirical insights from attribute-based data extracted from unstructured textual injury reports. By applying our methodology on a data set of 814 injury reports, we first show the frequency-magnitude distribution of construction safety risk to be very similar to that of many natural phenomena such as precipitation or earthquakes. Motivated by this observation, and drawing on state-of-the-art techniques in hydroclimatology and insurance, we then introduce univariate and bivariate nonparametric stochastic safety risk generators based on kernel density estimators and copulas. These generators enable the user to produce large numbers of synthetic safety risk values faithful to the original data, allowing safety-related decision making under uncertainty to be grounded on extensive empirical evidence. One of the implications of our study is that like natural phenomena, construction safety may benefit from being studied quantitatively by leveraging empirical data rather than strictly being approached through a managerial perspective using subjective data, which is the current industry standard. Finally, a side but interesting finding is that in our data set, attributes related to high energy levels (e.g., machinery, hazardous substance) and to human error (e.g., improper security of tools) emerge as strong risk shapers.

A Blueprint for Full Collective Flood Risk Estimation: Demonstration for European River Flooding (open access)
Authors: Francesco Serinaldi and Chris G. Kilsby
Abstract: loods are a natural hazard evolving in space and time according to meteorological and river basin dynamics, so that a single flood event can affect different regions over the event duration. This physical mechanism introduces spatio-temporal relationships between flood records and losses at different locations over a given time window that should be taken into account for an effective assessment of the collective flood risk. However, since extreme floods are rare events, the limited number of historical records usually prevents a reliable frequency analysis. To overcome this limit, we move from the analysis of extreme events to the modeling of continuous stream flow records preserving spatio-temporal correlation structures of the entire process, and making a more efficient use of the information provided by continuous flow records. The approach is based on the dynamic copula framework, which allows for splitting the modeling of spatio-temporal properties by coupling suitable time series models accounting for temporal dynamics, and multivariate distributions describing spatial dependence. The model is applied to 490 stream flow sequences recorded across 10 of the largest river basins in central and eastern Europe (Danube, Rhine, Elbe, Oder, Waser, Meuse, Rhone, Seine, Loire, and Garonne). Using available proxy data to quantify local flood exposure and vulnerability, we show that the temporal dependence exerts a key role in reproducing interannual persistence, and thus magnitude and frequency of annual proxy flood losses aggregated at a basin-wide scale, while copulas allow the preservation of the spatial dependence of losses at weekly and annual time scales.

Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach(open access)
Authors: Toon Haer, W. J. Wouter Botzen, Hans de Moel and Jeroen C. J. H. Aerts
Abstract: Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework
Authors: Venu Kandiah, Andrew R. Binder and Emily Z. Berglund
Abstract: Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the “risk publics” model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters—including social groups, relationships, and communication variables, also from survey data—are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.

September 2017

Resilience of Cyber Systems with Over- and Underregulation
Authors: Viktoria Gisladottir, Alexander A. Ganin, Jeffrey M. Keisler, Jeremy Kepner and Igor Linkov
Abstract: Recent cyber attacks provide evidence of increased threats to our critical systems and infrastructure. A common reaction to a new threat is to harden the system by adding new rules and regulations. As federal and state governments request new procedures to follow, each of their organizations implements their own cyber defense strategies. This unintentionally increases time and effort that employees spend on training and policy implementation and decreases the time and latitude to perform critical job functions, thus raising overall levels of stress. People's performance under stress, coupled with an overabundance of information, results in even more vulnerabilities for adversaries to exploit. In this article, we embed a simple regulatory model that accounts for cybersecurity human factors and an organization's regulatory environment in a model of a corporate cyber network under attack. The resulting model demonstrates the effect of under- and overregulation on an organization's resilience with respect to insider threats. Currently, there is a tendency to use ad-hoc approaches to account for human factors rather than to incorporate them into cyber resilience modeling. It is clear that using a systematic approach utilizing behavioral science, which already exists in cyber resilience assessment, would provide a more holistic view for decisionmakers.

Application of Graph Theory to Cost-Effective Fire Protection of Chemical Plants During Domino Effects
Authors: Nima Khakzad, Gabriele Landucci and Genserik Reniers
Abstract: In the present study, we have introduced a methodology based on graph theory and multicriteria decision analysis for cost-effective fire protection of chemical plants subject to fire-induced domino effects. By modeling domino effects in chemical plants as a directed graph, the graph centrality measures such as out-closeness and betweenness scores can be used to identify the installations playing a key role in initiating and propagating potential domino effects. It is demonstrated that active fire protection of installations with the highest out-closeness score and passive fire protection of installations with the highest betweenness score are the most effective strategies for reducing the vulnerability of chemical plants to fire-induced domino effects. We have employed a dynamic graph analysis to investigate the impact of both the availability and the degradation of fire protection measures over time on the vulnerability of chemical plants. The results obtained from the graph analysis can further be prioritized using multicriteria decision analysis techniques such as the method of reference point to find the most cost-effective fire protection strategy.

A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents
Authors: Hongyang Yu, Faisal Khan and Brian Veitch
Abstract: Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.

August 2017

Cascading Delay Risk of Airline Workforce Deployments with Crew Pairing and Schedule Optimization
Authors: Sai Ho Chung, Hoi Lam Ma, Hing Kai Chan
Abstract: This article concerns the assignment of buffer time between two connected flights and the number of reserve crews in crew pairing to mitigate flight disruption due to flight arrival delay. Insufficient crew members for a flight will lead to flight disruptions such as delays or cancellations. In reality, most of these disruption cases are due to arrival delays of the previous flights. To tackle this problem, many research studies have examined the assignment method based on the historical flight arrival delay data of the concerned flights. However, flight arrival delays can be triggered by numerous factors. Accordingly, this article proposes a new forecasting approach using a cascade neural network, which considers a massive amount of historical flight arrival and departure data. The approach also incorporates learning ability so that unknown relationships behind the data can be revealed. Based on the expected flight arrival delay, the buffer time can be determined and a new dynamic reserve crew strategy can then be used to determine the required number of reserve crews. Numerical experiments are carried out based on one year of flight data obtained from 112 airports around the world. The results demonstrate that by predicting the flight departure delay as the input for the prediction of the flight arrival delay, the prediction accuracy can be increased. Moreover, by using the new dynamic reserve crew strategy, the total crew cost can be reduced. This significantly benefits airlines in flight schedule stability and cost saving in the current big data era.

Analysis of Traffic Crashes Involving Pedestrians Using Big Data: Investigation of Contributing Factors and Identification of Hotspots
Authors: Kun Xie, Kaan Ozbay, Abdullah Kurkcu, Hong Yang
Abstract: This study aims to explore the potential of using big data in advancing the pedestrian risk analysis including the investigation of contributing factors and the hotspot identification. Massive amounts of data of Manhattan from a variety of sources were collected, integrated, and processed, including taxi trips, subway turnstile counts, traffic volumes, road network, land use, sociodemographic, and social media data. The whole study area was uniformly split into grid cells as the basic geographical units of analysis. The cell-structured framework makes it easy to incorporate rich and diversified data into risk analysis. The cost of each crash, weighted by injury severity, was assigned to the cells based on the relative distance to the crash site using a kernel density function. A tobit model was developed to relate grid-cell-specific contributing factors to crash costs that are left-censored at zero. The potential for safety improvement (PSI) that could be obtained by using the actual crash cost minus the cost of “similar” sites estimated by the tobit model was used as a measure to identify and rank pedestrian crash hotspots. The proposed hotspot identification method takes into account two important factors that are generally ignored, i.e., injury severity and effects of exposure indicators. Big data, on the one hand, enable more precise estimation of the effects of risk factors by providing richer data for modeling, and on the other hand, enable large-scale hotspot identification with higher resolution than conventional methods based on census tracts or traffic analysis zones.

Robustness Assessment of Urban Road Network with Consideration of Multiple Hazard Events
Authors: Yaoming Zhou, Jiuh-Biing Sheu, Junwei Wang
Abstract: Robustness measures a system's ability of being insensitive to disturbances. Previous studies assessed the robustness of transportation networks to a single disturbance without considering simultaneously happening multiple events. The purpose of this article is to address this problem and propose a new framework to assess the robustness of an urban transportation network. The framework consists of two layers. The upper layer is to define the robustness index based on the impact evaluation in different scenarios obtained from the lower layer, whereas the lower layer is to evaluate the performance of each hypothetical disrupted road network given by the upper layer. The upper layer has two varieties, that is, robustness against random failure and robustness against intentional attacks. This robustness measurement framework is validated by application to a real-world urban road network in Hong Kong. The results show that the robustness of a transport network with consideration of multiple events is quite different from and more comprehensive than that with consideration of only a single disruption. We also propose a Monte Carlo method and a heuristic algorithm to handle different scenarios with multiple hazard events, which is proved to be quite efficient. This methodology can also be applied to conduct risk analysis of other systems where multiple failures or disruptions exist.

A Big Data Analysis Approach for Rail Failure Risk Assessment
Authors: Ali Jamshidi, Shahrzad Faghih-Roohi, Siamak Hajizadeh, Alfredo Núñez, Robert Babuska, Rolf Dollevoet, Zili Li, Bart De Schutter
Abstract: Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach.

Satellite Data and Machine Learning for Weather Risk Management and Food Security
Authors: Enrico Biffis, Erik Chavez
Abstract: The increase in frequency and severity of extreme weather events poses challenges for the agricultural sector in developing economies and for food security globally. In this article, we demonstrate how machine learning can be used to mine satellite data and identify pixel-level optimal weather indices that can be used to inform the design of risk transfers and the quantification of the benefits of resilient production technology adoption. We implement the model to study maize production in Mozambique, and show how the approach can be used to produce countrywide risk profiles resulting from the aggregation of local, heterogeneous exposures to rainfall precipitation and excess temperature. We then develop a framework to quantify the economic gains from technology adoption by using insurance costs as the relevant metric, where insurance is broadly understood as the transfer of weather-driven crop losses to a dedicated facility. We consider the case of irrigation in detail, estimating a reduction in insurance costs of at least 30%, which is robust to different configurations of the model. The approach offers a robust framework to understand the costs versus benefits of investment in irrigation infrastructure, but could clearly be used to explore in detail the benefits of more advanced input packages, allowing, for example, for different crop varieties, sowing dates, or fertilizers.

A Community Perspective on Resilience Analytics: A Visual Analysis of Community Mood
Authors: Armando López-Cuevas, José Ramírez-Márquez, Gildardo Sanchez-Ante, Kash Barker
Abstract: Social networks are ubiquitous in everyday life. Although commonly analyzed from a perspective of individual interactions, social networks can provide insights about the collective behavior of a community. It has been shown that changes in the mood of social networks can be correlated to economic trends, public demonstrations, and political reactions, among others. In this work, we study community resilience in terms of the mood variations of the community. We have developed a method to characterize the mood steady-state of online social networks and to analyze how this steady-state is affected under certain perturbations or events that affect a community. We applied this method to study community behavior for three real social network situations, with promising results.

Security Events and Vulnerability Data for Cybersecurity Risk Estimation
Authors: Luca Allodi, Fabio Massacci
Abstract: Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of “weaponized” vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach.

July 2017

Resilience Analytics with Application to Power Grid of a Developing Region
Authors: Heimir Thorisson, James H. Lambert, John J. Cardenas, Igor Linkov
Abstract: Infrastructure development of volatile regions is a significant investment by international government and nongovernment organizations, with attendant requirements for risk management. Global development banks may be tasked to manage these investments and provide a channel between donors and borrowers. Moreover, various stakeholders from the private sector, local and international agencies, and the military can be engaged in conception, planning, and implementation of constituent projects. Emergent and future conditions of military conflict, politics, economics, technology, environment, behaviors, institutions, and society that stress infrastructure development are prevalent, and funding mechanisms are vulnerable to fraud, waste, and abuse. This article will apply resilience analytics with scenario-based preferences to identify the stressors that most influence a prioritization of initiatives in the electric power sector of Afghanistan. The resilience in this article is conceived in terms of the degree of disruption of priorities when stressors influence the preferences of stakeholders, and ultimately a prioritization of initiatives. The ancillary results include an understanding of which initiatives contribute most and least across strategic criteria and which criteria have the most impact for the analysis. The article concludes with recommendations for risk monitoring and risk management of the portfolio of stressors through the life cycle and horizon of grid capacity expansion.

A Critical Discussion and Practical Recommendations on Some Issues Relevant to the Nonprobabilistic Treatment of Uncertainty in Engineering Risk Assessment
Authors: Nicola Pedroni, Enrico Zio, Alberto Pasanisi, Mathieu Couplet
Abstract: Models for the assessment of the risk of complex engineering systems are affected by uncertainties due to the randomness of several phenomena involved and the incomplete knowledge about some of the characteristics of the system. The objective of this article is to provide operative guidelines to handle some conceptual and technical issues related to the treatment of uncertainty in risk assessment for engineering practice. In particular, the following issues are addressed: (1) quantitative modeling and representation of uncertainty coherently with the information available on the system of interest; (2) propagation of the uncertainty from the input(s) to the output(s) of the system model; (3) (Bayesian) updating as new information on the system becomes available; and (4) modeling and representation of dependences among the input variables and parameters of the system model. Different approaches and methods are recommended for efficiently tackling each of issues (1)‒(4) above; the tools considered are derived from both classical probability theory as well as alternative, nonfully probabilistic uncertainty representation frameworks (e.g., possibility theory). The recommendations drawn are supported by the results obtained in illustrative applications of literature.

The Role of Behavioral Responses in the Total Economic Consequences of Terrorist Attacks on U.S. Air Travel Targets
Authors: Adam Rose, Misak Avetisyan, Heather Rosoff, William J. Burns, Paul Slovic, Oswin Chan
Abstract: U.S. airports and airliners are prime terrorist targets. Not only do the facilities and equipment represent high-value assets, but the fear and dread that is spread by such attacks can have tremendous effects on the U.S. economy. This article presents the methodology, data, and estimates of the macroeconomic impacts stemming from behavioral responses to a simulated terrorist attack on a U.S. airport and on a domestic airliner. The analysis is based on risk-perception surveys of these two scenarios. The responses relate to reduced demand for airline travel, shifts to other modes, spending on nontravel items, and savings of potential travel expenditures by U.S. resident passengers considering flying domestic routes. We translate these responses to individual spending categories and feed these direct impact results into a computable general equilibrium (CGE) model of the U.S. economy to ascertain the indirect and total impacts on both the airline industry and the economy as a whole. Overall, the estimated impacts on GDP of both types of attacks exceed $10B. We find that the behavioral economic impacts are almost an order of magnitude higher than the ordinary business interruption impacts for the airliner attack and nearly two orders of magnitude higher for the airport attack. The results are robust to sensitivity tests on the travel behavior of U.S. residents in response to terrorism.

Nuclear Waste Management under Approaching Disaster: A Comparison of Decommissioning Strategies for the German Repository Asse II
Authors: Patrick Ilg, Silke Gabbert, Hans-Peter Weikard
Abstract: This article compares different strategies for handling low- and medium-level nuclear waste buried in a retired potassium mine in Germany (Asse II) that faces significant risk of uncontrollable brine intrusion and, hence, long-term groundwater contamination. We survey the policy process that has resulted in the identification of three possible so-called decommissioning options: complete backfilling, relocation of the waste to deeper levels in the mine, and retrieval. The selection of a decommissioning strategy must compare expected investment costs with expected social damage costs (economic, environmental, and health damage costs) caused by flooding and subsequent groundwater contamination. We apply a cost minimization approach that accounts for the uncertainty regarding the stability of the rock formation and the risk of an uncontrollable brine intrusion. Since economic and health impacts stretch out into the far future, we examine the impact of different discounting methods and rates. Due to parameter uncertainty, we conduct a sensitivity analysis concerning key assumptions. We find that retrieval, the currently preferred option by policymakers, has the lowest expected social damage costs for low discount rates. However, this advantage is overcompensated by higher expected investment costs. Considering all costs, backfilling is the best option for all discounting scenarios considered.

June 2017

Enabling Stakeholder Involvement in Coastal Disaster Resilience Planning
Authors: Thomas P. Bostick, Thomas H. Holzer, Shahryar Sarkani
Abstract: Coastal hazards including storm surge, sea-level rise, and cyclone winds continue to have devastating effects on infrastructure systems and communities despite costly investments in risk management. Risk management has generally not been sufficiently focused on coastal resilience, with community stakeholders involved in the process of making their coastline, as a system, more resilient to coastal storms. Thus, without stakeholder earlier involvement in coastal resilience planning for their community, they are frustrated after disasters occur. The U.S. National Academies has defined resilience as “the ability to prepare and plan for, absorb, recover from, and more successfully adapt to adverse events” (National Research Council). This article introduces a methodology for enabling stakeholder-involved resilience discussions across physical, information, cognitive, and social domains. The methodology addresses the stages of resilience—prepare, absorb, recover, and adapt—and integrates performance assessment with scenario analysis to characterize disruptions of risk-management priorities. The methodology is illustrated through a case study at Mobile Bay, Alabama, USA.

May 2017

Validating Resilience and Vulnerability Indices in the Context of Natural Disasters
Authors: Laura A. Bakkensen, Cate Fox-Lent, Laura K. Read, Igor Linkov
Abstract: Due to persistent and serious threats from natural disasters around the globe, many have turned to resilience and vulnerability research to guide disaster preparation, recovery, and adaptation decisions. In response, scholars and practitioners have put forth a variety of disaster indices, based on quantifiable metrics, to gauge levels of resilience and vulnerability. However, few indices are empirically validated using observed disaster impacts and, as a result, it is often unclear which index should be preferred for each decision at hand. Thus, we compare and empirically validate five of the top U.S. disaster indices, including three resilience indices and two vulnerability indices. We use observed disaster losses, fatalities, and disaster declarations from the southeastern United States to empirically validate each index. We find that disaster indices, though thoughtfully substantiated by literature and theoretically persuasive, are not all created equal. While four of the five indices perform as predicted in explaining damages, only three explain fatalities and only two explain disaster declarations as expected by theory. These results highlight the need for disaster indices to clearly state index objectives and structure underlying metrics to support validation of the results based on these goals. Further, policymakers should use index results carefully when developing regional policy or investing in resilience and vulnerability improvement projects.

Are People Interested in Probabilities of Natural Disasters?
Authors: Julija Michailova, Tadeusz Tyszka, Katarzyna Pfeifer
Abstract: Previous research has demonstrated that in naturalistic risky decisions people tend to have little interest in receiving information about probabilities. The present research asked whether subjects search for and employ probabilistic information in situations that are representative of natural disasters: namely, situations where (1) they have no control over the occurrence of a negative event and (2) there might be huge losses of physical and human capital. Pseudo-realistic scenarios involving risky situations were presented to 116 experimental participants. Based on the active information search paradigm, subjects were given only a basic description of the situation and had to acquire additional information from the experimenter. In addition to the main task, the individual risk aversion of participants was measured. We demonstrate that in pseudo-naturalistic scenarios involving natural disasters people tend to show more interest in probabilities compared to scenarios with generally more controllable risks. Moreover, this interest increases with an increase in the importance of the situation to the decisionmaker. The importance of the situation also has a positive influence on the thoroughness of information search. The experiment detected no connection between individual risk aversion and information search.

April 2017

Emergence of Flood Insurance in Canada: Navigating Institutional Uncertainty
Author: Jason Thistlethwaite
Abstract: Flood insurance has remained unavailable in Canada based on an assessment that it lacks economic viability. In response to Canada’s costliest flood event to date in 2013, the Canadian insurance industry has started to develop a framework to expand existing property insurance to cover flood damage. Research on flood insurance has overlooked why and how insurance systems transition to expand insurance coverage without evidence of economic viability. This article will address this gap through a case study on the emergence of flood insurance in Canada, and the approach to its expansion. Between 2013 and 2016, insurance industry officials representing over 60% of premiums collected in Canada were interviewed. These interviews revealed that flood insurance is being expanded in response to institutional pressure, specifically external stakeholder expectations that the insurance industry will adopt a stronger role in managing flood risk through coverage of flood damage. Further evidence of this finding is explored by assessing the emergence of a unique flood insurance model that involves a risk-adjusted and optional product along with an expansion of government policy supporting flood risk mitigation. This approach attempts to balance industry concerns about economic viability with institutional pressure to reduce flood risk through insurance. This analysis builds on existing research by providing the first scholarly analysis of flood insurance in Canada, important “empirical” teeth to existing conceptual analysis on the availability of flood insurance, and the influence of institutional factors on risk analysis within the insurance sector.

Multidimensional Approach for Tsunami Vulnerability Assessment: Framing the Territorial Impacts in Two Municipalities in Portugal
Authors: Alexandre Oliveira Tavares, José Leandro Barros, Angela Santos
Abstract: This study presents a new multidimensional methodology for tsunami vulnerability assessment that combines the morphological, structural, social, and tax component of vulnerability. This new approach can be distinguished from previous methodologies that focused primarily on the evaluation of potentially affected buildings and did not use tsunami numerical modeling. The methodology was applied to the Figueira da Foz and Vila do Bispo municipalities in Portugal. For each area, the potential tsunami-inundated areas were calculated considering the 1755 Lisbon tsunami, which is the greatest disaster caused by natural hazards that ever occurred in Portugal. Furthermore, the four components of the vulnerability were calculated to obtain a composite vulnerability index. This methodology enables us to differentiate the two areas in their vulnerability, highlighting the characteristics of the territory components. This methodology can be a starting point for the creation of a local assessment framework at the municipal scale related to tsunami risk. In addition, the methodology is an important support for the different local stakeholders.

Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO
Authors: Heidi Kreibich, Anna Botto, Bruno Merz, Kai Schröter
Abstract: Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study.

Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard
Authors: Rui Liu, Yun Chen, Jianping Wu, Lei Gao, Damian Barrett, Tingbao Xu, Xiaojuan Li, Linyi Li, Chang Huang, Jia Yu
Abstract: Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies.

Evacuation from Natural Disasters: A Systematic Review of the Literature
Authors: Rebecca R. Thompson, Dana Rose Garfin, Roxane Cohen Silver
Abstract: Research on evacuation from natural disasters has been published across the peer-reviewed literature among several disparate disciplinary outlets and has suggested a wide variety of predictors of evacuation behavior. We conducted a systematic review to summarize and evaluate the current literature on demographic, storm-related, and psychosocial correlates of natural disaster evacuation behavior. Eighty-three eligible papers utilizing 83 independent samples were identified. Risk perception was a consistent positive predictor of evacuation, as were several demographic indicators, prior evacuation behavior, and having an evacuation plan. The influence of prior experiences, self-efficacy, personality, and links between expected and actual behavior were examined less frequently. Prospective, longitudinal designs are relatively uncommon. Although difficult to conduct in postdisaster settings, more prospective, methodologically rigorous studies would bolster inferences. Results synthesize the current body of literature on evacuation behavior and can help inform the design of more effective predisaster evacuation warnings and procedures.

March 2017

Nonparametric Tree-Based Predictive Modeling of Storm Outages on an Electric Distribution Network
Authors: Jichao He, David W. Wanik, Brian M. Hartman, Emmanouil N. Anagnostou, Marina Astitha, Maria E. B. Frediani
Abstract: This article compares two nonparametric tree-based models, quantile regression forests (QRF) and Bayesian additive regression trees (BART), for predicting storm outages on an electric distribution network in Connecticut, USA. We evaluated point estimates and prediction intervals of outage predictions for both models using high-resolution weather, infrastructure, and land use data for 89 storm events (including hurricanes, blizzards, and thunderstorms). We found that spatially BART predicted more accurate point estimates than QRF. However, QRF produced better prediction intervals for high spatial resolutions (2-km grid cells and towns), while BART predictions aggregated to coarser resolutions (divisions and service territory) more effectively. We also found that the predictive accuracy was dependent on the season (e.g., tree-leaf condition, storm characteristics), and that the predictions were most accurate for winter storms. Given the merits of each individual model, we suggest that BART and QRF be implemented together to show the complete picture of a storm's potential impact on the electric distribution network, which would allow for a utility to make better decisions about allocating prestorm resources.

Disasters as Learning Experiences or Disasters as Policy Opportunities? Examining Flood Insurance Purchases after Hurricanes
Author: Caroline Kousky
Abstract: Flood insurance is a critical risk management strategy, contributing to greater resilience of individuals and communities. The occurrence of disasters has been observed to alter risk management choices, including the decision to insure. This has previously been explained by learning and behavioral biases. When it comes to flood insurance, however, federal disaster aid policy could also play a role since recipients of aid are required to maintain insurance. Using a database of flood insurance policies for all states on the Atlantic and Gulf coasts of the United States between 2001 and 2010, this article uses fixed effects models to examine how take-up rates respond to the occurrence of hurricanes and tropical storms, as well as disaster declarations and aid requirements. Being hit by at least one hurricane in the previous year increases net flood insurance purchases by 7.2%. This effect dies out by three years after the storm. A presidential disaster declaration for floods increases take-up rates by 6.7%. When disaster aid grants are made available to households, take-up rates increase by 5%; this accounts for the majority of the increase in policies after occurrence of a hurricane. When the models are estimated taking into account which policies are required by disaster aid, hurricanes are estimated to lead to only a 1.5% increase in voluntary purchases. This overlooked federal policy that disaster aid recipients insure is responsible for a majority of insurance purchases postdisaster.

Weighing the Risks of Nuclear Energy and Climate Change: Trust in Different Information Sources, Perceived Risks, and Willingness to Pay for Alternatives to Nuclear Power
Authors: Annukka Vainio, Riikka Paloniemi, Vilja Varho
Abstract: We examined how individuals perceive nuclear energy in the context of climate change mitigation and how their perceptions are associated with trust in different risk information sources. We analyzed the interrelationships between trust, perceived risk of nuclear power, climate change concern, perception of nuclear energy as an acceptable way to mitigate climate change, and willingness to pay (WTP) for alternatives to nuclear power. A nationwide survey (N = 967) collected in Finland was analyzed with structural equation modeling. The associations between trust and perceived risk of nuclear power, climate change concern, and perception of nuclear power as a way to mitigate climate change varied by the type of information source. Political party support and other background variables were associated with trust in different information sources. The effect of trust in information sources on WTP was mediated by perceived risks and benefits. The results will increase our understanding of how individuals perceive nuclear energy as a way to cut CO2 emissions and the role of trust in different information sources in shaping nuclear risk perceptions and energy choices.

February 2017

In Search of Perfect Foresight? Policy Bias, Management of Unknowns, and What Has Changed in Science Policy Since the Tohoku Disaster
By: Junko Mochizuki and Nadejda Komendantova
Abstract: The failure to foresee the catastrophic earthquakes, tsunamis, and nuclear accident of 2011 has been perceived by many in Japan as a fundamental shortcoming of modern disaster risk science. Hampered by a variety of cognitive and institutional biases, the conventional disaster risk management planning based on the “known risks” led to the cascading failures of the interlinked disaster risk management (DRM) apparatus. This realization led to a major rethinking in the use of science for policy and the incorporations of lessons learned in the country's new DRM policy. This study reviews publicly available documents on expert committee discussions and scientific articles to identify what continuities and changes have been made in the use of scientific knowledge in Japanese risk management. In general, the prior influence of cognitive bias (e.g., overreliance on documented hazard risks) has been largely recognized, and increased attention is now being paid to the incorporation of less documented but known risks. This has led to upward adjustments in estimated damages from future risks and recognition of the need for further strengthening of DRM policy. At the same time, there remains significant continuity in the way scientific knowledge is perceived to provide sufficient and justifiable grounds for the development and implementation of DRM policy. The emphasis on “evidence-based policy” in earthquake and tsunami risk reduction measures continues, despite the critical reflections of a group of scientists who advocate for a major rethinking of the country's science-policy institution respecting the limitations of the current state science.

January 2017

Flood Catastrophe Model for Designing Optimal Flood Insurance Program: Estimating Location‐Specific Premiums in the Netherlands
By: T. Ermolieva, T. Filatova, Y. Ermoliev, M. Obersteiner, K. M. Bruijn, and A. Jeuken
Abstract: As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures.

Of disasters and dragon kings: a statistical analysis of nuclear power incidents and accidents
By: S. Wheatley, B. Sovacool, and D. Sornette
Abstract: We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5–0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the “dragon-king” phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60–150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10–20 years. Further—even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima—the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following Fukushima, but also deeper improvements to effectively exclude the possibility of “dragon-king” disasters. Finally, we find that the International Nuclear Event Scale (INES) is inconsistent in terms of both cost and radiation released. To be consistent with cost data, the Chernobyl and Fukushima disasters would need to have between an INES level of 10 and 11, rather than the maximum of 7.

Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi‐Directional Road Tunnel
By: C. Caliendo and M. L. De Guglielmo
Abstract: A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decision-makers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs.

Failing to Fix What is Found: Risk Accommodation in the Oil and Gas Industry.
By: M. R. Stackhouse and R. Stewart
Abstract: The present program of research synthesizes the findings from three studies in line with two goals. First, the present research explores how the oil and gas industry is performing at risk mitigation in terms of finding and fixing errors when they occur. Second, the present research explores what factors in the work environment relate to a risk-accommodating environment. Study 1 presents a descriptive evaluation of high-consequence incidents at 34 oil and gas companies over a 12-month period (N = 873), especially in terms of those companies’ effectiveness at investigating and fixing errors. The analysis found that most investigations were fair in terms of quality (mean = 75.50%), with a smaller proportion that were weak (mean = 11.40%) or strong (mean = 13.24%). Furthermore, most companies took at least one corrective action for high-consequence incidents, but few of these corrective actions were confirmed as having been completed (mean = 13.77%). In fact, most corrective actions were secondary interim administrative controls (e.g., having a safety meeting) rather than fair or strong controls (e.g., training, engineering elimination). Study 2a found that several environmental factors explain the 56.41% variance in safety, including management's disengagement from safety concerns, finding and fixing errors, safety management system effectiveness, training, employee safety, procedures, and a production-over-safety culture. Qualitative results from Study 2b suggest that a compliance-based culture of adhering to liability concerns, out-group blame, and a production-over-safety orientation may all impede safety effectiveness.