EISG in Risk Analysis (cont'd)

Risk Analysis is the official journal of Society for Risk Analysis and publishes peer-reviewed, original research on both the theory and practice of risk. The application areas are vast. Below are articles with particular relevance to the Engineering and Infrastructure Specialty Group.

August 2017

Cascading Delay Risk of Airline Workforce Deployments with Crew Pairing and Schedule Optimization
Authors: Sai Ho Chung, Hoi Lam Ma, Hing Kai Chan
Abstract: This article concerns the assignment of buffer time between two connected flights and the number of reserve crews in crew pairing to mitigate flight disruption due to flight arrival delay. Insufficient crew members for a flight will lead to flight disruptions such as delays or cancellations. In reality, most of these disruption cases are due to arrival delays of the previous flights. To tackle this problem, many research studies have examined the assignment method based on the historical flight arrival delay data of the concerned flights. However, flight arrival delays can be triggered by numerous factors. Accordingly, this article proposes a new forecasting approach using a cascade neural network, which considers a massive amount of historical flight arrival and departure data. The approach also incorporates learning ability so that unknown relationships behind the data can be revealed. Based on the expected flight arrival delay, the buffer time can be determined and a new dynamic reserve crew strategy can then be used to determine the required number of reserve crews. Numerical experiments are carried out based on one year of flight data obtained from 112 airports around the world. The results demonstrate that by predicting the flight departure delay as the input for the prediction of the flight arrival delay, the prediction accuracy can be increased. Moreover, by using the new dynamic reserve crew strategy, the total crew cost can be reduced. This significantly benefits airlines in flight schedule stability and cost saving in the current big data era.

Analysis of Traffic Crashes Involving Pedestrians Using Big Data: Investigation of Contributing Factors and Identification of Hotspots
Authors: Kun Xie, Kaan Ozbay, Abdullah Kurkcu, Hong Yang
Abstract: This study aims to explore the potential of using big data in advancing the pedestrian risk analysis including the investigation of contributing factors and the hotspot identification. Massive amounts of data of Manhattan from a variety of sources were collected, integrated, and processed, including taxi trips, subway turnstile counts, traffic volumes, road network, land use, sociodemographic, and social media data. The whole study area was uniformly split into grid cells as the basic geographical units of analysis. The cell-structured framework makes it easy to incorporate rich and diversified data into risk analysis. The cost of each crash, weighted by injury severity, was assigned to the cells based on the relative distance to the crash site using a kernel density function. A tobit model was developed to relate grid-cell-specific contributing factors to crash costs that are left-censored at zero. The potential for safety improvement (PSI) that could be obtained by using the actual crash cost minus the cost of “similar” sites estimated by the tobit model was used as a measure to identify and rank pedestrian crash hotspots. The proposed hotspot identification method takes into account two important factors that are generally ignored, i.e., injury severity and effects of exposure indicators. Big data, on the one hand, enable more precise estimation of the effects of risk factors by providing richer data for modeling, and on the other hand, enable large-scale hotspot identification with higher resolution than conventional methods based on census tracts or traffic analysis zones.

Robustness Assessment of Urban Road Network with Consideration of Multiple Hazard Events
Authors: Yaoming Zhou, Jiuh-Biing Sheu, Junwei Wang
Abstract: Robustness measures a system's ability of being insensitive to disturbances. Previous studies assessed the robustness of transportation networks to a single disturbance without considering simultaneously happening multiple events. The purpose of this article is to address this problem and propose a new framework to assess the robustness of an urban transportation network. The framework consists of two layers. The upper layer is to define the robustness index based on the impact evaluation in different scenarios obtained from the lower layer, whereas the lower layer is to evaluate the performance of each hypothetical disrupted road network given by the upper layer. The upper layer has two varieties, that is, robustness against random failure and robustness against intentional attacks. This robustness measurement framework is validated by application to a real-world urban road network in Hong Kong. The results show that the robustness of a transport network with consideration of multiple events is quite different from and more comprehensive than that with consideration of only a single disruption. We also propose a Monte Carlo method and a heuristic algorithm to handle different scenarios with multiple hazard events, which is proved to be quite efficient. This methodology can also be applied to conduct risk analysis of other systems where multiple failures or disruptions exist.

A Big Data Analysis Approach for Rail Failure Risk Assessment
Authors: Ali Jamshidi, Shahrzad Faghih-Roohi, Siamak Hajizadeh, Alfredo Núñez, Robert Babuska, Rolf Dollevoet, Zili Li, Bart De Schutter
Abstract: Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach.

Satellite Data and Machine Learning for Weather Risk Management and Food Security
Authors: Enrico Biffis, Erik Chavez
Abstract: The increase in frequency and severity of extreme weather events poses challenges for the agricultural sector in developing economies and for food security globally. In this article, we demonstrate how machine learning can be used to mine satellite data and identify pixel-level optimal weather indices that can be used to inform the design of risk transfers and the quantification of the benefits of resilient production technology adoption. We implement the model to study maize production in Mozambique, and show how the approach can be used to produce countrywide risk profiles resulting from the aggregation of local, heterogeneous exposures to rainfall precipitation and excess temperature. We then develop a framework to quantify the economic gains from technology adoption by using insurance costs as the relevant metric, where insurance is broadly understood as the transfer of weather-driven crop losses to a dedicated facility. We consider the case of irrigation in detail, estimating a reduction in insurance costs of at least 30%, which is robust to different configurations of the model. The approach offers a robust framework to understand the costs versus benefits of investment in irrigation infrastructure, but could clearly be used to explore in detail the benefits of more advanced input packages, allowing, for example, for different crop varieties, sowing dates, or fertilizers.

A Community Perspective on Resilience Analytics: A Visual Analysis of Community Mood
Authors: Armando López-Cuevas, José Ramírez-Márquez, Gildardo Sanchez-Ante, Kash Barker
Abstract: Social networks are ubiquitous in everyday life. Although commonly analyzed from a perspective of individual interactions, social networks can provide insights about the collective behavior of a community. It has been shown that changes in the mood of social networks can be correlated to economic trends, public demonstrations, and political reactions, among others. In this work, we study community resilience in terms of the mood variations of the community. We have developed a method to characterize the mood steady-state of online social networks and to analyze how this steady-state is affected under certain perturbations or events that affect a community. We applied this method to study community behavior for three real social network situations, with promising results.

Security Events and Vulnerability Data for Cybersecurity Risk Estimation
Authors: Luca Allodi, Fabio Massacci
Abstract: Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of “weaponized” vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach.

July 2017

Resilience Analytics with Application to Power Grid of a Developing Region
Authors: Heimir Thorisson, James H. Lambert, John J. Cardenas, Igor Linkov
Abstract: Infrastructure development of volatile regions is a significant investment by international government and nongovernment organizations, with attendant requirements for risk management. Global development banks may be tasked to manage these investments and provide a channel between donors and borrowers. Moreover, various stakeholders from the private sector, local and international agencies, and the military can be engaged in conception, planning, and implementation of constituent projects. Emergent and future conditions of military conflict, politics, economics, technology, environment, behaviors, institutions, and society that stress infrastructure development are prevalent, and funding mechanisms are vulnerable to fraud, waste, and abuse. This article will apply resilience analytics with scenario-based preferences to identify the stressors that most influence a prioritization of initiatives in the electric power sector of Afghanistan. The resilience in this article is conceived in terms of the degree of disruption of priorities when stressors influence the preferences of stakeholders, and ultimately a prioritization of initiatives. The ancillary results include an understanding of which initiatives contribute most and least across strategic criteria and which criteria have the most impact for the analysis. The article concludes with recommendations for risk monitoring and risk management of the portfolio of stressors through the life cycle and horizon of grid capacity expansion.

A Critical Discussion and Practical Recommendations on Some Issues Relevant to the Nonprobabilistic Treatment of Uncertainty in Engineering Risk Assessment
Authors: Nicola Pedroni, Enrico Zio, Alberto Pasanisi, Mathieu Couplet
Abstract: Models for the assessment of the risk of complex engineering systems are affected by uncertainties due to the randomness of several phenomena involved and the incomplete knowledge about some of the characteristics of the system. The objective of this article is to provide operative guidelines to handle some conceptual and technical issues related to the treatment of uncertainty in risk assessment for engineering practice. In particular, the following issues are addressed: (1) quantitative modeling and representation of uncertainty coherently with the information available on the system of interest; (2) propagation of the uncertainty from the input(s) to the output(s) of the system model; (3) (Bayesian) updating as new information on the system becomes available; and (4) modeling and representation of dependences among the input variables and parameters of the system model. Different approaches and methods are recommended for efficiently tackling each of issues (1)‒(4) above; the tools considered are derived from both classical probability theory as well as alternative, nonfully probabilistic uncertainty representation frameworks (e.g., possibility theory). The recommendations drawn are supported by the results obtained in illustrative applications of literature.

The Role of Behavioral Responses in the Total Economic Consequences of Terrorist Attacks on U.S. Air Travel Targets
Authors: Adam Rose, Misak Avetisyan, Heather Rosoff, William J. Burns, Paul Slovic, Oswin Chan
Abstract: U.S. airports and airliners are prime terrorist targets. Not only do the facilities and equipment represent high-value assets, but the fear and dread that is spread by such attacks can have tremendous effects on the U.S. economy. This article presents the methodology, data, and estimates of the macroeconomic impacts stemming from behavioral responses to a simulated terrorist attack on a U.S. airport and on a domestic airliner. The analysis is based on risk-perception surveys of these two scenarios. The responses relate to reduced demand for airline travel, shifts to other modes, spending on nontravel items, and savings of potential travel expenditures by U.S. resident passengers considering flying domestic routes. We translate these responses to individual spending categories and feed these direct impact results into a computable general equilibrium (CGE) model of the U.S. economy to ascertain the indirect and total impacts on both the airline industry and the economy as a whole. Overall, the estimated impacts on GDP of both types of attacks exceed $10B. We find that the behavioral economic impacts are almost an order of magnitude higher than the ordinary business interruption impacts for the airliner attack and nearly two orders of magnitude higher for the airport attack. The results are robust to sensitivity tests on the travel behavior of U.S. residents in response to terrorism.

Nuclear Waste Management under Approaching Disaster: A Comparison of Decommissioning Strategies for the German Repository Asse II
Authors: Patrick Ilg, Silke Gabbert, Hans-Peter Weikard
Abstract: This article compares different strategies for handling low- and medium-level nuclear waste buried in a retired potassium mine in Germany (Asse II) that faces significant risk of uncontrollable brine intrusion and, hence, long-term groundwater contamination. We survey the policy process that has resulted in the identification of three possible so-called decommissioning options: complete backfilling, relocation of the waste to deeper levels in the mine, and retrieval. The selection of a decommissioning strategy must compare expected investment costs with expected social damage costs (economic, environmental, and health damage costs) caused by flooding and subsequent groundwater contamination. We apply a cost minimization approach that accounts for the uncertainty regarding the stability of the rock formation and the risk of an uncontrollable brine intrusion. Since economic and health impacts stretch out into the far future, we examine the impact of different discounting methods and rates. Due to parameter uncertainty, we conduct a sensitivity analysis concerning key assumptions. We find that retrieval, the currently preferred option by policymakers, has the lowest expected social damage costs for low discount rates. However, this advantage is overcompensated by higher expected investment costs. Considering all costs, backfilling is the best option for all discounting scenarios considered.

June 2017

Enabling Stakeholder Involvement in Coastal Disaster Resilience Planning
Authors: Thomas P. Bostick, Thomas H. Holzer, Shahryar Sarkani
Abstract: Coastal hazards including storm surge, sea-level rise, and cyclone winds continue to have devastating effects on infrastructure systems and communities despite costly investments in risk management. Risk management has generally not been sufficiently focused on coastal resilience, with community stakeholders involved in the process of making their coastline, as a system, more resilient to coastal storms. Thus, without stakeholder earlier involvement in coastal resilience planning for their community, they are frustrated after disasters occur. The U.S. National Academies has defined resilience as “the ability to prepare and plan for, absorb, recover from, and more successfully adapt to adverse events” (National Research Council). This article introduces a methodology for enabling stakeholder-involved resilience discussions across physical, information, cognitive, and social domains. The methodology addresses the stages of resilience—prepare, absorb, recover, and adapt—and integrates performance assessment with scenario analysis to characterize disruptions of risk-management priorities. The methodology is illustrated through a case study at Mobile Bay, Alabama, USA.

May 2017

Validating Resilience and Vulnerability Indices in the Context of Natural Disasters
Authors: Laura A. Bakkensen, Cate Fox-Lent, Laura K. Read, Igor Linkov
Abstract: Due to persistent and serious threats from natural disasters around the globe, many have turned to resilience and vulnerability research to guide disaster preparation, recovery, and adaptation decisions. In response, scholars and practitioners have put forth a variety of disaster indices, based on quantifiable metrics, to gauge levels of resilience and vulnerability. However, few indices are empirically validated using observed disaster impacts and, as a result, it is often unclear which index should be preferred for each decision at hand. Thus, we compare and empirically validate five of the top U.S. disaster indices, including three resilience indices and two vulnerability indices. We use observed disaster losses, fatalities, and disaster declarations from the southeastern United States to empirically validate each index. We find that disaster indices, though thoughtfully substantiated by literature and theoretically persuasive, are not all created equal. While four of the five indices perform as predicted in explaining damages, only three explain fatalities and only two explain disaster declarations as expected by theory. These results highlight the need for disaster indices to clearly state index objectives and structure underlying metrics to support validation of the results based on these goals. Further, policymakers should use index results carefully when developing regional policy or investing in resilience and vulnerability improvement projects.

Are People Interested in Probabilities of Natural Disasters?
Authors: Julija Michailova, Tadeusz Tyszka, Katarzyna Pfeifer
Abstract: Previous research has demonstrated that in naturalistic risky decisions people tend to have little interest in receiving information about probabilities. The present research asked whether subjects search for and employ probabilistic information in situations that are representative of natural disasters: namely, situations where (1) they have no control over the occurrence of a negative event and (2) there might be huge losses of physical and human capital. Pseudo-realistic scenarios involving risky situations were presented to 116 experimental participants. Based on the active information search paradigm, subjects were given only a basic description of the situation and had to acquire additional information from the experimenter. In addition to the main task, the individual risk aversion of participants was measured. We demonstrate that in pseudo-naturalistic scenarios involving natural disasters people tend to show more interest in probabilities compared to scenarios with generally more controllable risks. Moreover, this interest increases with an increase in the importance of the situation to the decisionmaker. The importance of the situation also has a positive influence on the thoroughness of information search. The experiment detected no connection between individual risk aversion and information search.

April 2017

Emergence of Flood Insurance in Canada: Navigating Institutional Uncertainty
Author: Jason Thistlethwaite
Abstract: Flood insurance has remained unavailable in Canada based on an assessment that it lacks economic viability. In response to Canada’s costliest flood event to date in 2013, the Canadian insurance industry has started to develop a framework to expand existing property insurance to cover flood damage. Research on flood insurance has overlooked why and how insurance systems transition to expand insurance coverage without evidence of economic viability. This article will address this gap through a case study on the emergence of flood insurance in Canada, and the approach to its expansion. Between 2013 and 2016, insurance industry officials representing over 60% of premiums collected in Canada were interviewed. These interviews revealed that flood insurance is being expanded in response to institutional pressure, specifically external stakeholder expectations that the insurance industry will adopt a stronger role in managing flood risk through coverage of flood damage. Further evidence of this finding is explored by assessing the emergence of a unique flood insurance model that involves a risk-adjusted and optional product along with an expansion of government policy supporting flood risk mitigation. This approach attempts to balance industry concerns about economic viability with institutional pressure to reduce flood risk through insurance. This analysis builds on existing research by providing the first scholarly analysis of flood insurance in Canada, important “empirical” teeth to existing conceptual analysis on the availability of flood insurance, and the influence of institutional factors on risk analysis within the insurance sector.

Multidimensional Approach for Tsunami Vulnerability Assessment: Framing the Territorial Impacts in Two Municipalities in Portugal
Authors: Alexandre Oliveira Tavares, José Leandro Barros, Angela Santos
Abstract: This study presents a new multidimensional methodology for tsunami vulnerability assessment that combines the morphological, structural, social, and tax component of vulnerability. This new approach can be distinguished from previous methodologies that focused primarily on the evaluation of potentially affected buildings and did not use tsunami numerical modeling. The methodology was applied to the Figueira da Foz and Vila do Bispo municipalities in Portugal. For each area, the potential tsunami-inundated areas were calculated considering the 1755 Lisbon tsunami, which is the greatest disaster caused by natural hazards that ever occurred in Portugal. Furthermore, the four components of the vulnerability were calculated to obtain a composite vulnerability index. This methodology enables us to differentiate the two areas in their vulnerability, highlighting the characteristics of the territory components. This methodology can be a starting point for the creation of a local assessment framework at the municipal scale related to tsunami risk. In addition, the methodology is an important support for the different local stakeholders.

Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO
Authors: Heidi Kreibich, Anna Botto, Bruno Merz, Kai Schröter
Abstract: Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study.

Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard
Authors: Rui Liu, Yun Chen, Jianping Wu, Lei Gao, Damian Barrett, Tingbao Xu, Xiaojuan Li, Linyi Li, Chang Huang, Jia Yu
Abstract: Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies.

Evacuation from Natural Disasters: A Systematic Review of the Literature
Authors: Rebecca R. Thompson, Dana Rose Garfin, Roxane Cohen Silver
Abstract: Research on evacuation from natural disasters has been published across the peer-reviewed literature among several disparate disciplinary outlets and has suggested a wide variety of predictors of evacuation behavior. We conducted a systematic review to summarize and evaluate the current literature on demographic, storm-related, and psychosocial correlates of natural disaster evacuation behavior. Eighty-three eligible papers utilizing 83 independent samples were identified. Risk perception was a consistent positive predictor of evacuation, as were several demographic indicators, prior evacuation behavior, and having an evacuation plan. The influence of prior experiences, self-efficacy, personality, and links between expected and actual behavior were examined less frequently. Prospective, longitudinal designs are relatively uncommon. Although difficult to conduct in postdisaster settings, more prospective, methodologically rigorous studies would bolster inferences. Results synthesize the current body of literature on evacuation behavior and can help inform the design of more effective predisaster evacuation warnings and procedures.

March 2017

Nonparametric Tree-Based Predictive Modeling of Storm Outages on an Electric Distribution Network
Authors: Jichao He, David W. Wanik, Brian M. Hartman, Emmanouil N. Anagnostou, Marina Astitha, Maria E. B. Frediani
Abstract: This article compares two nonparametric tree-based models, quantile regression forests (QRF) and Bayesian additive regression trees (BART), for predicting storm outages on an electric distribution network in Connecticut, USA. We evaluated point estimates and prediction intervals of outage predictions for both models using high-resolution weather, infrastructure, and land use data for 89 storm events (including hurricanes, blizzards, and thunderstorms). We found that spatially BART predicted more accurate point estimates than QRF. However, QRF produced better prediction intervals for high spatial resolutions (2-km grid cells and towns), while BART predictions aggregated to coarser resolutions (divisions and service territory) more effectively. We also found that the predictive accuracy was dependent on the season (e.g., tree-leaf condition, storm characteristics), and that the predictions were most accurate for winter storms. Given the merits of each individual model, we suggest that BART and QRF be implemented together to show the complete picture of a storm's potential impact on the electric distribution network, which would allow for a utility to make better decisions about allocating prestorm resources.

Disasters as Learning Experiences or Disasters as Policy Opportunities? Examining Flood Insurance Purchases after Hurricanes
Author: Caroline Kousky
Abstract: Flood insurance is a critical risk management strategy, contributing to greater resilience of individuals and communities. The occurrence of disasters has been observed to alter risk management choices, including the decision to insure. This has previously been explained by learning and behavioral biases. When it comes to flood insurance, however, federal disaster aid policy could also play a role since recipients of aid are required to maintain insurance. Using a database of flood insurance policies for all states on the Atlantic and Gulf coasts of the United States between 2001 and 2010, this article uses fixed effects models to examine how take-up rates respond to the occurrence of hurricanes and tropical storms, as well as disaster declarations and aid requirements. Being hit by at least one hurricane in the previous year increases net flood insurance purchases by 7.2%. This effect dies out by three years after the storm. A presidential disaster declaration for floods increases take-up rates by 6.7%. When disaster aid grants are made available to households, take-up rates increase by 5%; this accounts for the majority of the increase in policies after occurrence of a hurricane. When the models are estimated taking into account which policies are required by disaster aid, hurricanes are estimated to lead to only a 1.5% increase in voluntary purchases. This overlooked federal policy that disaster aid recipients insure is responsible for a majority of insurance purchases postdisaster.

Weighing the Risks of Nuclear Energy and Climate Change: Trust in Different Information Sources, Perceived Risks, and Willingness to Pay for Alternatives to Nuclear Power
Authors: Annukka Vainio, Riikka Paloniemi, Vilja Varho
Abstract: We examined how individuals perceive nuclear energy in the context of climate change mitigation and how their perceptions are associated with trust in different risk information sources. We analyzed the interrelationships between trust, perceived risk of nuclear power, climate change concern, perception of nuclear energy as an acceptable way to mitigate climate change, and willingness to pay (WTP) for alternatives to nuclear power. A nationwide survey (N = 967) collected in Finland was analyzed with structural equation modeling. The associations between trust and perceived risk of nuclear power, climate change concern, and perception of nuclear power as a way to mitigate climate change varied by the type of information source. Political party support and other background variables were associated with trust in different information sources. The effect of trust in information sources on WTP was mediated by perceived risks and benefits. The results will increase our understanding of how individuals perceive nuclear energy as a way to cut CO2 emissions and the role of trust in different information sources in shaping nuclear risk perceptions and energy choices.

February 2017

In Search of Perfect Foresight? Policy Bias, Management of Unknowns, and What Has Changed in Science Policy Since the Tohoku Disaster
By: Junko Mochizuki and Nadejda Komendantova
Abstract: The failure to foresee the catastrophic earthquakes, tsunamis, and nuclear accident of 2011 has been perceived by many in Japan as a fundamental shortcoming of modern disaster risk science. Hampered by a variety of cognitive and institutional biases, the conventional disaster risk management planning based on the “known risks” led to the cascading failures of the interlinked disaster risk management (DRM) apparatus. This realization led to a major rethinking in the use of science for policy and the incorporations of lessons learned in the country's new DRM policy. This study reviews publicly available documents on expert committee discussions and scientific articles to identify what continuities and changes have been made in the use of scientific knowledge in Japanese risk management. In general, the prior influence of cognitive bias (e.g., overreliance on documented hazard risks) has been largely recognized, and increased attention is now being paid to the incorporation of less documented but known risks. This has led to upward adjustments in estimated damages from future risks and recognition of the need for further strengthening of DRM policy. At the same time, there remains significant continuity in the way scientific knowledge is perceived to provide sufficient and justifiable grounds for the development and implementation of DRM policy. The emphasis on “evidence-based policy” in earthquake and tsunami risk reduction measures continues, despite the critical reflections of a group of scientists who advocate for a major rethinking of the country's science-policy institution respecting the limitations of the current state science.

January 2017

Flood Catastrophe Model for Designing Optimal Flood Insurance Program: Estimating Location‐Specific Premiums in the Netherlands
By: T. Ermolieva, T. Filatova, Y. Ermoliev, M. Obersteiner, K. M. Bruijn, and A. Jeuken
Abstract: As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures.

Of disasters and dragon kings: a statistical analysis of nuclear power incidents and accidents
By: S. Wheatley, B. Sovacool, and D. Sornette
Abstract: We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5–0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the “dragon-king” phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60–150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10–20 years. Further—even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima—the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following Fukushima, but also deeper improvements to effectively exclude the possibility of “dragon-king” disasters. Finally, we find that the International Nuclear Event Scale (INES) is inconsistent in terms of both cost and radiation released. To be consistent with cost data, the Chernobyl and Fukushima disasters would need to have between an INES level of 10 and 11, rather than the maximum of 7.

Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi‐Directional Road Tunnel
By: C. Caliendo and M. L. De Guglielmo
Abstract: A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decision-makers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs.

Failing to Fix What is Found: Risk Accommodation in the Oil and Gas Industry.
By: M. R. Stackhouse and R. Stewart
Abstract: The present program of research synthesizes the findings from three studies in line with two goals. First, the present research explores how the oil and gas industry is performing at risk mitigation in terms of finding and fixing errors when they occur. Second, the present research explores what factors in the work environment relate to a risk-accommodating environment. Study 1 presents a descriptive evaluation of high-consequence incidents at 34 oil and gas companies over a 12-month period (N = 873), especially in terms of those companies’ effectiveness at investigating and fixing errors. The analysis found that most investigations were fair in terms of quality (mean = 75.50%), with a smaller proportion that were weak (mean = 11.40%) or strong (mean = 13.24%). Furthermore, most companies took at least one corrective action for high-consequence incidents, but few of these corrective actions were confirmed as having been completed (mean = 13.77%). In fact, most corrective actions were secondary interim administrative controls (e.g., having a safety meeting) rather than fair or strong controls (e.g., training, engineering elimination). Study 2a found that several environmental factors explain the 56.41% variance in safety, including management's disengagement from safety concerns, finding and fixing errors, safety management system effectiveness, training, employee safety, procedures, and a production-over-safety culture. Qualitative results from Study 2b suggest that a compliance-based culture of adhering to liability concerns, out-group blame, and a production-over-safety orientation may all impede safety effectiveness.