Risk Analysis: An International Journal

Subscribe to Risk Analysis: An International Journal feed Risk Analysis: An International Journal
Table of Contents for Risk Analysis. List of articles from both the latest and EarlyView issues.
Updated: 43 min 31 sec ago

Dread and Risk Elimination Premium for the Value of a Statistical Life

13 June 2019 - 6:34pm
Abstract

The value of a statistical life (VSL) is a widely used measure for the value of mortality risk reduction. As VSL should reflect preferences and attitudes to risk, there are reasons to believe that it varies depending on the type of risk involved. It has been argued that cancer should be considered a “dread disease,” which supports the use of a “cancer premium.” The objective of this study is to investigate the existence of a cancer premium (for pancreatic cancer and multiple myeloma) in relation to road traffic accidents, sudden cardiac arrest, and amyotrophic lateral sclerosis (ALS). Data were collected from 500 individuals in the Swedish general population of 50–74‐year olds using a web‐based questionnaire. Preferences were elicited using the contingent valuation method, and a split‐sample design was applied to test scale sensitivity. VSL differs significantly between contexts, being highest for ALS and lowest for road traffic accidents. A premium (92–113%) for cancer was found in relation to road traffic accidents. The premium was higher for cancer with a shorter time from diagnosis to death. A premium was also found for sudden cardiac arrest (73%) and ALS (118%) in relation to road traffic accidents. Eliminating risk was associated with a premium of around 20%. This study provides additional evidence that there exist a dread premium and risk elimination premium. These factors should be considered when searching for an appropriate value for economic evaluation and health technology assessment.

Science for Policy: A Case Study of Scientific Polarization, Values, and the Framing of Risk and Uncertainty

13 June 2019 - 6:31pm
Abstract

It is well documented that more research can lead to hardened positions, particularly when dealing with complex, controversial, and value‐laden issues. This study is an attempt to unveil underlying values in a contemporary debate, where both sides use scientific evidence to support their argument. We analyze the problem framing, vocabulary, interpretation of evidence, and policy recommendations, with particular attention to the framing of nature and technology. We find clear differences between the two arguments. One side stress that there is no evidence that the present approach is causing harm to humans or the environment, does not ruminate on uncertainties to that end, references nature's ability to handle the problem, and indicates distrust in technological solutions. In contrast, the other side focuses on uncertainties, particularly the lack of knowledge about potential environmental effects and signals trust in technological development and human intervention as the solution. Our study suggests that the two sides’ diverging interpretations are tied to their perception of nature: vulnerable to human activities versus robust and able to handle human impacts. The two sides also seem to hold diverging views of technology, but there are indications that this might be rooted in their perception of governance and economy rather than about technology per se. We conclude that there is a need to further investigate how scientific arguments are related to worldviews, to see how (if at all) worldview typologies can help us to understand how value‐based judgments are embedded in science advice, and the impact these have on policy preferences.

Modeling the Cost Effectiveness of Fire Protection Resource Allocation in the United States: Models and a 1980–2014 Case Study

13 June 2019 - 6:31pm
Abstract

The estimated cost of fire in the United States is about $329 billion a year, yet there are gaps in the literature to measure the effectiveness of investment and to allocate resources optimally in fire protection. This article fills these gaps by creating data‐driven empirical and theoretical models to study the effectiveness of nationwide fire protection investment in reducing economic and human losses. The regression between investment and loss vulnerability shows high R 2 values (≈0.93). This article also contributes to the literature by modeling strategic (national‐level or state‐level) resource allocation (RA) for fire protection with equity‐efficiency trade‐off considerations, while existing literature focuses on operational‐level RA. This model and its numerical analyses provide techniques and insights to aid the strategic decision‐making process. The results from this model are used to calculate fire risk scores for various geographic regions, which can be used as an indicator of fire risk. A case study of federal fire grant allocation is used to validate and show the utility of the optimal RA model. The results also identify potential underinvestment and overinvestment in fire protection in certain regions. This article presents scenarios in which the model presented outperforms the existing RA scheme, when compared in terms of the correlation of resources allocated with actual number of fire incidents. This article provides some novel insights to policymakers and analysts in fire protection and safety that would help in mitigating economic costs and saving lives.

On the Limits of the Precautionary Principle

13 June 2019 - 6:31pm
Abstract

The precautionary principle (PP) is an influential principle of risk management. It has been widely introduced into environmental legislation, and it plays an important role in most international environmental agreements. Yet, there is little consensus on precisely how to understand and formulate the principle. In this article I prove some impossibility results for two plausible formulations of the PP as a decision‐rule. These results illustrate the difficulty in making the PP consistent with the acceptance of any tradeoffs between catastrophic risks and more ordinary goods. How one interprets these results will, however, depend on one's views and commitments. For instance, those who are convinced that the conditions in the impossibility results are requirements of rationality may see these results as undermining the rationality of the PP. But others may simply take these results to identify a set of purported rationality conditions that defenders of the PP should not accept, or to illustrate types of situations in which the principle should not be applied.

A CGE Framework for Modeling the Economics of Flooding and Recovery in a Major Urban Area

13 June 2019 - 6:31pm
Abstract

Coastal cities around the world have experienced large costs from major flooding events in recent years. Climate change is predicted to bring an increased likelihood of flooding due to sea level rise and more frequent severe storms. In order to plan future development and adaptation, cities must know the magnitude of losses associated with these events, and how they can be reduced. Often losses are calculated from insurance claims or surveying flood victims. However, this largely neglects the loss due to the disruption of economic activity. We use a forward‐looking dynamic computable general equilibrium model to study how a local economy responds to a flood, focusing on the subsequent recovery/reconstruction. Initial damage is modeled as a shock to the capital stock and recovery requires rebuilding that stock. We apply the model to Vancouver, British Columbia by considering a flood scenario causing total capital damage of $14.6 billion spread across five municipalities. GDP loss relative to a no‐flood scenario is relatively long‐lasting. It is 2.0% ($2.2 billion) in the first year after the flood, 1.7% ($1.9 billion) in the second year, and 1.2% ($1.4 billion) in the fifth year.

Toward an Epidemiology of Safety and Security Risks: An Organizational Vulnerability Assessment in International Airports

13 June 2019 - 6:31pm
Abstract

International airports are complex sociotechnical systems that have an intrinsic potential to develop safety and security disruptions. In the absence of appropriate defenses, and when the potential for disruption is neglected, organizational crises can occur and jeopardize aviation services. This investigation examines the ways in which modern international airports can be “authors of their own misfortune” by adopting practices, attitudes, and behaviors that could increase their overall level of vulnerability. A sociotechnical perspective, the macroergonomic approach, is applied in this research to detect the potential organizational determinants of vulnerability in airport operations. Qualitative data nurture the case study on international airports produced by the present research. Findings from this study highlight that systemic weaknesses frequently reside in areas at the intersection of physical, organizational, and social spaces. Specific pathways of vulnerability can be drawn across these areas, involving the following systemic layers: individual, task, tools and technology, environment, and organization. This investigation expands the existing literature on the dynamics that characterize crisis incubation in multiorganization, multistakeholder systems such as international airports and provides practical recommendations for airport managers to improve their capabilities to early detect symptoms of organizational vulnerability.

A New Methodology for Before–After Safety Assessment Using Survival Analysis and Longitudinal Data

13 June 2019 - 6:31pm
Abstract

The widely used empirical Bayes (EB) and full Bayes (FB) methods for before–after safety assessment are sometimes limited because of the extensive data needs from additional reference sites. To address this issue, this study proposes a novel before–after safety evaluation methodology based on survival analysis and longitudinal data as an alternative to the EB/FB method. A Bayesian survival analysis (SARE) model with a random effect term to address the unobserved heterogeneity across sites is developed. The proposed survival analysis method is validated through a simulation study before its application. Subsequently, the SARE model is developed in a case study to evaluate the safety effectiveness of a recent red‐light‐running photo enforcement program in New Jersey. As demonstrated in the simulation and the case study, the survival analysis can provide valid estimates using only data from treated sites, and thus its results will not be affected by the selection of defective or insufficient reference sites. In addition, the proposed approach can take into account the censored data generated due to the transition from the before period to the after period, which has not been previously explored in the literature. Using individual crashes as units of analysis, survival analysis can incorporate longitudinal covariates such as the traffic volume and weather variation, and thus can explicitly account for the potential temporal heterogeneity.

A Study on a Sequential One‐Defender‐N‐Attacker Game

13 June 2019 - 6:31pm
Abstract

Government usually faces threat from multiple attackers. However, in the literature, researchers often model attackers as one monolithic player who chooses whether to attack, how much investment to spend, and on which target, instead of treating multiple attackers as independent agents. This modeling strategy may potentially cause suboptimal defense investment if the attackers have vastly different interests and preferences and may not be combined as one in theory. In this article, we develop a sequential game with complete information. This model considers one defender explicitly dealing with multiple unmergeable attackers. Thorough numerical experiments are conducted using ratio and exponential contest success functions under different scenarios. The result is also contrasted with the corresponding single attacker model to study the effect of mishandling multiple attackers. The propositions and observations drawn from the numerical experiments provide insights for government decision making with a better understanding of the attackers' behavior.

Risk and Quantification: A Linguistic Study

13 June 2019 - 6:31pm
Abstract

In risk analysis and research, the concept of risk is often understood quantitatively. For example, risk is commonly defined as the probability of an unwanted event or as its probability multiplied by its consequences. This article addresses (1) to what extent and (2) how the noun risk is actually used quantitatively. Uses of the noun risk are analyzed in four linguistic corpora, both Swedish and English (mostly American English). In total, over 16,000 uses of the noun risk are studied in 14 random (n = 500) or complete samples (where n ranges from 173 to 5,144) of, for example, news and magazine articles, fiction, and websites of government agencies. In contrast to the widespread definition of risk as a quantity, a main finding is that the noun risk is mostly used nonquantitatively. Furthermore, when used quantitatively, the quantification is seldom numerical, instead relying on less precise expressions of quantification, such as high risk and increased risk. The relatively low frequency of quantification in a wide range of language material suggests a quantification bias in many areas of risk theory, that is, overestimation of the importance of quantification in defining the concept of risk. The findings are also discussed in relation to fuzzy‐trace theory. Findings of this study confirm, as suggested by fuzzy‐trace theory, that vague representations are prominent in quantification of risk. The application of the terminology of fuzzy‐trace theory for explaining the patterns of language use are discussed.

GIS‐Based Integration of Social Vulnerability and Level 3 Probabilistic Risk Assessment to Advance Emergency Preparedness, Planning, and Response for Severe Nuclear Power Plant Accidents

13 June 2019 - 6:31pm
Abstract

In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.

Are We Adapting to Floods? Evidence from Global Flooding Fatalities

13 June 2019 - 6:31pm
Abstract

There has been a growing interest in understanding whether and how people adapt to extreme weather events in a changing climate. This article presents one of the first empirical analyses of adaptation to flooding on a global scale. Using a sample of 97 countries between 1985 and 2010, we investigate the extent and pattern of flood adaptation by estimating the effects of a country's climatological risk, recent flood experiences, and socioeconomic characteristics on its flood‐related fatalities. Our results provide mixed evidence on adaptation: countries facing greater long‐term climatological flooding risks do not necessarily adapt better and suffer fewer fatalities; however, after controlling for the cross‐country heterogeneity, we find that more recent flooding shocks have a significant and negative effect on fatalities from subsequent floods. These findings may suggest the short‐term learning dynamics of adaptation and potential inefficacy of earlier flood control measures, particularly those that promote increased exposure in floodplains. Our findings provide important implications for climate adaptation policy making and climate modeling.

Predictive Modeling and Categorizing Likelihoods of Quarantine Pest Introduction of Imported Propagative Commodities from Different Countries

13 June 2019 - 6:31pm
Abstract

The present study investigates U.S. Department of Agriculture inspection records in the Agricultural Quarantine Activity System database to estimate the probability of quarantine pests on propagative plant materials imported from various countries of origin and to develop a methodology ranking the risk of country–commodity combinations based on quarantine pest interceptions. Data collected from October 2014 to January 2016 were used for developing predictive models and validation study. A generalized linear model with Bayesian inference and a generalized linear mixed effects model were used to compare the interception rates of quarantine pests on different country–commodity combinations. Prediction ability of generalized linear mixed effects models was greater than that of generalized linear models. The estimated pest interception probability and confidence interval for each country–commodity combination was categorized into one of four compliance levels: “High,” “Medium,” “Low,” and “Poor/Unacceptable,” Using K‐means clustering analysis. This study presents risk‐based categorization for each country–commodity combination based on the probability of quarantine pest interceptions and the uncertainty in that assessment.

Machine Learning Methods as a Tool for Predicting Risk of Illness Applying Next‐Generation Sequencing Data

13 June 2019 - 6:31pm
Abstract

Next‐generation sequencing (NGS) data present an untapped potential to improve microbial risk assessment (MRA) through increased specificity and redefinition of the hazard. Most of the MRA models do not account for differences in survivability and virulence among strains. The potential of machine learning algorithms for predicting the risk/health burden at the population level while inputting large and complex NGS data was explored with Listeria monocytogenes as a case study. Listeria data consisted of a percentage similarity matrix from genome assemblies of 38 and 207 strains of clinical and food origin, respectively. Basic Local Alignment (BLAST) was used to align the assemblies against a database of 136 virulence and stress resistance genes. The outcome variable was frequency of illness, which is the percentage of reported cases associated with each strain. These frequency data were discretized into seven ordinal outcome categories and used for supervised machine learning and model selection from five ensemble algorithms. There was no significant difference in accuracy between the models, and support vector machine with linear kernel was chosen for further inference (accuracy of 89% [95% CI: 68%, 97%]). The virulence genes FAM002725, FAM002728, FAM002729, InlF, InlJ, Inlk, IisY, IisD, IisX, IisH, IisB, lmo2026, and FAM003296 were important predictors of higher frequency of illness. InlF was uniquely truncated in the sequence type 121 strains. Most important risk predictor genes occurred at highest prevalence among strains from ready‐to‐eat, dairy, and composite foods. We foresee that the findings and approaches described offer the potential for rethinking the current approaches in MRA.

Comments to Orri Stefánsson's Paper on the Precautionary Principle

13 June 2019 - 6:31pm
Risk Analysis, Volume 39, Issue 6, Page 1223-1224, June 2019.

Comment: The Precautionary Principle and Judgment Aggregation

13 June 2019 - 6:31pm
Risk Analysis, Volume 39, Issue 6, Page 1225-1226, June 2019.

The Call for a Shift from Risk to Resilience: What Does it Mean?

13 June 2019 - 6:31pm
Abstract

In recent years calls have been made for a shift from risk to resilience. The basic idea is that we need to be prepared when threatening events occur, whether they are anticipated or unforeseen. This article questions the extent to which this call will have and should have implications for the risk field and science. Is the call based on a belief that this field and science should be replaced by resilience analysis and management, or is it more about priorities: Should more weight be placed on improving resilience? The article argues that the only meaningful interpretation of the call is the latter. Resilience analysis and management is today an integrated part of the risk field and science, and risk analysis in a broad sense is needed to increase relevant knowledge, develop adequate policies, and make the right decisions, balancing different concerns and using our limited resources in an effective way.

Reply

13 June 2019 - 6:31pm
Risk Analysis, Volume 39, Issue 6, Page 1227-1228, June 2019.

From the Editors

13 June 2019 - 6:31pm
Risk Analysis, Volume 39, Issue 6, Page 1193-1195, June 2019.

Pages