Risk Analysis: An International Journal

Subscribe to Risk Analysis: An International Journal feed Risk Analysis: An International Journal
Table of Contents for Risk Analysis. List of articles from both the latest and EarlyView issues.
Updated: 2 hours 22 min ago

Robustness of Optimal Investment Decisions in Mixed Insurance/Investment Cyber Risk Management

15 October 2019 - 11:02am
Abstract

An integrated risk management strategy, combining insurance and security investments, where the latter contribute to reduce the insurance premium, is investigated to assess whether it can lead to reduced overall security expenses. The optimal investment for this mixed strategy is derived under three insurance policies, covering, respectively, all the losses (total coverage), just those below the limit of maximum liability (partial coverage), and those above a threshold but below the maximum liability (partial coverage with deductibles). Under certain conditions (e.g., low potential loss, or either very low or very high vulnerability), the mixed strategy reverts however to insurance alone, because investments do not provide an additional benefit. When the mixed strategy is the best choice, the dominant component in the overall security expenses is the insurance premium in most cases. Optimal investment decisions require an accurate estimate of the vulnerability, whereas larger estimation errors may be tolerated for the investment‐effectiveness coefficient.

Robustness of Optimal Investment Decisions in Mixed Insurance/Investment Cyber Risk Management

15 October 2019 - 11:02am
Abstract

An integrated risk management strategy, combining insurance and security investments, where the latter contribute to reduce the insurance premium, is investigated to assess whether it can lead to reduced overall security expenses. The optimal investment for this mixed strategy is derived under three insurance policies, covering, respectively, all the losses (total coverage), just those below the limit of maximum liability (partial coverage), and those above a threshold but below the maximum liability (partial coverage with deductibles). Under certain conditions (e.g., low potential loss, or either very low or very high vulnerability), the mixed strategy reverts however to insurance alone, because investments do not provide an additional benefit. When the mixed strategy is the best choice, the dominant component in the overall security expenses is the insurance premium in most cases. Optimal investment decisions require an accurate estimate of the vulnerability, whereas larger estimation errors may be tolerated for the investment‐effectiveness coefficient.

Assessing the Benefits and Costs of Homeland Security Research: A Risk‐Informed Methodology with Applications for the U.S. Coast Guard

15 October 2019 - 10:59am
Abstract

This article describes a methodology for risk‐informed benefit–cost analyses of homeland security research products. The methodology is field‐tested with 10 research products developed for the U.S. Coast Guard. Risk‐informed benefit–cost analysis is a tool for risk management that integrates elements of risk analysis, decision analysis, and benefit–cost analysis. The cost analysis methodology includes a full‐cost accounting of research projects, starting with initial fundamental research costs and extending to the costs of implementation of the research products and, where applicable, training, maintenance, and upgrade costs. The benefits analysis methodology is driven by changes in costs and risks leading to five alternative models: cost savings at the same level of security, increased security at the same cost, signal detection improvements, risk reduction by deterrence, and value of information. The U.S. Coast Guard staff selected 10 research projects to test and generalize the methodology. Examples include tools to improve the detection of explosives, reduce the costs of harbor patrols, and provide better predictions of hurricane wind speeds and floods. Benefits models and estimates varied by research project and many input parameters of the benefit estimates were highly uncertain, so risk analysis for sensitivity testing and simulation was important. Aggregating across the 10 research products, we found an overall median net present value of about $385 million, with a range from $54 million (5th percentile) to $877 million (95th percentile). Lessons learned are provided for future applications.

Assessing the Benefits and Costs of Homeland Security Research: A Risk‐Informed Methodology with Applications for the U.S. Coast Guard

15 October 2019 - 10:59am
Abstract

This article describes a methodology for risk‐informed benefit–cost analyses of homeland security research products. The methodology is field‐tested with 10 research products developed for the U.S. Coast Guard. Risk‐informed benefit–cost analysis is a tool for risk management that integrates elements of risk analysis, decision analysis, and benefit–cost analysis. The cost analysis methodology includes a full‐cost accounting of research projects, starting with initial fundamental research costs and extending to the costs of implementation of the research products and, where applicable, training, maintenance, and upgrade costs. The benefits analysis methodology is driven by changes in costs and risks leading to five alternative models: cost savings at the same level of security, increased security at the same cost, signal detection improvements, risk reduction by deterrence, and value of information. The U.S. Coast Guard staff selected 10 research projects to test and generalize the methodology. Examples include tools to improve the detection of explosives, reduce the costs of harbor patrols, and provide better predictions of hurricane wind speeds and floods. Benefits models and estimates varied by research project and many input parameters of the benefit estimates were highly uncertain, so risk analysis for sensitivity testing and simulation was important. Aggregating across the 10 research products, we found an overall median net present value of about $385 million, with a range from $54 million (5th percentile) to $877 million (95th percentile). Lessons learned are provided for future applications.

The Aversion to Tampering with Nature (ATN) Scale: Individual Differences in (Dis)comfort with Altering the Natural World

15 October 2019 - 10:58am
Abstract

People differ in their comfort with tampering with the natural world. Although some see altering nature as a sign of human progress, others see it as dangerous or hubristic. Across four studies, we investigate discomfort with tampering with the natural world. To do so, we develop the Aversion to Tampering with Nature (ATN) Scale, a short scale that is the first to directly measure this discomfort. We identify six activities that people believe tamper with nature (geoengineering, genetically modified organisms, pesticides, cloning, gene therapy, and nanoparticles) and show that ATN scores are associated with opposition to these activities. Furthermore, the ATN Scale predicts actual behavior: donations to an anti‐tampering cause. We demonstrate that ATN is related to previously identified constructs including trust in technology, naturalness bias, purity values, disgust sensitivity, aversion to playing God, and environmental beliefs and values. By illuminating who is concerned about tampering with nature and what predicts these beliefs, the ATN Scale provides opportunities to better understand public opposition to technological innovations, consumer preferences for “natural” products, and strategies for science communication.

The Aversion to Tampering with Nature (ATN) Scale: Individual Differences in (Dis)comfort with Altering the Natural World

15 October 2019 - 10:58am
Abstract

People differ in their comfort with tampering with the natural world. Although some see altering nature as a sign of human progress, others see it as dangerous or hubristic. Across four studies, we investigate discomfort with tampering with the natural world. To do so, we develop the Aversion to Tampering with Nature (ATN) Scale, a short scale that is the first to directly measure this discomfort. We identify six activities that people believe tamper with nature (geoengineering, genetically modified organisms, pesticides, cloning, gene therapy, and nanoparticles) and show that ATN scores are associated with opposition to these activities. Furthermore, the ATN Scale predicts actual behavior: donations to an anti‐tampering cause. We demonstrate that ATN is related to previously identified constructs including trust in technology, naturalness bias, purity values, disgust sensitivity, aversion to playing God, and environmental beliefs and values. By illuminating who is concerned about tampering with nature and what predicts these beliefs, the ATN Scale provides opportunities to better understand public opposition to technological innovations, consumer preferences for “natural” products, and strategies for science communication.

Optimization of the Aflatoxin Monitoring Costs along the Maize Supply Chain

10 October 2019 - 2:19pm
Abstract

An optimization model was used to gain insight into cost‐effective monitoring plans for aflatoxins along the maize supply chain. The model was based on a typical Dutch maize chain, with maize grown in the Black Sea region, and transported by ship to the Netherlands for use as an ingredient in compound feed for dairy cattle. Six different scenarios, with different aflatoxin concentrations at harvest and possible aflatoxin production during transport, were used. By minimizing the costs and using parameters such as the concentration, the variance of the sampling plan, and the monitoring and replacement costs, the model optimized the control points (CPs; e.g., after harvest, before or after transport by sea ship), the number of batches sampled at the CP, and the number of samples per batch. This optimization approach led to an end‐of‐chain aflatoxin concentration below the predetermined limit. The model showed that, when postharvest aflatoxin production was not possible, it was most cost‐effective to collect samples from all batches and replace contaminated batches directly after the harvest, since the replacement costs were the lowest at the origin of the chain. When there was aflatoxin production during storage, it was most cost‐effective to collect samples and replace contaminated batches after storage and transport to avoid the duplicate before and after monitoring and replacement costs. Further along the chain a contaminated batch is detected, the more stakeholders are involved, the more expensive the replacement costs and possible recall costs become.

The Practical Significance of Measurement Error in Pulmonary Function Testing Conducted in Research Settings

10 October 2019 - 2:19pm
Abstract

Conventional spirometry produces measurement error by using repeatability criteria (RC) to discard acceptable data and terminating tests early when RC are met. These practices also implicitly assume that there is no variation across maneuvers within each test. This has implications for air pollution regulations that rely on pulmonary function tests to determine adverse effects or set standards. We perform a Monte Carlo simulation of 20,902 tests of forced expiratory volume in 1 second (FEV1), each with eight maneuvers, for an individual with empirically obtained, plausibly normal pulmonary function. Default coefficients of variation for inter‐ and intratest variability (3% and 6%, respectively) are employed. Measurement error is defined as the difference between results from the conventional protocol and an unconstrained, eight‐maneuver alternative. In the default model, average measurement error is shown to be ∼5%. The minimum difference necessary for statistical significance at p < 0.05 for a before/after comparison is shown to be 16%. Meanwhile, the U.S. Environmental Protection Agency has deemed single‐digit percentage decrements in FEV1 sufficient to justify more stringent national ambient air quality standards. Sensitivity analysis reveals that results are insensitive to intertest variability but highly sensitive to intratest variability. Halving the latter to 3% reduces measurement error by 55%. Increasing it to 9% or 12% increases measurement error by 65% or 125%, respectively. Within‐day FEV1 differences ≤5% among normal subjects are believed to be clinically insignificant. Therefore, many differences reported as statistically significant are likely to be artifactual. Reliable data are needed to estimate intratest variability for the general population, subpopulations of interest, and research samples. Sensitive subpopulations (e.g., chronic obstructive pulmonary disease or COPD patients, asthmatics, children) are likely to have higher intratest variability, making it more difficult to derive valid statistical inferences about differences observed after treatment or exposure.

Interventions Targeting Deep Tissue Lymph Nodes May Not Effectively Reduce the Risk of Salmonellosis from Ground Pork Consumption: A Quantitative Microbial Risk Assessment

10 October 2019 - 2:19pm
Abstract

The inclusion of deep tissue lymph nodes (DTLNs) or nonvisceral lymph nodes contaminated with Salmonella in wholesale fresh ground pork (WFGP) production may pose risks to public health. To assess the relative contribution of DTLNs to human salmonellosis occurrence associated with ground pork consumption and to investigate potential critical control points in the slaughter‐to‐table continuum for the control of human salmonellosis in the United States, a quantitative microbial risk assessment (QMRA) model was established. The model predicted an average of 45 cases of salmonellosis (95% CI = [19, 71]) per 100,000 Americans annually due to WFGP consumption. Sensitivity analysis of all stochastic input variables showed that cooking temperature was the most influential parameter for reducing salmonellosis cases associated with WFGP meals, followed by storage temperature and Salmonella concentration on contaminated carcass surface before fabrication. The input variables were grouped to represent three main factors along the slaughter‐to‐table chain influencing Salmonella doses ingested via WFGP meals: DTLN‐related factors, factors at processing other than DTLNs, and consumer‐related factors. The evaluation of the impact of each group of factors by second‐order Monte Carlo simulation showed that DTLN‐related factors had the lowest impact on the risk estimate among the three groups of factors. These findings indicate that interventions to reduce Salmonella contamination in DTLNs or to remove DTLNs from WFGP products may be less critical for reducing human infections attributable to ground pork than improving consumers’ cooking habits or interventions of carcass decontamination at processing.

Managing Safety‐Related Disruptions: Evidence from the U.S. Nuclear Power Industry

10 October 2019 - 2:19pm
Abstract

Low‐probability, high‐impact events are difficult to manage. Firms may underinvest in risk assessments for low‐probability, high‐impact events because it is not easy to link the direct and indirect benefits of doing so. Scholarly research on the effectiveness of programs aimed at reducing such events faces the same challenge. In this article, we draw on comprehensive industry‐wide data from the U.S. nuclear power industry to explore the impact of conducting probabilistic risk assessment (PRA) on preventing safety‐related disruptions. We examine this using data from over 25,000 monthly event reports across 101 U.S. nuclear reactors from 1985 to 1998. Using Poisson fixed effects models with time trends, we find that the number of safety‐related disruptions reduced between 8% and 27% per month in periods after operators submitted their PRA in response to the Nuclear Regulatory Commission's Generic Letter 88‐20, which required all operators to conduct a PRA. One possible mechanism for this is that the adoption of PRA may have increased learning rates, lowering the rate of recurring events by 42%. We find that operators that completed their PRA before Generic Letter 88‐20 continued to experience safety improvements during 1990–1995. This suggests that revisiting PRA or conducting it again can be beneficial. Our results suggest that even in a highly safety‐conscious industry as nuclear utilities, a more formal approach to quantifying risk has its benefits.

Understanding Community Resilience from a PRA Perspective Using Binary Decision Diagrams

10 October 2019 - 2:19pm
Abstract

Probabilistic risk assessment (PRA) is a useful tool to assess complex interconnected systems. This article leverages the capabilities of PRA tools developed for industrial and nuclear risk analysis in community resilience evaluations by modeling the food security of a community in terms of its built environment as an integrated system. To this end, we model the performance of Gilroy, CA, a moderate‐size town, with regard to disruptions in its food supply caused by a severe earthquake. The food retailers of Gilroy, along with the electrical power network, water network elements, and bridges are considered as components of a system. Fault and event trees are constructed to model the requirements for continuous food supply to community residents and are analyzed efficiently using binary decision diagrams (BDDs). The study also identifies shortcomings in approximate classical system analysis methods in assessing community resilience. Importance factors are utilized to rank the importance of various factors to the overall risk of food insecurity. Finally, the study considers the impact of various sources of uncertainties in the hazard modeling and performance of infrastructure on food security measures. The methodology can be applicable for any existing critical infrastructure system and has potential extensions to other hazards.

A Review of Recent Advances in Benchmark Dose Methodology

10 October 2019 - 2:19pm
Abstract

In this review, recent methodological developments for the benchmark dose (BMD) methodology are summarized. Specifically, we introduce the advances for the main steps in BMD derivation: selecting the procedure for defining a BMD from a predefined benchmark response (BMR), setting a BMR, selecting a dose–response model, and estimating the corresponding BMD lower limit (BMDL). Although the last decade has shown major progress in the development of BMD methodology, there is still room for improvement. Remaining challenges are the implementation of new statistical methods in user‐friendly software and the lack of consensus about how to derive the BMDL.

Potential Airborne Asbestos Exposure and Risk Associated with the Historical Use of Cosmetic Talcum Powder Products

10 October 2019 - 2:19pm
Abstract

Over time, concerns have been raised regarding the potential for human exposure and risk from asbestos in cosmetic‐talc–containing consumer products. In 1985, the U.S. Food and Drug Administration (FDA) conducted a risk assessment evaluating the potential inhalation asbestos exposure associated with the cosmetic talc consumer use scenario of powdering an infant during diapering, and found that risks were below levels associated with background asbestos exposures and risk. However, given the scope and age of the FDA's assessment, it was unknown whether the agency's conclusions remained relevant to current risk assessment practices, talc application scenarios, and exposure data. This analysis updates the previous FDA assessment by incorporating the current published exposure literature associated with consumer use of talcum powder and using the current U.S. Environmental Protection Agency's (EPA) nonoccupational asbestos risk assessment approach to estimate potential cumulative asbestos exposure and risk for four use scenarios: (1) infant exposure during diapering; (2) adult exposure from infant diapering; (3) adult exposure from face powdering; and (4) adult exposure from body powdering. The estimated range of cumulative asbestos exposure potential for all scenarios (assuming an asbestos content of 0.1%) ranged from 0.0000021 to 0.0096 f/cc‐yr and resulted in risk estimates that were within or below EPA's acceptable target risk levels. Consistent with the original FDA findings, exposure and corresponding health risk in this range were orders of magnitude below upper‐bound estimates of cumulative asbestos exposure and risk at ambient levels, which have not been associated with increased incidence of asbestos‐related disease.

A Novel Approach to Chemical Mixture Risk Assessment—Linking Data from Population‐Based Epidemiology and Experimental Animal Tests

10 October 2019 - 2:19pm
Abstract

Humans are continuously exposed to chemicals with suspected or proven endocrine disrupting chemicals (EDCs). Risk management of EDCs presents a major unmet challenge because the available data for adverse health effects are generated by examining one compound at a time, whereas real‐life exposures are to mixtures of chemicals. In this work, we integrate epidemiological and experimental evidence toward a whole mixture strategy for risk assessment. To illustrate, we conduct the following four steps in a case study: (1) identification of single EDCs (“bad actors”)—measured in prenatal blood/urine in the SELMA study—that are associated with a shorter anogenital distance (AGD) in baby boys; (2) definition and construction of a “typical” mixture consisting of the “bad actors” identified in Step 1; (3) experimentally testing this mixture in an in vivo animal model to estimate a dose–response relationship and determine a point of departure (i.e., reference dose [RfD]) associated with an adverse health outcome; and (4) use a statistical measure of “sufficient similarity” to compare the experimental RfD (from Step 3) to the exposure measured in the human population and generate a “similar mixture risk indicator” (SMRI). The objective of this exercise is to generate a proof of concept for the systematic integration of epidemiological and experimental evidence with mixture risk assessment strategies. Using a whole mixture approach, we could find a higher rate of pregnant women under risk (13%) when comparing with the data from more traditional models of additivity (3%), or a compound‐by‐compound strategy (1.6%).

Shannon Entropy for Quantifying Uncertainty and Risk in Economic Disparity

10 October 2019 - 2:19pm
Abstract

The rise in economic disparity presents significant risks to global social order and the resilience of local communities. However, existing measurement science for economic disparity (e.g., the Gini coefficient) does not explicitly consider a probability distribution with information, deficiencies, and uncertainties associated with the underlying income distribution. This article introduces the quantification of Shannon entropy for income inequality across scales, including national‐, subnational‐, and city‐level data. The probabilistic principles of Shannon entropy provide a new interpretation for uncertainty and risk related to economic disparity. Entropy and information‐based conflict rise as world incomes converge. High‐entropy instances can resemble both happy and prosperous societies as well as a socialist–communist social structure. Low entropy signals high‐risk tipping points for anomaly and conflict detection with higher confidence. Finally, spatial–temporal entropy maps for U.S. cities offer a city risk profiling framework. The results show polarization of household incomes within and across Baltimore, Washington, DC, and San Francisco. Entropy produces reliable results at significantly reduced computational costs than Gini coefficients.

Adoption of Individual Flood Damage Mitigation Measures in New York City: An Extension of Protection Motivation Theory

10 October 2019 - 2:19pm
Abstract

This study offers insights into factors of influence on the implementation of flood damage mitigation measures by more than 1,000 homeowners who live in flood‐prone areas in New York City. Our theoretical basis for explaining flood preparedness decisions is protection motivation theory, which we extend using a variety of other variables that can have an important influence on individual decision making under risk, such as risk attitudes, time preferences, social norms, trust, and local flood risk management policies. Our results in relation to our main hypothesis are as follows. Individuals who live in high flood risk zones take more flood‐proofing measures in their home than individuals in low‐risk zones, which suggests the former group has a high threat appraisal. With regard to coping appraisal variables, we find that a high response efficacy and a high self‐efficacy play an important role in taking flood damage mitigation measures, while perceived response cost does not. In addition, a variety of behavioral characteristics influence individual decisions to flood‐proof homes, such as risk attitudes, time preferences, and private values of being well prepared for flooding. Investments in elevating one's home are mainly influenced by building code regulations and are negatively related with expectations of receiving federal disaster relief. We discuss a variety of policy recommendations to improve individual flood preparedness decisions, including incentives for risk reduction through flood insurance, and communication campaigns focused on coping appraisals and informing people about flood risk they face over long time horizons.

Is Allocation Affected by the Perception of Others' Irresponsible Behavior and by Ambiguity?

10 October 2019 - 2:19pm
Abstract

The article examines how the perception of others' irresponsible behavior and ambiguity regarding probabilities affect allocation among potential beneficiaries. To elicit these views, we conducted a survey where the participants were first asked to make an allocation of a fixed sum of money between a hereditary cancer, where chance plays a central role, and a lifestyle‐related cancer, where individual lifestyle decisions are more important. Our estimation results show that a substantial share of the respondents allocate significantly more to the hereditary cancer. This may indicate that these respondents care about others' irresponsible behavior. Then, we elicited perceptions of cancer hazards in the form of imprecise probabilities and examined the interplay between allocating behavior and risk perceptions. Finally, we investigated the effects of various socioeconomic characteristics, and of awareness of highly publicized cancer cases, on respondents' allocations.

Linking Decision Theory and Quantitative Microbial Risk Assessment: Tradeoffs Between Compliance and Efficacy for Waterborne Disease Interventions

10 October 2019 - 2:19pm
Abstract

Achieving health gains from the U.N. Sustainable Development Goals of universal coverage for water and sanitation will require interventions that can be widely adopted and maintained. Effectiveness—how an intervention performs based on actual use—as opposed to efficacy will therefore be central to evaluations of new and existing interventions. Incomplete compliance—when people do not always use the intervention and are therefore exposed to contamination—is thought to be responsible for the lower‐than‐expected risk reductions observed from water, sanitation, and hygiene interventions based on their efficacy at removing pathogens. We explicitly incorporated decision theory into a quantitative microbial risk assessment model. Specifically, we assume that the usability of household water treatment (HWT) devices (filters and chlorine) decreases as they become more efficacious due to issues such as taste or flow rates. Simulations were run to examine the tradeoff between device efficacy and usability. For most situations, HWT interventions that trade lower efficacy (i.e., remove less pathogens) for higher compliance (i.e., better usability) contribute substantial reductions in diarrheal disease risk compared to devices meeting current World Health Organization efficacy guidelines. Recommendations that take into account both the behavioral and microbiological properties of treatment devices are likely to be more effective at reducing the burden of diarrheal disease than current standards that only consider efficacy.

Efficacy Foundations for Risk Communication: How People Think About Reducing the Risks of Climate Change

10 October 2019 - 2:19pm
Abstract

Believing action to reduce the risks of climate change is both possible (self‐efficacy) and effective (response efficacy) is essential to motivate and sustain risk mitigation efforts, according to current risk communication theory. Although the public recognizes the dangers of climate change, and is deluged with lists of possible mitigative actions, little is known about public efficacy beliefs in the context of climate change. Prior efficacy studies rely on conflicting constructs and measures of efficacy, and links between efficacy and risk management actions are muddled. As a result, much remains to learn about how laypersons think about the ease and effectiveness of potential mitigative actions. To bring clarity and inform risk communication and management efforts, we investigate how people think about efficacy in the context of climate change risk management by analyzing unprompted and prompted beliefs from two national surveys (N = 405, N = 1,820). In general, respondents distinguish little between effective and ineffective climate strategies. While many respondents appreciate that reducing fossil fuel use is an effective risk mitigation strategy, overall assessments reflect persistent misconceptions about climate change causes, and uncertainties about the effectiveness of risk mitigation strategies. Our findings suggest targeting climate change risk communication and management strategies to (1) address gaps in people's existing mental models of climate action, (2) leverage existing public understanding of both potentially effective mitigation strategies and the collective action dilemma at the heart of climate change action, and (3) take into account ideologically driven reactions to behavior change and government action framed as climate action.

Risk and the Five Hard Problems of Cybersecurity

10 October 2019 - 2:19pm
Abstract

This perspectives article addresses risk in cyber defense and identifies opportunities to incorporate risk analysis principles into the cybersecurity field. The Science of Security (SoS) initiative at the National Security Agency seeks to further and promote interdisciplinary research in cybersecurity. SoS organizes its research into the Five Hard Problems (5HP): (1) scalability and composability; (2) policy‐governed secure collaboration; (3) security‐metrics–driven evaluation, design, development, and deployment; (4) resilient architectures; and (5) understanding and accounting for human behavior. However, a vast majority of the research sponsored by SoS does not consider risk and when it does so, only implicitly. Therefore, we identify opportunities for risk analysis in each hard problem and propose approaches to address these objectives. Such collaborations between risk and cybersecurity researchers will enable growth and insight in both fields, as risk analysts may apply existing methodology in a new realm, while the cybersecurity community benefits from accepted practices for describing, quantifying, working with, and mitigating risk.

Pages