Risk Analysis: An International Journal

Subscribe to Risk Analysis: An International Journal feed Risk Analysis: An International Journal
Table of Contents for Risk Analysis. List of articles from both the latest and EarlyView issues.
Updated: 2 hours 15 min ago

Peak Exposures in Epidemiologic Studies and Cancer Risks: Considerations for Regulatory Risk Assessment

5 July 2019 - 2:26pm
Abstract

We review approaches for characterizing “peak” exposures in epidemiologic studies and methods for incorporating peak exposure metrics in dose–response assessments that contribute to risk assessment. The focus was on potential etiologic relations between environmental chemical exposures and cancer risks. We searched the epidemiologic literature on environmental chemicals classified as carcinogens in which cancer risks were described in relation to “peak” exposures. These articles were evaluated to identify some of the challenges associated with defining and describing cancer risks in relation to peak exposures. We found that definitions of peak exposure varied considerably across studies. Of nine chemical agents included in our review of peak exposure, six had epidemiologic data used by the U.S. Environmental Protection Agency (US EPA) in dose–response assessments to derive inhalation unit risk values. These were benzene, formaldehyde, styrene, trichloroethylene, acrylonitrile, and ethylene oxide. All derived unit risks relied on cumulative exposure for dose–response estimation and none, to our knowledge, considered peak exposure metrics. This is not surprising, given the historical linear no‐threshold default model (generally based on cumulative exposure) used in regulatory risk assessments. With newly proposed US EPA rule language, fuller consideration of alternative exposure and dose–response metrics will be supported. “Peak” exposure has not been consistently defined and rarely has been evaluated in epidemiologic studies of cancer risks. We recommend developing uniform definitions of “peak” exposure to facilitate fuller evaluation of dose response for environmental chemicals and cancer risks, especially where mechanistic understanding indicates that the dose response is unlikely linear and that short‐term high‐intensity exposures increase risk.

Risk and Planning in Agriculture: How Planning on Dairy Farms in Ireland Is Affected by Farmers’ Regulatory Focus

5 July 2019 - 2:26pm
Abstract

This article examines how planning on dairy farms is affected by farmers' motivation. It argues that farmers' choice of expansion strategies can be specified in terms of risk decision making and understood as either prevention‐focused or promotion‐focused motivation. This relationship was empirically examined using mediated regression analyses where promotion/prevention focus was the independent variable and its effect on total milk production via planned expansion strategies was examined. The results indicate that promotion focus among farmers has an indirect effect on farm expansion via planning strategies that incur greater risk to the farm enterprise. Regulatory focus on the part of farmers has an influence on farmers' planning and risk management activities and must be accounted for in the design and implementation of policy and risk management tools in agriculture.

Clinical Capital and the Risk of Maternal Labor and Delivery Complications: Hospital Scheduling, Timing, and Cohort Turnover Effects

5 July 2019 - 2:26pm
Abstract

The establishment of interventions to maximize maternal health requires the identification of modifiable risk factors. Toward the identification of modifiable hospital‐based factors, we analyze over 2 million births from 2005 to 2010 in Texas, employing a series of quasi‐experimental tests involving hourly, daily, and monthly circumstances where medical service quality (or clinical capital) is known to vary exogenously. Motivated by a clinician's choice model, we investigate whether maternal delivery complications (1) vary by work shift, (2) increase by the hours worked within shifts, (3) increase on weekends and holidays when hospitals are typically understaffed, and (4) are higher in July when a new cohort of residents enter teaching hospitals. We find consistent evidence of a sizable statistical relationship between deliveries during nonstandard schedules and negative patient outcomes. Delivery complications are higher during night shifts (OR = 1.21, 95% CI: 1.18–1.25), and on weekends (OR = 1.09, 95% CI: 1.04–1.14) and holidays (OR = 1.29, 95% CI: 1.04–1.60), when hospitals are understaffed and less experienced doctors are more likely to work. Within shifts, we show deterioration of occupational performance per additional hour worked (OR = 1.02, 95% CI: 1.01–1.02). We observe substantial additional risk at teaching hospitals in July (OR = 1.28, 95% CI: 1.14–1.43), reflecting a cohort‐turnover effect. All results are robust to the exclusion of noninduced births and intuitively falsified with analyses of chromosomal disorders. Results from our multiple‐test strategy indicate that hospitals can meaningfully attenuate harm to maternal health through strategic scheduling of staff.

Agent‐Based Recovery Model for Seismic Resilience Evaluation of Electrified Communities

5 July 2019 - 2:26pm
Abstract

In this article, an agent‐based framework to quantify the seismic resilience of an electric power supply system (EPSS) and the community it serves is presented. Within the framework, the loss and restoration of the EPSS power generation and delivery capacity and of the power demand from the served community are used to assess the electric power deficit during the damage absorption and recovery processes. Damage to the components of the EPSS and of the community‐built environment is evaluated using the seismic fragility functions. The restoration of the community electric power demand is evaluated using the seismic recovery functions. However, the postearthquake EPSS recovery process is modeled using an agent‐based model with two agents, the EPSS Operator and the Community Administrator. The resilience of the EPSS–community system is quantified using direct, EPSS‐related, societal, and community‐related indicators. Parametric studies are carried out to quantify the influence of different seismic hazard scenarios, agent characteristics, and power dispatch strategies on the EPSS–community seismic resilience. The use of the agent‐based modeling framework enabled a rational formulation of the postearthquake recovery phase and highlighted the interaction between the EPSS and the community in the recovery process not quantified in resilience models developed to date. Furthermore, it shows that the resilience of different community sectors can be enhanced by different power dispatch strategies. The proposed agent‐based EPSS–community system resilience quantification framework can be used to develop better community and infrastructure system risk governance policies.

AbSRiM: An Agent‐Based Security Risk Management Approach for Airport Operations

5 July 2019 - 2:26pm
Abstract

Security risk management is essential for ensuring effective airport operations. This article introduces AbSRiM, a novel agent‐based modeling and simulation approach to perform security risk management for airport operations that uses formal sociotechnical models that include temporal and spatial aspects. The approach contains four main steps: scope selection, agent‐based model definition, risk assessment, and risk mitigation. The approach is based on traditional security risk management methodologies, but uses agent‐based modeling and Monte Carlo simulation at its core. Agent‐based modeling is used to model threat scenarios, and Monte Carlo simulations are then performed with this model to estimate security risks.

The use of the AbSRiM approach is demonstrated with an illustrative case study. This case study includes a threat scenario in which an adversary attacks an airport terminal with an improvised explosive device. The approach provides a promising way to include important elements, such as human aspects and spatiotemporal aspects, in the assessment of risk. More research is still needed to better identify the strengths and weaknesses of the AbSRiM approach in different case studies, but results demonstrate the feasibility of the approach and its potential.

When Evolution Works Against the Future: Disgust's Contributions to the Acceptance of New Food Technologies

5 July 2019 - 2:26pm
Abstract

New food technologies have a high potential to transform the current resource‐consuming food system to a more efficient and sustainable one, but public acceptance of new food technologies is rather low. Such an avoidance might be maintained by a deeply preserved risk avoidance system called disgust. In an online survey, participants (N = 313) received information about a variety of new food technology applications (i.e., genetically modified meat/fish, edible nanotechnology coating film, nanotechnology food box, artificial meat/milk, and a synthetic food additive). Every new food technology application was rated according to the respondent's willingness to eat (WTE) it (i.e., acceptance), risk, benefit, and disgust perceptions. Furthermore, food disgust sensitivity was measured using the Food Disgust Scale. Overall, the WTE both gene‐technology applications and meat coated with an edible nanotechnology film were low and disgust responses toward all three applications were high. In full mediation models, food disgust sensitivity predicted the disgust response toward each new food technology application, which in turn influenced WTE them. Effects of disgust responses on the WTE a synthetic food additive were highest for and lowest for the edible nanotechnology coating film compared to the other technologies. Results indicate that direct disgust responses influence acceptance and risk and benefit perceptions of new food technologies. Beyond the discussion of this study, implications for future research and strategies to increase acceptance of new food technologies are discussed.

Exploring the Conceptual Foundation of Continuity Management in the Context of Societal Safety

5 July 2019 - 2:26pm
Abstract

Public and private actors with critical roles for ensuring societal safety need to work proactively to reduce risks and vulnerabilities. Traditionally, risk management activities have often been performed in order to ensure continuous functioning of key societal services. Recently, however, business continuity management (BCM), and its analytical subcomponent business impact assessment (BIA), has been introduced and used more extensively by both the private and public sector in order to increase the robustness and resilience of critical infrastructures and societal functions and services. BCM was originally developed in the business sector but has received a broader use during the last decade. Yet, BCM/BIA has gained limited attention in the scientific literature—especially when it comes to clarifying and developing its conceptual basis. First, this article examines and discusses the conceptual foundation of BCM concepts, including practical challenges of applying the concepts. Based on recent conceptual developments from the field of risk management, a developed conceptualization is suggested. Second, the article discusses challenges that arise when applying BCM in the societal safety area and provides some recommendations aiming to improve the clarity and quality of applications. Third, the article provides suggestions of how to integrate the overlapping approaches of BIA and risk assessment in order to improve efficiency and effectiveness of proactive, analytic processes. We hope that the article can stimulate a critical discussion about the key concepts of BCM, their wider use in societal safety, and their connection to other concepts and activities such as risk assessment.

Let's Call it Quits: Break‐Even Effects in the Decision to Stop Taking Risks

5 July 2019 - 2:26pm
Abstract

“Chasing” behavior, whereby individuals, driven by a desire to break even, continue a risky activity (RA) despite incurring large losses, is a commonly observed phenomenon. We examine whether the desire to break even plays a wider role in decisions to stop engaging in financially motivated RA in a naturalistic setting. We test hypotheses, motivated by this research question, using a large data set: 707,152 transactions of 5,379 individual financial market spread traders between September 2004 and April 2013. The results indicate strong effects of changes in wealth around the break‐even point on the decision to cease an RA. An important mediating factor was the individual's historical long‐term performance. Those with a more profitable trading history were less affected by a fall in cash balance below the break‐even point compared to those who had been less profitable. We observe that break‐even points play an important role in the decision of nonpathological risk takers to stop RAs. It is possible, therefore, that these nonpathological cognitive processes, when occurring in extrema, may result in pathological gambling behavior such as “chasing.” Our data set focuses on RAs in financial markets and, consequently, we discuss the implications for institutions and regulators in the effective management of risk taking in markets. We also suggest that there may be a need to consider carefully the nature and role of “break‐even points” associated with a broader range of nonfinancially‐focused risk‐taking activities, such as smoking and substance abuse.

Null Hypothesis Testing ≠ Scientific Inference: A Critique of the Shaky Premise at the Heart of the Science and Values Debate, and a Defense of Value‐Neutral Risk Assessment

5 July 2019 - 2:26pm
Abstract

Many philosophers and statisticians argue that risk assessors are morally obligated to evaluate the probabilities and consequences of methodological error, and to base their decisions of whether to adopt a given parameter value, model, or hypothesis on those considerations. This argument is couched within the rubric of null hypothesis testing, which I suggest is a poor descriptive and normative model for risk assessment. Risk regulation is not primarily concerned with evaluating the probability of data conditional upon the null hypothesis, but rather with measuring risks, estimating the consequences of available courses of action and inaction, formally characterizing uncertainty, and deciding what to do based upon explicit values and decision criteria. In turn, I defend an ideal of value‐neutrality, whereby the core inferential tasks of risk assessment—such as weighing evidence, estimating parameters, and model selection—should be guided by the aim of correspondence to reality. This is not to say that value judgments be damned, but rather that they should be accounted for within a structured approach to decision analysis, rather than embedded within risk assessment in an informal manner.

Effect of Providing the Uncertainty Information About a Tornado Occurrence on the Weather Recipients’ Cognition and Protective Action: Probabilistic Hazard Information Versus Deterministic Warnings

5 July 2019 - 2:26pm
Abstract

Currently, a binary alarm system is used in the United States to issue deterministic warning polygons in case of tornado events. To enhance the effectiveness of the weather information, a likelihood alarm system, which uses a tool called probabilistic hazard information (PHI), is being developed at National Severe Storms Laboratory to issue probabilistic information about the threat. This study aims to investigate the effects of providing the uncertainty information about a tornado occurrence through the PHI's graphical swath on laypeople's concern, fear, and protective action, as compared with providing the warning information with the deterministic polygon. The displays of color‐coded swaths and deterministic polygons were shown to subjects. Some displays had a blue background denoting the probability of any tornado formation in the general area. Participants were asked to report their levels of concern, fear, and protective action at randomly chosen locations within each of seven designated levels on each display. Analysis of a three‐stage nested design showed that providing the uncertainty information via the PHI would appropriately increase recipients’ levels of concern, fear, and protective action in highly dangerous scenarios, with a more than 60% chance of being affected by the threat, as compared with deterministic polygons. The blue background and the color‐coding type did not have a significant effect on the people's cognition of the threat and reaction to it. This study shows that using a likelihood alarm system leads to more conscious decision making by the weather information recipients and enhances the system safety.

Recalibration of the Grunow–Finke Assessment Tool to Improve Performance in Detecting Unnatural Epidemics

5 July 2019 - 2:26pm
Abstract

Successful identification of unnatural epidemics relies on a sensitive risk assessment tool designed for the differentiation between unnatural and natural epidemics. The Grunow–Finke tool (GFT), which has been the most widely used, however, has low sensitivity in such differentiation. We aimed to recalibrate the GFT and improve the performance in detection of unnatural epidemics. The comparator was the original GFT and its application in 11 historical outbreaks, including eight confirmed unnatural outbreaks and three natural outbreaks. Three steps were involved: (i) removing criteria, (ii) changing weighting factors, and (iii) adding and refining criteria. We created a series of alternative models to examine the changes on the parameter likelihood of unnatural outbreaks until we found a model that correctly identified all the unnatural outbreaks and natural ones. Finally, the recalibrated GFT was tested and validated with data from an unnatural and natural outbreak, respectively. A total of 238 models were tested. Through the removal of criteria, increasing or decreasing weighting factors of other criteria, adding a new criterion titled “special insights,” and setting a new threshold for likelihood, we increased the sensitivity of the GFT from 38% to 100%, and retained the specificity at 100% in detecting unnatural epidemics. Using test data from an unnatural and a natural outbreak, the recalibrated GFT correctly classified their etiology. The recalibrated GFT could be integrated into routine outbreak investigation by public health institutions and agencies responsible for biosecurity.

Ortwin Renn: Risk Governance Maven

5 July 2019 - 2:26pm
Risk Analysis, Volume 39, Issue 7, Page 1435-1440, July 2019.

From the Editors

5 July 2019 - 2:26pm
Risk Analysis, Volume 39, Issue 7, Page 1433-1434, July 2019.

Issue Information ‐ TOC

5 July 2019 - 2:26pm
Risk Analysis, Volume 39, Issue 7, July 2019.

Health Risk Assessment of Photoresists Used in an Optoelectronic Semiconductor Factory

28 June 2019 - 1:24pm
Abstract

Photoresist materials are indispensable in photolithography, a process used in semiconductor fabrication. The work process and potential hazards in semiconductor production have raised concerns as to adverse health effects. We therefore performed a health risk assessment of occupational exposure to positive photoresists in a single optoelectronic semiconductor factory in Taiwan. Positive photoresists are widely used in the optoelectronic semiconductor industry for photolithography. Occupational exposure was estimated using the Stoffenmanager® model. Bayesian modeling incorporated available personal air sampling data. We examined the composition and by‐products of the photoresists according to descriptions published in the literature and patents; the main compositions assessed were propylene glycol methyl ether acetate (PGMEA), novolac resin, photoactive compound, phenol, cresol, benzene, toluene, and xylene. Reference concentrations for each compound were reassessed and updated if necessary. Calculated hazard quotients were greater than 1 for benzene, phenol, xylene, and PGMEA, indicating that they have the potential for exposures that exceed reference levels. The information from our health risk assessment suggests that benzene and phenol have a higher level of risk than is currently acknowledged. Undertaking our form of risk assessment in the workplace design phase could identify compounds of major concern, allow for the early implementation of control measures and monitoring strategies, and thereby reduce the level of exposure to health risks that workers face throughout their career.

How to Integrate Labor Disruption into an Economic Impact Evaluation Model for Postdisaster Recovery Periods

28 June 2019 - 8:08am
Abstract

Evaluating the economic impacts caused by capital destruction is an effective method for disaster management and prevention, but the magnitude of the economic impact of labor disruption on an economic system remains unclear. This article emphasizes the importance of considering labor disruption when evaluating the economic impact of natural disasters. Based on the principle of disasters and resilience theory, our model integrates nonlinear recovery of labor losses and the demand of labor from outside the disaster area into the dynamic evaluation of the economic impact in the postdisaster recovery period. We exemplify this through a case study: the flood disaster that occurred in Wuhan city, China, on July 6, 2016 (the “7.6 Wuhan flood disaster”). The results indicate that (i) the indirect economic impacts of the “7.6 Wuhan flood disaster” will underestimate 15.12% if we do not consider labor disruption; (ii) the economic impact in secondary industry caused by insufficient labor forces accounts for 42.27% of its total impact, while that in the tertiary industry is 36.29%, which can cause enormous losses if both industries suffer shocks; and (iii) the agricultural sector of Wuhan city experiences an increase in output demand of 0.07% that is created by the introduction of 50,000 short‐term laborers from outside the disaster area to meet the postdisaster reconstruction need. These results provide evidence for the important role of labor disruption and prove that it is a nonnegligible component of postdisaster economic recovery and postdisaster reduction.

Determinants of Probability Neglect and Risk Attitudes for Disaster Risk: An Online Experimental Study of Flood Insurance Demand among Homeowners

27 June 2019 - 7:00pm
Abstract

Little is known about why individuals place either a high or a very low value on mitigating risks of disaster‐type events, like floods. This study uses panel data methods to explore the psychological factors affecting probability neglect of flood risk relevant to the zero end‐point of the probability weighting function in Prospect Theory, and willingness‐to‐pay for flood insurance. In particular, we focus on explanatory variables of anticipatory and anticipated emotions, as well as the threshold of concern. Moreover, results obtained under real and hypothetical incentives are compared in an experiment with high experimental outcomes. Based on our findings, we suggest several policy recommendations to overcome individual decision processes, which may hinder flood protection efforts.

Pages