WorldWideScience

Sample records for risk analysis computer

  1. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  2. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  3. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  4. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    Ginevan, M.

    1984-09-01

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  5. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  6. ESP and NOAH: computer programs for flood-risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Montague, D.F.; Rooney, J.J.; Fussell, J.B.; Baker, L.S.

    1982-06-01

    This report describes a computer program package that aids in assessing the impact of floods on risk from nuclear power plants. The package consists of two distinct computer programs: ESP and NOAH. The ESP program improves the efficiency of a flood analysis by screening accident sequences and identifying accident sequences that are potentially significant contributors to risk in the event of a flood. Input to ESP includes accident sequences from an existing risk assessment and flood screening criteria. The NOAH program provides detailed qualitative analysis of the plant systems identified by ESP. NOAH performs a qualitative flood simulation of the fault tree

  7. RADTRAN 5: A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air

  8. A computer code for Cohort Analysis of Increased Risks of Death (CAIRD). Technical report

    International Nuclear Information System (INIS)

    Cook, J.R.; Bunger, B.M.; Barrick, M.K.

    1978-06-01

    The most serious health risk confronting individuals exposed to radiation is death from an induced cancer. Since cancers usually do no develop until many years after exposure, other causes of death may intervene and take the lives of those destined to die from cancer. This computer code has been developed to aid risk analysis by calculating the number of premature deaths and loss of years of life produced by a hypothetical population after exposure to a given risk situation. The code generates modified life tables and estimates the impact of increased risk through several numerical comparisons with the appropriate reference life tables. One of the code's frequent applications is in estimating the number of radiation induced deaths that would result from exposing an initial population of 100,000 individuals to an annual radiation dose. For each risk situation analyzed, the computer code generates a summary table which documents the input, data and contains the results of the comparisons with reference life tables

  9. RADTRAN 5 - A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1993-01-01

    The RADTRAN 5 computer code has been developed to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI standard FORTRAN 77; the code contains significant advances in the methodology first pioneered with the LINK option of RADTRAN 4. A major application of the LINK methodology is route-specific analysis. Another application is comparisons of attributes along the same route segments. Nonradiological risk factors have been incorporated to allow users to estimate nonradiological fatalities and injuries that might occur during the transportation event(s) being analyzed. These fatalities include prompt accidental fatalities from mechanical causes. Values of these risk factors for the United States have been made available in the code as optional defaults. Several new health effects models have been published in the wake of the Hiroshima-Nagasaki dosimetry reassessment, and this has emphasized the need for flexibility in the RADTRAN approach to health-effects calculations. Therefore, the basic set of health-effects conversion equations in RADTRAN have been made user-definable. All parameter values can be changed by the user, but a complete set of default values are available for both the new International Commission on Radiation Protection model (ICRP Publication 60) and the recent model of the U.S. National Research Council's Committee on the Biological Effects of Radiation (BEIR V). The meteorological input data tables have been modified to permit optional entry of maximum downwind distances for each dose isopleth. The expected dose to an individual in each isodose area is also calculated and printed automatically. Examples are given that illustrate the power and flexibility of the RADTRAN 5 computer code. (J.P.N.)

  10. Environmental modeling and health risk analysis (ACTS/RISK)

    National Research Council Canada - National Science Library

    Aral, M. M

    2010-01-01

    ... presents a review of the topics of exposure and health risk analysis. The Analytical Contaminant Transport Analysis System (ACTS) and Health RISK Analysis (RISK) software tools are an integral part of the book and provide computational platforms for all the models discussed herein. The most recent versions of these two softwa...

  11. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  12. Computer aided approach for qualitative risk assessment of engineered systems

    International Nuclear Information System (INIS)

    Crowley, W.K.; Arendt, J.S.; Fussell, J.B.; Rooney, J.J.; Wagner, D.P.

    1978-01-01

    This paper outlines a computer aided methodology for determining the relative contributions of various subsystems and components to the total risk associated with an engineered system. Major contributors to overall task risk are identified through comparison of an expected frequency density function with an established risk criterion. Contributions that are inconsistently high are also identified. The results from this analysis are useful for directing efforts for improving system safety and performance. An analysis of uranium hexafluoride handling risk at a gaseous diffusion uranium enrichment plant using a preliminary version of the computer program EXCON is briefly described and illustrated

  13. Latent-failure risk estimates for computer control

    Science.gov (United States)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  14. International Conference on Risk Analysis

    CERN Document Server

    Oliveira, Teresa; Rigas, Alexandros; Gulati, Sneh

    2015-01-01

    This book covers the latest results in the field of risk analysis. Presented topics include probabilistic models in cancer research, models and methods in longevity, epidemiology of cancer risk, engineering reliability and economical risk problems. The contributions of this volume originate from the 5th International Conference on Risk Analysis (ICRA 5). The conference brought together researchers and practitioners working in the field of risk analysis in order to present new theoretical and computational methods with applications in biology, environmental sciences, public health, economics and finance.

  15. Managing the Risks Associated with End-User Computing.

    Science.gov (United States)

    Alavi, Maryam; Weiss, Ira R.

    1986-01-01

    Identifies organizational risks of end-user computing (EUC) associated with different stages of the end-user applications life cycle (analysis, design, implementation). Generic controls are identified that address each of the risks enumerated in a manner that allows EUC management to select those most appropriate to their EUC environment. (5…

  16. Cloud computing assessing the risks

    CERN Document Server

    Carstensen, Jared; Golden, Bernard

    2012-01-01

    Cloud Computing: Assessing the risks answers these questions and many more. Using jargon-free language and relevant examples, analogies and diagrams, it is an up-to-date, clear and comprehensive guide the security, governance, risk, and compliance elements of Cloud Computing.

  17. RAFT: a computer program for fault tree risk calculations

    International Nuclear Information System (INIS)

    Seybold, G.D.

    1977-11-01

    A description and user instructions are presented for RAFT, a FORTRAN computer code for calculation of a risk measure for fault tree cut sets. RAFT calculates release quantities and a risk measure based on the product of probability and release quantity for cut sets of fault trees modeling the accidental release of radioactive material from a nuclear fuel cycle facility. Cut sets and their probabilities are supplied as input to RAFT from an external fault tree analysis code. Using the total inventory available of radioactive material, along with release fractions for each event in a cut set, the release terms are calculated for each cut set. Each release term is multiplied by the cut set probability to yield the cut set risk measure. RAFT orders the dominant cut sets on the risk measure. The total risk measure of processed cut sets and their fractional contributions are supplied as output. Input options are available to eliminate redundant cut sets, apply threshold values on cut set probability and risk, and control the total number of cut sets output. Hash addressing is used to remove redundant cut sets from the analysis. Computer hardware and software restrictions are given along with a sample problem and cross-reference table of the code. Except for the use of file management utilities, RAFT is written exclusively in FORTRAN language and is operational on a Control Data, CYBER 74-18--series computer system. 4 figures

  18. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  19. Comparative risk analysis

    International Nuclear Information System (INIS)

    Niehaus, F.

    1988-01-01

    In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)

  20. RISKAP: a computer code for analysis of increased risk to arbitrary populations

    International Nuclear Information System (INIS)

    Leggett, R.W.

    1986-06-01

    The computer code RISKAP is used to estimate risk to a population exposed to radioactivity. Risk is measured in terms of the expected number of premature deaths resulting from radiogenic cancers, the number of years of life lost as a result of these deaths, and the average number of years of life lost per premature death. Radiation doses are used to compute an annual, age-specific risk of premature cancer death, based on a dose-response function selected by the user. Calculations of premature radiation deaths, deaths from all causes, and the new age distribution of the population are performed for one-year intervals. RISKAP has been designed to accommodate latency and plateau periods that vary with age at exposure and risk functions that vary with age at exposure as well as time after exposure. RISKAP allows the use of a linear, quadratic, or linear-quadratic dose-response function, although the code is structured so that the user may include an exponential factor or substitute any preferred dose-response function

  1. Risks and crises for healthcare providers: the impact of cloud computing.

    Science.gov (United States)

    Glasberg, Ronald; Hartmann, Michael; Draheim, Michael; Tamm, Gerrit; Hessel, Franz

    2014-01-01

    We analyze risks and crises for healthcare providers and discuss the impact of cloud computing in such scenarios. The analysis is conducted in a holistic way, taking into account organizational and human aspects, clinical, IT-related, and utilities-related risks as well as incorporating the view of the overall risk management.

  2. Risks and Crises for Healthcare Providers: The Impact of Cloud Computing

    OpenAIRE

    Glasberg, Ronald; Hartmann, Michael; Draheim, Michael; Tamm, Gerrit; Hessel, Franz

    2014-01-01

    We analyze risks and crises for healthcare providers and discuss the impact of cloud computing in such scenarios. The analysis is conducted in a holistic way, taking into account organizational and human aspects, clinical, IT-related, and utilities-related risks as well as incorporating the view of the overall risk management.

  3. Risks and Crises for Healthcare Providers: The Impact of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ronald Glasberg

    2014-01-01

    Full Text Available We analyze risks and crises for healthcare providers and discuss the impact of cloud computing in such scenarios. The analysis is conducted in a holistic way, taking into account organizational and human aspects, clinical, IT-related, and utilities-related risks as well as incorporating the view of the overall risk management.

  4. Computing risk for oil prospects: principles and programs

    International Nuclear Information System (INIS)

    Harbaugh, J.W.; Davis, J.C.; Wendebourg, J.

    1995-01-01

    This volume in the series Computer Methods in the Geosciences examines the challenge of risk assessment, field size distributions, and success, sequence and gambler's ruin. The estimation of the discovery size from the prospect size, outcome probabilities and success ratios, modeling prospects, and mapping properties and uncertainties are reviewed, and discriminating discoveries and dry holes, forecasting cash flow for a prospect, the worth of money, and use of risk analysis tables, decision tables and trees are considered. Appendices cover the installation of the RISK program and user manuals, and two disks are included with the volume. (UK)

  5. TRECII: a computer program for transportation risk assessment

    International Nuclear Information System (INIS)

    Franklin, A.L.

    1980-05-01

    A risk-based fault tree analysis method has been developed at the Pacific Northwest Laboratory (PNL) for analysis of nuclear fuel cycle operations. This methodology was developed for the Department of Energy (DOE) as a risk analysis tool for evaluating high level waste management systems. A computer package consisting of three programs was written at that time to assist in the performance of risk assessment: ACORN (draws fault trees), MFAULT (analyzes fault trees), and RAFT (calculates risk). This methodology evaluates release consequences and estimates the frequency of occurrence of these consequences. This document describes an additional risk calculating code which can be used in conjunction with two of the three codes for transportation risk assessment. TRECII modifies the definition of risk used in RAFT (prob. x release) to accommodate release consequences in terms of fatalities. Throughout this report risk shall be defined as probability times consequences (fatalities are one possible health effect consequence). This methodology has been applied to a variety of energy material transportation systems. Typically the material shipped has been radioactive, although some adaptation to fossil fuels has occurred. The approach is normally applied to truck or train transport systems with some adaptation to pipelines and aircraft. TRECII is designed to be used primarily in conjunction with MFAULT; however, with a moderate amount of effort by the user, it can be implemented independent of the risk analysis package developed at PNL. Code description and user instructions necessary for the implementation of the TRECII program are provided

  6. Risk assessment of computer-controlled safety systems for fusion reactors

    International Nuclear Information System (INIS)

    Fryer, M.O.; Bruske, S.Z.

    1983-01-01

    The complexity of fusion reactor systems and the need to display, analyze, and react promptly to large amounts of information during reactor operation will require a number of safety systems in the fusion facilities to be computer controlled. Computer software, therefore, must be included in the reactor safety analyses. Unfortunately, the science of integrating computer software into safety analyses is in its infancy. Combined plant hardware and computer software systems are often treated by making simple assumptions about software performance. This method is not acceptable for assessing risks in the complex fusion systems, and a new technique for risk assessment of combined plant hardware and computer software systems has been developed. This technique is an extension of the traditional fault tree analysis and uses structured flow charts of the software in a manner analogous to wiring or piping diagrams of hardware. The software logic determines the form of much of the fault trees

  7. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  8. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Advances in Risk Analysis with Big Data.

    Science.gov (United States)

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  10. Case studies: Risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1987-01-01

    The SOCRATES computer program uses the results of a Probabilistic Risk Assessment (PRA) or a system level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at a plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns with no adverse impacts on risk. Three summaries of case study applications are included to demonstrate the types of results that can be achieved through risk-based evaluation of technical specifications. (orig.)

  11. Measuring attitudes towards nuclear and technological risks (computer programs in SPSS language)

    International Nuclear Information System (INIS)

    Leonin, T.V. Jr.

    1981-04-01

    A number of methodologies have been developed for measuring public attitudes towards nuclear and other technological risks. The Fishbein model, as modified by the IAEA Risk Assessment group, and which was found to be applicable for Philippine public attitude measurements, is briefly explained together with two other models which are utilized for comparative correlations. A step by step guide on the procedures involved and the calculations required in measuring and analyzing attitude using these models is likewise described, with special emphasis on the computer processing aspect. The use of the Statistical Package for the Social Sciences (SPSS) in the analysis is also described and a number of computer programs in SPSS for the various statistical calculations required in the analysis is presented. (author)

  12. Modeling issues in nuclear plant fire risk analysis

    International Nuclear Information System (INIS)

    Siu, N.

    1989-01-01

    This paper discusses various issues associated with current models for analyzing the risk due to fires in nuclear power plants. Particular emphasis is placed on the fire growth and suppression models, these being unique to the fire portion of the overall risk analysis. Potentially significant modeling improvements are identified; also discussed are a variety of modeling issues where improvements will help the credibility of the analysis, without necessarily changing the computed risk significantly. The mechanistic modeling of fire initiation is identified as a particularly promising improvement for reducing the uncertainties in the predicted risk. 17 refs., 5 figs. 2 tabs

  13. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  14. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  15. Risk in the Clouds?: Security Issues Facing Government Use of Cloud Computing

    Science.gov (United States)

    Wyld, David C.

    Cloud computing is poised to become one of the most important and fundamental shifts in how computing is consumed and used. Forecasts show that government will play a lead role in adopting cloud computing - for data storage, applications, and processing power, as IT executives seek to maximize their returns on limited procurement budgets in these challenging economic times. After an overview of the cloud computing concept, this article explores the security issues facing public sector use of cloud computing and looks to the risk and benefits of shifting to cloud-based models. It concludes with an analysis of the challenges that lie ahead for government use of cloud resources.

  16. Information Risks Analysis in the Cloud Computing System on the basis of Intellectual Technologies

    Directory of Open Access Journals (Sweden)

    Alina Yurievna Sentsova

    2013-02-01

    Full Text Available In this article the possibility of the fuzzy cognitive maps application for the purpose of artificial neural network sample data set formation are used for information security risks estimation in cloud computing system.

  17. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  18. Analysis and computer program for rupture-risk prediction of abdominal aortic aneurysms

    Directory of Open Access Journals (Sweden)

    Li Zhonghua

    2006-03-01

    Full Text Available Abstract Background Ruptured abdominal aortic aneurysms (AAAs are the 13th leading cause of death in the United States. While AAA rupture may occur without significant warning, its risk assessment is generally based on critical values of the maximum AAA diameter (>5 cm and AAA-growth rate (>0.5 cm/year. These criteria may be insufficient for reliable AAA-rupture risk assessment especially when predicting possible rupture of smaller AAAs. Methods Based on clinical evidence, eight biomechanical factors with associated weighting coefficients were determined and summed up in terms of a dimensionless, time-dependent severity parameter, SP(t. The most important factor is the maximum wall stress for which a semi-empirical correlation has been developed. Results The patient-specific SP(t indicates the risk level of AAA rupture and provides a threshold value when surgical intervention becomes necessary. The severity parameter was validated with four clinical cases and its application is demonstrated for two AAA cases. Conclusion As part of computational AAA-risk assessment and medical management, a patient-specific severity parameter 0

  19. [Survival analysis with competing risks: estimating failure probability].

    Science.gov (United States)

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  20. The application of CFD to hydrogen risk analysis in nuclear power plants

    International Nuclear Information System (INIS)

    Wang Hui; Han Xu; Chang Meng; Wang Xiaofeng; Wang Shuguo; Lu Xinhua; Wu Lin

    2013-01-01

    Status of the hydrogen risk analysis method is systemically summarized in this paper and the advantages and limits of CFD (Computational Fluid Dynamic) in hydrogen risk analysis is discussed. The international experimental programs on the CFD hydrogen risk analysis are introduced in this paper. The application of CFD to nuclear power plant (NPP) hydrogen risk analysis is introduced in detail by taking EPR and Ling'ao NPP for example. In these bases, the CFD development prospect of hydrogen risk analysis is also summarized in this paper. (authors)

  1. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  2. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  3. Noninvasive Computed Tomography-based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial.

    Science.gov (United States)

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias

    2015-09-15

    Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.

  4. Older men who use computers have lower risk of dementia.

    Directory of Open Access Journals (Sweden)

    Osvaldo P Almeida

    Full Text Available OBJECTIVE: To determine if older men who use computers have lower risk of developing dementia. METHODS: Cohort study of 5506 community-dwelling men aged 69 to 87 years followed for up to 8.5 years. Use of computers measured as daily, weekly, less than weekly and never. Participants also reported their use of email, internet, word processors, games or other computer activities. The primary outcome was the incidence of ICD-10 diagnosis of dementia as recorded by the Western Australian Data Linkage System. RESULTS: 1857/5506 (33.7% men reported using computers and 347 (6.3% received a diagnosis of dementia during an average follow up of 6.0 years (range: 6 months to 8.5 years. The hazard ratio (HR of dementia was lower among computer users than non-users (HR = 0.62, 95%CI = 0.47-0.81, after adjustment for age, educational attainment, size of social network, and presence of depression or of significant clinical morbidity. The HR of dementia appeared to decrease with increasing frequency of computer use: 0.68 (95%CI = 0.41-1.13, 0.61 (95%CI = 0.39-0.94 and 0.59 (95%CI = 0.40-0.87 for less than weekly, at least weekly and daily. The HR of dementia was 0.66 (95%CI = 0.50-0.86 after the analysis was further adjusted for baseline cognitive function, as measured by the Mini-Mental State Examination. CONCLUSION: Older men who use computers have lower risk of receiving a diagnosis of dementia up to 8.5 years later. Randomised trials are required to determine if the observed associations are causal.

  5. Noninvasive Computed Tomography–based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial

    Science.gov (United States)

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M.; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A.; Bartholmai, Brian J.

    2015-01-01

    Rationale: Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. Objectives: To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. Methods: We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. Measurements and Main Results: A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. Conclusions: CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas. PMID:26052977

  6. Evidence Report: Risk of Inadequate Human-Computer Interaction

    Science.gov (United States)

    Holden, Kritina; Ezer, Neta; Vos, Gordon

    2013-01-01

    Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this "safety net". Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): (1) Requirements, policies, and design processes, (2) Information resources and support, (3) Allocation of attention, (4) Cognitive overload, (5) Environmentally induced perceptual changes, (6) Misperception and misinterpretation of displayed information, (7) Spatial disorientation, and (8) Displays and controls.

  7. Mare Risk Analysis monitor

    International Nuclear Information System (INIS)

    Fuente Prieto, I.; Alonso, P.; Carretero Fernandino, J. A.

    2000-01-01

    The Nuclear Safety Council's requirement that Spanish power plants comply with the requirements of the Maintenance Rule associated with plant risk assessment during power operation, arising from the partial unavailability of systems due to the maintenance activities, has led to need for additional tools to facilitate compliance with said requirements. While the impact on risk produced by individual equipment unavailabilities can easily be evaluated, either qualitatively or quantitatively, the process becomes more complicated when un programmed unavailabilities simultaneously occur in various systems, making it necessary to evaluate their functional impact. It is especially complex in the case of support systems that can affect the functionality of multiple systems. In view of the above, a computer application has been developed that is capable of providing the operator with quick answers based on the specific plant model in order to allow fast risk assessment using the information compiled as part of the Probabilistic Safety Analysis. The paper describes the most important characteristics of this application and the basic design requirements of the MARE Risk Monitor. (Author)

  8. Risk-based analysis methods applied to nuclear power plant technical specifications

    International Nuclear Information System (INIS)

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-01-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications

  9. RACLOUDS - Model for Clouds Risk Analysis in the Information Assets Context

    Directory of Open Access Journals (Sweden)

    SILVA, P. F.

    2016-06-01

    Full Text Available Cloud computing offers benefits in terms of availability and cost, but transfers the responsibility of information security management for the cloud service provider. Thus the consumer loses control over the security of their information and services. This factor has prevented the migration to cloud computing in many businesses. This paper proposes a model where the cloud consumer can perform risk analysis on providers before and after contracting the service. The proposed model establishes the responsibilities of three actors: Consumer, Provider and Security Labs. The inclusion of actor Security Labs provides more credibility to risk analysis making the results more consistent for the consumer.

  10. Pediatrics patient in computed tomography: risk awareness among medical staff

    International Nuclear Information System (INIS)

    Arandjic, D.; Ciraj-Bjelac, O.; Kosutic, D.; Lazarevic, Dj.

    2009-01-01

    In this paper the results of investigation about risk awareness in pediatrics computed tomography among medical staff are presented. Questionnaires were distributed along seven hospitals, 84 people were enrolled in this investigation. The results showed awareness of the potential risks associated with ionizing radiation in computed tomography. However, there is still widespread underestimation of relative doses and risks in case of pediatric patients. (author) [sr

  11. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  12. Variance computations for functional of absolute risk estimates.

    Science.gov (United States)

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  13. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  14. Aspects of the risk analysis in the process engineering industry

    International Nuclear Information System (INIS)

    Hennings, W.; Madjar, M.; Mock, R.; Reer, B.

    1996-01-01

    This document is the result of a multi-discipline working group of a portion of a project called Risk analysis for chemical plants. Within the framework of the project, only selected methods and tools of risk analysis, thus, aspects of method, were able to be discussed and developed further. Case examples from the chemical industry are dealt with in order to discuss the application of a computer assisted quantitative error analysis in this industrial sector. Included is also a comprehensive documentation of the data and results utilised in the examples. figs., tabs., refs

  15. Phased mission analysis of maintained systems: a study in reliability risk analysis

    International Nuclear Information System (INIS)

    Terpstra, K.

    1984-01-01

    The present study develops a general theory that treats the probability of occurrence of each branch of an event tree and that takes correctly into account the dependencies between systems; incorporates within the general theory the solution of the problem of phased mission analysis. It also includes the general model components, that may or may not be repairable, with general lifetime and repairtime distribution, i.e. in the model repairable systems should be taken into account. Finally a computer program is developed that is based on this general theory, i.e. a computer program that is able to perform fully the probabilistic calculations of a risk analysis and that can handle in a correct way phased mission analysis of repairable systems. The theory is applied to a boiling water reactor accident. (Auth.)

  16. Adversarial risk analysis with incomplete information: a level-k approach.

    Science.gov (United States)

    Rothschild, Casey; McLay, Laura; Guikema, Seth

    2012-07-01

    This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.

  17. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  18. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  19. Computer-based video analysis identifies infants with absence of fidgety movements.

    Science.gov (United States)

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.

  20. Computer-Aided Nodule Assessment and Risk Yield Risk Management of Adenocarcinoma: The Future of Imaging?

    Science.gov (United States)

    Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias

    2016-01-01

    Increased clinical use of chest high-resolution computed tomography results in increased identification of lung adenocarcinomas and persistent subsolid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to noninvasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semiquantitative measures to decrease interrater and intrarater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still suboptimal, require validation and are not yet clinically applicable. The computer-aided nodule assessment and risk yield software application represents a validated tool for the automated, quantitative, and noninvasive tool for risk stratification of adenocarcinoma lung nodules. Computer-aided nodule assessment and risk yield correlates well with consensus histology and postsurgical patient outcomes, and therefore may help to guide individualized patient management, for example, in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Efficient computation of the joint probability of multiple inherited risk alleles from pedigree data.

    Science.gov (United States)

    Madsen, Thomas; Braun, Danielle; Peng, Gang; Parmigiani, Giovanni; Trippa, Lorenzo

    2018-06-25

    The Elston-Stewart peeling algorithm enables estimation of an individual's probability of harboring germline risk alleles based on pedigree data, and serves as the computational backbone of important genetic counseling tools. However, it remains limited to the analysis of risk alleles at a small number of genetic loci because its computing time grows exponentially with the number of loci considered. We propose a novel, approximate version of this algorithm, dubbed the peeling and paring algorithm, which scales polynomially in the number of loci. This allows extending peeling-based models to include many genetic loci. The algorithm creates a trade-off between accuracy and speed, and allows the user to control this trade-off. We provide exact bounds on the approximation error and evaluate it in realistic simulations. Results show that the loss of accuracy due to the approximation is negligible in important applications. This algorithm will improve genetic counseling tools by increasing the number of pathogenic risk alleles that can be addressed. To illustrate we create an extended five genes version of BRCAPRO, a widely used model for estimating the carrier probabilities of BRCA1 and BRCA2 risk alleles and assess its computational properties. © 2018 WILEY PERIODICALS, INC.

  2. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    Science.gov (United States)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  3. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  4. RISK LEVEL ANALYSIS ON THE PREVENTIVE EROSION CAPACITY OF BRIDGES

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Deficiency of the Preventive Erosion Capacity (PEC) of a bridge pier is the main factor leading to bridge failures. In this paper, the PEC of bridge piers was analyzed using the stochastic analysis method. The definitions of the reliability and risk level of a bridge pier subjected to water erosion were proposed and a computational model for erosion depth and risk level in was suggested.

  5. Identification and ranking of the risk factors of cloud computing in State-Owned organizations

    Directory of Open Access Journals (Sweden)

    Noor Mohammad Yaghoubi

    2015-05-01

    Full Text Available Rapid development of processing and storage technologies and the success of the Internet have made computing resources cheaper, more powerful and more available than before. This technological trend has enabled the realization of a new computing model called cloud computing. Recently, the State-Owned organizations have begun to utilize cloud computing architectures, platforms, and applications to deliver services and meet constituents’ needs. Despite all of the advantages and opportunities of cloud computing technology, there are so many risks that State-Owned organizations need to know about before their migration to cloud environment. The purpose of this study is to identify and rank the risks factors of cloud computing in State-Owned organizations by making use of IT experts’ opinion. Firstly, by reviewing key articles, a comprehensive list of risks factors were extracted and classified into two categories: tangible and intangible. Then, six experts were interviewed about these risks and their classifications, and 10 risks were identified. After that, process of ranking the risks was done by seeking help from 52 experts and by fuzzy analytic hierarchy process. The results show that experts have identified intangible risks as the most important risks in cloud computing usage by State-Owned organizations. As the results indicate, "data confidentiality" risk has the highest place among the other risks.

  6. Computational Fluid Dynamic Analysis of the Left Atrial Appendage to Predict Thrombosis Risk

    Directory of Open Access Journals (Sweden)

    Giorgia Maria Bosi

    2018-04-01

    Full Text Available During Atrial Fibrillation (AF more than 90% of the left atrial thrombi responsible for thromboembolic events originate in the left atrial appendage (LAA, a complex small sac protruding from the left atrium (LA. Current available treatments to prevent thromboembolic events are oral anticoagulation, surgical LAA exclusion, or percutaneous LAA occlusion. However, the mechanism behind thrombus formation in the LAA is poorly understood. The aim of this work is to analyse the hemodynamic behaviour in four typical LAA morphologies - “Chicken wing”, “Cactus”, “Windsock” and “Cauliflower” - to identify potential relationships between the different shapes and the risk of thrombotic events. Computerised tomography (CT images from four patients with no LA pathology were segmented to derive the 3D anatomical shape of LAA and LA. Computational Fluid Dynamic (CFD analyses based on the patient-specific anatomies were carried out imposing both healthy and AF flow conditions. Velocity and shear strain rate (SSR were analysed for all cases. Residence time in the different LAA regions was estimated with a virtual contrast agent washing out. CFD results indicate that both velocity and SSR decrease along the LAA, from the ostium to the tip, at each instant in the cardiac cycle, thus making the LAA tip more prone to fluid stagnation, and therefore to thrombus formation. Velocity and SSR also decrease from normal to AF conditions. After four cardiac cycles, the lowest washout of contrast agent was observed for the Cauliflower morphology (3.27% of residual contrast in AF, and the highest for the Windsock (0.56% of residual contrast in AF. This suggests that the former is expected to be associated with a higher risk of thrombosis, in agreement with clinical reports in the literature. The presented computational models highlight the major role played by the LAA morphology on the hemodynamics, both in normal and AF conditions, revealing the potential

  7. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  8. Estimation of the failure risk of a maxillary premolar with different crack depths with endodontic treatment by computer-aided design/computer-aided manufacturing ceramic restorations.

    Science.gov (United States)

    Lin, Chun-Li; Chang, Yen-Hsiang; Hsieh, Shih-Kai; Chang, Wen-Jen

    2013-03-01

    This study evaluated the risk of failure for an endodontically treated premolar with different crack depths, which was shearing toward the pulp chamber and was restored by using 3 different computer-aided design/computer-aided manufacturing ceramic restoration configurations. Three 3-dimensional finite element models designed with computer-aided design/computer-aided manufacturing ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with finite element analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restorations exhibited the lowest values relative to the other 2 restoration methods. Weibull analysis revealed that the overall failure probabilities in a shallow cracked premolar were 27%, 2%, and 1% for the onlay, endocrown, and conventional crown restorations, respectively, in the normal occlusal condition. The corresponding values were 70%, 10%, and 2% for the depth cracked premolar. This numeric investigation suggests that the endocrown provides sufficient fracture resistance only in a shallow cracked premolar with endodontic treatment. The conventional crown treatment can immobilize the premolar for different cracked depths with lower failure risk. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Applied decision analysis and risk evaluation

    International Nuclear Information System (INIS)

    Ferse, W.; Kruber, S.

    1995-01-01

    During 1994 the workgroup 'Applied Decision Analysis and Risk Evaluation; continued the work on the knowledge based decision support system XUMA-GEFA for the evaluation of the hazard potential of contaminated sites. Additionally a new research direction was started which aims at the support of a later stage of the treatment of contaminated sites: The clean-up decision. For the support of decisions arising at this stage, the methods of decision analysis will be used. Computational aids for evaluation and decision support were implemented and a case study at a waste disposal site in Saxony which turns out to be a danger for the surrounding groundwater ressource was initiated. (orig.)

  10. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  11. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  12. Risk analysis of radioactive waste management systems in Germany

    International Nuclear Information System (INIS)

    Wingender, H.J.

    1978-01-01

    Within the scope of a system study, ''Radioactive wastes in the Federal Republic of Germany,'' performed from 1974 through 1976, the questions of risk assessment were investigated. A risk analysis of a high-level waste (HLW) management system was performed. The results of the HLW tank storage are that the risk expectation value is 700 nJ/kg x RBE (7 x 10 -5 rem) per year for atmospheric release. The discussion of the main contributing accidents shows the possibility of reducing the risk by a technical means. A qualitative comparison on the release basis with the results of the WASH-1400 report shows significant differences that can be explained by the different methodologies applied. The risk analysis activities have led to a comprehensive risk assessment project, which was recently started. The projected includes research and development tasks concerning nuclide migration and transport to the ecosphere, nuclide mobilization by various mechanisms, methodology problems, data collection, computer code development, as well as risk analyses of waste management facilities. It is intended to round off the project with risk analyses of spent fuel element transport, storage, and reprocessing

  13. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  14. Risk perception and risk management in cloud computing: results from a case study of Swiss companies

    OpenAIRE

    Brender, Nathalie; Markov, Iliya

    2013-01-01

    In today's economic turmoil, the pay-per-use pricing model of cloud computing, its flexibility and scalability and the potential for better security and availability levels are alluring to both SMEs and large enterprises. However, cloud computing is fraught with security risks which need to be carefully evaluated before any engagement in this area. This article elaborates on the most important risks inherent to the cloud such as information security, regulatory compliance, data location, inve...

  15. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  16. A visual interface to computer programs for linkage analysis.

    Science.gov (United States)

    Chapman, C J

    1990-06-01

    This paper describes a visual approach to the input of information about human families into computer data bases, making use of the GEM graphic interface on the Atari ST. Similar approaches could be used on the Apple Macintosh or on the IBM PC AT (to which it has been transferred). For occasional users of pedigree analysis programs, this approach has considerable advantages in ease of use and accessibility. An example of such use might be the analysis of risk in families with Huntington disease using linked RFLPs. However, graphic interfaces do make much greater demands on the programmers of these systems.

  17. Exploring the Key Risk Factors for Application of Cloud Computing in Auditing

    Directory of Open Access Journals (Sweden)

    Kuang-Hua Hu

    2016-08-01

    Full Text Available In the cloud computing information technology environment, cloud computing has some advantages such as lower cost, immediate access to hardware resources, lower IT barriers to innovation, higher scalability, etc., but for the financial audit information flow and processing in the cloud system, CPA (Certified Public Accountant firms need special considerations, for example: system problems, information security and other related issues. Auditing cloud computing applications is the future trend in the CPA firms, given this issue is an important factor for them and very few studies have been conducted to investigate this issue; hence this study seeks to explore the key risk factors for the cloud computing and audit considerations. The dimensions/perspectives of the application of cloud computing audit considerations are huge and cover many criteria/factors. These risk factors are becoming increasingly complex, and interdependent. If the dimensions could be established, the mutually influential relations of the dimensions and criteria determined, and the current execution performance established; a prioritized improvement strategy designed could be constructed to use as a reference for CPA firm management decision making; as well as provide CPA firms with a reference for build auditing cloud computing systems. Empirical results show that key risk factors to consider when using cloud computing in auditing are, in order of priority for improvement: Operations (D, Automating user provisioning (C, Technology Risk (B and Protection system (A.

  18. Data security and risk assessment in cloud computing

    Directory of Open Access Journals (Sweden)

    Li Jing

    2018-01-01

    Full Text Available Cloud computing has attracted more and more attention as it reduces the cost of IT infrastructure of organizations. In our country, business Cloud services, such as Alibaba Cloud, Huawei Cloud, QingCloud, UCloud and so on are gaining more and more uses, especially small or median organizations. In the cloud service scenario, the program and data are migrating into cloud, resulting the lack of trust between customers and cloud service providers. However, the recent study on Cloud computing is mainly focused on the service side, while the data security and trust have not been sufficiently studied yet. This paper investigates into the data security issues from data life cycle which includes five steps when an organization uses Cloud computing. A data management framework is given out, including not only the data classification but also the risk management framework. Concretely, the data is divided into two varieties, business and personal information. And then, four classification levels (high, medium, low, normal according to the different extent of the potential adverse effect is introduced. With the help of classification, the administrators can identify the application or data to implement corresponding security controls. At last, the administrators conduct the risk assessment to alleviate the risk of data security. The trust between customers and cloud service providers will be strengthen through this way.

  19. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  20. Probabilistic risk analysis and terrorism risk.

    Science.gov (United States)

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  1. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  2. Risk analysis in oil spill response planning

    International Nuclear Information System (INIS)

    Chernoplekov, A.N.; Alexandrov, A.A.

    2005-01-01

    Tiered response is a basic approach to emergency plans, including oil spill response (OSR). This paper delineates a huge set of accidental scenarios within a certain tier of response generated by a computer during risk assessment. Parameters such as the amount of oil spilled, duration of discharge and types of losses should be provided in OSR scenarios. Examples of applications include offshore installations, sub sea or onshore pipelines, and localized onshore facilities. The paper demonstrates how to use risk analysis results for delineating all likely spills into groups that need a specific tier response. The best world practices and Russian regulatory approaches were outlined and compared. Corresponding algorithms were developed and their application in pipelines was presented. The algorithm combines expert's skills and spill trajectory modeling with the net environmental benefit analysis principle into the incident specific emergency response planning. 9 refs., 13 tabs., 2 figs

  3. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  4. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  5. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  6. Efficacy of computer technology-based HIV prevention interventions: a meta-analysis.

    Science.gov (United States)

    Noar, Seth M; Black, Hulda G; Pierce, Larson B

    2009-01-02

    To conduct a meta-analysis of computer technology-based HIV prevention behavioral interventions aimed at increasing condom use among a variety of at-risk populations. Systematic review and meta-analysis of existing published and unpublished studies testing computer-based interventions. Meta-analytic techniques were used to compute and aggregate effect sizes for 12 randomized controlled trials that met inclusion criteria. Variables that had the potential to moderate intervention efficacy were also tested. The overall mean weighted effect size for condom use was d = 0.259 (95% confidence interval = 0.201, 0.317; Z = 8.74, P partners, and incident sexually transmitted diseases. In addition, interventions were significantly more efficacious when they were directed at men or women (versus mixed sex groups), utilized individualized tailoring, used a Stages of Change model, and had more intervention sessions. Computer technology-based HIV prevention interventions have similar efficacy to more traditional human-delivered interventions. Given their low cost to deliver, ability to customize intervention content, and flexible dissemination channels, they hold much promise for the future of HIV prevention.

  7. The Effects of Variability and Risk in Selection Utility Analysis: An Empirical Comparison.

    Science.gov (United States)

    Rich, Joseph R.; Boudreau, John W.

    1987-01-01

    Investigated utility estimate variability for the selection utility of using the Programmer Aptitude Test to select computer programmers. Comparison of Monte Carlo results to other risk assessment approaches (sensitivity analysis, break-even analysis, algebraic derivation of the distribtion) suggests that distribution information provided by Monte…

  8. Cloud computing in pharmaceutical R&D: business risks and mitigations.

    Science.gov (United States)

    Geiger, Karl

    2010-05-01

    Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.

  9. Identification of risk factors of computer information technologies in education

    Directory of Open Access Journals (Sweden)

    Hrebniak M.P.

    2014-03-01

    Full Text Available The basic direction of development of secondary school and vocational training is computer training of schoolchildren and students, including distance forms of education and widespread usage of world information systems. The purpose of the work is to determine risk factors for schoolchildren and students, when using modern information and computer technologies. Results of researches allowed to establish dynamics of formation of skills using computer information technologies in education and characteristics of mental ability among schoolchildren and students during training in high school. Common risk factors, while operating CIT, are: intensification and formalization of intellectual activity, adverse ergonomic parameters, unfavorable working posture, excess of hygiene standards by chemical and physical characteristics. The priority preventive directions in applying computer information technology in education are: improvement of optimal visual parameters of activity, rationalization of ergonomic parameters, minimizing of adverse effects of chemical and physical conditions, rationalization of work and rest activity.

  10. Utility of high-resolution computed tomography for predicting risk of sputum smear-negative pulmonary tuberculosis

    International Nuclear Information System (INIS)

    Nakanishi, Masanori; Demura, Yoshiki; Ameshima, Shingo; Kosaka, Nobuyuki; Chiba, Yukio; Nishikawa, Satoshi; Itoh, Harumi; Ishizaki, Takeshi

    2010-01-01

    Background: To diagnose sputum smear-negative pulmonary tuberculosis (PTB) is difficult and the ability of high-resolution computed tomography (HRCT) for diagnosing PTB has remained unclear in the sputum smear-negative setting. We retrospectively investigated whether or not this imaging modality can predict risk for sputum smear-negative PTB. Methods: We used HRCT to examine the findings of 116 patients with suspected PTB despite negative sputum smears for acid-fast bacilli (AFB). We investigated their clinical features and HRCT-findings to predict the risk for PTB by multivariate analysis and a combination of HRCT findings by stepwise regression analysis. We then designed provisional HRCT diagnostic criteria based on these results to rank the risk of PTB and blinded observers assessed the validity and reliability of these criteria. Results: A positive tuberculin skin test alone among clinical laboratory findings was significantly associated with an increase of risk of PTB. Multivariate regression analysis showed that large nodules, tree-in-bud appearance, lobular consolidation and the main lesion being located in S1, S2, and S6 were significantly associated with an increased risk of PTB. Stepwise regression analysis showed that coexistence of the above 4 factors was most significantly associated with an increase in the risk for PTB. Ranking of the results using our HRCT diagnostic criteria by blinded observers revealed good utility and agreement for predicting PTB risk. Conclusions: Even in the sputum smear-negative setting, HRCT can predict the risk of PTB with good reproducibility and can select patients having a high probability of PTB.

  11. Utility of high-resolution computed tomography for predicting risk of sputum smear-negative pulmonary tuberculosis

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, Masanori [Departments of Respiratory Medicine, Faculty of Medical Sciences, University of Fukui, 23 Shimoaizuki Eiheizi-cho, Fukui 910-1193 (Japan)], E-mail: mnakanishi@nifty.ne.jp; Demura, Yoshiki; Ameshima, Shingo [Departments of Respiratory Medicine, Faculty of Medical Sciences, University of Fukui, 23 Shimoaizuki Eiheizi-cho, Fukui 910-1193 (Japan); Kosaka, Nobuyuki [Departments of Radiology, Faculty of Medical Sciences, University of Fukui, 23 Shimoaizuki Eiheizi-cho, Fukui 910-1193 (Japan); Chiba, Yukio [Department of Respiratory Medicine, National Hospital Organization, Fukui Hospital, Tsuruga, Fukui 914-0195 (Japan); Nishikawa, Satoshi [Department of Radiology, National Hospital Organization, Fukui Hospital, Tsuruga, Fukui 914-0195 (Japan); Itoh, Harumi [Departments of Radiology, Faculty of Medical Sciences, University of Fukui, 23 Shimoaizuki Eiheizi-cho, Fukui 910-1193 (Japan); Ishizaki, Takeshi [Departments of Respiratory Medicine, Faculty of Medical Sciences, University of Fukui, 23 Shimoaizuki Eiheizi-cho, Fukui 910-1193 (Japan)

    2010-03-15

    Background: To diagnose sputum smear-negative pulmonary tuberculosis (PTB) is difficult and the ability of high-resolution computed tomography (HRCT) for diagnosing PTB has remained unclear in the sputum smear-negative setting. We retrospectively investigated whether or not this imaging modality can predict risk for sputum smear-negative PTB. Methods: We used HRCT to examine the findings of 116 patients with suspected PTB despite negative sputum smears for acid-fast bacilli (AFB). We investigated their clinical features and HRCT-findings to predict the risk for PTB by multivariate analysis and a combination of HRCT findings by stepwise regression analysis. We then designed provisional HRCT diagnostic criteria based on these results to rank the risk of PTB and blinded observers assessed the validity and reliability of these criteria. Results: A positive tuberculin skin test alone among clinical laboratory findings was significantly associated with an increase of risk of PTB. Multivariate regression analysis showed that large nodules, tree-in-bud appearance, lobular consolidation and the main lesion being located in S1, S2, and S6 were significantly associated with an increased risk of PTB. Stepwise regression analysis showed that coexistence of the above 4 factors was most significantly associated with an increase in the risk for PTB. Ranking of the results using our HRCT diagnostic criteria by blinded observers revealed good utility and agreement for predicting PTB risk. Conclusions: Even in the sputum smear-negative setting, HRCT can predict the risk of PTB with good reproducibility and can select patients having a high probability of PTB.

  12. MetaCompare: A computational pipeline for prioritizing environmental resistome risk.

    Science.gov (United States)

    Oh, Min; Pruden, Amy; Chen, Chaoqi; Heath, Lenwood S; Xia, Kang; Zhang, Liqing

    2018-04-26

    The spread of antibiotic resistance is a growing public health concern. While numerous studies have highlighted the importance of environmental sources and pathways of the spread of antibiotic resistance, a systematic means of comparing and prioritizing risks represented by various environmental compartments is lacking. Here we introduce MetaCompare, a publicly-available tool for ranking 'resistome risk,' which we define as the potential for antibiotic resistance genes (ARGs) to be associated with mobile genetic elements (MGEs) and mobilize to pathogens based on metagenomic data. A computational pipeline was developed in which each ARG is evaluated based on relative abundance, mobility, and presence within a pathogen. This is determined through assembly of shotgun sequencing data and analysis of contigs containing ARGs to determine if they contain sequence similarity to MGEs or human pathogens. Based on the assembled metagenomes, samples are projected into a 3-D hazard space and assigned resistome risk scores. To validate, we tested previously published metagenomic data derived from distinct aquatic environments. Based on unsupervised machine learning, the test samples clustered in the hazard space in a manner consistent with their origin. The derived scores produced a well-resolved ascending resistome risk ranking of: wastewater treatment plant effluent, dairy lagoon, hospital sewage.

  13. Computed Tomography Angiography Evaluation of Risk Factors for Unstable Intracranial Aneurysms.

    Science.gov (United States)

    Wang, Guang-Xian; Gong, Ming-Fu; Wen, Li; Liu, Lan-Lan; Yin, Jin-Bo; Duan, Chun-Mei; Zhang, Dong

    2018-03-19

    To evaluate risk factors for instability in intracranial aneurysms (IAs) using computed tomography angiography (CTA). A total of 614 consecutive patients diagnosed with 661 IAs between August 2011 and February 2016 were reviewed. Patients and IAs were divided into stable and unstable groups. Along with clinical characteristics, IA characteristics were evaluated by CTA. Multiple logistic regression analysis was used to identify the independent risk factors associated with unstable IAs. Receiver operating characteristic (ROC) curve analysis was performed on the final model, and optimal thresholds were obtained. Patient age (odds ratio [OR], 0.946), cerebral atherosclerosis (CA; OR, 0.525), and IAs located at the middle cerebral artery (OR, 0.473) or internal carotid artery (OR, 0.512) were negatively correlated with instability, whereas IAs with irregular shape (OR, 2.157), deep depth (OR, 1.557), or large flow angle (FA; OR, 1.015) were more likely to be unstable. ROC analysis revealed threshold values of age, depth, and FA of 59.5 years, 4.25 mm, and 87.8°, respectively. The stability of IAs is significantly affected by several factors, including patient age and the presence of CA. IA shape and location also have an impact on the stability of IAs. Growth into an irregular shape, with a deep depth, and a large FA are risk factors for a change in IAs from stable to unstable. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    International Nuclear Information System (INIS)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure

  15. Risk in Enterprise Cloud Computing: Re-Evaluated

    Science.gov (United States)

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  16. Primary care physicians’ perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study

    Directory of Open Access Journals (Sweden)

    Teja Voruganti

    2015-09-01

    Full Text Available Background Health risk assessment tools compute an individual’s risk of developing a disease. Routine use of such tools by primary care physicians (PCPs is potentially useful in chronic disease prevention. We sought physicians’ awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings.Methods Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology.Results PCPs (n = 30 were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats.Conclusions In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential. 

  17. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  18. Body mass index and risk of BPH: a meta-analysis.

    Science.gov (United States)

    Wang, S; Mao, Q; Lin, Y; Wu, J; Wang, X; Zheng, X; Xie, L

    2012-09-01

    Epidemiological studies have reported conflicting results relating obesity to BPH. A meta-analysis of cohort and case-control studies was conducted to pool the risk estimates of the association between obesity and BPH. Eligible studies were retrieved by both computer searches and review of references. We analyzed abstracted data with random effects models to obtain the summary risk estimates. Dose-response meta-analysis was performed for studies reporting categorical risk estimates for a series of exposure levels. A total of 19 studies met the inclusion criteria of the meta-analysis. Positive association with body mass index (BMI) was observed in BPH and lower urinary tract symptoms (LUTS) combined group (odds ratio=1.27, 95% confidence intervals 1.05-1.53). In subgroup analysis, BMI exhibited a positive dose-response relationship with BPH/LUTS in population-based case-control studies and a marginal positive association was observed between risk of BPH and increased BMI. However, no association between BPH/LUTS and BMI was observed in other subgroups stratified by study design, geographical region or primary outcome. The overall current literatures suggested that BMI was associated with increased risk of BPH. Further efforts should be made to confirm these findings and clarify the underlying biological mechanisms.

  19. Reinforcement of qualitative risk assessment proposals from computer science

    International Nuclear Information System (INIS)

    Friedlhuber, T.; Hibti, M.; Rauzy, A.

    2013-01-01

    During the last decade a lot of research has been made to evaluate concepts and methos of quantitative risk assessment in order to predict hazards more precisely. Nevertheless, the occurrence of new catastrophes like the Indonesian Tsunami in 2004, the Deepwater Horizon accident in 2010 or recently the Fukushima accidents in 2011 raise the question whether we may underestimate some natural limits of annotative risk assessment or even mistake its significance. Especially in the case of very unlikely events, in combination with uncertainty and severe consequences, may be we would do better to concentrate more on understanding risk than on calculating probability values. In this paper we apply progresses, made in the field of computer science, to tools and modelling concepts used in risk assessment. Regarding computer science, we point out now concepts, that may improve the quality of risk models and the process of model engineering. The goal is to reinforce the importance of qualitative risk assessment with the help of sophisticated tools and modelling. Qualitative risk assessment aims to understand risk and therefore reflects the initial idea of risk assessment. Risk understanding requires understanding systems and relations of components. It is fundamental to comprehend the meaning of components in fault- and event trees, to retrace all applied modifications and to highlight critical aspects. It is important how PSA models are visualized, documented, navigated, how results are presented and how model maintenance, integration and version control are performed. Also, the conjoint usage of different type of models (for example PSA models together with event sequence diagrams) can contribute to quality assurance. We present new concepts for various kind of problems. (author)

  20. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  1. Algorithms for the Computation of Debris Risk

    Science.gov (United States)

    Matney, Mark J.

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of satellites. A number of tools have been developed in NASA’s Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA’s Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper presents an introduction to these algorithms and the assumptions upon which they are based.

  2. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  3. Is risk analysis scientific?

    Science.gov (United States)

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  4. [Contrast-induced nephropathy in patients at risk of renal failure undergoing computed tomography: systematic review and meta-analysis of randomized controlled trials].

    Science.gov (United States)

    Arana, Estanislao; Catalá-López, Ferrán

    2010-09-11

    We evaluated and quantified by meta-analysis techniques the incidence of contrast-induced nephropathy (CIN) in patients at risk undergoing computed tomography (CT). We conducted a systematic review of randomized controlled clinical trials designated to evaluate the nephrotoxicity related to iso-osmolar contrast media (IOCM) compared to low-osmolar contrast media (LOCM). Main electronic databases searched included PubMed/MEDLINE, EMBASE, ISI Web of Knowledge and Virtual Health Library (BVS-BIREME), as well as abstracts presented at related scientific societies meetings. Prior to data extraction, definitions of nephrotoxicity and risk population were established. Besides meta-analysis, the global agreement between CIN definitions was evaluated with Mantel-Haenszel stratified test. Five studies were included with 716 randomized patients. When CIN was defined as increased serum creatinine (SCr)>or=25%, the relative risk (RR) was 0.71 (CI95%: 0.40-1.26)-in favor of IOCM-and when it was defined as SCr>or=0.5mg/dL it showed a RR 1.48 (CI95%: 0.37-5.87)-favoring LOCM-in the four studies used this criterion. Mantel-Haenszel stratified test was chi2=2.51 (p=0.8). In patients with renal failure undergoing CT there is a similar risk of CIN with the administration of any contrast media studied. CIN incidence depends on the chosen criteria and is lower with the definition of SCr>or=0.5mg/dL at 24-72h. No agreement was found between CIN definitions were adopted. Copyright © 2009 Elsevier España, S.L. All rights reserved.

  5. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  6. SCDAP: a light water reactor computer code for severe core damage analysis

    International Nuclear Information System (INIS)

    Marino, G.P.; Allison, C.M.; Majumdar, D.

    1982-01-01

    Development of the first code version (MODO) of the Severe Core Damage Analysis Package (SCDAP) computer code is described, and calculations made with SCDAP/MODO are presented. The objective of this computer code development program is to develop a capability for analyzing severe disruption of a light water reactor core, including fuel and cladding liquefaction, flow, and freezing; fission product release; hydrogen generation; quenched-induced fragmentation; coolability of the resulting geometry; and ultimately vessel failure due to vessel-melt interaction. SCDAP will be used to identify the phenomena which control core behavior during a severe accident, to help quantify uncertainties in risk assessment analysis, and to support planning and evaluation of severe fuel damage experiments and data. SCDAP/MODO addresses the behavior of a single fuel bundle. Future versions will be developed with capabilities for core-wide and vessel-melt interaction analysis

  7. ProRisk : risk analysis instrument : developed for William properties

    NARCIS (Netherlands)

    van Doorn, W.H.W.; Egeberg, Ingrid; Hendrickx, Kristoff; Kahramaner, Y.; Masseur, B.; Waijers, Koen; Weglicka, K.A.

    2005-01-01

    This report presents a Risk Analysis Instrument developed for William Properties. Based on the analysis, it appears that the practice of Risk Analysis exists within the organization, yet rather implicit. The Risk Analysis Instrument comes with a package of four components: an activity diagram, a

  8. Algorithms for the Computation of Debris Risks

    Science.gov (United States)

    Matney, Mark

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of non-spherical satellites. A number of tools have been developed in NASA's Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA's Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper will present an introduction to these algorithms and the assumptions upon which they are based.

  9. Computed tomography in children: multicenter cohort study design for the evaluation of cancer risk

    International Nuclear Information System (INIS)

    Krille, L.; Jahnen, A.; Mildenberger, P.; Schneider, K.; Weisser, G.; Zeeb, H.; Blettner, M.

    2011-01-01

    Exposure to ionizing radiation is a known risk factor for cancer. Cancer risk is highest after exposure in childhood. The computed tomography is the major contributor to the average, individual radiation exposure. Until now the association has been addressed only in statistical modeling. We present the first feasible study design on childhood cancer risk after exposure to computed tomography.

  10. Analysis of health impact inputs to the US Department of Energy's risk information system

    Energy Technology Data Exchange (ETDEWEB)

    Droppo, J.G. Jr.; Buck, J.W.; Strenge, D.L.; Siegel, M.R.

    1990-08-01

    The US Department of Energy (DOE) is in the process of completing a survey of environmental problems, referred to as the Environmental Survey, at their facilities across the country. The DOE Risk Information System (RIS) is being used to prioritize these environmental problems identified in the Environmental Survey's findings. This report contains a discussion of site-specific public health risk parameters and the rationale for their inclusion in the RIS. These parameters are based on computed potential impacts obtained with the Multimedia Environmental Pollutant Assessment System (MEPAS). MEPAS is a computer-based methodology for evaluating the potential exposures resulting from multimedia environmental transport of hazardous materials. This report has three related objectives: document the role of MEPAS in the RIS framework, report the results of the analysis of alternative risk parameters that led to the current RIS risk parameters, and describe analysis of uncertainties in the risk-related parameters. 20 refs., 17 figs., 10 tabs.

  11. Observations on risk analysis

    International Nuclear Information System (INIS)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested

  12. Smoking and the risk of a stroke. [Computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Bell, B A [Royal Infirmary, Edinburgh (UK). Dept. of Surgical Neurology; Ambrose, J [Atkinson Morley' s Hospital, London (UK). Dept. of Neurosurgery

    1982-01-01

    A retrospective study has been undertaken of 236 men and women with a stroke investigated by computed tomography and, where indicated, cerebral angiography. An excess of cigarette smokers has been found where the stroke was the result of ischaemia. Results indicate that continued smoking increases the risk of sustaining cerebral infarction by a factor of 1.9 for men and 2.4 for women. Smoking does not appear to be a risk factor in primary intracerebral haemorrhage, unlike subarachnoid haemorrhage where smokers carry a relative risk approaching four times that of non-smokers.

  13. A New Computationally Frugal Method For Sensitivity Analysis Of Environmental Models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A.; Teuling, R.; Borgonovo, E.; Uijlenhoet, R.

    2013-12-01

    Effective and efficient parameter sensitivity analysis methods are crucial to understand the behaviour of complex environmental models and use of models in risk assessment. This paper proposes a new computationally frugal method for analyzing parameter sensitivity: the Distributed Evaluation of Local Sensitivity Analysis (DELSA). The DELSA method can be considered a hybrid of local and global methods, and focuses explicitly on multiscale evaluation of parameter sensitivity across the parameter space. Results of the DELSA method are compared with the popular global, variance-based Sobol' method and the delta method. We assess the parameter sensitivity of both (1) a simple non-linear reservoir model with only two parameters, and (2) five different "bucket-style" hydrologic models applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both the synthetic and real-world examples, the global Sobol' method and the DELSA method provide similar sensitivities, with the DELSA method providing more detailed insight at much lower computational cost. The ability to understand how sensitivity measures vary through parameter space with modest computational requirements provides exciting new opportunities.

  14. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  15. Risks, costs and benefits analysis for exhumation of buried radioactive materials at a nuclear fuel fabrication facility

    International Nuclear Information System (INIS)

    Kirk, J.S.; Moore, R.A.; Huston, T.E.

    1996-01-01

    A Risks, Costs and Benefits analysis provides a tool for selecting a cost-effective remedial action alternative. This analysis can help avoid transferring risks to other populations and can objectively measure the benefits of a specific remedial action project. This paper describes the methods and results of a Risks, Costs and Benefits analysis performed at a nuclear fuel fabrication facility. The analysis examined exhuming and transporting radioactive waste to an offsite disposal facility. Risks evaluated for the remedial action project were divided into two categories: risks posed to the worker and risks posed to public health. Risks to workers included exposure to radioactive contaminants during excavation and packaging of waste materials and the use of heavy machinery. Potential public health risks included exposure to radioactive materials during transport from the exhumation site to the disposal facility. Methods included use of site-specific and published data, and existing computer models. Occupational risks were quantified using data from similar onsite remedial action projects. Computer modeling was used to evaluate public health risks from transporting radioactive materials; the consequences or probability of traffic accidents; and radiation exposure to potential inhabitants occupying the site considering various land use scenarios. A costs analysis was based on data obtained from similar onsite remedial action projects. Scenarios used to identify benefits resulting from the remedial action project included (1) an evaluation of reduction in risks to human health; (2) cost reductions associated with the unrestricted release of the property; and (3) benefits identified by evaluating regulatory mandates applicable to decommissioning. This paper will provide an overview of the methods used and a discussion of the results of a Risks, Costs and Benefits analysis for a site-specific remedial action scenario

  16. Computer/Mobile Device Screen Time of Children and Their Eye Care Behavior: The Roles of Risk Perception and Parenting.

    Science.gov (United States)

    Chang, Fong-Ching; Chiu, Chiung-Hui; Chen, Ping-Hung; Miao, Nae-Fang; Chiang, Jeng-Tung; Chuang, Hung-Yi

    2018-03-01

    This study assessed the computer/mobile device screen time and eye care behavior of children and examined the roles of risk perception and parental practices. Data were obtained from a sample of 2,454 child-parent dyads recruited from 30 primary schools in Taipei city and New Taipei city, Taiwan, in 2016. Self-administered questionnaires were collected from students and parents. Fifth-grade students spend more time on new media (computer/smartphone/tablet: 16 hours a week) than on traditional media (television: 10 hours a week). The average daily screen time (3.5 hours) for these children exceeded the American Academy of Pediatrics recommendations (≤2 hours). Multivariate analysis results showed that after controlling for demographic factors, the parents with higher levels of risk perception and parental efficacy were more likely to mediate their child's eye care behavior. Children who reported lower academic performance, who were from non-intact families, reported lower levels of risk perception of mobile device use, had parents who spent more time using computers and mobile devices, and had lower levels of parental mediation were more likely to spend more time using computers and mobile devices; whereas children who reported higher academic performance, higher levels of risk perception, and higher levels of parental mediation were more likely to engage in higher levels of eye care behavior. Risk perception by children and parental practices are associated with the amount of screen time that children regularly engage in and their level of eye care behavior.

  17. Computer-aided reliability and risk assessment

    International Nuclear Information System (INIS)

    Leicht, R.; Wingender, H.J.

    1989-01-01

    Activities in the fields of reliability and risk analyses have led to the development of particular software tools which now are combined in the PC-based integrated CARARA system. The options available in this system cover a wide range of reliability-oriented tasks, like organizing raw failure data in the component/event data bank FDB, performing statistical analysis of those data with the program FDA, managing the resulting parameters in the reliability data bank RDB, and performing fault tree analysis with the fault tree code FTL or evaluating the risk of toxic or radioactive material release with the STAR code. (orig.)

  18. Risk-benefit analysis of 18FDG PET cancer screening

    International Nuclear Information System (INIS)

    Murano, Takeshi; Daisaki, Hiromitsu; Terauchi, Takashi; Iinuma, Takeshi; Tateno, Yukio; Tateishi, Ukihide; Kato, Kazuaki; Inoue, Tomio

    2008-01-01

    The benefits of 18 F-fluorodeoxyglucose ( 18 FDG) positron emission tomography (PET) cancer screening are expected to include a large population of examinees and are intended for a healthy group. Therefore, we attempted to determine the benefit/risk ratio, estimated risk of radiation exposure, and benefit of cancer detection. We used software that embodied the method of the International Commission on Radiological Protection (ICRP) to calculate the average duration of life of radiation exposure. We calculated the lifesaving person years of benefit to be obtained by 18 FDG PET cancer screening detection. We also calculated the benefit/risk ratio using life-shortening and lifesaving person years. According to age, the benefit/risk ratio was more than 1 at 35-39 years old for males and 30-34 years old for females. 18 FDG PET cancer screening also is effective for examinees older than this. A risk-benefit analysis of 18 FDG-PET/computed tomography (CT) cancer screening will be necessary in the future. (author)

  19. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  20. Towards Cloud Computing SLA Risk Management: Issues and Challenges

    OpenAIRE

    Morin, Jean-Henry; Aubert, Jocelyn; Gateau, Benjamin

    2012-01-01

    Cloud Computing has become mainstream technology offering a commoditized approach to software, platform and infrastructure as a service over the Internet on a global scale. This raises important new security issues beyond traditional perimeter based approaches. This paper attempts to identify these issues and their corresponding challenges, proposing to use risk and Service Level Agreement (SLA) management as the basis for a service level framework to improve governance, risk and compliance i...

  1. Unsharpness-risk analysis

    International Nuclear Information System (INIS)

    Preyssl, C.

    1986-01-01

    Safety analysis provides the only tool for evaluation and quantification of rare or hypothetical events leading to system failure. So far probability theory has been used for the fault- and event-tree methodology. The phenomenon of uncertainties constitutes an important aspect in risk analysis. Uncertainties can be classified as originating from 'randomness' or 'fuzziness'. Probability theory addresses randomness only. The use of 'fuzzy set theory' makes it possible to include both types of uncertainty in the mathematical model of risk analysis. Thus the 'fuzzy fault tree' is expressed in 'possibilistic' terms implying a range of simplifications and improvements. 'Human failure' and 'conditionality' can be treated correctly. Only minimum-maximum relations are used to combine the possibility distributions of events. Various event-classifications facilitate the interpretation of the results. The method is demonstrated by application to a TRIGA-research reactor. Uncertainty as an implicit part of 'fuzzy risk' can be quantified explicitly using an 'uncertainty measure'. Based on this the 'degree of relative compliance' with a quantizative safety goal can be defined for a particular risk. The introduction of 'weighting functionals' guarantees the consideration of the importances attached to different parts of the risk exceeding or complying with the standard. The comparison of two reference systems is demonstrated in a case study. It is concluded that any application of the 'fuzzy risk analysis' has to be free of any hypostatization when reducing subjective to objective information. (Author)

  2. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  3. Downside Risk analysis applied to the Hedge Funds universe

    Science.gov (United States)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  4. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Sen, P.; Tan, John K.G.; Spencer, David

    1999-01-01

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  5. The first step toward diagnosing female genital schistosomiasis by computer image analysis

    DEFF Research Database (Denmark)

    Holmen, Sigve Dhondup; Kleppa, Elisabeth; Lillebø, Kristine

    2015-01-01

    Schistosoma haematobium causes female genital schistosomiasis (FGS), which is a poverty-related disease in sub-Saharan Africa. Furthermore, it is co-endemic with human immunodeficiency virus (HIV), and biopsies from genital lesions may expose the individual to increased risk of HIV infection...... statistics, we estimate that the computer color analysis yields a sensitivity of 80.5% and a specificity of 66.2% for the diagnosis of FGS....

  6. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  7. Risk analysis for CHP decision making within the conditions of an open electricity market

    International Nuclear Information System (INIS)

    Al-Mansour, Fouad; Kozuh, Mitja

    2007-01-01

    Decision making under uncertainty is a difficult task in most areas. Investment decisions for combined heat and power production (CHP) are certainly one of the areas where it is difficult to find an optimal solution since the payback period is several years and parameters change due to different perturbing factors of economic and mostly political nature. CHP is one of the most effective measures for saving primary energy and reduction of greenhouse gas emissions. The implementation of EU directives on the promotion of cogeneration based on useful heat demand in the internal energy market will accelerate CHP installation. The expected number of small CHP installations will be very high in the near future. A quick, reliable and simple tool for economic evaluation of small CHP systems is required. Since evaluation is normally made by sophisticated economic computer models which are rather expensive, a simple point estimate economic model was developed which was later upgraded by risk methodology to give more informative results for better decision making. This paper presents a reliable computer model entitled 'Computer program for economic evaluation analysis of CHP' as a tool for analysis and economic evaluation of small CHP systems with the aim of helping the decision maker. The paper describes two methods for calculation of the sensitivity of the economic results to changes of input parameters and the uncertainty of the results: the classic/static method and the risk method. The computer program uses risk methodology by applying RISK software on an existing conventional economic model. The use of risk methodology for economic evaluation can improve decisions by incorporating all possible information (knowledge), which cannot be done in the conventional economic model due to its limitations. The methodology was tested on the case of a CHP used in a smaller hospital

  8. Efficient computation of exposure profiles for counterparty credit risk

    NARCIS (Netherlands)

    C.S.L. de Graaf (Kees); Q. Feng (Qian); B.D. Kandhai; C.W. Oosterlee (Cornelis)

    2014-01-01

    htmlabstractThree computational techniques for approximation of counterparty exposure for financial derivatives are presented. The exposure can be used to quantify so-called Credit Valuation Adjustment (CVA) and Potential Future Exposure (PFE), which are of utmost importance for modern risk

  9. Risk analysis of geothermal power plants using Failure Modes and Effects Analysis (FMEA) technique

    International Nuclear Information System (INIS)

    Feili, Hamid Reza; Akar, Navid; Lotfizadeh, Hossein; Bairampour, Mohammad; Nasiri, Sina

    2013-01-01

    Highlights: • Using Failure Modes and Effects Analysis (FMEA) to find potential failures in geothermal power plants. • We considered 5 major parts of geothermal power plants for risk analysis. • Risk Priority Number (RPN) is calculated for all failure modes. • Corrective actions are recommended to eliminate or decrease the risk of failure modes. - Abstract: Renewable energy plays a key role in the transition toward a low carbon economy and the provision of a secure supply of energy. Geothermal energy is a versatile source as a form of renewable energy that meets popular demand. Since some Geothermal Power Plants (GPPs) face various failures, the requirement of a technique for team engineering to eliminate or decrease potential failures is considerable. Because no specific published record of considering an FMEA applied to GPPs with common failure modes have been found already, in this paper, the utilization of Failure Modes and Effects Analysis (FMEA) as a convenient technique for determining, classifying and analyzing common failures in typical GPPs is considered. As a result, an appropriate risk scoring of occurrence, detection and severity of failure modes and computing the Risk Priority Number (RPN) for detecting high potential failures is achieved. In order to expedite accuracy and ability to analyze the process, XFMEA software is utilized. Moreover, 5 major parts of a GPP is studied to propose a suitable approach for developing GPPs and increasing reliability by recommending corrective actions for each failure mode

  10. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  11. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  12. Risk analysis

    International Nuclear Information System (INIS)

    Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.

    1997-01-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es

  13. Security, Privacy, Threats and Risks in Cloud Computing ― A Vital Review

    OpenAIRE

    Goyal, Sumit

    2016-01-01

    Cloud computing is a multi million dollar business. As more and more enterprises are adopting cloud services for their businesses, threat of security has become a big concern for these enterprises and cloud users. This review describes the latest threats and risks associated with cloud computing and suggests techniques for better privacy and security of data in cloud environment. Threats and risks associated with cloud service models (SaaS, PaaS and IaaS) along with cloud deployment models (p...

  14. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  15. Efficient Computation of Exposure Profiles for Counterparty Credit Risk

    NARCIS (Netherlands)

    de Graaf, C.S.L.; Feng, Q.; Kandhai, D.; Oosterlee, C.W.

    2014-01-01

    Three computational techniques for approximation of counterparty exposure for financial derivatives are presented. The exposure can be used to quantify so-called Credit Valuation Adjustment (CVA) and Potential Future Exposure (PFE), which are of utmost importance for modern risk management in the

  16. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    Science.gov (United States)

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  17. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  18. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  19. An open-source textbook for teaching climate-related risk analysis using the R computing environment

    Science.gov (United States)

    Applegate, P. J.; Keller, K.

    2015-12-01

    Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.

  20. Risk-based decision analysis for the 200-BP-5 groundwater operable unit. Revision 2

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1996-02-01

    This document presents data from a risk analysis that was performed on three groundwater contaminant plumes within the 200-BP-5 Operable Unit. Hypothetical exposure scenarios were assessed based on current and future plume conditions. For current conditions, a hypothetical industrial groundwater scenarios were assumed. The industrial ingestion scenario, which is derived from HSRAM, was not used for drinking water and should not be implied by this risk analysis that the DOE is advocating use of this groundwater for direct human ingestion. Risk was calculated at each monitoring well using the observed radionuclide concentrations in groundwater from that well. The calculated values represent total radiological incremental lifetime cancer risk. Computer models were used to show the analytical flow and transport of contaminants of concern

  1. State-of-the-art report on accident analysis and risk analysis of reprocessing plants in European countries

    International Nuclear Information System (INIS)

    Nomura, Yasushi

    1985-12-01

    This report summarizes informations obtained from America, England, France and FRG concerning methodology, computer code, fundamental data and calculational model on accident/risk analyses of spent fuel reprocessing plants. As a result, the followings are revealed. (1) The system analysis codes developed for reactor plants can be used for reprocessing plants with some code modification. (2) Calculational models and programs have been developed for accidental phenomenological analyses in FRG, but with insufficient data to prove them. (3) The release tree analysis codes developed in FRG are available to estimate radioactivity release amount/probability via off-gas/exhaustair lines in the case of accidents. (4) The computer codes developed in America for reactor-plant environmental transport/safety analyses of released radioactivity can be applied to reprocessing facilities. (author)

  2. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  3. A Computational Framework for Flood Risk Assessment in The Netherlands

    Directory of Open Access Journals (Sweden)

    A.A. Markus

    2010-01-01

    Full Text Available The safety of dikes in The Netherlands, located in the delta of the rivers Rhine, Meuse and Scheldt, has been the subject of debate for more than ten years. The safety (or flood risk of a particular area may depend on the safety of other areas. This is referred to as effects of river system behaviour on flood risk (quantified as the estimated number of casualties and economic damage. A computational framework was developed to assess these effects. It consists of several components that are loosely coupled via data files and Tcl scripts to manage the individual programs and keep track of the state of the computations. The computations involved are lengthy (days or even weeks on a Linux cluster, which makes the framework currently more suitable for planning and design than for real-time operation. While the framework was constructed ad hoc, it can also be viewed more formally as a tuplespace Realising this makes it possible to adopt the philosophy for other similar frameworks.

  4. Risk assessment through drinking water pathway via uncertainty modeling of contaminant transport using soft computing

    International Nuclear Information System (INIS)

    Datta, D.; Ranade, A.K.; Pandey, M.; Sathyabama, N.; Kumar, Brij

    2012-01-01

    The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility

  5. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  6. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  7. Development of a fast running accident analysis computer program for use in a simulator

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    1985-01-01

    This paper describes how a reactor safety nuclear computer program can be modified and improved with the aim of reaching a very fast running tool to be used as a physical model in a plant simulator, without penalizing the accuracy of results. It also discusses some ideas on how the physical theoretical model can be combined to a driving statistical tool for the build up of the entire package of software to be implemented in the simulator for risk and reliability analysis. The approach to the problem, although applied to a specific computer program, can be considered quite general if an already existing and well tested code is being used for the purpose. The computer program considered is ALMOD, originally developed for the analysis of the thermohydraulic and neutronic behaviour of the reactor core, primary circuit and steam generator during operational and special transients. (author)

  8. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  9. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van

  10. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    1990-04-01

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  11. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  12. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  13. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  14. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  15. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    Science.gov (United States)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    information and to generate knowledge in the stakeholders for improving flood risk management. In particular, Floodrisk comprises a set of calculators capable of computing human or economic losses for a collection of assets, caused by a given scenario event, explicitly covering mitigation and adaptation measures (Mancusi et al., 2015). It is important to mention that despite the fact that some literature models incorporates calculator philosophies identical to the ones implemented in the FloodRisk engine, its implementation might vary significantly, such as the need for a user-friendly and intuitive user interface, or the capability of running the calculations on any platform (Windows, Mac, Linux, etc.), the ability to promotes extensibility, efficient testability, and scientific operability. FloodRisk has been designed as an initiative for implemented a standard and harmonized procedure to determine the flood impacts. Albano, R.; Mancusi, L.; Sole, A.; Adamowski, J. Collaborative Strategies for Sustainable EU Flood Risk Management: FOSS and Geospatial Tools - Challenges and Opportunities for Operative Risk Analysis. ISPRS Int. J. Geo-Inf. 2015, 4, 2704-2727. Mancusi, L., Albano, R., Sole, A.. FloodRisk: a QGIS plugin for flood consequences estimation, In: Geomatics Workbooks n°12 - FOSS4G Europe Como, 2015

  16. Adversarial risk analysis

    CERN Document Server

    Banks, David L; Rios Insua, David

    2015-01-01

    Flexible Models to Analyze Opponent Behavior A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on a...

  17. Head multidetector computed tomography: emergency medicine physicians overestimate the pretest probability and legal risk of significant findings.

    Science.gov (United States)

    Baskerville, Jerry Ray; Herrick, John

    2012-02-01

    This study focuses on clinically assigned prospective estimated pretest probability and pretest perception of legal risk as independent variables in the ordering of multidetector computed tomographic (MDCT) head scans. Our primary aim is to measure the association between pretest probability of a significant finding and pretest perception of legal risk. Secondarily, we measure the percentage of MDCT scans that physicians would not order if there was no legal risk. This study is a prospective, cross-sectional, descriptive analysis of patients 18 years and older for whom emergency medicine physicians ordered a head MDCT. We collected a sample of 138 patients subjected to head MDCT scans. The prevalence of a significant finding in our population was 6%, yet the pretest probability expectation of a significant finding was 33%. The legal risk presumed was even more dramatic at 54%. These data support the hypothesis that physicians presume the legal risk to be significantly higher than the risk of a significant finding. A total of 21% or 15% patients (95% confidence interval, ±5.9%) would not have been subjected to MDCT if there was no legal risk. Physicians overestimated the probability that the computed tomographic scan would yield a significant result and indicated an even greater perceived medicolegal risk if the scan was not obtained. Physician test-ordering behavior is complex, and our study queries pertinent aspects of MDCT testing. The magnification of legal risk vs the pretest probability of a significant finding is demonstrated. Physicians significantly overestimated pretest probability of a significant finding on head MDCT scans and presumed legal risk. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  19. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  20. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  1. WASTE-PRA: a computer package for probabilistic risk assessment of shallow-land burial of low-level radioactive waste

    International Nuclear Information System (INIS)

    Cox, N.D.; Atwood, C.L.

    1985-12-01

    This report is a user's manual for a package of computer programs and data files to be used for probabilistic risk assessment of shallow-land burial of low-level radioactive waste. The nuclide transport pathways modeled are an unsaturated groundwater column, an aquifer, and the atmosphere. An individual or the population receives a dose commitment through shine, inhalation, ingestion, direct exposure, and/or a puncture wound. The methodology of risk assessment is based on the response surface method of uncertainty analysis. The parameters of the model for predicting dose commitment due to a release are treated as statistical variables, in order to compute statistical distributions for various contributions to the dose commitment. The likelihood of a release is similarly treated as a statistical variable. Uncertainty distributions are obtained both for the dose commitment and for the corresponding risk. Plots and printouts are produced to aid in comparing the importance of various release scenarios and in assessing the total risk of a set of scenarios. The entire methodology is illustrated by an example. Information is included on parameter uncertainties, reference site characteristics, and probabilities of release events

  2. Risk assessment on the life in the analysis of UNSCEAR, BEIR and CIPR

    International Nuclear Information System (INIS)

    Hubert, P.

    1990-01-01

    During the last two years, international committees dealing with the effects of ionizing radiation have published a new series of risk estimates. New epidemiological data, in particular in the case of A-bomb survivors whose doses have been reassessed, have been used. Unfortunately, there has been a large range of alternatives, not only in dealing with epidemiological or biological modelling, but also in the actual computation of a life-long risk index which is derived from the epidemiological primary coefficients. As a consequence, published figures are never truly comparable. The identification of all computational alternatives is made here, and the quantification of their consequences is attempted. The purpose is to achieve the control of demographic assumptions (e.g. reference population) and of various conventional assumptions, in order to measure the impact of epidemiological and biological hypotheses, which are felt to be more fundamental. The analysis shows that such impacts are more important than suggested by published tables. The effects of various alternatives obviously compensate one another. Further discussions on the modelling of the biological effects of radiation would greatly benefit from the development of a standard for life-long risk computations. 6 tabs., 4 figs [fr

  3. System Analysis and Risk Assessment (SARA) system

    International Nuclear Information System (INIS)

    Krantz, E.A.; Russell, K.D.; Stewart, H.D.; Van Siclen, V.S.

    1986-01-01

    Utilization of Probabilistic Risk Assessment (PRA) related information in the day-to-day operation of plant systems has, in the past, been impracticable due to the size of the computers needed to run PRA codes. This paper discusses a microcomputer-based database system which can greatly enhance the capability of operators or regulators to incorporate PRA methodologies into their routine decision making. This system is called the System Analysis and Risk Assessment (SARA) system. SARA was developed by EG and G Idaho, Inc. at the Idaho National Engineering Laboratory to facilitate the study of frequency and consequence analyses of accident sequences from a large number of light water reactors (LWRs) in this country. This information is being amassed by several studies sponsored by the United States Nuclear Regulatory Commission (USNRC). To meet the need of portability and accessibility, and to perform the variety of calculations necessary, it was felt that a microcomputer-based system would be most suitable

  4. Risk analysis of nuclear safeguards regulations

    International Nuclear Information System (INIS)

    Al-Ayat, R.A.; Altman, W.D.; Judd, B.R.

    1982-06-01

    The Aggregated Systems Model (ASM), a probabilisitic risk analysis tool for nuclear safeguards, was applied to determine benefits and costs of proposed amendments to NRC regulations governing nuclear material control and accounting systems. The objective of the amendments was to improve the ability to detect insiders attempting to steal large quantities of special nuclear material (SNM). Insider threats range from likely events with minor consequences to unlikely events with catastrophic consequences. Moreover, establishing safeguards regulations is complicated by uncertainties in threats, safeguards performance, and consequences, and by the subjective judgments and difficult trade-offs between risks and safeguards costs. The ASM systematically incorporates these factors in a comprehensive, analytical framework. The ASM was used to evaluate the effectiveness of current safeguards and to quantify the risk of SNM theft. Various modifications designed to meet the objectives of the proposed amendments to reduce that risk were analyzed. Safeguards effectiveness was judged in terms of the probability of detecting and preventing theft, the expected time to detection, and the expected quantity of SNM diverted in a year. Data were gathered in tours and interviews at NRC-licensed facilities. The assessment at each facility was begun by carefully selecting scenarios representing the range of potential insider threats. A team of analysts and facility managers assigned probabilities for detection and prevention events in each scenario. Using the ASM we computed the measures of system effectiveness and identified cost-effective safeguards modifications that met the objectives of the proposed amendments

  5. Evidence-based ergonomics education: Promoting risk factor awareness among office computer workers.

    Science.gov (United States)

    Mani, Karthik; Provident, Ingrid; Eckel, Emily

    2016-01-01

    Work-related musculoskeletal disorders (WMSDs) related to computer work have become a serious public health concern. Literature revealed a positive association between computer use and WMSDs. The purpose of this evidence-based pilot project was to provide a series of evidence-based educational sessions on ergonomics to office computer workers to enhance the awareness of risk factors of WMSDs. Seventeen office computer workers who work for the National Board of Certification in Occupational Therapy volunteered for this project. Each participant completed a baseline and post-intervention ergonomics questionnaire and attended six educational sessions. The Rapid Office Strain Assessment and an ergonomics questionnaire were used for data collection. The post-intervention data revealed that 89% of participants were able to identify a greater number of risk factors and answer more questions correctly in knowledge tests of the ergonomics questionnaire. Pre- and post-intervention comparisons showed changes in work posture and behaviors (taking rest breaks, participating in exercise, adjusting workstation) of participants. The findings have implications for injury prevention in office settings and suggest that ergonomics education may yield positive knowledge and behavioral changes among computer workers.

  6. Computational modeling for irrigated agriculture planning. Part II: risk analysis Modelagem computacional para planejamento em agricultura irrigada: Parte II - Análise de risco

    Directory of Open Access Journals (Sweden)

    João C. F. Borges Júnior

    2008-09-01

    Full Text Available Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.Técnicas de avaliação de riscos procedentes de incertezas inerentes à atividade agrícola devem acompanhar os estudos de planejamento. A análise de risco pode ser desempenhada por meio de simulação, utilizando técnicas como o método de Monte Carlo. Neste trabalho, teve-se o objetivo de desenvolver um programa computacional, denominado P-RISCO, para utilização de simulações de risco em modelos de programação linear, aplicar a um estudo de caso e testar os resultados comparativamente ao programa @RISK. Na análise de risco, observou-se que a média da variável de saída, valor presente líquido total (U, foi consideravelmente inferior ao valor máximo de U obtido no modelo de programação linear. Constatou

  7. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  8. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  9. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  10. Non-Therapeutic Risk Factors for Onset of Tardive Dyskinesia in Schizophrenia : A Meta-Analysis

    NARCIS (Netherlands)

    Tenback, Diederik E.; van Harten, Peter N.; van Os, Jim

    2009-01-01

    A meta-analysis of prospective studies with schizophrenia patients was conducted to examine whether the evidence exists for risk factors for the emergence of Tardive Dyskinesia (TD) in schizophrenia. A computer assisted Medline/PubMed and Embase search was' conducted in January 2008 for the years

  11. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  12. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  13. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  14. Television, computer, and video viewing; physical activity; and upper limb fracture risk in children: a population-based case control study.

    Science.gov (United States)

    Ma, Deqiong; Jones, Graeme

    2003-11-01

    The effect of physical activity on upper limb fractures was examined in this population-based case control study with 321 age- and gender-matched pairs. Sports participation increased fracture risk in boys and decreased risk in girls. Television viewing had a deleterious dose response association with wrist and forearm fractures while light physical activity was protective. The aim of this population-based case control study was to examine the association between television, computer, and video viewing; types and levels of physical activity; and upper limb fractures in children 9-16 years of age. A total of 321 fracture cases and 321 randomly selected individually matched controls were studied. Television, computer, and video viewing and types and levels of physical activity were determined by interview-administered questionnaire. Bone strength was assessed by DXA and metacarpal morphometry. In general, sports participation increased total upper limb fracture risk in boys and decreased risk in girls. Gender-specific risk estimates were significantly different for total, contact, noncontact, and high-risk sports participation as well as four individual sports (soccer, cricket, surfing, and swimming). In multivariate analysis, time spent television, computer, and video viewing in both sexes was positively associated with wrist and forearm fracture risk (OR 1.6/category, 95% CI: 1.1-2.2), whereas days involved in light physical activity participation decreased fracture risk (OR 0.8/category, 95% CI: 0.7-1.0). Sports participation increased hand (OR 1.5/sport, 95% CI: 1.1-2.0) and upper arm (OR 29.8/sport, 95% CI: 1.7-535) fracture risk in boys only and decreased wrist and forearm fracture risk in girls only (OR 0.5/sport, 95% CI: 0.3-0.9). Adjustment for bone density and metacarpal morphometry did not alter these associations. There is gender discordance with regard to sports participation and fracture risk in children, which may reflect different approaches to sport

  15. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  16. A background to risk analysis. Vol. 3

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justifi- cation or evaluation, this is given in the form of a chapter appenix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 3 contains chapters on quantification of risk, failure and accident probability, risk analysis and design, and examles of risk analysis for process plant. (BP)

  17. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  18. Probabilistic methods in fire-risk analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment

  19. Computational methods in the pricing and risk management of modern financial derivatives

    Science.gov (United States)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  20. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, S.Y.

    1994-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the US Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities

  1. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, Shih-Yew

    1995-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the U.S. Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities. (author)

  2. Environmental risk analysis

    International Nuclear Information System (INIS)

    Lima-e-Silva, Pedro Paulo de

    1996-01-01

    The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)

  3. Cloud Computing Services: Benefits, Risks and Intellectual Property Issues

    Directory of Open Access Journals (Sweden)

    IONELA BĂLŢĂTESCU

    2014-05-01

    Full Text Available Major software players of the global market, such as Google, Amazon and Microsoft are developing cloud computing solutions, providing cloud services on demand: Infrastructure as a Service (IaaS, Platform as a Service (PaaS and Software as a service (SaaS. In software industry and also in ICT services market, cloud computing is playing an increasingly important role. Moreover, the expansion of cloud services indirectly contributed to the development and improvement of other types of services on the market – financial and accounting services, human resources services, educational services etc. – in terms of quality and affordability. Given the fact that cloud computing applications proved to be more affordable for small and medium enterprises (SME, an increasing number of companies in almost all the fields of activity have chosen cloud based solutions, such as Enterprise Resource Management (ERP software and Customer Relationship Management (CRM software. However, cloud computing services involve also some risks concerning privacy, security of data and lack of interoperability between cloud platforms. Patent strategy of certain proprietary software companies leaded to a veritable “patent war” and “patent arm race” endangering the process of standardization in software industry, especially in cloud computing. Intellectual property (IP legislation and court ruling in patent litigations is likely to have a significant impact on the development of cloud computing industry and cloud services.

  4. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  5. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  6. Component of the risk analysis

    International Nuclear Information System (INIS)

    Martinez, I.; Campon, G.

    2013-01-01

    The power point presentation reviews issues like analysis of risk (Codex), management risk, preliminary activities manager, relationship between government and industries, microbiological danger and communication of risk

  7. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  8. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  9. Contrast induced nephropathy in patients undergoing intravenous (IV) contrast enhanced computed tomography (CECT) and the relationship with risk factors: a meta-analysis

    NARCIS (Netherlands)

    Moos, Shira I.; van Vemde, David N. H.; Stoker, Jaap; Bipat, Shandra

    2013-01-01

    To summarize the incidence of contrast-induced nephropathy (CIN) and associations between CIN incidence and risk factors in patients undergoing intravenous contrast-enhanced computed tomography (CECT) with low- or iso-osmolar iodinated contrast medium. This review is performed in accordance with the

  10. Computational analysis of a multistage axial compressor

    Science.gov (United States)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  11. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  12. Living PRAs [probabilistic risk analysis] made easier with IRRAS [Integrated Reliability and Risk Analysis System

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1989-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is an integrated PRA software tool that gives the user the ability to create and analyze fault trees and accident sequences using an IBM-compatible microcomputer. This program provides functions that range from graphical fault tree and event tree construction to cut set generation and quantification. IRRAS contains all the capabilities and functions required to create, modify, reduce, and analyze event tree and fault tree models used in the analysis of complex systems and processes. IRRAS uses advanced graphic and analytical techniques to achieve the greatest possible realization of the potential of the microcomputer. When the needs of the user exceed this potential, IRRAS can call upon the power of the mainframe computer. The role of the Idaho National Engineering Laboratory if the IRRAS program is that of software developer and interface to the user community. Version 1.0 of the IRRAS program was released in February 1987 to prove the concept of performing this kind of analysis on microcomputers. This version contained many of the basic features needed for fault tree analysis and was received very well by the PRA community. Since the release of Version 1.0, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version is designated ''IRRAS 2.0''. Version 3.0 will contain all of the features required for efficient event tree and fault tree construction and analysis. 5 refs., 26 figs

  13. Risk Analysis in Road Tunnels – Most Important Risk Indicators

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Thöns, Sebastian

    2016-01-01

    Methodologies on fire risk analysis in road tunnels consider numerous factors affecting risks (risk indicators) and express the results by risk measures. But only few comprehensive studies on effects of risk indicators on risk measures are available. For this reason, this study quantifies...... the effects and highlights the most important risk indicators with the aim to support further developments in risk analysis. Therefore, a system model of a road tunnel was developed to determine the risk measures. The system model can be divided into three parts: the fire part connected to the fire model Fire...... Dynamics Simulator (FDS); the evacuation part connected to the evacuation model FDS+Evac; and the frequency part connected to a model to calculate the frequency of fires. This study shows that the parts of the system model (and their most important risk indicators) affect the risk measures in the following...

  14. Risk analysis in radiosurgery treatments using risk matrices

    International Nuclear Information System (INIS)

    Delgado, J. M.; Sanchez Cayela, C.; Ramirez, M. L.; Perez, A.

    2011-01-01

    The aim of this study is the risk analysis process stereotactic single-dose radiotherapy and evaluation of those initiating events that lead to increased risk and a possible solution in the design of barriers.

  15. Cognitive maps for risk assessment in providing cloud computing data security

    OpenAIRE

    Konrad, U.; Penzina, V.

    2013-01-01

    Cloud Computing (CC) became a new milestone in era of information technology. Almost unlimited possibilities for the storing information, data processing and virtual machine creation discovered unique perspectives. However, new technologies bring new threats, risks and serious consequences.

  16. Risk D and D Rapid Prototype: Scenario Documentation and Analysis Tool

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-01-01

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health and safety risk analysis for decontamination and decommissioning projects. The objective of the Decontamination and Decommissioning (D and D) Risk Management Evaluation and Work Sequencing Standardization Project under DOE EM-23 is to recommend or develop practical risk-management tools for decommissioning of nuclear facilities. PNNL has responsibility under this project for recommending or developing computer-based tools that facilitate the evaluation of risks in order to optimize the sequencing of D and D work. PNNL's approach is to adapt, augment, and integrate existing resources rather than to develop a new suite of tools. Methods for the evaluation of H and S risks associated with work in potentially hazardous environments are well-established. Several approaches exist which, collectively, are referred to as process hazard analysis (PHA). A PHA generally involves the systematic identification of accidents, exposures, and other adverse events associated with a given process or work flow. This identification process is usually achieved in a brainstorming environment or by other means of eliciting informed opinion. The likelihoods of adverse events (scenarios) and their associated consequence severities are estimated against pre-defined scales, based on which risk indices are then calculated. A similar process is encoded in various project risk software products that facilitate the quantification of schedule and cost risks associated with adverse scenarios. However, risk models do not generally capture both project risk and H and S risk. The intent of the project reported here is to produce a tool that facilitates the elicitation, characterization, and documentation of both project risk and H and S risk based on defined sequences of D and D activities. By considering alternative D and D sequences, comparison of the predicted risks can

  17. Systematic review on physician's knowledge about radiation doses and radiation risks of computed tomography

    International Nuclear Information System (INIS)

    Krille, Lucian; Hammer, Gael P.; Merzenich, Hiltrud; Zeeb, Hajo

    2010-01-01

    Background: The frequent use of computed tomography is a major cause of the increasing medical radiation exposure of the general population. Consequently, dose reduction and radiation protection is a topic of scientific and public concern. Aim: We evaluated the available literature on physicians' knowledge regarding radiation dosages and risks due to computed tomography. Methods: A systematic review in accordance with the Cochrane and PRISMA statements was performed using eight databases. 3091 references were found. Only primary studies assessing physicians' knowledge about computed tomography were included. Results: 14 relevant articles were identified, all focussing on dose estimations for CT. Overall, the surveys showed moderate to low knowledge among physicians concerning radiation doses and the involved health risks. However, the surveys varied considerably in conduct and quality. For some countries, more than one survey was available. There was no general trend in knowledge in any country except a slight improvement of knowledge on health risks and radiation doses in two consecutive local German surveys. Conclusions: Knowledge gaps concerning radiation doses and associated health risks among physicians are evident from published research. However, knowledge on radiation doses cannot be interpreted as reliable indicator for good medical practice.

  18. Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress

    Directory of Open Access Journals (Sweden)

    Chunlei Xia

    2018-01-01

    Full Text Available Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented.

  19. Risk analysis of Odelouca cofferdam

    OpenAIRE

    Pimenta, L.; Caldeira, L.; Maranha das Neves, E.

    2009-01-01

    In this paper we present the risk analysis of Odelouca Cofferdam, using an event tree analysis. The initializing events, failure modes and analysed limit states are discussed based on an influence diagram. The constructed event trees and their interpretation are presented. The obtained risk values are represented in an FN plot superimposed to the acceptability and tolerability risk limits proposed for Portuguese dams. Initially, particular emphasis is placed on the main characteristic...

  20. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  1. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  2. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    Science.gov (United States)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  3. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  4. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  5. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  6. Quantitative image analysis of vertebral body architecture - improved diagnosis in osteoporosis based on high-resolution computed tomography

    International Nuclear Information System (INIS)

    Mundinger, A.; Wiesmeier, B.; Dinkel, E.; Helwig, A.; Beck, A.; Schulte Moenting, J.

    1993-01-01

    71 women, 64 post-menopausal, were examined by single-energy quantitative computed tomography (SEQCT) and by high-resolution computed tomography (HRCT) scans through the middle of lumbar vertebral bodies. Computer-assisted image analysis of the high-resolution images assessed trabecular morphometry of the vertebral spongiosa texture. Texture parameters differed in women with and without age-reduced bone density, and in the former group also in patients with and without vertebral fractures. Discriminating parameters were the total number, diameter and variance of trabecular and intertrabecular spaces as well as the trabecular surface (p < 0.05)). A texture index based on these statistically selected morphometric parameters identified a subgroup of patients suffering from fractures due to abnormal spongiosal architecture but with a bone mineral content not indicative for increased fracture risk. The combination of osteodensitometric and trabecular morphometry improves the diagnosis of osteoporosis and may contribute to the prediction of individual fracture risk. (author)

  7. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    Science.gov (United States)

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  8. Generalized indices for radiation risk analysis

    International Nuclear Information System (INIS)

    Bykov, A.A.; Demin, V.F.

    1989-01-01

    A new approach to ensuring nuclear safety has begun forming since the early eighties. The approach based on the probabilistic safety analysis, the principles of acceptable risk, the optimization of safety measures, etc. has forced a complex of adequate quantitative methods of assessment, safety analysis and risk management to be developed. The method of radiation risk assessment and analysis hold a prominent place in the complex. National and international research and regulatory organizations ICRP, IAEA, WHO, UNSCEAR, OECD/NEA have given much attention to the development of the conceptual and methodological basis of those methods. Some resolutions of the National Commission of Radiological Protection (NCRP) and the Problem Commission on Radiation Hygiene of the USSR Ministry of Health should be also noted. Both CBA (cost benefit analysis) and other methods of radiation risk analysis and safety management use a system of natural and socio-economic indices characterizing the radiation risk or damage. There exist a number of problems associated with the introduction, justification and use of these indices. For example, the price, a, of radiation damage, or collective dose unit, is a noteworthy index. The difficulties in its qualitative and quantitative determination are still an obstacle for a wide application of CBA to the radiation risk analysis and management. During recent 10-15 years these problems have been a subject of consideration for many authors. The present paper also considers the issues of the qualitative and quantitative justification of the indices of radiation risk analysis

  9. Prevention of Computer Worker Health Disturbances Caused by Physical and Physiological Risk

    Directory of Open Access Journals (Sweden)

    Pille Viive

    2016-10-01

    Full Text Available This investigation was carried out in the frames of the Interreg 4A project “Workability and Social Inclusion” headed by the Arcada University of Applied Life. Tallinn University of Technology and Rīga Stradiņš University were involved in the project. A questionnaire based on the Nordic, WAI (Work Ability Index, and Kiva questionnaires was compiled to study psychosocial and physical working conditions at computer-equipped workplaces for 192 workers. The results showed that the computer workers assess their health status considerably high. They are optimistic in solving the problem that the monotonous work with computers will continue and believe that their health status in the future will stay at the same level using the steadily enhancing rehabilitation means. The most injured regions of the body were the right wrist and the neck. The novelty of the study consists in the graphical co-analysis of different groups of questions presented to the workers, which allows to assess the physiological and psychological factors in complex. The rehabilitation means have to be developed and the possibility for rehabilitation must be made available to the greatest possible number of workers. The workers were divided into two groups: Group A, the length of employment with computers under 10 years (included and Group B, having been working with computers over 10 years. These groups were found to differ in the perception of psychosocial risk factors at the workplace. Group B assessments for psychosocial working conditions were better than those of group A. In group B, employees appeared to be more afraid of losing their jobs and therefore they were not so demanding for the work atmosphere as in group A.

  10. Mare Risk Analysis monitor; Monitor de analisis de riesgos mare

    Energy Technology Data Exchange (ETDEWEB)

    Fuente Prieto, I.; Alonso, P.; Carretero Fernandino, J. A. [Empresarios Agrupados, A. I.E. Madrid (Spain)

    2000-07-01

    The Nuclear Safety Council's requirement that Spanish power plants comply with the requirements of the Maintenance Rule associated with plant risk assessment during power operation, arising from the partial unavailability of systems due to the maintenance activities, has led to need for additional tools to facilitate compliance with said requirements. While the impact on risk produced by individual equipment unavailabilities can easily be evaluated, either qualitatively or quantitatively, the process becomes more complicated when un programmed unavailabilities simultaneously occur in various systems, making it necessary to evaluate their functional impact. It is especially complex in the case of support systems that can affect the functionality of multiple systems. In view of the above, a computer application has been developed that is capable of providing the operator with quick answers based on the specific plant model in order to allow fast risk assessment using the information compiled as part of the Probabilistic Safety Analysis. The paper describes the most important characteristics of this application and the basic design requirements of the MARE Risk Monitor. (Author)

  11. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  12. Computer aided safety analysis

    International Nuclear Information System (INIS)

    1988-05-01

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  13. A climatological model for risk computations incorporating site- specific dry deposition influences

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.

    1991-07-01

    A gradient-flux dry deposition module was developed for use in a climatological atmospheric transport model, the Multimedia Environmental Pollutant Assessment System (MEPAS). The atmospheric pathway model computes long-term average contaminant air concentration and surface deposition patterns surrounding a potential release site incorporating location-specific dry deposition influences. Gradient-flux formulations are used to incorporate site and regional data in the dry deposition module for this atmospheric sector-average climatological model. Application of these formulations provide an effective means of accounting for local surface roughness in deposition computations. Linkage to a risk computation module resulted in a need for separate regional and specific surface deposition computations. 13 refs., 4 figs., 2 tabs

  14. Risk assessment using Analytical Hierarchy Process - Development and evaluation of a new computer-based tool; Riskvaerdering med Analytical Hierarchy Process - Utveckling och utprovning av ett nytt datorbaserat verktyg

    Energy Technology Data Exchange (ETDEWEB)

    Ritchey, Tom (Swedish Defence Research Agency, Stockholm (Sweden))

    2008-11-15

    Risk analysis concerning the management of contaminated areas involves comparing and evaluating the relationship between ecological, technical, economic and other factors, in order to determine a reasonable level of remediation. Risk analysis of this kind is a relatively new phenomenon. In order to develop methodology in this area, the Sustainable Remediation program contributes both to comprehensive risk analysis projects and to projects concentrating on specific aspects of remediation risk analysis. In the project described in this report, the Swedish Defence Research Agency (FOI) was given a grant by the Sustainable Remediation program to apply the Analytic Hierarchy Process (AHP) in order to develop a computer-aided instrument to support remediation risk analysis. AHP is one of several so-called multi-criteria decision support methods. These methods are applied in order to systematically compare and evaluate different solutions or measures, when there are many different goal criteria involved. Such criteria can be both quantitative and qualitative. The project has resulted in the development of a computer-aided instrument which can be employed to give a better structure, consistency and traceability to risk analyses for the remediation of contaminated areas. Project was carried out in two phases with two different working groups. The first phase involved the development of a generic base-model for remediation risk analysis. This was performed by a 'development group'. The second phase entailed the testing of the generic model in a specific, on-going remediation project. This was performed by a 'test group'. The remediation project in question concerned the decontamination of a closed-down sawmill in Vaeckelsaang, in the Swedish municipality of Tingsryd

  15. Reliability and validity of risk analysis

    International Nuclear Information System (INIS)

    Aven, Terje; Heide, Bjornar

    2009-01-01

    In this paper we investigate to what extent risk analysis meets the scientific quality requirements of reliability and validity. We distinguish between two types of approaches within risk analysis, relative frequency-based approaches and Bayesian approaches. The former category includes both traditional statistical inference methods and the so-called probability of frequency approach. Depending on the risk analysis approach, the aim of the analysis is different, the results are presented in different ways and consequently the meaning of the concepts reliability and validity are not the same.

  16. Safety analysis, risk assessment, and risk acceptance criteria

    International Nuclear Information System (INIS)

    Jamali, K.

    1997-01-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, 'ensuring' plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is 'safe.' Use of RACs requires quantitative estimates of consequence frequency and magnitude

  17. A background to risk analysis. Vol. 2

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 2 treats generic methods of qualitative failure analysis. (BP)

  18. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  19. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  20. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  1. Individual and work-related risk factors for musculoskeletal pain: a cross-sectional study among Estonian computer users.

    Science.gov (United States)

    Oha, Kristel; Animägi, Liina; Pääsuke, Mati; Coggon, David; Merisalu, Eda

    2014-05-28

    Occupational use of computers has increased rapidly over recent decades, and has been linked with various musculoskeletal disorders, which are now the most commonly diagnosed occupational diseases in Estonia. The aim of this study was to assess the prevalence of musculoskeletal pain (MSP) by anatomical region during the past 12 months and to investigate its association with personal characteristics and work-related risk factors among Estonian office workers using computers. In a cross-sectional survey, the questionnaires were sent to the 415 computer users. Data were collected by self-administered questionnaire from 202 computer users at two universities in Estonia. The questionnaire asked about MSP at different anatomical sites, and potential individual and work related risk factors. Associations with risk factors were assessed by logistic regression. Most respondents (77%) reported MSP in at least one anatomical region during the past 12 months. Most prevalent was pain in the neck (51%), followed by low back pain (42%), wrist/hand pain (35%) and shoulder pain (30%). Older age, right-handedness, not currently smoking, emotional exhaustion, belief that musculoskeletal problems are commonly caused by work, and low job security were the statistically significant risk factors for MSP in different anatomical sites. A high prevalence of MSP in the neck, low back, wrist/arm and shoulder was observed among Estonian computer users. Psychosocial risk factors were broadly consistent with those reported from elsewhere. While computer users should be aware of ergonomic techniques that can make their work easier and more comfortable, presenting computer use as a serious health hazard may modify health beliefs in a way that is unhelpful.

  2. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  3. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  4. Bias in risk-benefit analysis

    International Nuclear Information System (INIS)

    Mazur, A.

    1985-01-01

    Risk-benefit analysis has become popular in the past decade as a means of improving decision making, especially in the area of technology policy. Here risk-benefit analysis is compared to other (equally defensible) approaches to decision making, showing how it favors some political interests more than others, and suggesting why it has recently come to the fore as a tool of political analysis. A considerable portion of the discussion concerns nuclear power. 6 references

  5. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  6. Sudden Cardiac Risk Stratification with Electrocardiographic Indices - A Review on Computational Processing, Technology Transfer, and Scientific Evidence

    Directory of Open Access Journals (Sweden)

    Francisco Javier eGimeno-Blanes

    2016-03-01

    Full Text Available Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indexes, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indexes in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indexes which are tackled from the aforementioned viewpoints, namely, heart rate turbulence, heart rate variability, and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future.

  7. A background to risk analysis. Vol. 4

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 4 treats human error in plant operation. (BP)

  8. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  9. Computed tomography imaging of early coronary artery lesions in stable individuals with multiple cardiovascular risk factors

    Directory of Open Access Journals (Sweden)

    Xi Yang

    2015-04-01

    Full Text Available OBJECTIVES: To investigate the prevalence, extent, severity, and features of coronary artery lesions in stable patients with multiple cardiovascular risk factors. METHODS: Seventy-seven patients with more than 3 cardiovascular risk factors were suspected of having coronary artery disease. Patients with high-risk factors and 39 controls with no risk factors were enrolled in the study. The related risk factors included hypertension, impaired glucose tolerance, dyslipidemia, smoking history, and overweight. The characteristics of coronary lesions were identified and evaluated by 64-slice coronary computed tomography angiography. RESULTS: The incidence of coronary atherosclerosis was higher in the high-risk group than in the no-risk group. The involved branches of the coronary artery, the diffusivity of the lesion, the degree of stenosis, and the nature of the plaques were significantly more severe in the high-risk group compared with the no-risk group (all p < 0.05. CONCLUSION: Among stable individuals with high-risk factors, early coronary artery lesions are common and severe. Computed tomography has promising value for the early screening of coronary lesions.

  10. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  11. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  12. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  13. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  14. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  15. Advances in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Hardung von Hardung, H.

    1982-01-01

    Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de

  16. Easy calculations of lod scores and genetic risks on small computers.

    Science.gov (United States)

    Lathrop, G M; Lalouel, J M

    1984-01-01

    A computer program that calculates lod scores and genetic risks for a wide variety of both qualitative and quantitative genetic traits is discussed. An illustration is given of the joint use of a genetic marker, affection status, and quantitative information in counseling situations regarding Duchenne muscular dystrophy. PMID:6585139

  17. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  18. WIPP fire hazards and risk analysis

    International Nuclear Information System (INIS)

    1991-05-01

    The purpose of this analysis was to conduct a fire hazards risk analysis of the Transuranic (TRU) contact-handled waste receipt, emplacement, and disposal activities at the Waste Isolation Pilot Plant (WIPP). The technical bases and safety envelope for these operations are defined in the approved WIPP Final Safety Analysis Report (FSAR). Although the safety documentation for the initial phase of the Test Program, the dry bin scale tests, has not yet been approved by the Department of Energy (DOE), reviews of the draft to date, including those by the Advisory Committee on Nuclear Facility Safety (ACNFS), have concluded that the dry bin scale tests present no significant risks in excess of those estimated in the approved WIPP FSAR. It is the opinion of the authors and reviewers of this analysis, based on sound engineering judgment and knowledge of the WIPP operations, that a Fire Hazards and Risk Analysis specific to the dry bin scale test program is not warranted prior to first waste receipt. This conclusion is further supported by the risk analysis presented in this document which demonstrates the level of risk to WIPP operations posed by fire to be extremely low. 15 refs., 41 figs., 48 tabs

  19. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  20. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  1. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  2. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  3. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  4. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  5. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    Science.gov (United States)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  6. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  7. Maritime transportation risk analysis: Review and analysis in light of some foundational issues

    International Nuclear Information System (INIS)

    Goerlandt, Floris; Montewka, Jakub

    2015-01-01

    Many methods and applications for maritime transportation risk analysis have been presented in the literature. In parallel, there is a recent focus on foundational issues in risk analysis, with calls for intensified research on fundamental concepts and principles underlying the scientific field. This paper presents a review and analysis of risk definitions, perspectives and scientific approaches to risk analysis found in the maritime transportation application area, focusing on applications addressing accidental risk of shipping in a sea area. For this purpose, a classification of risk definitions, an overview of elements in risk perspectives and a classification of approaches to risk analysis science are applied. Results reveal that in the application area, risk is strongly tied to probability, both in definitions and perspectives, while alternative views exist. A diffuse situation is also found concerning the scientific approach to risk analysis, with realist, proceduralist and constructivist foundations co-existing. Realist approaches dominate the application area. Very few applications systematically account for uncertainty, neither concerning the evidence base nor in relation to the limitations of the risk model in relation to the space of possible outcomes. Some suggestions are made to improve the current situation, aiming to strengthen the scientific basis for risk analysis. - Highlights: • Risk analyses in maritime transportation analysed in light of foundational issues. • Focus on definitions, perspectives and scientific approaches to risk analysis. • Probability-based definitions and realist approaches dominate the field. • Findings support calls for increased focus on foundational issues in risk research. • Some suggestions are made to improve the current situation

  8. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  9. Computer analysis and comparison of chess players' game-playing styles

    OpenAIRE

    Krevs, Urša

    2015-01-01

    Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...

  10. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  11. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  12. Risk Analysis of Telecom Enterprise Financing

    Institute of Scientific and Technical Information of China (English)

    YU Hua; SHU Hua-ying

    2005-01-01

    The main research objects in this paper are the causes searching and risk estimating method for telecom enterprises' financial risks. The multi-mode financing for telecom enterprises makes it flexible to induce the capital and obtain the profit by corresponding projects. But there are also potential risks going with these financing modes. After making analysis of categories and causes of telecom enterprises' financing risk, a method by Analytic Hierarchy Process (AHP) is put forward to estimating the financing risk. And the author makes her suggestion and opinion by example analysis, in order to provide some ideas and basis for telecom enterprise's financing decision-making.

  13. Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.

    Science.gov (United States)

    Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge

    2015-01-01

    Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.

  14. Probabilistic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Hauptmanns, U.

    1988-01-01

    Risk analysis is applied if the calculation of risk from observed failures is not possible, because events contributing substantially to risk are too seldom, as in the case of nuclear reactors. The process of analysis provides a number of benefits. Some of them are listed. After this by no means complete enumeration of possible benefits to be derived from a risk analysis. An outline of risk studiesd for PWR's with some comments on the models used are given. The presentation is indebted to the detailed treatment of the subject given in the PRA Procedures Guide. Thereafter some results of the German Risk Study, Phase B, which is under way are communicated. The paper concludes with some remarks on probabilistic considerations in licensing procedures. (orig./DG)

  15. Uncertainty Analysis and Overtopping Risk Evaluation of Maroon Dam withMonte Carlo and Latin Hypercube Methods

    Directory of Open Access Journals (Sweden)

    J. M. Vali Samani

    2016-02-01

    Full Text Available Introduction: The greatest part of constructed dams belongs to embankment dams and there are many examples of their failures throughout history. About one-third of the world’s dam failures have been caused by flood overtopping, which indicates that flood overtopping is an important factor affecting reservoir projects’ safety. Moreover, because of a poor understanding of the randomness of floods, reservoir water levels during flood seasons are often lowered artificially in order to avoid overtopping and protect the lives and property of downstream residents. So, estimation of dam overtopping risk with regard to uncertainties is more important than achieving the dam’s safety. This study presents the procedure for risk evaluation of dam overtopping due to various uncertaintiess in inflows and reservoir initial condition. Materials and Methods: This study aims to present a practical approach and compare the different uncertainty analysis methods in the evaluation of dam overtopping risk due to flood. For this purpose, Monte Carlo simulation and Latin hypercube sampling methods were used to calculate the overtopping risk, evaluate the uncertainty, and calculate the highest water level during different flood events. To assess these methods from a practical point of view, the Maroon dam was chosen for the case study. Figure. 1 indicates the work procedure, including three parts: 1 Identification and evaluation of effective factors on flood routing and dam overtopping, 2 Data collection and analysis for reservoir routing and uncertainty analysis, 3 Uncertainty and risk analysis. Figure 1- Diagram of dam overtopping risk evaluation Results and Discussion: Figure 2 shows the results of the computed overtopping risks for the Maroon Dam without considering the wind effect, for the initial water level of 504 m as an example. As it is shown in Figure. 2, the trends of the risk curves computed by the different uncertainty analysis methods are similar

  16. Single-shell tank interim stabilization risk analysis

    International Nuclear Information System (INIS)

    Basche, A.D.

    1998-01-01

    The purpose of the Single-Shell Tank (SST) Interim Stabilization Risk Analysis is to provide a cost and schedule risk analysis of HNF-2358, Rev. 1, Single-Shell Tank Interim Stabilization Project Plan (Project Plan) (Ross et al. 1998). The analysis compares the required cost profile by fiscal year (Section 4.2) and revised schedule completion date (Section 4.5) to the Project Plan. The analysis also evaluates the executability of the Project Plan and recommends a path forward for risk mitigation

  17. Gastric cancer: texture analysis from multidetector computed tomography as a potential preoperative prognostic biomarker

    Energy Technology Data Exchange (ETDEWEB)

    Giganti, Francesco; Salerno, Annalaura; Marra, Paolo; Esposito, Antonio; Del Maschio, Alessandro; De Cobelli, Francesco [Department of Radiology and Centre for Experimental Imaging San Raffaele Scientific Institute, Milan (Italy); San Raffaele Vita-Salute University, Milan (Italy); Antunes, Sofia [San Raffaele Scientific Institute, Centre for Experimental Imaging, Milan (Italy); Ambrosi, Alessandro [San Raffaele Vita-Salute University, Milan (Italy); Nicoletti, Roberto [Department of Radiology and Centre for Experimental Imaging San Raffaele Scientific Institute, Milan (Italy); Orsenigo, Elena [San Raffaele Scientific Institute, Department of Surgery, Milan (Italy); Chiari, Damiano; Staudacher, Carlo [San Raffaele Vita-Salute University, Milan (Italy); San Raffaele Scientific Institute, Department of Surgery, Milan (Italy); Albarello, Luca [San Raffaele Scientific Institute, Pathology Unit, Milan (Italy)

    2017-05-15

    To investigate the association between preoperative texture analysis from multidetector computed tomography (MDCT) and overall survival in patients with gastric cancer. Institutional review board approval and informed consent were obtained. Fifty-six patients with biopsy-proved gastric cancer were examined by MDCT and treated with surgery. Image features from texture analysis were quantified, with and without filters for fine to coarse textures. The association with survival time was assessed using Kaplan-Meier and Cox analysis. The following parameters were significantly associated with a negative prognosis, according to different thresholds: energy [no filter] - Logarithm of relative risk (Log RR): 3.25; p = 0.046; entropy [no filter] (Log RR: 5.96; p = 0.002); entropy [filter 1.5] (Log RR: 3.54; p = 0.027); maximum Hounsfield unit value [filter 1.5] (Log RR: 3.44; p = 0.027); skewness [filter 2] (Log RR: 5.83; p = 0.004); root mean square [filter 1] (Log RR: - 2.66; p = 0.024) and mean absolute deviation [filter 2] (Log RR: - 4.22; p = 0.007). Texture analysis could increase the performance of a multivariate prognostic model for risk stratification in gastric cancer. Further evaluations are warranted to clarify the clinical role of texture analysis from MDCT. (orig.)

  18. Multiple Sclerosis Increases Fracture Risk: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Guixian Dong

    2015-01-01

    Full Text Available Purpose. The association between multiple sclerosis (MS and fracture risk has been reported, but results of previous studies remain controversial and ambiguous. To assess the association between MS and fracture risk, a meta-analysis was performed. Method. Based on comprehensive searches of the PubMed, Embase, and Web of Science, we identified outcome data from all articles estimating the association between MS and fracture risk. The pooled risk ratios (RRs with 95% confidence intervals (CIs were calculated. Results. A significant association between MS and fracture risk was found. This result remained statistically significant when the adjusted RRs were combined. Subgroup analysis stratified by the site of fracture suggested significant associations between MS and tibia fracture risk, femur fracture risk, hip fracture risk, pelvis fracture risk, vertebrae fracture risk, and humerus fracture risk. In the subgroup analysis by gender, female MS patients had increased fracture risk. When stratified by history of drug use, use of antidepressants, hypnotics/anxiolytics, anticonvulsants, and glucocorticoids increased the risk of fracture risk in MS patients. Conclusions. This meta-analysis demonstrated that MS was significantly associated with fracture risk.

  19. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  20. Big Data Risk Analysis for Rail Safety?

    OpenAIRE

    Van Gulijk, Coen; Hughes, Peter; Figueres-Esteban, Miguel; Dacre, Marcus; Harrison, Chris; HUD; RSSB

    2015-01-01

    Computer scientists believe that the enormous amounts of data in the internet will unchain a management revolution of uncanny proportions. Yet, to date, the potential benefit of this revolution is scantily investigated for safety and risk management. This paper gives a brief overview of a research programme that investigates how the new internet-driven data-revolution could benefit safety and risk management for railway safety in the UK. The paper gives a brief overview the current activities...

  1. Reference computations of public dose and cancer risk from airborne releases of plutonium. Nuclear safety technical report

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, V.L.

    1993-12-23

    This report presents results of computations of doses and the associated health risks of postulated accidental atmospheric releases from the Rocky Flats Plant (RFP) of one gram of weapons-grade plutonium in a form that is respirable. These computations are intended to be reference computations that can be used to evaluate a variety of accident scenarios by scaling the dose and health risk results presented here according to the amount of plutonium postulated to be released, instead of repeating the computations for each scenario. The MACCS2 code has been used as the basis of these computations. The basis and capabilities of MACCS2 are summarized, the parameters used in the evaluations are discussed, and results are presented for the doses and health risks to the public, both the Maximum Offsite Individual (a maximally exposed individual at or beyond the plant boundaries) and the population within 50 miles of RFP. A number of different weather scenarios are evaluated, including constant weather conditions and observed weather for 1990, 1991, and 1992. The isotopic mix of weapons-grade plutonium will change as it ages, the {sup 241}Pu decaying into {sup 241}Am. The {sup 241}Am reaches a peak concentration after about 72 years. The doses to the bone surface, liver, and whole body will increase slightly but the dose to the lungs will decrease slightly. The overall cancer risk will show almost no change over this period. This change in cancer risk is much smaller than the year-to-year variations in cancer risk due to weather. Finally, x/Q values are also presented for other applications, such as for hazardous chemical releases. These include the x/Q values for the MOI, for a collocated worker at 100 meters downwind of an accident site, and the x/Q value integrated over the population out to 50 miles.

  2. Risk analysis applied to the development of petroleum fields; Analise de risco aplicada ao desenvolvimento de campos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Ana Paula A. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Shiozer, Denis J. [Universidade Estadual de Campinas, SP (Brazil)

    2004-07-01

    Decision analysis applied to the development phase of petroleum fields must take into account the risk associated to several types of uncertainties. In the transition of the appraisal to the development phase, the importance of risk associated to the recovery factor may increase significantly. The process is complex due to high investments, large number of uncertain variables, strong dependence of the results with the production strategy definition. This complexity may, in several cases, cause difficulties to establish reliable techniques to assess risk correctly or it demands great computational effort. Therefore, methodologies to quantify the impact of uncertainties are still not well established because simplifications are necessary and the impact of such simplifications is not well known. The propose work bring the main aspects related to the validation of the simplifications necessary to the quantification of the impact of uncertainties in the risk analysis process. The adopted techniques are divided in three groups: adoption of the automated process and use of parallel computing; simplifications techniques in the treatment of attributes; and integration techniques of geological uncertainties with the different types of uncertainties (economical, technological and related with the production strategy). The integration of the geological uncertainties with the others uncertainties is made through the concept of representative models. The results show that the criteria adopted are good indicators of the viability of the methodology, improving the performance and reliability of the risk analysis process. (author)

  3. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  4. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  5. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  6. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  7. Analysis of haemodynamic disturbance in the atherosclerotic carotid artery using computational fluid dynamics

    International Nuclear Information System (INIS)

    Birchall, Daniel; Zaman, Azfar; Hacker, Jacob; Davies, Gavin; Mendelow, David

    2006-01-01

    Computational fluid dynamics (CFD) provides a means for the quantitative analysis of haemodynamic disturbances in vivo, but most work has used phantoms or idealised geometry. Our purpose was to use CFD to analyse flow in carotid atherosclerosis using patient-specific geometry and flow data. Eight atherosclerotic carotid arteries and one healthy control artery were imaged with magnetic resonance angiography (MRA) and duplex ultrasound, and the data used to construct patient-specific computational models used for CFD and wall shear stress (WSS) analysis. There is a progressive change in three-dimensional (3-D) velocity profile and WSS profile with increasing severity of stenosis, characterised by increasing restriction of areas of low WSS, change in oscillation patterns, and progressive rise in WSS within stenoses and downstream jets. Areas of turbulent, retrograde flow and of low WSS are demonstrated in the lee of the stenoses. This study presents the largest CFD analysis of abnormal haemodynamics at the atheromatous carotid bifurcation using patient-specific data and provides the basis for further investigation of causal links between haemodynamic variables and atherogenesis and formation of unstable plaque. We propose that this provides a means for the prospective assessment of relative stroke risk in patients with carotid atherosclerosis. (orig.)

  8. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  9. Temporal fringe pattern analysis with parallel computing

    International Nuclear Information System (INIS)

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-01-01

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis

  10. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  11. Dietary fibre intake and risk of breast cancer: A systematic review and meta-analysis of epidemiological studies.

    Science.gov (United States)

    Chen, Sumei; Chen, Yuanyuan; Ma, Shenglin; Zheng, Ruzhen; Zhao, Pengjun; Zhang, Lidan; Liu, Yuehua; Yu, Qingqing; Deng, Qinghua; Zhang, Ke

    2016-12-06

    Current evidence from randomised controlled trials on the effects of dietary fibre intake on breast cancer risk is inconsistent. We conducted a meta-analysis to determine the effectiveness of dietary fibre intake in reducing breast cancer risk. We searched for prospective and case-control studies on dietary fibre intake and breast cancer risk in the English language through March 2016. Twenty-four epidemiologic studies obtained through the PubMed, Embase, Web of Science, and Cochrane Library databases were systematically reviewed. A random-effects model was used to compute the pooled risk estimates by extracting the risk estimate of the highest and lowest reported categories of intake from each study. The meta-analyses showed a 12% decrease in breast cancer risk with dietary fibre intake. The association between dietary fibre intake and breast cancer risk was significant when stratified according to Jadad scores, study types, and menopause status. Dose-response analysis showed that every 10 g/d increment in dietary fibre intake was associated with a 4% reduction in breast cancer risk, and little evidence of publication bias was found. Thus, dietary fibre consumption is significantly associated with a reduced risk of breast cancer, particularly in postmenopausal women.

  12. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  13. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  14. 38 CFR 75.115 - Risk analysis.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed or...

  15. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    2008-06-01

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  16. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  17. Demonstration of risk-based decision analysis in remedial alternative selection and design

    International Nuclear Information System (INIS)

    Evans, E.K.; Duffield, G.M.; Massmann, J.W.; Freeze, R.A.; Stephenson, D.E.

    1993-01-01

    This study demonstrates the use of risk-based decision analysis (Massmann and Freeze 1987a, 1987b) in the selection and design of an engineering alternative for groundwater remediation at a waste site at the Savannah River Site, a US Department of Energy facility in South Carolina. The investigation focuses on the remediation and closure of the H-Area Seepage Basins, an inactive disposal site that formerly received effluent water from a nearby production facility. A previous study by Duffield et al. (1992), which used risk-based decision analysis to screen a number of ground-water remediation alternatives under consideration for this site, indicated that the most attractive remedial option is ground-water extraction by wells coupled with surface water discharge of treated effluent. The aim of the present study is to demonstrate the iterative use of risk-based decision analysis throughout the design of a particular remedial alternative. In this study, we consider the interaction between two episodes of aquifer testing over a 6-year period and the refinement of a remedial extraction well system design. Using a three-dimensional ground-water flow model, this study employs (1) geostatistics and Monte Carlo techniques to simulate hydraulic conductivity as a stochastic process and (2) Bayesian updating and conditional simulation to investigate multiple phases of aquifer testing. In our evaluation of a remedial alternative, we compute probabilistic costs associated with the failure of an alternative to completely capture a simulated contaminant plume. The results of this study demonstrate the utility of risk-based decision analysis as a tool for improving the design of a remedial alternative through the course of phased data collection at a remedial site

  18. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  19. Common approach of risks analysis

    International Nuclear Information System (INIS)

    Noviello, L.; Naviglio, A.

    1996-01-01

    Although, following the resolutions of the High German Court, the protection level of the human beings is an objective which can change in time, it is obvious that it is an important point when there is a risk for the population. This is true more particularly for the industrial plants whose possible accidents could affect the population. The accidents risk analysis indicates that there is no conceptual difference between the risks of a nuclear power plant and those of the other industrial plants as chemical plants, the gas distribution system and the hydraulic dams. A legislation analysis induced by the Seveso Directive for the industrial risks give some important indications which should always be followed. This work analyses more particularly the legislative situation in different European countries and identifies some of the most important characteristics. Indeed, for most of the countries, the situation is different and it is a later difficulties source for nuclear power plants. In order to strengthen this reasoning, this paper presents some preliminary results of an analysis of a nuclear power plant following the approach of other industrial plants. In conclusion, it will be necessary to analyse again the risks assessment approach for nuclear power plants because the real protection level of human beings in a country is determined by the less regulated of the dangerous industrial plants existing at the surroundings. (O.M.)

  20. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  1. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  2. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  3. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  4. Introduction of the risk analysis

    International Nuclear Information System (INIS)

    Campon, G.; Martinez, I.

    2013-01-01

    An introduction of risks analysis was given in the exposition which main issues were: food innocuousness, world, regional and national food context,change of paradigms, health definition, risk, codex, standardization, food chain role, trade agreement, codex alimentarius, food transmission diseases cost impact

  5. Risk Analysis Group annual progress report 1984

    International Nuclear Information System (INIS)

    1985-06-01

    The activities of the Risk Analysis Group at Risoe during 1984 are presented. These include descriptions in some detail of work on general development topics and risk analysis performed as contractor. (author)

  6. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  7. Software-based risk stratification of pulmonary adenocarcinomas manifesting as pure ground glass nodules on computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nemec, Ursula [Vienna General Hospital, Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Heidinger, Benedikt H.; Bankier, Alexander A. [Harvard Medical School, Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Anderson, Kevin R.; VanderLaan, Paul A. [Harvard Medical School, Pathology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Westmore, Michael S. [Imbio, Delafield, WI (United States)

    2018-01-15

    To assess the performance of the ''Computer-Aided Nodule Assessment and Risk Yield'' (CANARY) software in the differentiation and risk assessment of histological subtypes of lung adenocarcinomas manifesting as pure ground glass nodules on computed tomography (CT). 64 surgically resected and histologically proven adenocarcinomas manifesting as pure ground-glass nodules on CT were assessed using CANARY software, which classifies voxel-densities into three risk components (low, intermediate, and high risk). Differences in risk components between histological adenocarcinoma subtypes were analysed. To determine the optimal threshold reflecting the presence of an invasive focus, sensitivity, specificity, negative predictive value, and positive predictive value were calculated. 28/64 (44%) were adenocarcinomas in situ (AIS); 26/64 (41%) were minimally invasive adenocarcinomas (MIA); and 10/64 (16%) were invasive ACs (IAC). The software showed significant differences in risk components between histological subtypes (P<0.001-0.003). A relative volume of 45% or less of low-risk components was associated with histological invasiveness (specificity 100%, positive predictive value 100%). CANARY-based risk assessment of ACs manifesting as pure ground glass nodules on CT allows the differentiation of their histological subtypes. A threshold of 45% of low-risk components reflects invasiveness in these groups. (orig.)

  8. Software-based risk stratification of pulmonary adenocarcinomas manifesting as pure ground glass nodules on computed tomography

    International Nuclear Information System (INIS)

    Nemec, Ursula; Heidinger, Benedikt H.; Bankier, Alexander A.; Anderson, Kevin R.; VanderLaan, Paul A.; Westmore, Michael S.

    2018-01-01

    To assess the performance of the ''Computer-Aided Nodule Assessment and Risk Yield'' (CANARY) software in the differentiation and risk assessment of histological subtypes of lung adenocarcinomas manifesting as pure ground glass nodules on computed tomography (CT). 64 surgically resected and histologically proven adenocarcinomas manifesting as pure ground-glass nodules on CT were assessed using CANARY software, which classifies voxel-densities into three risk components (low, intermediate, and high risk). Differences in risk components between histological adenocarcinoma subtypes were analysed. To determine the optimal threshold reflecting the presence of an invasive focus, sensitivity, specificity, negative predictive value, and positive predictive value were calculated. 28/64 (44%) were adenocarcinomas in situ (AIS); 26/64 (41%) were minimally invasive adenocarcinomas (MIA); and 10/64 (16%) were invasive ACs (IAC). The software showed significant differences in risk components between histological subtypes (P<0.001-0.003). A relative volume of 45% or less of low-risk components was associated with histological invasiveness (specificity 100%, positive predictive value 100%). CANARY-based risk assessment of ACs manifesting as pure ground glass nodules on CT allows the differentiation of their histological subtypes. A threshold of 45% of low-risk components reflects invasiveness in these groups. (orig.)

  9. Importance and sensitivity of parameters affecting the Zion Seismic Risk

    International Nuclear Information System (INIS)

    George, L.L.; O'Connell, W.J.

    1985-06-01

    This report presents the results of a study on the importance and sensitivity of structures, systems, equipment, components and design parameters used in the Zion Seismic Risk Calculations. This study is part of the Seismic Safety Margins Research Program (SSMRP) supported by the NRC Office of Nuclear Regulatory Research. The objective of this study is to provide the NRC with results on the importance and sensitivity of parameters used to evaluate seismic risk. These results can assist the NRC in making decisions dealing with the allocation of research resources on seismic issues. This study uses marginal analysis in addition to importance and sensitivity analysis to identify subject areas (input parameter areas) for improvements that reduce risk, estimate how much the improvement dfforts reduce risk, and rank the subject areas for improvements. Importance analysis identifies the systems, components, and parameters that are important to risk. Sensitivity analysis estimates the change in risk per unit improvement. Marginal analysis indicates the reduction in risk or uncertainty for improvement effort made in each subject area. The results described in this study were generated using the SEISIM (Systematic Evaluation of Important Safety Improvement Measures) and CHAIN computer codes. Part 1 of the SEISIM computer code generated the failure probabilities and risk values. Part 2 of SEISIM, along with the CHAIN computer code, generated the importance and sensitivity measures

  10. Importance and sensitivity of parameters affecting the Zion Seismic Risk

    Energy Technology Data Exchange (ETDEWEB)

    George, L.L.; O' Connell, W.J.

    1985-06-01

    This report presents the results of a study on the importance and sensitivity of structures, systems, equipment, components and design parameters used in the Zion Seismic Risk Calculations. This study is part of the Seismic Safety Margins Research Program (SSMRP) supported by the NRC Office of Nuclear Regulatory Research. The objective of this study is to provide the NRC with results on the importance and sensitivity of parameters used to evaluate seismic risk. These results can assist the NRC in making decisions dealing with the allocation of research resources on seismic issues. This study uses marginal analysis in addition to importance and sensitivity analysis to identify subject areas (input parameter areas) for improvements that reduce risk, estimate how much the improvement dfforts reduce risk, and rank the subject areas for improvements. Importance analysis identifies the systems, components, and parameters that are important to risk. Sensitivity analysis estimates the change in risk per unit improvement. Marginal analysis indicates the reduction in risk or uncertainty for improvement effort made in each subject area. The results described in this study were generated using the SEISIM (Systematic Evaluation of Important Safety Improvement Measures) and CHAIN computer codes. Part 1 of the SEISIM computer code generated the failure probabilities and risk values. Part 2 of SEISIM, along with the CHAIN computer code, generated the importance and sensitivity measures.

  11. Prevalence of complaints of arm, neck and shoulder among computer office workers and psychometric evaluation of a risk factor questionnaire

    Directory of Open Access Journals (Sweden)

    Kennes Janneke

    2007-07-01

    Full Text Available Abstract Background Complaints of Arm Neck and Shoulder (CANS represent a wide range of complaints, which can differ in severity from mild, periodic symptoms to severe, chronic and debilitating conditions. They are thought to be associated with both physical and psychosocial risk factors. The measurement and identification of the various risk factors for these complaints is an important step towards recognizing (a high risk subgroups that are relevant in profiling CANS; and (b also for developing targeted and effective intervention plans for treatment. The purpose of the present study was to investigate the prevalence of CANS in a Dutch population of computer workers and to develop a questionnaire aimed at measuring workplace physical and psychosocial risk factors for the presence of these complaints. Methods To examine potential workplace risk factors for the presence of CANS, the Maastricht Upper Extremity Questionnaire (MUEQ, a structured questionnaire, was developed and tested among 264 computer office workers of a branch office of the national social security institution in the Netherlands. The MUEQ holds 95 items covering demographic characteristics, in addition to seven main domains assessing potential risk factors with regard to (1 work station, (2 posture during work, (3 quality of break time, (4 job demands, (5 job control, and (6 social support. The MUEQ further contained some additional questions about the quality of the work environment and the presence of complaints in the neck, shoulder, upper and lower arm, elbow, hand and wrist. The prevalence rates of CANS in the past year were computed. Further, we investigated the psychometric properties of the MUEQ (i.e. factor structure and reliability. Results The one-year prevalence rate of CANS indicated that 54% of the respondents reported at least one complaint in the arm, neck and/or shoulder. The highest prevalence rates were found for neck and shoulder symptoms (33% and 31

  12. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  13. Intrusion detection in cloud computing based attack patterns and risk assessment

    Directory of Open Access Journals (Sweden)

    Ben Charhi Youssef

    2017-05-01

    Full Text Available This paper is an extension of work originally presented in SYSCO CONF.We extend our previous work by presenting the initial results of the implementation of intrusion detection based on risk assessment on cloud computing. The idea focuses on a novel approach for detecting cyber-attacks on the cloud environment by analyzing attacks pattern using risk assessment methodologies. The aim of our solution is to combine evidences obtained from Intrusion Detection Systems (IDS deployed in a cloud with risk assessment related to each attack pattern. Our approach presents a new qualitative solution for analyzing each symptom, indicator and vulnerability analyzing impact and likelihood of distributed and multi-steps attacks directed to cloud environments. The implementation of this approach will reduce the number of false alerts and will improve the performance of the IDS.

  14. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    Science.gov (United States)

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or

  15. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    Science.gov (United States)

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  17. How to Perform an Ethical Risk Analysis (eRA).

    Science.gov (United States)

    Hansson, Sven Ove

    2018-02-26

    Ethical analysis is often needed in the preparation of policy decisions on risk. A three-step method is proposed for performing an ethical risk analysis (eRA). In the first step, the people concerned are identified and categorized in terms of the distinct but compatible roles of being risk-exposed, a beneficiary, or a decisionmaker. In the second step, a more detailed classification of roles and role combinations is performed, and ethically problematic role combinations are identified. In the third step, further ethical deliberation takes place, with an emphasis on individual risk-benefit weighing, distributional analysis, rights analysis, and power analysis. Ethical issues pertaining to subsidiary risk roles, such as those of experts and journalists, are also treated in this phase. An eRA should supplement, not replace, a traditional risk analysis that puts emphasis on the probabilities and severities of undesirable events but does not cover ethical issues such as agency, interpersonal relationships, and justice. © 2018 Society for Risk Analysis.

  18. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  19. RISK ANALYSIS IN INFORMATION TECHNOLOGY AND COMMUNICATION OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Edmir Parada Vasques Prado

    2011-12-01

    Full Text Available This research aims at evaluating the risk analysis process in Information Technology and Communication (ICT outsourcing conducted by organizations of the private sector. The research is characterized by being a descriptive, quantitative and transversal type study, which was used the survey method. Data were collected through questionnaire, the sample is not random and we used a convenience sampling process. The research made contributions to understanding the risk analysis process in ICT services outsourcing, and identified statistically significant relationships between risk analysis, organization's size and its industry, and between risk analysis and diversity of outsourced services

  20. Work-related complaints of arm, neck and shoulder among computer office workers in an Asian country: prevalence and validation of a risk-factor questionnaire

    Directory of Open Access Journals (Sweden)

    Jayawardana Naveen

    2011-04-01

    Full Text Available Abstract Background Complaints of arm, neck and/or shoulders (CANS affects millions of computer office workers. However its prevalence and associated risk factors in developing countries are yet to be investigated, due to non availability of validated assessment tools for these countries. We evaluated the 1-year prevalence of CANS among computer office workers in Sri Lanka and tested the psychometric properties of a translated risk factor questionnaire. Methods Computer office workers at a telecommunication company in Sri Lankan received the Sinhalese version of the validated Maastricht Upper Extremity Questionnaire (MUEQ. The 94 items in the questionnaire covers demographic characteristics, CANS and evaluates potential risk factors for CANS in six domains. Forward and backward translation of the MUEQ was done by two independent bi-lingual translators. One-year prevalence of CANS and psychometric properties of the Sinhalese questionnaire were investigated. Results Response rate was 97.7% (n = 440. Males were 42.7%. Mean age was 38.2 ± 9.5 years. One-year prevalence of CANS was 63.6% (mild-53.7% and severe-10%. The highest incidences were for neck (36.1% and shoulder (34.3% complaints. Two factors for each domain in the scale were identified by exploratory factor analysis (i.e. work-area, computer-position, incorrect body posture, bad-habits, skills and abilities, decision-making, time-management, work-overload, work-breaks, variation in work, work-environment and social-support. Calculation of internal consistency (Cronbach's alpha 0.43-0.82 and cross-validation provided evidence of reliability and lack of redundancy of items. Conclusion One year prevalence of CANS in the study population corresponds strongly with prevalence in developed countries. Translated version of the MUEQ has satisfactory psychometric properties for it to be used to assess work-related risk factors for development of CANS among Sri Lankan computer office workers.

  1. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  2. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  3. Conversion from laparoscopic to open cholecystectomy: Multivariate analysis of preoperative risk factors

    Directory of Open Access Journals (Sweden)

    Khan M

    2005-01-01

    Full Text Available BACKGROUND: Laparoscopic cholecystectomy has become the gold standard in the treatment of symptomatic cholelithiasis. Some patients require conversion to open surgery and several preoperative variables have been identified as risk factors that are helpful in predicting the probability of conversion. However, there is a need to devise a risk-scoring system based on the identified risk factors to (a predict the risk of conversion preoperatively for selected patients, (b prepare the patient psychologically, (c arrange operating schedules accordingly, and (d minimize the procedure-related cost and help overcome financial constraints, which is a significant problem in developing countries. AIM: This study was aimed to evaluate preoperative risk factors for conversion from laparoscopic to open cholecystectomy in our setting. SETTINGS AND DESIGNS: A case control study of patients who underwent laparoscopic surgery from January 1997 to December 2001 was conducted at the Aga Khan University Hospital, Karachi, Pakistan. MATERIALS AND METHODS: All those patients who were converted to open surgery (n = 73 were enrolled as cases. Two controls who had successful laparoscopic surgery (n = 146 were matched with each case for operating surgeon and closest date of surgery. STATISTICAL ANALYSIS USED: Descriptive statistics were computed and, univariate and multivariate analysis was done through multiple logistic regression. RESULTS: The final multivariate model identified two risk factors for conversion: ultrasonographic signs of inflammation (adjusted odds ratio [aOR] = 8.5; 95% confidence interval [CI]: 3.3, 21.9 and age > 60 years (aOR = 8.1; 95% CI: 2.9, 22.2 after adjusting for physical signs, alkaline phosphatase and BMI levels. CONCLUSION: Preoperative risk factors evaluated by the present study confirm the likelihood of conversion. Recognition of these factors is important for understanding the characteristics of patients at a higher risk of conversion.

  4. Risk-benefit analysis and public policy: a bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis, and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.

  5. Risk-benefit analysis and public policy: a bibliography

    International Nuclear Information System (INIS)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis, and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk

  6. Predicted cancer risks induced by computed tomography examinations during childhood, by a quantitative risk assessment approach.

    Science.gov (United States)

    Journy, Neige; Ancelet, Sophie; Rehel, Jean-Luc; Mezzarobba, Myriam; Aubert, Bernard; Laurier, Dominique; Bernier, Marie-Odile

    2014-03-01

    The potential adverse effects associated with exposure to ionizing radiation from computed tomography (CT) in pediatrics must be characterized in relation to their expected clinical benefits. Additional epidemiological data are, however, still awaited for providing a lifelong overview of potential cancer risks. This paper gives predictions of potential lifetime risks of cancer incidence that would be induced by CT examinations during childhood in French routine practices in pediatrics. Organ doses were estimated from standard radiological protocols in 15 hospitals. Excess risks of leukemia, brain/central nervous system, breast and thyroid cancers were predicted from dose-response models estimated in the Japanese atomic bomb survivors' dataset and studies of medical exposures. Uncertainty in predictions was quantified using Monte Carlo simulations. This approach predicts that 100,000 skull/brain scans in 5-year-old children would result in eight (90 % uncertainty interval (UI) 1-55) brain/CNS cancers and four (90 % UI 1-14) cases of leukemia and that 100,000 chest scans would lead to 31 (90 % UI 9-101) thyroid cancers, 55 (90 % UI 20-158) breast cancers, and one (90 % UI risks without exposure). Compared to background risks, radiation-induced risks would be low for individuals throughout life, but relative risks would be highest in the first decades of life. Heterogeneity in the radiological protocols across the hospitals implies that 5-10 % of CT examinations would be related to risks 1.4-3.6 times higher than those for the median doses. Overall excess relative risks in exposed populations would be 1-10 % depending on the site of cancer and the duration of follow-up. The results emphasize the potential risks of cancer specifically from standard CT examinations in pediatrics and underline the necessity of optimization of radiological protocols.

  7. Developing RESRAD-BASELINE for environmental baseline risk assessment

    International Nuclear Information System (INIS)

    Cheng, Jing-Jy.

    1995-01-01

    RESRAD-BASELINE is a computer code developed at Argonne developed at Argonne National Laboratory for the US Department of Energy (DOE) to perform both radiological and chemical risk assessments. The code implements the baseline risk assessment guidance of the US Environmental Protection Agency (EPA 1989). The computer code calculates (1) radiation doses and cancer risks from exposure to radioactive materials, and (2) hazard indexes and cancer risks from exposure to noncarcinogenic and carcinogenic chemicals, respectively. The user can enter measured or predicted environmental media concentrations from the graphic interface and can simulate different exposure scenarios by selecting the appropriate pathways and modifying the exposure parameters. The database used by PESRAD-BASELINE includes dose conversion factors and slope factors for radionuclides and toxicity information and properties for chemicals. The user can modify the database for use in the calculation. Sensitivity analysis can be performed while running the computer code to examine the influence of the input parameters. Use of RESRAD-BASELINE for risk analysis is easy, fast, and cost-saving. Furthermore, it ensures in consistency in methodology for both radiological and chemical risk analyses

  8. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  9. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  10. Identification and control of factors influencing flow-accelerated corrosion in HRSG units using computational fluid dynamics modeling, full-scale air flow testing, and risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pietrowski, Ronald L. [The Consolidated Edison Company of New York, Inc., New York, NY (United States)

    2010-11-15

    In 2009, Consolidated Edison's East River heat recovery steam generator units 10 and 20 both experienced economizer tube failures which forced each unit offline. Extensive inspections indicated that the primary failure mechanism was flow-accelerated corrosion (FAC). The inspections revealed evidence of active FAC in all 7 of the economizer modules, with the most advanced stages of degradation being noted in center modules. Analysis determined that various factors were influencing and enabling this corrosion mechanism. Computational fluid dynamics and full-scale air flow testing showed very turbulent feedwater flow prevalent in areas of the modules corresponding with the pattern of FAC damage observed through inspection. It also identified preferential flow paths, with higher flow velocities, in certain tubes directly under the inlet nozzles. A FAC risk analysis identified more general susceptibility to FAC in the areas experiencing damage due to feedwater pH, operating temperatures, local shear fluid forces, and the chemical composition of the original materials of construction. These, in combination, were the primary root causes of the failures. Corrective actions were identified, analyzed, and implemented, resulting in equipment replacements and repairs. (orig.)

  11. Flood Risk Assessment Based On Security Deficit Analysis

    Science.gov (United States)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  12. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1990-06-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  13. Analysis and management of risks experienced in tunnel construction

    Directory of Open Access Journals (Sweden)

    Cagatay Pamukcu

    2015-12-01

    Full Text Available In this study, first of all, the definitions of "risk", "risk analysis", "risk assessment" and "risk management" were made to avoid any confusions about these terms and significance of risk analysis and management in engineering projects was emphasized. Then, both qualitative and quantitative risk analysis techniques were mentioned and within the scope of the study, Event Tree Analysis method was selected in order to analyze the risks regarding TBM (Tunnel Boring Machine operations in tunnel construction. After all hazards that would be encountered during tunnel construction by TBM method had been investigated, those hazards were undergoing a Preliminary Hazard Analysis to sort out and prioritize the risks with high scores. When the risk scores were taken into consideration, it was seen that the hazards with high risk scores could be classified into 4 groups which are excavation + support induced accidents, accidents stemming from geologic conditions, auxiliary works, and project contract. According to these four classified groups of initiating events, Event Tree Analysis was conducted by taking into care 4 countermeasures apart from each other. Finally, the quantitative and qualitative consequences of Event Tree Analyses, which were undertaken for all initiating events, were investigated and interpreted together by making comparisons and referring to previous studies.

  14. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    Energy Technology Data Exchange (ETDEWEB)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could provide a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.

  15. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  16. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  17. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  18. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    Science.gov (United States)

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  19. 49 CFR 260.17 - Credit risk premium analysis.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk Premium...

  20. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  1. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  2. Work related complaints of neck, shoulder and arm among computer office workers: a cross-sectional evaluation of prevalence and risk factors in a developing country

    Directory of Open Access Journals (Sweden)

    Jayawardana Naveen

    2011-08-01

    Full Text Available Abstract Background Complaints of arms, neck and shoulders (CANS is common among computer office workers. We evaluated an aetiological model with physical/psychosocial risk-factors. Methods We invited 2,500 computer office workers for the study. Data on prevalence and risk-factors of CANS were collected by validated Maastricht-Upper-extremity-Questionnaire. Workstations were evaluated by Occupational Safety and Health Administration (OSHA Visual-Display-Terminal workstation-checklist. Participants' knowledge and awareness was evaluated by a set of expert-validated questions. A binary logistic regression analysis investigated relationships/correlations between risk-factors and symptoms. Results Sample size was 2,210. Mean age 30.8 ± 8.1 years, 50.8% were males. The 1-year prevalence of CANS was 56.9%, commonest region of complaint was forearm/hand (42.6%, followed by neck (36.7% and shoulder/arm (32.0%. In those with CANS, 22.7% had taken treatment from a health care professional, only in 1.1% seeking medical advice an occupation-related injury had been suspected/diagnosed. In addition 9.3% reported CANS-related absenteeism from work, while 15.4% reported CANS causing disruption of normal activities. A majority of evaluated workstations in all participants (88.4%, and in those with CANS (91.9% had OSHA non-compliant workstations. In the binary logistic regression analyses female gender, daily computer usage, incorrect body posture, bad work-habits, work overload, poor social support and poor ergonomic knowledge were associated with CANS and its' severity In a multiple logistic regression analysis controlling for age, gender and duration of occupation, incorrect body posture, bad work-habits and daily computer usage were significant independent predictors of CANS Conclusions The prevalence of work-related CANS among computer office workers in Sri Lanka, a developing, South Asian country is high and comparable to prevalence in developed countries

  3. Work related complaints of neck, shoulder and arm among computer office workers: a cross-sectional evaluation of prevalence and risk factors in a developing country.

    Science.gov (United States)

    Ranasinghe, Priyanga; Perera, Yashasvi S; Lamabadusuriya, Dilusha A; Kulatunga, Supun; Jayawardana, Naveen; Rajapakse, Senaka; Katulanda, Prasad

    2011-08-04

    Complaints of arms, neck and shoulders (CANS) is common among computer office workers. We evaluated an aetiological model with physical/psychosocial risk-factors. We invited 2,500 computer office workers for the study. Data on prevalence and risk-factors of CANS were collected by validated Maastricht-Upper-extremity-Questionnaire. Workstations were evaluated by Occupational Safety and Health Administration (OSHA) Visual-Display-Terminal workstation-checklist. Participants' knowledge and awareness was evaluated by a set of expert-validated questions. A binary logistic regression analysis investigated relationships/correlations between risk-factors and symptoms. Sample size was 2,210. Mean age 30.8 ± 8.1 years, 50.8% were males. The 1-year prevalence of CANS was 56.9%, commonest region of complaint was forearm/hand (42.6%), followed by neck (36.7%) and shoulder/arm (32.0%). In those with CANS, 22.7% had taken treatment from a health care professional, only in 1.1% seeking medical advice an occupation-related injury had been suspected/diagnosed. In addition 9.3% reported CANS-related absenteeism from work, while 15.4% reported CANS causing disruption of normal activities. A majority of evaluated workstations in all participants (88.4%,) and in those with CANS (91.9%) had OSHA non-compliant workstations. In the binary logistic regression analyses female gender, daily computer usage, incorrect body posture, bad work-habits, work overload, poor social support and poor ergonomic knowledge were associated with CANS and its' severity In a multiple logistic regression analysis controlling for age, gender and duration of occupation, incorrect body posture, bad work-habits and daily computer usage were significant independent predictors of CANS. The prevalence of work-related CANS among computer office workers in Sri Lanka, a developing, South Asian country is high and comparable to prevalence in developed countries. Work-related physical factors, psychosocial factors and

  4. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  5. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  6. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  7. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  8. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  9. Prognostic value of absence or presence of coronary artery disease determined by 64-slice computed tomography coronary angiography A systematic review and meta-analysis

    DEFF Research Database (Denmark)

    Abdulla, Jawdat; Asferg, Camilla Lundegaard; Kofoed, Klaus Fuglsang

    2011-01-01

    To determine via a meta-analysis the prognostic value of 64-slice computed tomography angiography (CTA) by quantifying risk of major adverse cardiac events (MACE) in different patient groups classified according to CT angiographic findings. A systematic literature search and meta...

  10. TV time but not computer time is associated with cardiometabolic risk in Dutch young adults.

    Science.gov (United States)

    Altenburg, Teatske M; de Kroon, Marlou L A; Renders, Carry M; Hirasing, Remy; Chinapaw, Mai J M

    2013-01-01

    TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Data of 634 Dutch young adults (18-28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time.

  11. Research on Risk Evaluation of Transnational Power Networking Projects Based on the Matter-Element Extension Theory and Granular Computing

    Directory of Open Access Journals (Sweden)

    Jinying Li

    2017-10-01

    Full Text Available In project management, risk assessment is crucial for stakeholders to identify the risk factors during the whole life cycle of the project. A risk evaluation index system of a transnational networking project, which provides an effective way for the grid integration of clean electricity and the sustainable development of the power industry, is constructed in this paper. Meanwhile, a combination of granular computing and order relation analysis (G1 method is applied to determine the weight of each indicator and the matter-element extension evaluation model is also employed to seek the global optimal decision during the risk assessment. Finally, a case study is given to validate the index system and evaluation model established in this paper by assessing two different investment schemes of a transnational high voltage direct current (HVDC transmission project. The result shows that the comprehensive risk level of Scheme 1 is “Low” and the level of Scheme 2 is “General”, which means Scheme 1 is better for the stakeholders from the angle of risk control. The main practical significance of this paper lies in that it can provide a reference and decision support for the government’s power sectors, investment companies and other stakeholders when carrying out related activities.

  12. Putting problem formulation at the forefront of GMO risk analysis.

    Science.gov (United States)

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  13. Streamlining project delivery through risk analysis.

    Science.gov (United States)

    2015-08-01

    Project delivery is a significant area of concern and is subject to several risks throughout Plan Development : Process (PDP). These risks are attributed to major areas of project development, such as environmental : analysis, right-of-way (ROW) acqu...

  14. Reliability and Availability of Cloud Computing

    CERN Document Server

    Bauer, Eric

    2012-01-01

    A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le

  15. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  16. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  17. Computational red teaming risk analytics of big-data-to-decisions intelligent systems

    CERN Document Server

    Abbass, Hussein A

    2015-01-01

    Written to bridge the information needs of management and computational scientists, this book presents the first comprehensive treatment of Computational Red Teaming (CRT).  The author describes an analytics environment that blends human reasoning and computational modeling to design risk-aware and evidence-based smart decision making systems. He presents the Shadow CRT Machine, which shadows the operations of an actual system to think with decision makers, challenge threats, and design remedies. This is the first book to generalize red teaming (RT) outside the military and security domains and it offers coverage of RT principles, practical and ethical guidelines. The author utilizes Gilbert’s principles for introducing a science. Simplicity: where the book follows a special style to make it accessible to a wide range of  readers. Coherence:  where only necessary elements from experimentation, optimization, simulation, data mining, big data, cognitive information processing, and system thinking are blend...

  18. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  19. FRAC (failure rate analysis code): a computer program for analysis of variance of failure rates. An application user's guide

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.; McInteer, C.R.

    1982-03-01

    Probabilistic risk assessments (PRAs) require estimates of the failure rates of various components whose failure modes appear in the event and fault trees used to quantify accident sequences. Several reliability data bases have been designed for use in providing the necessary reliability data to be used in constructing these estimates. In the nuclear industry, the Nuclear Plant Reliability Data System (NPRDS) and the In-Plant Reliability Data System (IRPDS), among others, were designed for this purpose. An important characteristic of such data bases is the selection and identification of numerous factors used to classify each component that is reported and the subsequent failures of each component. However, the presence of such factors often complicates the analysis of reliability data in the sense that it is inappropriate to group (that is, pool) data for those combinations of factors that yield significantly different failure rate values. These types of data can be analyzed by analysis of variance. FRAC (Failure Rate Analysis Code) is a computer code that performs an analysis of variance of failure rates. In addition, FRAC provides failure rate estimates

  20. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  1. Multi-hazard risk analysis for management strategies

    Science.gov (United States)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  2. Numerical analysis on hydrogen stratification and post-inerting of hydrogen risk

    International Nuclear Information System (INIS)

    Peng, Cheng; Tong, Lili; Cao, Xuewu

    2016-01-01

    Highlights: • A three-dimensional computational model was built and the applicability was discussed. • The formation of helium stratification was further studied. • Three influencing factors on the post-inerting of hydrogen risk were analyzed. - Abstract: In the case of severe accidents, the risk of hydrogen explosion threatens the integrity of the nuclear reactor containment. According to nuclear regulations, hydrogen control is required to ensure the safe operation of the nuclear reactor. In this study, the method of Computational Fluid Dynamics (CFD) has been applied to analyze process of hydrogen stratification and the post-inerting of hydrogen risk in the Large-Scale Gas Mixing Facility. A three-dimensional computational model was built and the applicability of different turbulence models was discussed. The result shows that the helium concentration calculated by the standard k–ε turbulence model is closest to the experiment data. Through analyzing the formation of helium stratification at different injection velocities, it is found that when the injection mass flow is constant and the injection velocity of helium increases, the mixture of helium and air is enhanced while there is rarely influence on the formation of helium stratification. In addition, the influences of mass flow rate, injection location and direction and inert gas on the post-inerting of hydrogen risk have been analyzed and the results are as follows: with the increasing of mass flow rate, the mitigation effect of nitrogen on hydrogen risk will be further improved; there is an obvious local difference between the mitigation effects of nitrogen on hydrogen risk in different injection directions and locations; when the inert gas is injected at the same mass flow rate, the mitigation effect of steam on hydrogen risk is better than that of nitrogen. This study can provide technical support for the mitigation of hydrogen risk in the small LWR containment.

  3. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  4. Environmental risk analysis of hazardous material rail transportation

    Energy Technology Data Exchange (ETDEWEB)

    Saat, Mohd Rapik, E-mail: mohdsaat@illinois.edu [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States); Werth, Charles J.; Schaeffer, David [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States); Yoon, Hongkyu [Sandia National Laboratories, Albuquerque, NM 87123 (United States); Barkan, Christopher P.L. [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States)

    2014-01-15

    Highlights: • Comprehensive, nationwide risk assessment of hazardous material rail transportation. • Application of a novel environmental (i.e. soil and groundwater) consequence model. • Cleanup cost and total shipment distance are the most significant risk factors. • Annual risk varies from $20,000 to $560,000 for different products. • Provides information on the risk cost associated with specific product shipments. -- Abstract: An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  5. Environmental risk analysis of hazardous material rail transportation

    International Nuclear Information System (INIS)

    Saat, Mohd Rapik; Werth, Charles J.; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P.L.

    2014-01-01

    Highlights: • Comprehensive, nationwide risk assessment of hazardous material rail transportation. • Application of a novel environmental (i.e. soil and groundwater) consequence model. • Cleanup cost and total shipment distance are the most significant risk factors. • Annual risk varies from $20,000 to $560,000 for different products. • Provides information on the risk cost associated with specific product shipments. -- Abstract: An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials

  6. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  7. Green Tea Consumption and Risk of Pancreatic Cancer: A Meta-analysis

    Directory of Open Access Journals (Sweden)

    Jin-Long Zeng

    2014-10-01

    Full Text Available Emerging laboratory and animal studies indicate that green tea inhibits development and progression of pancreatic cancer, but evidence from epidemiologic studies appears inconsistent and inconclusive. A meta-analysis summarizing published case-control and cohort studies was performed to evaluate the association of green tea consumption with risk of pancreatic cancer. Pertinent studies were identified by a search of PubMed and EMBASE up to April 2014. A random-effects model was assigned to compute summary risk estimates. A total of three case-control studies and five prospective studies were included, comprising 2317 incident cases and 288209 subjects. Of them, three studies were from China and the reminders were conducted in Japan. Overall, neither high vs. low green consumption (odds ratio (OR = 0.99, 95% confidence interval [CI] = 0.78–1.25, nor an increase in green tea consumption of two cups/day (OR = 0.95, 95% CI = 0.85–1.06 was associated with risk of pancreatic cancer. The null association persisted when the analysis was stratified by sex or restricted to non-smokers. In the stratification by study location, the summary OR for the studies from China and for those from Japan was 0.77 (95% CI = 0.60–0.99 and 1.21 (95% CI = 0.94–1.54, respectively (P for differences = 0.04. Cumulative epidemiologic evidence suggests that green tea consumption is not associated with pancreatic cancer.

  8. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  9. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  10. Predictive value and modeling analysis of MSCT signs in gastrointestinal stromal tumors (GISTs) to pathological risk degree.

    Science.gov (United States)

    Wang, J-K

    2017-03-01

    By analyzing MSCT (multi-slice computed tomography) signs with different risks in gastrointestinal stromal tumors, this paper aimed to discuss the predictive value and modeling analysis of MSCT signs in GISTs (gastrointestinal stromal tumor) to pathological risk degree. 100 cases of primary GISTs with abdominal and pelvic MSCT scan were involved in this study. All MSCT scan findings and enhanced findings were analyzed and compared among cases with different risk degree of pathology. Then GISTs diagnostic model was established by using support vector machine (SVM) algorithm, and its diagnostic value was evaluated as well. All lesions were solitary, among which there were 46 low-risk cases, 24 medium-risk cases and 30 high-risk cases. For all high-risk, medium-risk and low-risk GISTs, there were statistical differences in tumor growth pattern, size, shape, fat space, with or without calcification, ulcer, enhancement method and peritumoral and intratumoral vessels (pvalue at each period (plain scan, arterial phase, venous phase) (p>0.05). The apparent difference lied in plain scan, arterial phase and venous phase for each risk degree. The diagnostic accuracy of SVM diagnostic model established with 10 imaging features as indexes was 70.0%, and it was especially reliable when diagnosing GISTs of high or low risk. Preoperative analysis of MSCT features is clinically significant for its diagnosis of risk degree and prognosis; GISTs diagnostic model established on the basis of SVM possesses high diagnostic value.

  11. NGScloud: RNA-seq analysis of non-model species using cloud computing.

    Science.gov (United States)

    Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai

    2018-05-03

    RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.

  12. Traditional cardiovascular risk factors and coronary collateral circulation: Protocol for a systematic review and meta-analysis of case-control studies.

    Science.gov (United States)

    Xing, Zhenhua; Pei, Junyu; Tang, Liang; Hu, Xinqun

    2018-04-01

    Well-developed coronary collateral circulation usually results in fewer infarct size, improved cardiac function, and fewer mortality. Traditional coronary risk factors (diabetes, hypertension, and smoking) have some effects on coronary collateral circulation. However, the association between these risk factors and coronary collateral circulation are controversial. Given the confusing evidences regarding traditional cardiovascular risk factors on coronary collateral circulation, we performed this meta-analysis protocol to investigate the relationship between traditional risk factors of coronary artery disease and coronary collateral circulation. MEDINE, EMBASE, and Science Citation Index will be searched to identify relevant studies. The primary outcomes of this meta-analysis are well-developed coronary collateral circulation. Meta-analysis was performed to calculate the odds ratio (OR) and 95% confidence interval (CI) of traditional coronary risk factors (diabetes, smoking, hypertriton). Pooled ORs were computed as the Mantel-Haenszel-weighted average of the ORs for all included studies. Sensitivity analysis, quality assessment, publication bias analysis, and the Grading of Recommendations Assessment, Development and Evaluation approach (GRADE) will be performed to ensure the reliability of our results. This study will provide a high-quality synthesis of current evidence of traditional risk factors on collateral circulation. This conclusion of our systematic review and meta-analysis will provide evidence to judge whether traditional risk factors affects coronary collateral circulation.Ethics and dissemination: Ethical approval is not required because our systematic review and meta-analysis will be based on published data without interventions on patients. The findings of this study will be published in a peer-reviewed journal.

  13. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    International Nuclear Information System (INIS)

    Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A.; Jackson, R.; TenBrook, W.; Russell, J.

    1994-01-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force

  14. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  15. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  16. Computational dosimetry and risk assessment of radioinduced cancer: studies in mammary glands radiotherapy, radiopharmaceuticals and internal contamination

    International Nuclear Information System (INIS)

    Mendes, Bruno Melo

    2017-01-01

    The use of Ionizing radiation (IR) in medicine has increased considerably. The benefits generated by diagnostic and therapy techniques with IR are proven. Nevertheless, the risks arising from these uses should not be underestimated. Justification, a basic radiation protection, states that the benefits from exposures must outweigh detriment. The cancer induction is one of the detriment components. Thus, the study of the benefit/detriment ratio should take into account cancer incidence and mortality estimations resulting from a given diagnosis or therapy radiological technique. The risk of cancer induction depends on the absorbed doses in the irradiated organs and tissues. Thus, IR dosimetry is essential to evaluate the benefit/detriment ratio. The present work aims to perform computational dosimetric evaluations and estimations of cancer induction risk after ionizing radiation exposure. The investigated situations cover nuclear medicine, radiological contamination and radiotherapy fields. Computational dosimetry, with MCNPx Monte Carlo Code, was used as a tool to calculate the absorbed dose in the interest organs of the voxelized human models. The simulations were also used to obtain calibration factors and optimization of in vivo monitoring systems for internal contamination dosimetry. A breast radiotherapy (RT) standard protocol was simulated using the MCNPx code. The calculation of the radiation-induced cancer risk was adapted from the BEIR VII methodology for the Brazilian population. The absorbed doses used in the risk calculations were obtained through computational simulations of different exposure scenarios. During this work, two new computational phantoms, DM B RA and VW, were generated from tomographic images. Additional twelve voxelized phantoms, including the reference phantoms, RCP A M and RCP A F, and the child, baby, and fetus models were adapted to run on MCNP. Internal Dosimetry Protocols (IDP) for radiopharmaceuticals and for internal contamination

  17. RISK ANALYSIS APPLIED IN OIL EXPLORATION AND PRODUCTION

    African Journals Online (AJOL)

    ES Obe

    aDepartment of Civil Engineering, University of Nigeria, Nsukka, Enugu State, Nigeria. ... The analysis in this work is ... risk analysis, oil field, risk management, projects, investment opportunity. 1. .... own merit but since the company has limited.

  18. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  19. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  20. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  1. Computational analysis of cerebral cortex

    International Nuclear Information System (INIS)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni

    2010-01-01

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  2. Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.

    Science.gov (United States)

    Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip

    2018-02-01

    Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.

  3. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  4. Prognostic Value of Coronary Computed Tomography Imaging in Patients at High Risk Without Symptoms of Coronary Artery Disease.

    Science.gov (United States)

    Dedic, Admir; Ten Kate, Gert-Jan R; Roos, Cornelis J; Neefjes, Lisan A; de Graaf, Michiel A; Spronk, Angela; Delgado, Victoria; van Lennep, Jeanine E Roeters; Moelker, Adriaan; Ouhlous, Mohamed; Scholte, Arthur J H A; Boersma, Eric; Sijbrands, Eric J G; Nieman, Koen; Bax, Jeroen J; de Feijter, Pim J

    2016-03-01

    At present, traditional risk factors are used to guide cardiovascular management of asymptomatic subjects. Intensified surveillance may be warranted in those identified as high risk of developing cardiovascular disease (CVD). This study aims to determine the prognostic value of coronary computed tomography (CT) angiography (CCTA) next to the coronary artery calcium score (CACS) in patients at high CVD risk without symptoms suspect for coronary artery disease (CAD). A total of 665 patients at high risk (mean age 56 ± 9 years, 417 men), having at least one important CVD risk factor (diabetes mellitus, familial hypercholesterolemia, peripheral artery disease, or severe hypertension) or a calculated European systematic coronary risk evaluation of >10% were included from outpatient clinics at 2 academic centers. Follow-up was performed for the occurrence of adverse events including all-cause mortality, nonfatal myocardial infarction, unstable angina, or coronary revascularization. During a median follow-up of 3.0 (interquartile range 1.3 to 4.1) years, adverse events occurred in 40 subjects (6.0%). By multivariate analysis, adjusted for age, gender, and CACS, obstructive CAD on CCTA (≥50% luminal stenosis) was a significant predictor of adverse events (hazard ratio 5.9 [CI 1.3 to 26.1]). Addition of CCTA to age, gender, plus CACS, increased the C statistic from 0.81 to 0.84 and resulted in a total net reclassification index of 0.19 (p value and risk reclassification benefit beyond CACS in patients without CAD symptoms but with high risk of developing CVD. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes......, but also when wading through a flooded area. The results in this thesis have brought microbial risk assessments one step closer to more uniform and repeatable risk analysis by using actual and relevant measured data and hydrodynamic water quality models to estimate the risk from flooding caused...... are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph...

  6. Probabilistic risk analysis in chemical engineering

    International Nuclear Information System (INIS)

    Schmalz, F.

    1991-01-01

    In risk analysis in the chemical industry, recognising potential risks is considered more important than assessing their quantitative extent. Even in assessing risks, emphasis is not on the probability involved but on the possible extent. Qualitative assessment has proved valuable here. Probabilistic methods are used in individual cases where the wide implications make it essential to be able to assess the reliability of safety precautions. In this case, assessment therefore centres on the reliability of technical systems and not on the extent of a chemical risk. 7 figs

  7. Method for environmental risk analysis (MIRA) revision 2007

    International Nuclear Information System (INIS)

    2007-04-01

    OLF's instruction manual for carrying out environmental risk analyses provides a united approach and a common framework for environmental risk assessments. This is based on the best information available. The manual implies standardizations of a series of parameters, input data and partial analyses that are included in the environmental risk analysis. Environmental risk analyses carried out according to the MIRA method will thus be comparable between fields and between companies. In this revision an update of the text in accordance with today's practice for environmental risk analyses and prevailing regulations is emphasized. Moreover, method adjustments for especially protected beach habitats have been introduced, as well as a general method for estimating environmental risk concerning fish. Emphasis has also been put on improving environmental risk analysis' possibilities to contribute to a better management of environmental risk in the companies (ml)

  8. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  9. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  10. Report--COMOLA: A Computer System for the Analysis of Interlanguage Data.

    Science.gov (United States)

    Jagtman, Margriet; Bongaerts, Theo

    1994-01-01

    Discusses the design and use of the Computer Model for Language Acquisition (COMOLA), a computer program designed to analyze syntactic development in second-language learners by examining their oral utterances. Also compares COMOLA to the recently developed Computer-Aides Linguistic Analysis (COALA) program. (MDM)

  11. Evaluating the risks of clinical research: direct comparative analysis.

    Science.gov (United States)

    Rid, Annette; Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S; Wendler, David

    2014-09-01

    Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed "risks of daily life" standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. This study employed a conceptual and normative analysis, and use of an illustrative example. Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the "risks of daily life" standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Direct comparative analysis is a systematic method for applying the "risks of daily life" standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks.

  12. Computer users at risk: Health disorders associated with prolonged computer use

    OpenAIRE

    Abida Ellahi; M. Shahid Khalil; Fouzia Akram

    2011-01-01

    By keeping in view the ISO standards which emphasize the assessment of use of a product, this research aims to assess the prolonged use of computers and their effects on human health. The objective of this study was to investigate the association between extent of computer use (per day) and carpal tunnel syndrome, computer stress syndrome, computer vision syndrome and musculoskeletal problems. The second objective was to investigate the extent of simultaneous occurrence of carpal tunnel syndr...

  13. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  14. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  15. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    Science.gov (United States)

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process. © 2015 Society for Risk Analysis.

  16. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  17. Multifractal Analysis of Seismically Induced Soft-Sediment Deformation Structures Imaged by X-Ray Computed Tomography

    Science.gov (United States)

    Nakashima, Yoshito; Komatsubara, Junko

    Unconsolidated soft sediments deform and mix complexly by seismically induced fluidization. Such geological soft-sediment deformation structures (SSDSs) recorded in boring cores were imaged by X-ray computed tomography (CT), which enables visualization of the inhomogeneous spatial distribution of iron-bearing mineral grains as strong X-ray absorbers in the deformed strata. Multifractal analysis was applied to the two-dimensional (2D) CT images with various degrees of deformation and mixing. The results show that the distribution of the iron-bearing mineral grains is multifractal for less deformed/mixed strata and almost monofractal for fully mixed (i.e. almost homogenized) strata. Computer simulations of deformation of real and synthetic digital images were performed using the egg-beater flow model. The simulations successfully reproduced the transformation from the multifractal spectra into almost monofractal spectra (i.e. almost convergence on a single point) with an increase in deformation/mixing intensity. The present study demonstrates that multifractal analysis coupled with X-ray CT and the mixing flow model is useful to quantify the complexity of seismically induced SSDSs, standing as a novel method for the evaluation of cores for seismic risk assessment.

  18. Analgesic use and the risk of kidney cancer: a meta-analysis of epidemiologic studies

    Science.gov (United States)

    Choueiri, Toni K.; Je, Youjin; Cho, Eunyoung

    2013-01-01

    Analgesics are the most commonly used over-the-counter drugs worldwide with certain analgesics having cancer prevention effect. The evidence for an increased risk of developing kidney cancer with analgesic use is mixed. Using a meta-analysis design of available observational epidemiologic studies, we investigated the association between analgesic use and kidney cancer risk. We searched the MEDLINE and EMBASE databases to identify eligible case-control or cohort studies published in English until June 2012 for 3 categories of analgesics: acetaminophen, aspirin or other Non-Steroidal Anti-Inflammatory Drugs (NSAIDs). Study-specific effect estimates were pooled to compute an overall relative risk (RR) and its 95% confidence interval (CI) using a random effects model for each category of the analgesics. We identified 20 studies (14 with acetaminophen, 13 with aspirin, and 5 with other NSAIDs) that were performed in 6 countries, including 8,420 cases of kidney cancer. Use of acetaminophen and non-aspirin NSAIDs were associated with an increased risk of kidney cancer (pooled RR, 1.28; 95% CI, 1.15 to 1.44 and 1.25; 95% CI, 1.06 to 1.46, respectively). For aspirin use, we found no overall increased risk (pooled RR, 1.10; 95% CI, 0.95 to 1.28), except for non-US studies (5 studies, pooled RR=1.17, 95% CI, 1.04 to 1.33). Similar increases in risks were seen with higher analgesic intake. In this largest meta-analysis to date, we found that acetaminophen and non-aspirin NSAIDs are associated with a significant risk of developing kidney cancer. Further work is needed to elucidate biologic mechanisms behind these findings. PMID:23400756

  19. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  20. Risk evaluation system for facility safeguards and security planning

    International Nuclear Information System (INIS)

    Udell, C.J.; Carlson, R.L.

    1987-01-01

    The Risk Evaluation System (RES) is an integrated approach to determining safeguards and security effectiveness and risk. RES combines the planning and technical analysis into a format that promotes an orderly development of protection strategies, planing assumptions, facility targets, vulnerability and risk determination, enhancement planning, and implementation. In addition, the RES computer database program enhances the capability of the analyst to perform a risk evaluation of the facility. The computer database is menu driven using data input screens and contains an algorithm for determining the probability of adversary defeat and risk. Also, base case and adjusted risk data records can be maintained and accessed easily

  1. Risk evaluation system for facility safeguards and security planning

    International Nuclear Information System (INIS)

    Udell, C.J.; Carlson, R.L.

    1987-01-01

    The Risk Evaluation System (RES) is an integrated approach to determining safeguards and security effectiveness and risk. RES combines the planning and technical analysis into a format that promotes an orderly development of protection strategies, planning assumptions, facility targets, vulnerability and risk determination, enhancement planning, and implementation. In addition, the RES computer database program enhances the capability of the analyst to perform a risk evaluation of the facility. The computer database is menu driven using data input screens and contains an algorithm for determining the probability of adversary defeat and risk. Also, base case and adjusted risk data records can be maintained and accessed easily

  2. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    Zhang, Limao; Wu, Xianguo; Skibniewski, Miroslaw J.; Zhong, Jingbing; Lu, Yujie

    2014-01-01

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  3. On the validation of risk analysis-A commentary

    International Nuclear Information System (INIS)

    Rosqvist, Tony

    2010-01-01

    Aven and Heide (2009) [1] provided interesting views on the reliability and validation of risk analysis. The four validation criteria presented are contrasted with modelling features related to the relative frequency-based and Bayesian approaches to risk analysis. In this commentary I would like to bring forth some issues on validation that partly confirm and partly suggest changes in the interpretation of the introduced validation criteria-especially, in the context of low probability-high consequence systems. The mental model of an expert in assessing probabilities is argued to be a key notion in understanding the validation of a risk analysis.

  4. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    Morgenroth, M.; Donnelly, C.R.; Westermann, G.D.; Huang, J.H.S.; Lam, T.M.

    1999-01-01

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  5. Computer vision syndrome among computer office workers in a developing country: an evaluation of prevalence and risk factors.

    Science.gov (United States)

    Ranasinghe, P; Wathurapatha, W S; Perera, Y S; Lamabadusuriya, D A; Kulatunga, S; Jayawardana, N; Katulanda, P

    2016-03-09

    Computer vision syndrome (CVS) is a group of visual symptoms experienced in relation to the use of computers. Nearly 60 million people suffer from CVS globally, resulting in reduced productivity at work and reduced quality of life of the computer worker. The present study aims to describe the prevalence of CVS and its associated factors among a nationally-representative sample of Sri Lankan computer workers. Two thousand five hundred computer office workers were invited for the study from all nine provinces of Sri Lanka between May and December 2009. A self-administered questionnaire was used to collect socio-demographic data, symptoms of CVS and its associated factors. A binary logistic regression analysis was performed in all patients with 'presence of CVS' as the dichotomous dependent variable and age, gender, duration of occupation, daily computer usage, pre-existing eye disease, not using a visual display terminal (VDT) filter, adjusting brightness of screen, use of contact lenses, angle of gaze and ergonomic practices knowledge as the continuous/dichotomous independent variables. A similar binary logistic regression analysis was performed in all patients with 'severity of CVS' as the dichotomous dependent variable and other continuous/dichotomous independent variables. Sample size was 2210 (response rate-88.4%). Mean age was 30.8 ± 8.1 years and 50.8% of the sample were males. The 1-year prevalence of CVS in the study population was 67.4%. Female gender (OR: 1.28), duration of occupation (OR: 1.07), daily computer usage (1.10), pre-existing eye disease (OR: 4.49), not using a VDT filter (OR: 1.02), use of contact lenses (OR: 3.21) and ergonomics practices knowledge (OR: 1.24) all were associated with significantly presence of CVS. The duration of occupation (OR: 1.04) and presence of pre-existing eye disease (OR: 1.54) were significantly associated with the presence of 'severe CVS'. Sri Lankan computer workers had a high prevalence of CVS. Female gender

  6. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  7. Instability risk analysis and risk assessment system establishment of underground storage caverns in bedded salt rock

    Science.gov (United States)

    Jing, Wenjun; Zhao, Yan

    2018-02-01

    Stability is an important part of geotechnical engineering research. The operating experiences of underground storage caverns in salt rock all around the world show that the stability of the caverns is the key problem of safe operation. Currently, the combination of theoretical analysis and numerical simulation are the mainly adopts method of reserve stability analysis. This paper introduces the concept of risk into the stability analysis of underground geotechnical structure, and studies the instability of underground storage cavern in salt rock from the perspective of risk analysis. Firstly, the definition and classification of cavern instability risk is proposed, and the damage mechanism is analyzed from the mechanical angle. Then the main stability evaluating indicators of cavern instability risk are proposed, and an evaluation method of cavern instability risk is put forward. Finally, the established cavern instability risk assessment system is applied to the analysis and prediction of cavern instability risk after 30 years of operation in a proposed storage cavern group in the Huai’an salt mine. This research can provide a useful theoretical base for the safe operation and management of underground storage caverns in salt rock.

  8. Analysis of risk factors and risk assessment for ischemic stroke recurrence

    Directory of Open Access Journals (Sweden)

    Xiu-ying LONG

    2016-08-01

    Full Text Available Objective To screen the risk factors for recurrence of ischemic stroke and to assess the risk of recurrence. Methods Essen Stroke Risk Score (ESRS was used to evaluate the risk of recurrence in 176 patients with ischemic stroke (96 cases of first onset and 80 cases of recurrence. Univariate and multivariate stepwise Logistic regression analysis was used to screen risk factors for recurrence of ischemic stroke.  Results There were significant differences between first onset group and recurrence group on age, the proportion of > 75 years old, hypertension, diabetes, coronary heart disease, peripheral angiopathy, transient ischemic attack (TIA or ischemic stroke, drinking and ESRS score (P < 0.05, for all. First onset group included one case of ESRS 0 (1.04%, 8 cases of 1 (8.33%, 39 cases of 2 (40.63%, 44 cases of 3 (45.83%, 4 cases of 4 (4.17%. Recurrence group included 2 cases of ESRS 3 (2.50%, 20 cases of 4 (25% , 37 cases of 5 (46.25% , 18 cases of 6 (22.50% , 3 cases of 7 (3.75% . There was significant difference between 2 groups (Z = -11.376, P = 0.000. Logistic regression analysis showed ESRS > 3 score was independent risk factor for recurrence of ischemic stroke (OR = 31.324, 95%CI: 3.934-249.430; P = 0.001.  Conclusions ESRS > 3 score is the independent risk factor for recurrence of ischemic stroke. It is important to strengthen risk assessment of recurrence of ischemic stroke. To screen and control risk factors is the key to secondary prevention of ischemic stroke. DOI: 10.3969/j.issn.1672-6731.2016.07.011

  9. The Evaluation of CEIT Teacher Candidates in Terms of Computer Games, Educational Use of Computer Games and Game Design Qualifications

    Directory of Open Access Journals (Sweden)

    Hakkı BAĞCI

    2014-04-01

    Full Text Available Computer games have an important usage potential in the education of today’s digital student profile. Also computer teachers known as technology leaders in schools are the main stakeholders of this potential. In this study, opinions of the computer teachers about computer games are examined from different perspectives. 119 computer teacher candidates participated in this study, and the data were collected by a questionnaire. As a result of this study, computer teacher candidates have a positive thinking about the usage of computer games in education and they see themselves qualified for the analysis and design of educational games. But they partially have negative attitudes about some risks like addiction and lose of time. Also the candidates who attended the educational game courses and play games from their mobile phones have more positive opinions, and they see themselves more qualified than others. Males have more positive opinions about computer games than females, but in terms of educational games and the analysis and design of the computer games, there is no significant difference between males and females.

  10. Risk analysis of industrial plants operation

    International Nuclear Information System (INIS)

    Hubert, Philippe

    1989-12-01

    This study examines the possibilities of systematic technology risk analysis in view of territorial management (city, urban community, region), including chronic and accidental risks. The objective was to relate this evaluation with those done for permanent water and air pollution. Risk management for pollution are done for a long time. A number of studies were done in urban communities and regions both for air and water pollution. The second objective is related to management of industrial risks: nuclear, petrochemical, transport of hazardous material, pipelines, etc. At the beginning, three possibilities of effects are taken into account: human health, economic aspect and water, and possibilities of evaluation are identified. Elements of risk identification are presented for quantification of results [fr

  11. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  12. Managing the risk associated with use of contrast media for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Stacul, Fulvio [Department of Radiology, University of Trieste, Cattinara Hospital, Strada di Fiume, 447, 34149 Trieste (Italy)]. E-mail: fulvio.stacul@aots.sanita.fvg.it

    2007-05-15

    Contrast agents are widely used in patients undergoing diagnostic and therapeutic imaging procedures. In recent years, there has been a significant increase in the use of iodinated contrast media (CM) due to the growing number of computed tomography (CT) procedures. Although contrast agents are generally well-tolerated, some patient subsets are at an increased risk of complications from CM. Patients at risk include those with a history of adverse reactions to CM, asthma or severe allergies, impaired renal function, older age, dehydration, congestive heart failure, or concurrent use of some drugs. Although the incidence of CM-associated complications cannot be eliminated, the chances of developing severe adverse reactions can be reduced by incorporating a number of management strategies into clinical practice. Patients at risk for acute adverse reactions can undergo pre-medication with corticosteroids, eventually associated with anti-histamines, although opinion is divided whether this prophylaxis should be used with non-ionic CM. Patients who have been identified as at risk for contrast-induced nephropathy (CIN) should be well-hydrated and have nephrotoxic medications withdrawn prior to CM exposure. Contrast dose should be decreased, as the risk of developing CIN is dose-dependent. For patients with pre-existing renal insufficiency, use of low-osmolar or iso-osmolar CM is preferable to use of high-osmolar CM. Simple strategies for preventing the risk of adverse reactions are reviewed.

  13. Managing the risk associated with use of contrast media for computed tomography

    International Nuclear Information System (INIS)

    Stacul, Fulvio

    2007-01-01

    Contrast agents are widely used in patients undergoing diagnostic and therapeutic imaging procedures. In recent years, there has been a significant increase in the use of iodinated contrast media (CM) due to the growing number of computed tomography (CT) procedures. Although contrast agents are generally well-tolerated, some patient subsets are at an increased risk of complications from CM. Patients at risk include those with a history of adverse reactions to CM, asthma or severe allergies, impaired renal function, older age, dehydration, congestive heart failure, or concurrent use of some drugs. Although the incidence of CM-associated complications cannot be eliminated, the chances of developing severe adverse reactions can be reduced by incorporating a number of management strategies into clinical practice. Patients at risk for acute adverse reactions can undergo pre-medication with corticosteroids, eventually associated with anti-histamines, although opinion is divided whether this prophylaxis should be used with non-ionic CM. Patients who have been identified as at risk for contrast-induced nephropathy (CIN) should be well-hydrated and have nephrotoxic medications withdrawn prior to CM exposure. Contrast dose should be decreased, as the risk of developing CIN is dose-dependent. For patients with pre-existing renal insufficiency, use of low-osmolar or iso-osmolar CM is preferable to use of high-osmolar CM. Simple strategies for preventing the risk of adverse reactions are reviewed

  14. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  15. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  16. New algorithm for risk analysis in radiotherapy

    International Nuclear Information System (INIS)

    Torres, Antonio; Montes de Oca, Joe

    2015-01-01

    Risk analyses applied to radiotherapy treatments have become an undeniable necessity, considering the dangers generated by the combination of using powerful radiation fields on patients and the occurrence of human errors and equipment failures during these treatments. The technique par excellence to execute these analyses has been the risk matrix. This paper presents the development of a new algorithm to execute the task with wide graphic and analytic potentialities, thus transforming it into a very useful option for risk monitoring and the optimization of quality assurance. The system SECURE- MR, which is the basic software of this algorithm, has been successfully used in risk analysis regarding different kinds of radiotherapies. Compared to previous methods, It offers new possibilities of analysis considering risk controlling factors as the robustness of reducers of initiators frequency and its consequences. Their analytic capacities and graphs allow novel developments to classify risk contributing factors, to represent information processes as well as accidental sequences. The paper shows the application of the proposed system to a generic process of radiotherapy treatment using a lineal accelerator. (author)

  17. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  18. Comparative risk analysis for the Rocky Flats Plant integrated project planning

    International Nuclear Information System (INIS)

    Jones, M.E.; Shain, D.I.

    1994-01-01

    The Rocky Flats Plant is developing, with active stakeholder participation, a comprehensive planning strategy that will support transition of the Rocky Flats Plant from a nuclear weapons production facility to site cleanup and final disposition. Consideration of the interrelated nature of sitewide problems, such as material movement and disposition, facility and land use endstates, costs, relative risks to workers and the public, and waste disposition are all needed. Comparative Risk Analysis employs both incremental risk and cumulative risk evaluations to compare risks from postulated options or endstates and is an analytical tool for the Rocky Flats Plant Integrated Project Planning which can assist a decision-maker in evaluating relative risks among proposed remediation activity. However, risks from all of the remediation activities, decontamination and decommissioning activities, and normal ongoing operations are imposed upon the Rocky Flats workers, the surrounding public, and the environment. Comparative Risk Analysis will provide risk information, both human health and ecological, to aid in reducing unnecessary resource and monetary expenditures by focusing these resources on the largest risks first. Comparative Risk Analysis has been developed to aggregate various incremental risk estimates to develop a site cumulative risk estimate. The Comparative Risk Analysis methodology Group, consisting of community stakeholders, was established. Early stakeholder involvement in the risk analysis methodology development provides an opportunity for stakeholders to influence the risk information delivered to a decision-maker. This paper discusses development of the Comparative Risk Analysis methodology, stakeholder participation and lessons learned from these challenges

  19. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  20. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  1. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  2. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  3. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  4. Network theory-based analysis of risk interactions in large engineering projects

    International Nuclear Information System (INIS)

    Fang, Chao; Marle, Franck; Zio, Enrico; Bocquet, Jean-Claude

    2012-01-01

    This paper presents an approach based on network theory to deal with risk interactions in large engineering projects. Indeed, such projects are exposed to numerous and interdependent risks of various nature, which makes their management more difficult. In this paper, a topological analysis based on network theory is presented, which aims at identifying key elements in the structure of interrelated risks potentially affecting a large engineering project. This analysis serves as a powerful complement to classical project risk analysis. Its originality lies in the application of some network theory indicators to the project risk management field. The construction of the risk network requires the involvement of the project manager and other team members assigned to the risk management process. Its interpretation improves their understanding of risks and their potential interactions. The outcomes of the analysis provide a support for decision-making regarding project risk management. An example of application to a real large engineering project is presented. The conclusion is that some new insights can be found about risks, about their interactions and about the global potential behavior of the project. - Highlights: ► The method addresses the modeling of complexity in project risk analysis. ► Network theory indicators enable other risks than classical criticality analysis to be highlighted. ► This topological analysis improves project manager's understanding of risks and risk interactions. ► This helps project manager to make decisions considering the position in the risk network. ► An application to a real tramway implementation project in a city is provided.

  5. Aerodynamic analysis of Pegasus - Computations vs reality

    Science.gov (United States)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  6. Dynamics of global supply chain and electric power networks: Models, pricing analysis, and computations

    Science.gov (United States)

    Matsypura, Dmytro

    In this dissertation, I develop a new theoretical framework for the modeling, pricing analysis, and computation of solutions to electric power supply chains with power generators, suppliers, transmission service providers, and the inclusion of consumer demands. In particular, I advocate the application of finite-dimensional variational inequality theory, projected dynamical systems theory, game theory, network theory, and other tools that have been recently proposed for the modeling and analysis of supply chain networks (cf. Nagurney (2006)) to electric power markets. This dissertation contributes to the extant literature on the modeling, analysis, and solution of supply chain networks, including global supply chains, in general, and electric power supply chains, in particular, in the following ways. It develops a theoretical framework for modeling, pricing analysis, and computation of electric power flows/transactions in electric power systems using the rationale for supply chain analysis. The models developed include both static and dynamic ones. The dissertation also adds a new dimension to the methodology of the theory of projected dynamical systems by proving that, irrespective of the speeds of adjustment, the equilibrium of the system remains the same. Finally, I include alternative fuel suppliers, along with their behavior into the supply chain modeling and analysis framework. This dissertation has strong practical implications. In an era in which technology and globalization, coupled with increasing risk and uncertainty, complicate electricity demand and supply within and between nations, the successful management of electric power systems and pricing become increasingly pressing topics with relevance not only for economic prosperity but also national security. This dissertation addresses such related topics by providing models, pricing tools, and algorithms for decentralized electric power supply chains. This dissertation is based heavily on the following

  7. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  8. Physics-based Entry, Descent and Landing Risk Model

    Science.gov (United States)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  9. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  10. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  11. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  12. Introducing remarks upon the analysis of computer systems performance

    International Nuclear Information System (INIS)

    Baum, D.

    1980-05-01

    Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de

  13. Estimation of the exposure and a risk-benefit analysis for a CT system designed for a lung cancer mass screening unit

    International Nuclear Information System (INIS)

    Nishizawa, K.; Matsumoto, T.; Sakashita, K.; Tateno, Y.; Miyamoto, T.; Iwai, K.; Shimura, A.; Takagi, H.

    1996-01-01

    Organ or tissue doses from examination by a computed tomography system called LSCT were determined by in-phantom measurement. LSCT has been developed for lung cancer screening, with spiral scanning capability. Dose measurements were performed under the actual screening conditions of the chest CT examination. The effective dose recommended by ICRP 60 was evaluated using the organ or tissue doses. Risk-benefit analysis in the LSCT screening was also performed. The resultant effective dose per LSCT examination was 3.6 mSv and surface dose was 7.6 mGy. It was half to a third lower than the doses by traditional CT systems. The risk-benefit analysis of LSCT showed that the benefit will exceed the risk for Japanese over forty years for men and over forty-five for women. (Author)

  14. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  15. Advanced uncertainty modelling for container port risk analysis.

    Science.gov (United States)

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  17. Automated analysis of free speech predicts psychosis onset in high-risk youths

    Science.gov (United States)

    Bedi, Gillinder; Carrillo, Facundo; Cecchi, Guillermo A; Slezak, Diego Fernández; Sigman, Mariano; Mota, Natália B; Ribeiro, Sidarta; Javitt, Daniel C; Copelli, Mauro; Corcoran, Cheryl M

    2015-01-01

    Background/Objectives: Psychiatry lacks the objective clinical tests routinely used in other specializations. Novel computerized methods to characterize complex behaviors such as speech could be used to identify and predict psychiatric illness in individuals. AIMS: In this proof-of-principle study, our aim was to test automated speech analyses combined with Machine Learning to predict later psychosis onset in youths at clinical high-risk (CHR) for psychosis. Methods: Thirty-four CHR youths (11 females) had baseline interviews and were assessed quarterly for up to 2.5 years; five transitioned to psychosis. Using automated analysis, transcripts of interviews were evaluated for semantic and syntactic features predicting later psychosis onset. Speech features were fed into a convex hull classification algorithm with leave-one-subject-out cross-validation to assess their predictive value for psychosis outcome. The canonical correlation between the speech features and prodromal symptom ratings was computed. Results: Derived speech features included a Latent Semantic Analysis measure of semantic coherence and two syntactic markers of speech complexity: maximum phrase length and use of determiners (e.g., which). These speech features predicted later psychosis development with 100% accuracy, outperforming classification from clinical interviews. Speech features were significantly correlated with prodromal symptoms. Conclusions: Findings support the utility of automated speech analysis to measure subtle, clinically relevant mental state changes in emergent psychosis. Recent developments in computer science, including natural language processing, could provide the foundation for future development of objective clinical tests for psychiatry. PMID:27336038

  18. Multivariate survival analysis and competing risks

    CERN Document Server

    Crowder, Martin J

    2012-01-01

    Multivariate Survival Analysis and Competing Risks introduces univariate survival analysis and extends it to the multivariate case. It covers competing risks and counting processes and provides many real-world examples, exercises, and R code. The text discusses survival data, survival distributions, frailty models, parametric methods, multivariate data and distributions, copulas, continuous failure, parametric likelihood inference, and non- and semi-parametric methods. There are many books covering survival analysis, but very few that cover the multivariate case in any depth. Written for a graduate-level audience in statistics/biostatistics, this book includes practical exercises and R code for the examples. The author is renowned for his clear writing style, and this book continues that trend. It is an excellent reference for graduate students and researchers looking for grounding in this burgeoning field of research.

  19. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  20. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  1. Analysis of Biosignals During Immersion in Computer Games.

    Science.gov (United States)

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  2. Can Designing Self-Representations through Creative Computing Promote an Incremental View of Intelligence and Enhance Creativity among At-Risk Youth?

    Directory of Open Access Journals (Sweden)

    Ina Blau

    2016-12-01

    Full Text Available Creative computing is one of the rapidly growing educational trends around the world. Previous studies have shown that creative computing can empower disadvantaged children and youth. At-risk youth tend to hold a negative view of self and perceive their abilities as inferior compared to “normative” pupils. The Implicit Theories of Intelligence approach (ITI; Dweck, 1999, 2008 suggests a way of changing beliefs regarding one’s abilities. This paper reports findings from an experiment that explores the impact of a short intervention among at-risk youth and “normative” high-school students on (1 changing ITI from being perceived as fixed (entity view of intelligence to more flexible (incremental view of intelligence and (2 the quality of digital self-representations programmed though a creative computing app. The participants were 117 Israeli youth aged 14-17, half of whom were at-risk youth. The participants were randomly assigned to the experimental and control conditions. The experimental group watched a video of a lecture regarding brain plasticity that emphasized flexibility and the potential of human intelligence to be cultivated. The control group watched a neutral lecture about brain-functioning and creativity. Following the intervention, all of the participants watched screencasts of basic training for the Scratch programming app, designed artifacts that digitally represented themselves five years later and reported their ITI. The results showed more incremental ITI in the experimental group compared to the control group and among normative students compared to at-risk youth. In contrast to the research hypothesis, the Scratch projects of the at-risk youth, especially in the experimental condition, were rated by neutral judges as being more creative, more aesthetically designed, and more clearly conveying their message. The results suggest that creative computing combined with the ITI intervention is a way of developing

  3. Flood risk analysis procedure for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.

    1982-01-01

    This paper describes a methodology and procedure for determining the impact of floods on nuclear power plant risk. The procedures are based on techniques of fault tree and event tree analysis and use the logic of these techniques to determine the effects of a flood on system failure probability and accident sequence occurrence frequency. The methodology can be applied independently or as an add-on analysis for an existing risk assessment. Each stage of the analysis yields useful results such as the critical flood level, failure flood level, and the flood's contribution to accident sequence occurrence frequency. The results of applications show the effects of floods on the risk from nuclear power plants analyzed in the Reactor Safety Study

  4. Computer methods for transient fluid-structure analysis of nuclear reactors

    International Nuclear Information System (INIS)

    Belytschko, T.; Liu, W.K.

    1985-01-01

    Fluid-structure interaction problems in nuclear engineering are categorized according to the dominant physical phenomena and the appropriate computational methods. Linear fluid models that are considered include acoustic fluids, incompressible fluids undergoing small disturbances, and small amplitude sloshing. Methods available in general-purpose codes for these linear fluid problems are described. For nonlinear fluid problems, the major features of alternative computational treatments are reviewed; some special-purpose and multipurpose computer codes applicable to these problems are then described. For illustration, some examples of nuclear reactor problems that entail coupled fluid-structure analysis are described along with computational results

  5. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  6. Optimization of the cumulative risk assessment of pesticides and biocides using computational techniques: Pilot project

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    This pilot project is intended as the first step in developing a computational strategy to assist in refining methods for higher tier cumulative and aggregate risk assessment of exposure to mixture of pesticides and biocides. For this purpose, physiologically based toxicokinetic (PBTK) models were...

  7. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration

  8. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/ USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration. (author)

  9. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  10. Ontological Analysis of the Project Risk Management Concept ‘Risk’

    Directory of Open Access Journals (Sweden)

    Uzulāns Juris

    2018-02-01

    Full Text Available The aim of the current research series is to examine the definitions of the concept ‘risk’ and analyze the concepts used in the definitions of ‘risk’ in the sources of these definitions in order to perform the ontological analysis of the concept of ‘risk’. Ontological and epistemological analysis of the concepts in the definition of the concept ‘risk’ was used to answer the question what ‘risk’ means in project management. This investigation represents a part of the research series where the ontological, epistemological and methodological analysis of project risk is performed with the aim to improve risk registers and risk management as a whole. In the previous studies the author analyzed the concept of ‘event’ that defines the content of the concept ‘risk’. The use of ‘event’ was analyzed in different sources to find out how the concept should be used. The ontological, epistemological and methodological analysis of the definitions of the concept ‘risk’ is the theoretical foundation for risk register creation because it is possible to create complete and understandable register for the participants of the project risk management process. The author believes that the conducted research helps establish confidence that ontological analysis is the method that together with the epistemological and methodological analysis provides opportunity to perform analysis of risk management sources aimed at improving risk management. The results of the study cannot be considered sufficient for deriving valid conclusions about project risk management and developing recommendations for improving risk management with regard to the content of the risk register. For valid conclusions and recommendations, a deeper research is needed which, first of all, would analyze a larger number of sources.

  11. Cloud Computing and Risk: A look at the EU and the application of the Data Protection Directive to cloud computing

    OpenAIRE

    Victoria Ostrzenski

    2013-01-01

    The use of cloud services for the management of records presents many challenges, both in terms of the particulars of data security as well the need to sustain and ensure the greater reliability, authenticity, and accuracy of records. To properly grapple with these concerns requires the development of more specifically applicable and effective binding legislation; an important first step is the examination and identification of the risks specific to cloud computing coupled with an evaluation...

  12. Risk analysis of Leksell Gamma Knife Model C with automatic positioning system

    International Nuclear Information System (INIS)

    Goetsch, Steven J.

    2002-01-01

    Purpose: This study was conducted to evaluate the decrease in risk from misadministration of the new Leksell Gamma Knife Model C with Automatic Positioning System compared with previous models. Methods and Materials: Elekta Instruments, A.B. of Stockholm has introduced a new computer-controlled Leksell Gamma Knife Model C which uses motor-driven trunnions to reposition the patient between isocenters (shots) without human intervention. Previous models required the operators to manually set coordinates from a printed list, permitting opportunities for coordinate transposition, incorrect helmet size, incorrect treatment times, missing shots, or repeated shots. Results: A risk analysis was conducted between craniotomy involving hospital admission and outpatient Gamma Knife radiosurgery. A report of the Institute of Medicine of the National Academies dated November 29, 1999 estimated that medical errors kill between 44,000 and 98,000 people each year in the United States. Another report from the National Nosocomial Infections Surveillance System estimates that 2.1 million nosocomial infections occur annually in the United States in acute care hospitals alone, with 31 million total admissions. Conclusions: All medical procedures have attendant risks of morbidity and possibly mortality. Each patient should be counseled as to the risk of adverse effects as well as the likelihood of good results for alternative treatment strategies. This paper seeks to fill a gap in the existing medical literature, which has a paucity of data involving risk estimates for stereotactic radiosurgery

  13. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  14. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Sippel; William C. Carrigan; Kenneth D. Luff; Lyn Canter

    2003-11-12

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). The software tools in ICS have been developed for characterization of reservoir properties and evaluation of hydrocarbon potential using a combination of inter-disciplinary data sources such as geophysical, geologic and engineering variables. The ICS tools provide a means for logical and consistent reservoir characterization and oil reserve estimates. The tools can be broadly characterized as (1) clustering tools, (2) neural solvers, (3) multiple-linear regression, (4) entrapment-potential calculator and (5) file utility tools. ICS tools are extremely flexible in their approach and use, and applicable to most geologic settings. The tools are primarily designed to correlate relationships between seismic information and engineering and geologic data obtained from wells, and to convert or translate seismic information into engineering and geologic terms or units. It is also possible to apply ICS in a simple framework that may include reservoir characterization using only engineering, seismic, or geologic data in the analysis. ICS tools were developed and tested using geophysical, geologic and engineering data obtained from an exploitation and development project involving the Red River Formation in Bowman County, North Dakota and Harding County, South Dakota. Data obtained from 3D seismic surveys, and 2D seismic lines encompassing nine prospective field areas were used in the analysis. The geologic setting of the Red River Formation in Bowman and Harding counties is that of a shallow-shelf, carbonate system. Present-day depth of the Red River formation is approximately 8000 to 10,000 ft below ground surface. This report summarizes production results from well demonstration activity, results of reservoir characterization of the Red River Formation at demonstration sites, descriptions of ICS tools and strategies for their application.

  15. CFD: computational fluid dynamics or confounding factor dissemination? The role of hemodynamics in intracranial aneurysm rupture risk assessment.

    Science.gov (United States)

    Xiang, J; Tutino, V M; Snyder, K V; Meng, H

    2014-10-01

    Image-based computational fluid dynamics holds a prominent position in the evaluation of intracranial aneurysms, especially as a promising tool to stratify rupture risk. Current computational fluid dynamics findings correlating both high and low wall shear stress with intracranial aneurysm growth and rupture puzzle researchers and clinicians alike. These conflicting findings may stem from inconsistent parameter definitions, small datasets, and intrinsic complexities in intracranial aneurysm growth and rupture. In Part 1 of this 2-part review, we proposed a unifying hypothesis: both high and low wall shear stress drive intracranial aneurysm growth and rupture through mural cell-mediated and inflammatory cell-mediated destructive remodeling pathways, respectively. In the present report, Part 2, we delineate different wall shear stress parameter definitions and survey recent computational fluid dynamics studies, in light of this mechanistic heterogeneity. In the future, we expect that larger datasets, better analyses, and increased understanding of hemodynamic-biologic mechanisms will lead to more accurate predictive models for intracranial aneurysm risk assessment from computational fluid dynamics. © 2014 by American Journal of Neuroradiology.

  16. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  17. Risk analysis of alternative energy sources

    International Nuclear Information System (INIS)

    Kazmer, D.R.

    1982-01-01

    The author explores two points raised by Miller Spangler in a January 1981 issue: public perception of risks involving nuclear power plants relative to those of conventional plants and criteria for evaluating the way risk analyses are made. On the first point, he concludes that translating public attitudes into the experts' language of probability and risk could provide better information and understanding of both the attitudes and the risks. Viewing risk analysis methodologies as filters which help to test historical change, he suggests that the lack of information favors a lay jury approach for energy decisions. Spangler responds that Congress is an example of lay decision making, but that a lay jury, given public disinterest and polarization, would probably not improve social justice on the nuclear issue. 5 references, 4 figures

  18. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  19. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  20. Computer program for analysis of hemodynamic response to head-up tilt test

    Science.gov (United States)

    ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor

    2014-11-01

    The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.

  1. Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.

    Science.gov (United States)

    Kemp, Pavlina S; VanderVeen, Deborah K

    2016-01-01

    The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.

  2. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  3. The watchdog role of risk analysis

    International Nuclear Information System (INIS)

    Reijen, G. van; Vinck, W.

    1983-01-01

    The reason why the risks of large-scale technology attract more attention lies in the fact that accidents would have more disastrous results and in the fact that it is probably more attractive to study the risks of some large projects than to do the same for a greater number of smaller projects. Within this presentation there will be some opening remarks on the Role of the Commission of the European Community with regard to accident prevention. The development of the concept of quantitative risks is dealt with. This development leads to a combinded of deterministic and probabilistic methods. The presentation concludes with some critical remarks on quantitative risk analysis and its use. (orig./HP) [de

  4. Studying Driving Risk Factors using Multi-Source Mobile Computing Data

    Directory of Open Access Journals (Sweden)

    Xianbiao Hu

    2015-09-01

    In this paper, the overall research framework used in this research is presented, which mainly includes data collection, data processing, calibration and analysis methodology. A preliminary case study — including data summary statistics and correlation analysis — is also presented. The results of our study will further existing knowledge about driving exposure factors that are closely linked to crash risk, and provide the foundation for advanced forms of Usage Based Insurance.

  5. Method and system for dynamic probabilistic risk assessment

    Science.gov (United States)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  6. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  7. Identification of High-Risk Plaques Destined to Cause Acute Coronary Syndrome Using Coronary Computed Tomographic Angiography and Computational Fluid Dynamics.

    Science.gov (United States)

    Lee, Joo Myung; Choi, Gilwoo; Koo, Bon-Kwon; Hwang, Doyeon; Park, Jonghanne; Zhang, Jinlong; Kim, Kyung-Jin; Tong, Yaliang; Kim, Hyun Jin; Grady, Leo; Doh, Joon-Hyung; Nam, Chang-Wook; Shin, Eun-Seok; Cho, Young-Seok; Choi, Su-Yeon; Chun, Eun Ju; Choi, Jin-Ho; Nørgaard, Bjarne L; Christiansen, Evald H; Niemen, Koen; Otake, Hiromasa; Penicka, Martin; de Bruyne, Bernard; Kubo, Takashi; Akasaka, Takashi; Narula, Jagat; Douglas, Pamela S; Taylor, Charles A; Kim, Hyo-Soo

    2018-03-14

    We investigated the utility of noninvasive hemodynamic assessment in the identification of high-risk plaques that caused subsequent acute coronary syndrome (ACS). ACS is a critical event that impacts the prognosis of patients with coronary artery disease. However, the role of hemodynamic factors in the development of ACS is not well-known. Seventy-two patients with clearly documented ACS and available coronary computed tomographic angiography (CTA) acquired between 1 month and 2 years before the development of ACS were included. In 66 culprit and 150 nonculprit lesions as a case-control design, the presence of adverse plaque characteristics (APC) was assessed and hemodynamic parameters (fractional flow reserve derived by coronary computed tomographic angiography [FFR CT ], change in FFR CT across the lesion [△FFR CT ], wall shear stress [WSS], and axial plaque stress) were analyzed using computational fluid dynamics. The best cut-off values for FFR CT , △FFR CT , WSS, and axial plaque stress were used to define the presence of adverse hemodynamic characteristics (AHC). The incremental discriminant and reclassification abilities for ACS prediction were compared among 3 models (model 1: percent diameter stenosis [%DS] and lesion length, model 2: model 1 + APC, and model 3: model 2 + AHC). The culprit lesions showed higher %DS (55.5 ± 15.4% vs. 43.1 ± 15.0%; p stress than nonculprit lesions (all p values statistic [c-index] 0.789 vs. 0.747; p = 0.014) and reclassification abilities (category-free net reclassification index 0.287; p = 0.047; relative integrated discrimination improvement 0.368; p < 0.001) than model 2. Lesions with both APC and AHC showed significantly higher risk of the culprit for subsequent ACS than those with no APC/AHC (hazard ratio: 11.75; 95% confidence interval: 2.85 to 48.51; p = 0.001) and with either APC or AHC (hazard ratio: 3.22; 95% confidence interval: 1.86 to 5.55; p < 0.001). Noninvasive hemodynamic assessment enhanced

  8. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  9. Dietary patterns and depression risk: A meta-analysis.

    Science.gov (United States)

    Li, Ye; Lv, Mei-Rong; Wei, Yan-Jin; Sun, Ling; Zhang, Ji-Xiang; Zhang, Huai-Guo; Li, Bin

    2017-07-01

    Although some studies have reported potential associations of dietary patterns with depression risk, a consistent perspective hasn't been estimated to date. Therefore, we conducted this meta-analysis to evaluate the relation between dietary patterns and the risk of depression. A literature research was conducted searching MEDLINE and EMBASE databases up to September 2016. In total, 21 studies from ten countries met the inclusion criteria and were included in the present meta-analysis. A dietary pattern characterized by a high intakes of fruit, vegetables, whole grain, fish, olive oil, low-fat dairy and antioxidants and low intakes of animal foods was apparently associated with a decreased risk of depression. A dietary pattern characterized by a high consumption of red and/or processed meat, refined grains, sweets, high-fat dairy products, butter, potatoes and high-fat gravy, and low intakes of fruits and vegetables is associated with an increased risk of depression. The results of this meta-analysis suggest that healthy pattern may decrease the risk of depression, whereas western-style may increase the risk of depression. However, more randomized controlled trails and cohort studies are urgently required to confirm this findings. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  10. Benefit-Risk Analysis for Decision-Making: An Approach.

    Science.gov (United States)

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.

  11. Implementation of Cloud Computing into VoIP

    Directory of Open Access Journals (Sweden)

    Floriana GEREA

    2012-08-01

    Full Text Available This article defines Cloud Computing and highlights key concepts, the benefits of using virtualization, its weaknesses and ways of combining it with classical VoIP technologies applied to large scale businesses. The analysis takes into consideration management strategies and resources for better customer orientation and risk management all for sustaining the Service Level Agreement (SLA. An important issue in cloud computing can be security and for this reason there are several security solution presented.

  12. Implementing the Bayesian paradigm in risk analysis

    International Nuclear Information System (INIS)

    Aven, T.; Kvaloey, J.T.

    2002-01-01

    The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas

  13. Reference computations of public dose and cancer risk from airborne releases of uranium and Class W plutonium

    International Nuclear Information System (INIS)

    Peterson, V.L.

    1995-01-01

    This report presents ''reference'' computations that can be used by safety analysts in the evaluations of the consequences of postulated atmospheric releases of radionuclides from the Rocky Flats Environmental Technology Site. These computations deal specifically with doses and health risks to the public. The radionuclides considered are Class W Plutonium, all classes of Enriched Uranium, and all classes of Depleted Uranium. (The other class of plutonium, Y, was treated in an earlier report.) In each case, one gram of the respirable material is assumed to be released at ground leveL both with and without fire. The resulting doses and health risks can be scaled to whatever amount of release is appropriate for a postulated accident being investigated. The report begins with a summary of the organ-specific stochastic risk factors appropriate for alpha radiation, which poses the main health risk of plutonium and uranium. This is followed by a summary of the atmospheric dispersion factors for unfavorable and typical weather conditions for the calculation of consequences to both the Maximum Offsite Individual and the general population within 80 km (50 miles) of the site

  14. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  15. COVAR: Computer Program for Multifactor Relative Risks and Tests of Hypotheses Using a Variance-Covariance Matrix from Linear and Log-Linear Regression

    Directory of Open Access Journals (Sweden)

    Leif E. Peterson

    1997-11-01

    Full Text Available A computer program for multifactor relative risks, confidence limits, and tests of hypotheses using regression coefficients and a variance-covariance matrix obtained from a previous additive or multiplicative regression analysis is described in detail. Data used by the program can be stored and input from an external disk-file or entered via the keyboard. The output contains a list of the input data, point estimates of single or joint effects, confidence intervals and tests of hypotheses based on a minimum modified chi-square statistic. Availability of the program is also discussed.

  16. Status of computer codes available in AEOI for reactor physics analysis

    International Nuclear Information System (INIS)

    Karbassiafshar, M.

    1986-01-01

    Many of the nuclear computer codes available in Atomic Energy Organization of Iran AEOI can be used for physics analysis of an operating reactor or design purposes. Grasp of the various methods involved and practical experience with these codes would be the starting point for interesting design studies or analysis of operating conditions of presently existing and future reactors. A review of the objectives and flowchart of commonly practiced procedures in reactor physics analysis of LWRs and related computer codes was made, extrapolating to the nationally and internationally available resources. Finally, effective utilization of the existing facilities is discussed and called upon

  17. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  18. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  19. Risk Analysis for Performance Improvement in a Romanian Pharmaceutical Company

    Directory of Open Access Journals (Sweden)

    Dana Corina Deselnicu

    2018-05-01

    Full Text Available The paper presents risk management analysis carried out to investigate the operations of a Romanian company dealing with the distribution of pharmaceutical products. The main risks challenging the company were identified, described and classified, providing a scientific base for further analysis. Then, the identified inherent risks were evaluated using tools as the risk index method and the risk matrix in order to emphasize their tolerance level. According to the results of the evaluation, risk mitigation strategies and measures were advanced for the management of the analysed risks. Relevant conclusions were drawn from the experience.

  20. Analysis of sponge zones for computational fluid mechanics

    International Nuclear Information System (INIS)

    Bodony, Daniel J.

    2006-01-01

    The use of sponge regions, or sponge zones, which add the forcing term -σ(q - q ref ) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer

  1. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  2. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  3. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    Science.gov (United States)

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  4. Pilot Study of a Computer-Based Parental Questionnaire and Visual Profile of Obesity Risk in Healthy Preschoolers.

    Science.gov (United States)

    Davies, Marilyn A; Terhorst, Lauren; Zhang, Peng; Nakonechny, Amanda J; Nowalk, Mary Patricia

    2015-01-01

    This group field-tested a computer-based, parental questionnaire entitled the Childhood Obesity Risk Questionnaire 2-5 (CORQ 2-5) designed to assess obesity risk in healthy preschoolers. COR 2-5 generates a profile of seven obesity risk factors. Field studies provided good internal reliability data and evidence of discriminant validity for the CORQ 2-5. Pediatric nurse clinicians found the CORQ 2-5 profile to be clinically relevant. The CORQ 2-5 is a promising measure of obesity risk in preschoolers who attend community-based health centers for their wellchild visits and who are not yet obese. CORQ 2-5 is intended to guide provider-parental obesity risk discussions. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    Science.gov (United States)

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis

    Directory of Open Access Journals (Sweden)

    Paul D. Morris, PhD

    2017-08-01

    Full Text Available Fractional flow reserve (FFR-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel “pseudotransient” analysis protocol for computing virtual fractional flow reserve (vFFR based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33% and more by microvascular physiology (59%. If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  7. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  8. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  9. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  10. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  11. Neural computations underlying social risk sensitivity

    Directory of Open Access Journals (Sweden)

    Nina eLauharatanahirun

    2012-08-01

    Full Text Available Under standard models of expected utility, preferences over stochastic events are assumed to be independent of the source of uncertainty. Thus, in decision-making, an agent should exhibit consistent preferences, regardless of whether the uncertainty derives from the unpredictability of a random process or the unpredictability of a social partner. However, when a social partner is the source of uncertainty, social preferences can influence decisions over and above pure risk attitudes. Here, we compared risk-related hemodynamic activity and individual preferences for two sets of options that differ only in the social or non-social nature of the risk. Risk preferences in social and non-social contexts were systematically related to neural activity during decision and outcome phases of each choice. Individuals who were more risk averse in the social context exhibited decreased risk-related activity in the amygdala during non-social decisions, while individuals who were more risk averse in the non-social context exhibited the opposite pattern. Differential risk preferences were similarly associated with hemodynamic activity in ventral striatum at the outcome of these decisions. These findings suggest that social preferences, including aversion to betrayal or exploitation by social partners, may be associated with variability in the response of these subcortical regions to social risk.

  12. Adapting risk management and computational intelligence network optimization techniques to improve traffic throughput and tail risk analysis.

    Science.gov (United States)

    2014-04-01

    Risk management techniques are used to analyze fluctuations in uncontrollable variables and keep those fluctuations from impeding : the core function of a system or business. Examples of this are making sure that volatility in copper and aluminum pri...

  13. Risk analysis and reliability

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1979-01-01

    Mathematical foundations of risk analysis are addressed. The importance of having the same probability space in order to compare different experiments is pointed out. Then the following topics are discussed: consequences as random variables with infinite expectations; the phenomenon of rare events; series-parallel systems and different kinds of randomness that could be imposed on such systems; and the problem of consensus of estimates of expert opinion

  14. Assessment report on NRP sub-theme 'Risk Analysis'

    International Nuclear Information System (INIS)

    Biesiot, W.; Hendrickx, L.; Olsthoorn, A.A.

    1995-01-01

    An overview and assessment are presented of the three research projects carried out under NRP funding that concern risk-related topics: (1) The risks of nonlinear climate changes, (2) Socio-economic and policy aspects of changes in incidence and intensity of extreme (weather) events, and (3) Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy strategies. 1 tab., 6 refs

  15. Applications of probabilistic risk analysis in nuclear criticality safety design

    International Nuclear Information System (INIS)

    Chang, J.K.

    1992-01-01

    Many documents have been prepared that try to define the scope of the criticality analysis and that suggest adding probabilistic risk analysis (PRA) to the deterministic safety analysis. The report of the US Department of Energy (DOE) AL 5481.1B suggested that an accident is credible if the occurrence probability is >1 x 10 -6 /yr. The draft DOE 5480 safety analysis report suggested that safety analyses should include the application of methods such as deterministic safety analysis, risk assessment, reliability engineering, common-cause failure analysis, human reliability analysis, and human factor safety analysis techniques. The US Nuclear Regulatory Commission (NRC) report NRC SG830.110 suggested that major safety analysis methods should include but not be limited to risk assessment, reliability engineering, and human factor safety analysis. All of these suggestions have recommended including PRA in the traditional criticality analysis

  16. GIS risk analysis of hazardous materials transport

    International Nuclear Information System (INIS)

    Anders, C.; Olsten, J.

    1991-01-01

    The Geographic Information System (GIS) was used to assess the risks and vulnerability of transporting hazardous materials and wastes (such as gasoline, explosives, poisons, etc) on the Arizona highway system. This paper discusses the methodology that was utilized, and the application of GIS systems to risk analysis problems

  17. A computer-assisted motivational social network intervention to reduce alcohol, drug and HIV risk behaviors among Housing First residents.

    Science.gov (United States)

    Kennedy, David P; Hunter, Sarah B; Chan Osilla, Karen; Maksabedian, Ervant; Golinelli, Daniela; Tucker, Joan S

    2016-03-15

    Individuals transitioning from homelessness to housing face challenges to reducing alcohol, drug and HIV risk behaviors. To aid in this transition, this study developed and will test a computer-assisted intervention that delivers personalized social network feedback by an intervention facilitator trained in motivational interviewing (MI). The intervention goal is to enhance motivation to reduce high risk alcohol and other drug (AOD) use and reduce HIV risk behaviors. In this Stage 1b pilot trial, 60 individuals that are transitioning from homelessness to housing will be randomly assigned to the intervention or control condition. The intervention condition consists of four biweekly social network sessions conducted using MI. AOD use and HIV risk behaviors will be monitored prior to and immediately following the intervention and compared to control participants' behaviors to explore whether the intervention was associated with any systematic changes in AOD use or HIV risk behaviors. Social network health interventions are an innovative approach for reducing future AOD use and HIV risk problems, but little is known about their feasibility, acceptability, and efficacy. The current study develops and pilot-tests a computer-assisted intervention that incorporates social network visualizations and MI techniques to reduce high risk AOD use and HIV behaviors among the formerly homeless. CLINICALTRIALS. NCT02140359.

  18. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  19. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  20. Ideas concerning the risk problem in technology

    International Nuclear Information System (INIS)

    Schneider, K.W.

    1975-01-01

    Scientific and technical development is primarily meant to improve the quality of life and to reduce mortality. The increased risk thus caused for the individual leads to a longer life expectation of man and to the ethical question of choosing the safety of a technical system. One must critically choose between justifiable effort and risk. Experience in reactor engineering (technical systems with hypothetical extreme risks) leads to considerations of the analysis to determine the risk of large chemical plants for inhabited neighbourhoods and attempts to design a conceptive and computer model for a safety analysis which can be of use in decisions regarding new planning. (HP/LH) [de

  1. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    Science.gov (United States)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  2. Analysis of risk factors for persistent infection of asymptomatic women with high-risk human papilloma virus.

    Science.gov (United States)

    Shi, Nianmin; Lu, Qiang; Zhang, Jiao; Li, Li; Zhang, Junnan; Zhang, Fanglei; Dong, Yanhong; Zhang, Xinyue; Zhang, Zheng; Gao, Wenhui

    2017-06-03

    This study aims to prevent persistentinfection, reduce the incidence of cervical cancer, and improve women's health by understanding the theoretical basis of the risk factors for continuous infection of asymptomatic women with high-risk human papilloma virus (HPV) strains via information collected, which includes the persistent infection rate and the most prevalent HPV strain types of high risk to asymptomatic women in the high-risk area of cervical cancer in Linfen, Shanxi Province. Based on the method of cluster sampling, locations were chosen from the industrial county and agricultural county of Linfen, Shanxi Province, namely the Xiangfen and Quwo counties. Use of the convenience sampling (CS) method enables the identification of women who have sex but without symptoms of abnormal cervix for analyzing risk factors of HPV-DNA detection and performing a retrospective questionnaire survey in these 2 counties. Firstly, cervical exfoliated cell samples were collected for thin-layer liquid-based cytology test (TCT), and simultaneously testing high-risk type HPV DNA, then samples with positive testing results were retested to identify the infected HPV types. The 6-month period of testing was done to derive the 6-month persistent infection rate. The retrospective survey included concepts addressed in the questionnaire: basic situation of the research objects, menstrual history, marital status, pregnancy history, sexual habits and other aspects. The questionnaire was divided into a case group and a comparison group, which are based on the high-risk HPV-DNA testing result to ascertain whether or not there is persistent infection. Statistical analysis employed Epidate3.1 software for date entry, SPSS17.0 for date statistical analysis. Select statistic charts, Chi-Square Analysis, single-factor analysis and multivariate Logistic regression analysis to analyze the protective factors and risk factors of high-risk HPV infection. Risk factors are predicted by using the

  3. Non-invasive Characterization of the Histopathologic Features of Pulmonary Nodules of the Lung Adenocarcinoma Spectrum using Computer Aided Nodule Assessment and Risk Yield (CANARY) – a Pilot Study

    Science.gov (United States)

    Maldonado, Fabien; Boland, Jennifer M.; Raghunath, Sushravya; Aubry, Marie Christine; Bartholmai, Brian J.; deAndrade, Mariza; Hartman, Thomas E.; Karwoski, Ronald A.; Rajagopalan, Srinivasan; Sykes, Anne-Marie; Yang, Ping; Yi, Eunhee S.; Robb, Richard A.; Peikert, Tobias

    2013-01-01

    Introduction Pulmonary nodules of the adenocarcinoma spectrum are characterized by distinctive morphological and radiological features and variable prognosis. Non-invasive high-resolution computed-tomography (HRCT)-based risk stratification tools are needed to individualize their management. Methods Radiological measurements of histopathologic tissue invasion were developed in a training set of 54 pulmonary nodules of the adenocarcinoma spectrum and validated in 86 consecutively resected nodules. Nodules were isolated and characterized by computer-aided analysis and data were analyzed by Spearman correlation, sensitivity, specificity as well as the positive and negative predictive values. Results Computer Aided Nodule Assessment and Risk Yield (CANARY) can non-invasively characterize pulmonary nodules of the adenocarcinoma spectrum. Unsupervised clustering analysis of HRCT data identified 9 unique exemplars representing the basic radiologic building blocks of these lesions. The exemplar distribution within each nodule correlated well with the proportion of histologic tissue invasion, Spearman R=0.87,p < 0.0001 and 0.89,p < 0.0001 for the training and the validation set, respectively. Clustering of the exemplars in three-dimensional space corresponding to tissue invasion and lepidic growth was used to develop a CANARY decision algorithm, which successfully categorized these pulmonary nodules as “aggressive” (invasive adenocarcinoma) or “indolent” (adenocarcinoma in situ and minimally invasive adenocarcinoma). Sensitivity, specificity, positive predictive value and negative predictive value of this approach for the detection of “aggressive” lesions were 95.4%, 96.8%, 95.4% and 96.8%, respectively in the training set and 98.7%, 63.6%, 94.9% and 87.5%, respectively in the validation set. Conclusion CANARY represents a promising tool to non-invasively risk stratify pulmonary nodules of the adenocarcinoma spectrum. PMID:23486265

  4. The effect of decreasing computed tomography dosage on radiostereometric analysis (RSA) accuracy at the glenohumeral joint.

    Science.gov (United States)

    Fox, Anne-Marie V; Kedgley, Angela E; Lalone, Emily A; Johnson, James A; Athwal, George S; Jenkyn, Thomas R

    2011-11-10

    Standard, beaded radiostereometric analysis (RSA) and markerless RSA often use computed tomography (CT) scans to create three-dimensional (3D) bone models. However, ethical concerns exist due to risks associated with CT radiation exposure. Therefore, the aim of this study was to investigate the effect of decreasing CT dosage on RSA accuracy. Four cadaveric shoulder specimens were scanned using a normal-dose CT protocol and two low-dose protocols, where the dosage was decreased by 89% and 98%. 3D computer models of the humerus and scapula were created using each CT protocol. Bi-planar fluoroscopy was used to image five different static glenohumeral positions and two dynamic glenohumeral movements, of which a total of five static and four dynamic poses were selected for analysis. For standard RSA, negligible differences were found in bead (0.21±0.31mm) and bony landmark (2.31±1.90mm) locations when the CT dosage was decreased by 98% (p-values>0.167). For markerless RSA kinematic results, excellent agreement was found between the normal-dose and lowest-dose protocol, with all Spearman rank correlation coefficients greater than 0.95. Average root mean squared errors of 1.04±0.68mm and 2.42±0.81° were also found at this reduced dosage for static positions. In summary, CT dosage can be markedly reduced when performing shoulder RSA to minimize the risks of radiation exposure. Standard RSA accuracy was negligibly affected by the 98% CT dose reduction and for markerless RSA, the benefits of decreasing CT dosage to the subject outweigh the introduced errors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Shuttle user analysis (study 2.2): Volume 3. Business Risk And Value of Operations in space (BRAVO). Part 4: Computer programs and data look-up

    Science.gov (United States)

    1974-01-01

    Computer program listings as well as graphical and tabulated data needed by the analyst to perform a BRAVO analysis were examined. Graphical aid which can be used to determine the earth coverage of satellites in synchronous equatorial orbits was described. A listing for satellite synthesis computer program as well as a sample printout for the DSCS-11 satellite program and a listing of the symbols used in the program were included. The APL language listing for the payload program cost estimating computer program was given. This language is compatible with many of the time sharing remote terminals computers used in the United States. Data on the intelsat communications network was studied. Costs for telecommunications systems leasing, line of sight microwave relay communications systems, submarine telephone cables, and terrestrial power generation systems were also described.

  6. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  7. Chemical risk evaluation, importance of the risk analysis framework uses: Latin America development restrictions

    International Nuclear Information System (INIS)

    Carrillo, M.

    2013-01-01

    The power point presentation is about reach and results of the risk analysis in Venezuela, chemical dangers in food, human damage, injuries , technologies news in fodd development, toxicity, microbiological risk, technical recommendations

  8. Computational Fatigue Life Analysis of Carbon Fiber Laminate

    Science.gov (United States)

    Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.

    2018-02-01

    In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.

  9. Spent fuel pool risk analysis for the Dukovany NPP

    Energy Technology Data Exchange (ETDEWEB)

    Hust' ak, S.; Jaros, M.; Kubicek, J. [UJV Rez, a.s., Husinec-Rez (Czech Republic)

    2013-07-01

    UJV Rez, a.s. maintains a Living Probabilistic Safety Assessment (Living PSA) program for Dukovany Nuclear Power Plant (NPP) in the Czech Republic. This project has been established as a framework for activities related to risk assessment and to support for risk-informed decision making at this plant. The most extensively used PSA application at Dukovany NPP is risk monitoring of instantaneous (point-in-time) risk during plant operation, especially for the purpose of configuration risk management during plant scheduled outages to avoid risk significant configurations. The scope of PSA for Dukovany NPP includes also determination of a risk contribution from spent fuel pool (SFP) operation to provide recommendations for the prevention and mitigation of SFP accidents and to be applicable for configuration risk management. This paper describes the analysis of internal initiating events (IEs) in PSA for Dukovany NPP, which can contribute to the risk from SFP operation. The analysis of those IEs was done more thoroughly in the PSA for Dukovany NPP in order to be used in instantaneous risk monitoring. (orig.)

  10. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    International Nuclear Information System (INIS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  11. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr; Paramythiotis, Spyridon, E-mail: pskan@aua.gr [Laboratory of Food Quality Control and Hygiene, Department of Food Science and Technology, Agricultural University of Athens, Iera Odos 75, 118 55, Athens (Greece)

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  12. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Science.gov (United States)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  13. Risk analysis for decision support in electricity distribution system asset management: methods and frameworks for analysing intangible risks

    Energy Technology Data Exchange (ETDEWEB)

    Nordgaard, Dag Eirik

    2010-04-15

    During the last 10 to 15 years electricity distribution companies throughout the world have been ever more focused on asset management as the guiding principle for their activities. Within asset management, risk is a key issue for distribution companies, together with handling of cost and performance. There is now an increased awareness of the need to include risk analyses into the companies' decision making processes. Much of the work on risk in electricity distribution systems has focused on aspects of reliability. This is understandable, since it is surely an important feature of the product delivered by the electricity distribution infrastructure, and it is high on the agenda for regulatory authorities in many countries. However, electricity distribution companies are also concerned with other risks relevant for their decision making. This typically involves intangible risks, such as safety, environmental impacts and company reputation. In contrast to the numerous methodologies developed for reliability risk analysis, there are relatively few applications of structured analyses to support decisions concerning intangible risks, even though they represent an important motivation for decisions taken in electricity distribution companies. The overall objective of this PhD work has been to explore risk analysis methods that can be used to improve and support decision making in electricity distribution system asset management, with an emphasis on the analysis of intangible risks. The main contributions of this thesis can be summarised as: An exploration and testing of quantitative risk analysis (QRA) methods to support decisions concerning intangible risks; The development of a procedure for using life curve models to provide input to QRA models; The development of a framework for risk-informed decision making where QRA are used to analyse selected problems; In addition, the results contribute to clarify the basic concepts of risk, and highlight challenges

  14. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  15. The multimedia computer for low-literacy patient education: a pilot project of cancer risk perceptions.

    Science.gov (United States)

    Wofford, J L; Currin, D; Michielutte, R; Wofford, M M

    2001-04-20

    Inadequate reading literacy is a major barrier to better educating patients. Despite its high prevalence, practical solutions for detecting and overcoming low literacy in a busy clinical setting remain elusive. In exploring the potential role for the multimedia computer in improving office-based patient education, we compared the accuracy of information captured from audio-computer interviewing of patients with that obtained from subsequent verbal questioning. Adult medicine clinic, urban community health center Convenience sample of patients awaiting clinic appointments (n = 59). Exclusion criteria included obvious psychoneurologic impairment or primary language other than English. A multimedia computer presentation that used audio-computer interviewing with localized imagery and voices to elicit responses to 4 questions on prior computer use and cancer risk perceptions. Three patients refused or were unable to interact with the computer at all, and 3 patients required restarting the presentation from the beginning but ultimately completed the computerized survey. Of the 51 evaluable patients (72.5% African-American, 66.7% female, mean age 47.5 [+/- 18.1]), the mean time in the computer presentation was significantly longer with older age and with no prior computer use but did not differ by gender or race. Despite a high proportion of no prior computer use (60.8%), there was a high rate of agreement (88.7% overall) between audio-computer interviewing and subsequent verbal questioning. Audio-computer interviewing is feasible in this urban community health center. The computer offers a partial solution for overcoming literacy barriers inherent in written patient education materials and provides an efficient means of data collection that can be used to better target patients' educational needs.

  16. Computer security engineering management

    International Nuclear Information System (INIS)

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system

  17. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  18. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    Science.gov (United States)

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  19. Dietary patterns and colorectal cancer risk: a meta-analysis.

    Science.gov (United States)

    Feng, Yu-Liang; Shu, Long; Zheng, Pei-Fen; Zhang, Xiao-Yan; Si, Cai-Juan; Yu, Xiao-Long; Gao, Wei; Zhang, Lun

    2017-05-01

    The analysis of dietary patterns has recently drawn considerable attention as a method of investigating the association between the overall whole diet and the risk of colorectal cancer. However, the results have yielded conflicting findings. Here, we carried out a meta-analysis to identify the association between dietary patterns and the risk of colorectal cancer. A total of 40 studies fulfilled the inclusion criteria and were included in this meta-analysis. The highest category of 'healthy' dietary pattern compared with the lowest category was apparently associated with a decreased risk for colorectal cancer [odds ratio (OR)=0.75; confidence interval (CI): 0.68-0.83; Pcolorectal cancer was shown for the highest compared with the lowest category of a 'western-style' dietary pattern (OR=1.40; CI: 1.26-1.56; Pcolorectal cancer in the highest compared with the lowest category of 'alcohol-consumption' pattern (OR=1.44; CI: 1.13-1.82; P=0.003). The results of this meta-analysis indicate that a 'healthy' dietary pattern may decrease the risk of colorectal cancer, whereas 'western-style' and 'alcohol-consumption' patterns may increase the risk of colorectal cancer.

  20. Conceptual Aspects in the Modeling of Logistical Risk of the Networked Information Economy with the Use of Tools of Natural Computing

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2016-11-01

    Full Text Available Information and communication tools and technologies are rapidly changing daily lives of people and business processes in economic activity primarily in the field of logistics. In particular, the innovative nature of these transformations leads to the emergence of new logistical risks, changing the essence of the existing ones, which needs to be taken into account in the management of logistics systems at various levels. Besides, the problem of Big Data has become increasingly urgent, which, on the one hand, can improve the validity of making managerial decisions, on the other hand — they (Big Data require modern tools for their production, processing and analysis. As such tools there can be used methods and models of natural computing. In the paper the basics of ant and bee algorithms, the particle swarm method, artificial immune systems are summarized; the possibilities of their application in the modeling of various types of logistical risk are demonstrated, the formalization of the problem of risk modeling with the use of an artificial immune system being given as a conditional example.