WorldWideScience

Sample records for analysis proper risk

  1. Identifying Proper Names Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The issue of proper names recognition in Chinese text was discussed. An automatic approach based on association analysis to extract rules from corpus was presented. The method tries to discover rules relevant to external evidence by association analysis, without additional manual effort. These rules can be used to recognize the proper nouns in Chinese texts. The experimental result shows that our method is practical in some applications.Moreover, the method is language independent.

  2. Access to Heart Transplantation: A Proper Analysis of the Competing Risks of Death and Transplantation Is Required to Optimize Graft Allocation.

    Science.gov (United States)

    Cantrelle, Christelle; Legeai, Camille; Latouche, Aurélien; Tuppin, Philippe; Jasseron, Carine; Sebbag, Laurent; Bastien, Olivier; Dorent, Richard

    2017-08-01

    Heart allocation systems are usually urgency-based, offering grafts to candidates at high risk of waitlist mortality. In the context of a revision of the heart allocation rules, we determined observed predictors of 1-year waitlist mortality in France, considering the competing risk of transplantation, to determine which candidate subgroups are favored or disadvantaged by the current allocation system. Patients registered on the French heart waitlist between 2010 and 2013 were included. Cox cause-specific hazards and Fine and Gray subdistribution hazards were used to determine candidate characteristics associated with waitlist mortality and access to transplantation. Of the 2053 candidates, 7 variables were associated with 1-year waitlist mortality by the Fine and Gray method including 4 candidate characteristics related to heart failure severity (hospitalization at listing, serum natriuretic peptide level, systolic pulmonary artery pressure, and glomerular filtration rate) and 3 characteristics not associated with heart failure severity but with lower access to transplantation (blood type, age, and body mass index). Observed waitlist mortality for candidates on mechanical circulatory support was like that of others. The heart allocation system strongly modifies the risk of pretransplant mortality related to heart failure severity. An in-depth competing risk analysis is therefore a more appropriate method to evaluate graft allocation systems. This knowledge should help to prioritize candidates in the context of a limited donor pool.

  3. Regular inhaled corticosteroids in adult-onset asthma and the risk for future cancer: a population-based cohort study with proper person-time analysis

    Directory of Open Access Journals (Sweden)

    Kok VC

    2015-03-01

    Full Text Available Victor C Kok,1,2 Jorng-Tzong Horng,2,3 Hsu-Kai Huang,3 Tsung-Ming Chao,4 Ya-Fang Hong5 1Division of Medical Oncology, Department of Internal Medicine, Kuang Tien General Hospital, Taichung, Taiwan; 2Department of Biomedical Informatics, Asia University Taiwan, Taichung, Taiwan; 3Department of Computer Science and Information Engineering, National Central University, Jhongli, Taiwan; 4Statistics Unit, Department of Applied Geomatics, Chien Hsin University, Jhongli, Taiwan; 5Institute of Molecular Biology, Academia Sinica, Nankang, Taipei, Taiwan Background: Recent studies have shown that inhaled corticosteroids (ICS can exert anti-inflammatory effects for chronic airway diseases, and several observational studies suggest that they play a role as cancer chemopreventive agents, particularly against lung cancer. We aimed to examine whether regular ICS use was associated with a reduced risk for future malignancy in patients with newly diagnosed adult-onset asthma. Methods: We used a population-based cohort study between 2001 and 2008 with appropriate person-time analysis. Participants were followed up until the first incident of cancer, death, or to the end of 2008. The Cox model was used to derive an adjusted hazard ratio (aHR for cancer development. Kaplan–Meier cancer-free survival curves of two groups were compared. Results: The exposed group of 2,117 regular ICS users and the nonexposed group of 17,732 non-ICS users were assembled. After 7,365 (mean, 3.5 years; standard deviation 2.1 and 73,789 (mean, 4.1 years; standard deviation 2.4 person-years of follow-up for the ICS users and the comparator group of non-ICS users, respectively, the aHR for overall cancer was nonsignificantly elevated at 1.33 with 95% confidence interval (CI, 1.00–1.76, P=0.0501. The Kaplan–Meier curves for overall cancer-free proportions of both groups were not significant (log-rank, P=0.065. Synergistic interaction of concurrent presence of regular ICS use was

  4. Towards proper sampling and statistical analysis of defects

    Directory of Open Access Journals (Sweden)

    Cetin Ali

    2014-06-01

    Full Text Available Advancements in applied statistics with great relevance to defect sampling and analysis are presented. Three main issues are considered; (i proper handling of multiple defect types, (ii relating sample data originating from polished inspection surfaces (2D to finite material volumes (3D, and (iii application of advanced extreme value theory in statistical analysis of block maximum data. Original and rigorous, but practical mathematical solutions are presented. Finally, these methods are applied to make prediction regarding defect sizes in a steel alloy containing multiple defect types.

  5. The Lorentzian proper vertex amplitude: Classical analysis and quantum derivation

    CERN Document Server

    Engle, Jonathan

    2015-01-01

    Spin foam models, an approach to defining the dynamics of loop quantum gravity, make use of the Plebanski formulation of gravity, in which gravity is recovered from a topological field theory via certain constraints called simplicity constraints. However, the simplicity constraints in their usual form select more than just one gravitational sector as well as a degenerate sector. This was shown, in previous work, to be the reason for the "extra" terms appearing in the semiclassical limit of the Euclidean EPRL amplitude. In this previous work, a way to eliminate the extra sectors, and hence terms, was developed, leading to the what was called the Euclidean proper vertex amplitude. In the present work, these results are extended to the Lorentzian signature, establishing what is called the Lorentzian proper vertex amplitude. This extension is non-trivial and involves a number of new elements since, for Lorentzian bivectors, the split into self-dual and anti-self-dual parts, on which the Euclidean derivation was b...

  6. Proper motion survey and kinematic analysis of the Rho Ophiuchi embedded cluster

    CERN Document Server

    Ducourant, C; Krone-Martins, A; Bontemps, S; Despois, D; Galli, P A B; Bouy, H; Campion, J F Le; Rapaport, M; Cuillandre, J C

    2016-01-01

    We aim at performing a kinematic census of young stellar objects (YSOs) in the Rho Ophiuchi F core and partially in the E core of the L1688 dark cloud. We run a proper motion program at the ESO New Technology Telescope (NTT) with the Son of ISAAC (SOFI) instrument over nine years in the near-infrared. We complemented these observations with various public image databases to enlarge the time base of observations and the field of investigation to 0.5 deg X 0.5 deg. We derived positions and proper motions for 2213 objects. From these, 607 proper motions were derived from SOFI observations with a ~1.8 mas/yr accuracy while the remaining objects were measured only from auxiliary data with a mean precision of about ~3 mas/yr. We performed a kinematic analysis of the most accurate proper motions derived in this work, which allowed us to separate cluster members from field stars and to derive the mean properties of the cluster. From the kinematic analysis we derived a list of 68 members and 14 candidate members, comp...

  7. Arsenic speciation in fish products and seafood as a prerequisite for proper risk assessment

    Directory of Open Access Journals (Sweden)

    Pierluigi Piras

    2015-02-01

    Full Text Available The Boi Cerbus lagoon, facing a mining and industrial site in Sardinia (Italy, is an important fishing area for the local population. Previous studies showed high concentrations of total arsenic (Astot in fish, molluscs and crustaceans sampled in the lagoon, and a possible exceeding of the provisional tolerable weekly intake set by the Joint FAO/WHO Expert Committee on Food Additives by some local consumer groups. However, the percentage of inorganic As (Asinorg should be known for a correct assessment of potential risk, as its toxicity is much higher than that of the organic forms. Eighty samples of 14 different species of fish, molluscs and crustaceans, sampled in the Boi Cerbus lagoon in 3 different seasons (winter, spring and summer, were analysed for Astot by inductively coupled plasma mass spectrometry (ICP-MS and Asinorg by high performance liquid chromatography-ICP-MS. All the data obtained from the analysis were statistically processed to evaluate significant differences based on season, taxon and habitat, in preparation for a subsequent risk assessment.

  8. Design Analysis Rules to Identify Proper Noun from Bengali Sentence for Universal Networking language

    Directory of Open Access Journals (Sweden)

    Md. Syeful Islam

    2014-08-01

    Full Text Available Now-a-days hundreds of millions of people of almost all levels of education and attitudes from different country communicate with each other for different purposes and perform their jobs on internet or other communication medium using various languages. Not all people know all language; therefore it is very difficult to communicate or works on various languages. In this situation the computer scientist introduce various inter language translation program (Machine translation. UNL is such kind of inter language translation program. One of the major problem of UNL is identified a name from a sentence, which is relatively simple in English language, because such entities start with a capital letter. In Bangla we do not have concept of small or capital letters. Thus we find difficulties in understanding whether a word is a proper noun or not. Here we have proposed analysis rules to identify proper noun from a sentence and established post converter which translate the name entity from Bangla to UNL. The goal is to make possible Bangla sentence conversion to UNL and vice versa. UNL system prove that the theoretical analysis of our proposed system able to identify proper noun from Bangla sentence and produce relative Universal word for UNL.

  9. Towards a proper assignment of systemic risk: the combined roles of network topology and shock characteristics.

    Science.gov (United States)

    Loepfe, Lasse; Cabrales, Antonio; Sánchez, Angel

    2013-01-01

    The 2007-2008 financial crisis solidified the consensus among policymakers that a macro-prudential approach to regulation and supervision should be adopted. The currently preferred policy option is the regulation of capital requirements, with the main focus on combating procyclicality and on identifying the banks that have a high systemic importance, those that are "too big to fail". Here we argue that the concept of systemic risk should include the analysis of the system as a whole and we explore systematically the most important properties for policy purposes of networks topology on resistance to shocks. In a thorough study going from analytical models to empirical data, we show two sharp transitions from safe to risky regimes: 1) diversification becomes harmful with just a small fraction (~2%) of the shocks sampled from a fat tailed shock distributions and 2) when large shocks are present a critical link density exists where an effective giant cluster forms and most firms become vulnerable. This threshold depends on the network topology, especially on modularity. Firm size heterogeneity has important but diverse effects that are heavily dependent on shock characteristics. Similarly, degree heterogeneity increases vulnerability only when shocks are directed at the most connected firms. Furthermore, by studying the structure of the core of the transnational corporation network from real data, we show that its stability could be clearly increased by removing some of the links with highest centrality betweenness. Our results provide a novel insight and arguments for policy makers to focus surveillance on the connections between firms, in addition to capital requirements directed at the nodes.

  10. Towards a proper assignment of systemic risk: the combined roles of network topology and shock characteristics.

    Directory of Open Access Journals (Sweden)

    Lasse Loepfe

    Full Text Available The 2007-2008 financial crisis solidified the consensus among policymakers that a macro-prudential approach to regulation and supervision should be adopted. The currently preferred policy option is the regulation of capital requirements, with the main focus on combating procyclicality and on identifying the banks that have a high systemic importance, those that are "too big to fail". Here we argue that the concept of systemic risk should include the analysis of the system as a whole and we explore systematically the most important properties for policy purposes of networks topology on resistance to shocks. In a thorough study going from analytical models to empirical data, we show two sharp transitions from safe to risky regimes: 1 diversification becomes harmful with just a small fraction (~2% of the shocks sampled from a fat tailed shock distributions and 2 when large shocks are present a critical link density exists where an effective giant cluster forms and most firms become vulnerable. This threshold depends on the network topology, especially on modularity. Firm size heterogeneity has important but diverse effects that are heavily dependent on shock characteristics. Similarly, degree heterogeneity increases vulnerability only when shocks are directed at the most connected firms. Furthermore, by studying the structure of the core of the transnational corporation network from real data, we show that its stability could be clearly increased by removing some of the links with highest centrality betweenness. Our results provide a novel insight and arguments for policy makers to focus surveillance on the connections between firms, in addition to capital requirements directed at the nodes.

  11. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  12. Proper motion survey and kinematic analysis of the ρ Ophiuchi embedded cluster

    Science.gov (United States)

    Ducourant, C.; Teixeira, R.; Krone-Martins, A.; Bontemps, S.; Despois, D.; Galli, P. A. B.; Bouy, H.; Le Campion, J. F.; Rapaport, M.; Cuillandre, J. C.

    2017-01-01

    Context. The ρ Ophiuchi molecular complex and in particular the Lynds L1688 dark cloud is unique in its proximity ( 130 pc), in its richness in young stars and protostars, and in its youth (0.5 Myr). It is certainly one of the best targets currently accessible from the ground to study the early phases of star-formation. Proper motion analysis is a very efficient tool for separating members of clusters from field stars, but very few proper motions are available in the ρ Ophiuchi region since most of the young sources are deeply embedded in dust and gas. Aims: We aim at performing a kinematic census of young stellar objects (YSOs) in the ρ Ophiuchi F core and partially in the E core of the L1688 dark cloud. Methods: We run a proper motion program at the ESO New Technology Telescope (NTT) with the Son of ISAAC (SOFI) instrument over nine years in the near-infrared. We complemented these observations with various public image databases to enlarge the time base of observations and the field of investigation to 0.5° × 0.5°. We derived positions and proper motions for 2213 objects. From these, 607 proper motions were derived from SOFI observations with a 1.8 mas/yr accuracy while the remaining objects were measured only from auxiliary data with a mean precision of about 3 mas/yr. Results: We performed a kinematic analysis of the most accurate proper motions derived in this work, which allowed us to separate cluster members from field stars and to derive the mean properties of the cluster. From the kinematic analysis we derived a list of 68 members and 14 candidate members, comprising 26 new objects with a high membership probability. These new members are generally fainter than the known ones. We measured a mean proper motion of (μαcosδ, μδ) = (-8.2,-24.3) ± 0.8 mas/yr for the L1688 dark cloud. A supervised classification was applied to photometric data of members to allocate a spectral energy distribution (SED) classification to the unclassified members

  13. Risk Analysis

    Science.gov (United States)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  14. Importance of proper scaling of aerobic power when relating to cardiometabolic risk factors in children

    DEFF Research Database (Denmark)

    McMurray, Robert; Hosick ‎, Peter; Bugge, Anna

    2011-01-01

    BACKGROUND: The relationship between cardiometabolic risk factors (CMRF) and aerobic power (VO(2max)) scaled as mL O(2) per kilogram body mass is controversial because mass includes both fat and fat-free mass, and fat mass is independently associated with the CMRF. AIM: To examine common units used...... to scale VO(2max) and their relationships to mean blood pressure (MBP), total cholesterol (TC), HDL cholesterol, triglycerides (TG), insulin resistance (HOMA-IR) and cumulative risk score (z-score). SUBJECTS: 1784, 8-18 year-old youths, 938 girls and 886 boys. METHODS: Fasting blood samples were obtained....... VO(2max) was estimated in mL/min from cycle ergometry and scaled to body mass (kg), fat free mass (kg(FFM)), body surface area (m(2)), height (cm) and allometric (mL/kg(0.67)/min). RESULTS: Unadjusted correlations between CMRF and many of the scaled VO(2max) units were significant (p

  15. Importance of proper scaling of aerobic power when relating to cardiometabolic risk factors in children

    DEFF Research Database (Denmark)

    McMurray, Robert; Hosick ‎, Peter; Bugge, Anna

    2011-01-01

    BACKGROUND: The relationship between cardiometabolic risk factors (CMRF) and aerobic power (VO(2max)) scaled as mL O(2) per kilogram body mass is controversial because mass includes both fat and fat-free mass, and fat mass is independently associated with the CMRF. AIM: To examine common units us......, are more related to CMRF than any scaled units of VO(2max); thus care is needed when relating fitness and health issues.......BACKGROUND: The relationship between cardiometabolic risk factors (CMRF) and aerobic power (VO(2max)) scaled as mL O(2) per kilogram body mass is controversial because mass includes both fat and fat-free mass, and fat mass is independently associated with the CMRF. AIM: To examine common units used.......0001), especially for MBP, HOMA-IR, HDL and z-score, with lower correlations for TC and TG. After adjusting for ancestry, sex, height and body fat associations were greatly weakened (r physical characteristics of the child, especially body fat...

  16. Properly pricing country risk: a model for pricing long-term fundamental risk applied to central and eastern European countries

    Directory of Open Access Journals (Sweden)

    Debora Revoltella

    2010-09-01

    Full Text Available The private sector has used proxies such as sovereign credit ratings, spreads on sovereign bonds and spreads on sovereign credit default swaps (CDS to gauge country risk, even though these measures are pricing the risk of default of government bonds, which is different from the risks facing private participants in cross-border financing. Under normal market conditions, the CDS spreads are a very useful source of information on country risk. However, the recent crisis has shown that the CDS spreads might lead to some underpricing or overpricing of fundamentals in the case of excessively low or excessively high risk aversion. In this paper we develop an alternative measure of country risk that extracts the volatile, short-term market sentiment component from the sover eign CDS spread in order to improve its reliability in periods of market distress. We show that adverse market sentiment was a key driver of the sharp increase in sovereign CDS spreads of central and eastern European (CEE countries during the most severe phase of the crisis. We also show that our measure of country risk sheds some light on the observed stability of cross-border bank flows to CEE banks during the crisis.

  17. Transitional flow analysis in the carotid artery bifurcation by proper orthogonal decomposition and particle image velocimetry.

    Science.gov (United States)

    Kefayati, Sarah; Poepping, Tamie L

    2013-07-01

    Blood flow instabilities in the carotid artery bifurcation have been highly correlated to clot formation and mobilization resulting in ischemic stroke. In this work, PIV-measured flow velocities in normal and stenosed carotid artery bifurcation models were analyzed by means of proper orthogonal decomposition (POD). Through POD analysis, transition to more complex flow was visualized and quantified for increasing stenosis severity. While no evidence of transitional flow was seen in the normal model, the 50%-stenosed model started to show characteristics of transitional flow, which became highly evident in the 70% model, with greatest manifestation during the systolic phase of the cardiac cycle. By means of a model comparison, we demonstrate two quantitative measures of the flow complexity through the power-law decay slope of the energy spectrum and the global entropy. The more complex flow in the 70%-stenosed model showed a flatter slope of energy decay (-0.91 compared to -1.34 for 50% stenosis) and higher entropy values (0.26 compared to 0.17). Finally, the minimum temporal resolution required for POD analysis of carotid artery flow was found to be 100 Hz when determined through a more typical energy-mode convergence test, as compared to 400 Hz based on global entropy values.

  18. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  19. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  20. Observations on risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested.

  1. Around-the-clock ambulatory blood pressure monitoring is required to properly diagnose resistant hypertension and assess associated vascular risk.

    Science.gov (United States)

    Hermida, Ramón C; Ayala, Diana E; Ríos, María T; Fernández, José R; Mojón, Artemio; Smolensky, Michael H

    2014-07-01

    Diagnosis of resistant hypertension (RH) is currently based upon awake-time office blood pressure (BP). An increasing number of studies have documented abnormally elevated sleep-time BP in most RH patients, indicating that diagnosis of true RH cannot be determined solely by comparison of office BP with either patient awake-time BP self-measurements or awake-BP mean from ambulatory monitoring (ABPM), as is customary in the published literature. Moreover, the ABPM-determined sleep-time BP mean is an independent and stronger predictor of cardiovascular and cerebrovascular disease (CVD) risk than either daytime office/ABPM-derived awake or 24-hour means. Results of the recently completed MAPEC (Monitorización Ambulatoria para Predicción de Eventos Cardiovasculares) prospective outcomes study, which included a large cohort of RH patients, established that time of treatment relative to circadian rhythms constituted a critically important yet often neglected variable with respect to BP control. The study found that bedtime versus morning ingestion of the full dose of ≥1 BP-lowering medications resulted in both better therapeutic normalization of sleep-time BP and reduced CVD morbidity and mortality, including in RH patients. Accordingly, ABPM is highly recommended to properly diagnose and manage true RH, with a bedtime hypertension medication regimen as the therapeutic scheme of choice.

  2. On the properness condition for modal analysis of non-symmetric second-order systems

    Science.gov (United States)

    Ouisse, Morvan; Foltête, Emmanuel

    2011-02-01

    Non-symmetric second-order systems can be found in several engineering contexts, including vibroacoustics, rotordynamics, or active control. In this paper, the notion of properness for complex modes is extended to the case of non-self-adjoint problems. The properness condition is related to the ability of a set of complex modes to represent in an exact way the behavior of a physical second-order system, meaning that the modes are the solutions of a quadratic eigenvalue problem whose matrices are those of a physical system. This property can be used to identify the damping matrices which may be difficult to obtain with mathematical modeling techniques. The first part of the paper demonstrates the properness condition for non symmetric systems in general. In the second part, the authors propose a methodology to enforce that condition in order to perform an optimal reconstruction of the "closest" physical system starting from a given basis complex modes. The last part is dedicated to numerical and experimental illustrations of the proposed methodology. A simulated academic test case is first used to investigate the numerical aspects of the method. A physical application is then considered in the context of rotordynamics. Finally, an experimental test case is presented using a structure with an active control feedback. An extension of the LSCF identification technique is also introduced to identify both left and right complex mode shapes from measured frequency response functions.

  3. Factor Analysis with EM Algorithm Never Gives Improper Solutions when Sample Covariance and Initial Parameter Matrices Are Proper

    Science.gov (United States)

    Adachi, Kohei

    2013-01-01

    Rubin and Thayer ("Psychometrika," 47:69-76, 1982) proposed the EM algorithm for exploratory and confirmatory maximum likelihood factor analysis. In this paper, we prove the following fact: the EM algorithm always gives a proper solution with positive unique variances and factor correlations with absolute values that do not exceed one,…

  4. Factor Analysis with EM Algorithm Never Gives Improper Solutions when Sample Covariance and Initial Parameter Matrices Are Proper

    Science.gov (United States)

    Adachi, Kohei

    2013-01-01

    Rubin and Thayer ("Psychometrika," 47:69-76, 1982) proposed the EM algorithm for exploratory and confirmatory maximum likelihood factor analysis. In this paper, we prove the following fact: the EM algorithm always gives a proper solution with positive unique variances and factor correlations with absolute values that do not exceed one,…

  5. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  6. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HWAN; YANG

    2001-01-01

    Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).……

  7. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).

  8. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HAWNYANG

    2001-01-01

    Risk analysis is a useful too for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data.contaminant residue levels,statistical tools,exposure values and relevant variants,Risk managers consider scientific evidence and risk estimates,along with statutory,engineering,economic,social,and political factors,in evaluating alternative regulatory options and choosing among those options(NRC,1983).

  9. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  10. Differential diagnosis and proper treatment of acute rhinosinusitis: Guidance based on historical data analysis.

    Science.gov (United States)

    Cevc, Gregor

    2017-06-01

    The time course of rhinovirus positive and negative rhinosinusitis has not been quantified yet, which aggravates proper selection and justification of the optimum treatment for this illness. Such quantitative information would facilitate an early and proper identification of the disease and its differentiation from acute bacterial rhinosinusitis, and could diminish harmful overuse of antibiotics, arguably driven by patients' want for attention and the treating physicians' inability to offer an adequate verbal comfort in its stead. Extraction of the quantitative information needed to identify rhinovirus positive or negative rhinosinusitis and to allow selection of the most appropriate treatment from the published time dependence of individual clinical symptoms of the disease. Scrutiny (and modeling) of temporal evolution of all noteworthy symptoms of rhinosinusitis with a simple mathematical expression that relies on two adjustable parameters per symptom (and potentially a general time offset as an extra adjustable parameter). Adverse effects of rhinosinusitis can be grouped according to the sequence of their exponential appearance and ∼2.6 times slower exponential disappearance, rhinovirus negative rhinosinusitis generally improving ∼25% faster and being ∼40% less severe. The major early local symptoms (throat soreness and scratchiness, headache) vanish with a half-life of ∼1.8 days, whereas further local symptoms take ∼1.6 times longer to disappear. At least 50-60% improvement of two prominent early symptoms, sore throat and sneezing (but not of nasal discharge, cough, and hoarseness) by day 5 of the disease implies a nonbacterial origin of rhinitis and should exclude use of antibiotics. Temporal evolution of all rhinosinusitis symptoms is qualitatively similar, which makes the early symptom decay a good proxy for, and predictor of, the disease perspective. Knowing a symptom intensity at just three to four time points suffices for reconstructing its

  11. Rank defect analysis and the realization of proper singularity in normal equations of geodetic networks

    Science.gov (United States)

    Kotsakis, C.; Chatzinikos, M.

    2017-06-01

    The singularity of input normal equations (NEQ) is a crucial element for their optimal handling in the context of terrestrial reference frame (TRF) estimation under the minimal-constraint framework. However, this element is often missing in the recovered NEQ from SINEX files after the usual deconstraining based on the stated information for the stored solutions. The same setback also occurs with the original NEQ that are formed by the least-squares processing of space geodetic data due to the datum information which is carried by various modeling choices and/or software-dependent procedures. In the absence of this datum-related singularity, it is not possible to obtain genuine minimally constrained solutions because of the interference between the input NEQ's content and the external datum conditions, a fact that may alter the geometrical information of the original measurements and can cause unwanted distortions in the estimated solution. The main goal of this paper is the formulation of a filtering scheme to enforce the proper (or desired) singularity in the input NEQ with regard to datum parameters that will be handled by the minimal-constraint setting in TRF estimation problems. The importance of this task is extensively discussed and justified with the help of several numerical examples in different GNSS networks.

  12. Proper orthogonal decomposition analysis of vortex shedding behind a rotating circular cylinder

    Directory of Open Access Journals (Sweden)

    Dol Sharul Sham

    2016-01-01

    Full Text Available Turbulence studies were made in the wake of a rotating circular cylinder in a uniform free stream with the objective of describing the patterns of the vortex shedding up to suppression of the periodic vortex street at high velocity ratios, λ. The results obtained in the present study establish that shedding of Kármán vortices in a rotating circular cylinder-generated wake is modified by rotation of the cylinder. Alternate vortex shedding is highly visible when λ < 2.0 although the strength of the separated shear layers differ due to the rotation of the cylinder. The spectral density in the wakes indicate significant changes at λ = 2.0. The results indicate that the rotation of the cylinder causes significant disruption in the structure of the flow. Alternate vortex shedding is weak, distorted and close to being suppressed at λ = 2.0. It is clear that flow asymmetries will weaken vortex shedding, and when the asymmetries are significant enough, total suppression of a periodic street occurs. Particular attention was paid to the decomposition of the flow using Proper Orthogonal Decomposition (POD. By analyzing this decomposition with the help of Particle Image Velocimetry (PIV data, it was found that large scales contribute to the coherent motion. Vorticity structures in the modes become increasingly irregular with downstream distance, suggesting turbulent interactions are occurring at the more downstream locations, especially when the cylinder rotates.

  13. BVRIJHK photometry and proper motion analysis of NGC 6253 and the surrounding field

    CERN Document Server

    Montalto, M; Desidera, S; Platais, I; Carraro, G; Momany, Y; De Marchi, F; Recio-Blanco, A

    2009-01-01

    Context. We present a photometric and astrometric catalog of 187963 stars located in the field around the old super-metal-rich Galactic open cluster NGC 6253. The total field-of-view covered by the catalog is 34' x 33'. In this field, we provide CCD BVRI photometry. For a smaller region close to the cluster's center, we also provide near-infrared JHK photometry. Aims. We analyze the properties of NGC 6253 by using our new photometric data and astrometric membership. Methods. In June 2004, we targeted the cluster during a 10 day multi-site campaign, which involved the MPG/ESO 2.2m telescope with its wide-field imager and the Anglo-Australian 3.9m telescope, equipped with the IRIS2 near-infrared imager. Archival CCD images of NGC 6253 were used to derive relative proper motions and to calculate the cluster membership probabilities. Results. We have refined the cluster's fundamental parameters, deriving (V_0-M_v)=11.15, E(B - V)=0.15, E(V - I)=0.25, E(V - J)=0.50, and E(V - H)=0.55. The color excess ratios obtai...

  14. Rank defect analysis and the realization of proper singularity in normal equations of geodetic networks

    Science.gov (United States)

    Kotsakis, C.; Chatzinikos, M.

    2017-01-01

    The singularity of input normal equations (NEQ) is a crucial element for their optimal handling in the context of terrestrial reference frame (TRF) estimation under the minimal-constraint framework. However, this element is often missing in the recovered NEQ from SINEX files after the usual deconstraining based on the stated information for the stored solutions. The same setback also occurs with the original NEQ that are formed by the least-squares processing of space geodetic data due to the datum information which is carried by various modeling choices and/or software-dependent procedures. In the absence of this datum-related singularity, it is not possible to obtain genuine minimally constrained solutions because of the interference between the input NEQ's content and the external datum conditions, a fact that may alter the geometrical information of the original measurements and can cause unwanted distortions in the estimated solution. The main goal of this paper is the formulation of a filtering scheme to enforce the proper (or desired) singularity in the input NEQ with regard to datum parameters that will be handled by the minimal-constraint setting in TRF estimation problems. The importance of this task is extensively discussed and justified with the help of several numerical examples in different GNSS networks.

  15. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    Validation in chemometrics is presented using the exemplar context of multivariate calibration/prediction. A phenomenological analysis of common validation practices in data analysis and chemometrics leads to formulation of a set of generic Principles of Proper Validation (PPV), which is based...

  16. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  17. Maggie Creek Water Quality Data for Ecological Proper Functioning Condition Analysis

    Data.gov (United States)

    U.S. Environmental Protection Agency — These data are "standard" water quality parameters collected for surface water condition analysis (for example pH, conductivity, DO, TSS). This dataset is associated...

  18. Mixed stock analysis (MSA - a tool for proper management in fisheries

    Directory of Open Access Journals (Sweden)

    Viorica Coşier

    2013-03-01

    Full Text Available For natural environment, it is crucial to develop a fishery management plan, outliningconservation and restoration measures. From this point of view, if the fishery is managed as a singleunit (stock, there is a huge potential of overfishing of the less abundant population. In order toachieve population levels that support harvests or to protect a particular vulnerable population fromanthropogenic pressure, the uses of genetic markers information have been proposed in fisherymanagement. Mixed stock analysis uses genetic markers information in several source populations andin a single mixture population to estimate proportional contribution of each source to the mixedpopulation

  19. Targeted assets risk analysis.

    Science.gov (United States)

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians.

  20. Broken Robustness Analysis: How to make proper climate change conclusions in contradictory multimodal measurement contexts.

    Science.gov (United States)

    Keyser, V.

    2015-12-01

    Philosophers of science discuss how multiple modes of measurement can generate evidence for the existence and character of a phenomenon (Horwich 1982; Hacking 1983; Franklin and Howson 1984; Collins 1985; Sober 1989; Trout 1993; Culp 1995; Keeley 2002; Staley 2004; Weber 2005; Keyser 2012). But how can this work systematically in climate change measurement? Additionally, what conclusions can scientists and policy-makers draw when different modes of measurement fail to be robust by producing contradictory results? First, I present a new technical account of robust measurement (RAMP) that focuses on the physical independence of measurement processes. I detail how physically independent measurement processes "check each other's results." (This account is in contrast to philosophical accounts of robustness analysis that focus on independent model assumptions or independent measurement products or results.) Second, I present a puzzle about contradictory and divergent climate change measures, which has consistently re-emerged in climate measurement. This discussion will focus on land, drilling, troposphere, and computer simulation measures. Third, to systematically solve this climate measurement puzzle, I use RAMP in the context of drought measurement in order to generate a classification of measurement processes. Here, I discuss how multimodal precipitation measures—e.g., measures of precipitation deficit like the Standard Precipitation Index vs. air humidity measures like the Standardized Relative Humidity Index--can help with the classification scheme of climate change measurement processes. Finally, I discuss how this classification of measures can help scientists and policy-makers draw effective conclusions in contradictory multimodal climate change measurement contexts.

  1. Simplified seismic risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pellissetti, Manuel; Klapp, Ulrich [AREVA NP GmbH, Erlangen (Germany)

    2011-07-01

    Within the context of probabilistic safety analysis (PSA) for nuclear power plants (NPP's), seismic risk assessment has the purpose to demonstrate that the contribution of seismic events to overall risk is not excessive. The most suitable vehicle for seismic risk assessment is a full scope seismic PSA (SPSA), in which the frequency of core damage due to seismic events is estimated. An alternative method is represented by seismic margin assessment (SMA), which aims at showing sufficient margin between the site-specific safe shutdown earthquake (SSE) and the actual capacity of the plant. Both methods are based on system analysis (fault-trees and event-trees) and hence require fragility estimates for safety relevant systems, structures and components (SSC's). If the seismic conditions at a specific site of a plant are not very demanding, then it is reasonable to expect that the risk due to seismic events is low. In such cases, the cost-benefit ratio for performing a full scale, site-specific SPSA or SMA will be excessive, considering the ultimate objective of seismic risk analysis. Rather, it will be more rational to rely on a less comprehensive analysis, used as a basis for demonstrating that the risk due to seismic events is not excessive. The present paper addresses such a simplified approach to seismic risk assessment which is used in AREVA to: - estimate seismic risk in early design stages, - identify needs to extend the design basis, - define a reasonable level of seismic risk analysis Starting from a conservative estimate of the overall plant capacity, in terms of the HCLPF (High Confidence of Low Probability of Failure), and utilizing a generic value for the variability, the seismic risk is estimated by convolution of the hazard and the fragility curve. Critical importance is attached to the selection of the plant capacity in terms of the HCLPF, without performing extensive fragility calculations of seismically relevant SSC's. A suitable basis

  2. Metrics, Dose, and Dose Concept: The Need for a Proper Dose Concept in the Risk Assessment of Nanoparticles

    Directory of Open Access Journals (Sweden)

    Myrtill Simkó

    2014-04-01

    Full Text Available In order to calculate the dose for nanoparticles (NP, (i relevant information about the dose metrics and (ii a proper dose concept are crucial. Since the appropriate metrics for NP toxicity are yet to be elaborated, a general dose calculation model for nanomaterials is not available. Here we propose how to develop a dose assessment model for NP in analogy to the radiation protection dose calculation, introducing the so-called “deposited and the equivalent dose”. As a dose metric we propose the total deposited NP surface area (SA, which has been shown frequently to determine toxicological responses e.g. of lung tissue. The deposited NP dose is proportional to the total surface area of deposited NP per tissue mass, and takes into account primary and agglomerated NP. By using several weighting factors the equivalent dose additionally takes into account various physico-chemical properties of the NP which are influencing the biological responses. These weighting factors consider the specific surface area, the surface textures, the zeta-potential as a measure for surface charge, the particle morphology such as the shape and the length-to-diameter ratio (aspect ratio, the band gap energy levels of metal and metal oxide NP, and the particle dissolution rate. Furthermore, we discuss how these weighting factors influence the equivalent dose of the deposited NP.

  3. Intermittent Oxygen Inhalation with Proper Frequency Improves Overall Health Conditions and Alleviates Symptoms in a Population at High Risk of Chronic Mountain Sickness with Severe Symptoms

    Institute of Scientific and Technical Information of China (English)

    Bin Feng; Wei-Hao Xu; Yu-Qi Gao; Fu-Yu Liu; Peng Li; Shan-Jun Zheng; Lu-Yue Gai

    2016-01-01

    Background:Oxygen inhalation therapy is essential for the treatment of patients with chronic mountain sickness (CMS),but the efficacy of oxygen inhalation for populations at high risk of CMS remains unknown.This research investigated whether oxygen inhalation therapy benefits populations at high risk of CMS.Methods:A total of 296 local residents living at an altitude of 3658 m were included;of which these were 25 diagnosed cases of CMS,8 cases dropped out of the study,and 263 cases were included in the analysis.The subjects were divided into high-risk (180 ≤ hemoglobin (Hb) <210 g/L,n =161) and low-risk (Hb <180 g/L,n =102) groups,and the cases in each group were divided into severe symptom (CMS score ≥6) and mild symptom (CMS score 0-5) subgroups.Severe symptomatic population of either high-or low-risk CMS was randomly assigned to no oxygen intake group (A group) or oxygen intake 7 times/week group (D group);mild symptomatic population of either high-or low-risk CMS was randomly assigned to no oxygen intake group (A group),oxygen intake 2 times/week group (B group),and 4 times/week group (C group).The courses for oxygen intake were all 30 days.The CMS symptoms,sleep quality,physiological biomarkers,biochemical markers,etc.,were recorded on the day before oxygen intake,on the 15th and 30th days of oxygen intake,and on the 15th day after terminating oxygen intake therapy.Results:A total of 263 residents were finally included in the analysis.Among these high-altitude residents,CMS symptom scores decreased for oxygen inhalation methods B,C,and D at 15 and 30 days after oxygen intake and 15 days after termination,including dyspnea,palpitation,and headache index,compared to those before oxygen intake (B group:Z =5.604,5.092,5.741;C group:Z =4.155,4.068,4.809;D group:Z =6.021,6.196,5.331,at the 3 time points respectively;all P < 0.05/3 vs.before intake).However,dyspnea/palpitation (A group:Z =5.003,5.428,5.493,both P < 0.05/3 vs.before intake) and headache (A

  4. Not proper ROC curves as new tool for the analysis of differentially expressed genes in microarray experiments

    Directory of Open Access Journals (Sweden)

    Pistoia Vito

    2008-10-01

    Full Text Available Abstract Background Most microarray experiments are carried out with the purpose of identifying genes whose expression varies in relation with specific conditions or in response to environmental stimuli. In such studies, genes showing similar mean expression values between two or more groups are considered as not differentially expressed, even if hidden subclasses with different expression values may exist. In this paper we propose a new method for identifying differentially expressed genes, based on the area between the ROC curve and the rising diagonal (ABCR. ABCR represents a more general approach than the standard area under the ROC curve (AUC, because it can identify both proper (i.e., concave and not proper ROC curves (NPRC. In particular, NPRC may correspond to those genes that tend to escape standard selection methods. Results We assessed the performance of our method using data from a publicly available database of 4026 genes, including 14 normal B cell samples (NBC and 20 heterogeneous lymphomas (namely: 9 follicular lymphomas and 11 chronic lymphocytic leukemias. Moreover, NBC also included two sub-classes, i.e., 6 heavily stimulated and 8 slightly or not stimulated samples. We identified 1607 differentially expressed genes with an estimated False Discovery Rate of 15%. Among them, 16 corresponded to NPRC and all escaped standard selection procedures based on AUC and t statistics. Moreover, a simple inspection to the shape of such plots allowed to identify the two subclasses in either one class in 13 cases (81%. Conclusion NPRC represent a new useful tool for the analysis of microarray data.

  5. LES and Proper Orthogonal Decomposition analysis of vertical entrainment of kinetic energy in large wind farms (Invited)

    Science.gov (United States)

    Meneveau, C. V.; VerHulst, C.

    2013-12-01

    Vertical entrainment of kinetic energy has been shown to be an important limiting factor in the performance of very large wind turbine arrays. Given high Reynolds numbers and domain sizes on the order of kilometers, we rely on wall-modeled Large Eddy Simulation (LES) to predict flow within large wind farm. We use Proper Orthogonal Decomposition (POD) to identify energetically important large-scale structures in the flow. The primary large-scale structures are found to be streamwise counter-rotating vortices located above the height of the wind turbines. The contribution of each flow structure to the kinetic energy entrainment is quantified. Surprisingly, fewer flow structures (POD modes) contribute to the vertical kinetic energy flux than to the kinetic energy in the flow, for which the POD analysis is optimal. While the general characteristics of the flow structures are robust, the net kinetic energy entrainment to the turbines depends on the orientation of the wind turbines in the array. The various modes' contributions to variability and intermittency is also quantified. The POD analysis is performed for aligned and staggered wind turbine arrays as well as for atmospheric flow without wind turbines. This research is supported by a NSF Graduate Fellowship and by the WINDINSPIRE project, funded through NSF-OISE 1243482.

  6. Finite volume analysis of temperature effects induced by active MRI implants with cylindrical symmetry: 1. Properly working devices

    Directory of Open Access Journals (Sweden)

    Schnorr Jörg

    2005-04-01

    Full Text Available Abstract Background Active Magnetic Resonance Imaging implants are constructed as resonators tuned to the Larmor frequency of a magnetic resonance system with a specific field strength. The resonating circuit may be embedded into or added to the normal metallic implant structure. The resonators build inductively coupled wireless transmit and receive coils and can amplify the signal, normally decreased by eddy currents, inside metallic structures without affecting the rest of the spin ensemble. During magnetic resonance imaging the resonators generate heat, which is additional to the usual one described by the specific absorption rate. This induces temperature increases of the tissue around the circuit paths and inside the lumen of an active implant and may negatively influence patient safety. Methods This investigation provides an overview of the supplementary power absorbed by active implants with a cylindrical geometry, corresponding to vessel implants such as stents, stent grafts or vena cava filters. The knowledge of the overall absorbed power is used in a finite volume analysis to estimate temperature maps around different implant structures inside homogeneous tissue under worst-case assumptions. The "worst-case scenario" assumes thermal heat conduction without blood perfusion inside the tissue around the implant and mostly without any cooling due to blood flow inside vessels. Results The additional power loss of a resonator is proportional to the volume and the quality factor, as well as the field strength of the MRI system and the specific absorption rate of the applied sequence. For properly working devices the finite volume analysis showed only tolerable heating during MRI investigations in most cases. Only resonators transforming a few hundred mW into heat may reach temperature increases over 5 K. This requires resonators with volumes of several ten cubic centimeters, short inductor circuit paths with only a few 10 cm and a quality

  7. Probabilistic risk analysis and terrorism risk.

    Science.gov (United States)

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  8. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard a...

  9. Proper orientation of cacti

    OpenAIRE

    Araujo, Julio; Havet, Frédéric; Linhares Sales, Claudia; Silva, Ana

    2016-01-01

    International audience; An orientation of a graph G is proper if two adjacent vertices have different in-degrees. The proper-orientation number − → χ (G) of a graph G is the minimum maximum in-degree of a proper orientation of G. In [1], the authors ask whether the proper orientation number of a planar graph is bounded. We prove that every cactus admits a proper orientation with maximum in-degree at most 7. We also prove that the bound 7 is tight by showing a cactus having no proper orientati...

  10. International Conference on Risk Analysis

    CERN Document Server

    Oliveira, Teresa; Rigas, Alexandros; Gulati, Sneh

    2015-01-01

    This book covers the latest results in the field of risk analysis. Presented topics include probabilistic models in cancer research, models and methods in longevity, epidemiology of cancer risk, engineering reliability and economical risk problems. The contributions of this volume originate from the 5th International Conference on Risk Analysis (ICRA 5). The conference brought together researchers and practitioners working in the field of risk analysis in order to present new theoretical and computational methods with applications in biology, environmental sciences, public health, economics and finance.

  11. Proper orthogonal decomposition of velocity gradient fields in a simulated stratified turbulent wake: analysis of vorticity and internal waves

    Science.gov (United States)

    Gurka, R.; Diamessis, P.; Liberzon, A.

    2009-04-01

    The characterization of three-dimensional space and time-dependent coherent structures and internal waves in stratified environment is one of the most challenging tasks in geophysical fluid dynamics. Proper orthogonal decomposition (POD) is applied to 2-D slices of vorticity and horizontal divergence obtained from 3-D DNS of a stratified turbulent wake of a towed sphere at Re=5x103 and Fr=4. The numerical method employed solves the incompressible Navier-Stokes equations under the Boussinesq approximation. The temporal discretization consists of three fractional steps: an explicit advancement of the nonlinear terms, an implicit solution of the Poisson equation for the pseudo-pressure (which enforces incompressibility), and an implicit solution of the Helmholtz equation for the viscous terms (where boundary conditions are imposed). The computational domain is assumed to be periodic in the horizontal direction and non-periodic in the vertical direction. The 2-D slices are sampled along the stream-depth (Oxz), span-depth (Oyz) and stream-span planes (Oxy) for 231 times during the interval, Nt ∈ [12,35] (N is the stratification frequency). During this interval, internal wave radiation from the wake is most pronounced and the vorticity field in the wake undergoes distinct structural transitions. POD was chosen amongst the available statistical tools due to its advantage in characterization of simulated and experimentally measured velocity gradient fields. The computational procedure, applied to any random vector field, finds the most coherent feature from the given ensemble of field realizations. The decomposed empirical eigenfunctions could be referred to as "coherent structures", since they are highly correlated in an average sense with the flow field. In our analysis, we follow the computationally efficient method of 'snapshots' to find the POD eigenfunctions of the ensemble of vorticity field realizations. The results contains of the separate POD modes, along with

  12. Information and communication on risks related to medications and proper use of medications for healthcare professionals and the general public: precautionary principle, risk management, communication during and in the absence of crisis situations.

    Science.gov (United States)

    Molimard, Mathieu; Bernaud, Corine; Lechat, Philippe; Bejan-Angoulvant, Theodora; Benattia, Cherif; Benkritly, Amel; Braunstein, David; Cabut, Sandrine; David, Nadine; Fourrier-Réglat, Annie; Gallet, Benoit; Gersberg, Marta; Goni, Sylvia; Jolliet, Pascale; Lamarque-Garnier, Véronique; Le Jeunne, Claire; Leurs, Irina; Liard, François; Malbezin, Muriel; Micallef, Joelle; Nguon, Marina

    2014-01-01

    Recent drug crises have highlighted the complexity, benefits and risks of medication communication. The difficulty of this communication is due to the diversity of the sources of information and the target audience, the credibility of spokespersons, the difficulty to communicate on scientific uncertainties and the precautionary principle, which is influenced by variable perceptions and tolerances of the risk. Globally, there is a lack of training in risk management with a tendency of modern society to refuse even the slightest risk. Communication on medications is subject to regulatory or legal requirements, often uses tools and messages that are not adapted to the target audience and is often based on a poor knowledge of communication techniques. In order to improve this situation, the available information must be coordinated by reinforcing the unique medication information website and by coordinating communication between authorities by means of a single spokesperson. A particular effort must be made in the field of training in the proper use and risk of medications for both the general population and patients but also for healthcare professionals, by setting up a unified academic on-line teaching platform for continuing medical education on medications and their proper use.

  13. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  14. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  15. Campylobacter Risk Analysis

    DEFF Research Database (Denmark)

    Nauta, Maarten

    In several countries quantitative microbiological risk assessments (QMRAs) have been performed for Campylobacter in chicken meat. The models constructed for this purpose provide a good example of the development of QMRA in general and illustrate the diversity of available methods. Despite...... the differences between the models, the most prominent conclusions of the QMRAs are similar. These conclusions for example relate to the large risk of highly contaminated meat products and the insignificance of contamination from Campylobacter positive flocks to negative flocks during slaughter and processing...

  16. Workshop One : Risk Analysis

    NARCIS (Netherlands)

    Carlson, T.J.; Jong, C.A.F. de; Dekeling, R.P.A.

    2012-01-01

    The workshop looked at the assessment of risk to aquatic animals exposed to anthropogenic sound. The discussion focused on marine mammals given the worldwide attention being paid to them at the present time, particularly in relationship to oil and gas exploration, ocean power, and increases in ship

  17. CONSIDERATIONS ON ENTITY'S RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    MIRELA MONEA

    2014-12-01

    Full Text Available In the present paper, because of the complexity of this topic, the purpose is to discuss the main aspects involved by risk analysis; starting with few conceptual approaches about risk and to outline the contributions about methods to assess different risks categories, especially methods to assess bankruptcy risk prediction (entity insolvency from economic literature. The methods used to estimate bankruptcy risk are based on the score function which helps to find if an entity is confronted with financial difficulties. The score functions are a diagnosis method elaborated relying on the discriminant analysis, allowing to assess and to predict the bankruptcy risk of the entity using a set of relevant financial ratios.

  18. X-Ray Analysis of the Proper Motion and Pulsar Wind Nebula for PSR J1741-2054

    Science.gov (United States)

    Auchettl, Katie; Slane, Patrick; Romani, Roger W.; Posselt, Bettina; Pavlov, George G.; Kargaltsev, Oleg; Ng, C-Y.; Temim, Tea; Weisskopf, Martin C.; Bykov, Andrei; hide

    2015-01-01

    We obtained six observations of PSR J1741-2054 using the Chandra ACIS-S detector totaling approx.300 ks. By registering this new epoch of observations to an archival observation taken 3.2 yr earlier using X-ray point sources in the field of view, we have measured the pulsar proper motion at micron = 109 +/- 10 mas yr(exp. -1) in a direction consistent with the symmetry axis of the observed H(alpha) nebula. We investigated the inferred past trajectory of the pulsar but find no compelling association with OB associations in which the progenitor may have originated. We confirm previous measurements of the pulsar spectrum as an absorbed power law with photon index gamma = 2.68 +/- 0.04, plus a blackbody with an emission radius of (4.5(+3.2/-2.5))d(0.38) km, for a DM-estimated distance of 0.38d(0.38) kpc and a temperature of 61.7 +/- 3.0 eV. Emission from the compact nebula is well described by an absorbed power law model with a photon index of gamma = 1.67 +/- 0.06, while the diffuse emission seen as a trail extending northeast of the pulsar shows no evidence of synchrotron cooling. We also applied image deconvolution techniques to search for small-scale structures in the immediate vicinity of the pulsar, but found no conclusive evidence for such structures.

  19. Focused-based multifractal analysis of the wake in a wind turbine array utilizing proper orthogonal decomposition

    Science.gov (United States)

    Kadum, Hawwa; Ali, Naseem; Cal, Raúl

    2016-11-01

    Hot-wire anemometry measurements have been performed on a 3 x 3 wind turbine array to study the multifractality of the turbulent kinetic energy dissipations. A multifractal spectrum and Hurst exponents are determined at nine locations downstream of the hub height, and bottom and top tips. Higher multifractality is found at 0.5D and 1D downstream of the bottom tip and hub height. The second order of the Hurst exponent and combination factor show an ability to predict the flow state in terms of its development. Snapshot proper orthogonal decomposition is used to identify the coherent and incoherent structures and to reconstruct the stochastic velocity using a specific number of the POD eigenfunctions. The accumulation of the turbulent kinetic energy in top tip location exhibits fast convergence compared to the bottom tip and hub height locations. The dissipation of the large and small scales are determined using the reconstructed stochastic velocities. The higher multifractality is shown in the dissipation of the large scale compared to small-scale dissipation showing consistency with the behavior of the original signals.

  20. X-Ray Analysis of the Proper Motion and Pulsar Wind Nebula for PSR J1741-2054

    Science.gov (United States)

    Auchettl, Katie; Slane, Patrick; Romani, Roger W.; Posselt, Bettina; Pavlov, George G.; Kargaltsev, Oleg; Ng, C-Y.; Temim, Tea; Weisskopf, Martin C.; Bykov, Andrei; Swartz, Douglas

    2015-01-01

    We obtained six observations of PSR J1741-2054 using the Chandra ACIS-S detector totaling approx.300 ks. By registering this new epoch of observations to an archival observation taken 3.2 yr earlier using X-ray point sources in the field of view, we have measured the pulsar proper motion at micron = 109 +/- 10 mas yr(exp. -1) in a direction consistent with the symmetry axis of the observed H(alpha) nebula. We investigated the inferred past trajectory of the pulsar but find no compelling association with OB associations in which the progenitor may have originated. We confirm previous measurements of the pulsar spectrum as an absorbed power law with photon index gamma = 2.68 +/- 0.04, plus a blackbody with an emission radius of (4.5(+3.2/-2.5))d(0.38) km, for a DM-estimated distance of 0.38d(0.38) kpc and a temperature of 61.7 +/- 3.0 eV. Emission from the compact nebula is well described by an absorbed power law model with a photon index of gamma = 1.67 +/- 0.06, while the diffuse emission seen as a trail extending northeast of the pulsar shows no evidence of synchrotron cooling. We also applied image deconvolution techniques to search for small-scale structures in the immediate vicinity of the pulsar, but found no conclusive evidence for such structures.

  1. X-ray analysis of the proper motion and pulsar wind nebula for PSR J1741-2054

    CERN Document Server

    Auchettl, Katie; Romani, Roger W; Posselt, Bettina; Pavlov, George G; Kargaltsev, Oleg; Ng, C-Y; Temim, Tea; Weisskopf, Martin C; Bykov, Andrei; Swartz, Douglas A

    2015-01-01

    We obtained six observations of PSR J1741-2054 using the $Chandra$ ACIS-S detector totaling $\\sim$300 ks. By registering this new epoch of observations to an archival observation taken 3.2 years earlier using X-ray point sources in the field of view, we have measured the pulsar proper motion at $\\mu =109 \\pm 10$ mas/yr. The spectrum of the pulsar can be described by an absorbed power law with photon index $\\Gamma$=2.68$\\pm$0.04, plus a blackbody with an emission radius of (4.5$^{+3.2}_{-2.5})d_{0.38}$ km, for a DM-estimated distance of $0.38d_{0.38}$ kpc and a temperature of $61.7\\pm3.0$ eV. Emission from the compact nebula is well described by an absorbed power law model with a photon index of $\\Gamma$ = 1.67$\\pm$0.06, while the diffuse emission seen as a trail extending northeast of the pulsar shows no evidence of synchrotron cooling. We also looked for extended features that might represent a jet or torus-like structure using image deconvolution and PSF-subtraction but we find no conclusive evidence of suc...

  2. PROMOTIONS: PROper MOTION Software

    Science.gov (United States)

    Caleb Wherry, John; Sahai, R.

    2009-05-01

    We report on the development of a software tool (PROMOTIONS) to streamline the process of measuring proper motions of material in expanding nebulae. Our tool makes use of IDL's widget programming capabilities to design a unique GUI that is used to compare images of the objects from two epochs. The software allows us to first orient and register the images to a common frame of reference and pixel scale, using field stars in each of the images. We then cross-correlate specific morphological features in order to determine their proper motions, which consist of the proper motion of the nebula as a whole (PM-neb), and expansion motions of the features relative to the center. If the central star is not visible (quite common in bipolar nebulae with dense dusty waists), point-symmetric expansion is assumed and we use the average motion of high-quality symmetric pairs of features on opposite sides of the nebular center to compute PM-neb. This is then subtracted out to determine the individual movements of these and additional features relative to the nebular center. PROMOTIONS should find wide applicability in measuring proper motions in astrophysical objects such as the expanding outflows/jets commonly seen around young and dying stars. We present first results from using PROMOTIONS to successfully measure proper motions in several pre-planetary nebulae (transition objects between the red giant and planetary nebula phases), using images taken 7-10 years apart with the WFPC2 and ACS instruments on board HST. The authors are grateful to NASA's Undergradute Scholars Research Program (USRP) for supporting this research.

  3. ANALYSIS OF THE TAX RISK OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Safonova M. F.

    2014-09-01

    Full Text Available In the article, the author considers the essence of tax risks, summarized methodological approaches to the determination of tax risk classification and systematization of tax risks; stages in the analysis of tax risks

  4. Conceptual risk assessment framework for global change risk analysis SRP

    CSIR Research Space (South Africa)

    Elphinstone, CD

    2007-12-01

    Full Text Available This report is submitted as a deliverable of the SRP project Global Change Risk Analysis which aims at applying risk analysis as a unifying notion for quantifying and communicating threats to ecosystem services originating from global change...

  5. Henig Proper Efficient Points and Generalized Henig Proper Efficient Points

    Institute of Scientific and Technical Information of China (English)

    Jing Hui QIU

    2009-01-01

    Applying the theory of locally convex spaces to vector optimization,we investigate the relationship between Henig proper efficient points and generalized Henig proper efficient points. In particular,we obtain a sufficient and necessary condition for generalized Henig proper efficient points to be Henig proper efficient points. From this,we derive several convenient criteria for judging Henig proper efficient points.

  6. Proper Islamic Consumption

    DEFF Research Database (Denmark)

    Fischer, Johan

    '[T]his book is an excellent study that is lucidly written, strongly informed by theory, rich in ethnography, and empirically grounded. It has blazed a new trail in employing the tools of both religious studies and cultural studies to dissect the complex subject of “proper Islamic consumption...... because it is the Malay‐dominated state which has been crucial in generating and shaping a particular kind of modernity in order to address the problems posed for nation‐building by a quite radical form of ethnic pluralism.' Reviewed by V.T. (Terry) King, University of Leeds, ASEASUK News 46, 2009   'In...... spite of a long line of social theory analyzing the spiritual in the economic, and vice versa, very little of the recent increase in scholarship on Islam addresses its relationship with capitalism. Johan Fischer’s book,Proper Islamic Consumption, begins to fill this gap. […] Fischer’s detailed...

  7. Spectral proper orthogonal decomposition

    CERN Document Server

    Sieber, Moritz; Paschereit, Christian Oliver

    2015-01-01

    The identification of coherent structures from experimental or numerical data is an essential task when conducting research in fluid dynamics. This typically involves the construction of an empirical mode base that appropriately captures the dominant flow structures. The most prominent candidates are the energy-ranked proper orthogonal decomposition (POD) and the frequency ranked Fourier decomposition and dynamic mode decomposition (DMD). However, these methods fail when the relevant coherent structures occur at low energies or at multiple frequencies, which is often the case. To overcome the deficit of these "rigid" approaches, we propose a new method termed Spectral Proper Orthogonal Decomposition (SPOD). It is based on classical POD and it can be applied to spatially and temporally resolved data. The new method involves an additional temporal constraint that enables a clear separation of phenomena that occur at multiple frequencies and energies. SPOD allows for a continuous shifting from the energetically ...

  8. Proper Islamic Consumption

    DEFF Research Database (Denmark)

    Fischer, Johan

    ”. It is a must-read for researchers and students alike, especially those who want to pursue their study on the middle class, Islam and consumption.' Reviewed by Prof. Abdul Rahman Embong, Asian Anthropology    'This volume does make an important contribution to our understanding of the responses of socially...... spite of a long line of social theory analyzing the spiritual in the economic, and vice versa, very little of the recent increase in scholarship on Islam addresses its relationship with capitalism. Johan Fischer’s book,Proper Islamic Consumption, begins to fill this gap. […] Fischer’s detailed...

  9. Characterizations of proper actions

    Science.gov (United States)

    Biller, Harald

    2004-03-01

    Three kinds of proper actions of increasing strength are defined. We prove that the three definitions specialize to the definitions by Bourbaki, by Palais and by Baum, Connes and Higson in their respective settings. The third of these, which thus turns out to be the strongest, originally only concerns actions of second countable locally compact groups on metrizable spaces. In this situation, it is shown to coincide with the other two definitions if the total space locally has the Lindelöf property and the orbit space is regular.

  10. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  11. Calculating proper transfer prices

    Energy Technology Data Exchange (ETDEWEB)

    Dorkey, F.C. (Meliora Research Associates, Rochester, NY (United States)); Jarrell, G.A. (Univ. of Rochester, NY (United States))

    1991-01-01

    This article deals with developing a proper transfer pricing method. Decentralization is as American as baseball. While managers laud the widespread benefits of both decentralization and baseball, they often greet the term transfer price policy with a yawn. Since transfer prices are as critical to the success of decentralized firms as good pitchers are to baseball teams, this is quite a mistake on the part of our managers. A transfer price is the price charged to one division for a product or service that another division produced or provided. In many, perhaps most, decentralized organizations, the transfer pricing policies actually used are grossly inefficient and sacrifice the potential advantages of decentralization. Experience shows that far too many companies have transfer pricing policies that cost them significantly in foregone growth and profits.

  12. Dam risk assistant analysis system design

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In order to reduce the labor intensity and task difficulty of dam risk analysis and to meet the actual requirement of dam risk analysis,it is necessary to establish a dam risk assistant analysis system.The program structure and the implementation ways of the dam risk assistant analysis system are analyzed,and a procedural framework with "three-tier and multi-database" structure and "level structure" is established.The concept of dam risk assessment system modular development is proposed and the coupled mode of function module and data is improved.Finally,the dam risk assistant analysis system is developed using Delphi visual programming language.

  13. Meta analysis of risk factors for colorectal cancer

    Institute of Scientific and Technical Information of China (English)

    Kun Chen; Jiong-Liang Qiu; Yang Zhang; Yu-Wan Zhao

    2003-01-01

    AIM: To study the risk factors for colorectal cancer in China.METHODS: A meta-analysis of the risk factors of colorectal cancer was conducted for 14 case-control studies, and reviewed 14 reports within 13 years which included 5034cases and 5205 controls. Dersimonian and Laird random effective models were used to process the results.RESULTS: Meta analysis of the 14 studies demonstrated that proper physical activites and dietary fibers were protective factors (pooled OR<0.8), while fecal mucohemorrhage,chronic diarrhea and polyposis were highly associated with colorectal cancer (all pooled OR>4). The stratified results showed that different OR values of some factors were due to geographic factors or different resourses.CONCLUSION: Risks of colorectal cancer are significantly associated with the histories of intestinal diseases or relative symptoms, high lipid diet, emotional trauma and family history of cancers. The suitable physical activities and dietary fibers are protective factors.

  14. 38 CFR 75.115 - Risk analysis.

    Science.gov (United States)

    2010-07-01

    ... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational recommendations... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115...

  15. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2013-12-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  16. How to conduct a proper sensitivity analysis in life cycle assessment: taking into account correlations within LCI data and interactions within the LCA calculation model.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrene; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2015-01-06

    Sensitivity analysis (SA) is a significant tool for studying the robustness of results and their sensitivity to uncertainty factors in life cycle assessment (LCA). It highlights the most important set of model parameters to determine whether data quality needs to be improved, and to enhance interpretation of results. Interactions within the LCA calculation model and correlations within Life Cycle Inventory (LCI) input parameters are two main issues among the LCA calculation process. Here we propose a methodology for conducting a proper SA which takes into account the effects of these two issues. This study first presents the SA in an uncorrelated case, comparing local and independent global sensitivity analysis. Independent global sensitivity analysis aims to analyze the variability of results because of the variation of input parameters over the whole domain of uncertainty, together with interactions among input parameters. We then apply a dependent global sensitivity approach that makes minor modifications to traditional Sobol indices to address the correlation issue. Finally, we propose some guidelines for choosing the appropriate SA method depending on the characteristics of the model and the goals of the study. Our results clearly show that the choice of sensitivity methods should be made according to the magnitude of uncertainty and the degree of correlation.

  17. Analysis of foreign schools of risk management

    OpenAIRE

    I.M. Posokhov

    2013-01-01

    The aim of the article. The aim of the article is to study the scientific development of foreign scientific schools of risk management and analysis of their main publications; the allocation of foreign scientific schools of risk management. The results of the analysis. Research of modern risk management is carried out leading foreign schools. The most famous school in the theory of financial risk and risk management is American school. Among its current members are D. Galai, H. Greuning, A...

  18. Risk Analysis in Road Tunnels – Most Important Risk Indicators

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Thöns, Sebastian

    2016-01-01

    the effects and highlights the most important risk indicators with the aim to support further developments in risk analysis. Therefore, a system model of a road tunnel was developed to determine the risk measures. The system model can be divided into three parts: the fire part connected to the fire model Fire...... Dynamics Simulator (FDS); the evacuation part connected to the evacuation model FDS+Evac; and the frequency part connected to a model to calculate the frequency of fires. This study shows that the parts of the system model (and their most important risk indicators) affect the risk measures in the following......, further research can focus on these most important risk indicators with the aim to optimise risk analysis....

  19. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  20. Risk Analysis for Tea Processing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@ Itis lbviors that after all the disasters with dilxins, BSE, pathogcns,Footand Mouth disease a. o. and now shortly because of the possibillties of bioterrorism, thatFoodSafetyisalmostatthetopoftheagendaoftheEUfor theyearstocome The implementaion of certainhy gicneprinci plessuchas HA C C P and a transparent hygiene policy applicable to all food and all food operators, from the farm to the table, togetherwith effoctiveinstruments to manage Food Safety will form fsubstantialpart on this agenda. As an example external quality factors such as certain pathogens in tea will. be discussed. Since risk analysis of e. g. my cotoxing have already a quite long histoy and development in sereral international bodies and tea might bear unwanted (or deliberately added by terroristic action)contaminants, the need to monitor teamuch more regularly than is being done today, seems to be a"conditio sine qua non ". Recentoy developed Immuno Flow tests may one day help the consumer perhaps to find out if he gets poisoned.

  1. Modality and risk management for orthodontic extrusion procedures in interdisciplinary treatment for generating proper bone and tissue contours for the planned implant: a case report.

    Science.gov (United States)

    Maeda, Sachiko; Sasaki, Takeshi

    2015-12-01

    In adult interdisciplinary treatments with using dental implants, limited orthodontic treatment, especially orthodontic extrusion (OE), offers many benefits by both correcting teeth alignment and by contributing to the regeneration of periodontal tissues. However, orthodontic procedures carry some risks and unpredictabilities that might compromise tooth and/or periodontal tissue health. Especially in complex cases, it is difficult to decide which orthodontic treatment modalities should be combined, in what sequences they should be applied, and what their force systems and treatment times are.To achieve optimum results, some cases require two or more OEs to the same site being carried out at different times while taking the treatment effects into consideration. Such staged OE offers minimum intervention and maximum efficiency. In this case report, OE was first applied for orthodontic extraction. After bone regeneration followed by an implant placement and another surgical operation, a second OE was applied to align the inclination of an adjacent tooth. As a result, a predictable prognosis of implants as well as greatly improved esthetics and periodontal tissue health were achieved.

  2. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  3. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  4. Dynamic Blowout Risk Analysis Using Loss Functions.

    Science.gov (United States)

    Abimbola, Majeed; Khan, Faisal

    2017-08-11

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  5. Supply Chain Visibility with Linked Open Data for Supply Chain Risk Analysis

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Current customs applications are declaration based to support the various customs procedures based on (inter)national laws and regulations. To be able to perform a proper supply chain risk analysis, customs requires to have all data in supply chains. The current declaration procedures are not suffic

  6. Low-dimensional models for the nonlinear vibration analysis of cylindrical shells based on a perturbation procedure and proper orthogonal decomposition

    Science.gov (United States)

    Gonçalves, P. B.; Silva, F. M. A.; Del Prado, Z. J. G. N.

    2008-08-01

    In formulating mathematical models for dynamical systems, obtaining a high degree of qualitative correctness (i.e. predictive capability) may not be the only objective. The model must be useful for its intended application, and models of reduced complexity are attractive in many cases where time-consuming numerical procedures are required. This paper discusses the derivation of discrete low-dimensional models for the nonlinear vibration analysis of thin cylindrical shells. In order to understand the peculiarities inherent to this class of structural problems, the nonlinear vibrations and dynamic stability of a circular cylindrical shell subjected to static and dynamic loads are analyzed. This choice is based on the fact that cylindrical shells exhibit a highly nonlinear behavior under both static and dynamic loads. Geometric nonlinearities due to finite-amplitude shell motions are considered by using Donnell's nonlinear shallow-shell theory. A perturbation procedure, validated in previous studies, is used to derive a general expression for the nonlinear vibration modes and the discretized equations of motion are obtained by the Galerkin method using modal expansions for the displacements that satisfy all the relevant boundary and symmetry conditions. Next, the model is analyzed via the Karhunen-Loève expansion to investigate the relative importance of each mode obtained by the perturbation solution on the nonlinear response and total energy of the system. The responses of several low-dimensional models are compared. It is shown that rather low-dimensional but properly selected models can describe with good accuracy the response of the shell up to very large vibration amplitudes.

  7. Risk Analysis of Telecom Enterprise Financing

    Institute of Scientific and Technical Information of China (English)

    YU Hua; SHU Hua-ying

    2005-01-01

    The main research objects in this paper are the causes searching and risk estimating method for telecom enterprises' financial risks. The multi-mode financing for telecom enterprises makes it flexible to induce the capital and obtain the profit by corresponding projects. But there are also potential risks going with these financing modes. After making analysis of categories and causes of telecom enterprises' financing risk, a method by Analytic Hierarchy Process (AHP) is put forward to estimating the financing risk. And the author makes her suggestion and opinion by example analysis, in order to provide some ideas and basis for telecom enterprise's financing decision-making.

  8. Supply-Chain Risk Analysis

    Science.gov (United States)

    2016-06-07

    security score upon first submission – 3/1/2010 Measured Against CWE/SANS Top-25 Errors 24 SQL Database Query Output: All records with ID = 48983...exploitable design or coding errors • Very little data for software supply chains 8 Software Supply Chain Complexity-1 Composite inherits risk from any point... Relative Effort Operational Capabilities Knowledge of Supplier Capabilities Knowledge of Product Attributes 13 Supply-Chain Risk Categories Category

  9. Bank Liquidity Risk: Analysis and Estimates

    Directory of Open Access Journals (Sweden)

    Meilė Jasienė

    2012-12-01

    Full Text Available In today’s banking business, liquidity risk and its management are some of the most critical elements that underlie the stability and security of the bank’s operations, profit-making and clients confidence as well as many of the decisions that the bank makes. Managing liquidity risk in a commercial bank is not something new, yet scientific literature has not focused enough on different approaches to liquidity risk management and assessment. Furthermore, models, methodologies or policies of managing liquidity risk in a commercial bank have never been examined in detail either. The goal of this article is to analyse the liquidity risk of commercial banks as well as the possibilities of managing it and to build a liquidity risk management model for a commercial bank. The development, assessment and application of the commercial bank liquidity risk management was based on an analysis of scientific resources, a comparative analysis and mathematical calculations.

  10. Yuan Exchange Rate 'Properly Adjusted'

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

      The currency exchange rate was "properly adjusted" this year and takes into account effects on the country's neighbors and the world, Premier Wen Jiabao said at a regional meeting in Malaysia.……

  11. On the Crab Proper Motion

    CERN Document Server

    Caraveo, P A; Caraveo, Patrizia A; Mignani, Roberto

    1998-01-01

    Owing to the dramatic evolution of telescopes as well as optical detectors in the last 20 yrs, we are now able to measure anew the proper motion of the Crab pulsar, after the classical result of Wyckoff and Murray (1977) in a time span 40 times shorter. The proper motion is aligned with the axis of symmetry of the inner Crab nebula and, presumably, with the pulsar spin axis.

  12. Collision Risk Analysis for HSC

    DEFF Research Database (Denmark)

    Urban, Jesper; Pedersen, Preben Terndrup; Simonsen, Bo Cerup

    1999-01-01

    High Speed Craft (HSC) have a risk profile, which is distinctly different from conventional ferries. Due to different hull building material, structural layout, compartmentation and operation, both frequency and consequences of collision and grounding accidents must be expected to be different from...... conventional ships. To reach a documented level of safety, it is therefore not possible directly to transfer experience with conventional ships. The purpose of this paper is to present new rational scientific tools to assess and quantify the collision risk associated with HSC transportation. The paper...

  13. Initial Risk Analysis and Decision Making Framework

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  14. On proper linearization, construction and analysis of the Boyle-van't Hoff plots and correct calculation of the osmotically inactive volume.

    Science.gov (United States)

    Katkov, Igor I

    2011-06-01

    The Boyle-van't Hoff (BVH) law of physics has been widely used in cryobiology for calculation of the key osmotic parameters of cells and optimization of cryo-protocols. The proper use of linearization of the Boyle-vant'Hoff relationship for the osmotically inactive volume (v(b)) has been discussed in a rigorous way in (Katkov, Cryobiology, 2008, 57:142-149). Nevertheless, scientists in the field have been continuing to use inappropriate methods of linearization (and curve fitting) of the BVH data, plotting the BVH line and calculation of v(b). Here, we discuss the sources of incorrect linearization of the BVH relationship using concrete examples of recent publications, analyze the properties of the correct BVH line (which is unique for a given v(b)), provide appropriate statistical formulas for calculation of v(b) from the experimental data, and propose simplistic instructions (standard operation procedure, SOP) for proper normalization of the data, appropriate linearization and construction of the BVH plots, and correct calculation of v(b). The possible sources of non-linear behavior or poor fit of the data to the proper BVH line such as active water and/or solute transports, which can result in large discrepancy between the hyperosmotic and hypoosmotic parts of the BVH plot, are also discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Collision Risk Analysis for HSC

    DEFF Research Database (Denmark)

    Urban, Jesper; Pedersen, Preben Terndrup; Simonsen, Bo Cerup

    1999-01-01

    for a HSC on a given route, an analysis of the released energy during a collision, analytical closed form solutions for the absorbed energy in the structure and finally an assessment of the overall structural crushing behaviour of the vessel, including the level of acceleration and the size of the crushing...... analysis tools to quantify the effect of the high speed have been available. Instead nearly all research on ship accidents has been devoted to analysis of the consequences of given accident scenarios. The proposed collision analysis includes an analysis which determines the probability of a collision...

  16. Analysis of foreign schools of risk management

    Directory of Open Access Journals (Sweden)

    I.M. Posokhov

    2013-12-01

    Full Text Available The aim of the article. The aim of the article is to study the scientific development of foreign scientific schools of risk management and analysis of their main publications; the allocation of foreign scientific schools of risk management. The results of the analysis. Research of modern risk management is carried out leading foreign schools. The most famous school in the theory of financial risk and risk management is American school. Among its current members are D. Galai, H. Greuning, A. Damodaran, P. Jorion, J. Kallman, M. Crouhy, M. Mccarthy, R. Mark, T. Flynn and other scientists. Important contribution to the development of the theory and practice of risk management made British scientists and economists – the representatives of English Schools of Risk Management: T. Andersen, T. Bedford, A. Griffin, A. Zaman, R. Cooke, P. Sweeting, P. Hopkin, a German P. Schroder and others. A significant advance to the theory of risk management of German scientific school, based on the classic work of Zadeh has received significant results of the risk assessment using fuzzy logic and fuzzy sets. Graduate School of Risk Management of the University of Cologne is training and research group funded by the German Research Foundation (DFG under the project «Theoretical and Empirical Basis of Risk Management». The aim of Graduate School of Risk Management is to promote young scientists. The school risk management of the University of Cologne outstanding research conducted by German and foreign professors, such as: K. Mosler, A. Kempf, C. Kuhner, T. Hartmann-Wendels, C. Homburg, D. Hess, D. Sliwka, F. Schmid, H. Schradin. The author noted the existence and fruitful work in the capital of Switzerland Laboratory of risk management (Risk Lab Switzerland and its leading scientists: P. Embrechts, A. Gisler, M. Wüthrich, H. Bühlmann, V. Bignozzi, M. Hofert, P. Deprez, and the Basel Committee on banking supervision – Developer international standards of Basel

  17. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    Science.gov (United States)

    2012-03-01

    coatings Vinyl/asbestos floor tile Automatic transmission components Clutch facings Disc brake pads Drum brake linings Brake blocks Commercial and...1EMDQ March 2012 ASBESTOS WORKSHOP: SAMPLING, ANALYSIS , AND RISK ASSESSMENT Paul Black, PhD, Neptune and Company Ralph Perona, DABT, Neptune and...Sampling, Analysis , and Risk Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  18. Proper conformal symmetries in SD Einstein spaces

    CERN Document Server

    Chudecki, Adam

    2014-01-01

    Proper conformal symmetries in self-dual (SD) Einstein spaces are considered. It is shown, that such symmetries are admitted only by the Einstein spaces of the type [N]x[N]. Spaces of the type [N]x[-] are considered in details. Existence of the proper conformal Killing vector implies existence of the isometric, covariantly constant and null Killing vector. It is shown, that there are two classes of [N]x[-]-metrics admitting proper conformal symmetry. They can be distinguished by analysis of the associated anti-self-dual (ASD) null strings. Both classes are analyzed in details. The problem is reduced to single linear PDE. Some general and special solutions of this PDE are presented.

  19. Project cost analysis under risk

    Directory of Open Access Journals (Sweden)

    Florica LUBAN

    2010-12-01

    Full Text Available In this paper, an integrated approach based on Monte Carlo simulation and Six Sigma methodology is used to analyze the risk associated with a project's total cost. Monte Carlo simulation is applied to understand the variability in total cost caused by the probabilistic cost items. By Six Sigma methodology the range of variation of the project cost can be reduced by operating on the input factors with the greatest impact on total cost to cover the variation of 6 between the limits that were established in the design phase of Six Sigma.

  20. Gender Analysis of Risk in Innovation System

    DEFF Research Database (Denmark)

    Ayinde, Ope; Muchie, Mammo; Abaniyan, E. O.

    2011-01-01

    This study analyzed risk by gender in innovation in Kwara state, Nigeria, using downy mildew resistant maize production as case study. The study employed primary and secondary data. The primary data were collected from well-structured questionnaires administered to both male and female producing...... the new maize variety. The analytical tools used include descriptive statistics, regression model; risk utility functions and risk parameter analysis. The result showed that invasion by animals, disease and pest, lack of access to credit wind and price fluctuation were the major risk facing the maize...... producers in the area in the usage of the new innovation. The study also revealed that male producers were willing to take risk in the new maize variety production than the female, while the females were more indifferent to the risk involved in the new maize production variety than males. None...

  1. Intentional risk management through complex networks analysis

    CERN Document Server

    Chapela, Victor; Moral, Santiago; Romance, Miguel

    2015-01-01

    This book combines game theory and complex networks to examine intentional technological risk through modeling. As information security risks are in constant evolution,  the methodologies and tools to manage them must evolve to an ever-changing environment. A formal global methodology is explained  in this book, which is able to analyze risks in cyber security based on complex network models and ideas extracted from the Nash equilibrium. A risk management methodology for IT critical infrastructures is introduced which provides guidance and analysis on decision making models and real situations. This model manages the risk of succumbing to a digital attack and assesses an attack from the following three variables: income obtained, expense needed to carry out an attack, and the potential consequences for an attack. Graduate students and researchers interested in cyber security, complex network applications and intentional risk will find this book useful as it is filled with a number of models, methodologies a...

  2. Relative risk regression analysis of epidemiologic data.

    Science.gov (United States)

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  3. VVV High Proper Motion Survey

    CERN Document Server

    Gromadzki, M; Folkes, S; Beamin, J C; Ramirez, K Pena; Borissova, J; Pinfield, D; Jones, H; Minniti, D; Ivanov, V D

    2013-01-01

    Here we present survey of proper motion stars towards the Galactic Bulge and an adjacent plane region base on VISTA-VVV data. The searching method based on cross-matching photometric Ks-band CASU catalogs. The most interesting discoveries are shown.

  4. INTERNAL PROPER MOTIONS IN THE ESKIMO NEBULA

    Energy Technology Data Exchange (ETDEWEB)

    García-Díaz, Ma. T.; Gutiérrez, L.; Steffen, W.; López, J. A. [Instituto de Astronomía, Universidad Nacional Autónoma de México, Km 103 Carretera Tijuana-Ensenada, 22860 Ensenada, B.C. (Mexico); Beckman, J., E-mail: tere@astro.unam.mx, E-mail: leonel@astro.unam.mx, E-mail: wsteffen@astro.unam.mx, E-mail: jal@astro.unam.mx, E-mail: jeb@iac.es [Instituto de Astrofísica de Canarias, La Laguna, Tenerife (Spain)

    2015-01-10

    We present measurements of internal proper motions at more than 500 positions of NGC 2392, the Eskimo Nebula, based on images acquired with WFPC2 on board the Hubble Space Telescope at two epochs separated by 7.695 yr. Comparisons of the two observations clearly show the expansion of the nebula. We measured the amplitude and direction of the motion of local structures in the nebula by determining their relative shift during that interval. In order to assess the potential uncertainties in the determination of proper motions in this object, in general, the measurements were performed using two different methods, used previously in the literature. We compare the results from the two methods, and to perform the scientific analysis of the results we choose one, the cross-correlation method, because it is more reliable. We go on to perform a ''criss-cross'' mapping analysis on the proper motion vectors, which helps in the interpretation of the velocity pattern. By combining our results of the proper motions with radial velocity measurements obtained from high resolution spectroscopic observations, and employing an existing 3D model, we estimate the distance to the nebula to be 1.3 kpc.

  5. Factores de riesgo para la mortalidad materna: usos del enfoque de riesgo para la atención de las mujeres en edad reproductiva Risk factors and the risk approach for the proper care of women in reproductive age

    Directory of Open Access Journals (Sweden)

    Luis O. Cataño

    1993-03-01

    Full Text Available

    Se presentan diferentes modelos que han permitido Interpretar la mortalidad materna y su relación causal con los factores de riesgo. Se explica el enfoque de riesgo como una contribución de la epidemiología para ayudar a Interpretar la causalidad de dicha mortalidad. Se establece la diferencia entre riesgo y factor de riesgo. Se dan a conocer los elementos que la vigilancia epidemiológica ofrece al enfoque de riesgo y se Identifica la relación entre éste y los componentes de la atención primaria, como la investigación. Se destacan las posibilidades que ofrece este enfoque para materializar la descentralización y desarrollar los sistemas locales de salud (SILOS. Se reconoce la mortalidad materna como expresión del diferencial genérico y por lo tanto como un indicador de la salud de la mujer y de la salud reproductiva de cada comunidad. Se propone el desarrollo de investigaciones operativas como una de las soluciones a los problemas de la salud reproductiva.

     

    Different models are presented for interpreting maternal mortality and the causal relationship of risk factors. The risk approach is explained and the difference between risk and risk factor is established. Elements of epidemiological surveillance for the risk approach are presented. The relationship between this approach and components of primary care is identified. The possibilities offered by the risk approach for the development of the local health care systems are emphasized. Maternal mortality is accepted as an expression of the generic differential and, therefore, as an Index of women's health and of the reproductive health of the community.

     

  6. Benson真有效意义下向量优化的灵敏度分析%Sensitivity Analysis in Vector Optimization under Benson Proper Efficiency

    Institute of Scientific and Technical Information of China (English)

    盛宝怀; 刘三阳

    2002-01-01

    本文用关于集值映射的Contingent切导数定量地讨论了参数映射G(u)在Ben-son真有效意义下的扰动情况.记W(u)=Pmin[G(u),S],y∧∈W(u∧),则在某些条件下DW(u∧,y∧)(u)c Pmin[DG(u∧,y∧)(u)],而在另外一些条件下DW(u∧,y∧)(u))P min[DG(u∧,y∧)(u)].%The behavior of the perturbation map is analyzed quantitatively by usingthe concept of contingent derivatives for set-valued maps under Benson proper efficiency.Let W(u) = Pmin[G(u),S],y∧∈ W(u∧). It is shown that, under some conditions,DW(u∧, y∧) C P min[DG(u∧, y∧), S], and under some other conditions, DW(u∧, y∧) P min[DG(u∧, y∧), S].

  7. Evaluation of proper height for squatting stool.

    Science.gov (United States)

    Jung, Hwa S; Jung, Hyung-Shik

    2008-05-01

    Many jobs and activities in people's daily lives have them in squatting postures. Jobs such as housekeeping, farming and welding require various squatting activities. It is speculated that prolonged squatting without any type of supporting stool would gradually and eventually impose musculoskeletal injuries on workers. This study aims to examine the proper height of the stool according to the position of working materials for the squatting worker. A total of 40 male and female college students and 10 female farmers participated in the experiment to find the proper stool height. Student participants were asked to sit and work in three different positions: floor level of 50 mm; ankle level of 200 mm; and knee level of 400 mm. They were then provided with stools of various heights and asked to maintain a squatting work posture. For each working position, they were asked to write down their thoughts on a preferred stool height. A Likert summated rating method as well as pairwise ranking test was applied to evaluate user preference for provided stools under conditions of different working positions. Under a similar experimental procedure, female farmers were asked to indicate their body part discomfort (BPD) on a body chart before and after performing the work. Statistical analysis showed that comparable results were found from both evaluation measures. When working position is below 50 mm, the proper stool height is 100 or should not be higher than 150 mm. When working position is 200 mm, the proper stool height is 150 mm. When working position is 400 mm, the proper stool height is 200 mm. Thus, it is strongly recommended to use proper height of stools with corresponding working position. Moreover, a wearable chair prototype was designed so that workers in a squatting posture do not have to carry and move the stool from one place to another. This stool should ultimately help to relieve physical stress and hence promote the health of squatting workers. This study sought

  8. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  9. Multicriteria Decision Analysis for banks risks evaluation

    OpenAIRE

    Rakotoarivelo, Jean-Baptiste; ZARATÉ, Pascale; Razafimandimby, Josvah Paul

    2015-01-01

    International audience; This poster aims to observe a better choice for risks evaluation Financial organisms. Our aim is to support banks during operations of customers with respect to funding opportunities, investment or credits reaching. First of all, we identify different types of risks associated with this activity and we secondly analysed them thanks to a method of multicriteria analysis AHP (Analytic hierachy Process) with different means adopted to identify them. It should be noted tha...

  10. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  11. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  12. Quantitative risks analysis of maritime terminal petrochemical

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Leandro Silveira; Leal, Cesar A. [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica (PROMEC)]. E-mail: leandro19889900@yahoo.com.br

    2008-07-01

    This work consists of the application of a computer program (RISKAN) developed for studies of quantification of industrial risks and also a revision of the models used in the program. As part the evaluation made, a test was performed with the application of the computer program to estimate the risks for a marine terminal for storage of petrochemical products, in the city of Rio Grande, Brazil. Thus, as part of the work, it was performed a Quantitative Risk Analysis associated to the terminal, both for the workers and for the population nearby, with a verification of acceptability using the tolerability limits established by the State Licensing Agency (FEPAM-RS). In the risk analysis methodology used internationally, the most used way of presenting results of social risks is in the graphical form with the use of the FN curves and for the individual risk it is common the use of the iso-risk curves traced on the map of the area where is the plant. In the beginning of the study, both a historical analysis of accidents and use of the technique of Preliminary Analysis of Risks were made in order to aid in the process of identification of the possible scenarios of accidents related to the activities in the terminal. After identifying the initiating events, their frequencies or probabilities of occurrence were estimated and followed by the calculations of the physical effects and deaths, with the use, inside the computer program, of published models of Prins Mauritz Laboratory and of American Institute of Chemical Engineers. The average social risk obtained for the external populations was of 8.7x10{sup -7} fatality.year{sup -1} and for the internal population (people working inside the terminal), 3.2x10{sup -4} fatality.year-1. The accident scenario that most contributed to the social risk was death due to exposure to the thermal radiation caused by pool fire, with 84.3% of the total estimated for external populations and 82.9% for the people inside the terminal. The

  13. Gender Analysis of Risk in Innovation System

    DEFF Research Database (Denmark)

    Ayinde, Ope; Muchie, Mammo; Abaniyan, E. O.

    2011-01-01

    that information and knowledge on new technology and innovation should be made available to the farmers. And also those farmers with less experience and more female should be encouraged and integrated into the agricultural production and innovation systems. Supply of inputs of the innovative facilities should...... the new maize variety. The analytical tools used include descriptive statistics, regression model; risk utility functions and risk parameter analysis. The result showed that invasion by animals, disease and pest, lack of access to credit wind and price fluctuation were the major risk facing the maize...

  14. Competence Set Analysis Under Risk and Uncertainty

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The competence set analysis technology can be applied to solve the decision making problems successfully and satisfactorily. This paper mainly focuses on the expanding strategy research and development of the competence set under risk and uncertainty. A systematic expression of the competence set analysis is described, several expanding principles and strategies with regard to several different cases are presented, and their applications in the personnel training program are discussed, some conclusions and suggestions to be developed in a further work are included.

  15. Economic impact assessment in pest risk analysis

    NARCIS (Netherlands)

    Soliman, T.A.A.; Mourits, M.C.M.; Oude Lansink, A.G.J.M.; Werf, van der W.

    2010-01-01

    According to international treaties, phytosanitary measures against introduction and spread of invasive plant pests must be justified by a science-based pest risk analysis (PRA). Part of the PRA consists of an assessment of potential economic consequences. This paper evaluates the main available tec

  16. Game Theoretic Risk Analysis of Security Threats

    CERN Document Server

    Bier, Vicki M

    2008-01-01

    Introduces reliability and risk analysis in the face of threats by intelligent agents. This book covers applications to networks, including problems in both telecommunications and transportation. It provides a set of tools for applying game theory TO reliability problems in the presence of intentional, intelligent threats

  17. Proper motions of the HH 1 jet

    Science.gov (United States)

    Raga, A. C.; Reipurth, B.; Esquivel, A.; Castellanos-Ramírez, A.; Velázquez, P. F.; Hernández-Martínez, L.; Rodríguez-González, A.; Rechy-García, J. S.; Estrella-Trujillo, D.; Bally, J.; González-Gómez, D.; Riera, A.

    2017-10-01

    We describe a new method for determining proper motions of extended objects, and a pipeline developed for the application of this method. We then apply this method to an analysis of four epochs of [S II] HST images of the HH 1 jet (covering a period of ≈20 yr). We determine the proper motions of the knots along the jet, and make a reconstruction of the past ejection velocity time-variability (assuming ballistic knot motions). This reconstruction shows an "acceleration" of the ejection velocities of the jet knots, with higher velocities at more recent times. This acceleration will result in an eventual merging of the knots in ≈450 yr and at a distance of ≈80'' from the outflow source, close to the present-day position of HH 1.

  18. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  19. A tulajdonnevek pszicho- és neurolingvisztikája. Vizsgálati szempontok és modellek a tulajdonnevek feldolgozásáról [The psycho- and neurolinguistics of proper names. Aspects and models of analysis on processing proper names

    Directory of Open Access Journals (Sweden)

    Reszegi, Katalin

    2014-12-01

    Full Text Available This paper provides an overview of the results of psycho- and neurolinguistic examinations into the mental processes involving proper names (i.e. storing, processing, retrieving proper names. We can denote entities of various types with the help of proper names, and although most of these types are universal, there are in fact some cultural differences. In the fields of science concerned, that is, in psycho- and neurolinguistics and in neuropsychology, attention is given almost exclusively to anthroponyms; mental and neurological features of toponyms and other name types are much less examined. Processing names is generally believed to display more difficulties than processing common nouns, and these difficulties present themselves more and more strongly with age. In connection with the special identifying function and semantic features of proper names, many researchers assume that we process the two groups of words in different ways. This paper, reflecting also on these assumptions, summarizes and explains the results of research into a selective anomia affecting monolingual speakers (word-finding disturbances; b localization; c reaction time measurement; and d speech disfluency concerning proper names (especially the “tip of the tongue phenomenon”. The author also presents the models of processing proper names, examining to what degree these models can be reconciled with our knowledge of the acquisition of proper names. Finally, the results and possible explanations of the small amount of research into the representation and processing of proper names by bilingual speakers are discussed.

  20. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  1. Contribution of European research to risk analysis.

    Science.gov (United States)

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here.

  2. Risk and safety analysis of nuclear systems

    CERN Document Server

    Lee, John C

    2011-01-01

    The book has been developed in conjunction with NERS 462, a course offered every year to seniors and graduate students in the University of Michigan NERS program. The first half of the book covers the principles of risk analysis, the techniques used to develop and update a reliability data base, the reliability of multi-component systems, Markov methods used to analyze the unavailability of systems with repairs, fault trees and event trees used in probabilistic risk assessments (PRAs), and failure modes of systems. All of this material is general enough that it could be used in non-nuclear a

  3. Risk assessment for benefits analysis: framework for analysis of a thyroid-disrupting chemical.

    Science.gov (United States)

    Axelrad, Daniel A; Baetcke, Karl; Dockins, Chris; Griffiths, Charles W; Hill, Richard N; Murphy, Patricia A; Owens, Nicole; Simon, Nathalie B; Teuschler, Linda K

    Benefit-cost analysis is of growing importance in developing policies to reduce exposures to environmental contaminants. To quantify health benefits of reduced exposures, economists generally rely on dose-response relationships estimated by risk assessors. Further, to be useful for benefits analysis, the endpoints that are quantified must be expressed as changes in incidence of illnesses or symptoms that are readily understood by and perceptible to the layperson. For most noncancer health effects and for nonlinear carcinogens, risk assessments generally do not provide the dose-response functions necessary for economic benefits analysis. This article presents the framework for a case study that addresses these issues through a combination of toxicology, epidemiology, statistics, and economics. The case study assesses a chemical that disrupts proper functioning of the thyroid gland, and considers the benefits of reducing exposures in terms of both noncancer health effects (hypothyroidism) and thyroid cancers. The effects are presumed to be due to a mode of action involving interference with thyroid-pituitary functioning that would lead to nonlinear dose response. The framework integrates data from animal testing, statistical modeling, human data from the medical and epidemiological literature, and economic methodologies and valuation studies. This interdisciplinary collaboration differs from the more typical approach in which risk assessments and economic analyses are prepared independently of one another. This framework illustrates particular approaches that may be useful for expanded quantification of adverse health effects, and demonstrates the potential of such interdisciplinary approaches. Detailed implementation of the case study framework will be presented in future publications.

  4. Risk analysis for earth dam overtopping

    Directory of Open Access Journals (Sweden)

    Mo Chongxun

    2008-06-01

    Full Text Available In this paper, a model of overtopping risk under the joint effects of floods and wind waves, which is based on risk analysis theory and takes into account the uncertainties of floods, wind waves, reservoir capacity and discharge capacity of the spillway, is proposed and applied to the Chengbihe Reservoir in Baise City in Guangxi Zhuang Autonomous Region. The simulated results indicate that the flood control limiting level can be raised by 0.40 m under the condition that the reservoir overtopping risk is controlled within a mean variance of 5×10-6. As a result, the reservoir storage will increase to 16 million m3 and electrical energy generation and other functions of the reservoir will also increase greatly.

  5. Risk analysis for earth dam overtopping

    Institute of Scientific and Technical Information of China (English)

    Mo Chongxun; Liu Fanggui; Yu Mei; Ma Rongyong; Sun Guikai

    2008-01-01

    In this paper, a model of overtopping risk under the joint effects of floods and wind waves, which is based on risk analysis theory and takes into account the uncertainties of floods, wind waves, reservoir capacity and discharge capacity of the spillway, is proposed and applied to the Chengbihe Reservoir in Baise City in Guangxi Zhuang Autonomous Region. The simulated results indicate that the flood control limiting level can be raised by 0.40 m under the condition that the reservoir overtopping risk is controlled within a mean variance of 5×10-6. As a result, the reservoir storage will increase to 16 million m3 and electrical energy generation and other functions of the reservoir will also increase greatly.

  6. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  7. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  8. Integrated Reliability and Risk Analysis System (IRRAS)

    Energy Technology Data Exchange (ETDEWEB)

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rasmuson, D M [Nuclear Regulatory Commission, Washington, DC (United States)

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  9. Risk Analysis of Accounting Information System Infrastructure

    OpenAIRE

    MIHALACHE, Arsenie-Samoil

    2011-01-01

    National economy and security are fully dependent on information technology and infrastructure. At the core of the information infrastructure society relies on, we have the Internet, a system designed initially as a scientists’ forum for unclassified research. The use of communication networks and systems may lead to hazardous situations that generate undesirable effects such as communication systems breakdown, loss of data or taking the wrong decisions. The paper studies the risk analysis of...

  10. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  11. Conceptual issues with risk analysis in Switzerland

    Science.gov (United States)

    Nicolet, Pierrick; Jaboyedoff, Michel; Lévy, Sébastien

    2015-04-01

    Risk analysis is a tricky procedure, where one can easily make mistakes. Indeed, although risk equations are rather general, transferring a methodology to another context or hazard type can often lead to inaccuracies or even significant errors. To illustrate this, common mistakes made with the Swiss methodology are presented, together with possible solutions. This includes the following: Risk analysis for moving objects only takes the process dimension into account (e.g. the length of a road section potentially affected by a landslide), but not the object dimension (e.g. the cars length). This is a fair simplification as long as the object dimension is considerably smaller than the process dimension. However, when the object is large compared to the process (e.g. rockfalls on a train), the results will be wrong. This problem can be illustrated by considering two blocs. According to this methodology a 1 m diameter bloc will be twice more susceptible to reach a train than a 50 cm bloc. This is obviously not correct. When it comes to rockfalls risk analysis on roads or railway found in the literature, the bloc dimension is usually neglected, in favour of the object dimension, which is a fair assumption in this context. However, it is possible to include both dimensions by using the sum of the lengths instead of one of them. Risk analysis is usually performed using 3 different scenarios, for 3 different ranges of return periods, namely 1-30, 30-100 and 100-300 years. In order to be conservative, the operator commonly considers the magnitude of the worst event that happens with a return period included between the class bounds, which means that the operator evaluates the magnitude reached or overpassed with a return period of 30, 100 and 300 years respectively. Then, since the magnitude corresponds to the upper bounds of the classes, risk is calculated using the frequency corresponding to these return periods and not to the middle of the class (and also subtracting the

  12. Analysis of Operational Risks in Shipbuilding Industry

    Directory of Open Access Journals (Sweden)

    Daniela MATEI

    2012-11-01

    Full Text Available Our paper emphasizes the opportunities provided both for the academic research and companies by using a proposed model of analyzing the operational risks within business in general and shipbuilding industry in particular. The model aims to display the loss distribution from the operational risk for each business line/ type of event, based on frequency and severity estimation of the events. These estimations are derived mainly from the history logs of internal loss events. The calculations extend over a certain period of time in the future with a certain level of confidence. It should also be mentioned that the proposed model estimates unexpected losses, without making any suppositions concerning the values of the expected and unexpected losses. Several ideas could be extracted by analyzing and synthesizing the theoretical models from available literature. These ideas were analyzed in order to develop a model for operational risk analysis that is adapted to shipbuilding. This paper describes a new model, which can be applied to the naval industry to quantify operational risks.

  13. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative inf

  14. Proper alignment of the microscope.

    Science.gov (United States)

    Rottenfusser, Rudi

    2013-01-01

    The light microscope is merely the first element of an imaging system in a research facility. Such a system may include high-speed and/or high-resolution image acquisition capabilities, confocal technologies, and super-resolution methods of various types. Yet more than ever, the proverb "garbage in-garbage out" remains a fact. Image manipulations may be used to conceal a suboptimal microscope setup, but an artifact-free image can only be obtained when the microscope is optimally aligned, both mechanically and optically. Something else is often overlooked in the quest to get the best image out of the microscope: Proper sample preparation! The microscope optics can only do its job when its design criteria are matched to the specimen or vice versa. The specimen itself, the mounting medium, the cover slip, and the type of immersion medium (if applicable) are all part of the total optical makeup. To get the best results out of a microscope, understanding the functions of all of its variable components is important. Only then one knows how to optimize these components for the intended application. Different approaches might be chosen to discuss all of the microscope's components. We decided to follow the light path which starts with the light source and ends at the camera or the eyepieces. To add more transparency to this sequence, the section up to the microscope stage was called the "Illuminating Section", to be followed by the "Imaging Section" which starts with the microscope objective. After understanding the various components, we can start "working with the microscope." To get the best resolution and contrast from the microscope, the practice of "Koehler Illumination" should be understood and followed by every serious microscopist. Step-by-step instructions as well as illustrations of the beam path in an upright and inverted microscope are included in this chapter. A few practical considerations are listed in Section 3. Copyright © 2013 Elsevier Inc. All rights

  15. Empirical analysis on risk of security investment

    Institute of Scientific and Technical Information of China (English)

    AN Peng; LI Sheng-hong

    2009-01-01

    The paper analyzes the theory and application of Markowitz Mean-Variance Model and CAPM model. Firstly, it explains the development process and standpoints of two models and deduces the whole process in detail. Then 30 stocks are choosen from Shangzheng 50 stocks and are testified whether the prices of Shanghai stocks conform to the two models. With the technique of time series and panel data analysis, the research on the stock risk and effective portfolio by ORIGIN and MATLAB software is conducted. The result shows that Shanghai stock market conforms to Markowitz Mean-Variance Model to a certain extent and can give investors reliable suggestion to gain higher return, but there is no positive relation between system risk and profit ratio and CAPM doesn't function well in China's security market.

  16. Terminological Ontologies for Risk and Vulnerability Analysis

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2014-01-01

    Risk and vulnerability analyses are an important preliminary stage in civil contingency planning. The Danish Emergency Management Agency has developed a generic model and a set of tools that may be used in the preparedness planning, i.e. for identifying and describing society’s critical functions......, for formulating threat scenarios and for assessing consequences. Terminological ontologies, which are systems of domain specific concepts comprising concept relations and characteristics, are useful, both when describing the central concepts of risk and vulnerability analysis (meta concepts), and for further...... structuring and enriching the taxonomies of society’s critical functions and threats, which form an important part of the model. Creating terminological ontologies is a time consuming work, and therefore there is a need for automatic tools for extraction of terms, concept relations and characteristics...

  17. Comparative analysis of seismic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Kyoo; Kim, Tae Woon; Hwang, Mi Jung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-07-01

    SRA methodologies are separated into SPSA and SMM. SPSA methodology that has been widely used for seismic risk analysis has two kinds of methodologies such as Zion method and SSMRP method. SPSA methodology is suitable to interfacing with the analysis of internal event. However, the results of SPSA have uncertainties because of uncertainties in seismic hazard analysis and subjective judgement. Zion method specially developed for commercial use is less expensive and less time consuming but more uncertain than SSMRP method, since the former performs the fragility analysis less in detail than the latter. SMM is impossible to interface with the analysis of internal event but the uncertainties that are occurred during seismic hazard analysis is reduced because of the screening using RLE (review level earthquake). Therefore, if SPSA-based SMM methodology is chosen to be developed, the results of SRA will be more reliable and it requires low costs and time. In addition, the new methodology will require the development of a new evaluating code for SRA. (Author) 26 refs., 25 figs., 16 tabs.

  18. Nuclear risk analysis of the Ulysses mission

    Energy Technology Data Exchange (ETDEWEB)

    Bartram, B.W.; Vaughan, F.R. (NUS Corporation, 910 Clopper Road, Gaithersburg, Maryland 20877-0962 (USA)); Englehart, D.R.W. (Office of New Production Reactors, U.S. Department of Energy, Washington, D.C. 20585 (USA))

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  19. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  20. 14 CFR 417.225 - Debris risk analysis.

    Science.gov (United States)

    2010-01-01

    ... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert and explosive debris hazards from any one flight of a launch vehicle satisfies the public risk criterion of...

  1. 7 CFR 29.112 - Proper light.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Proper light. 29.112 Section 29.112 Agriculture... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.112 Proper light. Tobacco shall not be inspected or sampled for the purposes of the Act except when displayed in proper light for correct...

  2. Risk Analysis Approach to Rainwater Harvesting Systems

    Directory of Open Access Journals (Sweden)

    Nadia Ursino

    2016-08-01

    Full Text Available Urban rainwater reuse preserves water resources and promotes sustainable development in rapidly growing urban areas. The efficiency of a large number of urban water reuse systems, operating under different climate and demand conditions, is evaluated here on the base of a new risk analysis approach. Results obtained by probability analysis (PA indicate that maximum efficiency in low demanding scenarios is above 0.5 and a threshold, distinguishing low from high demanding scenarios, indicates that in low demanding scenarios no significant improvement in performance may be attained by increasing the storage capacity of rainwater harvesting tanks. Threshold behaviour is displayed when tank storage capacity is designed to match both the average collected volume and the average reuse volume. The low demand limit cannot be achieved under climate and operating conditions characterized by a disproportion between harvesting and demand volume.

  3. The proper name in the structure of the noun phrase

    Directory of Open Access Journals (Sweden)

    Aleksandra Pronińska

    2013-01-01

    Full Text Available The aim of this paper is to analyse Italian nominal groups containing proper nouns. The article discusses two syntagma types with elliptical structure proper noun> as well as analogical structures where the common and proper nouns are joined by the preposition “di”. The classification and analysis of nominal groups have been carried out on the basis of the function of the proper noun in the syntagma. Two groups of syntagmas have been distinguished: one with the proper noun functioning as a superordinate constituent and one with the proper noun functioning as a modifier. In the former type (where the proper noun is a superordinate constituent, the common noun functions as a descriptive or restrictive appositive. The syntagmas, where the proper noun functions as a modifier, are of particularly diverse character. In such cases, the possibility to paraphrase them by means of the analysed structures constitutes an additional criterion. Consequently, three syntagma types have been distinguished as represented by the following examples: il progetto Leonardo (which does not allow for an alternative synonymous version, *il progetto di Leonardo, il governo Monti (where the prepositional structure il governo di Monti may be used interchangeably and un quadro di Rubens (which does not allow for an alternative synonymous version with the ellipsis of the preposition “di” *un quadro Rubens.

  4. A Global Correction to PPMXL Proper Motions

    CERN Document Server

    Vickers, John J; Grebel, Eva K

    2016-01-01

    In this paper we notice that extragalactic sources seem to have non-zero proper motions in the PPMXL proper motion catalog. We collect a large, all-sky sample of extragalactic objects and fit their reported PPMXL proper motions to an ensemble of spherical harmonics in magnitude shells. A magnitude dependent proper motion correction is thus constructed. This correction is applied to a set of fundamental radio sources, quasars, and is compared to similar corrections to assess its utility. We publish, along with this paper, code which may be used to correct proper motions in the PPMXL catalog over the full sky which have 2 Micron All Sky Survey photometry.

  5. The Lorentzian proper vertex amplitude: Asymptotics

    CERN Document Server

    Engle, Jonathan; Zipfel, Antonia

    2015-01-01

    In previous work, the Lorentzian proper vertex amplitude for a spin-foam model of quantum gravity was derived. In the present work, the asymptotics of this amplitude are studied in the semi-classical limit. The starting point of the analysis is an expression for the amplitude as an action integral with action differing from that in the EPRL case by an extra `projector' term which scales linearly with spins only in the asymptotic limit. New tools are introduced to generalize stationary phase methods to this case. For the case of boundary data which can be glued to a non-degenerate Lorentzian 4-simplex, the asymptotic limit of the amplitude is shown to equal the single Feynman term, showing that the extra term in the asymptotics of the EPRL amplitude has been eliminated.

  6. Environmental risk analysis for nanomaterials: Review and evaluation of frameworks

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    2012-01-01

    In response to the challenges of conducting traditional human health and ecological risk assessment for nanomaterials (NM), a number of alternative frameworks have been proposed for NM risk analysis. This paper evaluates various risk analysis frameworks proposed for NM based on a number of criteria...

  7. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    Science.gov (United States)

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results.

  8. Insomnia and risk of dementia in older adults: Systematic review and meta-analysis.

    Science.gov (United States)

    de Almondes, Katie Moraes; Costa, Mônica Vieira; Malloy-Diniz, Leandro Fernandes; Diniz, Breno Satler

    2016-06-01

    There are cross-sectional evidences of an association between sleep disorders and cognitive impairment on older adults. However, there are no consensus by means of longitudinal studies data on the increased risk of developing dementia related to insomnia. We conduct a systematic review and meta-analysis to evaluate the risk of incident all-cause dementia in individuals with insomnia in population-based prospective cohort studies. Five studies of 5.242 retrieved references were included in the meta-analysis. We used the generic inverse variance method with a random effects model to calculate the pooled risk of dementia in older adults with insomnia. We assessed heterogeneity in the meta-analysis by means of the Q-test and I2 index. Study quality was assessed with the Newcastle-Ottawa Scale The results showed that Insomnia was associated with a significant risk of all-cause dementia (RR = 1.53 CI95% (1.07-2.18), z = 2.36, p = 0.02). There was evidence for significant heterogeneity in the analysis (q-value = 2.4, p Insomnia is associated with an increased risk for dementia. This results provide evidences that future studies should investigate dementia prevention among elderly individuals through screening and proper management of insomnia.

  9. Multiple Sclerosis Increases Fracture Risk: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Guixian Dong

    2015-01-01

    Full Text Available Purpose. The association between multiple sclerosis (MS and fracture risk has been reported, but results of previous studies remain controversial and ambiguous. To assess the association between MS and fracture risk, a meta-analysis was performed. Method. Based on comprehensive searches of the PubMed, Embase, and Web of Science, we identified outcome data from all articles estimating the association between MS and fracture risk. The pooled risk ratios (RRs with 95% confidence intervals (CIs were calculated. Results. A significant association between MS and fracture risk was found. This result remained statistically significant when the adjusted RRs were combined. Subgroup analysis stratified by the site of fracture suggested significant associations between MS and tibia fracture risk, femur fracture risk, hip fracture risk, pelvis fracture risk, vertebrae fracture risk, and humerus fracture risk. In the subgroup analysis by gender, female MS patients had increased fracture risk. When stratified by history of drug use, use of antidepressants, hypnotics/anxiolytics, anticonvulsants, and glucocorticoids increased the risk of fracture risk in MS patients. Conclusions. This meta-analysis demonstrated that MS was significantly associated with fracture risk.

  10. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    Energy Technology Data Exchange (ETDEWEB)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  11. Technical Overview of Ecological Risk Assessment - Analysis Phase: Exposure Characterization

    Science.gov (United States)

    Exposure Characterization is the second major component of the analysis phase of a risk assessment. For a pesticide risk assessment, the exposure characterization describes the potential or actual contact of a pesticide with a plant, animal, or media.

  12. Putting problem formulation at the forefront of GMO risk analysis.

    Science.gov (United States)

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  13. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  14. Seismic vulnerability assessments in risk analysis

    Science.gov (United States)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  15. A remark on proper partitions of unity

    CERN Document Server

    Calcines, Jose M Garcia

    2011-01-01

    In this paper we introduce, by means of the category of exterior spaces and using a process that generalizes the Alexandroff compactification, an analogue notion of numerable covering of a space in the proper and exterior setting. An application is given for fibrewise proper homotopy equivalences.

  16. Risk analysis of colorectal cancer incidence by gene expression analysis

    Science.gov (United States)

    Shangkuan, Wei-Chuan; Lin, Hung-Che; Chang, Yu-Tien; Jian, Chen-En; Fan, Hueng-Chuen; Chen, Kang-Hua; Liu, Ya-Fang; Hsu, Huan-Ming; Chou, Hsiu-Ling; Yao, Chung-Tay

    2017-01-01

    Background Colorectal cancer (CRC) is one of the leading cancers worldwide. Several studies have performed microarray data analyses for cancer classification and prognostic analyses. Microarray assays also enable the identification of gene signatures for molecular characterization and treatment prediction. Objective Microarray gene expression data from the online Gene Expression Omnibus (GEO) database were used to to distinguish colorectal cancer from normal colon tissue samples. Methods We collected microarray data from the GEO database to establish colorectal cancer microarray gene expression datasets for a combined analysis. Using the Prediction Analysis for Microarrays (PAM) method and the GSEA MSigDB resource, we analyzed the 14,698 genes that were identified through an examination of their expression values between normal and tumor tissues. Results Ten genes (ABCG2, AQP8, SPIB, CA7, CLDN8, SCNN1B, SLC30A10, CD177, PADI2, and TGFBI) were found to be good indicators of the candidate genes that correlate with CRC. From these selected genes, an average of six significant genes were obtained using the PAM method, with an accuracy rate of 95%. The results demonstrate the potential of utilizing a model with the PAM method for data mining. After a detailed review of the published reports, the results confirmed that the screened candidate genes are good indicators for cancer risk analysis using the PAM method. Conclusions Six genes were selected with 95% accuracy to effectively classify normal and colorectal cancer tissues. We hope that these results will provide the basis for new research projects in clinical practice that aim to rapidly assess colorectal cancer risk using microarray gene expression analysis. PMID:28229027

  17. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Energy Technology Data Exchange (ETDEWEB)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky [Department of Mechanical Engineering, Diponegoro University, Semarang (Indonesia); Kim, Seon Jin [Department of Mechanical & Automotive Engineering of Pukyong National University (Korea, Republic of)

    2016-04-19

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  18. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Science.gov (United States)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  19. Spinfoam Cosmology with the Proper Vertex

    Science.gov (United States)

    Vilensky, Ilya

    2017-01-01

    A modification of the EPRL vertex amplitude in the spin-foam framework of quantum gravity - so-called ``proper vertex amplitude'' - has been developed to enable correct semi-classical behavior to conform to the classical Regge calculus. The proper vertex amplitude is defined by projecting to the single gravitational sector. The amplitude is recast into an exponentiated form and we derive the asymptotic form of the projector part of the action. This enables us to study the asymptotics of the proper vertex by applying extended stationary phase methods. We use the proper vertex amplitude to investigate transition amplitudes between coherent quantum boundary states of cosmological geometries. In particular, Hartle-Hawking no-boundary states are computed in the proper vertex framework. We confirm that in the classical limit the Hartle-Hawking wavefunction satisfies the Hamiltonian constraint. Partly supported by NSF grants PHY-1205968 and PHY-1505490.

  20. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  1. KINERJA PENGELOLAAN LIMBAH HOTEL PESERTA PROPER DAN NON PROPER DI KABUPATEN BADUNG, PROVINSI BALI

    Directory of Open Access Journals (Sweden)

    Putri Nilakandi Perdanawati Pitoyo

    2016-07-01

    Full Text Available Bali tourism development can lead to positive and negative impacts that threatening environmental sustainability. This research evaluates the hotel performance of the waste management that includes management of waste water, emission, hazardous, and solid waste by hotel that participate at PROPER and non PROPER. Research using qualitative descriptive method. Not all of non PROPER doing test on waste water quality, chimney emissions quality, an inventory of hazardous waste and solid waste sorting. Wastewater discharge of PROPER hotels ranged from 290.9 to 571.8 m3/day and non PROPER ranged from 8.4 to 98.1 m3/day with NH3 parameter values that exceed the quality standards. The quality of chimney emissions were still below the quality standard. The volume of the hazardous waste of PROPER hotels ranged from 66.1 to 181.9 kg/month and non PROPER ranged from 5.003 to 103.42 kg/month. Hazardous waste from the PROPER hotel which has been stored in the TPS hazardous waste. The volume of the solid waste of PROPER hotel ranged from 342.34 to 684.54 kg/day and non PROPER ranged from 4.83 to 181.51 kg/day. The PROPER and non PROPER hotel not sort the solid waste. The hotel performance in term of wastewater management, emission, hazardous, and solid waste is better at the PROPER hotel compared to non PROPER participants.

  2. A comparative analysis of risk and quality

    DEFF Research Database (Denmark)

    Lynette, Jennifer Elyse

    2017-01-01

    independently, decision making and judgement processes have the potential to be positively impacted by furthering research and developing a deeper understanding of these constructs. By understanding risk management principles and combining that with a quality systems approach, decision making can be improved......Within the field of emergency management and fire response, risk and quality are conceptualized to some degree in every response effort. Quality is viewed as a relatively new concept within the field of emergency management and fire response. Whereas, within this same field the concept of risk...... interest to the field of emergency management and fire response, includes the methods used by risk research to measure risk in the field, and how that concept can be utilized for future research focusing on quality in the same field. By analyzing the subjective assessments that involve risk and quality...

  3. Essays on Systemic Risk : An analysis from multiple perspectives

    NARCIS (Netherlands)

    S. Muns (Sander)

    2016-01-01

    markdownabstractThis thesis is about systemic risk in the financial sector. It considers several aspects of systemic risk. It is a building block for an analysis of the impact of systemic risk on the real economy. It appears that stocks in the financial industry show a strong interdependence comp

  4. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  5. Mixture Distribution Approach In Financial Risk Analysis

    OpenAIRE

    Kocak, Keziban; Calis, Nazif; Unal, Deniz

    2014-01-01

    In recent years, major changes occurred in the prices of stock exchange appeared the necessity of measuring the financial risk. Nowadays, Value-atRisk (VaR) is often used to calculate the financial risk. Parametric methods which need normality are mostly used in the calculation of VaR.If the financial data does not fit the normal distribution, mixture of normal distribution models can be fitted to this data. In this study, the financial risk is calculated by using normal mixture distribution ...

  6. Analysis and management of risks experienced in tunnel construction

    Directory of Open Access Journals (Sweden)

    Cagatay Pamukcu

    2015-12-01

    Full Text Available In this study, first of all, the definitions of "risk", "risk analysis", "risk assessment" and "risk management" were made to avoid any confusions about these terms and significance of risk analysis and management in engineering projects was emphasized. Then, both qualitative and quantitative risk analysis techniques were mentioned and within the scope of the study, Event Tree Analysis method was selected in order to analyze the risks regarding TBM (Tunnel Boring Machine operations in tunnel construction. After all hazards that would be encountered during tunnel construction by TBM method had been investigated, those hazards were undergoing a Preliminary Hazard Analysis to sort out and prioritize the risks with high scores. When the risk scores were taken into consideration, it was seen that the hazards with high risk scores could be classified into 4 groups which are excavation + support induced accidents, accidents stemming from geologic conditions, auxiliary works, and project contract. According to these four classified groups of initiating events, Event Tree Analysis was conducted by taking into care 4 countermeasures apart from each other. Finally, the quantitative and qualitative consequences of Event Tree Analyses, which were undertaken for all initiating events, were investigated and interpreted together by making comparisons and referring to previous studies.

  7. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...... of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure....

  8. An Analysis on the Intentionality of Proper Names and the Context——Comments on Searle's Theory of Proper Name%专名的意向性分析和语境——评塞尔的专名理论

    Institute of Scientific and Technical Information of China (English)

    赵亮英

    2012-01-01

    Along with the development of his theory of intentionality,Searle's study of proper name transferred from theory of cluster descriptions into theory of cluster intentional contents.And his research direction also shifted from philosophy of language to philosophy of mind.Searle believes that the sense of proper name is intentional contents and its reference is determined by a causal-history chain of intentional contents.This paper shows that the essence of intentional contents is psychological,and therefore,is also personal which makes the meaning of proper name lose the property of public in Searle's theory.Moreover,I will show the necessity of combining the two kinds of causal-history chain by comparing and evaluating the causal-history theories of Searle,Kripke and Chen Xiaoping.%随着塞尔意向性理论的发展,塞尔对专名的研究由前期的"簇摹状词"理论转到"簇意向内容"理论,其研究方向也由语言哲学转向心灵哲学。塞尔认为专名的含义就是意向内容,指称是由一根意向内容的,因果历史链条决定的。本文指出,由于塞尔忽略语境因素,且意向内容本质上是心理的因而是私人性的,这使专名的意义在其理论中失去公共性。本文对塞尔、克里普克和陈晓平三人的因果历史理论作了比较和评价,进而指出,在语境中两种因果历史链条相结合的必要性。

  9. Assessment report on NRP sub-theme `Risk Analysis`

    Energy Technology Data Exchange (ETDEWEB)

    Biesiot, W.; Hendrickx, L. [eds.] [University of Groningen, Center for Energy and Environmental Studies, Groningen (Netherlands); Van Ham, J. [TNO Institute for Environmental Sciences, Delft (Netherlands); Olsthoorn, A.A. [VUA, Free University of Amsterdam, Amsterdam (Netherlands)

    1995-12-31

    An overview and assessment are presented of the three research projects carried out under NRP funding that concern risk-related topics: (1) The risks of nonlinear climate changes, (2) Socio-economic and policy aspects of changes in incidence and intensity of extreme (weather) events, and (3) Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy strategies. 1 tab., 6 refs.

  10. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    Science.gov (United States)

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process. © 2015 Society for Risk Analysis.

  11. Malaria Prevention, Mefloquine Neurotoxicity, Neuropsychiatric Illness, and Risk-Benefit Analysis in the Australian Defence Force

    Directory of Open Access Journals (Sweden)

    Stuart McCarthy

    2015-01-01

    Full Text Available The Australian Defence Force (ADF has used mefloquine for malaria chemoprophylaxis since 1990. Mefloquine has been found to be a plausible cause of a chronic central nervous system toxicity syndrome and a confounding factor in the diagnosis of existing neuropsychiatric illnesses prevalent in the ADF such as posttraumatic stress disorder and traumatic brain injury. Overall health risks appear to have been mitigated by restricting the drug’s use; however serious risks were realised when significant numbers of ADF personnel were subjected to clinical trials involving the drug. The full extent of the exposure, health impacts for affected individuals, and consequences for ADF health management including mental health are not yet known, but mefloquine may have caused or aggravated neuropsychiatric illness in large numbers of patients who were subsequently misdiagnosed and mistreated or otherwise failed to receive proper care. Findings in relation to chronic mefloquine neurotoxicity were foreseeable, but this eventuality appears not to have been considered during risk-benefit analyses. Thorough analysis by the ADF would have identified this long-term risk as well as other qualitative risk factors. Historical exposure of ADF personnel to mefloquine neurotoxicity now also necessitates ongoing risk monitoring and management in the overall context of broader health policies.

  12. Malaria Prevention, Mefloquine Neurotoxicity, Neuropsychiatric Illness, and Risk-Benefit Analysis in the Australian Defence Force.

    Science.gov (United States)

    McCarthy, Stuart

    2015-01-01

    The Australian Defence Force (ADF) has used mefloquine for malaria chemoprophylaxis since 1990. Mefloquine has been found to be a plausible cause of a chronic central nervous system toxicity syndrome and a confounding factor in the diagnosis of existing neuropsychiatric illnesses prevalent in the ADF such as posttraumatic stress disorder and traumatic brain injury. Overall health risks appear to have been mitigated by restricting the drug's use; however serious risks were realised when significant numbers of ADF personnel were subjected to clinical trials involving the drug. The full extent of the exposure, health impacts for affected individuals, and consequences for ADF health management including mental health are not yet known, but mefloquine may have caused or aggravated neuropsychiatric illness in large numbers of patients who were subsequently misdiagnosed and mistreated or otherwise failed to receive proper care. Findings in relation to chronic mefloquine neurotoxicity were foreseeable, but this eventuality appears not to have been considered during risk-benefit analyses. Thorough analysis by the ADF would have identified this long-term risk as well as other qualitative risk factors. Historical exposure of ADF personnel to mefloquine neurotoxicity now also necessitates ongoing risk monitoring and management in the overall context of broader health policies.

  13. Proper Time in Weyl space-time

    CERN Document Server

    Avalos, R; Romero, C

    2016-01-01

    We discuss the question of whether or not a general Weyl structure is a suitable mathematical model of space-time. This is an issue that has been in debate since Weyl formulated his unified field theory for the first time. We do not present the discussion from the point of view of a particular unification theory, but instead from a more general standpoint, in which the viability of such a structure as a model of space-time is investigated. Our starting point is the well known axiomatic approach to space-time given by Elhers, Pirani and Schild (EPS). In this framework, we carry out an exhaustive analysis of what is required for a consistent definition for proper time and show that such a definition leads to the prediction of the so-called "second clock effect". We take the view that if, based on experience, we were to reject space-time models predicting this effect, this could be incorporated as the last axiom in the EPS approach. Finally, we provide a proof that, in this case, we are led to a Weyl integrable ...

  14. Operational Risk Management A Practical Approach to Intelligent Data Analysis

    CERN Document Server

    Kenett, Ron

    2010-01-01

    The book will introduce modern Operational Risk (OpR) Management and illustrates the various sources of OpR assessment and OpR mitigation. This book discusses how various data sources can be integrated and analyzed and how OpR is synergetic to other risk management activities such as Financial Risk Management and Internationalization. The topics will include state of the art technology such as semantic analysis, ontology engineering, data mining and statistical analysis.

  15. Computation of Asteroid Proper Elements: Recent Advances

    Science.gov (United States)

    Knežević, Z.

    2017-06-01

    The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequency-modified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.

  16. Proper Handling and Storage of Human Milk

    Science.gov (United States)

    ... Breastfeeding Micronutrient Malnutrition State and Local Programs Proper Handling and Storage of Human Milk Recommend on Facebook ... sure to wash your hands before expressing or handling breast milk. When collecting milk, be sure to ...

  17. Proper holomorphic mappings between hyperbolic product manifolds

    CERN Document Server

    Janardhanan, Jaikrishnan

    2011-01-01

    We generalize a result of Remmert and Stein, on proper holomorphic mappings between domains that are products of certain planar domains, to finite proper holomorphic mappings between complex manifolds that are products of hyper- bolic Riemann surfaces. While an important special case of our result follows from the ideas developed by Remmert and Stein, our proof of the full result relies on the interplay of the latter ideas and a finiteness theorem for Riemann surfaces.

  18. Analysis of Operational Risks in Shipbuilding Industry

    National Research Council Canada - National Science Library

    Daniela MATEI; Mioara CHIRITA

    2012-01-01

    Our paper emphasizes the opportunities provided both for the academic research and companies by using a proposed model of analyzing the operational risks within business in general and shipbuilding...

  19. Ultraviolet exposure from indoor tanning devices as a potential source of health risks: Basic knowledge of the proper use of these devices for practical users, physicians and solarium staff

    Directory of Open Access Journals (Sweden)

    Jolanta Malinowska-Borowska

    2017-10-01

    Full Text Available Bearing in mind the adverse health effects of exposure to ultraviolet (UV radiation in solarium, especially the risk of carcinogenesis, there is a need to adopt legal regulations by relevant Polish authorities. They should set out the principles for indoor tanning studios operation, supervision and service of the technical parameters of tanning devices and training programs to provide the staff with professional knowledge and other aspects of safety in these facilities. The mechanism of the harmful effects of ultraviolet radiation on the human body, scale of overexposure, resulting from excessive sunbathing are described. Methods for estimating UV exposure and possible actions aimed at reducing the overexposure and preventing from cancer development caused by UV are also presented in this paper. Med Pr 2017;68(5:653–665

  20. [Ultraviolet exposure from indoor tanning devices as a potential source of health risks: Basic knowledge of the proper use of these devices for practical users, physicians and solarium staff].

    Science.gov (United States)

    Malinowska-Borowska, Jolanta; Janosik, Elżbieta

    2017-07-26

    Bearing in mind the adverse health effects of exposure to ultraviolet (UV) radiation in solarium, especially the risk of carcinogenesis, there is a need to adopt legal regulations by relevant Polish authorities. They should set out the principles for indoor tanning studios operation, supervision and service of the technical parameters of tanning devices and training programs to provide the staff with professional knowledge and other aspects of safety in these facilities. The mechanism of the harmful effects of ultraviolet radiation on the human body, scale of overexposure, resulting from excessive sunbathing are described. Methods for estimating UV exposure and possible actions aimed at reducing the overexposure and preventing from cancer development caused by UV are also presented in this paper. Med Pr 2017;68(5):653-665. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  1. The Network's Data Security Risk Analysis

    Directory of Open Access Journals (Sweden)

    Emil BURTESCU

    2008-01-01

    Full Text Available Establishing the networks security risk can be a very difficult operation especially for the small companies which, from financial reasons can't appeal at specialist in this domain, or for the medium or large companies that don't have experience. The following method proposes not to use complex financial calculus to determine the loss level and the value of impact making the determination of risk level a lot easier.

  2. The Operational Risk – Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Gabriela Victoria Anghelache

    2009-11-01

    Full Text Available In many cases operational risks tend to be underestimated, considering that the losses they cause are generally minor can’t threatening the survival of a bank. Losses resulting from these events come from a complex interaction between organizational factors, personal and market that do not fit into a simple classification scheme. Observing what happened in the past we can say that operational risk is an important question of the financial losses in the banking sector.

  3. Stress Analysis in Managing the Region’s Budget Risks

    Directory of Open Access Journals (Sweden)

    Natalya Pavlovna Pazdnikova

    2014-09-01

    Full Text Available The article addresses the implementation of budget risk management methods into the practices of governmental authorities. Drawing on the example of a particular region the article aims to demonstrate the possible methods of budget risk management. The authors refine the existing approaches to the notion of risk in its relation to budget system by introducing the notion of “budget risk.” Here the focus is the risk of default of budget spending in full which causes underfunding of territories and decrease in quality of life in the region. The authors have particularized the classification of budget risks and grouped together the criteria and factors which significantly influence the assessment and choice of method to manage budget risks. They hypothesize that budget risk is a financial risk. Therefore, the methods of financial risks management can be applied to budget risks management. The authors suggest a methodological approach to risk assessment based on correlation and regression analysis of program financing. The application of Kendall rank correlation coefficient allowed to assess the efficiency of budget spending on the implementation of state programs in Perm Krai. Two clusters — “Nature management and infrastructure” and “Public security” — turned out to be in the zone of high budget risk. The method of stress analysis, which consists in calculating Value at Risk (VaR, was applied to budget risks that in terms of probability are classified as critical. In order to assess risk as probability rate, the amount of Perm Krai deficit budget was calculated as induced variable from budget revenues and spending. The results demonstrate that contemporary management of public resources in the regions calls for the implementation of new management tools of higher quality and budget risk management is one of them.

  4. ANALYSIS MODEL USING ROBU MIRONIUC IN PREDICTING RISK OF BANKRUPTCY ROMANIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    ŞTEFĂNIŢĂ ŞUŞU

    2014-08-01

    Full Text Available Bankruptcy risk and made the subject of many research studies that aim to identify the time of the bankruptcy, the factors that compete to achieve this state, and the indicators that best expresses this orientation (the bankruptcy. The threats to enterprises require knowledge managers continually economic and financial situations, and vulnerable areas with development potential. Managers need to identify and properly manage the threats that would prevent achieving the targets. In terms of methods known in the literature of assessment and evaluation of bankruptcy risk they are static, functional, strategic and non-scoring. Analysis by scoring methods is usually carried out by banks in the analysis of creditworthiness, when a company asks for a bank loan. Each bank has its own analysis, including a feature-score calculated internally based on indicators defined in its credit manual. To have a national comparability, however, a scoring system should be based on more data in the situation of "public data" or available to all stakeholders. In this article, in order to achieve bankruptcy risk prediction model is used Robu-Mironiuc on the passage benchmarking 2009-2013. The source of information is the profit and loss account and balance sheet of the two companies listed on the Bucharest Stock Exchange (Turism Covasna and Dorna Turism companies. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  5. A Literature Review on Risk Analysis of Production Location Decisions

    OpenAIRE

    Dadpouri, Mohammad; Nunna, Kiran

    2011-01-01

    This report is the result of a master thesis with a focus on risk analysis of production location decisions. The project is a part of “PROLOC-manufacturing footprint during the product’s life cycle”. The main aim of this thesis is to point out how current applicable risk analysis techniques evaluate the risks involved in production location decisions and then underline the most important risks involved in production location decisions and elicit strengths and weaknesses of these methods.A sys...

  6. Maternal migration and autism risk: systematic analysis.

    Science.gov (United States)

    Crafa, Daina; Warfa, Nasir

    2015-02-01

    Autism (AUT) is one of the most prevalent developmental disorders emerging during childhood, and can be amongst the most incapacitating mental disorders. Some individuals with AUT require a lifetime of supervised care. Autism Speaks reported estimated costs for 2012 at £34 billion in the UK; and $3.2 million-$126 billion in the US, Australia and Canada. Ethnicity and migration experiences appear to increase risks of AUT and relate to underlying biological risk factors. Sociobiological stress factors can affect the uterine environment, or relate to stress-induced epigenetic changes during pregnancy and delivery. Epigenetic risk factors associated with AUT also include poor pregnancy conditions, low birth weight, and congenital malformation. Recent studies report that children from migrant communities are at higher risk of AUT than children born to non-migrant mothers, with the exception of Hispanic children. This paper provides the first systematic review into prevalence and predictors of AUT with a particular focus on maternal migration stressors and epigenetic risk factors. AUT rates appear higher in certain migrant communities, potentially relating to epigenetic changes after stressful experiences. Although AUT remains a rare disorder, failures to recognize its public health urgency and local community needs continue to leave certain cultural groups at a disadvantage.

  7. [Maintaining the proper distance for nurses working in the home].

    Science.gov (United States)

    Estève, Sonia

    2016-01-01

    Health professionals must be able to respond to many different situations which require technical knowledge and self-control. Particularly when working in the patient's home, nurses must know how to maintain a proper distance to protect themselves from burnout. In this respect, the practice analysis constitutes an adapted support tool. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  8. Proper Living - Exploring Domestic Ideals in Medieval Denmark

    DEFF Research Database (Denmark)

    Kristiansen, Mette Svart

    2014-01-01

    , and ornaments. This paper addresses ideas of proper living in affluent rural and urban milieus in medieval Denmark, particularly as they are expressed through houses, inventories, and murals, and it also addresses current challenges in understanding the materialized ideas based on excavations and analysis...

  9. Risk factors for retained surgical items: a meta-analysis and proposed risk stratification system.

    Science.gov (United States)

    Moffatt-Bruce, Susan D; Cook, Charles H; Steinberg, Steven M; Stawicki, Stanislaw P

    2014-08-01

    Retained surgical items (RSI) are designated as completely preventable "never events". Despite numerous case reports, clinical series, and expert opinions few studies provide quantitative insight into RSI risk factors and their relative contributions to the overall RSI risk profile. Existing case-control studies lack the ability to reliably detect clinically important differences within the long list of proposed risks. This meta-analysis examines the best available data for RSI risk factors, seeking to provide a clinically relevant risk stratification system. Nineteen candidate studies were considered for this meta-analysis. Three retrospective, case-control studies of RSI-related risk factors contained suitable group comparisons between patients with and without RSI, thus qualifying for further analysis. Comprehensive Meta-Analysis 2.0 (BioStat, Inc, Englewood, NJ) software was used to analyze the following "common factor" variables compiled from the above studies: body-mass index, emergency procedure, estimated operative blood loss >500 mL, incorrect surgical count, lack of surgical count, >1 subprocedure, >1 surgical team, nursing staff shift change, operation "afterhours" (i.e., between 5 PM and 7 AM), operative time, trainee presence, and unexpected intraoperative factors. We further stratified resulting RSI risk factors into low, intermediate, and high risk. Despite the fact that only between three and six risk factors were associated with increased RSI risk across the three studies, our analysis of pooled data demonstrates that seven risk factors are significantly associated with increased RSI risk. Variables found to elevate the RSI risk include intraoperative blood loss >500 mL (odds ratio [OR] 1.6); duration of operation (OR 1.7); >1 subprocedure (OR 2.1); lack of surgical counts (OR 2.5); >1 surgical team (OR 3.0); unexpected intraoperative factors (OR 3.4); and incorrect surgical count (OR 6.1). Changes in nursing staff, emergency surgery, body

  10. Risk Analysis for Environmental Health Triage

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  11. Risk analysis of complex hydrogen infrastructures

    DEFF Research Database (Denmark)

    Markert, Frank; Marangon, Alessia; Carcassi, Marco

    2015-01-01

    Developing a future sustainable refuelling station network is the next important step to establish hydrogen as a fuel for vehicles and related services. Such stations will most likely be integrated in existing refuelling stations and result in multi-fuel storages with a variety of fuels being...... assessment methodologies, and how functional models could support coherent risk and sustainability (Risk Assessment, Life Cycle Assessment /Life Cycle Costing) assessments, in order to find optimal solutions for the development of the infrastructure on a regional or national level....

  12. Proper generalized decompositions an introduction to computer implementation with Matlab

    CERN Document Server

    Cueto, Elías; Alfaro, Icíar

    2016-01-01

    This book is intended to help researchers overcome the entrance barrier to Proper Generalized Decomposition (PGD), by providing a valuable tool to begin the programming task. Detailed Matlab Codes are included for every chapter in the book, in which the theory previously described is translated into practice. Examples include parametric problems, non-linear model order reduction and real-time simulation, among others. Proper Generalized Decomposition (PGD) is a method for numerical simulation in many fields of applied science and engineering. As a generalization of Proper Orthogonal Decomposition or Principal Component Analysis to an arbitrary number of dimensions, PGD is able to provide the analyst with very accurate solutions for problems defined in high dimensional spaces, parametric problems and even real-time simulation. .

  13. Risk Analysis and Security Countermeasure Selection

    CERN Document Server

    Norman, Thomas L

    2009-01-01

    Explains how to evaluate the appropriateness of security countermeasures, from a cost-effectiveness perspective. This title guides readers from basic principles to complex processes in a step-by-step fashion, evaluating DHS-approved risk assessment methods, including CARVER, API/NPRA, RAMCAP, and various Sandia methodologies

  14. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  15. Spinfoam cosmology with the proper vertex amplitude

    CERN Document Server

    Vilensky, Ilya

    2016-01-01

    The proper vertex amplitude is derived from the EPRL vertex by restricting to a single gravitational sector in order to achieve the correct semi-classical behaviour. We apply the proper vertex to calculate a cosmological transition amplitude that can be viewed as the Hartle-Hawking wavefunction. To perform this calculation we deduce the integral form of the proper vertex and use extended stationary phase methods to estimate the large-volume limit. We show that the resulting amplitude satisfies an operator constraint whose classical analogue is the Hamiltonian constraint of the Friedmann-Robertson-Walker cosmology. We find that the constraint dynamically selects the relevant family of coherent states and demonstrate a similar dynamic selection in standard quantum mechanics.

  16. Panel Random Analysis of Credit Risk in Business

    Institute of Scientific and Technical Information of China (English)

    LIU Wei; ZHOU Yue-mei; ZHOU Ke

    2005-01-01

    Market economy is a kind of credit economy.The survival and development of an individual in the society are closely related with his credit. Without credit, market economy can not continue, the society can hardly run in good order and good health. This paper defines the basic concept of trade credit risk with its manifestation and brings forward the basic mode quantitatively analyzing the credit risk. The data structure of information is analyzed, the decomposition model of credit risk is structured and with the aid of statistical analysis, including regression analysis, analysis of variance, test of hypothesized, the description, classification, certification and confirmation of credit risk model are completed, then, we can describe and control the credit risk with the model to provide basis when building credit support system in today's society.

  17. Comparison of Management Oversight and Risk Tree and Tripod-Beta in Excavation Accident Analysis

    Directory of Open Access Journals (Sweden)

    Mohamadfam

    2015-01-01

    Full Text Available Background Accident investigation programs are a necessary part in identification of risks and management of the business process. Objectives One of the most important features of such programs is the analysis technique for identifying the root causes of accidents in order to prevent their recurrences. Analytical Hierarchy Process (AHP was used to compare management oversight and risk tree (MORT with Tripod-Beta in order to determine the superior technique for analysis of fatal excavation accidents in construction industries. Materials and Methods MORT and Tripod-Beta techniques were used for analyzing two major accidents with three main steps. First, these techniques were applied to find out the causal factors of the accidents. Second, a number of criteria were developed for the comparison of the techniques and third, using AHP, the techniques were prioritized in terms of the criteria for choosing the superior one. Results The Tripod-Beta investigation showed 41 preconditions and 81 latent causes involved in the accidents. Additionally, 27 root causes of accidents were identified by the MORT analysis. Analytical hierarchy process (AHP investigation revealed that MORT had higher priorities only in two criteria than Tripod-Beta. Conclusions Our findings indicate that Tripod-Beta with a total priority of 0.664 is superior to MORT with the total priority of 0.33. It is recommended for future research to compare the available accident analysis techniques based on proper criteria to select the best for accident analysis.

  18. Isometric Isomorphisms in Proper CQ*-algebras

    Institute of Scientific and Technical Information of China (English)

    Choonkil PARK; Jong Su AN

    2009-01-01

    In this paper,we prove the Hyers-Ulam-Rassias stability of isometric homomorphisms in proper CQ*-algebras for the following Cauchy-Jensen additive mapping:2f(x1+x2/2+y)=f(x1)+f(x2)+2f(y).The concept of Hyers-Ulam-Rassias stability originated from the Th.M.Rassias' stability theorem that appeared in the paper: On the stability of the linear mapping in Banach spaces,Proc.Amer.Math.Soc.,72 (1978),297-300.This is applied to investigate isometric isomorphisms between proper CQ*-algebras.

  19. A risk analysis model in concurrent engineering product development.

    Science.gov (United States)

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company.

  20. Management of Microbiologically Influenced Corrosion in Risk Based Inspection analysis

    DEFF Research Database (Denmark)

    Skovhus, Torben Lund; Hillier, Elizabeth; Andersen, Erlend S.

    2016-01-01

    Operating offshore oil and gas production facilities is often associated with high risk. In order to manage the risk, operators commonly use aids to support decision making in the establishment of a maintenance and inspection strategy. Risk Based Inspection (RBI) analysis is widely used in the of......Operating offshore oil and gas production facilities is often associated with high risk. In order to manage the risk, operators commonly use aids to support decision making in the establishment of a maintenance and inspection strategy. Risk Based Inspection (RBI) analysis is widely used...... in the offshore industry as a means to justify the inspection strategy adopted. The RBI analysis is a decision-making technique that enables asset managers to identify the risk related to failure of their most critical systems and components, with an effect on safety, environmental and business related issues....... Risk is a measure of possible loss or injury, and is expressed as the combination of the incident probability and its consequences. A component may have several associated risk levels depending on the different consequences of failure and the different probabilities of those failures occurring...

  1. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  2. Methodology for risk analysis of projects

    Directory of Open Access Journals (Sweden)

    Roșu Maria Magdalena

    2017-01-01

    Full Text Available The risk in the organization activity as an economic and social system open, adaptive, with varying degrees of permeability to the influences from the current business environment, which is increasingly unpredictable and in which the only constant is the change, refers to the probability of not complying with the objectives set in terms of performance, execution and cost. The insufficient application of the recognized project management methodologies can be one of the main causes of projects failures in the organization with major influences on the activity efficiency and the performance recorded. Therefore, the methodology proposed in the paper, wants to be an effective tool, a formalized risk management tool, considered as a cyclical process, with several distinct phases, indispensable to the current organizational practice which should contribute to optimizing the project performance and its successful completion.

  3. Imposed risk controversies: a critical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sauer, G.L.

    The author focuses on risk controversies where someone in one locale is concerned about being seriously injured as a result of human activities carried on elsewhere. Discussion is limited to unintentional risks, such as, nuclear power plants, dams, toxic pollutants and biological laboratories containing virulent organisms. He attempts to fashion a new judicial institution for settling disputes where liability is not well defined. The proposed system is a two-step process. The first is arbitration where defendant participation is voluntary. If judgment at first stage is against the defendant the system progresses to the second stage; otherwise the process is completed. The second stage is a full-scale hearing where burden of proof shifts to defendant. (PSB)

  4. Comparative Analysis and Evaluation of Existing Risk Management Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The focus of this article lies on the specific features of the existing software packages for risk management differentiating three categories. Representative for these categories we consider the Crystal Ball, Haufe Risikomanager and MIS - Risk Management solutions, outlining the strenghts and weaknesses in a comparative analysis.

  5. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    Science.gov (United States)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  6. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    Science.gov (United States)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  7. USAWC Coronary Risk and Fitness Analysis

    Science.gov (United States)

    1980-06-04

    Dr. Wood and associates of the Stanford Heart Disease Prevention Program compared the lipopro- tein patterns of sedentary and active men 35-39 years... insulates the body and increases the risk of heat exhaus- tion and heat stroke; it lowers the body’s efficient use of oxy- gen and reduces an...around the heart and throughout the body, and at the same time keep undiseased blood vessels ,,24 soft and pliable . It has also been established that

  8. Schedule Risk Analysis Simulator using Beta Distribution

    Directory of Open Access Journals (Sweden)

    Isha Sharma,

    2011-06-01

    Full Text Available This paper describes an application of simulation and Modelling in Software risk management. This paper describes a simulation based software risk management tool which helps manager to identifyhigh risk areas of software process. In this paper an endeavour has been made to build up a Stochastic Simulator which helps in decision making to identify the critical activities which are given due priorities during the development of Software Project. In response to new information or revised estimates, it may be necessary to reassign resources, cancel optional tasks, etc. Project management tools that make projections while treating decisions about tasks and resource assignments as static will not yield realistic results. The usual PERT procedure may lead to overly optimistic results as many pass which are not critical but slightly shorter than critical on the basis of estimated activity duration or average durations.Due to randomness of durations, these pass under some combination of activity durations, could become longer than the average longest path. Such paths would be ignored while using the PERT technique onthe basis of the average durations. In order to overcome this problem and be more reasonable, the said Stochastic Simulator has been designed by generating random samples from a specific probabilitydistribution associated with that particular activity of SPM. The said simulator is also not bugged with overly estimated results.

  9. Environmental risk assessment in GMO analysis.

    Science.gov (United States)

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  10. Software Speeds Up Analysis of Breast Cancer Risk

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_161117.html Software Speeds Up Analysis of Breast Cancer Risk: Study ... 22, 2016 THURSDAY, Sept. 22, 2016 (HealthDay News) -- Software that quickly analyzes mammograms and patient history to ...

  11. Risk and Interdependencies in Critical Infrastructures A Guideline for Analysis

    CERN Document Server

    Utne, Ingrid; Vatn, Jørn

    2012-01-01

    Today’s society is completely dependent on critical networks such as  water supply, sewage, electricity, ICT and transportation. Risk and vulnerability analyses are needed to grasp the impact of threats and hazards. However, these become quite complex as there are strong interdependencies both within and between infrastructure systems. Risk and Interdependencies in Critical Infrastructures: A  guideline for analysis provides methods for analyzing risks and interdependencies of critical infrastructures.  A number of analysis approaches are described and are adapted to each of these infrastructures. Various approaches are also revised, and all are supported by several examples and illustrations. Particular emphasis is given to the analysis of various interdependencies that often exist between the infrastructures.  Risk and Interdependencies in Critical Infrastructures: A  guideline for analysis provides a good tool to identify the hazards that are threatening your infrastructures, and will enhance the un...

  12. Risk Factor Analysis for Oral Precancer among Slum Dwellers in ...

    African Journals Online (AJOL)

    Rajasthan Dental College, Jaipur, Rajasthan, 1Dental Wing, All India Institute of Medical Sciences (AIIMS), Bhopal,. 4Department of Public ... Keywords: Oral cancer, Risk factor analysis, Slum dwellers. Access this .... hygiene aid used in India.

  13. American Airlines Propeller STOL Transport Economic Risk Analysis

    Science.gov (United States)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  14. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  15. Simulation Approach to Mission Risk and Reliability Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  16. Cardiometabolic risk in Canada: a detailed analysis and position paper by the cardiometabolic risk working group.

    Science.gov (United States)

    Leiter, Lawrence A; Fitchett, David H; Gilbert, Richard E; Gupta, Milan; Mancini, G B John; McFarlane, Philip A; Ross, Robert; Teoh, Hwee; Verma, Subodh; Anand, Sonia; Camelon, Kathryn; Chow, Chi-Ming; Cox, Jafna L; Després, Jean-Pierre; Genest, Jacques; Harris, Stewart B; Lau, David C W; Lewanczuk, Richard; Liu, Peter P; Lonn, Eva M; McPherson, Ruth; Poirier, Paul; Qaadri, Shafiq; Rabasa-Lhoret, Rémi; Rabkin, Simon W; Sharma, Arya M; Steele, Andrew W; Stone, James A; Tardif, Jean-Claude; Tobe, Sheldon; Ur, Ehud

    2011-01-01

    The concepts of "cardiometabolic risk," "metabolic syndrome," and "risk stratification" overlap and relate to the atherogenic process and development of type 2 diabetes. There is confusion about what these terms mean and how they can best be used to improve our understanding of cardiovascular disease treatment and prevention. With the objectives of clarifying these concepts and presenting practical strategies to identify and reduce cardiovascular risk in multiethnic patient populations, the Cardiometabolic Working Group reviewed the evidence related to emerging cardiovascular risk factors and Canadian guideline recommendations in order to present a detailed analysis and consolidated approach to the identification and management of cardiometabolic risk. The concepts related to cardiometabolic risk, pathophysiology, and strategies for identification and management (including health behaviours, pharmacotherapy, and surgery) in the multiethnic Canadian population are presented. "Global cardiometabolic risk" is proposed as an umbrella term for a comprehensive list of existing and emerging factors that predict cardiovascular disease and/or type 2 diabetes. Health behaviour interventions (weight loss, physical activity, diet, smoking cessation) in people identified at high cardiometabolic risk are of critical importance given the emerging crisis of obesity and the consequent epidemic of type 2 diabetes. Vascular protective measures (health behaviours for all patients and pharmacotherapy in appropriate patients) are essential to reduce cardiometabolic risk, and there is growing consensus that a multidisciplinary approach is needed to adequately address cardiometabolic risk factors. Health care professionals must also consider risk factors related to ethnicity in order to appropriately evaluate everyone in their diverse patient populations.

  17. Fuzzy Logic Application in Risk Analysis Due to Lightning

    Directory of Open Access Journals (Sweden)

    Yelennis Godoy Valladares

    2010-05-01

    Full Text Available This work uses the application of the fuzzy logic to the analysis of risk on the base of the approaches picked up in the IEC 62305-2, with the objective of developing a simple tool, of easy use and understanding, which offers the designer the possibility of a bigger interpretation to the subjectivity wrapped in the analysis using the language to evaluate the characteristics of the installation in study and the risk of lightning impact.

  18. Strategy Guideline. Proper Water Heater Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hoeschele, M. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Springer, D. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); German, A. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Staller, J. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Zhang, Y. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States)

    2015-04-09

    This Strategy Guideline on proper water heater selection was developed by the Building America team Alliance for Residential Building Innovation to provide step-by-step procedures for evaluating preferred cost-effective options for energy efficient water heater alternatives based on local utility rates, climate, and anticipated loads.

  19. Strategy Guideline: Proper Water Heater Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hoeschele, M. [Alliance for Residential Building Innovation, Davis, CA (United States); Springer, D. [Alliance for Residential Building Innovation, Davis, CA (United States); German, A. [Alliance for Residential Building Innovation, Davis, CA (United States); Staller, J. [Alliance for Residential Building Innovation, Davis, CA (United States); Zhang, Y. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2015-04-01

    This Strategy Guideline on proper water heater selection was developed by the Building America team Alliance for Residential Building Innovation to provide step-by-step procedures for evaluating preferred cost-effective options for energy efficient water heater alternatives based on local utility rates, climate, and anticipated loads.

  20. The Essentials of Proper Wine Service.

    Science.gov (United States)

    Manago, Gary H.

    This instructional unit was designed to assist the food services instructor and/or the restaurant manager in training students and/or staff in the proper procedure for serving wines to guests. The lesson plans included in this unit focus on: (1) the different types of wine glasses and their uses; (2) the parts of a wine glass; (3) the proper…

  1. Isometry groups of proper metric spaces

    CERN Document Server

    Niemiec, Piotr

    2012-01-01

    Given a locally compact Polish space X, a necessary and sufficient condition for a group G of homeomorphisms of X to be the full isometry group of (X,d) for some proper metric d on X is given. It is shown that every locally compact Polish group G acts freely on GxY as the full isometry group of GxY with respect to a certain proper metric on GxY, where Y is an arbitrary locally compact Polish space with (card(G),card(Y)) different from (1,2). Locally compact Polish groups which act effectively and almost transitively on complete metric spaces as full isometry groups are characterized. Locally compact Polish non-Abelian groups on which every left invariant metric is automatically right invariant are characterized and fully classified. It is demonstrated that for every locally compact Polish space X having more than two points the set of proper metrics d such that Iso(X,d) = {id} is dense in the space of all proper metrics on X.

  2. A proper subclass of Maclane's class

    Directory of Open Access Journals (Sweden)

    May Hamdan

    1999-01-01

    paper, we define a subclass ℛ of consisting of those functions that have asymptotic values at a dense subset of the unit circle reached along rectifiable asymptotic paths. We also show that the class ℛ is a proper subclass of by constructing a function f∈ that admits no asymptotic paths of finite length.

  3. THE ANALYSIS OF RISK MANAGEMENT PROCESS WITHIN MANAGEMENT

    Directory of Open Access Journals (Sweden)

    ROMANESCU MARCEL LAURENTIU

    2016-10-01

    Full Text Available This article highlights the risk analysis within management, focusing on how a company could practicaly integrate the risks management in the existing leading process. Subsequently, it is exemplified the way of manage risk effectively, which gives numerous advantages to all firms, including improving their decision-making process. All these lead to the conclusion that the degree of risk specific to companies is very high, but if managers make the best decisions then it can diminish it and all business activitiy and its income are not influenced by factors that could disturb in a negative way .

  4. Quantitative risk analysis in two pipelines operated by TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Bittencourt, Euclides [Centro Universitario FIB, Salvador , BA (Brazil)

    2009-07-01

    Transportation risk analysis techniques were used to study two pipelines operated by TRANSPETRO. The Pipeline A is for the simultaneous transportation of diesel, gasoline and LPG and comprises three parts, all of them crossing rural areas. The Pipeline B is for oil transportation and one of its ends is located in an area of a high density population. Both pipelines had their risk studied using the PHAST RISK{sup R} software and the individual risk measures, the only considered measures for license purposes for this type of studies, presented level far below the maximum tolerable levels considered. (author)

  5. An All-Sky Search for Wide Binaries in the SUPERBLINK Proper Motion Catalog

    Science.gov (United States)

    Hartman, Zachary; Lepine, Sebastien

    2017-01-01

    We present initial results from an all-sky search for Common Proper Motion (CPM) binaries in the SUPERBLINK all-sky proper motion catalog of 2.8 million stars with proper motions greater than 40 mas/yr, which has been recently enhanced with data from the GAIA mission. We initially search the SUPERBLINK catalog for pairs of stars with angular separations up to 1 degree and proper motion difference less than 40 mas/yr. In order to determine which of these pairs are real binaries, we develop a Bayesian analysis to calculate probabilities of true companionship based on a combination of proper motion magnitude, angular separation, and proper motion differences. The analysis reveals that the SUPERBLINK catalog most likely contains ~40,000 genuine common proper motion binaries. We provide initial estimates of the distances and projected physical separations of these wide binaries.

  6. Risk Analysis for Resource Planning Optimization

    Science.gov (United States)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  7. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  8. Analysis on Risk Prevention Mechanism for Farmers’ Default in Small Amount Credit

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Through analysis, it is believed that major reasons for default risks in operation of small amount credit include low management level and vacancy of normative system, vacancy of risk sharing mechanism, rating distortion due to imperfect credit investigation system, and uncertainty of borrower’s credit. On the basis of these, static and dynamic models are established to analyze the prevention mechanism for default risk in small amount credit. It is concluded that we must establish a restriction mechanism during operation of small amount credit as long as three values increase, namely, N (potential loss of bad credit record due to farmers’ default), Q (probability of successful recovery by small amount credit institution), and S (cost of small amount credit institution punishing farmers after successful recovery). Finally, following countermeasures and suggestions are put forward: perfect laws and regulations and credit reward and punishment mechanism for risk management of small amount credit; bring into play proper function of loan officer in small amount credit practice; widely promote rural "Group Credit Union" system.

  9. A Project Risk Ranking Approach Based on Set Pair Analysis

    Institute of Scientific and Technical Information of China (English)

    Gao Feng; Chen Yingwu

    2006-01-01

    Set Pair Analysis (SPA) is a new methodology to describe and process system uncertainty. It is different from stochastic or fuzzy methods in reasoning and operation, and it has been applied in many areas recently. In this paper, the application of SPA in risk ranking is presented, which includes review of risk ranking, introduction of Connecting Degree (CD) that is a key role in SPA., Arithmetic and Tendency Grade (TG) of CDs, and a risk ranking approach proposed. Finally a case analysis is presented to illustrate the reasonability of this approach. It is found that this approach is very convenient to operate, while the ranking result is more comprehensible.

  10. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure.

  11. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  12. The semantic distinction between "risk" and "danger": a linguistic analysis.

    Science.gov (United States)

    Boholm, Max

    2012-02-01

    The analysis combines frame semantic and corpus linguistic approaches in analyzing the role of agency and decision making in the semantics of the words "risk" and "danger" (both nominal and verbal uses). In frame semantics, the meanings of "risk" and of related words, such as "danger," are analyzed against the background of a specific cognitive-semantic structure (a frame) comprising frame elements such as Protagonist, Bad Outcome, Decision, Possession, and Source. Empirical data derive from the British National Corpus (100 million words). Results indicate both similarities and differences in use. First, both "risk" and "danger" are commonly used to represent situations having potential negative consequences as the result of agency. Second, "risk" and "danger," especially their verbal uses (to risk, to endanger), differ in agent-victim structure, i.e., "risk" is used to express that a person affected by an action is also the agent of the action, while "endanger" is used to express that the one affected is not the agent. Third, "risk," but not "danger," tends to be used to represent rational and goal-directed action. The results therefore to some extent confirm the analysis of "risk" and "danger" suggested by German sociologist Niklas Luhmann. As a point of discussion, the present findings arguably have implications for risk communication.

  13. Risk-benefit analysis and public policy: a bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis, and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.

  14. ANALYSIS OF RISK FACTORS ECTOPIC PREGNANCY

    Directory of Open Access Journals (Sweden)

    Budi Santoso

    2017-04-01

    Full Text Available Introduction: Ectopic pregnancy is a pregnancy with extrauterine implantation. This situation is gynecologic emergency that contributes to maternal mortality. Therefore, early recognition, based on identification of the causes of ectopic pregnancy risk factors, is needed. Methods: The design descriptive observational. The samples were pregnant women who had ectopic pregnancy at Maternity Room, Emergency Unit, Dr. Soetomo Hospital, Surabaya, from 1 July 2008 to 1 July 2010. Sampling technique was total sampling using medical records. Result: Patients with ectopic pregnancy were 99 individuals out of 2090 pregnant women who searched for treatment in Dr. Soetomo Hospital. However, only 29 patients were accompanied with traceable risk factors. Discussion:. Most ectopic pregnancies were in the age group of 26-30 years, comprising 32 patients (32.32%, then in age groups of 31–35 years as many as 25 patients (25.25%, 18 patients in age group 21–25 years (18.18%, 17 patients in age group 36–40 years (17.17%, 4 patients in age group 41 years and more (4.04%, and the least was in age group of 16–20 years with 3 patients (3.03%. A total of 12 patients with ectopic pregnancy (41.38% had experience of abortion and 6 patients (20.69% each in groups of patients with ectopic pregnancy who used family planning, in those who used family planning as well as ectopic pregnancy patients with history of surgery. There were 2 patients (6.90% of the group of patients ectopic pregnancy who had history of surgery and history of abortion. The incidence rate of ectopic pregnancy was 4.73%, mostly in the second gravidity (34.34%, whereas the nulliparous have the highest prevalence of 39.39%. Acquired risk factors, i.e. history of operations was 10.34%, patients with family planning 20.69%, patients with history of abortion 41.38%, patients with history of abortion and operation 6.90% patients with family and history of abortion was 20.69%.

  15. Development of a risk-analysis model. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    This report consists of a main body, which provides a presentation of risk analysis and its general and specific application to the needs of the Office of Buildings and Community Systems of the Department of Energy; and several case studies employing the risk-analysis model developed. The highlights include a discussion of how risk analysis is currently used in the private, regulated, and public sectors and how this methodology can be employed to meet the policy-analysis needs of the Office of Buildings and Community Systems of the Department of Energy (BCS/DOE). After a review of the primary methodologies available for risk analysis, it was determined that Monte Carlo simulation techniques provide the greatest degree of visibility into uncertainty in the decision-making process. Although the data-collection requirements can be demanding, the benefits, when compared to other methods, are substantial. The data-collection problem can be significantly reduced, without sacrificing proprietary-information rights, if prior arrangements are made with RD and D contractors to provide responses to reasonable requests for base-case data. A total of three case studies were performed on BCS technologies: a gas-fired heat pump; a 1000 ton/day anaerobic digestion plant; and a district heating and cooling system. The three case studies plus the risk-analysis methodology were issued as separate reports. It is concluded that, based on the overall research of risk analysis and the case-study experience, that the risk-analysis methodology has significant potential as a policy-evaluation tool within BCS.

  16. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    Science.gov (United States)

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures.

  17. From risk for trauma to unintentional injury risk: falls--a concept analysis. Nursing Diagnosis Extension and Classification Research Team.

    Science.gov (United States)

    Schoenfelder, D P; Crowell, C M

    1999-01-01

    Concept analysis of the nursing diagnosis risk for trauma. To examine the nursing diagnosis risk for trauma and to specify the risk factors for falling. Research and informational articles on falling, and NANDA Nursing Diagnoses: Definitions and Classification, 1999-2000. Replace the current nursing diagnosis risk for trauma with the more specific nursing diagnosis unintentional injury risk: falls. The other risks included in risk for trauma (e.g., burns) also will need to be developed.

  18. Contract Negotiations Supported Through Risk Analysis

    Science.gov (United States)

    Rodrigues, Sérgio A.; Vaz, Marco A.; Souza, Jano M.

    Many clients often view software as a commodity; then, it is critical that IT sellers know how to create value into their offering to differentiate their service from all the others. Clients sometimes refuse to contract software development due to lack of technical understanding or simply because they are afraid of IT contractual commitments. The IT negotiators who recognize the importance of this issue and the reason why it is a problem will be able to work to reach the commercial terms they want. Therefore, this chapter aims to stimulate IT professionals to improve their negotiation skills and presents a computational tool to support managers to get the best out of software negotiations through the identification of contract risks.

  19. Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.

    Science.gov (United States)

    Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip

    2017-07-05

    Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.

  20. Analysis Methods Of The Insolvency Risk

    Directory of Open Access Journals (Sweden)

    Gabriela Munteanu

    2010-12-01

    Full Text Available A company’s capacity to be solvent/able* – to defeat the insolvencyrisk, has an important position within the System of financial-patrimonial analysis. Any problem regarding the payment of obligatory taxes generates prejudice and requires urgent solving.

  1. Risk-Based Explosive Safety Analysis

    Science.gov (United States)

    2016-11-30

    other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a...based analysis of scenario 2 would likely determine that the hazard of death or injury to any single person is low due to the separation distance

  2. Revealing the underlying drivers of disaster risk: a global analysis

    Science.gov (United States)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL

  3. Boltzmann babies in the proper time measure

    Energy Technology Data Exchange (ETDEWEB)

    Bousso, Raphael; Bousso, Raphael; Freivogel, Ben; Yang, I-Sheng

    2007-12-20

    After commenting briefly on the role of the typicality assumption in science, we advocate a phenomenological approach to the cosmological measure problem. Like any other theory, a measure should be simple, general, well defined, and consistent with observation. This allows us to proceed by elimination. As an example, we consider the proper time cutoff on a geodesic congruence. It predicts that typical observers are quantum fluctuations in the early universe, or Boltzmann babies. We sharpen this well-known youngness problem by taking into account the expansion and open spatial geometry of pocket universes. Moreover, we relate the youngness problem directly to the probability distribution for observables, such as the temperature of the cosmic background radiation. We consider a number of modifications of the proper time measure, but find none that would make it compatible with observation.

  4. Is adaptation or transformation needed? Active nanomaterials and risk analysis

    Science.gov (United States)

    Kuzma, Jennifer; Roberts, John Patrick

    2016-07-01

    Nanotechnology has been a key area of funding and policy for the United States and globally for the past two decades. Since nanotechnology research and development became a focus and nanoproducts began to permeate the market, scholars and scientists have been concerned about how to assess the risks that they may pose to human health and the environment. The newest generation of nanomaterials includes biomolecules that can respond to and influence their environments, and there is a need to explore whether and how existing risk-analysis frameworks are challenged by such novelty. To fill this niche, we used a modified approach of upstream oversight assessment (UOA), a subset of anticipatory governance. We first selected case studies of "active nanomaterials," that are early in research and development and designed for use in multiple sectors, and then considered them under several, key risk-analysis frameworks. We found two ways in which the cases challenge the frameworks. The first category relates to how to assess risk under a narrow framing of the term (direct health and environmental harm), and the second involves the definition of what constitutes a "risk" worthy of assessment and consideration in decision making. In light of these challenges, we propose some changes for risk analysis in the face of active nanostructures in order to improve risk governance.

  5. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    potential risks, and estimated development times associated with each technology. The established WSTAT framework utilizes elicitation techniques with...and stakeholder value in order to inform and potentially influence requirements documents and associated specifications. The WSTA tool (WSTAT) can...exploitation, requirements definition, early cost informed trades, requirements analysis, Analysis of Alternatives (AoA), contractor trades, and technology

  6. Ontology-based specification, identification and analysis of perioperative risks.

    Science.gov (United States)

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  7. Proper time method in de Sitter space

    CERN Document Server

    Das, Ashok K

    2015-01-01

    We use the proper time formalism to study a (non-self-interacting) massive Klein-Gordon theory in the two dimensional de Sitter space. We determine the exact Green's function of the theory by solving the DeWitt-Schwinger equation as well as by calculating the operator matrix element. We point out how the one parameter family of arbitrariness in the Green's function arises in this method.

  8. Geotechnical risk analysis by flat dilatometer (DMT)

    Science.gov (United States)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  9. Emerging frontier technologies for food safety analysis and risk assessment

    Institute of Scientific and Technical Information of China (English)

    DONG Yi-yang; LIU Jia-hui; WANG Sai; CHEN Qi-long; GUO Tian-yang; ZHANG Li-ya; JIN Yong; SU Hai-jia; TAN Tian-wei

    2015-01-01

    Access to security and safe food is a basic human necessity and essential for a sustainable world. To perform hi-end food safety analysis and risk assessment with state of the art technologies is of utmost importance thereof. With applications as exempliifed by microlfuidic immunoassay, aptasensor, direct analysis in real time, high resolution mass spectrometry, benchmark dose and chemical speciifc adjustment factor, this review presents frontier food safety analysis and risk assess-ment technologies, from which both food quality and public health wil beneift undoubtedly in a foreseeable future.

  10. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  11. VVV IR high proper motion stars

    Science.gov (United States)

    Kurtev, R.; Gromadzki, M.; Beamin, J. C.; Peña, K.; Folkes, S.; Ivanov, V. D.; Borissova, J.; Kuhn, M.; Villanueva, V.; Minniti, D.; Mendez, R.; Lucas, P.; Smith, L.; Pinfield, D.; Antonova, A.

    2015-10-01

    We used the VISTA Variables en Vía Láctea (VVV) survey to search for large proper motion (PM) objects in the zone of avoidance in the Milky Way bulge and southern Galactic disk. This survey is multi-epoch and already spans a period of more than four years, giving us an excellent opportunity for proper motion and parallax studies. We found around 1700 PM objects with PM>30 mas yr(-1) . The majority of them are early and mid M-dwarfs. There are also few later spectral type objects, as well as numerous new K- and G-dwarfs. 75 of the stars have PM>300 mas (-1) and 189 stars have PM>200 mas (-1) . There are only 42 previously known stars in the VVV area with proper motion PM>200 mas (-1) . We also found three dM+WD binaries and new members of the immediate solar vicinity of 25 pc. We generated a catalog which will be a complementary to the existing catalogs outside this zone.

  12. Assessing patient awareness of proper hand hygiene.

    Science.gov (United States)

    Busby, Sunni R; Kennedy, Bryan; Davis, Stephanie C; Thompson, Heather A; Jones, Jan W

    2015-05-01

    The authors hypothesized that patients may not understand the forms of effective hand hygiene employed in the hospital environment. Multiple studies demonstrate the importance of hand hygiene in reducing healthcare-associated infections (HAIs). Extensive research about how to improve compliance has been conducted. Patients' perceptions of proper hand hygiene were evaluated when caregivers used soap and water, waterless hand cleaner, or a combination of these. No significant differences were observed, but many patients reported they did not notice whether their providers cleaned their hands. Educating patients and their caregivers about the protection afforded by proper, consistent hand hygiene practices is important. Engaging patients to monitor healthcare workers may increase compliance, reduce the spread of infection, and lead to better overall patient outcomes. This study revealed a need to investigate the effects of patient education on patient perceptions of hand hygiene. Results of this study appear to indicate a need to focus on patient education and the differences between soap and water versus alcohol-based hand sanitizers as part of proper hand hygiene. Researchers could be asking: "Why have patients not been engaged as members of the healthcare team who have the most to lose?"

  13. [Morphology of neurons of human subiculum proper].

    Science.gov (United States)

    Stanković-Vulović, Maja; Zivanović-Macuzić, Ivana; Sazdanović, Predrag; Jeremić, Dejan; Tosevski, Jovo

    2010-01-01

    Subiculum proper is an archicortical structure of the subicular complex and presents the place of origin of great majority of axons of the whole hippocampal formation. In contrast to the hippocampus which has been intensively studied, the data about human subiculum proper are quite scarce. The aim of our study was to identify morphological characteristics of neurons of the human subiculum proper. The study was performed on 10 brains of both genders by using Golgi impregnation and Nissl staining. The subiculum has three layers: molecular, pyramidal and polymorphic layer. The dominant cell type in the pyramidal layer was the pyramidal neurons, which had pyramidal shaped soma, multiple basal dendrites and one apical dendrite. The nonpyramidal cells were scattered among the pyramidal cells of the pyramidal layer. The nonpyramidal cells were classified on: multipolar, bipolar and neurons with triangular-shaped soma. The neurons of the molecular layer of the human subiculum were divided into groups: bipolar and multipolar neurons. The most numerous cells of the polymorphic layer were bipolar and multipolar neurons.

  14. THE PROPER MOTION OF THE LMC

    Directory of Open Access Journals (Sweden)

    R. A. Méndez

    2009-01-01

    Full Text Available We have determined the proper motion of the Large Magellanic Cloud (LMC relative to a background quasistellar object, using observations carried out in seven epochs (six years of base time. Our proper motion value agrees well with most results obtained by other authors and indicates that the LMC is not a member of a proposed stream of galaxies with similar orbits around our galaxy. Using published values of the radial velocity for the center of the LMC, in combination with the transverse velocity vector derived from our measured proper motion, we have calculated the absolute space velocity of the LMC. This value, along with some assumptions regarding the mass distribution of the Galaxy, has in turn been used to calculate the mass of the latter. This work is part of a program to study the space motion of the Magellanic Clouds system and its relationship to the Milky Way (MW. This knowledge is essential to understand the nature, origin and evolution of this system as well as the origin and evolution of the outer parts of the MW.

  15. Cable Hot Shorts and Circuit Analysis in Fire Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey; Nowlen, Steven P.; Wyant, Frank

    1999-05-19

    Under existing methods of probabilistic risk assessment (PRA), the analysis of fire-induced circuit faults has typically been conducted on a simplistic basis. In particular, those hot-short methodologies that have been applied remain controversial in regards to the scope of the assessments, the underlying methods, and the assumptions employed. To address weaknesses in fire PRA methodologies, the USNRC has initiated a fire risk analysis research program that includes a task for improving the tools for performing circuit analysis. The objective of this task is to obtain a better understanding of the mechanisms linking fire-induced cable damage to potentially risk-significant failure modes of power, control, and instrumentation cables. This paper discusses the current status of the circuit analysis task.

  16. Risk analysis of early childhood eczema

    DEFF Research Database (Denmark)

    Bisgaard, Hans; Halkjaer, Liselotte B; Hinge, Rikke

    2009-01-01

    BACKGROUND: The increasing prevalence of eczema suggests the role of environmental factors triggering a genetic predisposition. OBJECTIVE: To analyze the effect of environmental exposures in early life and genetic predisposition on the development of eczema before age 3 years. METHODS: The Copenh......BACKGROUND: The increasing prevalence of eczema suggests the role of environmental factors triggering a genetic predisposition. OBJECTIVE: To analyze the effect of environmental exposures in early life and genetic predisposition on the development of eczema before age 3 years. METHODS......: The Copenhagen Study on Asthma in Childhood is a prospective clinical study of a birth cohort of 411 children born of mothers with asthma. Eczema was diagnosed, treated, and monitored at the clinical research unit, and complete follow-up for the first 3 years of life was available for 356 children. Risk...... assessments included filaggrin loss-of-function mutation; parent's atopic disease; sex; social status; previous deliveries; third trimester complications and exposures; anthropometrics at birth; month of birth; duration solely breast-fed; introduction of egg, cow's milk, and fish; time spent in day care; cat...

  17. State of the art in benefit-risk analysis: medicines.

    Science.gov (United States)

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Germany wide seasonal flood risk analysis for agricultural crops

    Science.gov (United States)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  19. Unwanted Medication Collection Events: The Importance of Proper Disposal

    OpenAIRE

    Henderson, William L

    2015-01-01

    The objective of this service-learning research project was to discover current practices and barriers related to proper medication disposal while protecting the safety of the public and the integrity of the environment. Collection and analysis of unused medications followed by environmentally friendly disposal is a current vision found in the Healthy People 2020 initiative and coincides with this objective. Since 2012, medication collection events have been conducted in multiple locations in...

  20. Risk analysis. HIV / AIDS country profile: Mozambique.

    Science.gov (United States)

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system.

  1. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  2. ANALYSIS METHODS OF BANKRUPTCY RISK IN ROMANIAN ENERGY MINING INDUSTRY

    Directory of Open Access Journals (Sweden)

    CORICI MARIAN CATALIN

    2016-12-01

    Full Text Available The study is an analysis of bankruptcy risk and assessing the economic performance of the entity in charge of energy mining industry from southwest region. The scientific activity assesses the risk of bankruptcy using score’s method and some indicators witch reflecting the results obtained and elements from organization balance sheet involved in mining and energy which contributes to the stability of the national energy system. Analysis undertaken is focused on the application of the business organization models that allow a comprehensive assessment of the risk of bankruptcy and be an instrument of its forecast. In this study will be highlighted developments bankruptcy risk within the organization through the Altman model and Conan-Holder model in order to show a versatile image on the organization's ability to ensure business continuity

  3. How does scientific risk assessment of GM crops fit within the wider risk analysis?

    Science.gov (United States)

    Johnson, Katy L; Raybould, Alan F; Hudson, Malcolm D; Poppy, Guy M

    2007-01-01

    The debate concerning genetically modified crops illustrates confusion between the role of scientists and that of wider society in regulatory decision making. We identify two fundamental misunderstandings, which, if rectified, would allow progress with confidence. First, scientific risk assessment needs to test well-defined hypotheses, not simply collect data. Second, risk assessments need to be placed in the wider context of risk analysis to enable the wider 'non-scientific' questions to be considered in regulatory decision making. Such integration and understanding is urgently required because the challenges to regulation will escalate as scientific progress advances.

  4. A Subjective Risk Analysis Approach of Container Supply Chains

    Institute of Scientific and Technical Information of China (English)

    Zai-Li Yang; Jin Wang; Steve Bonsall; Jian-Bo Yang; Quan-Gen Fang

    2005-01-01

    After the 9/11 terrorism attacks, the lock-out of the American West Ports in 2002 and the breakout of SARS disease in 2003 have further focused mind of both the public and industrialists to take effective and timely measures for assessing and controlling the risks related to container supply chains (CSCs). However, due to the complexity of the risks in the chains, conventional quantitative risk assessment (QRA) methods may not be capable of providing sufficient safety management information, as achieving such a functionality requires enabling the possibility of conducting risk analysis in view of the challenges and uncertainties posed by the unavailability and incompleteness of historical failure data. Combing the fuzzy set theory (FST) and an evidential reasoning (ER) approach, the paper presents a subjective method to deal with the vulnerability-based risks, which are more ubiquitous and uncertain than the traditional hazard-based ones in the chains.

  5. Country Risk Analysis: A Survey of the Quantitative Methods

    OpenAIRE

    Hiranya K Nath

    2008-01-01

    With globalization and financial integration, there has been rapid growth of international lending and foreign direct investment (FDI). In view of this emerging trend, country risk analysis has become extremely important for the international creditors and investors. This paper briefly discusses the concepts and definitions, and presents a survey of the quantitative methods that are used to address various issues related to country risk. It also gives a summary review of selected empirical st...

  6. Chronic wasting disease risk analysis workshop: An integrative approach

    Science.gov (United States)

    Gillette, Shana; Dein, Joshua; Salman, Mo; Richards, Bryan; Duarte, Paulo

    2004-01-01

    Risk analysis tools have been successfully used to determine the potential hazard associated with disease introductions and have facilitated management decisions designed to limit the potential for disease introduction. Chronic Wasting Disease (CWD) poses significant challenges for resource managers due to an incomplete understanding of disease etiology and epidemiology and the complexity of management and political jurisdictions. Tools designed specifically to assess the risk of CWD introduction would be of great value to policy makers in areas where CWD has not been detected.

  7. Credibility analysis of risk classes by generalized linear model

    Science.gov (United States)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  8. RISK LEVEL ANALYSIS ON THE PREVENTIVE EROSION CAPACITY OF BRIDGES

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Deficiency of the Preventive Erosion Capacity (PEC) of a bridge pier is the main factor leading to bridge failures. In this paper, the PEC of bridge piers was analyzed using the stochastic analysis method. The definitions of the reliability and risk level of a bridge pier subjected to water erosion were proposed and a computational model for erosion depth and risk level in was suggested.

  9. Geomorphological risk analysis in the Republic of Belarus

    OpenAIRE

    2014-01-01

    Romanenko V. GIS-Mapping and Assessment of Geomorphological Risk in Belarus / V. Romanenko, D. Kurlovich // The geomorphology of natural hazards: mapping, analysis and prevention. Abstract book. 17th Joint Geomorphological Meeting, Liege (Belgium). 1-3 July 2014. – Liege. – P. 116. In the present study an assessment of geomorphological risk in the Republic of Belarus has been made. Geomorphological districts (according to geomorphological zoning) were the objects of the research.

  10. Ecological food web analysis for chemical risk assessment.

    Science.gov (United States)

    Preziosi, Damian V; Pastorok, Robert A

    2008-12-01

    Food web analysis can be a critical component of ecological risk assessment, yet it has received relatively little attention among risk assessors. Food web data are currently used in modeling bioaccumulation of toxic chemicals and, to a limited extent, in the determination of the ecological significance of risks. Achieving more realism in ecological risk assessments requires new analysis tools and models that incorporate accurate information on key receptors in a food web paradigm. Application of food web analysis in risk assessments demands consideration of: 1) different kinds of food webs; 2) definition of trophic guilds; 3) variation in food webs with habitat, space, and time; and 4) issues for basic sampling design and collection of dietary data. The different kinds of food webs include connectance webs, materials flow webs, and functional (or interaction) webs. These three kinds of webs play different roles throughout various phases of an ecological risk assessment, but risk assessors have failed to distinguish among web types. When modeling food webs, choices must be made regarding the level of complexity for the web, assignment of species to trophic guilds, selection of representative species for guilds, use of average diets, the characterization of variation among individuals or guild members within a web, and the spatial and temporal scales/dynamics of webs. Integrating exposure and effects data in ecological models for risk assessment of toxic chemicals relies on coupling food web analysis with bioaccumulation models (e.g., Gobas-type models for fish and their food webs), wildlife exposure models, dose-response models, and population dynamics models.

  11. Advanced uncertainty modelling for container port risk analysis.

    Science.gov (United States)

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance.

  12. State of the art in benefit-risk analysis: introduction.

    Science.gov (United States)

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume.

  13. Risk Analysis and Decision Making FY 2013 Milestone Report

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  14. Analysis of interactions among barriers in project risk management

    Science.gov (United States)

    Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita

    2017-06-01

    In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.

  15. SYNTHETIC ANALYSIS OF CREDIT RISK - PREVENTION AND MANAGEMENT

    Directory of Open Access Journals (Sweden)

    LĂPĂDUSI MIHAELA LOREDANA

    2013-02-01

    Full Text Available The uncertainty of the economic and social environment in which a company operates represents the essentialfeature from which are discharged all types of hazards. Protection against risks, mitigation of their effects that aremeasured by the losses generated are issues which led to the continuous improvement of the measure of prevention andmanagement of riskThe article puts in to highlight a number of aspects related to the prevention and management of credit risk, twokey actions on the conduct of the business of a firm, but in carrying out the activities. In the presentation of the articlewe focused on a synthetic analysis of the sources of information used in credit risk analysis based on information fromsources both within the company and outside of it.The importance of prevention and management of credit consists in being able to forecast the possibleproduction of the event of credit risk and of taking in time the necessary decisions in order to reduce this and someadverse consequences. The essence of credit risk can be expressed by the possibility of the quantification of likelihoodappearance of this risk with consequences which have a direct effect on the activity of banks or financial institutions.

  16. Complicated appendicitis: Analysis of risk factors in children

    Directory of Open Access Journals (Sweden)

    Mahavir Singh

    2014-01-01

    Full Text Available Background: Acute appendicitis (AA is the most common surgical emergency in childhood. The risk of rupture is negligible within the first 24 h, climbing to 6% after 36 h from the onset of symptoms. Because of difficulty in accurate diagnosis of AA a significant number of children still are being managed when it is already perforated. There is always a need to make an early diagnosis of AA and to find out the risk factors associated with development of complication in this condition. Patients and Methods: A total of 102 patients with a clinical diagnosis of AA were admitted during the study period. On admission, a good clinical history and proper physical examination was performed. All the eligible patients who finally diagnosed clinically as having AA were planned for emergency open appendectomy. The removed appendix was sent for histopathological examination in all the study subjects. Results: Out of 102 cases, 93 cases were histopathologically appendicitis, rest nine cases showed no evidence of inflammation so the rate of negative appendectomy was around 9%. On histopathology normal appendix was found in nine patients (8.9%, AA in 71 patients (69.6%, complicated appendicitis (CA which includes perforated and gangrenous appendicitis was present in 22 patients (21.5%. Perforations were more common in patients who were younger than 5 years. >60% patients presented with CA when the duration of pain was >72 h. Presence of appendicolith increased the probability of CA.

  17. Dietary Patterns and Pancreatic Cancer Risk: A Meta-Analysis.

    Science.gov (United States)

    Lu, Pei-Ying; Shu, Long; Shen, Shan-Shan; Chen, Xu-Jiao; Zhang, Xiao-Yan

    2017-01-05

    A number of studies have examined the associations between dietary patterns and pancreatic cancer risk, but the findings have been inconclusive. Herein, we conducted this meta-analysis to assess the associations between dietary patterns and the risk of pancreatic cancer. MEDLINE (provided by the National Library of Medicine) and EBSCO (Elton B. Stephens Company) databases were searched for relevant articles published up to May 2016 that identified common dietary patterns. Thirty-two studies met the inclusion criteria and were finally included in this meta-analysis. A reduced risk of pancreatic cancer was shown for the highest compared with the lowest categories of healthy patterns (odds ratio, OR = 0.86; 95% confidence interval, CI: 0.77-0.95; p = 0.004) and light-moderate drinking patterns (OR = 0.90; 95% CI: 0.83-0.98; p = 0.02). There was evidence of an increased risk for pancreatic cancer in the highest compared with the lowest categories of western-type pattern (OR = 1.24; 95% CI: 1.06-1.45; p = 0.008) and heavy drinking pattern (OR = 1.29; 95% CI: 1.10-1.48; p = 0.002). The results of this meta-analysis demonstrate that healthy and light-moderate drinking patterns may decrease the risk of pancreatic cancer, whereas western-type and heavy drinking patterns may increase the risk of pancreatic cancer. Additional prospective studies are needed to confirm these findings.

  18. Analysis of risk factors and risk assessment for ischemic stroke recurrence

    Directory of Open Access Journals (Sweden)

    Xiu-ying LONG

    2016-08-01

    Full Text Available Objective To screen the risk factors for recurrence of ischemic stroke and to assess the risk of recurrence. Methods Essen Stroke Risk Score (ESRS was used to evaluate the risk of recurrence in 176 patients with ischemic stroke (96 cases of first onset and 80 cases of recurrence. Univariate and multivariate stepwise Logistic regression analysis was used to screen risk factors for recurrence of ischemic stroke.  Results There were significant differences between first onset group and recurrence group on age, the proportion of > 75 years old, hypertension, diabetes, coronary heart disease, peripheral angiopathy, transient ischemic attack (TIA or ischemic stroke, drinking and ESRS score (P < 0.05, for all. First onset group included one case of ESRS 0 (1.04%, 8 cases of 1 (8.33%, 39 cases of 2 (40.63%, 44 cases of 3 (45.83%, 4 cases of 4 (4.17%. Recurrence group included 2 cases of ESRS 3 (2.50%, 20 cases of 4 (25% , 37 cases of 5 (46.25% , 18 cases of 6 (22.50% , 3 cases of 7 (3.75% . There was significant difference between 2 groups (Z = -11.376, P = 0.000. Logistic regression analysis showed ESRS > 3 score was independent risk factor for recurrence of ischemic stroke (OR = 31.324, 95%CI: 3.934-249.430; P = 0.001.  Conclusions ESRS > 3 score is the independent risk factor for recurrence of ischemic stroke. It is important to strengthen risk assessment of recurrence of ischemic stroke. To screen and control risk factors is the key to secondary prevention of ischemic stroke. DOI: 10.3969/j.issn.1672-6731.2016.07.011

  19. [Competitive karate and the risk of HIV infection--review, risk analysis and risk minimizing strategies].

    Science.gov (United States)

    Müller-Rath, R; Mumme, T; Miltner, O; Skobel, E

    2004-03-01

    Bleeding facial injuries are not uncommon in competitive karate. Nevertheless, the risk of an infection with HIV is extremely low. Guidelines about the prevention of HIV infections are presented. Especially in contact sports and martial arts the athletes, judges and staff have to recognize and employ these recommendations. Bleeding wounds of the hands due to contact with the opponents teeth can be minimized by fist padding.

  20. Security Risk Minimization for Desktop and Mobile Software Systems. An In-Depth Analysis

    Directory of Open Access Journals (Sweden)

    Florina Camelia PUICAN

    2014-01-01

    Full Text Available In an extremely rapid growing industry such as the information technology nowadays, continuous and efficient workflows need to be established within any integrated enterprise or consumer software system. Taking into consideration the actual trend of data and information migrating to mobile devices, which have became more than just simple gadgets, the security threats and vulnerabilities of software products have created a new playground for attackers, especially when the system offers cross-platform (desktop and mobile functionalities and applicability. In this context, the paper proposes an in depth analysis over some of the weaknesses software systems present, providing also a set of solutions for minimizing and mitigating the risks of any solution, be it mobile or desktop. Subsequently, even though consumer and enterprise systems have fundamentally different structures and architectures (due to the different needs of the end user, data loss or information leakage may and will affect any type of machine if proper securization of the systems is not taken into consideration, therefore risk minimization through an in-depth analysis of any integrated software system becomes mandatory and needs extensive care.

  1. Risk-based analysis for prioritization and processing in the Los Alamos National Laboratory 94-1 program

    Energy Technology Data Exchange (ETDEWEB)

    Boerigter, S.T.; DeMuth, N.S.; Tietjen, G.

    1996-10-01

    A previous report, {open_quotes}Analysis of LANL Options for Processing Plutonium Legacy Materials,{close_quotes} LA-UR-95-4301, summarized the development of a risk-based prioritization methodology for the Los Alamos National Laboratory (LANL) Plutonium Facility at Technical Area-55 (TA-55). The methodology described in that report was developed not only to assist processing personnel in prioritizing the remediation of legacy materials but also to evaluate the risk impacts of schedule modifications and changes. Several key activities were undertaken in the development of that methodology. The most notable was that the risk assessments were based on statistically developed data from sampling containers in the vault and evaluating their condition; the data from the vault sampling programs were used as the basis for risk estimates. Also, the time-dependent behavior of the legacy materials was explicitly modeled and included in the risk analysis. The results indicated that significant reductions in program risk can be achieved by proper prioritization of the materials for processing.

  2. From risk analysis to risk governance - Adapting to an ever more complex future

    Directory of Open Access Journals (Sweden)

    Dirk U. Pfeiffer

    2014-09-01

    Full Text Available Risk analysis is now widely accepted amongst veterinary authorities and other stakeholders around the world as a conceptual framework for integrating scientific evidence into animal health decision making. The resulting risk management for most diseases primarily involves linking epidemiological understanding with diagnostics and/or vaccines. Recent disease outbreaks such as Nipah virus, SARS, avian influenza H5N1, bluetongue serotype 8 and Schmallenberg virus have led to realising that we need to explicitly take into account the underlying complex interactions between environmental, epidemiological and social factors which are often also spatially and temporally heterogeneous as well as interconnected across affected regions and beyond. A particular challenge is to obtain adequate understanding of the influence of human behaviour and to translate this into effective mechanisms leading to appropriate behaviour change where necessary. Both, the One Health and the ecohealth approaches reflect the need for such a holistic systems perspective, however the current implementation of risk analysis frameworks for animal health and food safety is still dominated by a natural or biomedical perspective of science as is the implementation of control and prevention policies. This article proposes to integrate the risk analysis approach with a risk governance framework which explicitly adds the socio-economic context to policy development and emphasizes the need for organisational change and stakeholder engagement.

  3. [Knowledge regarding Proper Use Guidelines for Benzodiazepines].

    Science.gov (United States)

    Inada, Ken

    2016-01-01

      Benzodiazepines (BZs) work by agonising gamma-aminobutyric acid (GABA)-BZ-receptor complex and thereby produce sedation and anti-anxiety effects. BZs are commonly used in several clinical areas as hypnotics or anti-anxiety drugs. However, these drugs once supplied by medical institutions often lead to abuse and dependence. Thus it is important for institutions to supply and manage BZs properly. At Tokyo Women's Medical University Hospital educational activities about proper use of BZs are performed by not only medical doctors but also pharmacists. We coordinate distribution of leaflets and run an educational workshop. As a result of these activities, the number of patients receiving BZ prescriptions was reduced. Performing these activities, pharmacists were required to work for patients, doctors, and nurses; they acquired knowledge about BZs such as action mechanisms, efficacy, adverse effects, problems about co-prescription, and methods of discontinuing BZs, as well as information on coping techniques other than medication. The most important point to attend the patients is to answer their anxieties.

  4. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    Science.gov (United States)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  5. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph......D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes...... of variations in the data. Essential limiting factors of urban flooding QMRAs were identified as uncertainty regarding ingestion volumes, the limited use of dose-response models and low numbers of microbial parameters measurements and absent validation of the risk assessments. Because improving knowledge...

  6. Analysis of the Hazard, Vulnerability, and Exposure to the Risk of Flooding (Alba de Yeltes, Salamanca, Spain

    Directory of Open Access Journals (Sweden)

    Sergio Veleda

    2017-02-01

    Full Text Available The present work has developed a method using GIS technology to evaluate the danger, vulnerability, and exposure to the risk of flooding in the Alba de Yeltes area (Salamanca, Spain. It is a non-structural measure for the prevention and mitigation of the risk of extraordinary flooding. After completing a full analysis of the physical environment (climate, geology, geomorphology, hydrology, hydrogeology, and land use, hydrological-hydraulic modeling was carried out using the GeoHecRas river analysis software. The results obtained from the analysis and the models have generated a danger map that facilitates the efficient evaluation of the spatial distribution of the different severity parameters (depth of the watersheet, current flow rate, and flood-prone areas. Also, map algebra and the databases associated with GIS tools, together with the vulnerability and exposure cartography, have allowed the risk to be analyzed in an integrate manner and the production of an environmental diagnostic map. The results of this study propose that there are inhabited areas close to the Yeltes-Morasverdes riverbed that have a high risk of flooding, indicating the need for proper land planning and the implementation of a series of measures that will help to reduce the risk of flooding and its impact.

  7. Risk analysis by FMEA as an element of analytical validation.

    Science.gov (United States)

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  8. Empirical Analysis of Urban Residents’ Perceived Climatic Change Risks

    Institute of Scientific and Technical Information of China (English)

    Peihui; DAI; Lingling; HUANG

    2014-01-01

    The impact of climate change on human survival and security,urban development is even more profound,and receives more and more attention. To explore the perceived status of urban residents for the risks of climate change and put forward corresponding countermeasures and suggestions,taking Wuhan for example,from the microscopic point of urban residents,we use factor analysis to classify the perceived risks and recognized risk reduction measures,use cluster analysis to divide the urban residents into five groups,and use variance analysis to explore differences in the choice of measures between different cluster groups. We draw the following conclusions: the risk of deterioration of the ecological environment,the risk of economic damage,the risk of damage to the mental health,the risk of damage to the physical health and the risk of damage to the political harmony are the main risks of climate change for urban residents; individuals and families to develop good habits,businesses and governments to strengthen energy conservation,schools and other agencies to carry on the propaganda and education,carrying out multi-agent environment improvement,learn from the West are their recognized risk reduction measures. Depending on the perceived risk,the urban residents are clustered into five groups: those who are concerned about the body and politics,those who are concerned about the mental health,those who are concerned about the economic development,those who are concerned about the ecological safety,and those who ignore the climatic change. For the roles of individual and the family,business and government in the environmental protection,different groups have unanimous views,while for other measures,different groups have different understanding. It is concluded that individuals and families to develop environmentally friendly habits,government to strengthen regulation,businesses to take environmental responsibility,schools to strengthen publicity and education,and exploring

  9. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  10. Downside Risk analysis applied to Hedge Funds universe

    CERN Document Server

    Perello, J

    2006-01-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires a high precision risk evaluation and an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater (or lower) than investor's goal. We study several risk indicators using the Gaussian case as a benchmark and apply them to the Credit Suisse/Tremont Investable Hedge Fund Index Data.

  11. Risk Analysis on Uric Acid Resulting in Carotid Atherosclerosis

    Institute of Scientific and Technical Information of China (English)

    肖敏; 李河; 郭兰; 石美铃; 麦劲壮

    2004-01-01

    Objectives To explore the risk of uric acid (UA) resulting in carotid atherosclerosis. Methods With a cross sectional study, 643 subjects (aged 41-83 yrs, male 552 and female 91)were surveyed in 1999 in Guangdong Province, China.The main research variables were uric acid (UA), occurrence and the size of carotid artery plaque. Results There was no statistical significance between the UA means of plaque occurrence and no-occurrence groups (t=0.60, df=242, P=0.5495). It seemed UA was not a possible risk factor of carotid atherosclerosis (OR=1.060, P=-0.8448>0.05, n=244) based on the logistic regression analysis. Conclusions Our results are not consistent with serum UA being an independent risk factor for atherosclerosis and coronary heart disease (CHD). It is necessary to do more research to learn the risk degree of UA during the progress of atherosclerosis/CHD.

  12. Downside Risk analysis applied to the Hedge Funds universe

    Science.gov (United States)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  13. Regional Hazard Analysis For Use In Vulnerability And Risk Assessment

    Directory of Open Access Journals (Sweden)

    Maris Fotios

    2015-09-01

    Full Text Available A method for supporting an operational regional risk and vulnerability analysis for hydrological hazards is suggested and applied in the Island of Cyprus. The method aggregates the output of a hydrological flow model forced by observed temperatures and precipitations, with observed discharge data. A scheme supported by observed discharge is applied for model calibration. A comparison of different calibration schemes indicated that the same model parameters can be used for the entire country. In addition, it was demonstrated that, for operational purposes, it is sufficient to rely on a few stations. Model parameters were adjusted to account for land use and thus for vulnerability of elements at risk by comparing observed and simulated flow patterns, using all components of the hydrological model. The results can be used for regional risk and vulnerability analysis in order to increase the resilience of the affected population.

  14. Analysis of coastal protection under rising flood risk

    Directory of Open Access Journals (Sweden)

    Megan J. Lickley

    2014-01-01

    Full Text Available Infrastructure located along the U.S. Atlantic and Gulf coasts is exposed to rising risk of flooding from sea level rise, increasing storm surge, and subsidence. In these circumstances coastal management commonly based on 100-year flood maps assuming current climatology is no longer adequate. A dynamic programming cost–benefit analysis is applied to the adaptation decision, illustrated by application to an energy facility in Galveston Bay. Projections of several global climate models provide inputs to estimates of the change in hurricane and storm surge activity as well as the increase in sea level. The projected rise in physical flood risk is combined with estimates of flood damage and protection costs in an analysis of the multi-period nature of adaptation choice. The result is a planning method, using dynamic programming, which is appropriate for investment and abandonment decisions under rising coastal risk.

  15. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Science.gov (United States)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  16. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr; Paramythiotis, Spyridon, E-mail: pskan@aua.gr [Laboratory of Food Quality Control and Hygiene, Department of Food Science and Technology, Agricultural University of Athens, Iera Odos 75, 118 55, Athens (Greece)

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  17. Flood risk analysis model in the village of St. George/Danube Delta

    Science.gov (United States)

    Armas, I.; Dumitrascu, S.; Nistoran, D.

    2009-04-01

    the Danube Delta. The study area is situated at the mouth of the St. George river branch, which suffered a series of interventions resulting with the shortening with 31 km (period 1984-1988). As a direct result, the medium speed of the water grew along with the both liquid and solid flows. In fact, this is only an example of the human activity that took place in the Danube Delta starting with the second half of the last century that influenced the hydrological system for a better use of the natural resources offered by the delta. The study is structured in two stages: the analysis of the hydrological hazard together with the simulation of a series of scenarios concerning floods at various flows and the risk analysis, expressed in the shape of the calculus of the material damage. In the study of the hazard, the methodology was based on the analysis of water depth and velocity maps, done in various flow scenarios, to which were added correlations between flood risk maps with satellite pictures, cadastral plans and field data by using GIS functions. In addition, the field investigations conducted in September 2008 focused on collecting the data necessary in the assessment of the buildings. The observations that synthesize the features of each construction included in the analysis were also stored in ArcGis in the shape of a table of attributes. This information reveals the indicators used in the analysis of the vulnerability of the residences: number of floors, height, construction type, infrastructure and price per property. The analysis revealed an increased degree of the area visibility, pointing out not only certain sectors affected by floods, but also the problems that occurred at the more detailed level of the residences. In addition, the cartographic material plays also an important part in the development of a proper public awareness strategy.

  18. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  19. Sensitivity analysis on parameters and processes affecting vapor intrusion risk

    NARCIS (Netherlands)

    Picone, S.; Valstar, J.R.; Gaans, van P.; Grotenhuis, J.T.C.; Rijnaarts, H.H.M.

    2012-01-01

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the v

  20. RISK ANALYSIS APPLIED IN OIL EXPLORATION AND PRODUCTION

    African Journals Online (AJOL)

    ES Obe

    This research investigated the application of risk analysis to Oil exploration and production. Essentially ... uncertainty in Oil field projects; it reduces the impact of the losses should an unfavourable .... own merit but since the company has limited funds it can be ..... ference, New Orleans, LA, September 27-30. (1998). 8. Seba ...

  1. Risk management of domino effects considering dynamic consequence analysis.

    Science.gov (United States)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2014-06-01

    Domino effects are low-probability high-consequence accidents causing severe damage to humans, process plants, and the environment. Because domino effects affect large areas and are difficult to control, preventive safety measures have been given priority over mitigative measures. As a result, safety distances and safety inventories have been used as preventive safety measures to reduce the escalation probability of domino effects. However, these safety measures are usually designed considering static accident scenarios. In this study, we show that compared to a static worst-case accident analysis, a dynamic consequence analysis provides a more rational approach for risk assessment and management of domino effects. This study also presents the application of Bayesian networks and conflict analysis to risk-based allocation of chemical inventories to minimize the consequences and thus to reduce the escalation probability. It emphasizes the risk management of chemical inventories as an inherent safety measure, particularly in existing process plants where the applicability of other safety measures such as safety distances is limited. © 2013 Society for Risk Analysis.

  2. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    Science.gov (United States)

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  3. Musculoskeletal diseases of spine and risk factors during Dentistry: Multilevel ergonomic analysis

    Directory of Open Access Journals (Sweden)

    Charilaos Koutis

    2011-07-01

    Full Text Available The prevalence of occupational musculoskeletal diseases (MSDs among dentists is estimated to be high, despite the ergonomic interventions in this sector. The aim of the present study was a the evaluation of spine MSDs in dentists and b the assessment of risk factors related to dentist practice. Material and Method: The sample of the present study consisted of 16 dentists (n=16. The participants divided into two (2 groups, based on MSDs of the spine. A multilevel ergonomic analysis was conducted in both groups, which evaluated individual, physical and occupational risk factors during nine (9 dental procedures. For the analysis of data were used, direct methods (video, observation, amended postural analysis OWAS, indirect method (questionnaire and quantitative methods of ergonomic analysis (computerized mediball postural stabilizer cushion. Results: The most frequent MSDs of spine among dentists in the present research are localized on low back (66,7% and neck (8,3%. Based on OWAS analysis of 2348 working postures, statistically significant correlation was found between dentists' MSDs and factors concerning both dentists (weakness of stabilizer muscles of spine, awkward positions during working time, fatigue (p< 0,05 as well as the nature of dental work (specific dental procedures, patient's, the position of patients, tools and dentists during the working time, certain areas of the mouth, working hours, lack of breaks, etc (p< 0,05 respectively. Conclusions: Low back pain and neck pain are the most frequent MSDs of dentists' spine. They are related to individual and other occupational factors which could have been prevented using proper ergonomic interventions.

  4. On the proper motion of auroral arcs

    Energy Technology Data Exchange (ETDEWEB)

    Haerendel, G.; Raaf, B.; Rieger, E. (Max-Planck-Institut fuer Extraterrestrische Physik, Garching (Germany)); Buchert, S. (EISCAT Scientific Association, Kiruna (Sweden)); Hoz, C. la (Univ. of Tromso (Norway))

    1993-04-01

    The authors report on a series of measurements of the proper motion of auroral arcs, made using the EISCAT incoherent scatter radar. Radar measurements are correlated with auroral imaging from the ground to observe the arcs and sense their motion. The authors look at one to two broad classes of auroral arcs, namely the slow (approximately 100 m/s) class which are observed to move either poleward or equatorward. The other class is typically much faster, and observed to move poleward, and represents the class of events most studied in the past. They fit their observations to a previous model which provides a potential energy source for these events. The observations are consistent with the model, though no clear explanation for the actual cause of the motion can be reached from these limited measurements.

  5. Tracking magnetogram proper motions by multiscale regularization

    Science.gov (United States)

    Jones, Harrison P.

    1995-01-01

    Long uninterrupted sequences of solar magnetograms from the global oscillations network group (GONG) network and from the solar and heliospheric observatory (SOHO) satellite will provide the opportunity to study the proper motions of magnetic features. The possible use of multiscale regularization, a scale-recursive estimation technique which begins with a prior model of how state variables and their statistical properties propagate over scale. Short magnetogram sequences are analyzed with the multiscale regularization algorithm as applied to optical flow. This algorithm is found to be efficient, provides results for all the spatial scales spanned by the data and provides error estimates for the solutions. It is found that the algorithm is less sensitive to evolutionary changes than correlation tracking.

  6. Survey of stellar associations using proper motions

    Directory of Open Access Journals (Sweden)

    C. Abad

    2001-01-01

    Full Text Available Stellar Proper Motions can be represented as great circles over the Celestial Sphere. This point of view creates a geometry over the sphere where the study of parallelism of the motions is possible in an easy form. Calculus of intersections between circles can detect convergence point of motions. This means parallel spatial motion. The model can be carried out to open stars clusters, identifying convergence points as apex, in order to get membership probabilities or, in a general form, to stars of our galaxy to detect big stellar structures and to infer some details about their kinematics. We present here a short description of the model and some examples using stars of the Hipparcos catalogue.

  7. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    Science.gov (United States)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  8. Proper body mechanics from an engineering perspective.

    Science.gov (United States)

    Mohr, Edward G

    2010-04-01

    The economic viability of the manual therapy practitioner depends on the number of massages/treatments that can be given in a day or week. Fatigue or injuries can have a major impact on the income potential and could ultimately reach the point which causes the practitioner to quit the profession, and seek other, less physically demanding, employment. Manual therapy practitioners in general, and massage therapists in particular, can utilize a large variety of body postures while giving treatment to a client. The hypothesis of this paper is that there is an optimal method for applying force to the client, which maximizes the benefit to the client, and at the same time minimizes the strain and effort required by the practitioner. Two methods were used to quantifiably determine the effect of using "poor" body mechanics (Improper method) and "best" body mechanics (Proper/correct method). The first approach uses computer modeling to compare the two methods. Both postures were modeled, such that the biomechanical effects on the practitioner's elbow, shoulder, hip, knee and ankle joints could be calculated. The force applied to the client, along with the height and angle of application of the force, was held constant for the comparison. The second approach was a field study of massage practitioners (n=18) to determine their maximal force capability, again comparing methods using "Improper and Proper body mechanics". Five application methods were tested at three different application heights, using a digital palm force gauge. Results showed that there was a definite difference between the two methods, and that the use of correct body mechanics can have a large impact on the health and well being of the massage practitioner over both the short and long term.

  9. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  10. Recent Advances in Risk Analysis and Management (RAM

    Directory of Open Access Journals (Sweden)

    Arpita Banerjee

    2014-12-01

    Full Text Available In today‟s age, organizations consider software development process as an investment activity which is dependent on the comprehensive and precise working of each phase in Software Development Lifecycle. Flaws from each phase could remain undetected starting from requirement phase till maintenance phase. The flaw or defects if left unattended in the respective phase will be carried forward to next phase aggregating the issues. These undetected flaws should be identified and removed as early as possible so as to reduce additional overheads. From the data available, it is concluded that risk analysis is a major factor which is ignored during all the phases of software development process resulting in the emergence of undetected defects and flaws. Because of the failure of many projects, the importance of risk analysis during software development process is now being well recognized. A series of reversed as well as assorted researches are proceeding towards analyzing the risk „right from the beginning‟ during the software development process. Through researchers have contributed significantly in the field, still more needs to be achieved. This paper presents a review of the current research being done in Risk Analysis and Management (RAM, based on the recently published work. The study is carried out with respect to analysis and management of risk in various phase of SDLC. Such a thorough review enables one to identify mature areas of research, as well as areas that need further investigation. Finally, after critical analysis of the current research findings, the future research directions are highlighted with their significance.

  11. RECENT ADVANCES IN RISK ANALYSIS AND MANAGEMENT (RAM

    Directory of Open Access Journals (Sweden)

    Arpita Banerjee

    2015-10-01

    Full Text Available In today‟s age, organizations consider software development process as an investment activity which is dependent on the comprehensive and precise working of each phase in Software Development Lifecycle. Flaws from each phase could remain undetected starting from requirement phase till maintenance phase. The flaw or defects if left unattended in the respective phase will be carried forward to next phase aggregating the issues. These undetected flaws should be identified and removed as early as possible so as to reduce additional overheads. From the data available, it is concluded that risk analysis is a major factor which is ignored during all the phases of software development process resulting in the emergence of undetected defects and flaws. Because of the failure of many projects, the importance of risk analysis during software development process is now being well recognized. A series of reversed as well as assorted researches are proceeding towards analyzing the risk „right from the beginning‟ during the software development process. Through researchers have contributed significantly in the field, still more needs to be achieved. This paper presents a review of the current research being done in Risk Analysis and Management (RAM, based on the recently published work. The study is carried out with respect to analysis and management of risk in various phase of SDLC. Such a thorough review enables one to identify mature areas of research, as well as areas that need further investigation. Finally, after critical analysis of the current research findings, the future research directions are highlighted with their significance.

  12. Risk analysis in radiosurgery treatments using risk matrices; Analisis de riesgos en tratamientos de radiocirugia mediante matrices de riesgo

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, J. M.; Sanchez Cayela, C.; Ramirez, M. L.; Perez, A.

    2011-07-01

    The aim of this study is the risk analysis process stereotactic single-dose radiotherapy and evaluation of those initiating events that lead to increased risk and a possible solution in the design of barriers.

  13. Broad-scale recombination patterns underlying proper disjunction in humans.

    Directory of Open Access Journals (Sweden)

    Adi Fledel-Alon

    2009-09-01

    Full Text Available Although recombination is essential to the successful completion of human meiosis, it remains unclear how tightly the process is regulated and over what scale. To assess the nature and stringency of constraints on human recombination, we examined crossover patterns in transmissions to viable, non-trisomic offspring, using dense genotyping data collected in a large set of pedigrees. Our analysis supports a requirement for one chiasma per chromosome rather than per arm to ensure proper disjunction, with additional chiasmata occurring in proportion to physical length. The requirement is not absolute, however, as chromosome 21 seems to be frequently transmitted properly in the absence of a chiasma in females, a finding that raises the possibility of a back-up mechanism aiding in its correct segregation. We also found a set of double crossovers in surprisingly close proximity, as expected from a second pathway that is not subject to crossover interference. These findings point to multiple mechanisms that shape the distribution of crossovers, influencing proper disjunction in humans.

  14. Limited-memory adaptive snapshot selection for proper orthogonal decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Oxberry, Geoffrey M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kostova-Vassilevska, Tanya [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arrighi, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chand, Kyle [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-04-02

    Reduced order models are useful for accelerating simulations in many-query contexts, such as optimization, uncertainty quantification, and sensitivity analysis. However, offline training of reduced order models can have prohibitively expensive memory and floating-point operation costs in high-performance computing applications, where memory per core is limited. To overcome this limitation for proper orthogonal decomposition, we propose a novel adaptive selection method for snapshots in time that limits offline training costs by selecting snapshots according an error control mechanism similar to that found in adaptive time-stepping ordinary differential equation solvers. The error estimator used in this work is related to theory bounding the approximation error in time of proper orthogonal decomposition-based reduced order models, and memory usage is minimized by computing the singular value decomposition using a single-pass incremental algorithm. Results for a viscous Burgers’ test problem demonstrate convergence in the limit as the algorithm error tolerances go to zero; in this limit, the full order model is recovered to within discretization error. The resulting method can be used on supercomputers to generate proper orthogonal decomposition-based reduced order models, or as a subroutine within hyperreduction algorithms that require taking snapshots in time, or within greedy algorithms for sampling parameter space.

  15. Source contribution and risk assessment of airborne toxic metals by neutron activation analysis in Taejeon industrial complex area - Concentration analysis and health risk assessment of airborne toxic metals in Taejeon 1,2 industrial Complex

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. H.; Jang, M. S.; Nam, B. H.; Yun, M. J. [Chungnam National Univ., Taejeon (Korea)

    2000-04-01

    The study centers on one-year continual concentration analysis using ICP-MS and on health risk assessment of 15 airborne toxic metals in Taejeon 1,2 industrial complex. About 1-year arithmetic mean of human carcinogen, arsenic, hexavalent chromium and nickel subsulfide is 6.05, 2.40 and 2.81 ng/m{sup 3} while the mean of probable human carcinogen, beryllium, cadmium and lead is 0.06, 3.92, 145.99 ng/m{sup 3}, respectively. And the long-term arithmetic mean concentration of non-carcinogenic metal, manganese is 44.60 ng/m{sup 3}. The point risk estimate for the inhalation of carcinogenic metals is 7.0 X10{sup -5}, which is higher than a risk standard of 10{sup -5}. The risk from human carcinogens is 6.2X10{sup -5}, while that from probable human carcinogens is 8.0X10{sup -6}, respectively. About 86 % of the cancer risk is due to the inhalation of human carcinogens, arsenic and hexavalent chromium. Thus, it is necessary to properly manage both arsenic and hexavalent chromium risk in Taejeon 1,2 industrial complex. 37 refs., 13 figs., 9 tabs. (Author)

  16. Dietary Factors Affecting Thyroid Cancer Risk: A Meta-Analysis.

    Science.gov (United States)

    Cho, Young Ae; Kim, Jeongseon

    2015-01-01

    Some dietary factors are proposed to affect thyroid carcinogenesis, but previous studies have reported inconsistent findings. Therefore, we performed a meta-analysis, including 18 eligible studies, to clarify the role of dietary factors in the risk of thyroid cancer. The relative risks (RRs) with 95% confidence intervals (95% CIs) were estimated to assess the association and heterogeneity tests and subgroup and sensitivity analyses, and bias assessments were performed. When the results from all studies were combined, dietary iodine, fish, and cruciferous vegetable intake were not associated with thyroid cancer. However, when the data were divided by geographic location based on iodine availability, a slight increase in the risk of thyroid cancer was observed among those consuming a high total amount of fish in iodine nondeficient areas (RR: 1.18; 95% CI: 1.03-1.35; P for heterogeneity = 0.282). When excluding the studies examining a single food item and hospital-based controls, a high intake of cruciferous vegetables was associated with an increased risk of thyroid cancer in iodine-deficient areas (RR: 1.43; 95% CI: 1.18-1.74; P for heterogeneity = 0.426). This meta-analysis implies that the role of dietary factors, such as fish and cruciferous vegetables, in thyroid cancer risk can differ based on iodine availability.

  17. Analysis of risk factors for T. brucei rhodesiense sleeping sickness within villages in south-east Uganda

    Directory of Open Access Journals (Sweden)

    Odiit Martin

    2008-06-01

    Full Text Available Abstract Background Sleeping sickness (HAT caused by T.b. rhodesiense is a major veterinary and human public health problem in Uganda. Previous studies have investigated spatial risk factors for T.b. rhodesiense at large geographic scales, but none have properly investigated such risk factors at small scales, i.e. within affected villages. In the present work, we use a case-control methodology to analyse both behavioural and spatial risk factors for HAT in an endemic area. Methods The present study investigates behavioural and occupational risk factors for infection with HAT within villages using a questionnaire-based case-control study conducted in 17 villages endemic for HAT in SE Uganda, and spatial risk factors in 4 high risk villages. For the spatial analysis, the location of homesteads with one or more cases of HAT up to three years prior to the beginning of the study was compared to all non-case homesteads. Analysing spatial associations with respect to irregularly shaped geographical objects required the development of a new approach to geographical analysis in combination with a logistic regression model. Results The study was able to identify, among other behavioural risk factors, having a family member with a history of HAT (p = 0.001 as well as proximity of a homestead to a nearby wetland area (p Conclusion Spatial risk factors for HAT are maintained across geographical scales; this consistency is useful in the design of decision support tools for intervention and prevention of the disease. Familial aggregation of cases was confirmed for T. b. rhodesiense HAT in the study and probably results from shared behavioural and spatial risk factors amongmembers of a household.

  18. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  19. A comprehensive risk analysis of coastal zones in China

    Science.gov (United States)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  20. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  1. Risk analysis for renewable energy projects due to constraints arising

    Science.gov (United States)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  2. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    Science.gov (United States)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  3. Dietary Patterns and Pancreatic Cancer Risk: A Meta-Analysis

    Science.gov (United States)

    Lu, Pei-Ying; Shu, Long; Shen, Shan-Shan; Chen, Xu-Jiao; Zhang, Xiao-Yan

    2017-01-01

    A number of studies have examined the associations between dietary patterns and pancreatic cancer risk, but the findings have been inconclusive. Herein, we conducted this meta-analysis to assess the associations between dietary patterns and the risk of pancreatic cancer. MEDLINE (provided by the National Library of Medicine) and EBSCO (Elton B. Stephens Company) databases were searched for relevant articles published up to May 2016 that identified common dietary patterns. Thirty-two studies met the inclusion criteria and were finally included in this meta-analysis. A reduced risk of pancreatic cancer was shown for the highest compared with the lowest categories of healthy patterns (odds ratio, OR = 0.86; 95% confidence interval, CI: 0.77–0.95; p = 0.004) and light–moderate drinking patterns (OR = 0.90; 95% CI: 0.83–0.98; p = 0.02). There was evidence of an increased risk for pancreatic cancer in the highest compared with the lowest categories of western-type pattern (OR = 1.24; 95% CI: 1.06–1.45; p = 0.008) and heavy drinking pattern (OR = 1.29; 95% CI: 1.10–1.48; p = 0.002). The results of this meta-analysis demonstrate that healthy and light–moderate drinking patterns may decrease the risk of pancreatic cancer, whereas western-type and heavy drinking patterns may increase the risk of pancreatic cancer. Additional prospective studies are needed to confirm these findings. PMID:28067765

  4. Scientific commentary: Strategic analysis of environmental policy risks--heat maps, risk futures and the character of environmental harm.

    Science.gov (United States)

    Prpich, G; Dagonneau, J; Rocks, S A; Lickorish, F; Pollard, S J T

    2013-10-01

    We summarise our recent efforts on the policy-level risk appraisal of environmental risks. These have necessitated working closely with policy teams and a requirement to maintain crisp and accessible messages for policy audiences. Our comparative analysis uses heat maps, supplemented with risk narratives, and employs the multidimensional character of risks to inform debates on the management of current residual risk and future threats. The policy research and ensuing analysis raises core issues about how comparative risk analyses are used by policy audiences, their validation and future developments that are discussed in the commentary below. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Tea consumption and leukemia risk: a meta-analysis.

    Science.gov (United States)

    Zhong, Shanliang; Chen, Zhiyuan; Yu, Xinnian; Chen, Weixian; Lv, Mengmeng; Ma, Tengfei; Zhao, Jianhua

    2014-06-01

    Epidemiologic findings concerning the association between tea consumption and leukemia risk yielded mixed results. We aimed to investigate the association by performing a meta-analysis of all available studies. One cohort studies and six case-control studies with 1,019 cases were identified using PubMed, Web of Science, and EMBASE. We computed summary relative risks (RRs) and 95 % confidence intervals (CIs) using random effect model applied to the relative risk associated with ever, moderate, or highest drinkers vs. non/lowest drinkers. Subgroup analyses were performed based on country (China and USA). Compared with non/lowest drinkers, the combined RR for ever drinkers was 0.76 (95 % CI=0.65-0.89). In subgroup analyses, significant inverse associations were found for both China and USA studies. The summary RR was 0.57 (95 % CI=0.41-0.78) for highest drinkers. Same results were only found in China studies. No significant associations were found for moderate drinkers in overall analysis or in subgroup analyses. There was some evidence of publication bias. In conclusion, this meta-analysis suggests a significant inverse association of high tea consumption and leukemia risk. Results should be interpreted cautiously given the potential publication bias.

  6. Risk-based planning analysis for a single levee

    Science.gov (United States)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  7. Analysis of related risk factors for pancreatic fistula after pancreaticoduodenectomy

    Institute of Scientific and Technical Information of China (English)

    Qi-Song Yu; He-Chao Huang; Feng Ding; Xin-Bo Wang

    2016-01-01

    Objective:To explore the related risk factors for pancreatic fistula after pancreaticoduodenectomy to provide a theoretical evidence for effectively preventing the occurrence of pancreatic fistula.Methods:A total of 100 patients who were admitted in our hospital from January, 2012 to January, 2015 and had performed pancreaticoduodenectomy were included in the study. The related risk factors for developing pancreatic fistula were collected for single factor and Logistic multi-factor analysis.Results:Among the included patients, 16 had pancreatic fistula, and the total occurrence rate was 16% (16/100). The single-factor analysis showed that the upper abdominal operation history, preoperative bilirubin, pancreatic texture, pancreatic duct diameter, intraoperative amount of bleeding, postoperative hemoglobin, and application of somatostatin after operation were the risk factors for developing pancreatic fistula (P<0.05). The multi-factor analysis showed that the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin were the dependent risk factors for developing pancreatic fistula (OR=4.162, 6.104, 5.613, 4.034,P<0.05).Conclusions:The occurrence of pancreatic fistula after pancreaticoduodenectomy is closely associated with the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin; therefore, effective measures should be taken to reduce the occurrence of pancreatic fistula according to the patients’own conditions.

  8. Cancer risk in waterpipe smokers: a meta-analysis.

    Science.gov (United States)

    Mamtani, Ravinder; Cheema, Sohaila; Sheikh, Javaid; Al Mulla, Ahmad; Lowenfels, Albert; Maisonneuve, Patrick

    2017-01-01

    To quantify by meta-analysis the relationship between waterpipe smoking and cancer, including cancer of the head and neck, esophagus, stomach, lung and bladder. We performed a systematic literature search to identify relevant studies, scored their quality, used fixed and random-effect models to estimate summary relative risks (SRR), evaluated heterogeneity and publication bias. We retrieved information from 28 published reports. Considering only highquality studies, waterpipe smoking was associated with increased risk of head and neck cancer (SRR 2.97; 95 % CI 2.26-3.90), esophageal cancer (1.84; 1.42-2.38) and lung cancer (2.22; 1.24-3.97), with no evidence of heterogeneity or publication bias. Increased risk was also observed for stomach and bladder cancer but based mainly on poor-quality studies. For colorectum, liver and for all sites combined risk estimates were elevated, but there were insufficient reports to perform a meta-analysis. Contrary to the perception of the relative safety of waterpipe smoking, this meta-analysis provides quantitative estimates of its association with cancers of the head and neck, esophagus and lung. The scarcity and limited quality of available reports point out the need for larger carefully designed studies in well-defined populations.

  9. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    Science.gov (United States)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  10. Risk Factors Analysis on Traumatic Brain Injury Prognosis

    Institute of Scientific and Technical Information of China (English)

    Xiao-dong Qu; Resha Shrestha; Mao-de Wang

    2011-01-01

    To investigate the independent risk factors of traumatic brain injury (TBI) prognosis.Methods A retrospective analysis was performed in 885 hospitalized TEl patients from January 1,2003 to January 1, 2010 in the First Affiliated Hospital of Medical College of Xi' an Jiaotong University. Single-factor and logistic regression analysis were conducted to evaluate the association of different variables with TBI outcome.Results The single-factor analysis revealed significant association between several variables and TEl outcome, including age (P=0.044 for the age group 40-60, P<0.001 for the age group ≥60), complications (P<0.001), cerebrospinal fluid leakage (P<0.001), Glasgow Coma Scale (GCS) (P<0.001), pupillary light reflex (P<0.001), shock (P<0.001), associated extra-cranial lesions (P=0.01), subdural hematoma (P<0.001), cerebral contusion (P<0.001), diffuse axonal injury (P<0.001), and subarachnoid hemorrhage (P<0.001), suggesting the influence of those factors on the prognosis of TBI. Furthermore, logistic regression analysis identified age, GCS score, pupillary light reflex, subdural hematoma, and subarachnoid hemorrhage as independent risk factors of TEl prognosis.Conclusion Age, GCS score, papillary light reflex, subdural hematoma, and subarachnoid hemorrhage may be risk factors influencing the prognosis of TEl. Paying attention to those factors might improve the outcome of TBI in clinical treatment.

  11. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  12. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    Science.gov (United States)

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being.

  13. Financial risk analysis and prediction of Chinese power industry

    Energy Technology Data Exchange (ETDEWEB)

    Jin, H.; An, C. [Hebei Univ. of Technology, Tianjin (China). School of Management; Zhang, C. [Nankai Univ., Tianjin (China). School of Business

    2009-03-11

    A study of 57 Shanghai and Shenzhen power industry companies was presented. The study considered financial ratios between companies in order to determine risk factors for financial crises. Financial data from the Shanghai and Shenzhen stock markets were used to investigate power company performance from 2006 to 2008. Data from the China Center for Economic Research (CCER) were also used. Results of the study indicated that the cash-to-current debt ratio, the return on equity (ROE), net asset growth ratio, and inventory turnover presented uncorrelated and significantly varying ratios for failed power companies. The study also showed that most power companies have a high proportion of liabilities, higher debt risk, low asset turnover ratios, and negative net working capital. Results of the analysis were used to design an early warning model that used logistic regression techniques to predict risk. 7 refs., 5 tabs.

  14. Risk analysis of landslide disaster in Ponorogo, East Java, Indonesia

    Science.gov (United States)

    Koesuma, S.; Saido, A. P.; Fukuda, Y.

    2016-11-01

    Ponorogo is one of regency in South-West of East Java Province, Indonesia, where located in subduction zone between Eurasia and Australia plate tectonics. It has a lot of mountain area which is disaster-prone area for landslide. We have collected landslide data in 305 villages in Ponorogo and make it to be Hazards Index. Then we also calculate Vulnerability Index, Economic Loss index, Environmental Damage Index and Capacity Index. The risk analysis map is composed of three components H (Hazards), V (Vulnerability, Economic Loss index, Environmental Damage Index) and C (Capacity Index). The method is based on regulations of National Disaster Management Authority (BNPB) number 02/2012 and number 03/2012. It has three classes of risk index, i.e. Low, Medium and High. Ponorogo city has a medium landslide risk index.

  15. Occupational and Cost Risk : Critical Analysis of Monetization Policy Risk Approach to the Spanish Law Standards

    Directory of Open Access Journals (Sweden)

    Marco Antônio César Villatore

    2016-06-01

    Full Text Available The problem that surrounds the issue of occupational risk is a phenomenon that plagues every society, because the work is a central element and gives force to the economy. In this sense, the exposure of workers to harmful activities that may cause damage to your health and physical and mental integrity, based on the monetization policy of risk adopted by the Brazilian legal system, can import costs to the workers, employers, the state and society. Thus, the present study seeks, from the labor law analysis and the use of concepts of Economic Analysis of Law, attest that the social costs caused by worker exposure to risk is, fallaciously shown, in a short-term smaller than that of prevention, but in the long run can import the burden on all parties of the employment relationship as well as the State and society, being necessary to use economic and legal measures for changing the monetization of risk policy, as in alien systems, as the Spanish Law analyzed.

  16. Aeroservoelastic modeling with proper orthogonal decomposition

    Science.gov (United States)

    Carlson, Henry A.; Verberg, Rolf; Harris, Charles A.

    2017-02-01

    A physics-based, reduced-order, aeroservoelastic model of an F-18 aircraft has been developed using the method of proper orthogonal decomposition (POD), introduced to the field of fluid mechanics by Lumley. The model is constructed with data from high-dimensional, high-fidelity aeroservoelastic computational fluid dynamics (CFD-ASE) simulations that couple equations of motion of the flow to a modal model of the aircraft structure. Through POD modes, the reduced-order model (ROM) predicts both the structural dynamics and the coupled flow dynamics, offering much more information than typically employed, low-dimensional models based on system identification are capable of providing. ROM accuracy is evaluated through direct comparisons between predictions of the flow and structural dynamics with predictions from the parent, the CFD-ASE model. The computational overhead of the ROM is six orders of magnitude lower than that of the CFD-ASE model—accurately predicting the coupled dynamics from simulations of an F-18 fighter aircraft undergoing flutter testing over a wide range of transonic and supersonic flight speeds on a single processor in 1.073 s.

  17. Proper Treatment of Acute Mesenteric Ischemia

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Kwan; Han, Young Min [Dept. of Radiology, Chonbuk National University Hospital and School of Medicine, Jeonju (Korea, Republic of); Kwak, Hyo Sung [Research Institue of Clinical Medicine, Chonbuk National University Hospital and School of Medicine, Jeonju (Korea, Republic of); Yu, Hee Chul [Dept. of Radiology, Chonbuk National University Hospital and School of Medicine, Jeonju (Korea, Republic of)

    2011-10-15

    To evaluate the effectiveness of treatment options for Acute Mesenteric Ischemia and establish proper treatment guidelines. From January 2007 to May 2010, 14 patients (13 men and 1 woman, mean age: 52.1 years) with acute mesenteric ischemia were enrolled in this study. All of the lesions were detected by CT scan and angiography. Initially, 4 patients underwent conservative treatment. Eleven patients were managed by endovascular treatment. We evaluated the therapeutic success and survival rate of each patient. The causes of ischemia included thromboembolism in 6 patients and dissection in 8 patients. Nine patients showed bowel ischemia on CT scans, 4 dissection patients underwent conservative treatment, 3 patients had recurring symptoms, and 5 dissection patients underwent endovascular treatment. Overall success and survival rate was 100%. However, overall success was 83% and survival rate was 40% in the 6 thromboembolism patients. The choice of 20 hours as the critical time in which the procedure is ideally performed was statistically significant (p = 0.0476). A percutaneous endovascular procedure is an effective treatment for acute mesenteric ischemia, especially in patients who underwent treatment within 20 hours. However, further study and a long term follow-up are needed.

  18. H15-42: CFD analysis for risk analysis in urban environments - Tilburg city case study

    NARCIS (Netherlands)

    Hulsbosch-Dam, C.; Mack, A.; Ratingen, S. van; Rosmuller, N.; Trijssenaar, I.

    2013-01-01

    For risk analysis studies, relatively simple dispersion models are generally applied, such as Gaussian dispersion and dense gas dispersion models. For rail transport risk analyses in the Netherlands, fixed consequence distances are applied for various standard scenarios of hazardous materials releas

  19. Exploring Mexican adolescents' perceptions of environmental health risks: a photographic approach to risk analysis

    Directory of Open Access Journals (Sweden)

    Susanne Börner

    2015-05-01

    Full Text Available The objective of this study was to explore Mexican adolescents' perceptions of environmental health risks in contaminated urban areas, and to test the environmental photography technique as a research tool for engaging adolescents in community-based health research. The study was conducted with 74 adolescents from two communities in the city of San Luis Potosi, Mexico. Participants were provided with disposable cameras and asked to take photographs of elements and situations which they believed affected their personal health both at home and outside their homes. They were also asked to describe each photograph in writing. Photographs and written explanations were analyzed by using quantitative and qualitative content analysis. Risk perception plays a crucial role in the development of Risk Communication Programs (RCPs aimed at the improvement of community health. The photography technique opens up a promising field for environmental health research since it affords a realistic and concise impression of the perceived risks. Adolescents in both communities perceived different environmental health risks as detrimental to their well-being, e.g. waste, air pollution, and lack of hygiene. Yet, some knowledge gaps remain which need to be addressed.

  20. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    Science.gov (United States)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  1. Corticosteroids and pediatric septic shock outcomes: a risk stratified analysis.

    Directory of Open Access Journals (Sweden)

    Sarah J Atkinson

    Full Text Available The potential benefits of corticosteroids for septic shock may depend on initial mortality risk.We determined associations between corticosteroids and outcomes in children with septic shock who were stratified by initial mortality risk.We conducted a retrospective analysis of an ongoing, multi-center pediatric septic shock clinical and biological database. Using a validated biomarker-based stratification tool (PERSEVERE, 496 subjects were stratified into three initial mortality risk strata (low, intermediate, and high. Subjects receiving corticosteroids during the initial 7 days of admission (n = 252 were compared to subjects who did not receive corticosteroids (n = 244. Logistic regression was used to model the effects of corticosteroids on 28-day mortality and complicated course, defined as death within 28 days or persistence of two or more organ failures at 7 days.Subjects who received corticosteroids had greater organ failure burden, higher illness severity, higher mortality, and a greater requirement for vasoactive medications, compared to subjects who did not receive corticosteroids. PERSEVERE-based mortality risk did not differ between the two groups. For the entire cohort, corticosteroids were associated with increased risk of mortality (OR 2.3, 95% CI 1.3-4.0, p = 0.004 and a complicated course (OR 1.7, 95% CI 1.1-2.5, p = 0.012. Within each PERSEVERE-based stratum, corticosteroid administration was not associated with improved outcomes. Similarly, corticosteroid administration was not associated with improved outcomes among patients with no comorbidities, nor in groups of patients stratified by PRISM.Risk stratified analysis failed to demonstrate any benefit from corticosteroids in this pediatric septic shock cohort.

  2. Biological risk factors for suicidal behaviors: a meta-analysis.

    Science.gov (United States)

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-09-13

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09-1.81) and suicide death (wOR=1.28; CI: 1.13-1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias-cytokines (wOR=2.87; CI: 1.40-5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01-1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors.

  3. An analysis of risk factors for asymptomatic cerebral infarction.

    Science.gov (United States)

    Shiga, Tomoko; Owada, Kiyoshi; Hoshino, Tatsuo; Nagahara, Hikaru; Shiratori, Keiko

    2008-01-01

    The aim of this study is to identify risk factors for asymptomatic cerebral infarction (ACI) in the general Japanese population. A total of 634 subjects (272 men aged 55.4+/-8.8 years and 362 women aged 55.2+/-8.5 years) who visited the Health Management Center at Aoyama Hospital (Tokyo, Japan) from January 2004 through January 2005 for an annual brain dry dock examination were analyzed. We evaluated 21 risk factors for ACI by multivariate logistic regression analysis. Abnormal or potentially abnormal conditions were detected in 258 subjects (40.7% of all subjects who had an annual check-up program for brain disease). The most frequent abnormal finding was ACI, which was observed in 208 subjects. The significant risk factors for ACI, as determined by multivariate logistic analysis, were age (P <0.01), hypertension (P <0.01), and hypertensive vascular changes in the fundus (P <0.05). The hypertensive vascular abnormalities in the fundus might be a risk factor for ACI independent of age and hypertension.

  4. Working session 5: Operational aspects and risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cizelj, L. [Jozef Stefan Institute, Ljubljana (Slovenia); Donoghue, J. [Nuclear Regulatory Commission, Washington, DC (United States)

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects of affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.

  5. An Investigation Of Organizational Information Security Risk Analysis

    Directory of Open Access Journals (Sweden)

    Zack Jourdan

    2010-12-01

    Full Text Available Despite a growing number and variety of information security threats, many organizations continue to neglect implementing information security policies and procedures.  The likelihood that an organization’s information systems can fall victim to these threats is known as information systems risk (Straub & Welke, 1998.  To combat these threats, an organization must undergo a rigorous process of self-analysis. To better understand the current state of this information security risk analysis (ISRA process, this study deployed a questionnaire using both open-ended and closed ended questions administered to a group of information security professionals (N=32.  The qualitative and quantitative results of this study show that organizations are beginning to conduct regularly scheduled ISRA processes.  However, the results also show that organizations still have room for improvement to create idyllic ISRA processes. 

  6. Risk analysis of tyramine concentration in food production

    OpenAIRE

    Doudová, Lucie; Buňka, František; Michálek, Jaroslav; Sedlačík, Marek; Buňková, Leona

    2013-01-01

    The contribution is focused on risk analysis in food microbiology. This paper evaluates the effect of selected factors on tyramine production in bacterial strains of Lactococcus genus which were assigned as tyramine producers. Tyramine is a biogenic amine sythesized from an amino acid called tyrosine. It can be found in certain foodstuffs (often in cheese), and can cause a pseudo-response in sensitive individuals. The above-mentioned bacteria are commonly used in the biotechnological process ...

  7. Defining Human Failure Events for Petroleum Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  8. Common pitfalls in statistical analysis: Odds versus risk

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh; Pramesh, C. S.

    2015-01-01

    In biomedical research, we are often interested in quantifying the relationship between an exposure and an outcome. “Odds” and “Risk” are the most common terms which are used as measures of association between variables. In this article, which is the fourth in the series of common pitfalls in statistical analysis, we explain the meaning of risk and odds and the difference between the two. PMID:26623395

  9. Liquidity Risk Management: An Empirical Analysis on Panel Data Analysis and ISE Banking Sector

    Directory of Open Access Journals (Sweden)

    Sibel ÇELİK

    2012-06-01

    Full Text Available In this paper, we test the factors affecting liquidity risk management in banking sector in Turkey by using panel regression analysis. We use the data for 9 commercial banks traded in Istanbul Stock Exchange for the period 1998-2008. In conclusion, we find that risky liquid assets and return on equity variables are negatively related with liquidity risk. However, external financing and return on asset variables are positively related with liquidity risk. This finding is importance for banks since it underlines the critical factors in liquidity risk management.

  10. RISK ANALYSIS FOR SHIP CONVERTING PROJECT ACCOMPLISHMENT (Case study of KRI KP Converting Project

    Directory of Open Access Journals (Sweden)

    Dimas Endro W

    2013-10-01

    Full Text Available Ship converting has become as prospective activity in ship building area. Operational and economical aspect are the most dominant rationale. Baseon a new fuction of converted ship, a task list which contain several jobs that must be done is listed. This accomplishment schedule not only contain a task list, but also duration for certain job title. In practical apllication job duration is maintained based on experience of project manager.  Further more, total accomplish duration is setted as time accomplishment for the project. This setted time has become reference for the project bid. Occasionaly, if accomplishment time which offered is strict, than schedule slip become as potencial nightmare. For this situation, project manager has had a cristal clearconsideration to select a proper decision wheter he will take the tender offer or not. practically, project mananger has layed on his experience to handle previous project and face  penalty if the project delayed. This paper focussed on how to measure tender offer based on risk analysis, specially for converted ship tender which has a strike time accomplishment.A new application method to analysis proposed tender based on time and penalty parameter has become a topic of this paper.

  11. Do we see how they perceive risk? An integrated analysis of risk perception and its effect on workplace safety behavior.

    Science.gov (United States)

    Xia, Nini; Wang, Xueqing; Griffin, Mark A; Wu, Chunlin; Liu, Bingsheng

    2017-06-20

    While risk perception is a key factor influencing safety behavior, the academia lacks specific attention to the ways that workers perceive risk, and thus little is known about the mechanisms through which different risk perceptions influence safety behavior. Most previous research in the workplace safety domain argues that people tend to perceive risk based on rational formulations of risk criticality. However, individuals' emotions can be also useful in understanding their perceptions. Therefore, this research employs an integrated analysis concerning the rational and emotional perspectives. Specifically, it was expected that the identified three rational ways of perceiving risk, i.e., perceived probability, severity, and negative utility, would influence the direct emotional risk perception. Furthermore, these four risk perceptions were all expected to positively but differently influence safety behavior. The hypotheses were tested using a sample of 120 construction workers. It was found that all the three rational risk perceptions significantly influenced workers' direct perception of risk that is mainly based on emotions. Furthermore, safety behavior among workers relied mainly on emotional perception but not rational calculations of risk. This research contributes to workplace safety research by highlighting the importance of integrating the emotional assessment of risk, especially when workers' risk perception and behavior are concerned. Suggested avenues for improving safety behavior through improvement in risk perception include being aware of the possibility of different ways of perceiving risk, promoting experience sharing and accident simulation, and uncovering risk information. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Software requirements specification for the program analysis and control system risk management module

    Energy Technology Data Exchange (ETDEWEB)

    SCHAEFER, J.C.

    1999-06-02

    TWR Program Analysis and Control System Risk Module is used to facilitate specific data processes surrounding the Risk Management program of the Tank Waste Retrieval environment. This document contains the Risk Management system requirements of the database system.

  13. Analysis of risk factors for postoperative pancreatic fistula following pancreaticoduodenectomy.

    Science.gov (United States)

    Liu, Qi-Yu; Zhang, Wen-Zhi; Xia, Hong-Tian; Leng, Jian-Jun; Wan, Tao; Liang, Bin; Yang, Tao; Dong, Jia-Hong

    2014-12-14

    To explore the morbidity and risk factors of postoperative pancreatic fistula (POPF) following pancreaticoduodenectomy. The data from 196 consecutive patients who underwent pancreaticoduodenectomy, performed by different surgeons, in the General Hospital of the People's Liberation Army between January 1(st), 2013 and December 31(st), 2013 were retrospectively collected for analysis. The diagnoses of POPF and clinically relevant (CR)-POPF following pancreaticoduodenectomy were judged strictly by the International Study Group on Pancreatic Fistula Definition. Univariate analysis was performed to analyze the following factors: patient age, sex, body mass index (BMI), hypertension, diabetes mellitus, serum CA19-9 level, history of jaundice, serum albumin level, blood loss volume, pancreatic duct diameter, pylorus preserving pancreaticoduodenectomy, pancreatic drainage and pancreaticojejunostomy. Multivariate logistic regression analysis was used to determine the main independent risk factors for POPF. POPF occurred in 126 (64.3%) of the patients, and the incidence of CR-POPF was 32.7% (64/196). Patient characteristics of age, sex, BMI, hypertension, diabetes mellitus, serum CA19-9 level, history of jaundice, serum albumin level, blood loss volume, pylorus preserving pancreaticoduodenectomy and pancreaticojejunostomy showed no statistical difference related to the morbidity of POPF or CR-POPF. Pancreatic duct diameter was found to be significantly correlated with POPF rates by univariate analysis and multivariate regression analysis, with a pancreatic duct diameter ≤ 3 mm being an independent risk factor for POPF (OR = 0.291; P = 0.000) and CR-POPF (OR = 0.399; P = 0.004). The CR-POPF rate was higher in patients without external pancreatic stenting, which was found to be an independent risk factor for CR-POPF (OR = 0.394; P = 0.012). Among the entire patient series, there were three postoperative deaths, giving a total mortality rate of 1.5% (3/196), and the mortality

  14. Risk Analysis of Hepatocellular Carcinoma in Northeast China

    Institute of Scientific and Technical Information of China (English)

    Zhi-fang Jia; Meng Su; Miao He; Zhi-hua Yin; Wei Wu; Xue-lian Li; Peng Guan; Bao-sen Zhou

    2009-01-01

    Objective: It is known that chronic hepatitis B virus (HBV) infection is a main risk factor for hepatocellular carcinoma (HCC). To assess the effect of HBV infection and its interaction with other factors on the risk for HCC, a hospital-based case-control study was carried out in Northeast China. Methods: A total of 384 cases with hepatocellular carcinoma and 432 controls without evidence of liver diseases were enrolled in the study. Blood samples were collected to detect the serum markers of hepatitis B virus (HBV) and hepatitis C virus (HCV) and questionnaires about lifestyle and family tumor history were performed in all subjects. Results: The total infection rate of HBV in hepatocellular carcinoma cases was 70.8% and 10.0% in non-liver disease controls. There was a statistically significant difference (P<0.0001) between cases and controls (OR= 22.0; 95%CI:15.0-32.3). Interaction analysis indicated that in HBV chronic carriers with HCV infection or alcohol consumption or family HCC history, the risk for HCC increased (OR=41.1, 95%CI: 20.2-83.9, OR=125.0, 95%CI: 66.5-235.2; OR=56.9, 95%CI: 27.2-119.3 respectively). In addition, hepatitis B history, HCV infection, hepatic cirrhosis and family history of HCC were also potential HCC independent risk factors. Conclusion: We confirmed that HBV is a chief risk factor for hepatocellular carcinoma and accounts for 67.7% of all hepatocellular carcinoma in Northeast China. HCV infection, alcohol intake and family history could enhance the risk for HCC in chronic HBV carriers.

  15. Living with Risk in Everyday Life - A Comparative Analysis on Handling and Reflecting Risk in Everyday Actions

    DEFF Research Database (Denmark)

    Elverdam, Beth; Hoel Felde, Lina Klara

    phones; chemicals in a nursery; elevated cholesterol was combined to analyse the concept of risk in everyday life. In-depth qualitative interviews with 46 people made it possible to analyse a general perception of risk in everyday life. Interviews were analysed using a phenomenological thematical content...... analysis. Results: Although risk is communicated in the media and by health personnel, and thus has a general presence in society, participants in everyday life place risk at the periphery of life. Risk is not part of their everyday reflections. When risk manifests itself in everyday life, it is reflected...

  16. Risk of persistent high-grade squamous intraepithelial lesion after electrosurgical excisional treatment with positive margins: a meta-analysis.

    Science.gov (United States)

    Oliveira, Caroline Alves de; Russomano, Fábio Bastos; Gomes Júnior, Saint Clair dos Santos; Corrêa, Flávia de Miranda

    2012-01-01

    Even if precursor lesions of cervical cancer are properly treated, there is a risk of persistence or recurrence. The aim here was to quantify the risks of persistence of high-grade intraepithelial squamous lesions, one and two years after cervical electrosurgical excisional treatment with positive margins. Systematic review of the literature and meta-analysis at Instituto Fernandes Figueira. This meta-analysis was on studies published between January 1989 and July 2009 that were identified in Medline, Scopus, Embase, Cochrane, SciELO, Lilacs, Adolec, Medcarib, Paho, Wholis, Popline, ISI Web of Science and Sigle. Articles were selected if they were cohort studies on electrosurgical excisional treatment of high-grade squamous intraepithelial lesions with a minimum follow-up of one year, a histopathological outcome of persistence of these lesions and a small risk of bias. The search identified 7,066 articles and another 21 in the reference lists of these papers. After applying the selection and exclusion criteria, only four articles were found to have extractable data. The risk of persistence of high-grade intraepithelial lesions after one year was 11.36 times greater (95% confidence interval, CI: 5.529-23.379, P treatment after the first year of follow-up and highlights the need for appropriately chosen electrosurgical techniques based on disease location and extent, with close surveillance of these patients.

  17. High proper motion X-ray binaries from the Yale Southern Proper Motion Survey

    CERN Document Server

    Maccarone, Thomas J; Casetti-Dinescu, Dana I

    2014-01-01

    We discuss the results of cross-correlating catalogs of bright X-ray binaries with the Yale Southern Proper Motion catalog (version 4.0). Several objects already known to have large proper motions from Hipparcos are recovered. Two additional objects are found which show substantial proper motions, both of which are unusual in their X-ray properties. One is IGR J17544-2619, one of the supergiant fast X-ray transients. Assuming the quoted distances in the literature for this source of about 3 kpc are correct, this system has a peculiar velocity of about 275 km/sec -- greater than the velocity of a Keplerian orbit at its location of the Galaxy, and in line with the expectations formed from suggestions that the supergiant fast X-ray transients should be highly eccentric. We discuss the possibility that these objects may help explain the existence of short gamma-ray bursts outside the central regions of galaxies. The other is the source 2A~1822-371, which is a member of the small class of objects which are low mas...

  18. Analysis and evaluation of enterprise risk management capability elements

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Research on enterprise risk management capability is conducted with a view of discerning and processing risks, in which an evaluation index system and an evaluation model of enterprise risk management capabilities are constructed. The risk management capability consists of four aspects, i.e. risk identification capability, risk assessment capability, risk planning capability and risk control capability. Risk identification and assessment capabilities reflect the level of enterprises on finding and analyzing...

  19. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    Science.gov (United States)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  20. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    Science.gov (United States)

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  1. Flood Risk Analysis and Flood Potential Losses Assessment

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.

  2. Cascade vulnerability for risk analysis of water infrastructure.

    Science.gov (United States)

    Sitzenfrei, R; Mair, M; Möderl, M; Rauch, W

    2011-01-01

    One of the major tasks in urban water management is failure-free operation for at least most of the time. Accordingly, the reliability of the network systems in urban water management has a crucial role. The failure of a component in these systems impacts potable water distribution and urban drainage. Therefore, water distribution and urban drainage systems are categorized as critical infrastructure. Vulnerability is the degree to which a system is likely to experience harm induced by perturbation or stress. However, for risk assessment, we usually assume that events and failures are singular and independent, i.e. several simultaneous events and cascading events are unconsidered. Although failures can be causally linked, a simultaneous consideration in risk analysis is hardly considered. To close this gap, this work introduces the term cascade vulnerability for water infrastructure. Cascade vulnerability accounts for cascading and simultaneous events. Following this definition, cascade risk maps are a merger of hazard and cascade vulnerability maps. In this work cascade vulnerability maps for water distribution systems and urban drainage systems based on the 'Achilles-Approach' are introduced and discussed. It is shown, that neglecting cascading effects results in significant underestimation of risk scenarios.

  3. [Analysis of risk factors for perinatal brachial plexus palsy].

    Science.gov (United States)

    Gosk, Jerzy; Rutowski, Roman

    2005-04-01

    Risk factors of obstetrical brachial plexus palsy include: (1) large birth weight, (2) shoulder dystocia and prolonged second stage of labour, (3) instrumental vaginal delivery (forceps delivery, vacuum extraction), (4) diabetes mellitus and mother's obesity, (5) breech presentation, (6) delivery and infant with obstetrical brachial plexus palsy in antecedent delivery. The purpose was analysis of the classical risk factors for brachial plexus palsy based on our own clinical material. Clinical material consists of 83 children with obstetrical brachial plexus palsy treated at the Department of Trauma and Hand Surgery (surgically--54, conservatively--29). Control group consists of 56 healthy newborns. Data recorded included: birth weight, body length, head and chest circumference, Apgar test at 1 min., type of brachial palsy and side affected, type of birth, presentation, duration of delivery (II stage), age of mother, mother's diseases, parity. The infants treated surgically have had a significantly higher birth weight, body height, head and chest circumference, in compression with control group and group treated conservatively. The differences were statistically important. Shoulder dystocia occurred in 32.9% of all vaginal delivery. Instrumental vaginal delivery was observed in 11.3% and breech presentation in 4.9% cases. There were no incidences of obstetrical brachial plexus palsy recurrence. Diabetes mellitus and mother's obesity was found in 3 cases. (1) Fetal macrosomia is the important risk factor of the obstetrical brachial plexus palsy. (2) Obstetrical brachial plexus palsy may occur also in the absence of the classical risk factors.

  4. Securitization of Receivables - An Analysis of the Inherent Risks

    Directory of Open Access Journals (Sweden)

    José Roberto Ferreira Savoia

    2009-06-01

    Full Text Available Securitization is a modality of structured finance which allows a company to raise funds based on its receivables through capital markets. In Brazil, securitization was developed mostly in the form of mutual funds - the FIDC, which raise money by issuing senior cotes for qualified investors, and subordinated cotes, usually bought by the company that originated the receivables. This paper evaluates the risk and return for both kinds of investors through a stochastic model with two main variables: interest rates and default rates. The model is still sensible to the characteristics of the fund, like the amount of subordinated cotes, the type of asset being securitized; and the amount of receivables in relation to the assets. Regarding the case of senior cotes, the risk of returns under the basic level of interest rates is highly improbable; and in the case of subordinated cotes, the risk of returns under the basic interest rate may be considered still low, due to the high spreads observed in the Brazilian financial market. The simulations indicated that under historically mean interest rate volatility the default rates are the main component of the total risk. Accordingly to the developed analysis of international standards of regulation, the Brazilian Central Bank imposes very strong capital requirements to banks that securitize their assets and purchase the corresponding subordinated cotes.

  5. Risk Propagation Analysis and Visualization using Percolation Theory

    Directory of Open Access Journals (Sweden)

    Sandra Konig

    2016-01-01

    Full Text Available This article presents a percolation-based approach for the analysis of risk propagation, using malware spreading as a showcase example. Conventional risk management is often driven by human (subjective assessment of how one risk influences the other, respectively, how security incidents can affect subsequent problems in interconnected (subsystems of an infrastructure. Using percolation theory, a well-established methodology in the fields of epidemiology and disease spreading, a simple simulation-based method is described to assess risk propagation system-atically. This simulation is formally analyzed using percolation theory, to obtain closed form criteria that help predicting a pandemic incident propagation (or a propagation with average-case bounded implications. The method is designed as a security decision support tool, e.g., to be used in security operation centers. For that matter, a flexible visualization technique is devised, which is naturally induced by the percolation model and the simulation algorithm that derives from it. The main output of the model is a graphical visualization of the infrastructure (physical or logical topology. This representation uses color codes to indicate the likelihood of problems to arise from a security incident that initially occurs at a given point in the system. Large likelihoods for problems thus indicate “hotspots”, where additional action should be taken.

  6. Proper Nouns in Translation: Should They Be Translated?

    Directory of Open Access Journals (Sweden)

    Rouhollah Zarei

    2014-11-01

    Full Text Available The translation of proper nouns is not as easy as that of other parts of speech as this is more challenging for certain reasons. The present article presents a descriptive study of proper nouns in translation, scrutinizing the challenges and exploring the solutions. Building on some scholars’ approach and suggestions from other researchers, the article clarifies the nature and problems of proper nouns in translation; it seeks to answer three questions: 1 Should proper nouns be translated? 2 What are the problems on the way of translation of the proper nouns? 3 How can the translator overcome such problems? Moreover, strategies applied by the researchers to make their translation easier are also discussed. It follows that translating proper nouns is not simple and there is little flexibility about translating proper nouns. Keywords: proper nouns, translation, strategies

  7. Relative Proper Motions in the Rho Ophiuchi Cluster

    Science.gov (United States)

    Wilking, Bruce A.; Vrba, Frederick J.; Sullivan, Timothy

    2015-12-01

    Near-infrared images optimized for astrometry have been obtained for four fields in the high-density L 1688 cloud core over a 12 year period. The targeted regions include deeply embedded young stellar objects (YSOs) and very low luminosity objects too faint and/or heavily veiled for spectroscopy. Relative proper motions in R.A. and decl. were computed for 111 sources and again for a subset of 65 YSOs, resulting in a mean proper motion of (0,0) for each field. Assuming each field has the same mean proper motion, YSOs in the four fields were combined to yield estimates of the velocity dispersions in R.A. and decl. that are consistent with 1.0 km s-1. These values appear to be independent of the evolutionary state of the YSOs. The observed velocity dispersions are consistent with the dispersion in radial velocity derived for optically visible YSOs at the periphery of the cloud core and are consistent with virial equilibrium. The higher velocity dispersion of the YSOs in the plane of the sky relative to that of dense cores may be a consequence of stellar encounters due to dense cores and filaments fragmenting to form small groups of stars or the global collapse of the L 1688 cloud core. An analysis of the differential magnitudes of objects over the 12 year baseline has not only confirmed the near-infrared variability for 29 YSOs established by prior studies, but has also identified 18 new variability candidates. Four of these have not been previously identified as YSOs and may be newly identified cluster members.

  8. Flood Hazard and Risk Analysis in Urban Area

    Science.gov (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  9. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    Michael M·derl; Wolfgang Rauch

    2011-01-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g.,by terrorist attacks,infrastructure deterioration or climate change.For the spatial risk assessment,vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process.Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios.Thereby parameters are varied according to the specific impact of a particular threat scenario.Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past.The application of the spatial risk assessment is exemplified by means of a case study for a water supply system,but the principal concept is applicable likewise to other critical network infrastructure.The aim of the approach is to help decision makers in choosing zones for preventive measures.

  10. Risk-management and risk-analysis-based decision tools for attacks on electric power.

    Science.gov (United States)

    Simonoff, Jeffrey S; Restrepo, Carlos E; Zimmerman, Rae

    2007-06-01

    Incident data about disruptions to the electric power grid provide useful information that can be used as inputs into risk management policies in the energy sector for disruptions from a variety of origins, including terrorist attacks. This article uses data from the Disturbance Analysis Working Group (DAWG) database, which is maintained by the North American Electric Reliability Council (NERC), to look at incidents over time in the United States and Canada for the period 1990-2004. Negative binomial regression, logistic regression, and weighted least squares regression are used to gain a better understanding of how these disturbances varied over time and by season during this period, and to analyze how characteristics such as number of customers lost and outage duration are related to different characteristics of the outages. The results of the models can be used as inputs to construct various scenarios to estimate potential outcomes of electric power outages, encompassing the risks, consequences, and costs of such outages.

  11. Toward an Empirical Taxonomy of Suicide Ideation: A Cluster Analysis of the Youth Risk Behavior Survey

    Science.gov (United States)

    Flannery, William Peter; Sneed, Carl D.; Marsh, Penny

    2003-01-01

    In this study we examined adolescent risk behaviors, giving special attention to suicide ideation. Cluster analysis was used to classify adolescents ( N = 2,730) on the Youth Risk Behavior Survey. Six clusters of adolescent risk behavior were identified. Although each risk cluster was distinct, some clusters shared overlapping risk behaviors.…

  12. Risk analysis of investments in-farm milk cooling tanks

    Directory of Open Access Journals (Sweden)

    Sant´Anna Danielle D.

    2003-01-01

    Full Text Available A risk analysis for the installation of milk cooling tanks (250, 500 and 1,000 L on Brazilian rural properties was conducted in this study. The results showed that all investments had a return higher than the annual 12% minimum rate of attractiveness. There was a direct relationship between tank size and investment profitability and an inverse relation between size and risk. The probability of achieving returns lower than the opportunity cost was highest for the smallest tank (42%. In order to make the investment in small cooling tanks more attractive, the dairy industry incentives offered to farmers for supplying cooled milk could be increased. However, this approach might make investments in bulk milk collection by dairy companies infeasible. Thus, a recommendable strategy for a successful modernization of the Brazilian dairy sector?s inbound logistics would be to promote an increase in the volume of the milk produced per farm.

  13. [Risk factors analysis of cardiovascular diseases. Is the Life Style Assessment useful?].

    Science.gov (United States)

    Bye, A

    1997-08-10

    The Norwegian Medical Association's Health Control Handbook (1993) has introduced a lifestyle risk analysis-a paper-based way of assessing risk factors for cardiovascular disease and transferring them into pedagogic risk scores. By using the lifestyle risk analysis in our data based risk profile system LIVDA, we compared and evaluated the two systems through 437 consultations at our Occupational Health Clinic. The lifestyle risk analysis is a pedagogic tool, as compared with the unsystematic clinical information recorded in journals. We found only small differences between the lifestyle risk analysis and LIVDA, except when assessing total cholesterol and physical exercise. Lifestyle risk analysis does not, however, allow categorisation of risk factor values without adjustments, and does not include all relevant risk factors. Further, there are no possibilities of measuring motivation, or for selecting patients for group intervention.

  14. Survival analysis in the presence of competing risks.

    Science.gov (United States)

    Zhang, Zhongheng

    2017-02-01

    Survival analysis in the presence of competing risks imposes additional challenges for clinical investigators in that hazard function (the rate) has no one-to-one link to the cumulative incidence function (CIF, the risk). CIF is of particular interest and can be estimated non-parametrically with the use cuminc() function. This function also allows for group comparison and visualization of estimated CIF. The effect of covariates on cause-specific hazard can be explored using conventional Cox proportional hazard model by treating competing events as censoring. However, the effect on hazard cannot be directly linked to the effect on CIF because there is no one-to-one correspondence between hazard and cumulative incidence. Fine-Gray model directly models the covariate effect on CIF and it reports subdistribution hazard ratio (SHR). However, SHR only provide information on the ordering of CIF curves at different levels of covariates, it has no practical interpretation as HR in the absence of competing risks. Fine-Gray model can be fit with crr() function shipped with the cmprsk package. Time-varying covariates are allowed in the crr() function, which is specified by cov2 and tf arguments. Predictions and visualization of CIF for subjects with given covariate values are allowed for crr object. Alternatively, competing risk models can be fit with riskRegression package by employing different link functions between covariates and outcomes. The assumption of proportionality can be checked by testing statistical significance of interaction terms involving failure time. Schoenfeld residuals provide another way to check model assumption.

  15. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  16. Vertical velocities from proper motions of red clump giants

    Science.gov (United States)

    López-Corredoira, M.; Abedi, H.; Garzón, F.; Figueras, F.

    2014-12-01

    Aims: We derive the vertical velocities of disk stars in the range of Galactocentric radii of R = 5 - 16 kpc within 2 kpc in height from the Galactic plane. This kinematic information is connected to dynamical aspects in the formation and evolution of the Milky Way, such as the passage of satellites and vertical resonance and determines whether the warp is a long-lived or a transient feature. Methods: We used the PPMXL survey, which contains the USNO-B1 proper motions catalog cross-correlated with the astrometry and near-infrared photometry of the 2MASS point source catalog. To improve the accuracy of the proper motions, the systematic shifts from zero were calculated by using the average proper motions of quasars in this PPMXL survey, and we applied the corresponding correction to the proper motions of the whole survey, which reduces the systematic error. From the color-magnitude diagram K versus (J - K) we selected the standard candles corresponding to red clump giants and used the information of their proper motions to build a map of the vertical motions of our Galaxy. We derived the kinematics of the warp both analytically and through a particle simulation to fit these data. Complementarily, we also carried out the same analysis with red clump giants spectroscopically selected with APOGEE data, and we predict the improvements in accuracy that will be reached with future Gaia data. Results: A simple model of warp with the height of the disk zw(R,φ) = γ(R - R⊙)sin(φ - φw) fits the vertical motions if dot {γ }/γ = -34±17 Gyr-1; the contribution to dot {γ } comes from the southern warp and is negligible in the north. If we assume this 2σ detection to be real, the period of this oscillation is shorter than 0.43 Gyr at 68.3% C.L. and shorter than 4.64 Gyr at 95.4% C.L., which excludes with high confidence the slow variations (periods longer than 5 Gyr) that correspond to long-lived features. Our particle simulation also indicates a probable abrupt decrease

  17. ANALYSIS OF ROMANIAN SMALL AND MEDIUM ENTERPRISES BANKRUPTCY RISK

    Directory of Open Access Journals (Sweden)

    Kulcsar Edina

    2014-07-01

    Full Text Available Considering the fundamental role of small and medium enterprises in Romanian economy, this paper aims to quantify the level of their bankruptcy risk for 2009 and 2012 period, after debuting of financial crisis. The main reason of selecting this type of companies is that they represent the backbone of national economy. They have an indispensable role, because they offer jobs for great part of population and their contribution for GDP stimulation is considerable. In this paper it was applied two default risk models, namely the well known Altman’s Z-score model, based on five financial ratios and a bankruptcy predictor model developed by Teti et. al (2012 used firstly exclusively for Italian small and medium-sized enterprise for 2006-2009 period. The model proposed by Teti et. is based on the investigation of financially distressed and financially non-distressed Italian small and medium-sized enterprises during the financial crisis by using a discriminant analysis model. They conclude that there are four financial ratios, which characterized well the small and medium-sized enterprises bankruptcy risk. These variables are financial ratios, like: Debt/Total Assets, Return on Sales (ROS, EBIT/Interest Expenses and Working capital/EBIDTA. They consider that small and medium-sized enterprises require a particular approach in terms of bankruptcy risk analysis. In present study I try to compare the efficiency of traditional bankruptcy risk model with a small and medium-sized specific model. The necessary database for present analysis is ensured by simplified financial reports of 120 small and medium-sized enterprises registered in Bihor County. The selected enterprises are operating in manufacturing industry (21,67% and trading (78,33%. Present investigation has an important value in actual economic background, where the healthiness and sustainability of small and medium-sized enterprises is a great issue. The results of study shows contradictory

  18. Risk analysis and the law: international law, the World Trade Organization, Codex Alimentarius and national legislation.

    Science.gov (United States)

    Horton, L R

    2001-12-01

    This paper discusses the place of risk analysis in international trade from a US perspective, through looking at the activities of the World Trade Organization and the Codex Alimentarius Commission. After examining what the trade agreements say about risk analysis and how international bodies are advancing and using risk analysis, the paper goes on to assess how risk analysis is used at a national level. Finally, recommendations are made for strengthening international food safety initiatives.

  19. Risk Assessment and Prediction of Flyrock Distance by Combined Multiple Regression Analysis and Monte Carlo Simulation of Quarry Blasting

    Science.gov (United States)

    Armaghani, Danial Jahed; Mahdiyar, Amir; Hasanipanah, Mahdi; Faradonbeh, Roohollah Shirani; Khandelwal, Manoj; Amnieh, Hassan Bakhshandeh

    2016-09-01

    Flyrock is considered as one of the main causes of human injury, fatalities, and structural damage among all undesirable environmental impacts of blasting. Therefore, it seems that the proper prediction/simulation of flyrock is essential, especially in order to determine blast safety area. If proper control measures are taken, then the flyrock distance can be controlled, and, in return, the risk of damage can be reduced or eliminated. The first objective of this study was to develop a predictive model for flyrock estimation based on multiple regression (MR) analyses, and after that, using the developed MR model, flyrock phenomenon was simulated by the Monte Carlo (MC) approach. In order to achieve objectives of this study, 62 blasting operations were investigated in Ulu Tiram quarry, Malaysia, and some controllable and uncontrollable factors were carefully recorded/calculated. The obtained results of MC modeling indicated that this approach is capable of simulating flyrock ranges with a good level of accuracy. The mean of simulated flyrock by MC was obtained as 236.3 m, while this value was achieved as 238.6 m for the measured one. Furthermore, a sensitivity analysis was also conducted to investigate the effects of model inputs on the output of the system. The analysis demonstrated that powder factor is the most influential parameter on fly rock among all model inputs. It is noticeable that the proposed MR and MC models should be utilized only in the studied area and the direct use of them in the other conditions is not recommended.

  20. Integrated transcriptome and methylome analysis in youth at high risk for bipolar disorder: a preliminary analysis.

    Science.gov (United States)

    Fries, G R; Quevedo, J; Zeni, C P; Kazimi, I F; Zunta-Soares, G; Spiker, D E; Bowden, C L; Walss-Bass, C; Soares, J C

    2017-03-14

    First-degree relatives of patients with bipolar disorder (BD), particularly their offspring, have a higher risk of developing BD and other mental illnesses than the general population. However, the biological mechanisms underlying this increased risk are still unknown, particularly because most of the studies so far have been conducted in chronically ill adults and not in unaffected youth at high risk. In this preliminary study we analyzed genome-wide expression and methylation levels in peripheral blood mononuclear cells from children and adolescents from three matched groups: BD patients, unaffected offspring of bipolar parents (high risk) and controls (low risk). By integrating gene expression and DNA methylation and comparing the lists of differentially expressed genes and differentially methylated probes between groups, we were able to identify 43 risk genes that discriminate patients and high-risk youth from controls. Pathway analysis showed an enrichment of the glucocorticoid receptor (GR) pathway with the genes MED1, HSPA1L, GTF2A1 and TAF15, which might underlie the previously reported role of stress response in the risk for BD in vulnerable populations. Cell-based assays indicate a GR hyporesponsiveness in cells from adult BD patients compared to controls and suggest that these GR-related genes can be modulated by DNA methylation, which poses the theoretical possibility of manipulating their expression as a means to counteract the familial risk presented by those subjects. Although preliminary, our results suggest the utility of peripheral measures in the identification of biomarkers of risk in high-risk populations and further emphasize the potential role of stress and DNA methylation in the risk for BD in youth.

  1. Meta-analysis: Circulating vitamin D and ovarian cancer risk.

    Science.gov (United States)

    Yin, Lu; Grandi, Norma; Raum, Elke; Haug, Ulrike; Arndt, Volker; Brenner, Hermann

    2011-05-01

    To review and summarize evidence from longitudinal studies on the association between circulating 25 hydroxyvitamin D (25(OH)D) and the risk of ovarian cancer (OC). Relevant prospective cohort studies and nested case-control studies were identified by systematically searching Ovid Medline, EMBASE, and ISI Web of Knowledge databases and by cross-referencing. The following data were extracted in a standardized manner from eligible studies: first author, publication year, country, study design, characteristics of the study population, duration of follow-up, OC incidence according to circulating vitamin D status and the respective relative risks, and covariates adjusted for in the analysis. Due to the heterogeneity of studies in categorizing circulating vitamin D levels, all results were recalculated for an increase of circulating 25(OH)D by 20ng/ml. Summary relative risks (RRs) were calculated using meta-analysis methods. Overall, ten individual-level studies were included that reported on the association between circulating vitamin D levels and OC incidence. Meta-analysis of studies on OC incidence resulted in a summary RR (95% confidence interval, CI) of 0.83 (0.63-1.08) for an increase of 25(OH)D by 20ng/ml (P=0.160). No indication for heterogeneity and publication bias was found. A tentative inverse association of circulating 25(OH)D with OC incidence was found, which did not reach statistical significance but which requires clarification by additional studies due to potentially high clinical and public health impact. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  3. Probabilistic Approach to Risk Analysis of Chemical Spills at Sea

    Institute of Scientific and Technical Information of China (English)

    Magda Bogalecka; Krzysztof Kolowrocki

    2006-01-01

    Risk analysis of chemical spills at sea and their consequences for sea environment are discussed. Mutual interactions between the process of the sea accident initiating events, the process of the sea environment threats, and the process of the sea environment degradation are investigated. To describe these three particular processes, the separate semi-Markov models are built. Furthermore, these models are jointed into one general model of these processes interactions.Moreover, some comments on the method for statistical identification of the considered models are proposed.

  4. Radiological risk analysis of potential SP-100 space mission scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Bartram, B.W.; Weitzberg, A.

    1988-08-19

    This report presents a radiological risk analysis of three representative space mission scenarios utilizing a fission reactor. The mission profiles considered are: a high-altitude mission, launched by a TITAN IV launch vehicle, boosted by chemical upper stages into its operational orbit, a interplanetary nuclear electric propulsion (NEP) mission, started directly from a shuttle parking orbit, a low-altitude mission, launched by the Shuttle and boosted by a chemical stage to its operational orbit, with subsequent disposal boost after operation. 21 refs., 12 figs., 7 tabs.

  5. Department of Defense Energy and Logistics: Implications of Historic and Future Cost, Risk, and Capability Analysis

    Science.gov (United States)

    Tisa, Paul C.

    Every year the DoD spends billions satisfying its large petroleum demand. This spending is highly sensitive to uncontrollable and poorly understood market forces. Additionally, while some stakeholders may not prioritize its monetary cost and risk, energy is fundamentally coupled to other critical factors. Energy, operational capability, and logistics are heavily intertwined and dependent on uncertain security environment and technology futures. These components and their relationships are less understood. Without better characterization, future capabilities may be significantly limited by present-day acquisition decisions. One attempt to demonstrate these costs and risks to decision makers has been through a metric known as the Fully Burdened Cost of Energy (FBCE). FBCE is defined as the commodity price for fuel plus many of these hidden costs. The metric encouraged a valuable conversation and is still required by law. However, most FBCE development stopped before the lessons from that conversation were incorporated. Current implementation is easy to employ but creates little value. Properly characterizing the costs and risks of energy and putting them in a useful tradespace requires a new framework. This research aims to highlight energy's complex role in many aspects of military operations, the critical need to incorporate it in decisions, and a novel framework to do so. It is broken into five parts. The first describes the motivation behind FBCE, the limits of current implementation, and outlines a new framework that aids decisions. Respectively, the second, third, and fourth present a historic analysis of the connections between military capabilities and energy, analyze the recent evolution of this conversation within the DoD, and pull the historic analysis into a revised framework. The final part quantifies the potential impacts of deeply uncertain futures and technological development and introduces an expanded framework that brings capability, energy, and

  6. Characterizations of Graphs Having Large Proper Connection Numbers

    Directory of Open Access Journals (Sweden)

    Lumduanhom Chira

    2016-05-01

    Full Text Available Let G be an edge-colored connected graph. A path P is a proper path in G if no two adjacent edges of P are colored the same. If P is a proper u − v path of length d(u, v, then P is a proper u − v geodesic. An edge coloring c is a proper-path coloring of a connected graph G if every pair u, v of distinct vertices of G are connected by a proper u − v path in G, and c is a strong proper-path coloring if every two vertices u and v are connected by a proper u− v geodesic in G. The minimum number of colors required for a proper-path coloring or strong proper-path coloring of G is called the proper connection number pc(G or strong proper connection number spc(G of G, respectively. If G is a nontrivial connected graph of size m, then pc(G ≤ spc(G ≤ m and pc(G = m or spc(G = m if and only if G is the star of size m. In this paper, we determine all connected graphs G of size m for which pc(G or spc(G is m − 1,m − 2 or m − 3.

  7. Metabolic disease risk in children by salivary biomarker analysis.

    Science.gov (United States)

    Goodson, J Max; Kantarci, Alpdogan; Hartman, Mor-Li; Denis, Gerald V; Stephens, Danielle; Hasturk, Hatice; Yaskell, Tina; Vargas, Jorel; Wang, Xiaoshan; Cugini, Maryann; Barake, Roula; Alsmadi, Osama; Al-Mutawa, Sabiha; Ariga, Jitendra; Soparkar, Pramod; Behbehani, Jawad; Behbehani, Kazem; Welty, Francine

    2014-01-01

    The study of obesity-related metabolic syndrome or Type 2 diabetes (T2D) in children is particularly difficult because of fear of needles. We tested a non-invasive approach to study inflammatory parameters in an at-risk population of children to provide proof-of-principle for future investigations of vulnerable subjects. We evaluated metabolic differences in 744, 11-year old children selected from underweight, normal healthy weight, overweight and obese categories by analyzing fasting saliva samples for 20 biomarkers. Saliva supernatants were obtained following centrifugation and used for analyses. Salivary C-reactive protein (CRP) was 6 times higher, salivary insulin and leptin were 3 times higher, and adiponectin was 30% lower in obese children compared to healthy normal weight children (all P<0.0001). Categorical analysis suggested that there might be three types of obesity in children. Distinctly inflammatory characteristics appeared in 76% of obese children while in 13%, salivary insulin was high but not associated with inflammatory mediators. The remaining 11% of obese children had high insulin and reduced adiponectin. Forty percent of the non-obese children were found in groups which, based on biomarker characteristics, may be at risk for becoming obese. Significantly altered levels of salivary biomarkers in obese children from a high-risk population, suggest the potential for developing non-invasive screening procedures to identify T2D-vulnerable individuals and a means to test preventative strategies.

  8. Metabolic disease risk in children by salivary biomarker analysis.

    Directory of Open Access Journals (Sweden)

    J Max Goodson

    Full Text Available OBJECTIVE: The study of obesity-related metabolic syndrome or Type 2 diabetes (T2D in children is particularly difficult because of fear of needles. We tested a non-invasive approach to study inflammatory parameters in an at-risk population of children to provide proof-of-principle for future investigations of vulnerable subjects. DESIGN AND METHODS: We evaluated metabolic differences in 744, 11-year old children selected from underweight, normal healthy weight, overweight and obese categories by analyzing fasting saliva samples for 20 biomarkers. Saliva supernatants were obtained following centrifugation and used for analyses. RESULTS: Salivary C-reactive protein (CRP was 6 times higher, salivary insulin and leptin were 3 times higher, and adiponectin was 30% lower in obese children compared to healthy normal weight children (all P<0.0001. Categorical analysis suggested that there might be three types of obesity in children. Distinctly inflammatory characteristics appeared in 76% of obese children while in 13%, salivary insulin was high but not associated with inflammatory mediators. The remaining 11% of obese children had high insulin and reduced adiponectin. Forty percent of the non-obese children were found in groups which, based on biomarker characteristics, may be at risk for becoming obese. CONCLUSIONS: Significantly altered levels of salivary biomarkers in obese children from a high-risk population, suggest the potential for developing non-invasive screening procedures to identify T2D-vulnerable individuals and a means to test preventative strategies.

  9. Pressure Systems Stored-Energy Threshold Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  10. Analysis of existing risk assessments, and list of suggestions

    CERN Document Server

    Heimsch, Laura

    2016-01-01

    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  11. Risk Perception Analysis Related To Existing Dams In Italy

    Science.gov (United States)

    Solimene, Pellegrino

    2013-04-01

    earthfill dam is illustrated by defining the risk analysis during its construction and operation. A qualitative "Event Tree Analysis" makes clear with an example the probability of occurrence of the events triggered by an earthquake, and leads to a classification of the damage level. Finally, a System Dynamics (SD) approach is presented to investigate possibilities of a preventive planning in relationship to the risk, so that it's possible to establish shared procedures to achieve the correct management in any crisis phase. As a qualitative result of a SD application, figure 1 presents a flow-chart about a case study on the same dam so to illustrate the emergency planning in a step by step procedure according to the Regulations.

  12. RAVEN, a New Software for Dynamic Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cristian Rabiti; Andrea Alfonsi; Joshua Cogliati; Diego Mandelli; Robert Kinoshita

    2014-06-01

    RAVEN is a generic software driver to perform parametric and probabilistic analysis of code simulating complex systems. Initially developed to provide dynamic risk analysis capabilities to the RELAP-7 code [1] is currently being generalized with the addition of Application Programming Interfaces (APIs). These interfaces are used to extend RAVEN capabilities to any software as long as all the parameters that need to be perturbed are accessible by inputs files or directly via python interfaces. RAVEN is capable to investigate the system response probing the input space using Monte Carlo, grid strategies, or Latin Hyper Cube schemes, but its strength is its focus toward system feature discovery like limit surfaces separating regions of the input space leading to system failure using dynamic supervised learning techniques. The paper will present an overview of the software capabilities and their implementation schemes followed by same application examples.

  13. Crash Prediction and Risk Evaluation Based on Traffic Analysis Zones

    Directory of Open Access Journals (Sweden)

    Cuiping Zhang

    2014-01-01

    Full Text Available Traffic safety evaluation for traffic analysis zones (TAZs plays an important role in transportation safety planning and long-range transportation plan development. This paper aims to present a comprehensive analysis of zonal safety evaluation. First, several criteria are proposed to measure the crash risk at zonal level. Then these criteria are integrated into one measure-average hazard index (AHI, which is used to identify unsafe zones. In addition, the study develops a negative binomial regression model to statistically estimate significant factors for the unsafe zones. The model results indicate that the zonal crash frequency can be associated with several social-economic, demographic, and transportation system factors. The impact of these significant factors on zonal crash is also discussed. The finding of this study suggests that safety evaluation and estimation might benefit engineers and decision makers in identifying high crash locations for potential safety improvements.

  14. GIS-based spatial statistical analysis of risk areas for liver flukes in Surin Province of Thailand.

    Science.gov (United States)

    Rujirakul, Ratana; Ueng-arporn, Naporn; Kaewpitoon, Soraya; Loyd, Ryan J; Kaewthani, Sarochinee; Kaewpitoon, Natthawut

    2015-01-01

    It is urgently necessary to be aware of the distribution and risk areas of liver fluke, Opisthorchis viverrini, for proper allocation of prevention and control measures. This study aimed to investigate the human behavior, and environmental factors influencing the distribution in Surin Province of Thailand, and to build a model using stepwise multiple regression analysis with a geographic information system (GIS) on environment and climate data. The relationship between the human behavior, attitudes (land use as wetland (X64), were correlated with the liver fluke disease distribution at 0.000, 0.034, and 0.006 levels, respectively. Multiple regression analysis, by equations OV=-0.599+0.005(population density (148-169 pop/km2); X73)+0.040 (human attitude (land used (wetland; X64), was used to predict the distribution of liver fluke. OV is the patients of liver fluke infection, R Square=0.878, and, Adjust R Square=0.849. By GIS analysis, we found Si Narong, Sangkha, Phanom Dong Rak, Mueang Surin, Non Narai, Samrong Thap, Chumphon Buri, and Rattanaburi to have the highest distributions in Surin province. In conclusion, the combination of GIS and statistical analysis can help simulate the spatial distribution and risk areas of liver fluke, and thus may be an important tool for future planning of prevention and control measures.

  15. Invited commentary: multilevel analysis of individual heterogeneity-a fundamental critique of the current probabilistic risk factor epidemiology.

    Science.gov (United States)

    Merlo, Juan

    2014-07-15

    In this issue of the Journal, Dundas et al. (Am J Epidemiol. 2014;180(2):197-207) apply a hitherto infrequent multilevel analytical approach: multiple membership multiple classification (MMMC) models. Specifically, by adopting a life-course approach, they use a multilevel regression with individuals cross-classified in different contexts (i.e., families, early schools, and neighborhoods) to investigate self-reported health and mental health in adulthood. They provide observational evidence suggesting the relevance of the early family environment for launching public health interventions in childhood in order to improve health in adulthood. In their analyses, the authors distinguish between specific contextual measures (i.e., the association between particular contextual characteristics and individual health) and general contextual measures (i.e., the share of the total interindividual heterogeneity in health that appears at each level). By doing so, they implicitly question the traditional probabilistic risk factor epidemiology including classical "neighborhood effects" studies. In fact, those studies use simple hierarchical structures and disregard the analysis of general contextual measures. The innovative MMMC approach properly responds to the call for a multilevel eco-epidemiology against a widespread probabilistic risk factors epidemiology. The risk factors epidemiology is not only reduced to individual-level analyses, but it also embraces many current "multilevel analyses" that are exclusively focused on analyzing contextual risk factors.

  16. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  17. FTO gene polymorphisms and obesity risk: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Li Xiaobo

    2011-06-01

    Full Text Available Abstract Background The pathogenesis of obesity is reportedly related to variations in the fat mass and an obesity-associated gene (FTO; however, as the number of reports increases, particularly with respect to varying ethnicities, there is a need to determine more precisely the effect sizes in each ethnic group. In addition, some reports have claimed ethnic-specific associations with alternative SNPs, and to that end there has been a degree of confusion. Methods We searched PubMed, MEDLINE, Web of Science, EMBASE, and BIOSIS Preview to identify studies investigating the associations between the five polymorphisms and obesity risk. Individual study odds ratios (OR and their 95% confidence intervals (CI were estimated using per-allele comparison. Summary ORs were estimated using a random effects model. Results We identified 59 eligible case-control studies in 27 articles, investigating 41,734 obesity cases and 69,837 healthy controls. Significant associations were detected between obesity risk and the five polymorphisms: rs9939609 (OR: 1.31, 95% CI: 1.26 to 1.36, rs1421085 (OR: 1.43, 95% CI: 1.33 to 1.53, rs8050136 (OR: 1.25, 95% CI: 1.13 to 1.38, rs17817449 (OR: 1.54, 95% CI: 1.41 to 1.68, and rs1121980 (OR: 1.34, 95% CI: 1.10 to 1.62. Begg's and Egger's tests provided no evidence of publication bias for the polymorphisms except rs1121980. There is evidence of higher heterogeneity, with I2 test values ranging from 38.1% to 84.5%. Conclusions This meta-analysis suggests that FTO may represent a low-penetrance susceptible gene for obesity risk. Individual studies with large sample size are needed to further evaluate the associations between the polymorphisms and obesity risk in various ethnic populations.

  18. Network analysis of wildfire transmission and implications for risk governance

    Science.gov (United States)

    Ager, Alan A.; Evers, Cody R.; Day, Michelle A.; Preisler, Haiganoush K.; Barros, Ana M. G.; Nielsen-Pincus, Max

    2017-01-01

    We characterized wildfire transmission and exposure within a matrix of large land tenures (federal, state, and private) surrounding 56 communities within a 3.3 million ha fire prone region of central Oregon US. Wildfire simulation and network analysis were used to quantify the exchange of fire among land tenures and communities and analyze the relative contributions of human versus natural ignitions to wildfire exposure. Among the land tenures examined, the area burned by incoming fires averaged 57% of the total burned area. Community exposure from incoming fires ignited on surrounding land tenures accounted for 67% of the total area burned. The number of land tenures contributing wildfire to individual communities and surrounding wildland urban interface (WUI) varied from 3 to 20. Community firesheds, i.e. the area where ignitions can spawn fires that can burn into the WUI, covered 40% of the landscape, and were 5.5 times larger than the combined area of the community core and WUI. For the major land tenures within the study area, the amount of incoming versus outgoing fire was relatively constant, with some exceptions. The study provides a multi-scale characterization of wildfire networks within a large, mixed tenure and fire prone landscape, and illustrates the connectivity of risk between communities and the surrounding wildlands. We use the findings to discuss how scale mismatches in local wildfire governance result from disconnected planning systems and disparate fire management objectives among the large landowners (federal, state, private) and local communities. Local and regional risk planning processes can adopt our concepts and methods to better define and map the scale of wildfire risk from large fire events and incorporate wildfire network and connectivity concepts into risk assessments. PMID:28257416

  19. Anomalous Proper-Motions in the Cygnus Super Bubble Region

    Science.gov (United States)

    Comeron, F.; Torra, J.; Jordi, C.; Gomez, A. E.

    1993-10-01

    In an analysis of proper motions of O and B stars contained in the Input Catalogue for Hipparcos, we have found a clear deviation from the expected pattern of systematic motions which can be readily identified with the associations Cygnus OB1 and Cygnus OB9, located near the edge of the Cygnus Superbubble. The anomalous motions are directed outwards from the center of the Superbubble, which is coincident with the association Cygnus OB2. This seems to support the hypothesis of a strong stellar and supernova activity in Cygnus OB2 giving rise to the Superbubble and, by means of gravitational instabilities in its boundaries, to Cygnus CB1 and Cygnus OB9. New uvbyβ aperture photometry of selected O and B stars in the area of Cygnus OB1 and Cygnus OB9 is also presented and analyzed in this paper.

  20. A turbulent jet in crossflow analysed with proper orthogonal decomposition

    DEFF Research Database (Denmark)

    Meyer, Knud Erik; Pedersen, Jakob Martin; Özcan, Oktay

    2007-01-01

    Detailed instantaneous velocity fields of a jet in crossflow have been measured with stereoscopic particle image velocimetry (PIV). The jet originated from a fully developed turbulent pipe flow and entered a crossflow with a turbulent boundary layer. The Reynolds number based on crossflow velocity...... and pipe diameter was 2400 and the jet to crossflow velocity ratios were R = 3.3 and R = 1.3. The experimental data have been analysed by proper orthogonal decomposition (POD). For R = 3.3, the results in several different planes indicate that the wake vortices are the dominant dynamic flow structures...... and that they interact strongly with the jet core. The analysis identifies jet shear-layer vortices and finds that these vortical structures are more local and thus less dominant. For R = 1.3, on the other hand, jet shear-layer vortices are the most dominant, while the wake vortices are much less important. For both...

  1. ANALYSIS OF RISK FACTORS IN 3901 PATIENTS WITH STROKE

    Institute of Scientific and Technical Information of China (English)

    Xin-Feng Liu; Guy van Melle; Julien Bogousslavsky

    2005-01-01

    Objective To estimate the frequency of various risk factors for overall stroke and to identify risk factors for cerebral infarction (CI) versus intracerebral hemorrhage (ICH) in a large hospital-based stroke registry.Methods Data from a total of 3901 patients, consisting of 3525 patients with CI and 376 patients with ICH were prospectively coded and entered into a computerized data bank.Results Hypertension and smoking were the most prominent factors affecting overall stroke followed by mild internal carotid artery stenosis (< 50%), hypercholesterolemia, transient ischemic attacks (TIAs), diabetes mellitus, and cardiac ischemia. Univariate analysis showed that factors in male significantly associated with CI versus ICH were old age, a family history of stroke, and intermittent claudication; whereas in female the factors were oral contraception and migraine. By multivariate analysis, in all patients, the factors significantly associated with CI as opposed to ICH were smoking, hypercholesterolemia, migraine, TIAs, atrial fibrillation, structural heart disease, and arterial disease. Hypertension was the only significant factor related with ICH versus CI.Conclusions The factors for ischemic and hemorrhagic stroke are not exactly the same. Cardiac and arterial disease are the most powerful factors associated with CI rather than ICH.

  2. Dynamic Risk Analysis of Permanent Deformation of Sea Embankment

    Institute of Scientific and Technical Information of China (English)

    高玉峰; 刘汉龙; 余湘娟

    2001-01-01

    For evaluation of the permanent deformation of a sea embankment under stochastic earthquake excitation, a robust dynamic risk analytical method is presented based on conventional permanent deformation analysis and stochastic seismic response analysis. This method can predict not only the mean value of maximum permanent deformation but also the reliability corresponding to different deformation control standards. The earthquake motion is modelled as a stationary Gaussian filtered white noise random process. The predicted average maximum horizontal permanent displacement is in agreement with the conventional result. Further studied are the reliability of permanent deformation due to stochastic wave details at one seismic motion level and the risk of permanent deformation due to stochastic seismic strength, i. e., the maximum acceleration in a long period. Therefore, it is possible to make the optimal design in terms of safety and economy according to the importance of a sea embankment. It is suggested that the improved stochastic seismic model that can catch the behavior of the non-stationary random process for sea embankments should be further studied in future.

  3. Meta-analysis: serum vitamin D and breast cancer risk.

    Science.gov (United States)

    Yin, Lu; Grandi, Norma; Raum, Elke; Haug, Ulrike; Arndt, Volker; Brenner, Hermann

    2010-08-01

    We reviewed and summarised observational epidemiological studies regarding the association between serum vitamin D (measured as 25(OH)D levels) and the risk of breast cancer (BC). Relevant studies published until September 2009 were identified by systematically electronic searching Ovid Medline, EMBASE and ISI Web of Knowledge databases and by cross-referencing. The following data were extracted in a standardised manner from eligible studies: first author, publication year, country, study design, characteristics of the study population, duration of follow-up, BC incidence/BC mortality according to serum 25-hydroxyvitamin D (25(OH)D) and the respective ratios, and covariates adjusted for in the analysis. All existing observational epidemiological studies that reported at least one serum 25(OH)D level in subjects in any time period before or after a diagnosis of breast cancer were included in our review. Individual and summary risk ratios (RRs) for an increase of serum 25(OH)D by 20ng/ml were calculated using meta-analysis methods. Only 25(OH)D was considered. Overall, 10 articles were included. Specific results for BC incidence were reported in nine articles and for BC mortality in one article. In meta-analyses, summary RRs (95% confidence interval (CI)) for an increase of 25(OH)D by 20ng/ml were 0.59 (0.48-0.73), 0.92 (0.82-1.04) and 0.73 (0.60-0.88) with P values of risk. Copyright 2010 Elsevier Ltd. All rights reserved.

  4. Dietary acrylamide and cancer risk: an updated meta-analysis.

    Science.gov (United States)

    Pelucchi, Claudio; Bosetti, Cristina; Galeone, Carlotta; La Vecchia, Carlo

    2015-06-15

    The debate on the potential carcinogenic effect of dietary acrylamide is open. In consideration of the recent findings from large prospective investigations, we conducted an updated meta-analysis on acrylamide intake and the risk of cancer at several sites. Up to July 2014, we identified 32 publications. We performed meta-analyses to calculate the summary relative risk (RR) of each cancer site for the highest versus lowest level of intake and for an increment of 10 µg/day of dietary acrylamide, through fixed-effects or random-effects models, depending on the heterogeneity test. Fourteen cancer sites could be examined. No meaningful associations were found for most cancers considered. The summary RRs for high versus low acrylamide intake were 0.87 for oral and pharyngeal, 1.14 for esophageal, 1.03 for stomach, 0.94 for colorectal, 0.93 for pancreatic, 1.10 for laryngeal, 0.88 for lung, 0.96 for breast, 1.06 for endometrial, 1.12 for ovarian, 1.00 for prostate, 0.93 for bladder and 1.13 for lymphoid malignancies. The RR was of borderline significance only for kidney cancer (RR = 1.20; 95% confidence interval, CI, 1.00-1.45). All the corresponding continuous estimates ranged between 0.95 and 1.03, and none of them was significant. Among never-smokers, borderline associations with dietary acrylamide emerged for endometrial (RR = 1.23; 95% CI, 1.00-1.51) and ovarian (RR = 1.39; 95% CI, 0.97-2.00) cancers. This systematic review and meta-analysis of epidemiological studies indicates that dietary acrylamide is not related to the risk of most common cancers. A modest association for kidney cancer, and for endometrial and ovarian cancers in never smokers only, cannot be excluded. © 2014 UICC.

  5. Análise de falhas de implantes cirúrgicos no Brasil: a necessidade de uma regulamentação adequada Retrieval and failure analysis of surgical implants in Brazil: the need for proper regulation

    Directory of Open Access Journals (Sweden)

    Cesar R. de Farias Azevedo

    2002-10-01

    Full Text Available Este artigo apresenta alguns casos de análise metalúrgica de falhas de implantes cirúrgicos metálicos utilizados no Brasil. Investigaram-se as causas das falhas de duas placas de compressão de aço inoxidável, uma placa-lâmina de aço inoxidável, uma placa de reconstrução de maxilar de liga de Ti com 6% de alumínio e 4% de vanádio (Ti-6Al-4V e cinco arames de Nitinol (liga níquel-titânio. Adicionalmente, investigou-se a conformidade destes materiais às especificações técnicas da norma ABNT (Associação Brasileira de Normas Técnica. A investigação revelou que todos os implantes analisados não estavam de acordo com os requisitos mínimos da ABNT/ISO, e que as fraturas prematuras ocorreram por mecanismos assistidos por corrosão e/ou pela presença de defeitos (de fabricação, montagem ou de manuseio. Dados de literatura indicam que implantes de materiais não biocompatíveis podem causar diversos tipos de reações adversas no corpo humano, além de promover a falha prematura do componente e causar danos para o paciente e prejuízos para o investimento público. Não há no Brasil legislação sanitária que tornem compulsórios os procedimentos de notificação e de investigação dos casos de falhas de implantes cirúrgicos.This paper summarizes several cases of metallurgical failure analysis of surgical implants conducted at the Laboratory of Failure Analysis, Instituto de Pesquisas Tecnológicas (IPT, in Brazil. Failures with two stainless steel femoral compression plates, one stainless steel femoral nail plate, one Ti-6Al-4V alloy maxillary reconstruction plate, and five Nitinol wires were investigated. The results showed that the implants were not in accordance with ISO standards and presented evidence of corrosion-assisted fracture. Furthermore, some of the implants presented manufacturing/processing defects which also contributed to their premature failure. Implantation of materials that are not biocompatible may

  6. Slovenian proper names designing living beings and geographical proper names, in tourist brochures and informative booklets translated into French

    Directory of Open Access Journals (Sweden)

    Alenka Paternoster

    2011-12-01

    Full Text Available The article analyses French translations of Slovenian proper names in tourist bro chures and booklets published by the Slovenian Tourist Board and the Government of the Republic of Slovenia, Public Relations and Promotion Office. We analysed the names of living beings (the group of names was expected to be less numerous and above all geographical proper names. While we did not notice any bigger problems when translating proper names of living beings, the same can be said for one word geographical proper names. The opposite holds true for multiword geographical proper names. As we believe that tourist brochures play an important role in representing the coun try abroad, we would expect translators be given more detailed guidelines as far as trans lation of proper names is concerned. We hope that the present article brings forth the hard nuts of translating proper names in a manner to encourage the creation of such guidelines.

  7. Risk analysis and emergency management of ammonia installations

    NARCIS (Netherlands)

    Ham, J.M.; Gansevoort, J.

    1992-01-01

    The use of Quantitative Risk Assessment has been increasing for evaluating the risk of handling hazardous materials and land-use planning. This article reports on several studies carried out on the risk of handling, storage and transport of ammonia.

  8. Risk analysis and emergency management of ammonia installations

    NARCIS (Netherlands)

    Ham, J.M.; Gansevoort, J.

    1992-01-01

    The use of Quantitative Risk Assessment has been increasing for evaluating the risk of handling hazardous materials and land-use planning. This article reports on several studies carried out on the risk of handling, storage and transport of ammonia.

  9. 通过化学分析预测锌基涂层耐蚀性能的专家系统%Expert systems for prediction of corrosion properities of Zn-based coatings from the chemical analysis

    Institute of Scientific and Technical Information of China (English)

    BENGTSON Arne; HILDEBRAND Lars

    2012-01-01

    本研究旨在开发一种用于预测锌基涂层耐腐蚀性的通用方法,其可以表示为加速盐雾试验中的总质量损失.本方法仅基于三个分析参数,即锌、铝和镁的总涂层质量.这种限制的原因是这三种参数可能通过在线分析获得.然后,预测的耐腐蚀性被包括在一个过程/质量控制系统.加速腐蚀试验在布雷斯特的Swerea KIMAB IC(腐蚀研究所)以及比利时的冶金研究中心(CRM)进行.试验按照雷诺ECC1试验D172028/--C(12周)以及CRM研发的加速循环腐蚀试验进行.根据总质量损失情况,原材料被分为四个耐腐蚀级别.所有腐蚀试验都清晰、充分地说明了元素镁和铝的正面影响.对于涂层中大多数这些元素来说,元素镁和铝的影响比单独元素锌的影响大很多.因此,引入了一个新的量,叫做“等量元素锌涂层重量”.此量与锌、铝和镁的涂层重量线性相关.使用专家系统开发了一种用于预测耐腐蚀性的模型,此模型基于回归分析和“决策树”算法.根据上述提及的三个分析参数(即锌、铝和镁的总涂层质量),可以使用开发的模型准确对27种材料中的25种进行分类.总之,即使是在线状态,这种方法也有可能准确预测出腐蚀行为.出于材料研发的目的,还扩展了专家系统使其包括其他分析参数.%The purpose of the work is to develop a general method, to predict the corrosion resistance of Zn-based coatings, expressed as total mass loss in an accelerated salt spray test. The method is to be based on just three analytical parametersi the total coating weights of Zn, Al and Mg. The reason for this restriction is that determination of these three parameters is possible in on—line analysis. The predicted corrosion resistance could then be included in a process/quality control system. Accelerated corrosion tests have been carried out by Swerea KIMAB IC (Institut de Corrosion) in Brest, and CRM in Belgium. Test were

  10. The STABALID project: Risk analysis of stationary Li-ion batteries for power system applications

    OpenAIRE

    2015-01-01

    This work presents a risk analysis performed to stationary Li-ion batteries within the framework of the STABALID project. The risk analysis had as main objective analysing the variety of hazards and dangerous situations that might be experienced by the battery during its life cycle and providing useful information on how to prevent or manage those undesired events. The first task of the risk analysis was the identification of all the hazards (or risks) that may arise during the battery life c...

  11. VizieR Online Data Catalog: Proper motions in omega Centauri (van Leeuwen+, 2000)

    Science.gov (United States)

    van Leeuwen, F.; Le Poole, R. S.; Reijns, R. A.; Freeman, K. C.; de Zeeuw, P. T.

    2000-08-01

    The tables present the photometric and astrometric results of an extensive proper motion study of the globular cluster omega Centauri: information on the photographic plates used, variability analysis, astrometric data for 9847 stars, membership determination and surface density profile, cluster proper motion dispersions and systematics and cross-references with star-numbers used by Norris et al. (1997ApJ...487L.187N) and Lynga (1996A&AS..115..297L). (7 data files).

  12. Risk analysis for decision support in electricity distribution system asset management: methods and frameworks for analysing intangible risks

    Energy Technology Data Exchange (ETDEWEB)

    Nordgaard, Dag Eirik

    2010-04-15

    During the last 10 to 15 years electricity distribution companies throughout the world have been ever more focused on asset management as the guiding principle for their activities. Within asset management, risk is a key issue for distribution companies, together with handling of cost and performance. There is now an increased awareness of the need to include risk analyses into the companies' decision making processes. Much of the work on risk in electricity distribution systems has focused on aspects of reliability. This is understandable, since it is surely an important feature of the product delivered by the electricity distribution infrastructure, and it is high on the agenda for regulatory authorities in many countries. However, electricity distribution companies are also concerned with other risks relevant for their decision making. This typically involves intangible risks, such as safety, environmental impacts and company reputation. In contrast to the numerous methodologies developed for reliability risk analysis, there are relatively few applications of structured analyses to support decisions concerning intangible risks, even though they represent an important motivation for decisions taken in electricity distribution companies. The overall objective of this PhD work has been to explore risk analysis methods that can be used to improve and support decision making in electricity distribution system asset management, with an emphasis on the analysis of intangible risks. The main contributions of this thesis can be summarised as: An exploration and testing of quantitative risk analysis (QRA) methods to support decisions concerning intangible risks; The development of a procedure for using life curve models to provide input to QRA models; The development of a framework for risk-informed decision making where QRA are used to analyse selected problems; In addition, the results contribute to clarify the basic concepts of risk, and highlight challenges

  13. Beyond risk: a psychometric and cultural analysis of risk percepion in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Byung-Sun; Chung, Ik Jae [Seoul National Univ., Seoul (Korea, Republic of)

    2002-07-01

    A survey of technological risk perception in Korea was administered in 2001 with a special emphasis on nuclear risks. This paper summarizes the characteristics of risk perception through the analytic lens of psychometric paradigm. A group of experts identified 8 dimensions of risk; voluntariness, severity, effect manifestation, exposure pattern, controllability, familiarity, benefit and necessity. The survey with sample size of 1870 evaluates the perceived level of 25 technological risks including transportation, chemicals, environmental, industrial, as well as nuclear areas. Research findings confirm that the risk characteristics or dimensions are significant predictors of risk perception. Nuclear risks are perceived as involuntary, catastrophic, delayed, occasional, controllable, beneficiary, unfamiliar, and necessary. The paper underlines the need and the importance of nuclear power generation as an environmentally-friendly energy resource in Korea. Effective risk communication can improve the awareness and the understanding of nuclear risks as well as other technological risk, and ultimately foster the public acceptance of nuclear facilities.

  14. Risk factors for progressive ischemic stroke A retrospective analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    BACKGROUND: Progressive ischemic stroke has higher fatality rate and disability rate than common cerebral infarction, thus it is very significant to investigate the early predicting factors related to the occurrence of progressive ischemic stroke, thc potential pathological mechanism and the risk factors of early intervention for preventing the occurrence of progressive ischemic stroke and ameliorating its outcome.OBJECTIVE: To analyze the possible related risk factors in patients with progressive ishcemic stroke, so as to provide reference for the prevention and treatment of progressive ishcemic stroke.DESIGN: A retrospective analysis.SETTING: Department of Neurology, General Hospital of Beijing Coal Mining Group.PARTICIPANTS: Totally 280 patients with progressive ischemic stroke were selected from the Department of Neurology, General Hospital of Beijing Coal Mining Group from March 2002 to June 2006, including 192 males and 88 females, with a mean age of (62±7) years old. They were all accorded with the diagnostic standards for cerebral infarction set by the Fourth National Academic Meeting for Cerebrovascular Disease in 1995, and confired by CT or MRI, admitted within 24 hours after attack, and the neurological defect progressed gradually or aggravated in gradients within 72 hours after attack, and the aggravation of neurological defect was defined as the neurological deficit score decreased by more than 2 points. Meanwhile,200 inpatients with non-progressive ischemic stroke (135 males and 65 females) were selected as the control group.METHODS: After admission, a univariate analysis of variance was conducted using the factors of blood pressure, history of diabetes mellitus, fever, leukocytosis, levels of blood lipids, fibrinogen, blood glucose and plasma homocysteine, cerebral arterial stenosis, and CT symptoms of early infarction, and the significant factors were involved in the multivariate non-conditional Logistic regression analysis.MAIN OUTCOME MEASURES

  15. Proper Time for Spin 1/2 Particles

    CERN Document Server

    Kudaka, S; Kudaka, Shoju; Matsumoto, Shuichi

    2005-01-01

    We find a quantum mechanical formulation of proper time for spin 1/2 particles within the framework of the Dirac theory. It is shown that the rate of proper time can be represented by an operator called the ` ` tempo operator'', and that the proper time itself be given by the integral of the expectation value of the operator. The tempo operator has some terms involving the Pauli spin matrices, and the evolution of the proper time is influenced by the spin state via these terms. The relation between the tempo operator and the metric tensor is elucidated.

  16. An Algorithm for Variable-Length Proper-Name Compression

    Directory of Open Access Journals (Sweden)

    James L. Dolby

    1970-12-01

    Full Text Available Viable on-line search systems require reasonable capabilities to automatically detect (and hopefully correct variations between request format and stored format. An important requirement is the solution of the problem of matching proper names, not only because both input specifications and storage specifications are subject to error, but also because various transliteration schemes exist and can provide variant proper name forms in the same data base. This paper reviews several proper name matching schemes and provides an updated version of these schemes which tests out nicely on the proper name equivalence classes of a suburban telephone book. An appendix lists the corpus of names used for algorithm test.

  17. Establishment of a Risk Assessment Framework for Analysis of the Spread of Highly Pathogenic Avian Influenza

    Institute of Scientific and Technical Information of China (English)

    LI Jing; WANG Jing-fei; WU Chun-yan; YANG Yan-tao; JI Zeng-tao; WANG Hong-bin

    2007-01-01

    To evaluate the risk of highly pathogenic avian influenza (HPAI) in mainland China, a risk assessment framework was built.Risk factors were determined by analyzing the epidemic data using the brainstorming method; the analytic hierarchy process was designed to weigh risk factors, and the integrated multicriteria analysis was used to evaluate the final result.The completed framework included the risk factor system, data standards for risk factors, weights of risk factors, and integrated assessment methods. This risk assessment framework can be used to quantitatively analyze the outbreak and spread of HPAI in mainland China.

  18. Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation

    CERN Document Server

    Vaiter, Samuel; Peyré, Gabriel; Dossal, Charles; Fadili, Jalal

    2012-01-01

    This paper studies the recovery of an unknown signal $x_0$ from low dimensional noisy observations $y = \\Phi x_0 + w$, where $\\Phi$ is an ill-posed linear operator and $w$ accounts for some noise. We focus our attention to sparse analysis regularization. The recovery is performed by minimizing the sum of a quadratic data fidelity term and the $\\lun$-norm of the correlations between the sought after signal and atoms in a given (generally overcomplete) dictionary. The $\\lun$ prior is weighted by a regularization parameter $\\lambda > 0$ that accounts for the noise level. In this paper, we prove that minimizers of this problem are piecewise-affine functions of the observations $y$ and the regularization parameter $\\lambda$. As a byproduct, we exploit these properties to get an objectively guided choice of $\\lambda$. More precisely, we propose an extension of the Generalized Stein Unbiased Risk Estimator (GSURE) and show that it is an unbiased estimator of an appropriately defined risk. This encompasses special ca...

  19. Capability for Integrated Systems Risk-Reduction Analysis

    Science.gov (United States)

    Mindock, J.; Lumpkins, S.; Shelhamer, M.

    2016-01-01

    NASA's Human Research Program (HRP) is working to increase the likelihoods of human health and performance success during long-duration missions, and subsequent crew long-term health. To achieve these goals, there is a need to develop an integrated understanding of how the complex human physiological-socio-technical mission system behaves in spaceflight. This understanding will allow HRP to provide cross-disciplinary spaceflight countermeasures while minimizing resources such as mass, power, and volume. This understanding will also allow development of tools to assess the state of and enhance the resilience of individual crewmembers, teams, and the integrated mission system. We will discuss a set of risk-reduction questions that has been identified to guide the systems approach necessary to meet these needs. In addition, a framework of factors influencing human health and performance in space, called the Contributing Factor Map (CFM), is being applied as the backbone for incorporating information addressing these questions from sources throughout HRP. Using the common language of the CFM, information from sources such as the Human System Risk Board summaries, Integrated Research Plan, and HRP-funded publications has been combined and visualized in ways that allow insight into cross-disciplinary interconnections in a systematic, standardized fashion. We will show examples of these visualizations. We will also discuss applications of the resulting analysis capability that can inform science portfolio decisions, such as areas in which cross-disciplinary solicitations or countermeasure development will potentially be fruitful.

  20. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas.

    Science.gov (United States)

    Bedford, Tim; Daneshkhah, Alireza; Wilson, Kevin J

    2016-04-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets.

  1. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  2. Risk analysis of sustainable urban drainage and irrigation

    Science.gov (United States)

    Ursino, Nadia

    2015-09-01

    Urbanization, by creating extended impervious areas, to the detriment of vegetated ones, may have an undesirable influence on the water and energy balances of urban environments. The storage and infiltration capacity of the drainage system lessens the negative influence of urbanization, and vegetated areas help to re-establish pre-development environmental conditions. Resource limitation, climate, leading to increasing water scarcity, demographic and socio-institutional shifts promote more integrated water management. Storm-water harvesting for landscape irrigation mitigates possible water restrictions for the urban population in drought scenarios. A new probabilistic model for sustainable rainfall drainage, storage and re-use systems was implemented in this study. Risk analysis of multipurpose storage capacities was generalized by the use of only a few dimensionless parameters and applied to a case study in a Mediterranean-type climate, although the applicability of the model is not restricted to any particular climatic type.

  3. Retention and risk factors for attrition in a large public health ART program in Myanmar: a retrospective cohort analysis.

    Directory of Open Access Journals (Sweden)

    Aye Thida

    Full Text Available BACKGROUND: The outcomes from an antiretroviral treatment (ART program within the public sector in Myanmar have not been reported. This study documents retention and the risk factors for attrition in a large ART public health program in Myanmar. METHODS: A retrospective analysis of a cohort of adult patients enrolled in the Integrated HIV Care (IHC Program between June 2005 and October 2011 and followed up until April 2012 is presented. The primary outcome was attrition (death or loss-follow up; a total of 10,223 patients were included in the 5-year cumulative survival analysis. Overall 5,718 patients were analyzed for the risk factors for attrition using both logistic regression and flexible parametric survival models. RESULT: The mean age was 36 years, 61% of patients were male, and the median follow up was 13.7 months. Overall 8,564 (84% patients were retained in ART program: 750 (7% were lost to follow-up and 909 (9% died. During the 3 years follow-up, 1,542 attritions occurred over 17,524 person years at risk, giving an incidence density of 8.8% per year. The retention rates of participants at 12, 24, 36, 48 and 60 months were 86, 82, 80, 77 and 74% respectively. In multivariate analysis, being male, having high WHO staging, a low CD4 count, being anaemic or having low BMI at baseline were independent risk factors for attrition; tuberculosis (TB treatment at ART initiation, a prior ART course before program enrollment and literacy were predictors for retention in the program. CONCLUSION: High retention rate of IHC program was documented within the public sector in Myanmar. Early diagnosis of HIV, nutritional support, proper investigation and treatment for patients with low CD4 counts and for those presenting with anaemia are crucial issues towards improvement of HIV program outcomes in resource-limited settings.

  4. Retention and Risk Factors for Attrition in a Large Public Health ART Program in Myanmar: A Retrospective Cohort Analysis

    Science.gov (United States)

    Thida, Aye; Tun, Sai Thein Than; Zaw, Sai Ko Ko; Lover, Andrew A.; Cavailler, Philippe; Chunn, Jennifer; Aye, Mar Mar; Par, Par; Naing, Kyaw Win; Zan, Kaung Nyunt; Shwe, Myint; Kyaw, Thar Tun; Waing, Zaw Htoon; Clevenbergh, Philippe

    2014-01-01

    Background The outcomes from an antiretroviral treatment (ART) program within the public sector in Myanmar have not been reported. This study documents retention and the risk factors for attrition in a large ART public health program in Myanmar. Methods A retrospective analysis of a cohort of adult patients enrolled in the Integrated HIV Care (IHC) Program between June 2005 and October 2011 and followed up until April 2012 is presented. The primary outcome was attrition (death or loss-follow up); a total of 10,223 patients were included in the 5-year cumulative survival analysis. Overall 5,718 patients were analyzed for the risk factors for attrition using both logistic regression and flexible parametric survival models. Result The mean age was 36 years, 61% of patients were male, and the median follow up was 13.7 months. Overall 8,564 (84%) patients were retained in ART program: 750 (7%) were lost to follow-up and 909 (9%) died. During the 3 years follow-up, 1,542 attritions occurred over 17,524 person years at risk, giving an incidence density of 8.8% per year. The retention rates of participants at 12, 24, 36, 48 and 60 months were 86, 82, 80, 77 and 74% respectively. In multivariate analysis, being male, having high WHO staging, a low CD4 count, being anaemic or having low BMI at baseline were independent risk factors for attrition; tuberculosis (TB) treatment at ART initiation, a prior ART course before program enrollment and literacy were predictors for retention in the program. Conclusion High retention rate of IHC program was documented within the public sector in Myanmar. Early diagnosis of HIV, nutritional support, proper investigation and treatment for patients with low CD4 counts and for those presenting with anaemia are crucial issues towards improvement of HIV program outcomes in resource-limited settings. PMID:25268903

  5. Risk of persistent high-grade squamous intraepithelial lesion after electrosurgical excisional treatment with positive margins: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Caroline Alves de Oliveira

    Full Text Available CONTEXT AND OBJECTIVE: Even if precursor lesions of cervical cancer are properly treated, there is a risk of persistence or recurrence. The aim here was to quantify the risks of persistence of high-grade intraepithelial squamous lesions, one and two years after cervical electrosurgical excisional treatment with positive margins. DESIGN AND SETTING: Systematic review of the literature and meta-analysis at Instituto Fernandes Figueira. METHODS: This meta-analysis was on studies published between January 1989 and July 2009 that were identified in Medline, Scopus, Embase, Cochrane, SciELO, Lilacs, Adolec, Medcarib, Paho, Wholis, Popline, ISI Web of Science and Sigle. Articles were selected if they were cohort studies on electrosurgical excisional treatment of high-grade squamous intraepithelial lesions with a minimum follow-up of one year, a histopathological outcome of persistence of these lesions and a small risk of bias. RESULTS: The search identified 7,066 articles and another 21 in the reference lists of these papers. After applying the selection and exclusion criteria, only four articles were found to have extractable data. The risk of persistence of high-grade intraepithelial lesions after one year was 11.36 times greater (95% confidence interval, CI: 5.529-23.379, P < 0.0001 in patients with positive margins and after two years, was four times greater (95% CI: 0.996-16.164, although without statistical significance. CONCLUSION: This meta-analysis confirms the importance of positive margins as an indicator of incomplete treatment after the first year of follow-up and highlights the need for appropriately chosen electrosurgical techniques based on disease location and extent, with close surveillance of these patients.

  6. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  7. AN ANALYSIS ON CHOOSING A PROPER ECOMMERCE PLATFORM

    Directory of Open Access Journals (Sweden)

    Radu Lixandroiu

    2015-05-01

    Full Text Available Choosing an electronic trading platform is a very important decision when opening an online store. A suitable platform must be able to meet many of the requirements as to not confuse the user. It should provide an online business operation providing tools for managing back-office of technically. There are dozens maybe hundreds of electronic ecommerce platforms available to create an online store, so the decision to choose a platform is difficult. In this article, we tried to compare 19 of the most popular open source ecommerce platforms using a mathematical model based on the platform functionalities. Each of these allows parameterization of an online store in a very short time, with some relatively low cost or close to 0.

  8. AN ANALYSIS ON CHOOSING A PROPER ECOMMERCE PLATFORM

    OpenAIRE

    Radu Lixandroiu; Catalin Maican

    2015-01-01

    Choosing an electronic trading platform is a very important decision when opening an online store. A suitable platform must be able to meet many of the requirements as to not confuse the user. It should provide an online business operation providing tools for managing back-office of technically. There are dozens maybe hundreds of electronic ecommerce platforms available to create an online store, so the decision to choose a platform is difficult. In this article, we tried to compare 19 of the...

  9. Study of Hip Fracture Risk using Tree Structured Survival Analysis

    Directory of Open Access Journals (Sweden)

    Lu Y

    2003-01-01

    Full Text Available In dieser Studie wird das Hüftfraktur-Risiko bei postmenopausalen Frauen untersucht, indem die Frauen in verschiedene Subgruppen hinsichtlich dieses Risikos klassifiziert werden. Frauen in einer gemeinsamen Subgruppe haben ein ähnliches Risiko, hingegen in verschiedenen Subgruppen ein unterschiedliches Hüftfraktur-Risiko. Die Subgruppen wurden mittels der Tree Structured Survival Analysis (TSSA aus den Daten von 7.665 Frauen der SOF (Study of Osteoporosis Fracture ermittelt. Bei allen Studienteilnehmerinnen wurde die Knochenmineraldichte (BMD von Unterarm, Oberschenkelhals, Hüfte und Wirbelsäule gemessen. Die Zeit von der BMD-Messung bis zur Hüftfraktur wurde als Endpunkt notiert. Eine Stichprobe von 75% der Teilnehmerinnen wurde verwendet, um die prognostischen Subgruppen zu bilden (Trainings-Datensatz, während die anderen 25% als Bestätigung der Ergebnisse diente (Validierungs-Datensatz. Aufgrund des Trainings-Datensatzes konnten mittels TSSA 4 Subgruppen identifiziert werden, deren Hüftfraktur-Risiko bei einem Follow-up von im Mittel 6,5 Jahren bei 19%, 9%, 4% und 1% lag. Die Einteilung in die Subgruppen erfolgte aufgrund der Bewertung der BMD des Ward'schen Dreiecks sowie des Oberschenkelhalses und nach dem Alter. Diese Ergebnisse konnten mittels des Validierungs-Datensatzes reproduziert werden, was die Sinnhaftigkeit der Klassifizierungregeln in einem klinischen Setting bestätigte. Mittels TSSA war eine sinnvolle, aussagekräftige und reproduzierbare Identifikation von prognostischen Subgruppen, die auf dem Alter und den BMD-Werten beruhen, möglich. In this paper we studied the risk of hip fracture for post-menopausal women by classifying women into different subgroups based on their risk of hip fracture. The subgroups were generated such that all the women in a particular subgroup had relatively similar risk while women belonging to two different subgroups had rather different risks of hip fracture. We used the Tree Structured

  10. DNA adducts and cancer risk in prospective studies: a pooled analysis and a meta-analysis

    DEFF Research Database (Denmark)

    Veglia, Fabrizio; Loft, Steffen; Matullo, Giuseppe;

    2008-01-01

    Bulky DNA adducts are biomarkers of exposure to aromatic compounds and of the ability of the individual to metabolically activate carcinogens and to repair DNA damage. Their ability to predict cancer onset is uncertain. We have performed a pooled analysis of three prospective studies on cancer risk...... in which bulky DNA adducts have been measured in blood samples collected from healthy subjects (N = 1947; average follow-up 51-137 months). In addition, we have performed a meta-analysis by identifying all articles on the same subject published up to the end of 2006, including case-control studies....... In the pooled analysis, a weakly statistically significant increase in the risk of lung cancer was apparent (14% per unit standard deviation change in adduct levels, 95% confidence interval 1-28%; using the weighted mean difference method, 0.15 SD, units higher adducts in cases than in controls...

  11. Imminent Cardiac Risk Assessment via Optical Intravascular Biochemical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, D.; Wetzel, L; Wetzel, M; Lodder, R

    2009-01-01

    Heart disease is by far the biggest killer in the United States, and type II diabetes, which affects 8% of the U.S. population, is on the rise. In many cases, the acute coronary syndrome and/or sudden cardiac death occurs without warning. Atherosclerosis has known behavioral, genetic and dietary risk factors. However, our laboratory studies with animal models and human post-mortem tissue using FT-IR microspectroscopy reveal the chemical microstructure within arteries and in the arterial walls themselves. These include spectra obtained from the aortas of ApoE-/- knockout mice on sucrose and normal diets showing lipid deposition in the former case. Also pre-aneurysm chemical images of knockout mouse aorta walls, and spectra of plaque excised from a living human patient are shown for comparison. In keeping with the theme of the SPEC 2008 conference Spectroscopic Diagnosis of Disease this paper describes the background and potential value of a new catheter-based system to provide in vivo biochemical analysis of plaque in human coronary arteries. We report the following: (1) results of FT-IR microspectroscopy on animal models of vascular disease to illustrate the localized chemical distinctions between pathological and normal tissue, (2) current diagnostic techniques used for risk assessment of patients with potential unstable coronary syndromes, and (3) the advantages and limitations of each of these techniques illustrated with patent care histories, related in the first person, by the physician coauthors. Note that the physician comments clarify the contribution of each diagnostic technique to imminent cardiac risk assessment in a clinical setting, leading to the appreciation of what localized intravascular chemical analysis can contribute as an add-on diagnostic tool. The quality of medical imaging has improved dramatically since the turn of the century. Among clinical non-invasive diagnostic tools, laboratory tests of body fluids, EKG, and physical examination are

  12. Risk of Hypothyroidism following Hemithyroidectomy: Systematic Review and Meta-Analysis of Prognostic Studies.

    NARCIS (Netherlands)

    Verloop, H.; Louwerens, M.; Schoones, J.W.; Kievit, J.; Smit, J.W.A.; Dekkers, O.M.

    2012-01-01

    Context: The reported risk of hypothyroidism after hemithyroidectomy shows considerable heterogeneity in literature. Objective: The aim of this systematic review and meta-analysis was to determine the overall risk of hypothyroidism, both clinical and subclinical, after hemithyroidectomy. Furthermore

  13. Coffee Consumption and Risk of Stroke: A Dose-Response Meta-Analysis of Prospective Studies

    National Research Council Canada - National Science Library

    Larsson, Susanna C; Orsini, Nicola

    2011-01-01

    Coffee consumption has been inconsistently associated with risk of stroke. The authors conducted a meta-analysis of prospective studies to quantitatively assess the association between coffee consumption and stroke risk...

  14. Reliability and risk analysis data base development: an historical perspective

    Energy Technology Data Exchange (ETDEWEB)

    Fragola, Joseph R

    1996-02-01

    Collection of empirical data and data base development for use in the prediction of the probability of future events has a long history. Dating back at least to the 17th century, safe passage events and mortality events were collected and analyzed to uncover prospective underlying classes and associated class attributes. Tabulations of these developed classes and associated attributes formed the underwriting basis for the fledgling insurance industry. Much earlier, master masons and architects used design rules of thumb to capture the experience of the ages and thereby produce structures of incredible longevity and reliability (Antona, E., Fragola, J. and Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings, Rome, Italy, 18-20 October 1993). These rules served so well in producing robust designs that it was not until almost the 19th century that the analysis (Charlton, T.M., A History Of Theory Of Structures In The 19th Century, Cambridge University Press, Cambridge, UK, 1982) of masonry voussoir arches, begun by Galileo some two centuries earlier (Galilei, G. Discorsi e dimostrazioni mathematiche intorno a due nuove science, (Discourses and mathematical demonstrations concerning two new sciences, Leiden, The Netherlands, 1638), was placed on a sound scientific basis. Still, with the introduction of new materials (such as wrought iron and steel) and the lack of theoretical knowledge and computational facilities, approximate methods of structural design abounded well into the second half of the 20th century. To this day structural designers account for material variations and gaps in theoretical knowledge by employing factors of safety (Benvenuto, E., An Introduction to the History of Structural Mechanics, Part II: Vaulted Structures and Elastic Systems, Springer-Verlag, NY, 1991) or codes of practice (ASME Boiler and Pressure Vessel Code, ASME, New York) originally developed in the 19th century (Antona, E., Fragola, J. and

  15. Hierarchical Modelling of Flood Risk for Engineering Decision Analysis

    DEFF Research Database (Denmark)

    Custer, Rocco

    Societies around the world are faced with flood risk, prompting authorities and decision makers to manage risk to protect population and assets. With climate change, urbanisation and population growth, flood risk changes constantly, requiring flood risk management strategies that are flexible...... and robust. Traditional risk management solutions, e.g. dike construction, are not particularly flexible, as they are difficult to adapt to changing risk. Conversely, the recent concept of integrated flood risk management, entailing a combination of several structural and non-structural risk management...... measures, allows identifying flexible and robust flood risk management strategies. Based on it, this thesis investigates hierarchical flood protection systems, which encompass two, or more, hierarchically integrated flood protection structures on different spatial scales (e.g. dikes, local flood barriers...

  16. The proper longshore current in a wave basin

    NARCIS (Netherlands)

    Visser, P.J.

    1982-01-01

    This report describes the investigation into a method how to obtain the proper longshore current in a wave basin. In this method the basin geometry is optimized and the proper recirculation flow through openings in the wave guides is determined by minimizing the circulation flow between the wave gui

  17. Participation in "Handwashing University" Promotes Proper Handwashing Techniques for Youth

    Science.gov (United States)

    Fenton, Ginger; Radhakrishna, Rama; Cutter, Catherine Nettles

    2010-01-01

    A study was conducted to assess the effectiveness of the Handwashing University on teaching youth the benefits of proper handwashing. The Handwashing University is an interactive display with several successive stations through which participants move to learn necessary skills for proper handwashing. Upon completion of the Handwashing University,…

  18. 29 CFR 1404.20 - Proper use of expedited arbitration.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Proper use of expedited arbitration. 1404.20 Section 1404... ARBITRATION SERVICES Expedited Arbitration § 1404.20 Proper use of expedited arbitration. (a) FMCS reserves the right to cease honoring request for Expedited Arbitration if a pattern of misuse of this...

  19. Reasons Why Chinese Learners Can not Use English Words Properly

    Institute of Scientific and Technical Information of China (English)

    杨文娟

    2015-01-01

    English has been one of the important major subjects in the system of Chinese education for a long period. However, Chinese learners can not always use English words properly. There are some reasons elucidating this phenomenon. Therefore, reasons why Chinese learners can not use English words properly will be discussed in this paper.

  20. Embeddings of (proper) power graphs of finite groups

    OpenAIRE

    Doostabadi, Alireza; Ghouchan, Mohammad Farrokhi Derakhshandeh

    2014-01-01

    The (proper) power graph of a group is a graph whose vertex set is the set of all (nontrivial) elements of the group and two distinct vertices are adjacent if one is a power of the other. Various kinds of planarity of (proper) power graphs of groups are discussed.

  1. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model

    OpenAIRE

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event co...

  2. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    OpenAIRE

    2016-01-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequen...

  3. Integrated Risk-Capability Analysis under Deep Uncertainty: an ESDMA Approach

    OpenAIRE

    Pruyt, E.; Kwakkel, J. H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations and nations use truly integrated risk-capability approaches, and almost none use integrated risk-capability approaches that take dynamic complexity and deep uncertainty seriously into account. Thi...

  4. Model-based risk analysis of coupled process steps.

    Science.gov (United States)

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  5. 75 FR 6346 - Notice of Availability of a Pest Risk Analysis for the Importation of Fresh Male Summer Squash...

    Science.gov (United States)

    2010-02-09

    ... Animal and Plant Health Inspection Service Notice of Availability of a Pest Risk Analysis for the... have prepared a pest risk analysis that evaluates the risks associated with the importation of fresh... to mitigate the pest risk. We are making the pest risk analysis available to the public for...

  6. Comparative Analysis of Corporate Risk Management Practices in Croatian and Slovenian Companies

    OpenAIRE

    Miloš Sprčić, Danijela; Šević, Željko

    2008-01-01

    The paper explores differences as well as commonalities in corporate risk management practices and risk exposures in the large non-financial Slovenian and Croatian companies. Comparative analysis of survey results have revealed that the majority of analysed companies in both Croatia and Slovenia are using some form of risk management to manage interest-rate, foreign exchange, or commodity price risk. Regarding the intensity of influence of financial risks on the performance of the analysed co...

  7. Galactic Dynamics: new proper motions from Gaia and UCAC

    Science.gov (United States)

    Zacharias, Norbert

    2017-06-01

    With the Gaia DR1 we now have proper motions accurate on the 0.1 mas/yr level for about 100,000 Hipparcos stars. The Tycho-Gaia astrometric solution (TGAS) furthermore provides proper motions of about 2 million stars on the 1 to 2 mas/yr level. Using TGAS as reference star catalog, the USNO CCD Astrograph Catalog (UCAC) observations were re-reduced and their about epoch 2001 positions combined with Gaia DR1 to obtain proper motions of over 100 millions stars to about magnitude R=16.5 with a proper motion accuracy of 1 to 5 mas/yr (depending on brightness). This UCAC5 data largely extends the TGAS data for galactic dynamics studies, and thus provides a preview of some more exciting science which will be enabled with the Gaia DR2 in April 2018, when accurate proper motions will become available for a billion stars.

  8. The Southern Proper Motion Program IV. The SPM4 Catalog

    CERN Document Server

    Girard, T M; Zacharias, N; Vieira, K; Casetti-Dinescu, D I; Monet, D G; Lopez, C E

    2011-01-01

    We present the fourth installment of the Yale/San Juan Southern Proper Motion Catalog, SPM4. The SPM4 contains absolute proper motions, celestial coordinates, and (B,V) photometry for over 103 million stars and galaxies between the south celestial pole and -20 deg declination. The catalog is roughly complete to V=17.5 and is based on photographic and CCD observations taken with the Yale Southern Observatory's double-astrograph at Cesco Observatory in El Leoncito, Argentina. The proper-motion precision, for well-measured stars, is estimated to be 2 to 3 mas/yr, depending on the type of second-epoch material. At the bright end, proper motions are on the International Celestial Reference System by way of Hipparcos Catalog stars, while the faint end is anchored to the inertial system using external galaxies. Systematic uncertainties in the absolute proper motions are on the order of 1 mas/yr.

  9. Crash risk analysis during fog conditions using real-time traffic data.

    Science.gov (United States)

    Wu, Yina; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2017-05-30

    This research investigates the changes of traffic characteristics and crash risks during fog conditions. Using real-time traffic flow and weather data at two regions in Florida, the traffic patterns at the fog duration were compared to the traffic patterns at the clear duration. It was found that the average 5-min speed and the average 5-min volume were prone to decreasing during fog. Based on previous studies, a "Crash Risk Increase Indicator (CRII)" was proposed to explore the differences of crash risk between fog and clear conditions. A binary logistic regression model was applied to link the increase of crash risks with traffic flow characteristics. The results suggested that the proposed indicator worked well in evaluating the increase of crash risk under fog condition. It was indicated that the crash risk was prone to increase at ramp vicinities in fog conditions. Also, the average 5-min volume during fog and the lane position are important factors for crash risk increase. The differences between the regions were also explored in this study. The results indicated that the locations with heavier traffic or locations at the lanes that were closest to the median in Region 2 were more likely to observe an increase in crash risks in fog conditions. It is expected that the proposed indicator can help identify the dangerous traffic status under fog conditions and then proper ITS technologies can be implemented to enhance traffic safety when the visibility declines. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. An analysis of the public perception of flood risk on the Belgian coast.

    Science.gov (United States)

    Kellens, Wim; Zaalberg, Ruud; Neutens, Tijs; Vanneuville, Wouter; De Maeyer, Philippe

    2011-07-01

    In recent years, perception of flood risks has become an important topic to policy makers concerned with risk management and safety issues. Knowledge of the public risk perception is considered a crucial aspect in modern flood risk management as it steers the development of effective and efficient flood mitigation strategies. This study aimed at gaining insight into the perception of flood risks along the Belgian coast. Given the importance of the tourism industry on the Belgian coast, the survey considered both inhabitants and residential tourists. Based on actual expert's risk assessments, a high and a low risk area were selected for the study. Risk perception was assessed on the basis of scaled items regarding storm surges and coastal flood risks. In addition, various personal and residence characteristics were measured. Using multiple regression analysis, risk perception was found to be primarily influenced by actual flood risk estimates, age, gender, and experience with previous flood hazards.

  11. Study on risk analysis of supply chain enterprises

    Institute of Scientific and Technical Information of China (English)

    Wu Xiaohui; Zhong Xiaobing; Song Shiji; Wu Cheng

    2006-01-01

    The sources of supply chain enterprise risk from different aspects including material flow, information flow, cash flow and partner relationship is analyzed. Measures for risk reduction have also been summarized from the aspects of risk sharing, information sharing, change of inventory control mode, and supply chain flexibility. Finally, problems in current research on supply chain risk management are pointed out and a discussion on future research trend is presented.

  12. Risk-analysis of global climate tipping points

    Energy Technology Data Exchange (ETDEWEB)

    Frieler, Katja; Meinshausen, Malte; Braun, N. [Potsdam Institute for Climate Impact Research e.V., Potsdam (Germany). PRIMAP Research Group] [and others

    2012-09-15

    There are many elements of the Earth system that are expected to change gradually with increasing global warming. Changes might prove to be reversible after global warming returns to lower levels. But there are others that have the potential of showing a threshold behavior. This means that these changes would imply a transition between qualitatively disparate states which can be triggered by only small shifts in background climate (2). These changes are often expected not to be reversible by returning to the current level of warming. The reason for that is, that many of them are characterized by self-amplifying processes that could lead to a new internally stable state which is qualitatively different from before. There are different elements of the climate system that are already identified as potential tipping elements. This group contains the mass losses of the Greenland and the West-Antarctic Ice Sheet, the decline of the Arctic summer sea ice, different monsoon systems, the degradation of coral reefs, the dieback of the Amazon rainforest, the thawing of the permafrost regions as well as the release of methane hydrates (3). Crucially, these tipping elements have regional to global scale effects on human society, biodiversity and/or ecosystem services. Several examples may have a discernable effect on global climate through a large-scale positive feedback. This means they would further amplify the human induced climate change. These tipping elements pose risks comparable to risks found in other fields of human activity: high-impact events that have at least a few percent chance to occur classify as high-risk events. In many of these examples adaptation options are limited and prevention of occurrence may be a more viable strategy. Therefore, a better understanding of the processes driving tipping points is essential. There might be other tipping elements even more critical but not yet identified. These may also lie within our socio-economic systems that are

  13. Risk Profiles of Children Entering Residential Care: A Cluster Analysis

    Science.gov (United States)

    Hagaman, Jessica L.; Trout, Alexandra L.; Chmelka, M. Beth; Thompson, Ronald W.; Reid, Robert

    2010-01-01

    Children in residential care are a heterogeneous population, presenting various combinations of risks. Existing studies on these children suggest high variability across multiple domains (e.g., academics, behavior). Given this heterogeneity, it is important to begin to identify the combinations and patterns of multiple risks, or risk profiles,…

  14. Foundations of risk analysis a knowledge and decision-oriented perspective

    CERN Document Server

    Aven

    2004-01-01

    Everyday we face decisions that carry an element of risk and uncertainty. The ability to analyse, communicate and control the level of risk entailed by these decisions remains one of the most pressing challenges to the analyst, scientist and manager. This book presents the foundational issues in risk analysis - expressing risk, understanding what risk means, building risk models, addressing uncertainty, and applying probability models to real problems. The principal aim of the book is to give the reader the knowledge and basic thinking they require to approach risk and uncertainty to support d

  15. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    OpenAIRE

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general proc...

  16. [Improvement of legislation basis for occupational risk analysis in occupational hygiene and work safety].

    Science.gov (United States)

    Zaitseva, N V; Shur, P Z; Alekseev, V B; Andreeva, E E; Sliapniakov, D M

    2014-01-01

    One among priority trendsin health care in Russian Federation and abroad is minimization of occupational risks. The authors present evaluation of legislation basis for occupational risk analysis. The most promising trend in improvement of national legislation is its development on basis of internationally accepted documents, that-provides legislation basis for analysis of workers' health risk. Findings are that complete evaluation of occupational risk requires combination of data on work conditions and data of occupational control, and sometimes--with results of special research. Further improvement is needed for justifying hygienic norms with applying criteria of allowable risk for workers' health. Now development of risk analysis methodology enables quantitative evaluation of health risk via mathematic models including those describing risk evolution.

  17. tropical cyclone risk analysis: a decisive role of its track

    Science.gov (United States)

    Chelsea Nam, C.; Park, Doo-Sun R.; Ho, Chang-Hoi

    2016-04-01

    The tracks of 85 tropical cyclones (TCs) that made landfall to South Korea for the period 1979-2010 are classified into four clusters by using a fuzzy c-means clustering method. The four clusters are characterized by 1) east-short, 2) east-long, 3) west-long, and 4) west-short based on the moving routes around Korean peninsula. We conducted risk comparison analysis for these four clusters regarding their hazards, exposure, and damages. Here, hazard parameters are calculated from two different sources independently, one from the best-track data (BT) and the other from the 60 weather stations over the country (WS). The results show distinct characteristics of the four clusters in terms of the hazard parameters and economic losses (EL), suggesting that there is a clear track-dependency in the overall TC risk. It is appeared that whether there occurred an "effective collision" overweighs the intensity of the TC per se. The EL ranking did not agree with the BT parameters (maximum wind speed, central pressure, or storm radius), but matches to WS parameter (especially, daily accumulated rainfall and TC-influenced period). The west-approaching TCs (i.e. west-long and west-short clusters) generally recorded larger EL than the east-approaching TCs (i.e. east-short and east-long clusters), although the east-long clusters are the strongest in BT point of view. This can be explained through the spatial distribution of the WS parameters and the regional EL maps corresponding to it. West-approaching TCs accompanied heavy rainfall on the southern regions with the helps of the topographic effect on their tracks, and of the extended stay on the Korean Peninsula in their extratropical transition, that were not allowed to the east-approaching TCs. On the other hand, some regions had EL that are not directly proportional to the hazards, and this is partly attributed to spatial disparity in wealth and vulnerability. Correlation analysis also revealed the importance of rainfall; daily

  18. RISK DISCLOSURE ANALYSIS IN THE CORPORATE GOVERNANCE ANNUAL REPORT USING FUZZY-SET QUALITATIVE COMPARATIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pedro Carmona

    2016-05-01

    Full Text Available This paper explores the necessary and sufficient conditions of good Corporate Governance practices for high risk disclosure by firms in their Corporate Governance Annual Report. Additionally, we explore whether those recipes have changed during the financial crisis. With a sample of 271 Spanish listed companies, we applied fuzzy-set qualitative comparative analysis to a database of financial and non-financial data. We report that Board of Directors independence, size, level of activity and gender diversity, CEO duality, Audit Committee independence, being audited by the Big Four auditing firms and the presence of institutional investors are associated with high risk disclosure. The conditions included in almost every combination are the presence of institutional investors and being audited by the Big Four. We found similar combinations for 2006 and 2012, while the analysis for 2009 showed the lowest number of causal configurations.

  19. ANALYSIS OF THE INVESTMENT RISK IN CRYPTOCURRENCY BITCOIN

    OpenAIRE

    Kinga Kądziołka

    2015-01-01

    The aim of the article was to evaluate the risks of investing in Bitcoin cryptocurrency. Particular attention was paid to the risk of investment on the Polish exchanges: Bitcurex, BitBay, BitMarket.pl and LocalBitcoins. To evaluate the risk there was used VaR measure. There were compared the risk of investing in Bitcoin cryptocurrency and the risk of investing in the selected "traditional" currencies. There was also paid attention to the effect of day of the week on the Bitcoin’s exchanges. T...

  20. ANALYSIS OF THE INVESTMENT RISK IN CRYPTOCURRENCY BITCOIN

    OpenAIRE

    Kinga Kądziołka

    2015-01-01

    The aim of the article was to evaluate the risks of investing in Bitcoin cryptocurrency. Particular attention was paid to the risk of investment on the Polish exchanges: Bitcurex, BitBay, BitMarket.pl and LocalBitcoins. To evaluate the risk there was used VaR measure. There were compared the risk of investing in Bitcoin cryptocurrency and the risk of investing in the selected "traditional" currencies. There was also paid attention to the effect of day of the week on the Bitcoin’s exchanges. T...

  1. Prospective Analysis of Risk for Hypothyroidism after Hemithyroidectomy

    Directory of Open Access Journals (Sweden)

    Virgilijus Beisa

    2015-01-01

    Full Text Available Objectives. To evaluate risk factors and to develop a simple scoring system to grade the risk of postoperative hypothyroidism (PH. Methods. In a controlled prospective study, 109 patients, who underwent hemithyroidectomy for a benign thyroid disease, were followed up for 12 months. The relation between clinical data and PH was analyzed for significance. A risk scoring system based on significant risk factors and clinical implications was developed. Results. The significant risk factors of PH were higher TSH (thyroid-stimulating hormone level and lower ratio of the remaining thyroid weight to the patient’s weight (derived weight index. Based on the log of risk factor, preoperative TSH level greater than 1.4 mU/L was assigned 2 points; 1 point was for 0.8–1.4 mU/L. The derived weight index lower than 0.8 g/kg was assigned 1 point. A risk scoring system was calculated by summing the scores. The incidences of PH were 7.3%, 30.4%, and 69.2% according to the risk scores of 0-1, 2, and 3. Conclusion. Risk factors for PH are higher preoperative TSH level and lower derived weight index. Our developed risk scoring system is a valid and reliable tool to identify patients who are at risk for PH before surgery.

  2. Predicting adolescent's cyberbullying behavior: A longitudinal risk analysis.

    Science.gov (United States)

    Barlett, Christopher P

    2015-06-01

    The current study used the risk factor approach to test the unique and combined influence of several possible risk factors for cyberbullying attitudes and behavior using a four-wave longitudinal design with an adolescent US sample. Participants (N = 96; average age = 15.50 years) completed measures of cyberbullying attitudes, perceptions of anonymity, cyberbullying behavior, and demographics four times throughout the academic school year. Several logistic regression equations were used to test the contribution of these possible risk factors. Results showed that (a) cyberbullying attitudes and previous cyberbullying behavior were important unique risk factors for later cyberbullying behavior, (b) anonymity and previous cyberbullying behavior were valid risk factors for later cyberbullying attitudes, and (c) the likelihood of engaging in later cyberbullying behavior increased with the addition of risk factors. Overall, results show the unique and combined influence of such risk factors for predicting later cyberbullying behavior. Results are discussed in terms of theory.

  3. ANALYSIS OF THE INVESTMENT RISK IN CRYPTOCURRENCY BITCOIN

    Directory of Open Access Journals (Sweden)

    Kinga Kądziołka

    2015-09-01

    Full Text Available The aim of the article was to evaluate the risks of investing in Bitcoin cryptocurrency. Particular attention was paid to the risk of investment on the Polish exchanges: Bitcurex, BitBay, BitMarket.pl and LocalBitcoins. To evaluate the risk there was used VaR measure. There were compared the risk of investing in Bitcoin cryptocurrency and the risk of investing in the selected "traditional" currencies. There was also paid attention to the effect of day of the week on the Bitcoin’s exchanges. The investment in cryptocurrency was characterized by higher risk than investing in “traditional” currencies. The Polish Bitcoin exchange LocalBitcoins was characterized by the highest risk and highest average daily rate of return.

  4. What Defines Us as Professionals in the Field of Risk Analysis?

    Science.gov (United States)

    Aven, Terje

    2016-08-11

    In a recent issue of Risk Analysis, the then-President of the Society for Risk Analysis (SRA), Pamela Williams, has some interesting reflections about the risk analysis field. She states that the ability and desire to tackle difficult problems using a risk analytical approach is what uniquely defines us as professionals in the field of risk analysis. The point of departure for her discussion is interviews with the plenary speakers of the 2014 SRA Annual Meeting, who addressed two divisive topics: hydraulic fracking and marijuana use. She points to several themes that invite contributions from the field of risk analysis, including: Has the full spectrum of potential risks and benefits been identified and weighted, and what are the risk tradeoffs or countervailing risks? Inspired by Williams's reflections, and by analyzing the issues raised in the interviews, this article seeks to clarify what our field is really providing. A main conclusion of the article is that it is essential to acknowledge that professionals in the field of risk analysis merely support the tackling of such problems, and that their genuine competence-that which distinguishes them from other professionals-lies in the risk analytical approach itself.

  5. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  6. Investigating differences between proper and common nouns using novel word learning

    Directory of Open Access Journals (Sweden)

    Anastasiya Romanova

    2014-04-01

    Full Text Available Empirical studies have shown higher rates of tip-of-the-tongue states for proper nouns, in comparison to common nouns, in non-brain-damaged speakers (e.g., Valentine & Moore, 1995, and higher retrieval failure rates for proper nouns relative to common nouns in people with aphasia (e.g., Semenza, 2009. Some authors suggest the source of these differences lies in logical properties (e.g., Semenza, 2009. That is, common nouns refer to a category of beings or objects that share certain semantic properties, while proper nouns designate specific individual beings or objects with unique features. Other authors attribute the distinction in processing to a number of statistical properties that differ across common and proper nouns (Kay, Hanley, & Miles, 2001. The aims of the present study were: 1 to dissociate the effects of logical and statistical properties by using novel words with equal statistical properties; 2 to determine whether people with aphasia show disproportionate impairments in learning proper nouns relative to common nouns, compared to aged-matched subjects. Methods We tested young (n=16 and elderly (n=14 adult non-brain-damaged participants and people with aphasia (n=2. Items-to-be-learnt were given as representatives of an unknown species (n=10 in the common noun condition, or as individual creatures (n=10 in the proper noun condition. The experiment consisted of 5 sessions. Each session included a learning phase and a test phase with naming and word-picture verification tasks. Results and Discussion Preliminary analysis showed learning of both common and proper nouns for both younger (F(4=140.68, p<.01 and elderly (F(4=34.87, p<.01 non-brain-damaged participants, with learning being significantly better for the younger group (F(4=6.5, p<.01. Contrary to expectations, performance on proper nouns was better than that for common nouns for both young and elderly subjects (F(1=6.47, p=.02 and F(1=9.75, p<.01, respectively, possibly due to

  7. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.

    2016-12-01

    Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.

  8. Arctic climate change and oil spill risk analysis

    Institute of Scientific and Technical Information of China (English)

    William B. Samuels; David E. Amstutz; Heather A. Crowley

    2011-01-01

    The purpose of this project was to:1) describe the effects of climate change in the Arctic and its impact on circulation,2) describe hindcast data used in the Ocean Energy Management,Regulation and Enforcement (BOEMRE) Oil Spill Risk Analysis (OSRA) model,3)evaluate alternatives such as using forecast results in the OSRA model,and 4) recommend future studies.Effects of climate change on winds,sea ice,ocean circulation and river discharge in the Arctic and impacts on surface circulation can be evaluated only through a series of specially designed numerical experiments using highresolution coupled ice-ocean models to elucidate the sensitivity of the models to various parameterizations or forcings.The results of these experiments will suggest what mechanisms are most important in controlling model response and guide inferences on how OSRA may respond to different climate change scenarios.Climatological change in the Arctic could lead to drastic alterations of wind,sea ice cover and concentration,and surface current fields all of which would influence hypothetical oil spill trajectories.Because of the pace at which conditions are changing,BOEMRE needs to assess whether forecast ice/ocean model results might contain useful information for the purposes of calculating hypothetical oil spill trajectories.

  9. Distal wound complications following pedal bypass: analysis of risk factors.

    Science.gov (United States)

    Robison, J G; Ross, J P; Brothers, T E; Elliott, B M

    1995-01-01

    Wound complications of the pedal incision continue to compromise successful limb salvage following aggressive revascularization. Significant distal wound disruption occurred in 14 of 142 (9.8%) patients undergoing pedal bypass with autogenous vein for limb salvage between 1986 and 1993. One hundred forty-two pedal bypass procedures were performed for rest pain in 66 patients and tissue necrosis in 76. Among the 86 men and 56 women, 76% were diabetic and 73% were black. All but eight patients had a history of diabetes and/or tobacco use. Eight wounds were successfully managed with maintenance of patent grafts from 5 to 57 months. Exposure of a patent graft precipitated amputation in three patients, as did graft occlusion in an additional patient. One graft was salvaged by revision to the peroneal artery and one was covered by a local bipedicled flap. Multiple regression analysis identified three factors associated with wound complications at the pedal incision site: diabetes mellitus (p = 0.03), age > 70 years (p = 0.03), and rest pain (p = 0.05). Ancillary techniques ("pie-crusting") to reduce skin tension resulted in no distal wound problems among 15 patients considered to be at greatest risk for wound breakdown. Attention to technique of distal graft tunneling, a wound closure that reduces tension, and control of swelling by avoiding dependency on and use of gentle elastic compression assume crucial importance in minimizing pedal wound complications following pedal bypass.

  10. Sanitary risk analysis for farm workers exposed to environmental pollutants

    Directory of Open Access Journals (Sweden)

    Simone Pascuzzi

    2013-03-01

    Full Text Available In Italy, a large number of agricultural areas are contaminated by organic and inorganic polluting substances. In such areas, the agricultural operators come into contact with the environmental contaminants through inhalation and dermic contact with dusts and vapour, and this exposure can potentially alter the biological equilibrium with consequent poisonings and/or work-related illness. The aim of this paper is to apply a methodological procedure for the numerical evaluation of the health risk for agricultural employees operating in open fields or inside greenhouses located in areas contaminated with organic pollutants. This procedure is in response to the lack of calculation models concerning these types of environment and agricultural activities. As a case study, this methodology has been applied to an agricultural area of southern Italy characterised by the presence of pollutants. The results underline that in this area there is a smaller concentration of pollutants in open field cultivations than inside greenhouses owing to a phenomenon of dispersion into the atmosphere. This numeric analysis will later be verified by measurements carried out in situ in order to evaluate the real situation on the ground.

  11. Risk analysis of tyramine concentration in food production

    Science.gov (United States)

    Doudová, L.; Buňka, F.; Michálek, J.; Sedlačík, M.; Buňková, L.

    2013-10-01

    The contribution is focused on risk analysis in food microbiology. This paper evaluates the effect of selected factors on tyramine production in bacterial strains of Lactococcus genus which were assigned as tyramine producers. Tyramine is a biogenic amine sythesized from an amino acid called tyrosine. It can be found in certain foodstuffs (often in cheese), and can cause a pseudo-response in sensitive individuals. The above-mentioned bacteria are commonly used in the biotechnological process of cheese production as starter cultures. The levels of factors were chosen with respect to the conditions which can occur in this technological process. To describe and compare tyramine production in chosen microorganisms, generalized regression models were applied. Tyramine production was modelled by Gompertz curves according to the selected factors (the lactose concentration of 0-1% w/v, NaCl 0-2% w/v and aero/anaerobiosis) for 3 different types of bacterial cultivation. Moreover, estimates of model parameters were calculated and tested; multiple comparisons were discussed as well. The aim of this paper is to find a combination of factors leading to a similar tyramine production level.

  12. Framework for risk analysis in Multimedia Environmental Systems (FRAMES)

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, G.; Buck, J.W.; Castleton, K.J.; Hoopes, B.L.; Pelton, M.A.; McDonald, J.P.; Gelston, G.M.; Taira, R.Y. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-05-01

    The objectives of this workshop are to (1) provide the NRC staff and the public with an overview of currently available Federally-Sponsored dose models appropriate for decommissioning assessments and (2) discuss NRC staff-developed questions related to model selection criteria with the final rule on ``Radiological Criteria for License Termination`` (62 FR 39058). For over 40 years, medium specific models have been and will continue to be developed in an effort to understand and predict environmental phenomena, including fluid-flow patterns, contaminant migration and fate, human or wildlife exposures, impacts from specific toxicants to specific species and their organs, cost-benefit analyses, impacts from remediation alternatives, etc. For nearly 40 years, medium-specific models have been combined for either sequential or concurrent assessments. The evolution of multiple-media assessment tools has followed a logic progression. To allow a suite of users the flexibility and versatility to construct, combine, and couple attributes that meet their specific needs without unnecessarily burdening the user with extraneous capabilities, the development of a computer-based methodology to implement a Risk Analysis in Multimedia Environmental Systems (FRAMES) was begun in 1994. FRAMES represents a platform which links elements together and yet does not represent the models that are linked to or within it; therefore, changes to elements that are linked to or within FRAMES do not change the framework.

  13. Sensitivity analysis on parameters and processes affecting vapor intrusion risk.

    Science.gov (United States)

    Picone, Sara; Valstar, Johan; van Gaans, Pauline; Grotenhuis, Tim; Rijnaarts, Huub

    2012-05-01

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the vast majority of previous studies, this model accounts for vertical variation of soil water saturation and includes aerobic biodegradation. The attenuation factor (ratio between concentration in the crawl space and source concentration) and the characteristic time to approach maximum concentrations were calculated and compared for a variety of scenarios. These concepts allow an understanding of controlling mechanisms and aid in the identification of critical parameters to be collected for field situations. The relative distance of the source to the nearest gas-filled pores of the unsaturated zone is the most critical parameter because diffusive contaminant transport is significantly slower in water-filled pores than in gas-filled pores. Therefore, attenuation factors decrease and characteristic times increase with increasing relative distance of the contaminant dissolved source to the nearest gas diffusion front. Aerobic biodegradation may decrease the attenuation factor by up to three orders of magnitude. Moreover, the occurrence of water table oscillations is of importance. Dynamic processes leading to a retreating water table increase the attenuation factor by two orders of magnitude because of the enhanced gas phase diffusion.

  14. Comparative Analysis of Risk, Return and Diversification of Mutual Fund

    Directory of Open Access Journals (Sweden)

    Rais Ahmad

    2015-01-01

    Full Text Available Mutual Funds have become a widely popular and effective way for investors to participate in financial markets in an easy, low-cost fashion, while muting risk characteristics by spreading the investment across different types of securities, also known as diversification. It can play a central role in an individual's investment strategy. With the plethora of schemes available in the Indian markets, an investors needs to evaluate and consider various factors before making an investment decision. The present investigation is aimed to examine the performance of safest investment instrument in the security market in the eyes of investors. Five mutual fund large cap scheme have been selected for this purpose. The examination is achieved by assessing various financial tests like Sharpe Ratio, Standard Deviation, Alpha, and Beta. Furthermore, in-depth analysis also has been done by considering return over the period of last five years on various basis, expenses ratio, corpus-size etc. The data has been taken from various websites of mutual fund schemes and from www.valueresearch.com. The study will be helpful for the researchers and financial analysts to analyze various securities or funds while selecting the best investment alternative out of the galaxy of investment alternatives.

  15. Reliability and risk analysis using artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, D.G. [Sandia National Labs., Albuquerque, NM (United States)

    1995-12-31

    This paper discusses preliminary research at Sandia National Laboratories into the application of artificial neural networks for reliability and risk analysis. The goal of this effort is to develop a reliability based methodology that captures the complex relationship between uncertainty in material properties and manufacturing processes and the resulting uncertainty in life prediction estimates. The inputs to the neural network model are probability density functions describing system characteristics and the output is a statistical description of system performance. The most recent application of this methodology involves the comparison of various low-residue, lead-free soldering processes with the desire to minimize the associated waste streams with no reduction in product reliability. Model inputs include statistical descriptions of various material properties such as the coefficients of thermal expansion of solder and substrate. Consideration is also given to stochastic variation in the operational environment to which the electronic components might be exposed. Model output includes a probabilistic characterization of the fatigue life of the surface mounted component.

  16. Sensitivity analysis on parameters and processes affecting vapor intrusion risk

    KAUST Repository

    Picone, Sara

    2012-03-30

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the vast majority of previous studies, this model accounts for vertical variation of soil water saturation and includes aerobic biodegradation. The attenuation factor (ratio between concentration in the crawl space and source concentration) and the characteristic time to approach maximum concentrations were calculated and compared for a variety of scenarios. These concepts allow an understanding of controlling mechanisms and aid in the identification of critical parameters to be collected for field situations. The relative distance of the source to the nearest gas-filled pores of the unsaturated zone is the most critical parameter because diffusive contaminant transport is significantly slower in water-filled pores than in gas-filled pores. Therefore, attenuation factors decrease and characteristic times increase with increasing relative distance of the contaminant dissolved source to the nearest gas diffusion front. Aerobic biodegradation may decrease the attenuation factor by up to three orders of magnitude. Moreover, the occurrence of water table oscillations is of importance. Dynamic processes leading to a retreating water table increase the attenuation factor by two orders of magnitude because of the enhanced gas phase diffusion. © 2012 SETAC.

  17. The risk analysis of levee systems: a comparison of international best practices

    Directory of Open Access Journals (Sweden)

    Tourment R.

    2016-01-01

    Full Text Available A risk analysis of a levee system estimates the overall level of flood risk associated with the levee system, according to a series of loading conditions, the levee performance and the vulnerability to flooding of assets in the protected area. This process, which requires the identification and examination of all the components that determine the risk of flooding in a system, includes different steps. Among these steps, ‘levee system failure analysis’, ‘flood consequences analysis’ and ‘risk attribution’ have benefitted from the most important advances of recent research projects. This paper presents a critical analysis of the latest methods to conduct levee system failure analysis, flood consequences analysis and risk attribution. It shows how these methods can contribute to improving the efficiency of the risk analysis process and therefore the design and management of levee systems.

  18. Advancing flood risk analysis by integrating adaptive behaviour in large-scale flood risk assessments

    Science.gov (United States)

    Haer, T.; Botzen, W.; Aerts, J.

    2016-12-01

    In the last four decades the global population living in the 1/100 year-flood zone has doubled from approximately 500 million to a little less than 1 billion people. Urbanization in low lying -flood prone- cities further increases the exposed assets, such as buildings and infrastructure. Moreover, climate change will further exacerbate flood risk in the future. Accurate flood risk assessments are important to inform policy-makers and society on current- and future flood risk levels. However, these assessment suffer from a major flaw in the way they estimate flood vulnerability and adaptive behaviour of individuals and governments. Current flood risk projections commonly assume that either vulnerability remains constant, or try to mimic vulnerability through incorporating an external scenario. Such a static approach leads to a misrepresentation of future flood risk, as humans respond adaptively to flood events, flood risk communication, and incentives to reduce risk. In our study, we integrate adaptive behaviour in a large-scale European flood risk framework through an agent-based modelling approach. This allows for the inclusion of heterogeneous agents, which dynamically respond to each other and a changing environment. We integrate state-of-the-art flood risk maps based on climate scenarios (RCP's), and socio-economic scenarios (SSP's), with government and household agents, which behave autonomously based on (micro-)economic behaviour rules. We show for the first time that excluding adaptive behaviour leads to a major misrepresentation of future flood risk. The methodology is applied to flood risk, but has similar implications for other research in the field of natural hazards. While more research is needed, this multi-disciplinary study advances our understanding of how future flood risk will develop.

  19. [Risk communication in analysis of occupational health risk for industrial workers].

    Science.gov (United States)

    Barg, A O; Lebedeva-Nesevrya, N A

    2015-01-01

    The article covers problems of risk communication system function on industrial enterprise. Sociologic study in machinery construction enterprise of Perm area helped to consider main procedures of informing on occupational risk for health of workers exposed to occupational hazards, to describe features and mechanisms of risk communication, to specify its model. The authors proved that main obstacles for efficient system of occupational risks communication are insufficiently thorough legal basis, low corporative social responsibility of the enterprise and low social value of health for workers. This article was prepared with the support of the Russian Humanitarian Science Foundation (Project No. 14-16-59011).

  20. Elusive Critical Elements of Transformative Risk Assessment Practice and Interpretation: Is Alternatives Analysis the Next Step?

    Science.gov (United States)

    Francis, Royce A

    2015-11-01

    This article argues that "game-changing" approaches to risk analysis must focus on "democratizing" risk analysis in the same way that information technologies have democratized access to, and production of, knowledge. This argument is motivated by the author's reading of Goble and Bier's analysis, "Risk Assessment Can Be a Game-Changing Information Technology-But Too Often It Isn't" (Risk Analysis, 2013; 33: 1942-1951), in which living risk assessments are shown to be "game changing" in probabilistic risk analysis. In this author's opinion, Goble and Bier's article focuses on living risk assessment's potential for transforming risk analysis from the perspective of risk professionals-yet, the game-changing nature of information technologies has typically achieved a much broader reach. Specifically, information technologies change who has access to, and who can produce, information. From this perspective, the author argues that risk assessment is not a game-changing technology in the same way as the printing press or the Internet because transformative information technologies reduce the cost of production of, and access to, privileged knowledge bases. The author argues that risk analysis does not reduce these costs. The author applies Goble and Bier's metaphor to the chemical risk analysis context, and in doing so proposes key features that transformative risk analysis technology should possess. The author also discusses the challenges and opportunities facing risk analysis in this context. These key features include: clarity in information structure and problem representation, economical information dissemination, increased transparency to nonspecialists, democratized manufacture and transmission of knowledge, and democratic ownership, control, and interpretation of knowledge. The chemical safety decision-making context illustrates the impact of changing the way information is produced and accessed in the risk context. Ultimately, the author concludes that although

  1. Flood risk perceptions and spatial multi-criteria analysis: an exploratory research for hazard mitigation

    NARCIS (Netherlands)

    Raaijmakers, R.; Raaijmakers, Ruud; Krywkow, Jorg; van der Veen, A.

    2008-01-01

    The conventional method of risk analysis (with risk as a product of probability and consequences) does not allow for a pluralistic approach that includes the various risk perceptions of stakeholders or lay people within a given social system. This article introduces a methodology that combines the

  2. A Risk-Analysis Approach to Implementing Web-Based Assessment

    Science.gov (United States)

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  3. Graduate Education in Risk Analysis for Food, Agriculture, and Veterinary Medicine: Challenges and Opportunities

    Science.gov (United States)

    Correia, Ana-Paula; Wolt, Jeffrey D.

    2010-01-01

    The notion of risk in relation to food and food production has heightened the need to educate students to effectively deal with risk in relation to decision making from a science-based perspective. Curricula and related materials were developed and adopted to support graduate learning opportunities in risk analysis and decision making as applied…

  4. Graduate Education in Risk Analysis for Food, Agriculture, and Veterinary Medicine: Challenges and Opportunities

    Science.gov (United States)

    Correia, Ana-Paula; Wolt, Jeffrey D.

    2010-01-01

    The notion of risk in relation to food and food production has heightened the need to educate students to effectively deal with risk in relation to decision making from a science-based perspective. Curricula and related materials were developed and adopted to support graduate learning opportunities in risk analysis and decision making as applied…

  5. Credit risk determinants analysis: Empirical evidence from Chinese commercial banks

    OpenAIRE

    LU, ZONGQI

    2013-01-01

    Abstract In order to investigate the potential determinants of credit risk in Chinese commercial banks, a panel dataset includes 342 bank-year observations from 2003 to 2012 in Chinese commercial banks are used to quantify the relationship between the selected variables and Chinese bank’s credit risk. Based on several robust test, the empirical results suggest the inflation rate and loan loss provision is significantly positive to Chinese commercial banks’ credit risk, on the other hand, m...

  6. RISK ANALYSIS AND EVALUATION FOR CRITICAL LOGISTICAL INFRASTRUCTURE

    Directory of Open Access Journals (Sweden)

    Sascha Düerkop

    2016-12-01

    Full Text Available Logistical infrastructure builds the backbone of an economy. Without an effective logistical infrastructure in place, the supply for both enterprises and consumers might not be met. But even a high-quality logistical infrastructure can be threatened by risks. Thus, it is important to identify, analyse, and evaluate risks for logistical infrastructure that might threaten logistical processes. Only if those risks are known and their impact estimated, decision makers can implement counteractive measures to reduce risks. In this article, we develop a network-based approach that allows for the evaluation of risks and their consequences onto the logistical network. We will demonstrate the relevance of this approach by applying it to the logistics network of the central German state of Hesse. Even though transport data is extensively tracked and recorded nowadays, typical daily risks, like accidents on a motorway, and extraordinary risks, like a bridge at risk to collapse, terrorist attacks or climate-related catastrophes, are not systematically anticipated. Several studies unveiled recently that the overall impact for an economy of possible failures of single nodes and/or edges in a network are not calculated, and particularly critical edges are not identified in advance. We address this information gap by a method that helps to identify and quantify risks in a given network. To reach this objective, we define a mathematical optimization model that quantifies the current “risk-related costs” of the overall network and quantify the risk by investigating the change of the overall costs in the case a risk is realized.

  7. Populating a multilingual ontology of proper names from open sources

    Directory of Open Access Journals (Sweden)

    Agata Savary

    2013-11-01

    Full Text Available Even if proper names play a central role in natural language processing (NLP applications they are still under-represented in lexicons, annotated corpora, and other resources dedicated to text processing.  One of the main challenges is both the prevalence and the dynamicity of proper names. At the same time, large and regularly-updated knowledge sources containing partially-structured data, such as Wikipedia or GeoNames, are publicly available and contain large numbers of proper names. We present a method for a semi-automatic enrichment of Prolexbase, an existing multilingual ontology of proper names dedicated to natural language processing, with data extracted from these open sources in three languages: Polish, English and French. Fine-grained data extraction and integration procedures allow the user to enrich previous contents of Prolexbase with new incoming data. All data are manually validated and available under an open licence.

  8. Cataclysmic variables in the SUPERBLINK proper motion survey

    Energy Technology Data Exchange (ETDEWEB)

    Skinner, Julie N.; Thorstensen, John R. [Department of Physics and Astronomy, 6127 Wilder Laboratory, Dartmouth College, Hanover, NH 03755-3528 (United States); Lépine, Sébastien, E-mail: jns@dartmouth.edu [Department of Physics and Astronomy, Georgia State University, 25 Park Place NE, Atlanta, GA 30303 (United States)

    2014-12-01

    We have discovered a new high proper motion cataclysmic variable (CV) in the SUPERBLINK proper motion survey, which is sensitive to stars with proper motions greater than 40 mas yr{sup −1}. This CV was selected for follow-up observations as part of a larger search for CVs selected based on proper motions and their near-UV−V and V−K{sub s} colors. We present spectroscopic observations from the 2.4 m Hiltner Telescope at MDM Observatory. The new CV's orbital period is near 96 minutes, its spectrum shows the double-peaked Balmer emission lines characteristic of quiescent dwarf novae, and its V magnitude is near 18.2. Additionally, we present a full list of known CVs in the SUPERBLINK catalog.

  9. Proper Use of Audio-Visual Aids: Essential for Educators.

    Science.gov (United States)

    Dejardin, Conrad

    1989-01-01

    Criticizes educators as the worst users of audio-visual aids and among the worst public speakers. Offers guidelines for the proper use of an overhead projector and the development of transparencies. (DMM)

  10. Some Remarks on Bonjour on Warrant, Proper Function, and Defeasibility

    Directory of Open Access Journals (Sweden)

    Colin P. Ruloff

    2000-12-01

    Full Text Available A number of counterexamples have recently been leveled against Alvin Plantinga's Proper Functionalism, counterexamples aimed at showing that Plantinga's theory fads to provide sufficient conditions for warrant — that elusive epistemic property which together with true belief yields knowledge Among these counterexamples, Laurence Bonjour s is perhaps the most formidable and, if successful, shows that Proper Functionalism is simply too weak to serve as an acceptable theory of warrant In this paper, I argue that, contrary to initial appearances, BonJour's counterexample is not successful More exactly, I argue that, once it is recognized that a defeasibility constraint is deeply embedded within Plantinga's proper function condition for warrant — a constraint which says, in effect, that a belief B is warranted for an agent S only of S does not possess any defeaters against B — BonJour's counterexample to Proper Functionalism can be handled quite straightforwardly

  11. A properly adjusted forage harvester can save time and money

    Science.gov (United States)

    A properly adjusted forage harvester can save fuel and increase the realizable milk per ton of your silage. This article details the adjustments necessary to minimize energy while maximizing productivity and forage quality....

  12. Cataclysmic Variables in the SUPERBLINK Proper Motion Survey

    CERN Document Server

    Skinner, Julie N; Lépine, Sébastien

    2014-01-01

    We have discovered a new high proper motion cataclysmic variable (CV) in the SUPERBLINK proper motion survey, which is sensitive to stars with proper motions greater than 40 mas/yr. This CV was selected for follow-up observations as part of a larger search for CVs selected based on proper motions and their NUV-V and V-K$_{s}$ colors. We present spectroscopic observations from the 2.4m Hiltner Telescope at MDM Observatory. The new CV's orbital period is near 96 minutes, its spectrum shows the double-peaked Balmer emission lines characteristic of quiescent dwarf novae, and its V magnitude is near 18.2. Additionally, we present a full list of known CVs in the SUPERBLINK catalog.

  13. Risk Analysis and Consumer Protection in B2C Transactions

    Institute of Scientific and Technical Information of China (English)

    YANG Jian-zheng; SHI Qi-liang; Gary Millar; Ruhul A. Sarker

    2005-01-01

    Recent studies have shown that the perceived lack of security is a major obstacle to the wider acceptance of e-commerce. To overcome this barrier, businesses need to implement comprehensive consumer protection systems that protect consumers during every stage of the purchasing process. This paper used the consumer behaviour model as the basis for analysing risks in Bussiness-to-Consumer (B2C) transactions. Four categories of risks were identified: information, agreement, payment and delivery risk. By combining these risk categories with the three dimensions of management, technology and legislation, a comprehensive B2C consumer protection framework is developed.

  14. Contrast-Induced Nephropathy After Computed Tomography in Stable CKD Patients With Proper Prophylaxis: 8-Year Experience of Outpatient Prophylaxis Program.

    Science.gov (United States)

    Park, Sehoon; Kim, Myoung-Hee; Kang, Eunjeong; Park, Seokwoo; Jo, Hyung Ah; Lee, Hajeong; Kim, Sun Moon; Lee, Jung Pyo; Oh, Kook-Hwan; Joo, Kwon Wook; Kim, Yon Su; Kim, Dong Ki

    2016-05-01

    Conflicting data have been reported on the clinical significance of contrast-induced nephropathy after CT scan (CT-CIN). In addition, the epidemiologic characteristics and clinical outcomes of CT-CIN following proper prophylactic intervention remain elusive.We examined the incidence, risk factors, and outcomes of CT-CIN in stable chronic kidney disease (CKD) patients using data collected from our outpatient CT-CIN prophylaxis program conducted between 2007 and 2014. The program recruited patients with an estimated glomerular filtration rate (eGFR) pop-up alert system and provided an identical protocol of CIN prophylaxis to all patients.A total of 1666 subjects were included in this study, and 61 of the 1666 subjects (3.7%) developed CT-CIN. Multivariate analysis showed that baseline eGFR, diabetes mellitus, and low serum albumin were significant risk factors for CT-CIN. The generalized additive model analysis revealed a nonlinear relationship between the baseline eGFR and the risk of CT-CIN. In this analysis, the risk of CT-CIN began to increase below an eGFR threshold of 36.8 mL/min/1.73 m. To assess the outcomes of CT-CIN, patients with and without CT-CIN were compared after propensity score-based 1:2 matching. CT-CIN did not increase the mortality rate of patients. However, patients with CT-CIN were significantly more likely to start dialysis within 6 months of follow-up, but not after those initial 6 months.CT-CIN developed in only a small number of stable CKD patients who received proper prophylactic intervention, and the risk of CT-CIN was increased in patients with more advanced CKD. Despite the low incidence, CT-CIN conferred a non-negligible risk for the initiation of dialysis in the acute period, even after prophylaxis.

  15. The PMA Catalogue: 420 million positions and absolute proper motions

    Science.gov (United States)

    Akhmetov, V. S.; Fedorov, P. N.; Velichko, A. B.; Shulga, V. M.

    2017-07-01

    We present a catalogue that contains about 420 million absolute proper motions of stars. It was derived from the combination of positions from Gaia DR1 and 2MASS, with a mean difference of epochs of about 15 yr. Most of the systematic zonal errors inherent in the 2MASS Catalogue were eliminated before deriving the absolute proper motions. The absolute calibration procedure (zero-pointing of the proper motions) was carried out using about 1.6 million positions of extragalactic sources. The mean formal error of the absolute calibration is less than 0.35 mas yr-1. The derived proper motions cover the whole celestial sphere without gaps for a range of stellar magnitudes from 8 to 21 mag. In the sky areas where the extragalactic sources are invisible (the avoidance zone), a dedicated procedure was used that transforms the relative proper motions into absolute ones. The rms error of proper motions depends on stellar magnitude and ranges from 2-5 mas yr-1 for stars with 10 mag mas yr-1 for faint ones. The present catalogue contains the Gaia DR1 positions of stars for the J2015 epoch. The system of the PMA proper motions does not depend on the systematic errors of the 2MASS positions, and in the range from 14 to 21 mag represents an independent realization of a quasi-inertial reference frame in the optical and near-infrared wavelength range. The Catalogue also contains stellar magnitudes taken from the Gaia DR1 and 2MASS catalogues. A comparison of the PMA proper motions of stars with similar data from certain recent catalogues has been undertaken.

  16. Fast algorithms for finding proper strategies in game trees

    DEFF Research Database (Denmark)

    Miltersen, Peter Bro; Sørensen, Troels Bjerre

    2008-01-01

    We show how to find a normal form proper equilibrium in behavior strategies of a given two-player zero-sum extensive form game with imperfect information but perfect recall. Our algorithm solves a finite sequence of linear programs and runs in polynomial time. For the case of a perfect informatio...... game, we show how to find a normal form proper equilibrium in linear time by a simple backwards induction procedure....

  17. Uncertainty analysis of EUSES: Improving risk management through probabilistic risk assessment

    NARCIS (Netherlands)

    Jager T; Rikken MGJ; Poel P van der; ECO

    1997-01-01

    In risk assessment of new and existing substances, it is current practice to characterise risk using a deterministic quotient of the exposure concentration, or the dose, and a no-effect level. Feelings of uncertainty are tackled by introducing worst-case assumptions in the methodology. Since this pr

  18. Foundations for proper-time relativistic quantum theory

    Science.gov (United States)

    Gill, Tepper L.; Morris, Trey; Kurtz, Stewart K.

    2015-05-01

    This paper is a progress report on the foundations for the canonical proper-time approach to relativistic quantum theory. We first review the the standard square-root equation of relativistic quantum theory, followed by a review of the Dirac equation, providing new insights into the physical properties of both. We then introduce the canonical proper-time theory. For completeness, we give a brief outline of the canonical proper-time approach to electrodynamics and mechanics, and then introduce the canonical proper-time approach to relativistic quantum theory. This theory leads to three new relativistic wave equations. In each case, the canonical generator of proper-time translations is strictly positive definite, so that it represents a particle. We show that the canonical proper-time extension of the Dirac equation for Hydrogen gives results that are consistently closer to the experimental data, when compared to the Dirac equation. However, these results are not sufficient to account for either the Lamb shift or the anomalous magnetic moment.

  19. Breast Image Analysis for Risk Assessment, Detection, Diagnosis, and Treatment of Cancer

    NARCIS (Netherlands)

    Giger, M.L.; Karssemeijer, N.; Schnabel, J.A.

    2013-01-01

    The role of breast image analysis in radiologists' interpretation tasks in cancer risk assessment, detection, diagnosis, and treatment continues to expand. Breast image analysis methods include segmentation, feature extraction techniques, classifier design, biomechanical modeling, image registration

  20. Method for environmental risk analysis (MIRA) revision 2007; Metode for miljoerettet risikoanalyse (MIRA) revisjon 2007

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-04-15

    OLF's instruction manual for carrying out environmental risk analyses provides a united approach and a common framework for environmental risk assessments. This is based on the best information available. The manual implies standardizations of a series of parameters, input data and partial analyses that are included in the environmental risk analysis. Environmental risk analyses carried out according to the MIRA method will thus be comparable between fields and between companies. In this revision an update of the text in accordance with today's practice for environmental risk analyses and prevailing regulations is emphasized. Moreover, method adjustments for especially protected beach habitats have been introduced, as well as a general method for estimating environmental risk concerning fish. Emphasis has also been put on improving environmental risk analysis' possibilities to contribute to a better management of environmental risk in the companies (ml)