WorldWideScience

Sample records for risk analyses based

  1. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  2. Issues and approaches in risk-based aging analyses of passive components

    International Nuclear Information System (INIS)

    Uryasev, S.P.; Samanta, P.K.; Vesely, W.E.

    1994-01-01

    In previous NRC-sponsored work a general methodology was developed to quantify the risk contributions from aging components at nuclear plants. The methodology allowed Probabilistic Risk Analyses (PRAs) to be modified to incorporate the age-dependent component failure rates and also aging maintenance models to evaluate and prioritize the aging contributions from active components using the linear aging failure rate model and empirical components aging rates. In the present paper, this methodology is extended to passive components (for example, the pipes, heat exchangers, and the vessel). The analyses of passive components bring in issues different from active components. Here, we specifically focus on three aspects that need to be addressed in risk-based aging prioritization of passive components

  3. Risk-based analyses in support of California hazardous site remediation

    International Nuclear Information System (INIS)

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year's activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs' capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis

  4. Data analyses and modelling for risk based monitoring of mycotoxins in animal feed

    NARCIS (Netherlands)

    Ine van der Fels-Klerx, H.J.; Adamse, Paulien; Punt, Ans; Asselt, van Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study

  5. Handbook of methods for risk-based analyses of technical specifications

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations

  6. Handbook of methods for risk-based analyses of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Samanta, P.K.; Kim, I.S. [Brookhaven National Lab., Upton, NY (United States); Mankamo, T. [Avaplan Oy, Espoo (Finland); Vesely, W.E. [Science Applications International Corp., Dublin, OH (United States)

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  7. [Health risks in different living circumstances of mothers. Analyses based on a population study].

    Science.gov (United States)

    Sperlich, Stefanie

    2014-12-01

    The objective of this study was to determine the living circumstances ('Lebenslagen') in mothers which are associated with elevated health risks. Data were derived from a cross-sectional population based sample of German women (n = 3129) with underage children. By means of a two-step cluster analysis ten different maternal living circumstances were assessed which proved to be distinct with respect to indicators of socioeconomic position, employment status and family-related factors. Out of the ten living circumstances, one could be attributed to higher socioeconomic status (SES), while five were assigned to a middle SES and four to a lower SES. In line with previous findings, mothers with a high SES predominantly showed the best health while mothers with a low SES tended to be at higher health risk with respect to subjective health, mental health (anxiety and depression), obesity and smoking. However, there were important health differences between the different living circumstances within the middle and lower SES. In addition, varying health risks were found among different living circumstances of single mothers, pointing to the significance of family and job-related living conditions in establishing health risks. With this exploratory analysis strategy small-scale living conditions could be detected which were associated with specific health risks. This approach seemed particularly suitable to provide a more precise definition of target groups for health promotion. The findings encourage a more exrensive application of the concept of living conditions in medical sociology research as well as health monitoring.

  8. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Directory of Open Access Journals (Sweden)

    H.J. (Ine van der Fels-Klerx

    2018-01-01

    Full Text Available Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products; all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials.

  9. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Science.gov (United States)

    van der Fels-Klerx, H.J. (Ine); Adamse, Paulien; Punt, Ans; van Asselt, Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products) and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products); all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials. PMID:29373559

  10. [The genotype-based haplotype relative risk and transmission disequilibrium test analyses of familial febrile convulsions].

    Science.gov (United States)

    Qi, Y; Wu, X; Guo, Z; Zhang, J; Pan, H; Li, M; Bao, X; Peng, J; Zou, L; Lin, Q

    1999-10-01

    To confirm the linkage of familial febrile convulsions to the short arm of chromosome 6(6p) or the long arm of chromosome 8(8q). The authors finished genotyping of Pst I locus on the coding region of heat shock protein (HSP) 70, 5'untranslated region of HSP70-1, 3' untranslated region of HSP70-2, D8S84 and D8S85. The data were processed by the genotype-based haplotype relative risk(GHRR) and transmission disequilibrium test(TDT) methods in PPAP. Some signs of association and disequilibrium between D8S85 and FC were shown by GHRR and TDT. A suspect linkage of familial febrile convulsions to the long arm of chromosome 8 has been proposed.

  11. Use of results of microbiological analyses for risk-based control of Listeria monocytogenes in marinated broiler legs.

    Science.gov (United States)

    Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura

    2008-02-10

    Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk

  12. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  13. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    Directory of Open Access Journals (Sweden)

    Ester Vilaprinyo

    Full Text Available The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1 To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2 To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial, the starting ages (40, 45 and 50 years and the ending ages (69 and 74 years in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  14. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  15. Process of Integrating Screening and Detailed Risk-based Modeling Analyses to Ensure Consistent and Scientifically Defensible Results

    International Nuclear Information System (INIS)

    Buck, John W.; McDonald, John P.; Taira, Randal Y.

    2002-01-01

    To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders

  16. Environmental risk factors for autism: an evidence-based review of systematic reviews and meta-analyses.

    Science.gov (United States)

    Modabbernia, Amirhossein; Velthorst, Eva; Reichenberg, Abraham

    2017-01-01

    According to recent evidence, up to 40-50% of variance in autism spectrum disorder (ASD) liability might be determined by environmental factors. In the present paper, we conducted a review of systematic reviews and meta-analyses of environmental risk factors for ASD. We assessed each review for quality of evidence and provided a brief overview of putative mechanisms of environmental risk factors for ASD. Current evidence suggests that several environmental factors including vaccination, maternal smoking, thimerosal exposure, and most likely assisted reproductive technologies are unrelated to risk of ASD. On the contrary, advanced parental age is associated with higher risk of ASD. Birth complications that are associated with trauma or ischemia and hypoxia have also shown strong links to ASD, whereas other pregnancy-related factors such as maternal obesity, maternal diabetes, and caesarian section have shown a less strong (but significant) association with risk of ASD. The reviews on nutritional elements have been inconclusive about the detrimental effects of deficiency in folic acid and omega 3, but vitamin D seems to be deficient in patients with ASD. The studies on toxic elements have been largely limited by their design, but there is enough evidence for the association between some heavy metals (most important inorganic mercury and lead) and ASD that warrants further investigation. Mechanisms of the association between environmental factors and ASD are debated but might include non-causative association (including confounding), gene-related effect, oxidative stress, inflammation, hypoxia/ischemia, endocrine disruption, neurotransmitter alterations, and interference with signaling pathways. Compared to genetic studies of ASD, studies of environmental risk factors are in their infancy and have significant methodological limitations. Future studies of ASD risk factors would benefit from a developmental psychopathology approach, prospective design, precise exposure

  17. RISK ANALYSES USED IN ACCEPTANCE TESTING

    Directory of Open Access Journals (Sweden)

    Oxana STOROJ

    2016-06-01

    Full Text Available This article is talking about risk based testing approach in user acceptance testing UAT (User Acceptance Testing. There are presented definitions of risk and risk based testing. In addition, we are talking about risks that can appear during UAT and we are describing the process of testing based on risks. We propose some techniques and methods of identifying risks such as using Brainstorming, Delphi method,probability analysis method and others. Also, risk traceability matrix is presented as a method of prioritizing risks.

  18. Building-related symptoms among U.S. office workers and risks factors for moisture and contamination: Preliminary analyses of U.S. EPA BASE Data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Cozen, Myrna

    2002-09-01

    The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very

  19. Increased risk of stroke in hypertensive women using hormone therapy: analyses based on the Danish Nurse Study

    DEFF Research Database (Denmark)

    Løkkegaard, Ellen; Jovanovic, Zorana; Heitmann, Berit L

    2003-01-01

    by presence of risk factors for stroke. DESIGN: Prospective cohort study. SETTING: In 1993, the Danish Nurse Study was established, and questionnaires on lifestyle and HT use were sent to all Danish nurses older than 44 years, of whom 19,898 (85.8%) replied. PARTICIPANTS: Postmenopausal women (n = 13...

  20. El Niño-Southern Oscillation-based index insurance for floods: Statistical risk analyses and application to Peru

    Science.gov (United States)

    Khalil, Abedalrazq F.; Kwon, Hyun-Han; Lall, Upmanu; Miranda, Mario J.; Skees, Jerry

    2007-10-01

    Index insurance has recently been advocated as a useful risk transfer tool for disaster management situations where rapid fiscal relief is desirable and where estimating insured losses may be difficult, time consuming, or subject to manipulation and falsification. For climate-related hazards, a rainfall or temperature index may be proposed. However, rainfall may be highly spatially variable relative to the gauge network, and in many locations, data are inadequate to develop an index because of short time series and the spatial dispersion of stations. In such cases, it may be helpful to consider a climate proxy index as a regional rainfall index. This is particularly useful if a long record is available for the climate index through an independent source and it is well correlated with the regional rainfall hazard. Here El Niño-Southern Oscillation (ENSO) related climate indices are explored for use as a proxy to extreme rainfall in one of the districts of Peru, Piura. The ENSO index insurance product may be purchased by banks or microfinance institutions to aid agricultural damage relief in Peru. Crop losses in the region are highly correlated with floods but are difficult to assess directly. Beyond agriculture, many other sectors suffer as well. Basic infrastructure is destroyed during the most severe events. This disrupts trade for many microenterprises. The reliability and quality of the local rainfall data are variable. Averaging the financial risk across the region is desirable. Some issues with the implementation of the proxy ENSO index are identified and discussed. Specifically, we explore (1) the reliability of the index at different levels of probability of exceedance of maximum seasonal rainfall, (2) the effect of sampling uncertainties and the strength of the proxy's association to local outcome, (3) the potential for clustering of payoffs, (4) the potential that the index could be predicted with some lead time prior to the flood season, and (5) evidence

  1. Seismic risk analyses in the German Risk Study, phase B

    International Nuclear Information System (INIS)

    Hosser, D.; Liemersdorf, H.

    1991-01-01

    The paper discusses some aspects of the seismic risk part of the German Risk Study for Nuclear Power Plants, Phase B. First simplified analyses in Phase A of the study allowed only a rough classification of structures and systems of the PWR reference plant according to their seismic risk contribution. These studies were extended in Phase B using improved models for the dynamic analyses of buildings, structures and components as well as for the probabilistic analyses of seismic loading, failure probabilities and event trees. The methodology of deriving probabilistic seismic load descriptions is explained and compared with the methods in Phase A of the study and in other studies. Some details of the linear and nonlinear dynamic analyses of structures are reported in order to demonstrate the influence of different assumptions for material behaviour and failure criteria. The probabilistic structural and event tree analyses are discussed with respect to distribution assumptions, acceptable simplifications and model uncertainties. Some results for the PWR reference plant are given. (orig.)

  2. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  3. The Network of Counterparty Risk: Analysing Correlations in OTC Derivatives.

    Science.gov (United States)

    Nanumyan, Vahan; Garas, Antonios; Schweitzer, Frank

    2015-01-01

    Counterparty risk denotes the risk that a party defaults in a bilateral contract. This risk not only depends on the two parties involved, but also on the risk from various other contracts each of these parties holds. In rather informal markets, such as the OTC (over-the-counter) derivative market, institutions only report their aggregated quarterly risk exposure, but no details about their counterparties. Hence, little is known about the diversification of counterparty risk. In this paper, we reconstruct the weighted and time-dependent network of counterparty risk in the OTC derivatives market of the United States between 1998 and 2012. To proxy unknown bilateral exposures, we first study the co-occurrence patterns of institutions based on their quarterly activity and ranking in the official report. The network obtained this way is further analysed by a weighted k-core decomposition, to reveal a core-periphery structure. This allows us to compare the activity-based ranking with a topology-based ranking, to identify the most important institutions and their mutual dependencies. We also analyse correlations in these activities, to show strong similarities in the behavior of the core institutions. Our analysis clearly demonstrates the clustering of counterparty risk in a small set of about a dozen US banks. This not only increases the default risk of the central institutions, but also the default risk of peripheral institutions which have contracts with the central ones. Hence, all institutions indirectly have to bear (part of) the counterparty risk of all others, which needs to be better reflected in the price of OTC derivatives.

  4. PC based uranium enrichment analyser

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishana, K.R.; Bairi, B.R.

    1991-01-01

    It is important to measure enrichment of unirradiated nuclear fuel elements during production as a quality control measure. An IBM PC based system has recently been tested for enrichment measurements for Nuclear Fuel Complex (NFC), Hyderabad. As required by NFC, the system has ease of calibration. It is easy to switch the system from measuring enrichment of fuel elements to pellets and also automatically store the data and the results. The system uses an IBM PC plug in card to acquire data. The card incorporates programmable interval timers (8253-5). The counter/timer devices are executed by I/O mapped I/O's. A novel algorithm has been incorporated to make the system more reliable. The application software has been written in BASIC. (author). 9 refs., 1 fig

  5. Agent-based and phylogenetic analyses reveal how HIV-1 moves between risk groups: injecting drug users sustain the heterosexual epidemic in Latvia

    Science.gov (United States)

    Graw, Frederik; Leitner, Thomas; Ribeiro, Ruy M.

    2012-01-01

    Injecting drug users (IDU) are a driving force for the spread of HIV-1 in Latvia and other Baltic States, accounting for a majority of cases. However, in recent years, heterosexual cases have increased disproportionately. It is unclear how the changes in incidence patterns in Latvia can be explained, and how important IDU are for the heterosexual sub-epidemic. We introduce a novel epidemic model and use phylogenetic analyses in parallel to examine the spread of HIV-1 in Latvia between 1987 and 2010. Using a hybrid framework with a mean-field description for the susceptible population and an agent-based model for the infecteds, we track infected individuals and follow transmission histories dynamically formed during the simulation. The agent-based simulations and the phylogenetic analysis show that more than half of the heterosexual transmissions in Latvia were caused by IDU, which sustain the heterosexual epidemic. Indeed, we find that heterosexual clusters are characterized by short transmission chains with up to 63% of the chains dying out after the first introduction. In the simulations, the distribution of transmission chain sizes follows a power law distribution, which is confirmed by the phylogenetic data. Our models indicate that frequent introductions reduced the extinction probability of an autonomously spreading heterosexual HIV-1 epidemic, which now has the potential to dominate the spread of the overall epidemic in the future. Furthermore, our model shows that social heterogeneity of the susceptible population can explain the shift in HIV-1 incidence in Latvia over the course of the epidemic. Thus, the decrease in IDU incidence may be due to local heterogeneities in transmission, rather than the implementation of control measures. Increases in susceptibles, through social or geographic movement of IDU, could lead to a boost in HIV-1 infections in this risk group. Targeting individuals that bridge social groups would help prevent further spread of the

  6. A conceptual framework for formulating a focused and cost-effective fire protection program based on analyses of risk and the dynamics of fire effects

    International Nuclear Information System (INIS)

    Dey, M.K.

    1999-01-01

    This paper proposes a conceptual framework for developing a fire protection program at nuclear power plants based on probabilistic risk analysis (PRA) of fire hazards, and modeling the dynamics of fire effects. The process for categorizing nuclear power plant fire areas based on risk is described, followed by a discussion of fire safety design methods that can be used for different areas of the plant, depending on the degree of threat to plant safety from the fire hazard. This alternative framework has the potential to make programs more cost-effective, and comprehensive, since it will allow a more systematic and broader examination of fire risk, and provide a means to distinguish between high and low risk fire contributors. (orig.)

  7. International comparative analyses of healthcare risk management.

    Science.gov (United States)

    Sun, Niuyun; Wang, Li; Zhou, Jun; Yuan, Qiang; Zhang, Zongjiu; Li, Youping; Liang, Minghui; Cheng, Lan; Gao, Guangming; Cui, Xiaohui

    2011-02-01

    Interpretation of the growing body of global literature on health care risk is compromised by a lack of common understanding and language. This series of articles aims to comprehensively compare laws and regulations, institutional management, and administration of incidence reporting systems on medical risk management in the United Kingdom, the United States, Canada, Australia, and Taiwan, so as to provide evidence and recommendations for health care risk management policy in China. We searched the official websites of the healthcare risk management agencies of the four countries and one district for laws, regulatory documents, research reports, reviews and evaluation forms concerned with healthcare risk management and assessment. Descriptive comparative analysis was performed on relevant documents. A total of 146 documents were included in this study, including 2 laws (1.4%), 17 policy documents (11.6%), 41 guidance documents (28.1%), 37 reviews (25.3%), and 49 documents giving general information (33.6%). The United States government implemented one law and one rule of patient safety management, while the United Kingdom and Australia each issued professional guidances on patient safety improvement. The four countries implemented patient safety management policy on four different levels: national, state/province, hospital, and non-governmental organization. The four countries and one district adopted four levels of patient safety management, and the administration modes can be divided into an "NGO-led mode" represented by the United States and Canada and a "government-led mode" represented by the United Kingdom, Australia, and Taiwan. © 2011 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.

  8. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  9. Regionalisation of asset values for risk analyses

    Directory of Open Access Journals (Sweden)

    A. H. Thieken

    2006-01-01

    Full Text Available In risk analysis there is a spatial mismatch of hazard data that are commonly modelled on an explicit raster level and exposure data that are often only available for aggregated units, e.g. communities. Dasymetric mapping techniques that use ancillary information to disaggregate data within a spatial unit help to bridge this gap. This paper presents dasymetric maps showing the population density and a unit value of residential assets for whole Germany. A dasymetric mapping approach, which uses land cover data (CORINE Land Cover as ancillary variable, was adapted and applied to regionalize aggregated census data that are provided for all communities in Germany. The results were validated by two approaches. First, it was ascertained whether population data disaggregated at the community level can be used to estimate population in postcodes. Secondly, disaggregated population and asset data were used for a loss evaluation of two flood events that occurred in 1999 and 2002, respectively. It must be concluded that the algorithm tends to underestimate the population in urban areas and to overestimate population in other land cover classes. Nevertheless, flood loss evaluations demonstrate that the approach is capable of providing realistic estimates of the number of exposed people and assets. Thus, the maps are sufficient for applications in large-scale risk assessments such as the estimation of population and assets exposed to natural and man-made hazards.

  10. Integrating Risk Analyses and Tools at the DOE Hanford Site

    International Nuclear Information System (INIS)

    LOBER, R.W.

    2002-01-01

    Risk assessment and environmental impact analysis at the U.S. Department of Energy (DOE) Hanford Site in Washington State has made significant progress in refining the strategy for using risk analysis to support closing of several hundred waste sites plus 149 single-shell tanks at the Hanford Site. A Single-Shell Tank System Closure Work Plan outlines the current basis for closing the single-shell tank systems. An analogous site approach has been developed to address closure of aggregated groups of similar waste sites. Because of the complexity, decision time frames, proximity of non-tank farm waste sites to tank farms, scale, and regulatory considerations, various projects are providing integrated assessments to support risk analyses and decision-making. Projects and the tools that are being developed and applied at Hanford to support retrieval and cleanup decisions include: (1) Life Cycle Model (LCM) and Risk Receptor Model (RRM)--A site-level set of tools to support strategic analyses through scoping level risk management to assess different alternatives and options for tank closure. (2) Systems Assessment Capability for Integrated Groundwater Nadose Zone (SAC) and the Site-Wide Groundwater Model (SWGM)--A site-wide groundwater modeling system coupled with a risk-based uncertainty analysis of inventory, vadose zone, groundwater, and river interactions for evaluating cumulative impacts from individual and aggregate waste sites. (3) Retrieval Performance Evaluation (RPE)--A site-specific, risk-based methodology developed to evaluate performance of waste retrieval, leak detection and closure on a tank-specific basis as a function of past tank Leaks, potential leakage during retrieval operations, and remaining residual waste inventories following completion of retrieval operations. (4) Field Investigation Report (FIR)--A corrective action program to investigate the nature and extent of past tank leaks through characterization activities and assess future impacts to

  11. Assessing the risk of pelvic and para-aortic nodal involvement in apparent early-stage ovarian cancer: A predictors- and nomogram-based analyses.

    Science.gov (United States)

    Bogani, Giorgio; Tagliabue, Elena; Ditto, Antonino; Signorelli, Mauro; Martinelli, Fabio; Casarin, Jvan; Chiappa, Valentina; Dondi, Giulia; Leone Roberti Maggiore, Umberto; Scaffa, Cono; Borghi, Chiara; Montanelli, Luca; Lorusso, Domenica; Raspagliesi, Francesco

    2017-10-01

    To estimate the prevalence of lymph node involvement in early-stage epithelial ovarian cancer in order to assess the prognostic value of lymph node dissection. Data of consecutive patients undergoing staging for early-stage epithelial ovarian cancer were retrospectively evaluated. Logistic regression and a nomogram-based analysis were used to assess the risk of lymph node involvement. Overall, 290 patients were included. All patients had lymph node dissection including pelvic and para-aortic lymphadenectomy. Forty-two (14.5%) patients were upstaged due to lymph node metastatic disease. Pelvic and para-aortic nodal metastases were observed in 22 (7.6%) and 42 (14.5%) patients. Lymph node involvement was observed in 18/95 (18.9%), 1/37 (2.7%), 4/29 (13.8%), 11/63 (17.4%), 3/41 (7.3%) and 5/24 (20.8%) patients with high-grade serous, low-grade-serous, endometrioid G1, endometrioid G2&3, clear cell and undifferentiated, histology, respectively (p=0.12, Chi-square test). We observed that high-grade serous histology was associated with an increased risk of pelvic node involvement; while, histology rather than low-grade serous and bilateral tumors were independently associated with para-aortic lymph node involvement (p<0.05). Nomograms displaying the risk of nodal involvement in the pelvic and para-aortic areas were built. High-grade serous histology and bilateral tumors are the main characteristics suggesting lymph node positivity. Our data suggested that high-grade serous and bilateral early-stage epithelial ovarian cancer are at high risk of having disease harboring in the lymphatic tissues of both pelvic and para-aortic area. After receiving external validation, our data will help to identify patients deserving comprehensive retroperitoneal staging. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. New ventures require accurate risk analyses and adjustments.

    Science.gov (United States)

    Eastaugh, S R

    2000-01-01

    For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.

  13. A critical appraisal of the use of umbilical artery Doppler ultrasound in high-risk pregnancies: use of meta-analyses in evidence-based obstetrics

    DEFF Research Database (Denmark)

    Westergaard, H.B.; Langhoff-Roos, J.; Lingman, G.

    2001-01-01

    Doppler velocimetry; high-risk pregnancy; meta-analysis; intrauterine growth restriction; perinatal mortality; umbilical artery......Doppler velocimetry; high-risk pregnancy; meta-analysis; intrauterine growth restriction; perinatal mortality; umbilical artery...

  14. Risk and reliability analyses (LURI) and expert judgement techniques

    International Nuclear Information System (INIS)

    Pyy, P.; Pulkkinen, U.

    1998-01-01

    Probabilistic safety analysis (PSA) is currently used as a regulatory licensing tool in risk informed and plant performance based regulation. More often also utility safety improvements are based on PSA calculations as one criterion. PSA attempts to comprehensively identify all important risk contributors, compare them with each other, assess the safety level and suggest improvements based on its findings. The strength of PSA is that it is capable to provide decision makers with numerical estimates of risks. This makes decision making easier than the comparison of purely qualitative results. PSA is the only comprehensive tool that compactly attempts to include all the important risk contributors in its scope. Despite the demonstrated strengths of PSA, there are some features that have reduced its uses. For example, the PSA scope has been limited to the power operation and process internal events (transients and LOCAs). Only lately, areas such as shutdown, external events and severe accidents have been included in PSA models in many countries. Problems related to modelling are, e.g., that rather static fault and event tree models are commonly used in PSA to model dynamic event sequences. Even if a valid model may be generated, there may not be any other data sources to be used than expert judgement. Furthermore, there are a variety of different techniques for human reliability assessment (HRA) giving varying results. In the project Reliability and Risk Analyses (LURI) these limitations and shortcomings have been studied. In the decision making area, case studies on the application of decision analysis and a doctoral thesis have been published. Further, practical aid has been given to utilities and regulatory decision making. Model uncertainty effect on PSA results has been demonstrated by two case studies. Human reliability has been studied both in the integrated safety analysis study and in the study of maintenance originated NPP component faults based on the

  15. Quantitative risk trends deriving from PSA-based event analyses. Analysis of results from U.S.NRC's accident sequence precursor program

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2004-01-01

    The United States Nuclear Regulatory Commission (U.S.NRC) has been carrying out the Accident Sequence Precursor (ASP) Program to identify and categorize precursors to potential severe core damage accident sequences using the probabilistic safety assessment (PSA) technique. The ASP Program has identified a lot of risk significant events as precursors that occurred at U.S. nuclear power plants. Although the results from the ASP Program include valuable information that could be useful for obtaining and characterizing risk significant insights and for monitoring risk trends in nuclear power industry, there are only a few attempts to determine and develop the trends using the ASP results. The present study examines and discusses quantitative risk trends for the industry level, using two indicators, that is, the occurrence frequency of precursors and the annual core damage probability, deriving from the results of the ASP analysis. It is shown that the core damage risk at U.S. nuclear power plants has been lowered and the likelihood of risk significant events has been remarkably decreasing. As well, the present study demonstrates that two risk indicators used here can provide quantitative information useful for examining and monitoring the risk trends and/or risk characteristics in nuclear power industry. (author)

  16. Evaluation of the Risk of Grade 3 Oral and Pharyngeal Dysphagia Using Atlas-Based Method and Multivariate Analyses of Individual Patient Dose Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Otter, Sophie [Department of Clinical Oncology, Royal Marsden Hospital, Sutton and London (United Kingdom); Schick, Ulrike; Gulliford, Sarah [Department of Clinical Oncology, Royal Marsden Hospital, Sutton and London (United Kingdom); The Institute of Cancer Research, London (United Kingdom); Lal, Punita [Sanjay Gandhi Postgraduate Institute of Medical Sciences, Lucknow India (India); Franceschini, Davide [Department of Radiotherapy and Radiosurgery, Humanitas Research Hospital, Milan (Italy); Newbold, Katie; Nutting, Christopher; Harrington, Kevin [Department of Clinical Oncology, Royal Marsden Hospital, Sutton and London (United Kingdom); The Institute of Cancer Research, London (United Kingdom); Bhide, Shreerang, E-mail: shreerang.bhide@icr.ac.uk [Department of Clinical Oncology, Royal Marsden Hospital, Sutton and London (United Kingdom); The Institute of Cancer Research, London (United Kingdom); Department of Radiotherapy and Radiosurgery, Humanitas Research Hospital, Milan (Italy)

    2015-11-01

    Purpose: The study aimed to apply the atlas of complication incidence (ACI) method to patients receiving radical treatment for head and neck squamous cell carcinomas (HNSCC), to generate constraints based on dose-volume histograms (DVHs), and to identify clinical and dosimetric parameters that predict the risk of grade 3 oral mucositis (g3OM) and pharyngeal dysphagia (g3PD). Methods and Materials: Oral and pharyngeal mucosal DVHs were generated for 253 patients who received radiation (RT) or chemoradiation (CRT). They were used to produce ACI for g3OM and g3PD. Multivariate analysis (MVA) of the effect of dosimetry, clinical, and patient-related variables was performed using logistic regression and bootstrapping. Receiver operating curve (ROC) analysis was also performed, and the Youden index was used to find volume constraints that discriminated between volumes that predicted for toxicity. Results: We derived statistically significant dose-volume constraints for g3OM over the range v28 to v70. Only 3 statistically significant constraints were derived for g3PD v67, v68, and v69. On MVA, mean dose to the oral mucosa predicted for g3OM and concomitant chemotherapy and mean dose to the inferior constrictor (IC) predicted for g3PD. Conclusions: We have used the ACI method to evaluate incidences of g3OM and g3PD and ROC analysis to generate constraints to predict g3OM and g3PD derived from entire individual patient DVHs. On MVA, the strongest predictors were radiation dose (for g3OM) and concomitant chemotherapy (for g3PD).

  17. Cost tradeoffs in consequence management at nuclear power plants: A risk based approach to setting optimal long-term interdiction limits for regulatory analyses

    International Nuclear Information System (INIS)

    Mubayi, V.

    1995-05-01

    The consequences of severe accidents at nuclear power plants can be limited by various protective actions, including emergency responses and long-term measures, to reduce exposures of affected populations. Each of these protective actions involve costs to society. The costs of the long-term protective actions depend on the criterion adopted for the allowable level of long-term exposure. This criterion, called the ''long term interdiction limit,'' is expressed in terms of the projected dose to an individual over a certain time period from the long-term exposure pathways. The two measures of offsite consequences, latent cancers and costs, are inversely related and the choice of an interdiction limit is, in effect, a trade-off between these two measures. By monetizing the health effects (through ascribing a monetary value to life lost), the costs of the two consequence measures vary with the interdiction limit, the health effect costs increasing as the limit is relaxed and the protective action costs decreasing. The minimum of the total cost curve can be used to calculate an optimal long term interdiction limit. The calculation of such an optimal limit is presented for each of five US nuclear power plants which were analyzed for severe accident risk in the NUREG-1150 program by the Nuclear Regulatory Commission

  18. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  19. Risk assessment of PCDD/Fs levels in human tissues related to major food items based on chemical analyses and micro-EROD assay.

    Science.gov (United States)

    Tsang, H L; Wu, S C; Wong, C K C; Leung, C K M; Tao, S; Wong, M H

    2009-10-01

    Nine groups of food items (freshwater fish, marine fish, pork, chicken, chicken eggs, leafy, non-leafy vegetables, rice and flour) and three types of human samples (human milk, maternal serum and cord serum) were collected for the analysis of PCDD/Fs. Results of chemical analysis revealed PCDD/Fs concentrations (pg g(-1) fat) in the following ascending order: pork (0.289 pg g(-1) fat), grass carp (Ctenopharyngodon idellus) (freshwater fish) (0.407), golden thread (Nemipterus virgatus) (marine fish) (0.511), chicken (0.529), mandarin fish (Siniperca kneri) (marine fish) (0.535), chicken egg (0.552), and snubnose pompano (Trachinotus blochii) (marine fish) (1.219). The results of micro-EROD assay showed relatively higher PCDD/Fs levels in fish (2.65 pg g(-1) fat) when compared with pork (0.47), eggs (0.33), chicken (0.13), flour (0.07), vegetables (0.05 pg g(-1) wet wt) and rice (0.05). The estimated average daily intake of PCDD/Fs of 3.51 pg EROD-TEQ/kg bw/day was within the range of WHO Tolerable Daily Intake (1-4 pg WHO-TEQ/kg bw/day) and was higher than the Provisional Tolerable Daily Intake (PMTL) (70 pg for dioxins and dioxin-like PCBs) recommended by the Joint FAO/WHO Expert Committee on Food Additives (JECFA) [Joint FAO/WHO Expert Committee on Food Additives (JECFA), Summary and conclusions of the fifty-seventh meeting, JECFA, 2001.]. Nevertheless, the current findings were significantly lower than the TDI (14 pg WHO-TEQ/kg/bw/day) recommended by the Scientific Committee on Food of the Europe Commission [European Scientific Committee on Food (EU SCF), Opinions on the SCF on the risk assessment of dioxins and dioxin-like PCBs in food, 2000.]. However, it should be noted that micro-EROD assay overestimates the PCDD/Fs levels by 2 to 7 folds which may also amplify the PCDD/Fs levels accordingly. Although the levels of PCDD/Fs obtained from micro-EROD assay were much higher than those obtained by chemical analysis by 2 to 7 folds, it provides a cost-effective and

  20. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  1. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  2. Accidental Risk Analyses of the Istanbul and Canakkale Straits

    Science.gov (United States)

    Essiz, Betül; Dagkiran, Berat

    2017-12-01

    Maritime transportation plays an important role in the world. Commercial transport and navy are international maritime activities in different countries. Thanks to the role of straits and channels, these activities can be easier and faster, Turkey has a crucial importance on it because of importance of geographical location. The Turkish Straits are a series of internationally significant waterways connecting Mediterranean Sea and Black Sea. They consist of the Canakkale Strait, the Sea of Marmara, and the Istanbul Strait, all part of the sovereign sea territory of Turkey and subject to the regime of internal waters. They are conventionally considered by the boundary between the continents of Europe and Asia. Because of this geographical importance, all kinds of huge sized vessel activities and high volume cargo transportation always keep going in this waterway. On the other hand, the more maritime activities grow the more accident risks increase. So, can be examined the accident risks on Istanbul and Canakkale Straits and can be assessed risk analysis for them. In the context of the study, one can see general information of the Turkish Straits and the regulatory regime. In addition, tables are applied for vessel movement in the Turkish Straits by years in detail in order to sense variation of the vessel. Risk analyses may also be described in sections with many variables. This paper outlines ship accidents and the risk analysis of ship accidents is applied and resulted for the Turkish Straits. The last chapter concerns the Vessel Traffic Service (VTS) System in the Turkish Straits.

  3. Risk based modelling

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Baker, A.E.

    1993-01-01

    Risk based analysis is a tool becoming available to both engineers and managers to aid decision making concerning plant matters such as In-Service Inspection (ISI). In order to develop a risk based method, some form of Structural Reliability Risk Assessment (SRRA) needs to be performed to provide a probability of failure ranking for all sites around the plant. A Probabilistic Risk Assessment (PRA) can then be carried out to combine these possible events with the capability of plant safety systems and procedures, to establish the consequences of failure for the sites. In this way the probability of failures are converted into a risk based ranking which can be used to assist the process of deciding which sites should be included in an ISI programme. This paper reviews the technique and typical results of a risk based ranking assessment carried out for nuclear power plant pipework. (author)

  4. 76 FR 46268 - Notice of Availability of Pest Risk Analyses for the Importation of Fresh Pitaya and Pomegranates...

    Science.gov (United States)

    2011-08-02

    ...] Notice of Availability of Pest Risk Analyses for the Importation of Fresh Pitaya and Pomegranates From.... ACTION: Notice. SUMMARY: We are advising the public that we have prepared pest risk analyses that... for approving the importation of commodities that, based on the findings of a pest- risk analysis, can...

  5. 75 FR 52302 - Notice of Availability of Pest Risk Analyses for the Importation of Fresh Celery, Arugula, and...

    Science.gov (United States)

    2010-08-25

    ... Inspection Service [Docket No. APHIS-2010-0074] Notice of Availability of Pest Risk Analyses for the... spinach from Colombia. We are making these pest risk analyses available to the public for review and..., based on the findings of a pest- risk analysis, can be safely imported subject to one or more of the...

  6. Quality and completeness of risk analyses. Vol. 1

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1992-06-01

    The program described was started in 1974, at Risoe National Laboratory. The motivation was criticism then being directed at the Reactor Safety Study WASH 1400, and the view that if risk analysis were to have a future as a scientific study, then its procedures would need to be verified. The material described is a record of a prolonged set of experiments and experiences and this second edition of the report includes an update of the research to cover a study of 35 risk analyses reviewed and checked between 1988 and 1992. A survey is presented of the ways in which incompleteness, lacunae, and oversights arise in risk analysis. Areas were detailed knowledge of disturbance causes and consequences need to be known are alarm priority setting, suppression of nuisance alarms and status annunciation signals, advanced shut down system design and runback systems, alarm and disturbance analysis, automatic plant supervision, disturbance diagnosis and testing support, and monitoring of safe operation margins. Despite improvements in risk analysis technique, in safety management, and in safety design, there will always be problems of ignorance, of material properties, of chemical reactions, of behavioural patterns and safety decision making, which leave plants vulnerable and hazardous. (AB) (24 refs.)

  7. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  8. Children with Elevated Psychosocial Risk Load Benefit Most from a Family-Based Preventive Intervention: Exploratory Differential Analyses from the German "Strengthening Families Program 10-14" Adaptation Trial.

    Science.gov (United States)

    Bröning, Sonja; Baldus, Christiane; Thomsen, Monika; Sack, Peter-Michael; Arnaud, Nicolas; Thomasius, Rainer

    2017-11-01

    While the effectiveness of substance use prevention programs such as the Strengthening Families Program 10-14 (SFP) has been demonstrated in the USA, European SFP adaptations have not replicated these sizable effects. Following the rationale of the risk moderation hypothesis positing that elevated risk groups may benefit more from a preventive intervention than lower-risk groups, we reanalyzed evaluation data from a randomized controlled trial testing the adapted German version of SFP (SFP-D). We hypothesized a differential impact of risk status on intervention results. The study employed a minimal control condition. Of the N = 292 participating children, 73.5% qualified as at-risk because they lived in a deprived urban district, and 26.5% qualified as high risk because they additionally scored as "difficult" in the German Strengths and Difficulty Questionnaire (parents' reports using gender- and age-specific German norms). Outcomes were children's self-reports on substance use, mental health, family functioning, and quality of life. Data were analyzed with repeated measures linear mixed models and relative risk analyses. The high-risk group in the SFP-D condition achieved the best results compared with all other groups, especially in mental health and quality of life. Relative risk analyses on tobacco [alcohol] abstinence showed that an additional percentage of 29.8% [16.0%] of high-risk children in nonabstinent controls would have remained abstinent if they had participated in SFP-D. We conclude that risk load influences the impact of substance use prevention programs and discuss to what extent differential analyses can add value to prevention research.

  9. Analysing the external supply chain risk driver competitiveness: a risk mitigation framework and business continuity plan.

    Science.gov (United States)

    Blos, Mauricio F; Wee, Hui-Ming; Yang, Joshua

    2010-11-01

    Innovation challenges for handling supply chain risks have become one of the most important drivers in business competitiveness and differentiation. This study analyses competitiveness at the external supply chain level as a driver of risks and provides a framework for mitigating these risks. The mitigation framework, also called the supply chain continuity framework, provides insight into six stages of the business continuity planning (BCP) process life cycle (risk mitigation management, business impact analysis, supply continuity strategy development, supply continuity plan development, supply continuity plan testing and supply continuity plan maintenance), together with the operational constructs: customer service, inventory management, flexibility, time to market, ordering cycle time and quality. The purpose of the BCP process life cycle and operational constructs working together is to emphasise the way in which a supply chain can deal with disruption risks and, consequently, bring competitive advantage. Future research will consider the new risk scenarios and analyse the consequences to promote the improvement of supply chain resilience.

  10. Methodical treatment of dependent failures in risk analyses

    International Nuclear Information System (INIS)

    Hennings, W.; Mertens, J.

    1987-06-01

    In this report the state-of-the-art regarding dependent failures is compiled and commented on. Among others the following recommendations are infered: The term 'common mode failures' should be restricted to failures of redundant, similar components; the generic term is 'dependent failures' with the subsets 'causal failures' and 'common cause failures'. In risk studies, dependent failures should be covered as far as possible by 'explicit methods'. Nevertheless an uncovered rest remains, which should be accounted for by sensitivity analyses using 'implicit methods'. For this the homogeneous Marshall-Olkin model is recommended. Because the available reports on operating experiences only record 'common mode failures' systematically, it is recommended to additionally apply other methods, e.g. carry out a 'precursor study'. (orig.) [de

  11. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  12. Theoretical analyses of superconductivity in iron based ...

    African Journals Online (AJOL)

    This paper focuses on the theoretical analysis of superconductivity in iron based superconductor Ba1−xKxFe2As2. After reviewing the current findings on this system, we suggest that phononexciton combined mechanism gives a right order of superconducting transition temperature (TC) for Ba1−xKxFe2As2 . By developing ...

  13. Risk-based safety indicators

    International Nuclear Information System (INIS)

    Szikszai, T.

    1997-01-01

    The presentation discusses the following issues: The objectives of the risk-based indicator programme. The characteristics of the risk-based indicators. The objectives of risk-based safety indicators - in monitoring safety; in PSA applications. What indicators? How to produce the risk based indicators? PSA requirements

  14. Biodiversity analyses for risk assessment of genetically modified potato

    NARCIS (Netherlands)

    Lazebnik, Jenny; Dicke, Marcel; Braak, ter Cajo J.F.; Loon, van Joop J.A.

    2017-01-01

    An environmental risk assessment for the introduction of genetically modified crops includes assessing the consequences for biodiversity. In this study arthropod biodiversity was measured using pitfall traps in potato agro-ecosystems in Ireland and The Netherlands over two years. We tested the

  15. Probabilistic aspects of risk analyses for hazardous facilities

    International Nuclear Information System (INIS)

    Morici, A.; Valeri, A.; Zaffiro, C.

    1989-01-01

    The work described in the paper discusses the aspects of the risk analysis concerned with the use of the probabilistic methodology, in order to see how this approach may affect the risk management of industrial hazardous facilities. To this purpose reference is done to the Probabilistic Risk Assessment (PRA) of nuclear power plants. The paper points out that even though the public aversion towards nuclear risks is still far from being removed, the probabilistic approach may provide a sound support to the decision making and authorization process for any industrial activity implying risk for the environment and the public health. It is opinion of the authors that the probabilistic techniques have been developed to a great level of sophistication in the nuclear industry and provided much more experience in this field than in others. For some particular areas of the nuclear applications, such as the plant reliability and the plant response to the accidents, these techniques have reached a sufficient level of maturity and so some results have been usefully taken as a measure of the safety level of the plant itself. The use of some limited safety goals is regarded as a relevant item of the nuclear licensing process. The paper claims that it is time now that these methods would be applied with equal success to other hazardous facilities, and makes some comparative consideration on the differences of these plants with nuclear power plants in order to understand the effect of these differences on the PRA results and on the use one intends to make with them. (author)

  16. Risk-based configuration control

    International Nuclear Information System (INIS)

    Szikszai, T.

    1997-01-01

    The presentation discusses the following issues: The Configuration Control; The Risk-based Configuration Control (during power operation mode, and during shutdown mode). PSA requirements. Use of Risk-based Configuration Control System. Configuration Management (basic elements, benefits, information requirements)

  17. The necessity for comparative risk analyses as seen from the political point of view

    International Nuclear Information System (INIS)

    Steger, U.

    1981-01-01

    The author describes the current insufficient utilization of risk analyses in the political decision process and investigates if other technologies encounter the same difficulties of acceptance as in the nuclear energy field. This being likely he is trying to find out which contribution comparative risk analyses could make to the process of democratic will-formation so that new technologies are accepted. Firstly the author establishes theses criticizing the recent scientific efforts made in the field of risk analyses and their usability for the political decision process. He then defines the criteria risk analyses have to meet in order to serve as scientific elements for consultative political discussions. (orig./HP) [de

  18. Skin barrier and contact allergy: Genetic risk factor analyses

    DEFF Research Database (Denmark)

    Ross-Hansen, Katrine

    2013-01-01

    allergy. Objectives To evaluate the effect of specific gene polymorphisms on the risk of developing contact allergy by a candidate gene approach. These included polymorphisms in the glutathione S-transferase genes (GSTM1, -T1 and -P1 variants), the claudin-1 gene (CLDN1), and the filaggrin gene (FLG......) in particular. Methods Epidemiological genetic association studies were performed on a general Danish population. Participants were patch tested, answered a questionnaire on general health and were genotyped for GST, CLDN1 and FLG polymorphisms. Filaggrin’s nickel binding potential was evaluated biochemically...

  19. Analyse Risk-Return Paradox: Evidence from Electricity Sector of Pakistan

    OpenAIRE

    Naqi Shah, Sadia; Qayyum, Abdul

    2016-01-01

    This study analyse risk return relationship of the electricity companies of Pakistan by using the log return series of these electricity companies. Financial time series data have the property of autoregressive heteroscedasticity so move towards the GARCH family test. As the study want to analyse the risk return relationship so, GARCH-M Model of Engel et al (1987) is used, who empirically found relationship between risk and return. Results show that risk return in case of Pakistan electricity...

  20. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  1. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  2. Development of new risk based regulations

    International Nuclear Information System (INIS)

    Nielsen, L.

    1999-01-01

    A short presentation of the oil and gas industry in Norway, and a brief overview of the regulatory regime in the petroleum sector in Norway is given. Risk analysis has been performed in Norway since 1981 and the various applications will be described. These risk analyses are quite different from a nuclear PSA and some of these differences will be commented. Risk based optimisation techniques such as RCM (Reliability Centred Maintenance) and Risk Based Inspection is used in the industry, with very limited support from the risk analysis. Some of the limitation that exist when such techniques are imported from other industries will be commented on. NPD (Norwegian Petroleum Directorate) is revising our regulations and some of the future plants when it comes to risk informed regulatory requirements will be presented. (au)

  3. Market analyses of livestock trade networks to inform the prevention of joint economic and epidemiological risks.

    Science.gov (United States)

    Moslonka-Lefebvre, Mathieu; Gilligan, Christopher A; Monod, Hervé; Belloc, Catherine; Ezanno, Pauline; Filipe, João A N; Vergu, Elisabeta

    2016-03-01

    Conventional epidemiological studies of infections spreading through trade networks, e.g., via livestock movements, generally show that central large-size holdings (hubs) should be preferentially surveyed and controlled in order to reduce epidemic spread. However, epidemiological strategies alone may not be economically optimal when costs of control are factored in together with risks of market disruption from targeting core holdings in a supply chain. Using extensive data on animal movements in supply chains for cattle and swine in France, we introduce a method to identify effective strategies for preventing outbreaks with limited budgets while minimizing the risk of market disruptions. Our method involves the categorization of holdings based on position along the supply chain and degree of market share. Our analyses suggest that trade has a higher risk of propagating epidemics through cattle networks, which are dominated by exchanges involving wholesalers, than for swine. We assess the effectiveness of contrasting interventions from the perspectives of regulators and the market, using percolation analysis. We show that preferentially targeting minor, non-central agents can outperform targeting of hubs when the costs to stakeholders and the risks of market disturbance are considered. Our study highlights the importance of assessing joint economic-epidemiological risks in networks underlying pathogen propagation and trade. © 2016 The Authors.

  4. Market analyses of livestock trade networks to inform the prevention of joint economic and epidemiological risks

    Science.gov (United States)

    Gilligan, Christopher A.; Belloc, Catherine; Filipe, João A. N.; Vergu, Elisabeta

    2016-01-01

    Conventional epidemiological studies of infections spreading through trade networks, e.g. via livestock movements, generally show that central large-size holdings (hubs) should be preferentially surveyed and controlled in order to reduce epidemic spread. However, epidemiological strategies alone may not be economically optimal when costs of control are factored in together with risks of market disruption from targeting core holdings in a supply chain. Using extensive data on animal movements in supply chains for cattle and swine in France, we introduce a method to identify effective strategies for preventing outbreaks with limited budgets while minimizing the risk of market disruptions. Our method involves the categorization of holdings based on position along the supply chain and degree of market share. Our analyses suggest that trade has a higher risk of propagating epidemics through cattle networks, which are dominated by exchanges involving wholesalers, than for swine. We assess the effectiveness of contrasting interventions from the perspectives of regulators and the market, using percolation analysis. We show that preferentially targeting minor, non-central agents can outperform targeting of hubs when the costs to stakeholders and the risks of market disturbance are considered. Our study highlights the importance of assessing joint economic–epidemiological risks in networks underlying pathogen propagation and trade. PMID:26984191

  5. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  6. Insights from the analyses of risk-informed extension of diesel generator allowed outage time

    International Nuclear Information System (INIS)

    Lin, J.C.; He Wei

    2005-01-01

    In recent years, many U.S. nuclear plants have applied and received approval for the risk-informed extension of the Allowed Outage Time (AOT) for Emergency Diesel Generators (EDGs). These risk-informed applications need to meet the regulatory guidance on the risk criteria. This paper discusses in detail insights derived from the risk-informed analyses performed to support these applications. The risk criteria on ΔCDF/ΔLERF evaluate the increase in average risk by extending the AOT for EDGs, induced primarily by an increase in EDG maintenance unavailability due to the introduction of additional EDG preventive maintenance. By performing this preventive maintenance work on-line, the outage duration can be shortened. With proper refinement of the risk model, most plants can meet the ΔCDF/ΔLERF criteria for extending the EDGAOT from, for example, 3 days to 14 days. The key areas for model enhancements to meet these criteria include offsite/onsite power recovery, LERF modeling, etc. The most important LERF model enhancements consist of refinement of the penetrations included in the containment isolation model for the consideration of a large release, and taking credit for operator vessel depressurization during the time period between core damage and vessel failure. A recent study showed that although the frequency of loss of offsite power (LOSP) has decreased, the duration of offsite power recovery has actually increased. However, many of the events used to derive this conclusion may not be applicable to PRAs. One approach develops the offsite power non-recovery factor by first screening the LOSP events for applicability to the plant being analyzed, power operation, and LOSP initiating event, then using the remaining events data for the derivation based on the fraction of events with recovery duration longer than the time window allowed. The risk criteria on ICCDP/ICLERP examine the increase in risk from the average CDF/LERF, based on the increased maintenance

  7. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  8. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  9. Risk-based safety indicators

    International Nuclear Information System (INIS)

    Sedlak, J.

    2001-12-01

    The report is structured as follows: 1. Risk-based safety indicators: Typology of risk-based indicators (RBIs); Tools for defining RBIs; Requirements for the PSA model; Data sources for RBIs; Types of risks monitored; RBIs and operational safety indicators; Feedback from operating experience; PSO model modification for RBIs; RBI categorization; RBI assessment; RBI applications; Suitable RBI applications. 2. Proposal for risk-based indicators: Acquiring information from operational experience; Method of acquiring safety relevance coefficients for the systems from a PSA model; Indicator definitions; On-line indicators. 3. Annex: Application of RBIs worldwide. (P.A.)

  10. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  11. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer : Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    NARCIS (Netherlands)

    Khankari, Nikhil K.; Shu, Xiao Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Eeles, Rosalind A.; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei; Blalock, Kendra; Campbell, Peter T.; Casey, Graham; Conti, David V.; Edlund, Christopher K.; Figueiredo, Jane; James Gauderman, W.; Gong, Jian; Green, Roger C.; Harju, John F.; Harrison, Tabitha A.; Jacobs, Eric J.; Jenkins, Mark A.; Jiao, Shuo; Li, Li; Lin, Yi; Manion, Frank J.; Moreno, Victor; Mukherjee, Bhramar; Raskin, Leon; Schumacher, Fredrick R.; Seminara, Daniela; Severi, Gianluca; Stenzel, Stephanie L.; Thomas, Duncan C.; Hopper, John L.; Southey, Melissa C.; Makalic, Enes; Schmidt, Daniel F.; Fletcher, Olivia; Peto, Julian; Gibson, Lorna; dos Santos Silva, Isabel; Ahsan, Habib; Whittemore, Alice; Waisfisz, Quinten; Meijers-Heijboer, Hanne; Adank, Muriel; van der Luijt, Rob B.; Uitterlinden, Andre G.; Hofman, Albert; Meindl, Alfons; Schmutzler, Rita K.; Müller-Myhsok, Bertram; Lichtner, Peter; Nevanlinna, Heli; Muranen, Taru A.; Aittomäki, Kristiina; Blomqvist, Carl; Chang-Claude, Jenny; Hein, Rebecca; Dahmen, Norbert; Beckman, Lars; Crisponi, Laura; Hall, Per; Czene, Kamila; Irwanto, Astrid; Liu, Jianjun; Easton, Douglas F.; Turnbull, Clare; Rahman, Nazneen; Eeles, Rosalind; Kote-Jarai, Zsofia; Muir, Kenneth; Giles, Graham; Neal, David; Donovan, Jenny L.; Hamdy, Freddie C.; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher; Schumacher, Fred; Travis, Ruth; Riboli, Elio; Hunter, David; Gapstur, Susan; Berndt, Sonja; Chanock, Stephen; Han, Younghun; Su, Li; Wei, Yongyue; Hung, Rayjean J.; Brhane, Yonathan; McLaughlin, John; Brennan, Paul; McKay, James D.; Rosenberger, Albert; Houlston, Richard S.; Caporaso, Neil; Teresa Landi, Maria; Heinrich, Joachim; Wu, Xifeng; Ye, Yuanqing; Christiani, David C.

    2016-01-01

    Background: Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using

  12. Risk based limits for Operational Safety Requirements

    International Nuclear Information System (INIS)

    Cappucci, A.J. Jr.

    1993-01-01

    OSR limits are designed to protect the assumptions made in the facility safety analysis in order to preserve the safety envelope during facility operation. Normally, limits are set based on ''worst case conditions'' without regard to the likelihood (frequency) of a credible event occurring. In special cases where the accident analyses are based on ''time at risk'' arguments, it may be desirable to control the time at which the facility is at risk. A methodology has been developed to use OSR limits to control the source terms and the times these source terms would be available, thus controlling the acceptable risk to a nuclear process facility. The methodology defines a new term ''gram-days''. This term represents the area under a source term (inventory) vs time curve which represents the risk to the facility. Using the concept of gram-days (normalized to one year) allows the use of an accounting scheme to control the risk under the inventory vs time curve. The methodology results in at least three OSR limits: (1) control of the maximum inventory or source term, (2) control of the maximum gram-days for the period based on a source term weighted average, and (3) control of the maximum gram-days at the individual source term levels. Basing OSR limits on risk based safety analysis is feasible, and a basis for development of risk based limits is defensible. However, monitoring inventories and the frequencies required to maintain facility operation within the safety envelope may be complex and time consuming

  13. Cost/benefit and risk/benefit analyses in LMFBR program planning

    International Nuclear Information System (INIS)

    Brewer, S.T.; Benson, R.A.; Palmer, R.S.

    1978-01-01

    The subject is discussed under the following headings: incentives analyses, uranium availability, electrical demand, the present value of future savings, alternatives to the breeder, environmental considerations, development program risks, results and conclusions. (U.K.)

  14. Not all risks are created equal: A twin study and meta-analyses of risk taking across seven domains.

    Science.gov (United States)

    Wang, X T Xiao-Tian; Zheng, Rui; Xuan, Yan-Hua; Chen, Jie; Li, Shu

    2016-11-01

    Humans routinely deal with both traditional and novel risks. Different kinds of risks have been a driving force for both evolutionary adaptations and personal development. This study explored the genetic and environmental influences on human risk taking in different task domains. Our approach was threefold. First, we integrated several scales of domain-specific risk-taking propensity and developed a synthetic scale, including both evolutionarily typical and modern risks in the following 7 domains: cooperation/competition, safety, reproduction, natural/physical risk, moral risk, financial risk, and gambling. Second, we conducted a twin study using the scale to estimate the contributions of genes and environment to risk taking in each of these 7 domains. Third, we conducted a series of meta-analyses of extant twin studies across the 7 risk domains. The results showed that individual differences in risk-taking propensity and its consistency across domains were mainly regulated by additive genetic influences and individually unique environmental experiences. The heritability estimates from the meta-analyses ranged from 29% in financial risk taking to 55% in safety. Supporting the notion of risk-domain specificity, both the behavioral and genetic correlations among the 7 domains were generally low. Among the relatively few correlations between pairs of risk domains, our analysis revealed a common genetic factor that regulates moral, financial, and natural/physical risk taking. This is the first effort to separate genetic and environmental influences on risk taking across multiple domains in a single study and integrate the findings of extant twin studies via a series of meta-analyses conducted in different task domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  16. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  17. Risk-based decisionmaking (Panel)

    Energy Technology Data Exchange (ETDEWEB)

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  18. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  19. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  20. Cancer risks, risk-cost-benefit analyses, and the scientific method

    International Nuclear Information System (INIS)

    Seiler, F.A.; Alvarez, J.L.

    1995-01-01

    Two main changes in risk analysis are increasingly beginning to influence the manner in which, in the perception of scientists, low-dose modeling of radiation carcinogenesis is supposed to be done. In the past, efforts to model radiation risks have been carried out under the banner of scientific endeavors. On closer inspection, however, it has become obvious that these efforts were not guided by the scientific method and that a change in approach is needed. We realize increasingly that risk analysis is not done in a vacuum and that any action taken due to the result of the analysis not only has a benefit in the form of a risk reduction but leads inevitably to an increase in cost and an increase in the risks of persons effecting the benefit. Thus, a risk-cost-benefit analysis should be done and show a clear-cut net benefit before a remedial action is taken

  1. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    Science.gov (United States)

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  2. Risk analysis for decision support in electricity distribution system asset management: methods and frameworks for analysing intangible risks

    Energy Technology Data Exchange (ETDEWEB)

    Nordgaard, Dag Eirik

    2010-04-15

    During the last 10 to 15 years electricity distribution companies throughout the world have been ever more focused on asset management as the guiding principle for their activities. Within asset management, risk is a key issue for distribution companies, together with handling of cost and performance. There is now an increased awareness of the need to include risk analyses into the companies' decision making processes. Much of the work on risk in electricity distribution systems has focused on aspects of reliability. This is understandable, since it is surely an important feature of the product delivered by the electricity distribution infrastructure, and it is high on the agenda for regulatory authorities in many countries. However, electricity distribution companies are also concerned with other risks relevant for their decision making. This typically involves intangible risks, such as safety, environmental impacts and company reputation. In contrast to the numerous methodologies developed for reliability risk analysis, there are relatively few applications of structured analyses to support decisions concerning intangible risks, even though they represent an important motivation for decisions taken in electricity distribution companies. The overall objective of this PhD work has been to explore risk analysis methods that can be used to improve and support decision making in electricity distribution system asset management, with an emphasis on the analysis of intangible risks. The main contributions of this thesis can be summarised as: An exploration and testing of quantitative risk analysis (QRA) methods to support decisions concerning intangible risks; The development of a procedure for using life curve models to provide input to QRA models; The development of a framework for risk-informed decision making where QRA are used to analyse selected problems; In addition, the results contribute to clarify the basic concepts of risk, and highlight challenges

  3. What Risk Assessments of Genetically Modified Organisms Can Learn from Institutional Analyses of Public Health Risks

    Directory of Open Access Journals (Sweden)

    S. Ravi Rajan

    2012-01-01

    Full Text Available The risks of genetically modified organisms (GMOs are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  4. What risk assessments of genetically modified organisms can learn from institutional analyses of public health risks.

    Science.gov (United States)

    Rajan, S Ravi; Letourneau, Deborah K

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  5. 77 FR 53059 - Risk-Based Capital Guidelines: Market Risk

    Science.gov (United States)

    2012-08-30

    ...'' framework that includes (1) Risk-based capital requirements for credit risk, market risk, and operational... default and credit quality migration risk for non-securitization credit products. With respect to... securitization positions, the revisions assign a specific risk- weighting factor based on the credit rating of a...

  6. 76 FR 1889 - Risk-Based Capital Guidelines: Market Risk

    Science.gov (United States)

    2011-01-11

    ... ``three-pillar'' framework that includes (i) risk-based capital requirements for credit risk, market risk... incremental risk capital requirement to capture default and credit quality migration risk for non... (advanced approaches rules) (collectively, the credit risk capital rules) \\8\\ by requiring any bank subject...

  7. 76 FR 4278 - Notice of Availability of Pest Risk Analyses for the Importation of Fresh Edible Flowers of Izote...

    Science.gov (United States)

    2011-01-25

    ...] Notice of Availability of Pest Risk Analyses for the Importation of Fresh Edible Flowers of Izote... prepared pest risk analyses that evaluate the risks associated with the importation into the continental... risks of introducing or disseminating plant pests or noxious weeds via the importation of fresh edible...

  8. Operational Satellite-based Surface Oil Analyses (Invited)

    Science.gov (United States)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  9. The importance of probabilistic evaluations in connection with risk analyses according to technical safety laws

    International Nuclear Information System (INIS)

    Mathiak, E.

    1984-01-01

    The nuclear energy sector exemplifies the essential importance to be attached to the practical application of probabilistic evaluations (e.g. probabilistic reliability analyses) in connection with the legal risk assessment of technical systems and installations. The study is making use of a triad risk analysis and tries to reconcile the natural science and legal points of view. Without changing the definitions of 'risk' and 'hazard' in the legal sense of their meaning the publication discusses their reconcilation with the laws of natural science, their interpretation and application in view of the latter. (HSCH) [de

  10. Sandia Transportation Technical Environmental Information Center and its application to transportation risk analyses

    International Nuclear Information System (INIS)

    Foley, J.T.; Davidson, C.A.; McClure, J.D.

    1978-01-01

    Purpose of this paper is to describe an applied research activity which is fundamental to the conduct of transportation analyses: the collection, analysis, storage, and retrieval of information on the intensities of technical environments. This paper describes the collection system which provides such a service to official researchers in transportation analysis and the applications of this information in the area of risk analysis

  11. Conducting Meta-Analyses Based on p Values

    Science.gov (United States)

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  12. PC based 8K multichannel analyser for nuclear spectroscopy

    International Nuclear Information System (INIS)

    Jain, S.K.; Gupta, J.D.; Suman Kumari, B.

    1989-01-01

    An IBM-PC based 8K multichannel analyser(MCA) has been developed which incorporates all the features of an advanced system like very high throughput for data acquisition in PHA as well as MCS modes, fast real-time display, extensive display manipulation facilities, various present controls and concurrent data processing. The compact system hardware consists of a 2 bit wide NIM module and a PC add-on card. Because of external acquisition hardware, the system after initial programming by PC can acquire data independently allowing the PC to be switched off. To attain very high throughput, the most desirable feature of an MCA, a dual-port memory architecture has been used. The asymmetric dual-port RAM, housed in the NIM module offers 24 bit parallel access to the ADC and 8 bit wide access to PC which results in fast real-time histogramic display on the monitor. PC emulation software is menu driven and user friendly. It integrates a comprehensive set of commonly required application routines for concurrent data processing. After the transfer of know-how to the Electronic Corporation of India Ltd. (ECIL), this system is bein g produced at ECIL. (author). 5 refs., 4 figs

  13. Analyser-based x-ray imaging for biomedical research

    International Nuclear Information System (INIS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-01-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment. (paper)

  14. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  15. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  16. A case against bio markers as they are currently used in radioecological risk analyses: a problem of linkage

    International Nuclear Information System (INIS)

    Hinton, T.G.; Brechignac, F.

    2005-01-01

    Bio-markers are successfully used in human risk analyses as early indicators of contaminant exposure and predictors of deleterious effects. This has boosted the search for bio-markers in determining ecological risks to non-human biota, and particularly for assessments related to radioactive contaminants. There are difficulties, however, that prevent an easy transfer of the bio-marker concept from humans to non-human biota, as there are significant differences in endpoints of concern, units of observation and dose response relationships between human and ecological risk analyses. The use of bio-markers in ecological risk analyses currently lacks a linkage between molecular-level effects and quantifiable impacts observed in individuals and populations. This is important because ecological risk analyses generally target the population level of biological organisation. We highlight various examples that demonstrate the difficulties of linking individual responses to population-level impacts, such as indirect effects and compensatory interactions. Eco-toxicologists cope with such difficulties through the use of uncertainty or extrapolation factors. Extrapolation factors (EF) typically range from 1 to 1000 when linking effects observed in individuals to those predicted to occur in populations. We question what magnitude of EF will be required when going from a molecular level effect, measured by a bio-marker, all the way up to the population level of biological organisation. Particularly, we stress that a successful application of bio-markers to radioecological risk assessment can only be achieved once the connection has been made between changes in individual resource allocation-based life histories and population dynamics. This clearly emphasises the need to quantify the propagation of molecular and cellular level effects to higher levels of biological organisation, especially in the long-term via several generations of exposure. Finally, we identify pertinent research

  17. Predicting Geomorphic and Hydrologic Risks after Wildfire Using Harmonic and Stochastic Analyses

    Science.gov (United States)

    Mikesell, J.; Kinoshita, A. M.; Florsheim, J. L.; Chin, A.; Nourbakhshbeidokhti, S.

    2017-12-01

    Wildfire is a landscape-scale disturbance that often alters hydrological processes and sediment flux during subsequent storms. Vegetation loss from wildfires induce changes to sediment supply such as channel erosion and sedimentation and streamflow magnitude or flooding. These changes enhance downstream hazards, threatening human populations and physical aquatic habitat over various time scales. Using Williams Canyon, a basin burned by the Waldo Canyon Fire (2012) as a case study, we utilize deterministic and statistical modeling methods (Fourier series and first order Markov chain) to assess pre- and post-fire geomorphic and hydrologic characteristics, including of precipitation, enhanced vegetation index (EVI, a satellite-based proxy of vegetation biomass), streamflow, and sediment flux. Local precipitation, terrestrial Light Detection and Ranging (LiDAR) scanning, and satellite-based products are used for these time series analyses. We present a framework to assess variability of periodic and nonperiodic climatic and multivariate trends to inform development of a post-wildfire risk assessment methodology. To establish the extent to which a wildfire affects hydrologic and geomorphic patterns, a Fourier series was used to fit pre- and post-fire geomorphic and hydrologic characteristics to yearly temporal cycles and subcycles of 6, 4, 3, and 2.4 months. These cycles were analyzed using least-squares estimates of the harmonic coefficients or amplitudes of each sub-cycle's contribution to fit the overall behavior of a Fourier series. The stochastic variances of these characteristics were analyzed by composing first-order Markov models and probabilistic analysis through direct likelihood estimates. Preliminary results highlight an increased dependence of monthly post-fire hydrologic characteristics on 12 and 6-month temporal cycles. This statistical and probabilistic analysis provides a basis to determine the impact of wildfires on the temporal dependence of

  18. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  19. 75 FR 6344 - Notice of Availability of Pest Risk Analyses for Importation of Fresh Figs, Pomegranates, and...

    Science.gov (United States)

    2010-02-09

    ...] Notice of Availability of Pest Risk Analyses for Importation of Fresh Figs, Pomegranates, and Baby Kiwi...: Notice. SUMMARY: We are advising the public that we have prepared two pest risk analyses, one with... the risks of introducing or disseminating plant pests or noxious weeds via the importation of figs...

  20. Night shift work and breast cancer risk: what do the meta-analyses tell us?

    Science.gov (United States)

    Pahwa, Manisha; Labrèche, France; Demers, Paul A

    2018-05-22

    Objectives This paper aims to compare results, assess the quality, and discuss the implications of recently published meta-analyses of night shift work and breast cancer risk. Methods A comprehensive search was conducted for meta-analyses published from 2007-2017 that included at least one pooled effect size (ES) for breast cancer associated with any night shift work exposure metric and were accompanied by a systematic literature review. Pooled ES from each meta-analysis were ascertained with a focus on ever/never exposure associations. Assessments of heterogeneity and publication bias were also extracted. The AMSTAR 2 checklist was used to evaluate quality. Results Seven meta-analyses, published from 2013-2016, collectively included 30 cohort and case-control studies spanning 1996-2016. Five meta-analyses reported pooled ES for ever/never night shift work exposure; these ranged from 0.99 [95% confidence interval (CI) 0.95-1.03, N=10 cohort studies) to 1.40 (95% CI 1.13-1.73, N=9 high quality studies). Estimates for duration, frequency, and cumulative night shift work exposure were scant and mostly not statistically significant. Meta-analyses of cohort, Asian, and more fully-adjusted studies generally resulted in lower pooled ES than case-control, European, American, or minimally-adjusted studies. Most reported statistically significant between-study heterogeneity. Publication bias was not evident in any of the meta-analyses. Only one meta-analysis was strong in critical quality domains. Conclusions Fairly consistent elevated pooled ES were found for ever/never night shift work and breast cancer risk, but results for other shift work exposure metrics were inconclusive. Future evaluations of shift work should incorporate high quality meta-analyses that better appraise individual study quality.

  1. Meta-analyses: does long-term PPI use increase the risk of gastric premalignant lesions?

    Science.gov (United States)

    Eslami, Layli; Nasseri-Moghaddam, Siavosh

    2013-08-01

    Proton pump inhibitors (PPIs) are the most effective agents available for reducing acid secretion. They are used for medical treatment of various acid-related disorders. PPIs are used extensively and for extended periods of time in gastroesophageal reflux disease (GERD). A troublesome issue regarding maintenance therapy has been the propensity of PPI-treated patients to develop chronic atrophic gastritis while on therapy that could theoretically lead to an increased incidence of gastric cancer. In addition, animal studies have raised concern for development of enterochromaffin-like cell hyperplasia and carcinoid tumors in the stomachs of mice receiving high dose PPIs. Current literature does not provide a clear-cut conclusion on the subject and the reports are sometimes contradictory. Therefore, this study is a systematic review of the available literature to address the safety of long-term PPI use and its relation to the development of malignant/premalignant gastric lesions. A literature search of biomedical databases was performed. The reference lists of retrieved articles were reviewed to further identify relevant trials. We hand-searched the abstracts of the American Digestive Disease Week (DDW) and the United European Gastroenterology Week (UEGW) from 1995 to 2013. Only randomized clinical trials (RCTs) that used PPIs as the primary treatment for at least six month versus no treatment, placebo, antacid or anti-reflux surgery (ARS) were included. Two reviewers independently extracted the data. Discrepancies in the interpretation were resolved by consensus. All analyses of outcomes were based on the intention-to-treat principle. We performed statistical analysis using Review Manager software. The effect measure of choice was relative risk (RR) for dichotomous data. Six RCTs with a total of 785 patients met the inclusion criteria. Two multicenter RCTs compared Esomeprazole with placebo. One RCT compared omeprazole with ARS. Two RCTs compared omeprazole with

  2. GPM Rainfall-Based Streamflow Analyses for East Africa

    Science.gov (United States)

    Blankenship, Clay B.; Limaye, Ashutosh S.; Mitheu, Faith

    2016-01-01

    SERVIR is a joint project of NASA and US Agency for International Development (USAID). Mission is to use satellite data and geospatial technology to help developing countries manage resources, land use, and climate risks. Means to serve, in Spanish.

  3. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  4. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  5. Risk-based performance indicators

    International Nuclear Information System (INIS)

    Azarm, M.A.; Boccio, J.L.; Vesely, W.E.; Lofgren, E.

    1987-01-01

    The purpose of risk-based indicators is to monitor plant safety. Safety is measured by monitoring the potential for core melt (core-melt frequency) and the public risk. Targets for these measures can be set consistent with NRC safety goals. In this process, the performance of safety systems, support systems, major components, and initiating events can be monitored using measures such as unavailability, failure or occurrence frequency. The changes in performance measures and their trends are determined from the time behavior of monitored measures by differentiation between stochastical and actual variations. Therefore, degradation, as well as improvement in the plant safety performance, can be determined. The development of risk-based performance indicators will also provide the means to trace a change in the safety measures to specific problem areas which are amenable to root cause analysis and inspection audits. In addition, systematic methods will be developed to identify specific improvement policies using the plant information system for the identified problem areas. The final product of the performance indicator project will be a methodology, and an integrated and validated set of software packages which, if properly interfaced with the logic model software of a plant, can monitor the plant performance as plant information is provided as input

  6. Designing and evaluating risk-based surveillance systems

    DEFF Research Database (Denmark)

    Willeberg, Preben; Nielsen, Liza Rosenbaum; Salman, Mo

    2012-01-01

    Risk-based surveillance systems reveal occurrence of disease or infection in a sample of population units, which are selected on the basis of risk factors for the condition under study. The purpose of such systems for supporting practical animal disease policy formulations and management decisions...... with prudent use of resources while maintaining acceptable system performance. High-risk category units are selected for testing by identification of the presence of specific high-risk factor(s), while disregarding other factors that might also influence the risk. On this basis we argue that the most...... applicable risk estimate for use in designing and evaluating a risk-based surveillance system would be a crude (unadjusted) relative risk, odds ratio or apparent prevalence. Risk estimates found in the published literature, however, are often the results of multivariable analyses implicitly adjusting...

  7. Serum Lipid Profiles and Cancer Risk in the Context of Obesity: Four Meta-Analyses

    International Nuclear Information System (INIS)

    Melvin, J. C.; Holmberg, L.; Hemelrijck, M. V.

    2013-01-01

    The objective here was to summarize the evidence for, and quantify the link between, serum markers of lipid metabolism and risk of obesity-related cancers. Pub Med and Embase were searched using predefined inclusion criteria to conduct meta-analyses on the association between serum levels of TG, TC, HDL, ApoA-I, and risk of 11 obesity-related cancers. Pooled relative risks (RRs) and 95% confidence intervals were estimated using random-effects analyses. 28 studies were included. Associations between abnormal lipid components and risk of obesity-related cancers when using clinical cut points ( TC ≥ 6.50; TG ≥ 1.71; HDL ≤ 1.03; ApoA-I ≤ 1.05 mmol/L) were apparent in all models. RRs were 1.18 (95% CI: 1.08-1.29) for TC, 1.20 (1.07-1.35) for TG, 1.15 (1.01-1.32) for HDL, and 1.42 (1.17-1.74) for ApoA-I. High levels of TC and TG, as well as low levels of HDL and ApoA-I, were consistently associated with increased risk of obesity-related cancers. The modest RRs suggest serum lipids to be associated with the risk of cancer, but indicate it is likely that other markers of the metabolism and/or lifestyle factors may also be involved. Future intervention studies involving lifestyle modification would provide insight into the potential biological role of lipid metabolism in tumorigenesis.

  8. Risk based surveillance for vector borne diseases

    DEFF Research Database (Denmark)

    Bødker, Rene

    of samples and hence early detection of outbreaks. Models for vector borne diseases in Denmark have demonstrated dramatic variation in outbreak risk during the season and between years. The Danish VetMap project aims to make these risk based surveillance estimates available on the veterinarians smart phones...... in Northern Europe. This model approach may be used as a basis for risk based surveillance. In risk based surveillance limited resources for surveillance are targeted at geographical areas most at risk and only when the risk is high. This makes risk based surveillance a cost effective alternative...... sample to a diagnostic laboratory. Risk based surveillance models may reduce this delay. An important feature of risk based surveillance models is their ability to continuously communicate the level of risk to veterinarians and hence increase awareness when risk is high. This is essential for submission...

  9. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  10. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  11. Risk factors for headache in the UK military: cross-sectional and longitudinal analyses.

    Science.gov (United States)

    Rona, Roberto J; Jones, Margaret; Goodwin, Laura; Hull, Lisa; Wessely, Simon

    2013-05-01

    To assess the importance of service demographic, mental disorders, and deployment factors on headache severity and prevalence, and to assess the impact of headache on functional impairment. There is no information on prevalence and risk factors of headache in the UK military. Recent US reports suggest that deployment, especially a combat role, is associated with headache. Such an association may have serious consequences on personnel during deployment. A survey was carried out between 2004 and 2006 (phase 1) and again between 2007 and 2009 (phase 2) of randomly selected UK military personnel to study the health consequences of the Iraq and Afghanistan wars. This study is based on those who participated in phase 2 and includes cross-sectional and longitudinal analyses. Headache severity in the last month and functional impairment at phase 2 were the main outcomes. Forty-six percent complained of headache in phase 2, half of whom endorsed moderate or severe headache. Severe headache was strongly associated with probable post-traumatic stress disorder (multinomial odds ratio [MOR] 9.6, 95% confidence interval [CI] 6.4-14.2), psychological distress (MOR 6.15, 95% CI 4.8-7.9), multiple physical symptoms (MOR 18.2, 95% CI 13.4-24.6) and self-reported mild traumatic brain injury (MOR 3.5, 95% CI 1.4-8.6) after adjustment for service demographic factors. Mild headache was also associated with these variables but at a lower level. Moderate and severe headache were associated with functional impairment, but the association was partially explained by mental disorders. Mental ill health was also associated with reporting moderate and severe headache at both phase 1 and phase 2. Deployment and a combat role were not associated with headache. Moderate and severe headache are common in the military and have an impact on functional impairment. They are more strongly associated with mental disorders than with mild traumatic brain injury. © 2013 American Headache Society.

  12. Effectiveness of a selective alcohol prevention program targeting personality risk factors: Results of interaction analyses.

    Science.gov (United States)

    Lammers, Jeroen; Goossens, Ferry; Conrod, Patricia; Engels, Rutger; Wiers, Reinout W; Kleinjan, Marloes

    2017-08-01

    To explore whether specific groups of adolescents (i.e., scoring high on personality risk traits, having a lower education level, or being male) benefit more from the Preventure intervention with regard to curbing their drinking behaviour. A clustered randomized controlled trial, with participants randomly assigned to a 2-session coping skills intervention or a control no-intervention condition. Fifteen secondary schools throughout The Netherlands; 7 schools in the intervention and 8 schools in the control condition. 699 adolescents aged 13-15; 343 allocated to the intervention and 356 to the control condition; with drinking experience and elevated scores in either negative thinking, anxiety sensitivity, impulsivity or sensation seeking. Differential effectiveness of the Preventure program was examined for the personality traits group, education level and gender on past-month binge drinking (main outcome), binge frequency, alcohol use, alcohol frequency and problem drinking, at 12months post-intervention. Preventure is a selective school-based alcohol prevention programme targeting personality risk factors. The comparator was a no-intervention control. Intervention effects were moderated by the personality traits group and by education level. More specifically, significant intervention effects were found on reducing alcohol use within the anxiety sensitivity group (OR=2.14, CI=1.40, 3.29) and reducing binge drinking (OR=1.76, CI=1.38, 2.24) and binge drinking frequency (β=0.24, p=0.04) within the sensation seeking group at 12months post-intervention. Also, lower educated young adolescents reduced binge drinking (OR=1.47, CI=1.14, 1.88), binge drinking frequency (β=0.25, p=0.04), alcohol use (OR=1.32, CI=1.06, 1.65) and alcohol use frequency (β=0.47, p=0.01), but not those in the higher education group. Post hoc latent-growth analyses revealed significant effects on the development of binge drinking (β=-0.19, p=0.02) and binge drinking frequency (β=-0.10, p=0

  13. Collaborative development of land use change scenarios for analysing hydro-meteorological risk

    Science.gov (United States)

    Malek, Žiga; Glade, Thomas

    2015-04-01

    Simulating future land use changes remains a difficult task, due to uncontrollable and uncertain driving forces of change. Scenario development emerged as a tool to address these limitations. Scenarios offer the exploration of possible futures and environmental consequences, and enable the analysis of possible decisions. Therefore, there is increasing interest of both decision makers and researchers to apply scenarios when studying future land use changes and their consequences. The uncertainties related to generating land use change scenarios are among others defined by the accuracy of data, identification and quantification of driving forces, and the relation between expected future changes and the corresponding spatial pattern. To address the issue of data and intangible driving forces, several studies have applied collaborative, participatory techniques when developing future scenarios. The involvement of stakeholders can lead to incorporating a broader spectrum of professional values and experience. Moreover, stakeholders can help to provide missing data, improve detail, uncover mistakes, and offer alternatives. Thus, collaborative scenarios can be considered as more reliable and relevant. Collaborative scenario development has been applied to study a variety of issues in environmental sciences on different spatial and temporal scales. Still, these participatory approaches are rarely spatially explicit, making them difficult to apply when analysing changes to hydro-meteorological risk on a local scale. Spatial explicitness is needed to identify potentially critical areas of land use change, leading to locations where the risk might increase. In order to allocate collaboratively developed scenarios of land change, we combined participatory modeling with geosimulation in a multi-step scenario generation framework. We propose a framework able to develop scenarios that are plausible, can overcome data inaccessibility, address intangible and external driving forces

  14. From volatility to value: analysing and managing financial and performance risk in energy savings projects

    International Nuclear Information System (INIS)

    Mills, Evan; Kromer, Steve; Weiss, Gary; Mathew, Paul A.

    2006-01-01

    Many energy-related investments are made without a clear financial understanding of their values, risks, and volatilities. In the face of this uncertainty, the investor-such as a building owner or an energy service company-will often choose to implement only the most certain and thus limited energy-efficiency measures. Conversely, commodities traders and other sophisticated investors accustomed to evaluating investments on a value, risk, and volatility basis often overlook energy-efficiency investments because risk and volatility information are not provided. Fortunately, energy-efficiency investments easily lend themselves to such analysis using tools similar to those applied to supply side risk management. Accurate and robust analysis demands a high level of understanding of the physical aspects of energy-efficiency, which enables the translation of physical performance data into the language of investment. With a risk management analysis framework in place, the two groups-energy-efficiency experts and investment decision-makers-can exchange the information they need to expand investment in demand-side energy projects. In this article, we first present the case for financial risk analysis in energy efficiency in the buildings sector. We then describe techniques and examples of how to identify, quantify, and manage risk. Finally, we describe emerging market-based opportunities in risk management for energy efficiency

  15. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  16. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming

    2010-01-01

    methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...... have two factors that are useful for segmentation and none of them can be used to segment the two types of meat. The kernel based methods have a lot of useful factors and they are able to capture the subtle differences in the images. This is illustrated in Figure 1. You can see a comparison of the most...... useful factor of PCA and kernel based PCA respectively in Figure 2. The factor of the kernel based PCA turned out to be able to segment the two types of meat and in general that factor is much more distinct, compared to the traditional factor. After the orthogonal transformation a simple thresholding...

  18. A methodology for analysing human errors of commission in accident scenarios for risk assessment

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. D.; Park, J. K

    2003-01-01

    As the concern on the impact of the operator's inappropriate interventions, so-called Errors Of Commissions(EOCs), on the plant safety has been raised, the interest in the identification and analysis of EOC events from the risk assessment perspective becomes increasing accordingly. To this purpose, we propose a new methodology for identifying and analysing human errors of commission that might be caused from the failures in situation assessment and decision making during accident progressions given an initiating event. The proposed methodology was applied to the accident scenarios of YGN 3 and 4 NPPs, which resulted in about 10 EOC situations that need careful attention

  19. Application of Markowitz model in analysing risk and return a case study of BSE stock

    Directory of Open Access Journals (Sweden)

    Manas Pandey

    2012-03-01

    Full Text Available In this paper the optimal portfolio formation using real life data subject to two different constraint sets is attempted. It is a theoretical framework for the analysis of risk return choices. Decisions are based on the concept of efficient portfolios. Markowitz portfolio analysis gives as output an efficient frontier on which each portfolio is the highest return earning portfolio for a specified level of risk. The investors can reduce their risks and can maximize their return from the investment, The Markowitz portfolio selections were obtained by solving the portfolio optimization problems to get maximum total returns, constrained by minimum allowable risk level. Investors can get lot of information knowledge about how to invest when to invest and why to invest in the particular portfolio. It basically calculates the standard deviation and returns for each of the feasible portfolios and identifies the efficient frontier, the boundary of the feasible portfolios of increasing returns

  20. Evaluation of an optoacoustic based gas analysing device

    Science.gov (United States)

    Markmann, Janine; Lange, Birgit; Theisen-Kunde, Dirk; Danicke, Veit; Mayorov, Fedor; Eckert, Sebastian; Kettmann, Pascal; Brinkmann, Ralf

    2017-07-01

    The relative occurrence of volatile organic compounds in the human respiratory gas is disease-specific (ppb range). A prototype of a gas analysing device using two tuneable laser systems, an OPO-laser (2.5 to 10 μm) and a CO2-laser (9 to 11 μm), and an optoacoustic measurement cell was developed to detect concentrations in the ppb range. The sensitivity and resolution of the system was determined by test gas measurements, measuring ethylene and sulfur hexafluoride with the CO2-laser and butane with the OPO-laser. System sensitivity found to be 13 ppb for sulfur hexafluoride, 17 ppb for ethylene and Respiratory gas samples of 8 healthy volunteers were investigated by irradiation with 17 laser lines of the CO2-laser. Several of those lines overlap with strong absorption bands of ammonia. As it is known that ammonia concentration increases by age a separation of people 35 was striven for. To evaluate the data the first seven gas samples were used to train a discriminant analysis algorithm. The eighth subject was then assigned correctly to the group >35 years with the age of 49 years.

  1. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  2. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  3. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  4. Analysing co-articulation using frame-based feature trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2010-11-01

    Full Text Available The authors investigate several approaches aimed at a more detailed understanding of co-articulation in spoken utterances. They find that the Euclidean difference between instantaneous frame-based feature values and the mean values of these features...

  5. PCR and RFLP analyses based on the ribosomal protein operon

    Science.gov (United States)

    Differentiation and classification of phytoplasmas have been primarily based on the highly conserved 16Sr RNA gene. RFLP analysis of 16Sr RNA gene sequences has identified 31 16Sr RNA (16Sr) groups and more than 100 16Sr subgroups. Classification of phytoplasma strains can however, become more refin...

  6. Development of reliability databases and the particular requirements of probabilistic risk analyses

    International Nuclear Information System (INIS)

    Meslin, T.

    1989-01-01

    Nuclear utilities have an increasing need to develop reliability databases for their operating experience. The purposes of these databases are often multiple, including both equipment maintenance aspects and probabilistic risk analyses. EDF has therefore been developing experience feedback databases, including the Reliability Data Recording System (SRDF) and the Event File, as well as the history of numerous operating documents. Furthermore, since the end of 1985, EDF has been preparing a probabilistic safety analysis applied to one 1,300 MWe unit, for which a large amount of data of French origin is necessary. This data concerns both component reliability parameters and initiating event frequencies. The study has thus been an opportunity for trying out the performance databases for a specific application, as well as in-depth audits of a number of nuclear sites to make it possible to validate numerous results. Computer aided data collection is also on trial in a number of plants. After describing the EDF operating experience feedback files, we discuss the particular requirements of probabilistic risk analyses, and the resources implemented by EDF to satisfy them. (author). 5 refs

  7. Analysing Leontiev Tube Capabilities in the Space-based Plants

    Directory of Open Access Journals (Sweden)

    N. L. Shchegolev

    2017-01-01

    Full Text Available The paper presents a review of publications dedicated to the gas-dynamic temperature stratification device (the Leontief tube and shows main factors affecting its efficiency. Describes an experimental installation, which is used to obtain data on the value of energy separation in the air to prove this device the operability.The assumption that there is an optimal relationship between the flow velocities in the subsonic and supersonic channels of the gas-dynamic temperature stratification device is experimentally confirmed.The paper conducts analysis of possible ways to raise the efficiency of power plants of various (including space basing, and shows that, currently, a mainstream of increasing efficiency of their operation is to complicate design solutions.A scheme of the closed gas-turbine space-based plant using a mixture of inert gases (helium-xenon one for operation is proposed. What differs it from the simplest variants is a lack of the cooler-radiator and integration into gas-dynamic temperature stratification device and heat compressor.Based on the equations of one-dimensional gas dynamics, it is shown that the total pressure restorability when removing heat in a thermal compressor determines operating capability of this scheme. The exploratory study of creating a heat compressor is performed, and it is shown that when operating on gases with a Prandtl number close to 1, the total pressure does not increase.The operating capability conditions of the heat compressor are operation on gases with a low value of the Prandtl number (helium-xenon mixture at high supersonic velocities and with a longitudinal pressure gradient available.It is shown that there is a region of the low values of the Prandtl number (Pr <0.3 for which, with the longitudinal pressure gradient available in the supersonic flows of a viscous gas, the total pressure can be restored.

  8. Design of the storage location based on the ABC analyses

    Science.gov (United States)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  9. Risk-based regulation: Challenges and opportunities

    International Nuclear Information System (INIS)

    Bari, R.A.

    1995-01-01

    Over the last twenty years, man has witnessed a gradual but steady movement toward increased usage of risk-based methods and results in the regulatory process. The ''risk perspective'' as a supportive view to existing (non-risk-based or deterministic) information used in decision making has a firm foothold now in most countries that regulate nuclear power. Furthermore, in the areas outside the nuclear power field, such as health risk assessment, risk-based information is used increasingly to make decisions on potential impacts of chemical, biological, and radiological exposures. Some of the principal concepts and issues that are pertinent to risk-based regulation are reviewed. There is a growing interest in most countries in the use of risk-based methods and results to facilitate decision-making associated with regulatory processes. A summary is presented of the challenges and opportunities related to expanded use of risk-based regulation

  10. IMPROVING CONTROL ROOM DESIGN AND OPERATIONS BASED ON HUMAN FACTORS ANALYSES OR HOW MUCH HUMAN FACTORS UPGRADE IS ENOUGH ?

    Energy Technology Data Exchange (ETDEWEB)

    HIGGINS,J.C.; OHARA,J.M.; ALMEIDA,P.

    2002-09-19

    THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT.

  11. Economic evaluation of algae biodiesel based on meta-analyses

    Science.gov (United States)

    Zhang, Yongli; Liu, Xiaowei; White, Mark A.; Colosi, Lisa M.

    2017-08-01

    The objective of this study is to elucidate the economic viability of algae-to-energy systems at a large scale, by developing a meta-analysis of five previously published economic evaluations of systems producing algae biodiesel. Data from original studies were harmonised into a standardised framework using financial and technical assumptions. Results suggest that the selling price of algae biodiesel under the base case would be 5.00-10.31/gal, higher than the selected benchmarks: 3.77/gal for petroleum diesel, and 4.21/gal for commercial biodiesel (B100) from conventional vegetable oil or animal fat. However, the projected selling price of algal biodiesel (2.76-4.92/gal), following anticipated improvements, would be competitive. A scenario-based sensitivity analysis reveals that the price of algae biodiesel is most sensitive to algae biomass productivity, algae oil content, and algae cultivation cost. This indicates that the improvements in the yield, quality, and cost of algae feedstock could be the key factors to make algae-derived biodiesel economically viable.

  12. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer: Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    Science.gov (United States)

    Khankari, Nikhil K.; Shu, Xiao-Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei

    2016-01-01

    Background Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using height-associated genetic variants identified in a genome-wide association study (GWAS), to evaluate the association of adult height with these cancers. Methods and Findings A systematic review of prospective studies was conducted using the PubMed, Embase, and Web of Science databases. Using meta-analyses, results obtained from 62 studies were summarized for the association of a 10-cm increase in height with cancer risk. Mendelian randomization analyses were conducted using summary statistics obtained for 423 genetic variants identified from a recent GWAS of adult height and from a cancer genetics consortium study of multiple cancers that included 47,800 cases and 81,353 controls. For a 10-cm increase in height, the summary relative risks derived from the meta-analyses of prospective studies were 1.12 (95% CI 1.10, 1.15), 1.07 (95% CI 1.05, 1.10), and 1.06 (95% CI 1.02, 1.11) for colorectal, prostate, and lung cancers, respectively. Mendelian randomization analyses showed increased risks of colorectal (odds ratio [OR] = 1.58, 95% CI 1.14, 2.18) and lung cancer (OR = 1.10, 95% CI 1.00, 1.22) associated with each 10-cm increase in genetically predicted height. No association was observed for prostate cancer (OR = 1.03, 95% CI 0.92, 1.15). Our meta-analysis was limited to published studies. The sample size for the Mendelian randomization analysis of colorectal cancer was relatively small, thus affecting the precision of the point estimate. Conclusions Our study provides evidence for a potential causal association of adult height with the risk of colorectal and lung cancers and suggests that certain genetic factors and biological pathways affecting adult height may also affect the

  13. Evidence for Endothermy in Pterosaurs Based on Flight Capability Analyses

    Science.gov (United States)

    Jenkins, H. S.; Pratson, L. F.

    2005-12-01

    Previous attempts to constrain flight capability in pterosaurs have relied heavily on the fossil record, using bone articulation and apparent muscle allocation to evaluate flight potential (Frey et al., 1997; Padian, 1983; Bramwell, 1974). However, broad definitions of the physical parameters necessary for flight in pterosaurs remain loosely defined and few systematic approaches to constraining flight capability have been synthesized (Templin, 2000; Padian, 1983). Here we present a new method to assess flight capability in pterosaurs as a function of humerus length and flight velocity. By creating an energy-balance model to evaluate the power required for flight against the power available to the animal, we derive a `U'-shaped power curve and infer optimal flight speeds and maximal wingspan lengths for pterosaurs Quetzalcoatlus northropi and Pteranodon ingens. Our model corroborates empirically derived power curves for the modern black-billed magpie ( Pica Pica) and accurately reproduces the mechanical power curve for modern cockatiels ( Nymphicus hollandicus) (Tobalske et al., 2003). When we adjust our model to include an endothermic metabolic rate for pterosaurs, we find a maximal wingspan length of 18 meters for Q. northropi. Model runs using an exothermic metabolism derive maximal wingspans of 6-8 meters. As estimates based on fossil evidence show total wingspan lengths reaching up to 15 meters for Q. northropi, we conclude that large pterosaurs may have been endothermic and therefore more metabolically similar to birds than to reptiles.

  14. Risk analyses in nuclear engineerig, their value in terms of information, and their limits in terms of applicability

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1983-01-01

    This contribution first briefly explains the main pillars of the deterministic safety concept as developed in nuclear engineering, and some basic ideas on risk analyses in general. This is followed by an outline of the methodology and main purposes of risk analyses. The German Risk Study is taken as an example to discuss selected aspects with regard to information value and limits of risk analyses. The main conclusions state that risk analyses are a valuable instrument for quantitative safety evaluation, leading to a better understanding of safety problems and their prevention, and allowing a comparative assessment of various safety measures. They furthermore allow a refined evaluation of a variety of accident parameters and other impacts determining the risk emanating from accidents. The current state of the art in this sector still leaves numerous uncertainties so that risk analyses yield information for assessments rather than for definite predictions. However, the urge for quantifying the lack of knowledge leads to a better and more precise determination of the gaps still to be filled up by researchers and engineers. Thus risk analyses are a useful help in defining suitable approaches and setting up standards, showing the tasks to be fulfilled in safety research in general. (orig./HSCH) [de

  15. Scenario sensitivity analyses performed on the PRESTO-EPA LLW risk assessment models

    International Nuclear Information System (INIS)

    Bandrowski, M.S.

    1988-01-01

    The US Environmental Protection Agency (EPA) is currently developing standards for the land disposal of low-level radioactive waste. As part of the standard development, EPA has performed risk assessments using the PRESTO-EPA codes. A program of sensitivity analysis was conducted on the PRESTO-EPA codes, consisting of single parameter sensitivity analysis and scenario sensitivity analysis. The results of the single parameter sensitivity analysis were discussed at the 1987 DOE LLW Management Conference. Specific scenario sensitivity analyses have been completed and evaluated. Scenario assumptions that were analyzed include: site location, disposal method, form of waste, waste volume, analysis time horizon, critical radionuclides, use of buffer zones, and global health effects

  16. Risk based seismic design criteria

    International Nuclear Information System (INIS)

    Kennedy, R.P.

    1999-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2) What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the safe-shutdown-earthquake (SSE) ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented. (orig.)

  17. Genome-wide diet-gene interaction analyses for risk of colorectal cancer.

    Directory of Open Access Journals (Sweden)

    Jane C Figueiredo

    2014-04-01

    Full Text Available Dietary factors, including meat, fruits, vegetables and fiber, are associated with colorectal cancer; however, there is limited information as to whether these dietary factors interact with genetic variants to modify risk of colorectal cancer. We tested interactions between these dietary factors and approximately 2.7 million genetic variants for colorectal cancer risk among 9,287 cases and 9,117 controls from ten studies. We used logistic regression to investigate multiplicative gene-diet interactions, as well as our recently developed Cocktail method that involves a screening step based on marginal associations and gene-diet correlations and a testing step for multiplicative interactions, while correcting for multiple testing using weighted hypothesis testing. Per quartile increment in the intake of red and processed meat were associated with statistically significant increased risks of colorectal cancer and vegetable, fruit and fiber intake with lower risks. From the case-control analysis, we detected a significant interaction between rs4143094 (10p14/near GATA3 and processed meat consumption (OR = 1.17; p = 8.7E-09, which was consistently observed across studies (p heterogeneity = 0.78. The risk of colorectal cancer associated with processed meat was increased among individuals with the rs4143094-TG and -TT genotypes (OR = 1.20 and OR = 1.39, respectively and null among those with the GG genotype (OR = 1.03. Our results identify a novel gene-diet interaction with processed meat for colorectal cancer, highlighting that diet may modify the effect of genetic variants on disease risk, which may have important implications for prevention.

  18. Comprehensive review of genetic association studies and meta-analyses on miRNA polymorphisms and cancer risk.

    Directory of Open Access Journals (Sweden)

    Kshitij Srivastava

    Full Text Available MicroRNAs (miRNAs are small RNA molecules that regulate the expression of corresponding messenger RNAs (mRNAs. Variations in the level of expression of distinct miRNAs have been observed in the genesis, progression and prognosis of multiple human malignancies. The present study was aimed to investigate the association between four highly studied miRNA polymorphisms (mir-146a rs2910164, mir-196a2 rs11614913, mir-149 rs2292832 and mir-499 rs3746444 and cancer risk by using a two-sided meta-analytic approach.An updated meta-analysis based on 53 independent case-control studies consisting of 27573 cancer cases and 34791 controls was performed. Odds ratio (OR and 95% confidence interval (95% CI were used to investigate the strength of the association.Overall, the pooled analysis showed that mir-196a2 rs11614913 was associated with a decreased cancer risk (OR = 0.846, P = 0.004, TT vs. CC while other miRNA SNPs showed no association with overall cancer risk. Subgroup analyses based on type of cancer and ethnicity were also performed, and results indicated that there was a strong association between miR-146a rs2910164 and overall cancer risk in Caucasian population under recessive model (OR = 1.274, 95%CI = 1.096-1.481, P = 0.002. Stratified analysis by cancer type also associated mir-196a2 rs11614913 with lung and colorectal cancer at allelic and genotypic level.The present meta-analysis suggests an important role of mir-196a2 rs11614913 polymorphism with overall cancer risk especially in Asian population. Further studies with large sample size are needed to evaluate and confirm this association.

  19. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  20. Analyse of the prevalence rate and risk factors of pulmonary embolism in the patients with dyspnea

    International Nuclear Information System (INIS)

    Cao Yanxia; Su Jian; Wang Bingsheng; Wu Songhong; Dai Ruiting; Cao Caixia

    2005-01-01

    Objective: To analyse the prevalence rate and risk factors of pulmonary embolism (PE) in patients with dyspnea and to explore the predisposing causes and its early clinical manifestations. Methods: Retrospective analysis was done in 461 patients with dyspnea performed 99 Tc m -macroaggregated albumin (MAA) lung perfusion imaging and 99 Tc m -DTPA ventilation imaging or 99 Tc m -MAA perfusion imaging and chest X-ray examination. Among them, 48 cases without apparent disease were considered as control group, whereas the remaining patients with other underlying illnesses as patients group. PEMS statistics software package was used for estimation of prevalence rate, χ 2 test and PE risk factor analysis. Results: There were 251 PE patients among 461 patients, the prevalence rate [ (π)=95% confidence interval (CI) ] was: lower extremity thrombosis and varicosity (80.79-95.47 ), post cesarean section (55.64-87.12), lower extremity bone surgery or fracture (52.76-87.27 ), cancer operation (52.19-78.19), atrial fibrillation or heart failure (53.30-74.88), obesity (23.14-50.20), post abdominal surgery (20.23-59.43), diabetes (19.12-63.95), chronic bronchitis (1.80-23.06), normal control group (3.47-22.66). Except chronic bronchitis, PE prevalence rate between patients group and control group had significant difference (P 99 Tc m -MAA and DTPA lung imaging should be done as early as possible. (authors)

  1. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    Wong, S.M.; Holahan, G.M.; Chung, J.W.; Johnson, M.R.

    1995-01-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  2. Risk and Performance Analyses Supporting Closure of WMA C at the Hanford Site in Southeast Washington

    International Nuclear Information System (INIS)

    Eberlein, Susan J.; Bergeron, Marcel P.; Kemp, Christopher J.; Hildebrand, R. Douglas; Aly, Alaa; Kozak, Matthew; Mehta, Sunil; Connelly, Michael

    2013-01-01

    The Office of River Protection under the U.S. Department of Energy (DOE) is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C as stipulated by the Hanford Federal Facility Agreement and Consent Order (HFFACO) under federal requirements and work tasks will be done under the State-approved closure plans and permits. An initial step in meeting the regulatory requirements is to develop a baseline risk assessment representing current conditions based on available characterization data and information collected at the WMA C location. The baseline risk assessment will be supporting a Resource Conservation and Recovery Act of 1976 (RCRA) Field Investigation (RFI)/Corrective Measures Study (CMS) for WMA closure and RCRA corrective action. Complying with the HFFACO conditions also involves developing a long-term closure Performance Assessment (PA) that evaluates human health and environmental impacts resulting from radionuclide inventories in residual wastes remaining in WMA C tanks and ancillary equipment. This PA is being developed to meet the requirements necessary for closure authorization under DOE Order 435.1 and Washington State Hazardous Waste Management Act. To meet the HFFACO conditions, the long-term closure risk analysis will include an evaluation of human health and environmental impacts from hazardous chemical inventories along with other performance Comprehensive Environmental Response, Compensation, and Liability Act Appropriate and Applicable Requirements (CERCLA ARARs) in residual wastes left in WMA C facilities after retrieval and removal. This closure risk analysis is needed to needed to comply with the requirements for permitted closure. Progress to date in developing a baseline risk assessment of WMA C has involved aspects of an evaluation of soil characterization and groundwater monitoring data collected as a part of the RFI/CMS and RCRA monitoring. Developing the long-term performance assessment aspects has involved the

  3. Risk and Performance Analyses Supporting Closure of WMA C at the Hanford Site in Southeast Washington

    Energy Technology Data Exchange (ETDEWEB)

    Eberlein, Susan J.; Bergeron, Marcel P.; Kemp, Christopher J.

    2013-11-11

    The Office of River Protection under the U.S. Department of Energy (DOE) is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C as stipulated by the Hanford Federal Facility Agreement and Consent Order (HFFACO) under federal requirements and work tasks will be done under the State-approved closure plans and permits. An initial step in meeting the regulatory requirements is to develop a baseline risk assessment representing current conditions based on available characterization data and information collected at the WMA C location. The baseline risk assessment will be supporting a Resource Conservation and Recovery Act of 1976 (RCRA) Field Investigation (RFI)/Corrective Measures Study (CMS) for WMA closure and RCRA corrective action. Complying with the HFFACO conditions also involves developing a long-term closure Performance Assessment (PA) that evaluates human health and environmental impacts resulting from radionuclide inventories in residual wastes remaining in WMA C tanks and ancillary equipment. This PA is being developed to meet the requirements necessary for closure authorization under DOE Order 435.1 and Washington State Hazardous Waste Management Act. To meet the HFFACO conditions, the long-term closure risk analysis will include an evaluation of human health and environmental impacts from hazardous chemical inventories along with other performance Comprehensive Environmental Response, Compensation, and Liability Act Appropriate and Applicable Requirements (CERCLA ARARs) in residual wastes left in WMA C facilities after retrieval and removal. This closure risk analysis is needed to needed to comply with the requirements for permitted closure. Progress to date in developing a baseline risk assessment of WMA C has involved aspects of an evaluation of soil characterization and groundwater monitoring data collected as a part of the RFI/CMS and RCRA monitoring. Developing the long-term performance assessment aspects has involved the

  4. Conclusive meta-analyses on antenatal magnesium may be inconclusive! Are we underestimating the risk of random error?

    DEFF Research Database (Denmark)

    Brok, Jesper; Huusom, Lene D; Thorlund, Kristian

    2012-01-01

    Results from meta-analyses significantly influence clinical practice. Both simulation and empirical studies have demonstrated that the risk of random error (i.e. spurious chance findings) in meta-analyses is much higher than previously anticipated. Hence, authors and users of systematic reviews a...... about the investigated intervention effect(s). We outline the rationale for conducting trial sequential analysis including some examples of the meta-analysis on antenatal magnesium for women at risk of preterm birth....

  5. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  6. Backtesting for Risk-Based Regulatory Capital

    NARCIS (Netherlands)

    Kerkhof, F.L.J.; Melenberg, B.

    2002-01-01

    In this paper we present a framework for backtesting all currently popular risk measurement methods (including value-at-risk and expected shortfall) using the functional delta method.Estimation risk can be taken explicitly into account.Based on a simulation study we provide evidence that tests for

  7. Dangerous dads? Ecological and longitudinal analyses of paternity leave and risk for child injury.

    Science.gov (United States)

    Laflamme, Lucie; Månsdotter, Anna; Lundberg, Michael; Magnusson, Cecilia

    2012-11-01

    In 1974, Sweden became the first country to permit fathers to take paid parental leave. Other countries are currently following suit issuing similar laws. While this reform supports the principles of the United Nations convention of the right for children to be with both parents and enshrines the ethos of gender equality, there has been little systematic examination of its potential impact on child health. Instead, there is uninformed debate that fathers may expose their children to greater risks of injury than mothers. In this Swedish national study, the authors therefore assess whether fathers' parental leave can be regarded as a more serious risk factor for child injuries than that of mothers. Nationwide register-based ecological and longitudinal studies of hospitalisation due to injury (and intoxication) in early childhood, involving the Swedish population in 1973-2009 (ecological design), and children born in 1988 and 1989 (n=118 278) (longitudinal design). An increase in fathers' share of parental leave over time was parallelled by a downward trend in child injury rates (age 0-4 years). At the individual level, the crude incidence of child injury (age 0-2 years) was lower during paternity as compared with maternity leave. This association was, however, explained by parental socio-demographic characteristics (multivariate HR 0.96, 95% CI 0.74 to 1.2). There is no support for the notion that paternity leave increases the risk of child injury.

  8. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  9. Status, results and usefulness of risk analyses for HTGR type reactors of different capacity accessory to planning

    International Nuclear Information System (INIS)

    Kroeger, W.; Mertens, J.

    1985-01-01

    As regards system-inherent risks, HTGR type reactors are evaluated with reference to the established light-water-moderated reactor types. Probabilistic HTGR risk analyses have shown modern HTGR systems to possess a balanced safety concept with a risk remaining distinctly below legally accepted values. Inversely, the development and optimization of the safety concepts have been (and are being) essentially co-determined by the probabilistic analyses, as it is technically sensible and economically necessary to render the specific safety-related HTGR properties eligible for licensing. (orig./HP) [de

  10. Orbitrap-based mass analyser for in-situ characterization of asteroids: ILMA, Ion Laser Mass Analyser

    Science.gov (United States)

    Briois, C.; Cotti, H.; Thirkell, L.; Space Orbitrap Consortium[K. Aradj, French; Bouabdellah, A.; Boukrara, A.; Carrasco, N.; Chalumeau, G.; Chapelon, O.; Colin, F.; Coll, P.; Engrand, C.; Grand, N.; Kukui, A.; Lebreton, J.-P.; Pennanech, C.; Szopa, C.; Thissen, R.; Vuitton, V.; Zapf], P.; Makarov, A.

    2014-07-01

    Since about a decade the boundaries between comets and carbonaceous asteroids are fading [1,2]. No doubt that the Rosetta mission should bring a new wealth of data on the composition of comets. But as promising as it may look, the mass resolving power of the mass spectrometers onboard (so far the best on a space mission) will only be able to partially account for the diversity of chemical structures present. ILMA (Ion-Laser Mass Analyser) is a new generation high mass resolution LDI-MS (Laser Desorption-Ionization Mass Spectrometer) instrument concept using the Orbitrap technique, which has been developed in the frame of the two Marco Polo & Marco Polo-R proposals to the ESA Cosmic Vision program. Flagged by ESA as an instrument concept of interest for the mission in 2012, it has been under study for a few years in the frame of a Research and Technology (R&T) development programme between 5 French laboratories (LPC2E, IPAG, LATMOS, LISA, CSNSM) [3,4], partly funded by the French Space Agency (CNES). The work is undertaken in close collaboration with the Thermo Fisher Scientific Company, which commercialises Orbitrap-based laboratory instruments. The R&T activities are currently concentrating on the core elements of the Orbitrap analyser that are required to reach a sufficient maturity level for allowing design studies of future space instruments. A prototype is under development at LPC2E and a mass resolution (m/Δm FWHM) of 100,000 as been obtained at m/z = 150 for a background pressure of 10^{-8} mbar. ILMA would be a key instrument to measure the molecular, elemental and isotopic composition of objects such as carbonaceous asteroids, comets, or other bodies devoid of atmosphere such as the surface of an icy satellite, the Moon, or Mercury.

  11. Configuration control based on risk matrix for radiotherapy treatment

    International Nuclear Information System (INIS)

    Montes de Oca Quinnones, Joe; Torres Valle, Antonio

    2015-01-01

    The incorporation of the science and technique breakthroughs in the application of the radiotherapy represents a challenge so that, the appearance of equipment failure or human mistakes that triggers unfavorable consequences for patients, public, or the occupationally exposed workers; it is also diversified forcing to incorporate besides, as part of the efforts, new techniques for the evaluation of risk and the detection of the weak points that can lead to these consequences. In order to evaluate the risks of the radiotherapy practices there is the SEVRRA code, based on the method of Risk Matrix. The system SEVRRA is the most frequently used code in the applications of risk studies in radiotherapy treatment. On the other hand, starting from the development of tools to control the dangerous configurations in nuclear power plants, it has been developed the SECURE code, which in its application variant of Risk Matrix, has gain a comfortable interface man-machine to make risk analyses to the radiotherapy treatment, molding in this way a lot of combinations of scenarios. These capacities outstandingly facilitate the studies and risk optimization applications in these practices. In the system SECURE-Risk Matrix are incorporated graphic and analytical capacities, which make more flexible the analyses and the subsequent documentation of all the results. The paper shows the the application of the proposed system to an integral risk study for the process of radiotherapy treatment with linear accelerator. (Author)

  12. Risk-based SMA for Cubesats

    Science.gov (United States)

    Leitner, Jesse

    2016-01-01

    This presentation conveys an approach for risk-based safety and mission assurance applied to cubesats. This presentation accompanies a NASA Goddard standard in development that provides guidance for building a mission success plan for cubesats based on the risk tolerance and resources available.

  13. Strategies and criteria for risk-based configuration control

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Vesely, W.E.

    1991-01-01

    A configuration, as used here, is a set of component operability or statuses that define the state of a nuclear power plant. Risk-based configuration control is the management of component configurations using a risk perspective to control risk and assure safety. If the component configurations that have high risk implications do not occur then the risk from the operation of nuclear power plants would be minimal. The control of component configurations, i.e., the management of component statuses, so that the risk from components being unavailable is minimized, becomes difficult because the status of a standby safety system component is often not apparent unless it is tested. In this paper, we discuss the strategies and criteria for risk-based configuration control in nuclear power plants. In developing these strategies and criteria, the primary objective is to obtain more direct risk control but the added benefit is the effective use of plant resources. Implementation of such approaches can result in replacement/modification of parts of Technical Specifications. Specifically, the risk impact or safety impact of a configuration depends upon four factors: (1) The configuration components which are simultaneously down (i.e., inoperable); (2) the backup components which are known to be up (i.e., operable); (3) the duration of time the configuration exists (the outage time); and (4) the frequency at which the configuration occurs. Risk-based configuration control involves managing these factors using risk analyses and risk insights. In this paper, we discuss each of the factors and illustrate how they can be controlled. The information and the tools needed in implementing configuration control are also discussed. The risk-based calculation requirements in achieving the control are also delineated. 4 refs., 4 figs., 1 tab

  14. Risk- and cost-benefit analyses of breast screening programs derived from absorbed dose measurements in the Netherlands

    International Nuclear Information System (INIS)

    Zuur, C.; Broerse, J.J.

    1985-01-01

    Risk- and cost benefit analyses for breast screening programs are being performed, employing the risk-factors for induction of breast cancer from six extensive follow-up studies. For women of the age group above 35 years and for a risk period of 30 years after a 10-year latency period, a factor of extra cases of 20 x 10 -6 mGy -1 can be estimated. Measurements are being performed in Dutch hospitals to determine the mean absorbed tissue dose. These doses vary from 0.6 to 4.4 mGy per radiography. For a dose of 1 mGy per radiograph and yearly screening of women between 35 and 75 years, the risk of radiogenic breast cancer is about 1% of the natural incidence (85,000 per 10 6 women) in this group. A recommended frequency of screening has to be based on medical, social and financial considerations. The gain in woman years and in completely cured women is being estimated for screening with intervals of 12 instead of 24 months. The medical and social benefit is 1,520 years life-time and 69 more cases completely cured per 1,000 breast cancer patients. The financial profit of a completely cured instead of an ultimately fatal cancer can be roughly estimated at 55,000 guilders. In addition the costs per gained woman-year are about 5,000 guilders. In consequence, the extra costs of annual additional rounds of mammographic screening are balanced by the benefit. (Auth.)

  15. Nuclear insurance risk assessment using risk-based methodology

    International Nuclear Information System (INIS)

    Wendland, W.G.

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance

  16. Medicine and ionizing rays: a help sheet in analysing risks in high rate curietherapy

    International Nuclear Information System (INIS)

    Gauron, C.

    2009-01-01

    This document proposes a synthesis of useful knowledge for radioprotection in the case of high rate curietherapy. Several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the determination of exposure levels, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment

  17. Medicine and ionizing rays: a help sheet in analysing risks in pulsed rate curietherapy

    International Nuclear Information System (INIS)

    Gauron, C.

    2009-01-01

    This document proposes a synthesis of useful knowledge for radioprotection in the case of pulsed rate curietherapy. Several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the determination of exposure levels, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment

  18. Medicine and ionizing rays: a help sheet in analysing risks in curietherapy

    International Nuclear Information System (INIS)

    Gauron, C.

    2008-01-01

    This document proposes a synthesis of useful knowledge for radioprotection in the case of low rate and not-pulsed curietherapy. Several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the determination of exposure levels, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment

  19. Medicine and ionizing rays: a help sheet in analysing risks in exo-buccal dental radiology

    International Nuclear Information System (INIS)

    Gauron, C.

    2009-01-01

    This document proposes a synthesis of useful knowledge for radioprotection in the case of exo-buccal dental radiology. In the first part, several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the determination of exposure levels, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment

  20. Systematic assessment of environmental risk factors for bipolar disorder: an umbrella review of systematic reviews and meta-analyses.

    Science.gov (United States)

    Bortolato, Beatrice; Köhler, Cristiano A; Evangelou, Evangelos; León-Caballero, Jordi; Solmi, Marco; Stubbs, Brendon; Belbasis, Lazaros; Pacchiarotti, Isabella; Kessing, Lars V; Berk, Michael; Vieta, Eduard; Carvalho, André F

    2017-03-01

    The pathophysiology of bipolar disorder is likely to involve both genetic and environmental risk factors. In our study, we aimed to perform a systematic search of environmental risk factors for BD. In addition, we assessed possible hints of bias in this literature, and identified risk factors supported by high epidemiological credibility. We searched the Pubmed/MEDLINE, EMBASE and PsycInfo databases up to 7 October 2016 to identify systematic reviews and meta-analyses of observational studies that assessed associations between putative environmental risk factors and BD. For each meta-analysis, we estimated its summary effect size by means of both random- and fixed-effects models, 95% confidence intervals (CIs), the 95% prediction interval, and heterogeneity. Evidence of small-study effects and excess of significance bias was also assessed. Sixteen publications met the inclusion criteria (seven meta-analyses and nine qualitative systematic reviews). Fifty-one unique environmental risk factors for BD were evaluated. Six meta-analyses investigated associations with a risk factor for BD. Only irritable bowel syndrome (IBS) emerged as a risk factor for BD supported by convincing evidence (k=6; odds ratio [OR]=2.48; 95% CI=2.35-2.61; P<.001), and childhood adversity was supported by highly suggestive evidence. Asthma and obesity were risk factors for BD supported by suggestive evidence, and seropositivity to Toxoplasma gondii and a history of head injury were supported by weak evidence. Notwithstanding that several environmental risk factors for BD were identified, few meta-analyses of observational studies were available. Therefore, further well-designed and adequately powered studies are necessary to map the environmental risk factors for BD. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  2. Risk-based microbiological criteria for Campylobacter in broiler meat: A comparison of two approaches

    DEFF Research Database (Denmark)

    Nauta, Maarten; Andersen, Jens Kirk; Tuominen, Pirkko

    2015-01-01

    Risk-based microbiological criteria can offer a tool to control Campylobacter in the broiler meat production chain. Recently two approaches have been applied to derive such criteria and to analyse their potential impact in terms of human health risk reduction: the risk-based version...

  3. Impact of cardiovascular risk factors on medical expenditure: evidence from epidemiological studies analysing data on health checkups and medical insurance.

    Science.gov (United States)

    Nakamura, Koshi

    2014-01-01

    Concerns have increasingly been raised about the medical economic burden in Japan, of which approximately 20% is attributable to cardiovascular disease, including coronary heart disease and stroke. Because the management of risk factors is essential for the prevention of cardiovascular disease, it is important to understand the relationship between cardiovascular risk factors and medical expenditure in the Japanese population. However, only a few Japanese epidemiological studies analysing data on health checkups and medical insurance have provided evidence on this topic. Patients with cardiovascular risk factors, including obesity, hypertension, and diabetes, may incur medical expenditures through treatment of the risk factors themselves and through procedures for associated diseases that usually require hospitalization and sometimes result in death. Untreated risk factors may cause medical expenditure surges, mainly due to long-term hospitalization, more often than risk factors preventively treated by medication. On an individual patient level, medical expenditures increase with the number of concomitant cardiovascular risk factors. For single risk factors, personal medical expenditure may increase with the severity of that factor. However, on a population level, the medical economic burden attributable to cardiovascular risk factors results largely from a single, particularly prevalent risk factor, especially from mildly-to-moderately abnormal levels of the factor. Therefore, cardiovascular risk factors require management on the basis of both a cost-effective strategy of treating high-risk patients and a population strategy for reducing both the ill health and medical economic burdens that result from cardiovascular disease.

  4. Complementary Exploratory and Confirmatory Factor Analyses of the French WISC-V: Analyses Based on the Standardization Sample.

    Science.gov (United States)

    Lecerf, Thierry; Canivez, Gary L

    2017-12-28

    Interpretation of the French Wechsler Intelligence Scale for Children-Fifth Edition (French WISC-V; Wechsler, 2016a) is based on a 5-factor model including Verbal Comprehension (VC), Visual Spatial (VS), Fluid Reasoning (FR), Working Memory (WM), and Processing Speed (PS). Evidence for the French WISC-V factorial structure was established exclusively through confirmatory factor analyses (CFAs). However, as recommended by Carroll (1995); Reise (2012), and Brown (2015), factorial structure should derive from both exploratory factor analysis (EFA) and CFA. The first goal of this study was to examine the factorial structure of the French WISC-V using EFA. The 15 French WISC-V primary and secondary subtest scaled scores intercorrelation matrix was used and factor extraction criteria suggested from 1 to 4 factors. To disentangle the contribution of first- and second-order factors, the Schmid and Leiman (1957) orthogonalization transformation (SLT) was applied. Overall, no EFA evidence for 5 factors was found. Results indicated that the g factor accounted for about 67% of the common variance and that the contributions of the first-order factors were weak (3.6 to 11.9%). CFA was used to test numerous alternative models. Results indicated that bifactor models produced better fit to these data than higher-order models. Consistent with previous studies, findings suggested dominance of the general intelligence factor and that users should thus emphasize the Full Scale IQ (FSIQ) when interpreting the French WISC-V. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Risk based inspection for atmospheric storage tank

    Science.gov (United States)

    Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.

  6. Direct comparison of risk-adjusted and non-risk-adjusted CUSUM analyses of coronary artery bypass surgery outcomes.

    Science.gov (United States)

    Novick, Richard J; Fox, Stephanie A; Stitt, Larry W; Forbes, Thomas L; Steiner, Stefan

    2006-08-01

    We previously applied non-risk-adjusted cumulative sum methods to analyze coronary bypass outcomes. The objective of this study was to assess the incremental advantage of risk-adjusted cumulative sum methods in this setting. Prospective data were collected in 793 consecutive patients who underwent coronary bypass grafting performed by a single surgeon during a period of 5 years. The composite occurrence of an "adverse outcome" included mortality or any of 10 major complications. An institutional logistic regression model for adverse outcome was developed by using 2608 contemporaneous patients undergoing coronary bypass. The predicted risk of adverse outcome in each of the surgeon's 793 patients was then calculated. A risk-adjusted cumulative sum curve was then generated after specifying control limits and odds ratio. This risk-adjusted curve was compared with the non-risk-adjusted cumulative sum curve, and the clinical significance of this difference was assessed. The surgeon's adverse outcome rate was 96 of 793 (12.1%) versus 270 of 1815 (14.9%) for all the other institution's surgeons combined (P = .06). The non-risk-adjusted curve reached below the lower control limit, signifying excellent outcomes between cases 164 and 313, 323 and 407, and 667 and 793, but transgressed the upper limit between cases 461 and 478. The risk-adjusted cumulative sum curve never transgressed the upper control limit, signifying that cases preceding and including 461 to 478 were at an increased predicted risk. Furthermore, if the risk-adjusted cumulative sum curve was reset to zero whenever a control limit was reached, it still signaled a decrease in adverse outcome at 166, 653, and 782 cases. Risk-adjusted cumulative sum techniques provide incremental advantages over non-risk-adjusted methods by not signaling a decrement in performance when preoperative patient risk is high.

  7. Risk Based Optimal Fatigue Testing

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M.H.; Kroon, I.B.

    1992-01-01

    Optimal fatigue life testing of materials is considered. Based on minimization of the total expected costs of a mechanical component a strategy is suggested to determine the optimal stress range levels for which additional experiments are to be performed together with an optimal value...

  8. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  9. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  10. WTS - Risk Based Resource Targeting (RBRT) -

    Data.gov (United States)

    Department of Transportation — The Risk Based Resource Targeting (RBRT) application supports a new SMS-structured process designed to focus on safety oversight of systems and processes rather than...

  11. All India survey for analyses of colors in sweets and savories: exposure risk in Indian population.

    Science.gov (United States)

    Dixit, Sumita; Khanna, Subhash K; Das, Mukul

    2013-04-01

    In the present study, an attempt has been made to understand the exposure assessment of food colors through 2 major groups, sweets and savories, at a national level so as to evolve a scientific yardstick to fix levels of colors in commodities based on technological and safety requirement. A vast majority of colored food commodities (83.6%) were found to employ permitted colors and confirmed a marked decline in the trend of use of nonpermitted colors (NPCs). Of the 4 zones of India, East zone showed the maximum adulteration (80.3%) both by exceeding the prescribed limits of permitted colors (72.3%) and the use of NPCs (28.7%). Tartrazine was the most popular color among the permitted list, which ranged from 12.5 to 1091 mg/kg. Rhodamine B was the most prevalent dye in the NPCs group. On the basis of average consumption of food commodities and average levels of detected colors, the intake of Sunset Yellow FCF saturates the acceptable daily intake limit to a maximum of 47.8% in children, which is a cause of concern. The uniform maximum permissible limit of synthetic colors at 100 mg/kg under the Indian rules thus needs to be reviewed and should rather be governed by the technological necessity and the consumption profiles of food commodities so that the vulnerable population should not unnecessary be exposed to excessive amounts of synthetic colors to pose health risks. © 2013 Institute of Food Technologists®

  12. Advanced exergy-based analyses applied to a system including LNG regasification and electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Morosuk, Tatiana; Tsatsaronis, George; Boyano, Alicia; Gantiva, Camilo [Technische Univ. Berlin (Germany)

    2012-07-01

    Liquefied natural gas (LNG) will contribute more in the future than in the past to the overall energy supply in the world. The paper discusses the application of advanced exergy-based analyses to a recently developed LNG-based cogeneration system. These analyses include advanced exergetic, advanced exergoeconomic, and advanced exergoenvironmental analyses in which thermodynamic inefficiencies (exergy destruction), costs, and environmental impacts have been split into avoidable and unavoidable parts. With the aid of these analyses, the potentials for improving the thermodynamic efficiency and for reducing the overall cost and the overall environmental impact are revealed. The objectives of this paper are to demonstrate (a) the potential for generating electricity while regasifying LNG and (b) some of the capabilities associated with advanced exergy-based methods. The most important subsystems and components are identified, and suggestions for improving them are made. (orig.)

  13. Tenants at-risk-of-poverty induced by housing expenditure – exploratory analyses with EU-SILC

    NARCIS (Netherlands)

    Haffner, M.E.A.; Dol, C.P.; Heylen, K.

    2014-01-01

    Combating poverty and social exclusion is a core policy issue in the European Union (EU). The Statis-tics on Income and Living Conditions (EU-SILC) database facilitates analyses of the extent of poverty and social exclusion. One of the indicators built from the database is the at-risk-of-poverty

  14. Dietary supplement use and colorectal cancer risk: A systematic review and meta-analyses of prospective cohort studies

    NARCIS (Netherlands)

    Heine-Bröring, R.C.; Winkels, R.M.; Renkema, J.M.S.; Kragt, L.; Orten-Luiten, van A.C.B.; Tigchelaar, E.F.; Chan, D.S.M.; Norat, T.; Kampman, E.

    2015-01-01

    Use of dietary supplements is rising in countries where colorectal cancer is prevalent. We conducted a systematic literature review and meta-analyses of prospective cohort studies on dietary supplement use and colorectal cancer risk. We identified relevant studies in Medline, Embase and Cochrane up

  15. Econometric analyses of microfinance credit group formation, contractual risks and welfare impacts in Northern Ethiopia

    NARCIS (Netherlands)

    Berhane Tesfay, G.

    2009-01-01

    Key words
    Microfinance, joint liability, contractual risk, group formation, risk-matching, impact evaluation, Panel data econometrics, dynamic panel probit, trend models, fixed-effects, composite counterfactuals, propensity score matching, farm households, Ethiopia.

    Lack of

  16. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  17. Medicine and ionizing rays: a help sheet in analysing risks in radiotherapy and applicable texts

    International Nuclear Information System (INIS)

    Gauron, C.

    2007-01-01

    This document proposes a synthesis of useful knowledge for radioprotection in the case of external radiotherapy. In the first part, several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the exposure level determination, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment. A second part indicates the various applicable legal and regulatory texts (European directives, law and decrees published by public authorities, and texts concerning the general principles in radioprotection, worker protection, specialists, medical devices, nuclear medicine and radiology)

  18. Meta-analyses on behavioral interventions to reduce the risk of transmission of HIV.

    Science.gov (United States)

    Vergidis, Paschalis I; Falagas, Matthew E

    2009-06-01

    Different behavioral interventions have found to be efficacious in reducing high-risk sexual activity. Interventions have been evaluated in both original research and meta-analytic reviews. Most of the studies have shown that interventions are efficacious among different study populations. In adolescents, both in- and out-of-the classroom interventions showed a decrease in the risk of unprotected sex. In African Americans, greater efficacy was found for interventions including peer education. For Latinos, effect was larger in interventions with segmentation in the same gender. Geographic and social isolation are barriers in approaching MSM. For IDUs, interventions provided within a treatment program have an impact on risk reduction above that produced by drug treatment alone. Finally, people diagnosed with HIV tend to reduce their sexual risk behavior. However, adherence to safe sex practices for life can be challenging. Relentless efforts for implementation of behavioral interventions to decrease high-risk behavior are necessary to decrease HIV transmission.

  19. Coastal and river flood risk analyses for guiding economically optimal flood adaptation policies: a country-scale study for Mexico

    Science.gov (United States)

    Haer, Toon; Botzen, W. J. Wouter; van Roomen, Vincent; Connor, Harry; Zavala-Hidalgo, Jorge; Eilander, Dirk M.; Ward, Philip J.

    2018-06-01

    Many countries around the world face increasing impacts from flooding due to socio-economic development in flood-prone areas, which may be enhanced in intensity and frequency as a result of climate change. With increasing flood risk, it is becoming more important to be able to assess the costs and benefits of adaptation strategies. To guide the design of such strategies, policy makers need tools to prioritize where adaptation is needed and how much adaptation funds are required. In this country-scale study, we show how flood risk analyses can be used in cost-benefit analyses to prioritize investments in flood adaptation strategies in Mexico under future climate scenarios. Moreover, given the often limited availability of detailed local data for such analyses, we show how state-of-the-art global data and flood risk assessment models can be applied for a detailed assessment of optimal flood-protection strategies. Our results show that especially states along the Gulf of Mexico have considerable economic benefits from investments in adaptation that limit risks from both river and coastal floods, and that increased flood-protection standards are economically beneficial for many Mexican states. We discuss the sensitivity of our results to modelling uncertainties, the transferability of our modelling approach and policy implications. This article is part of the theme issue `Advances in risk assessment for climate change adaptation policy'.

  20. Risk Classification and Risk-based Safety and Mission Assurance

    Science.gov (United States)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  1. GIS-based Approaches to Catchment Area Analyses of Mass Transit

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2009-01-01

    Catchment area analyses of stops or stations are used to investigate potential number of travelers to public transportation. These analyses are considered a strong decision tool in the planning process of mass transit especially railroads. Catchment area analyses are GIS-based buffer and overlay...... analyses with different approaches depending on the desired level of detail. A simple but straightforward approach to implement is the Circular Buffer Approach where catchment areas are circular. A more detailed approach is the Service Area Approach where catchment areas are determined by a street network...... search to simulate the actual walking distances. A refinement of the Service Area Approach is to implement additional time resistance in the network search to simulate obstacles in the walking environment. This paper reviews and compares the different GIS-based catchment area approaches, their level...

  2. Meta-Analyses of Human Cell-Based Cardiac Regeneration Therapies

    DEFF Research Database (Denmark)

    Gyöngyösi, Mariann; Wojakowski, Wojciech; Navarese, Eliano P

    2016-01-01

    In contrast to multiple publication-based meta-analyses involving clinical cardiac regeneration therapy in patients with recent myocardial infarction, a recently published meta-analysis based on individual patient data reported no effect of cell therapy on left ventricular function or clinical...

  3. Risk-based and deterministic regulation

    International Nuclear Information System (INIS)

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose

  4. Risk-based remediation: Approach and application

    International Nuclear Information System (INIS)

    Frishmuth, R.A.; Benson, L.A.

    1995-01-01

    The principle objective of remedial actions is to protect human health and the environment. Risk assessments are the only defensible tools available to demonstrate to the regulatory community and public that this objective can be achieved. Understanding the actual risks posed by site-related contamination is crucial to designing cost-effective remedial strategies. All to often remedial actions are overdesigned, resulting in little to no increase in risk reduction while increasing project cost. Risk-based remedial actions have recently been embraced by federal and state regulators, industry, government, the scientific community, and the public as a mechanism to implement rapid and cost-effective remedial actions. Emphasizing risk reduction, rather than adherence to ambiguous and generic standards, ensures that only remedial actions required to protect human health and the environment at a particular site are implemented. Two sites are presented as case studies on how risk-based approaches are being used to remediate two petroleum hydrocarbon contaminated sites. The sites are located at two US Air Force Bases, Wurtsmith Air Force Base (AFB) in Oscoda, Michigan and Malmstrom AFB in Great Falls, Montana

  5. Disentangling the effects of forage, social rank, and risk on movement autocorrelation of elephants using Fourier and wavelet analyses.

    Science.gov (United States)

    Wittemyer, George; Polansky, Leo; Douglas-Hamilton, Iain; Getz, Wayne M

    2008-12-09

    The internal state of an individual-as it relates to thirst, hunger, fear, or reproductive drive-can be inferred by referencing points on its movement path to external environmental and sociological variables. Using time-series approaches to characterize autocorrelative properties of step-length movements collated every 3 h for seven free-ranging African elephants, we examined the influence of social rank, predation risk, and seasonal variation in resource abundance on periodic properties of movement. The frequency domain methods of Fourier and wavelet analyses provide compact summaries of temporal autocorrelation and show both strong diurnal and seasonal based periodicities in the step-length time series. This autocorrelation is weaker during the wet season, indicating random movements are more common when ecological conditions are good. Periodograms of socially dominant individuals are consistent across seasons, whereas subordinate individuals show distinct differences diverging from that of dominants during the dry season. We link temporally localized statistical properties of movement to landscape features and find that diurnal movement correlation is more common within protected wildlife areas, and multiday movement correlations found among lower ranked individuals are typically outside of protected areas where predation risks are greatest. A frequency-related spatial analysis of movement-step lengths reveal that rest cycles related to the spatial distribution of critical resources (i.e., forage and water) are responsible for creating the observed patterns. Our approach generates unique information regarding the spatial-temporal interplay between environmental and individual characteristics, providing an original approach for understanding the movement ecology of individual animals and the spatial organization of animal populations.

  6. Cystatin C and Risk of Diabetes and the Metabolic Syndrome - Biomarker and Genotype Association Analyses.

    Directory of Open Access Journals (Sweden)

    Martin Magnusson

    Full Text Available We recently reported a relationship between plasma levels of cystatin C and incidence of the metabolic syndrome (MetS among the first 2,369 subjects who participated in the re-examination study of the population-based Malmö and Diet Cancer Cardiovascular cohort (MDC-CC-re-exam. In this study we aimed to replicate these results and also investigate if cystatin C was causally associated with MetS and diabetes.We estimated the effect size of the strongest GWAS derived cystatin C SNP (major allele of rs13038305 on plasma cystatin C in the now completed MDC-CC-re-exam (n = 3,734 and thereafter examined the association between plasma cystatin C (403 cases of diabetes and 2665 controls as well as rs13038305 (235 cases and 2425 controls with incident diabetes. The association of rs13038305 and incident MetS (511 cases of MetS and 1980 controls was similarly investigated in the whole MDC-CC-re-exam. We also attempted to replicate our previously shown association of cystatin C with incident MetS in subjects from the MDC-CC-re-exam (147 cases and 711 controls that were not included in our previous report.In the entire MDC-CC-re-exam, each copy of the major allele of rs13038305 was associated with approximately 0.30 standard deviation (SD higher plasma concentration of cystatin C (β = 0.33, p = 4.2E-28 in age and sex adjusted analysis. Cystatin C in plasma was not associated with incident diabetes after adjustment for known diabetes risk factors (OR per 1 SD increment 0.99 (0.86-1.13, p = 0.842. In the replication cohort of MDC-CC-re-exam, the OR (95% CI for incident MetS in subjects belonging to quartiles 1, 2, 3 and 4 of plasma cystatin C levels was 1.00 (reference, 1.21 (0.70-2.07, 1.62 (0.95-2.78 and 1.72 (1.01-2.93 (ptrend = 0.026 in age and sex adjusted analysis. In the entire MDC-CC-re-exam the odds ratio for incident MetS and diabetes per copy of the major rs13038305 allele was 1.13, (0.95-1.34, p = 0.160 and 1.07, 95% CI 0.89-1.30, p = 0

  7. Evaluating the Investment Benefit of Multinational Enterprises' International Projects Based on Risk Adjustment: Evidence from China

    Science.gov (United States)

    Chen, Chong

    2016-01-01

    This study examines the international risks faced by multinational enterprises to understand their impact on the evaluation of investment projects. Moreover, it establishes a 'three-dimensional' theoretical framework of risk identification to analyse the composition of international risk indicators of multinational enterprises based on the theory…

  8. Medicine and ionizing rays: a help sheet in analysing risks in nuclear medicine

    International Nuclear Information System (INIS)

    Gauron, C.

    2006-01-01

    This document first proposes the various applicable legal and regulatory texts concerning radioprotection in the medical sector (European directives, institutions in charge of radioprotection, general arrangements, regulatory texts concerning worker protection against ionizing radiations, personnel specialized in medical radio-physics, electro-radiology operators, quality control of medical devices, and nuclear medicine and radiology). The second part proposes a synthesis of useful knowledge for radioprotection in the case of nuclear medicine when performing in vivo diagnosis, positron emission tomography or PET being excluded. Several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the determination of exposure levels, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment. The next parts present the same kind of information but for positron emission tomography or PET with Fluorine 18, for therapeutic practice without hospitalization (activity of iodine 137 less than 740 MBq), for therapeutic practice in case of hospitalization (iodine 137 activity greater than 740 MBq), and when taking patients into care after treatment in a nuclear medicine (in this last case, legal and regulatory information focus on patients)

  9. CHANGES SDSS: the development of a Spatial Decision Support System for analysing changing hydro-meteorological risk

    Science.gov (United States)

    van Westen, Cees; Bakker, Wim; Zhang, Kaixi; Jäger, Stefan; Assmann, Andre; Kass, Steve; Andrejchenko, Vera; Olyazadeh, Roya; Berlin, Julian; Cristal, Irina

    2014-05-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES (www.changes-itn.eu) and the EU FP7 Copernicus project INCREO (http://www.increo-fp7.eu) a spatial decision support system is under development with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk

  10. Establishing sitewide risk perspectives due to cumulative impacts from AB, EP, and NEPA hazard analyses

    International Nuclear Information System (INIS)

    Olinger, S.J.

    1998-06-01

    With the end of the Cold War in 1992, the mission for the Rocky Flats Environmental Technology Site (Site) was changed from production of nuclear weapon components to special nuclear materials (SNM) and waste management, accelerated cleanup, reuse and closure of the Site. This change in mission presents new hazards and risk management challenges. With today's shrinking DOE budget, a balance needs to be achieved between controlling those hazards related to SNM and waste management and interim storage, and those hazards related to accelerated closure of the Site involving deactivation, decontamination, and decommissioning (DD and D) of surplus nuclear facilities. This paper discusses how risk assessments of normal operations and potential accidents have provided insights on the risks of current operations and planned closure activities

  11. Risk and safety analyses for disposal of alpha-contaminated waste in INEL

    International Nuclear Information System (INIS)

    Smith, T.

    1982-01-01

    The author first discusses the context, objectives, and scope of the risk analysis. Then he gives some background on the waste and how its managed, including the alternatives for long-term management. These are followed by risk evaluation approach, results, and 7 conclusions and problems. One of his conclusions is that a 100 nCi/g limit would provide adequate safety margins. Raising the limit to 100 nCi/g would allow about 20% of the stored waste to be diverted to near-surface disposal. He added that analyzing waste packages at 10 nCi/g is not now practical. 21 figures

  12. A systematic review of meta-analyses on gene polymorphisms and gastric cancer risk

    NARCIS (Netherlands)

    F. Gianfagna (Francesco); E. de Feo (Emma); C.M. van Duijn (Cornelia); G. Ricciardi (Gualtiero); S. Boccia (Stefania)

    2008-01-01

    textabstractBackground: Individual variations in gastric cancer risk have been associated in the last decade with specific variant alleles of different genes that are present in a significant proportion of the population. Polymorphisms may modify the effects of environmental exposures, and these

  13. Assumptions in quantitative analyses of health risks of overhead power lines

    NARCIS (Netherlands)

    de Jong, A.; Wardekker, J.A.; van der Sluijs, J.P.

    2012-01-01

    One of the major issues hampering the formulation of uncontested policy decisions on contemporary risks is the presence of uncertainties in various stages of the policy cycle. In literature, different lines are suggested to address the problem of provisional and uncertain evidence. Reflective

  14. Climate analyses to assess risks from invasive forest insects: Simple matching to advanced models

    Science.gov (United States)

    Robert C. Venette

    2017-01-01

    Purpose of Review. The number of invasive alien insects that adversely affect trees and forests continues to increase as do associated ecological, economic, and sociological impacts. Prevention strategies remain the most cost-effective approach to address the issue, but risk management decisions, particularly those affecting international trade,...

  15. Analysing Production Technology and Risk in Organic and Conventional Dutch Arable Farming using Panel Data

    NARCIS (Netherlands)

    Gardebroek, C.; Chavez Clemente, M.D.; Oude Lansink, A.G.J.M.

    2010-01-01

    Abstract This paper compares the production technology and production risk of organic and conventional arable farms in the Netherlands. Just–Pope production functions that explicitly account for output variability are estimated using panel data of Dutch organic and conventional farms. Prior

  16. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  17. Applications of high lateral and energy resolution imaging XPS with a double hemispherical analyser based spectromicroscope

    International Nuclear Information System (INIS)

    Escher, M.; Winkler, K.; Renault, O.; Barrett, N.

    2010-01-01

    The design and applications of an instrument for imaging X-ray photoelectron spectroscopy (XPS) are reviewed. The instrument is based on a photoelectron microscope and a double hemispherical analyser whose symmetric configuration avoids the spherical aberration (α 2 -term) inherent for standard analysers. The analyser allows high transmission imaging without sacrificing the lateral and energy resolution of the instrument. The importance of high transmission, especially for highest resolution imaging XPS with monochromated laboratory X-ray sources, is outlined and the close interrelation of energy resolution, lateral resolution and analyser transmission is illustrated. Chemical imaging applications using a monochromatic laboratory Al Kα-source are shown, with a lateral resolution of 610 nm. Examples of measurements made using synchrotron and laboratory ultra-violet light show the broad field of applications from imaging of core level electrons with chemical shift identification, high resolution threshold photoelectron emission microscopy (PEEM), work function imaging and band structure imaging.

  18. Environmental risk factors of pregnancy outcomes: a summary of recent meta-analyses of epidemiological studies.

    Science.gov (United States)

    Nieuwenhuijsen, Mark J; Dadvand, Payam; Grellier, James; Martinez, David; Vrijheid, Martine

    2013-01-15

    Various epidemiological studies have suggested associations between environmental exposures and pregnancy outcomes. Some studies have tempted to combine information from various epidemiological studies using meta-analysis. We aimed to describe the methodologies used in these recent meta-analyses of environmental exposures and pregnancy outcomes. Furthermore, we aimed to report their main findings. We conducted a bibliographic search with relevant search terms. We obtained and evaluated 16 recent meta-analyses. The number of studies included in each reported meta-analysis varied greatly, with the largest number of studies available for environmental tobacco smoke. Only a small number of the studies reported having followed meta-analysis guidelines or having used a quality rating system. Generally they tested for heterogeneity and publication bias. Publication bias did not occur frequently.The meta-analyses found statistically significant negative associations between environmental tobacco smoke and stillbirth, birth weight and any congenital anomalies; PM2.5 and preterm birth; outdoor air pollution and some congenital anomalies; indoor air pollution from solid fuel use and stillbirth and birth weight; polychlorinated biphenyls (PCB) exposure and birth weight; disinfection by-products in water and stillbirth, small for gestational age and some congenital anomalies; occupational exposure to pesticides and solvents and some congenital anomalies; and agent orange and some congenital anomalies. The number of meta-analyses of environmental exposures and pregnancy outcomes is small and they vary in methodology. They reported statistically significant associations between environmental exposures such as environmental tobacco smoke, air pollution and chemicals and pregnancy outcomes.

  19. Environmental risk factors of pregnancy outcomes: a summary of recent meta-analyses of epidemiological studies

    Directory of Open Access Journals (Sweden)

    Nieuwenhuijsen Mark J

    2013-01-01

    Full Text Available Abstract Background Various epidemiological studies have suggested associations between environmental exposures and pregnancy outcomes. Some studies have tempted to combine information from various epidemiological studies using meta-analysis. We aimed to describe the methodologies used in these recent meta-analyses of environmental exposures and pregnancy outcomes. Furthermore, we aimed to report their main findings. Methods We conducted a bibliographic search with relevant search terms. We obtained and evaluated 16 recent meta-analyses. Results The number of studies included in each reported meta-analysis varied greatly, with the largest number of studies available for environmental tobacco smoke. Only a small number of the studies reported having followed meta-analysis guidelines or having used a quality rating system. Generally they tested for heterogeneity and publication bias. Publication bias did not occur frequently. The meta-analyses found statistically significant negative associations between environmental tobacco smoke and stillbirth, birth weight and any congenital anomalies; PM2.5 and preterm birth; outdoor air pollution and some congenital anomalies; indoor air pollution from solid fuel use and stillbirth and birth weight; polychlorinated biphenyls (PCB exposure and birth weight; disinfection by-products in water and stillbirth, small for gestational age and some congenital anomalies; occupational exposure to pesticides and solvents and some congenital anomalies; and agent orange and some congenital anomalies. Conclusions The number of meta-analyses of environmental exposures and pregnancy outcomes is small and they vary in methodology. They reported statistically significant associations between environmental exposures such as environmental tobacco smoke, air pollution and chemicals and pregnancy outcomes.

  20. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  1. Risk-based regulation: A regulatory perspective

    International Nuclear Information System (INIS)

    Scarborough, J.C.

    1993-01-01

    In the early development of regulations for nuclear power plants, risk was implicitly considered through qualitative assessments and engineering reliability principles and practices. Examples included worst case analysis, defense in depth, and the single failure criterion. However, the contributions of various systems, structures, components and operator actions to plant safety were not explicitly assessed since a methodology for this purpose had not been developed. As a consequence of the TMI accident, the use of more quantitative risk methodology and information in regulation such as probabilistic risk analysis (PRA) increased. The use of both qualitative and quantitative consideration of risk in regulation has been a set of regulations and regulatory guides and practices that ensure adequate protection of public health and safety. Presently, the development of PRA techniques has developed to the point that safety goals, expressed in terms of risk, have been established to help guide further regulatory decision making. This paper presents the personal opinions of the author as regards the use of risk today in nuclear power plant regulation, areas of further information needs, and necessary plans for moving toward a more systematic use of risk-based information in regulatory initiatives in the future

  2. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    Science.gov (United States)

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Does mitigation save? Reviewing cost-benefit analyses of disaster risk reduction

    OpenAIRE

    Shreve, Cheney M.; Kelman, Ilan

    2014-01-01

    The benefit-cost-ratio (BCR), used in cost-benefit analysis (CBA), is an indicator that attempts to summarize the overall value for money of a project. Disaster costs continue to rise and the demand has increased to demonstrate the economic benefit of disaster risk reduction (DRR) to policy makers. This study compiles and compares original CBA case studies reporting DRR BCRs, without restrictions as to hazard type, location, scale, or other parameters. Many results were identified supporting ...

  4. Negotiation Decision Support Systems: Analysing Negotiations under the Conditions of Risk

    OpenAIRE

    Nipun Agarwal

    2014-01-01

    Negotiation Theory is a research area with emphasis from three different research streams being game theory, psychology and negotiation analysis. Recently, negotiation theory research has moved towards the combination of game theory and psychology negotiation theory models that could be called Integrated Negotiation Theory (INT). As, negotiations are often impacted by external factors, there is risk associated with achieving the expected outcomes. Prospect theory and Negotiation theory are co...

  5. Boxing and mixed martial arts: preliminary traumatic neuromechanical injury risk analyses from laboratory impact dosage data.

    Science.gov (United States)

    Bartsch, Adam J; Benzel, Edward C; Miele, Vincent J; Morr, Douglas R; Prakash, Vikas

    2012-05-01

    In spite of ample literature pointing to rotational and combined impact dosage being key contributors to head and neck injury, boxing and mixed martial arts (MMA) padding is still designed to primarily reduce cranium linear acceleration. The objects of this study were to quantify preliminary linear and rotational head impact dosage for selected boxing and MMA padding in response to hook punches; compute theoretical skull, brain, and neck injury risk metrics; and statistically compare the protective effect of various glove and head padding conditions. An instrumented Hybrid III 50th percentile anthropomorphic test device (ATD) was struck in 54 pendulum impacts replicating hook punches at low (27-29 J) and high (54-58 J) energy. Five padding combinations were examined: unpadded (control), MMA glove-unpadded head, boxing glove-unpadded head, unpadded pendulum-boxing headgear, and boxing glove-boxing headgear. A total of 17 injury risk parameters were measured or calculated. All padding conditions reduced linear impact dosage. Other parameters significantly decreased, significantly increased, or were unaffected depending on padding condition. Of real-world conditions (MMA glove-bare head, boxing glove-bare head, and boxing glove-headgear), the boxing glove-headgear condition showed the most meaningful reduction in most of the parameters. In equivalent impacts, the MMA glove-bare head condition induced higher rotational dosage than the boxing glove-bare head condition. Finite element analysis indicated a risk of brain strain injury in spite of significant reduction of linear impact dosage. In the replicated hook punch impacts, all padding conditions reduced linear but not rotational impact dosage. Head and neck dosage theoretically accumulates fastest in MMA and boxing bouts without use of protective headgear. The boxing glove-headgear condition provided the best overall reduction in impact dosage. More work is needed to develop improved protective padding to minimize

  6. Determining significant endpoints for ecological risk analyses. 1997 annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hinton, T.G.; Congdon, J.; Rowe, C.; Scott, D. [Univ. of Georgia, Aiken, SC (US). Savannah River Ecology Lab.; Bedford, J.; Whicker, F.W. [Colorado State Univ., Fort Collins, CO (US)

    1997-11-01

    'This report summarizes the first year''s progress of research funded under the Department of Energy''s Environmental Management Science Program. The research was initiated to better determine ecological risks from toxic and radioactive contaminants. More precisely, the research is designed to determine the relevancy of sublethal cellular damage to the performance of individuals and to identify characteristics of non-human populations exposed to chronic, low-level radiation, as is typically found on many DOE sites. The authors propose to establish a protocol to assess risks to non-human species at higher levels of biological organization by relating molecular damage to more relevant responses that reflect population health. They think that they can achieve this by coupling changes in metabolic rates and energy allocation patterns to meaningful population response variables, and by using novel biological dosimeters in controlled, manipulative dose/effects experiments. They believe that a scientifically defensible endpoint for measuring ecological risks can only be determined once its understood the extent to which molecular damage from contaminant exposure is detrimental at the individual and population levels of biological organization.'

  7. Radon progeny exposure and lung cancer risk: Analyses of a cohort of Newfoundland fluorspar miners

    International Nuclear Information System (INIS)

    Morrison, H.I.; Villeneuve, P.J.

    1995-07-01

    A cohort study of the mortality experience (1950-1990) of 1744 underground miners and 321 millers or surface workers has been conducted. Excess mortality among underground miners was noted for cancers of the lung, buccal cavity, pharynx and mouth, urinary tract and for silicosis and pneumoconioses. A highly statistically significant relationship was noted between radon daughter exposure and risk of dying of lung cancer; the small numbers of buccal cavity/pharynx cancers (n = 6) precluded meaningful analysis of exposure-response. No statistically significant excess was found for any cause of death among surface workers. The exposure-response data for lung cancer were fitted to various mathematical models. The model selected included terms for attained age, cumulative dose, dose rate and time since last exposure. Because risk varies according to each of these factors, a single summary risk estimate was felt to be misleading. The joint effects of radon and smoking could not be adequately assessed using this cohort. (author). 46 refs., 16 tabs., 1 fig

  8. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  9. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

    Science.gov (United States)

    Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

    2017-08-01

    The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Flood Risk Assessment Based On Security Deficit Analysis

    Science.gov (United States)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  11. Risk based test interval and maintenance optimisation - Application and uses

    International Nuclear Information System (INIS)

    Sparre, E.

    1999-10-01

    The project is part of an IAEA co-ordinated Research Project (CRP) on 'Development of Methodologies for Optimisation of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs'. The purpose of the project is to investigate the sensitivity of the results obtained when performing risk based optimisation of the technical specifications. Previous projects have shown that complete LPSA models can be created and that these models allow optimisation of technical specifications. However, these optimisations did not include any in depth check of the result sensitivity with regards to methods, model completeness etc. Four different test intervals have been investigated in this study. Aside from an original, nominal, optimisation a set of sensitivity analyses has been performed and the results from these analyses have been compared to the original optimisation. The analyses indicate that the result of an optimisation is rather stable. However, it is not possible to draw any certain conclusions without performing a number of sensitivity analyses. Significant differences in the optimisation result were discovered when analysing an alternative configuration. Also deterministic uncertainties seem to affect the result of an optimisation largely. The sensitivity of failure data uncertainties is important to investigate in detail since the methodology is based on the assumption that the unavailability of a component is dependent on the length of the test interval

  12. Determining significant endpoints for ecological risk analyses. 1998 annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hinton, T.G.; Congdon, J.; Scott, D. [Univ. of Georgia, Aiken, SC (US). Savannah River Ecology Lab.; Rowe, C. [Univ. of Puerto Rico, San Juan (PR); Bedford, J.; Whicker, W. [Colorado State Univ., Fort Collins, CO (US)

    1998-06-01

    'The goal of this report is to establish a protocol for assessing risks to non-human populations exposed to environmental stresses typically found on many DOE sites. The authors think that they can achieve this by using novel biological dosimeters in controlled, manipulative dose/effects experiments, and by coupling changes in metabolic rates and energy allocation patterns to meaningful population response variables (such as age-specific survivorship, reproductive output, age at maturity and longevity). This research is needed to determine the relevancy of sublethal cellular damage to the performance of individuals and populations exposed to chronic, low-level radiation, and radiation with concomitant exposure to chemicals. They believe that a scientifically defensible endpoint for measuring ecological risks can only be determined once its understood the extent to which molecular damage from contaminant exposure is detrimental at the individual and population levels of biological organization. The experimental facility will allow them to develop a credible assessment tool for appraising ecological risks, and to evaluate the effects of radionuclide/chemical synergisms on non-human species. This report summarizes work completed midway of a 3-year project that began in November 1996. Emphasis to date has centered on three areas: (1) developing a molecular probe to measure stable chromosomal aberrations known as reciprocal translocations, (2) constructing an irradiation facility where the statistical power inherent in replicated mesocosms can be used to address the response of non-human organisms to exposures from low levels of radiation and metal contaminants, and (3) quantifying responses of organisms living in contaminated mesocosms and field sites.'

  13. 12 CFR 567.6 - Risk-based capital credit risk-weight categories.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Risk-based capital credit risk-weight... CAPITAL Regulatory Capital Requirements § 567.6 Risk-based capital credit risk-weight categories. (a) Risk...)(2) of this section), plus risk-weighted recourse obligations, direct credit substitutes, and certain...

  14. Risk-based classification system of nanomaterials

    International Nuclear Information System (INIS)

    Tervonen, Tommi; Linkov, Igor; Figueira, Jose Rui; Steevens, Jeffery; Chappell, Mark; Merad, Myriam

    2009-01-01

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  15. Risk-based classification system of nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Tervonen, Tommi, E-mail: t.p.tervonen@rug.n [University of Groningen, Faculty of Economics and Business (Netherlands); Linkov, Igor, E-mail: igor.linkov@usace.army.mi [US Army Research and Development Center (United States); Figueira, Jose Rui, E-mail: figueira@ist.utl.p [Technical University of Lisbon, CEG-IST, Centre for Management Studies, Instituto Superior Tecnico (Portugal); Steevens, Jeffery, E-mail: jeffery.a.steevens@usace.army.mil; Chappell, Mark, E-mail: mark.a.chappell@usace.army.mi [US Army Research and Development Center (United States); Merad, Myriam, E-mail: myriam.merad@ineris.f [INERIS BP 2, Societal Management of Risks Unit/Accidental Risks Division (France)

    2009-05-15

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  16. A combined approach to investigate the toxicity of an industrial landfill's leachate: Chemical analyses, risk assessment and in vitro assays

    International Nuclear Information System (INIS)

    Baderna, D.; Maggioni, S.; Boriani, E.; Gemma, S.; Molteni, M.; Lombardo, A.; Colombo, A.; Bordonali, S.; Rotella, G.; Lodi, M.; Benfenati, E.

    2011-01-01

    Solid wastes constitute an important and emerging problem. Landfills are still one of the most common ways to manage waste disposal. The risk assessment of pollutants from landfills is becoming a major environmental issue in Europe, due to the large number of sites and to the importance of groundwater protection. Furthermore, there is lack of knowledge for the environmental, ecotoxicological and toxicological characteristics of most contaminants contained into landfill leacheates. Understanding leachate composition and creating an integrated strategy for risk assessment are currently needed to correctly face the landfill issues and to make projections on the long-term impacts of a landfill, with particular attention to the estimation of possible adverse effects on human health and ecosystem. In the present study, we propose an integrated strategy to evaluate the toxicity of the leachate using chemical analyses, risk assessment guidelines and in vitro assays using the hepatoma HepG2 cells as a model. The approach was applied on a real case study: an industrial waste landfill in northern Italy for which data on the presence of leachate contaminants are available from the last 11 years. Results from our ecological risk models suggest important toxic effects on freshwater fish and small rodents, mainly due to ammonia and inorganic constituents. Our results from in vitro data show an inhibition of cell proliferation by leachate at low doses and cytotoxic effect at high doses after 48 h of exposure. - Research highlights: → We study the toxicity of leachate from a non-hazardous industrial waste landfill. → We perform chemical analyses, risk assessments and in vitro assays on HepG2 cells. → Risk models suggest toxic effects due to ammonia and inorganic constituents. → In vitro assays show that leachate inhibits cell proliferation at low doses. → Leachate can induce cytotoxic effects on HepG2 cells at high doses.

  17. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  18. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steffens, J; Steggemann, J; Urban, M; Winchen, T

    2012-01-01

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  19. Risk analyses for disposing nonhazardous oil field wastes in salt caverns

    Energy Technology Data Exchange (ETDEWEB)

    Tomasko, D.; Elcock, D.; Veil, J.; Caudle, D.

    1997-12-01

    Salt caverns have been used for several decades to store various hydrocarbon products. In the past few years, four facilities in the US have been permitted to dispose nonhazardous oil field wastes in salt caverns. Several other disposal caverns have been permitted in Canada and Europe. This report evaluates the possibility that adverse human health effects could result from exposure to contaminants released from the caverns in domal salt formations used for nonhazardous oil field waste disposal. The evaluation assumes normal operations but considers the possibility of leaks in cavern seals and cavern walls during the post-closure phase of operation. In this assessment, several steps were followed to identify possible human health risks. At the broadest level, these steps include identifying a reasonable set of contaminants of possible concern, identifying how humans could be exposed to these contaminants, assessing the toxicities of these contaminants, estimating their intakes, and characterizing their associated human health risks. The contaminants of concern for the assessment are benzene, cadmium, arsenic, and chromium. These were selected as being components of oil field waste and having a likelihood to remain in solution for a long enough time to reach a human receptor.

  20. Risk Analyses of Charging Pump Control Improvements for Alternative RCP Seal Cooling

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eun-Chan [Korea Hydro and Nuclear Power Co. Ltd. Daejeon (Korea, Republic of)

    2015-10-15

    There are two events that significantly affect the plant risk during a TLOCCW event. One is an event in which the seal assembly of a reactor coolant pump (RCP) fails due to heating stress from the loss of cooling water; the other is an event in which the operators fail to conduct alternative cooling for the RCP seal during the accident. KHNP reviewed the replacement of the RCP seal with a qualified shutdown seal in order to remove the risk due to RCP seal failure during a TLOCCW. As an optional measure, a design improvement in the alternative cooling method for the RCP seal is being considered. This analysis presents the alternative RCP seal cooling improvement and its safety effect. K2 is a nuclear power plant with a Westinghouse design, and it has a relatively high CDF during TLOCCW events because it has a different CCW system design and difficulty in preparing alternative cooling water sources. This analysis confirmed that an operator action providing cold water to the RWST as RCP seal injection water during a TLOCCW event is very important in K2. The control circuit improvement plan for the auxiliary charging pump was established in order to reduce the failure probability of this operator action. This analysis modeled the improvement as a fault tree and evaluated the resulting CDF change. The consequence demonstrated that the RCP seal injection failure probability was reduced by 89%, and the CDF decreased by 28%.

  1. Risk Factor Analyses for the Return of Spontaneous Circulation in the Asphyxiation Cardiac Arrest Porcine Model

    Directory of Open Access Journals (Sweden)

    Cai-Jun Wu

    2015-01-01

    Full Text Available Background: Animal models of asphyxiation cardiac arrest (ACA are frequently used in basic research to mirror the clinical course of cardiac arrest (CA. The rates of the return of spontaneous circulation (ROSC in ACA animal models are lower than those from studies that have utilized ventricular fibrillation (VF animal models. The purpose of this study was to characterize the factors associated with the ROSC in the ACA porcine model. Methods: Forty-eight healthy miniature pigs underwent endotracheal tube clamping to induce CA. Once induced, CA was maintained untreated for a period of 8 min. Two minutes following the initiation of cardiopulmonary resuscitation (CPR, defibrillation was attempted until ROSC was achieved or the animal died. To assess the factors associated with ROSC in this CA model, logistic regression analyses were performed to analyze gender, the time of preparation, the amplitude spectrum area (AMSA from the beginning of CPR and the pH at the beginning of CPR. A receiver-operating characteristic (ROC curve was used to evaluate the predictive value of AMSA for ROSC. Results: ROSC was only 52.1% successful in this ACA porcine model. The multivariate logistic regression analyses revealed that ROSC significantly depended on the time of preparation, AMSA at the beginning of CPR and pH at the beginning of CPR. The area under the ROC curve in for AMSA at the beginning of CPR was 0.878 successful in predicting ROSC (95% confidence intervals: 0.773∼0.983, and the optimum cut-off value was 15.62 (specificity 95.7% and sensitivity 80.0%. Conclusions: The time of preparation, AMSA and the pH at the beginning of CPR were associated with ROSC in this ACA porcine model. AMSA also predicted the likelihood of ROSC in this ACA animal model.

  2. Automatic image-based analyses using a coupled quadtree-SBFEM/SCM approach

    Science.gov (United States)

    Gravenkamp, Hauke; Duczek, Sascha

    2017-10-01

    Quadtree-based domain decomposition algorithms offer an efficient option to create meshes for automatic image-based analyses. Without introducing hanging nodes the scaled boundary finite element method (SBFEM) can directly operate on such meshes by only discretizing the edges of each subdomain. However, the convergence of a numerical method that relies on a quadtree-based geometry approximation is often suboptimal due to the inaccurate representation of the boundary. To overcome this problem a combination of the SBFEM with the spectral cell method (SCM) is proposed. The basic idea is to treat each uncut quadtree cell as an SBFEM polygon, while all cut quadtree cells are computed employing the SCM. This methodology not only reduces the required number of degrees of freedom but also avoids a two-dimensional quadrature in all uncut quadtree cells. Numerical examples including static, harmonic, modal and transient analyses of complex geometries are studied, highlighting the performance of this novel approach.

  3. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  4. Antihypertensive drugs and risk of cancer: network meta-analyses and trial sequential analyses of 324,168 participants from randomised trials

    DEFF Research Database (Denmark)

    Bangalore, Sripal; Kumar, Sunil; Kjeldsen, Sverre E

    2011-01-01

    The risk of cancer from antihypertensive drugs has been much debated, with a recent analysis showing increased risk with angiotensin-receptor blockers (ARBs). We assessed the association between antihypertensive drugs and cancer risk in a comprehensive analysis of data from randomised clinical tr...

  5. Geospatial analyses and system architectures for the next generation of radioactive materials risk assessment and routing

    International Nuclear Information System (INIS)

    Ganter, J.H.

    1996-01-01

    This paper suggests that inexorable changes in the society are presenting both challenges and a rich selection of technologies for responding to these challenges. The citizen is more demanding of environmental and personal protection, and of information. Simultaneously, the commercial and government information technologies markets are providing new technologies like commercial off-the-shelf (COTS) software, common datasets, ''open'' GIS, recordable CD-ROM, and the World Wide Web. Thus one has the raw ingredients for creating new techniques and tools for spatial analysis, and these tools can support participative study and decision-making. By carrying out a strategy of thorough and demonstrably correct science, design, and development, can move forward into a new generation of participative risk assessment and routing for radioactive and hazardous materials

  6. Survey of foreign risk analyses with plans and projects concerning final disposal

    International Nuclear Information System (INIS)

    Hultgren, Aa.

    1977-08-01

    Risk analysis of the back end of the fuel cycle is now being given increasing efforts in several nuclear power countries. A review of the major programmes abroad in this field, especially for terminal storage of high level nuclear waste, is given in the first part of this report. The second part of the report reviews major projects and plans for terminal storage in America and in Western Europe, with a brief reference to co-operation in international fora. The most comprehensive programme is in progress in the United States. For Sweden it seems that also the programmes in Canada and France are of particular interest due to their concentration on terminal storage in crystalline rocks. (author)

  7. Coalescent-based genome analyses resolve the early branches of the euarchontoglires.

    Directory of Open Access Journals (Sweden)

    Vikas Kumar

    Full Text Available Despite numerous large-scale phylogenomic studies, certain parts of the mammalian tree are extraordinarily difficult to resolve. We used the coding regions from 19 completely sequenced genomes to study the relationships within the super-clade Euarchontoglires (Primates, Rodentia, Lagomorpha, Dermoptera and Scandentia because the placement of Scandentia within this clade is controversial. The difficulty in resolving this issue is due to the short time spans between the early divergences of Euarchontoglires, which may cause incongruent gene trees. The conflict in the data can be depicted by network analyses and the contentious relationships are best reconstructed by coalescent-based analyses. This method is expected to be superior to analyses of concatenated data in reconstructing a species tree from numerous gene trees. The total concatenated dataset used to study the relationships in this group comprises 5,875 protein-coding genes (9,799,170 nucleotides from all orders except Dermoptera (flying lemurs. Reconstruction of the species tree from 1,006 gene trees using coalescent models placed Scandentia as sister group to the primates, which is in agreement with maximum likelihood analyses of concatenated nucleotide sequence data. Additionally, both analytical approaches favoured the Tarsier to be sister taxon to Anthropoidea, thus belonging to the Haplorrhine clade. When divergence times are short such as in radiations over periods of a few million years, even genome scale analyses struggle to resolve phylogenetic relationships. On these short branches processes such as incomplete lineage sorting and possibly hybridization occur and make it preferable to base phylogenomic analyses on coalescent methods.

  8. Alcohol Consumption as a Risk Factor for Acute and Chronic Pancreatitis: A Systematic Review and a Series of Meta-analyses.

    Science.gov (United States)

    Samokhvalov, Andriy V; Rehm, Jürgen; Roerecke, Michael

    2015-12-01

    Pancreatitis is a highly prevalent medical condition associated with a spectrum of endocrine and exocrine pancreatic insufficiencies. While high alcohol consumption is an established risk factor for pancreatitis, its relationship with specific types of pancreatitis and a potential threshold have not been systematically examined. We conducted a systematic literature search for studies on the association between alcohol consumption and pancreatitis based on PRISMA guidelines. Non-linear and linear random-effect dose-response meta-analyses using restricted cubic spline meta-regressions and categorical meta-analyses in relation to abstainers were conducted. Seven studies with 157,026 participants and 3618 cases of pancreatitis were included into analyses. The dose-response relationship between average volume of alcohol consumption and risk of pancreatitis was monotonic with no evidence of non-linearity for chronic pancreatitis (CP) for both sexes (p = 0.091) and acute pancreatitis (AP) in men (p = 0.396); it was non-linear for AP in women (p = 0.008). Compared to abstention, there was a significant decrease in risk (RR = 0.76, 95%CI: 0.60-0.97) of AP in women below the threshold of 40 g/day. No such association was found in men (RR = 1.1, 95%CI: 0.69-1.74). The RR for CP at 100 g/day was 6.29 (95%CI: 3.04-13.02). The dose-response relationships between alcohol consumption and risk of pancreatitis were monotonic for CP and AP in men, and non-linear for AP in women. Alcohol consumption below 40 g/day was associated with reduced risk of AP in women. Alcohol consumption beyond this level was increasingly detrimental for any type of pancreatitis. The work was financially supported by a grant from the National Institute on Alcohol Abuse and Alcoholism (R21AA023521) to the last author.

  9. THE GOAL OF VALUE-BASED MEDICINE ANALYSES: COMPARABILITY. THE CASE FOR NEOVASCULAR MACULAR DEGENERATION

    Science.gov (United States)

    Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy

  10. The goal of value-based medicine analyses: comparability. The case for neovascular macular degeneration.

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers

  11. Credit risk evaluation based on social media.

    Science.gov (United States)

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. The influence of the technologically advanced evacuation models on the risk analyses during accidents in LNG terminal

    Energy Technology Data Exchange (ETDEWEB)

    Stankovicj, Goran; Petelin, Stojan [Faculty for Maritime Studies and Transport, University of Ljubljana, Portorozh (Sierra Leone); others, and

    2014-07-01

    The evacuation of people located in different safety zones of an LNG terminal is a complex problem considering that the accidents involving LNG are very hazardous and post the biggest threat to the safety of the people located near the LNG leakage. The safety risk criteria define the parameters which one LNG terminal should meet in terms of safety. Those criteria also contain an evacuation as an evasive action with the objective to mitigate the influence of the LNG accident on the people at risk. Till date, not a lot of attention has been paid to technologically advanced evacuations intended for LNG terminals. Creating the technologically advanced evacuation influences directly on the decrease of the probability of fatalities P{sub f,i}, thus influencing the calculation of the individual risk as well as the societal risk which results in the positioning of the F-N curve in the acceptable part of the ALARP zone. With this paper, we aim to present the difference between the safety analyses in cases when conservative data for P{sub f,i} is being used while calculating the risk, and in cases when real data for P{sub f,i} is been used. (Author)

  14. Risk-based optimization of land reclamation

    International Nuclear Information System (INIS)

    Lendering, K.T.; Jonkman, S.N.; Gelder, P.H.A.J.M. van; Peters, D.J.

    2015-01-01

    Large-scale land reclamations are generally constructed by means of a landfill well above mean sea level. This can be costly in areas where good quality fill material is scarce. An alternative to save materials and costs is a ‘polder terminal’. The quay wall acts as a flood defense and the terminal level is well below the level of the quay wall. Compared with a conventional terminal, the costs are lower, but an additional flood risk is introduced. In this paper, a risk-based optimization is developed for a conventional and a polder terminal. It considers the investment and residual flood risk. The method takes into account both the quay wall and terminal level, which determine the probability and damage of flooding. The optimal quay wall level is found by solving a Lambert function numerically. The terminal level is bounded by engineering boundary conditions, i.e. piping and uplift of the cover layer of the terminal yard. It is found that, for a representative case study, the saving of reclamation costs for a polder terminal is larger than the increase of flood risk. The model is applicable to other cases of land reclamation and to similar optimization problems in flood risk management. - Highlights: • A polder terminal can be an attractive alternative for a conventional terminal. • A polder terminal is feasible at locations with high reclamation cost. • A risk-based approach is required to determine the optimal protection levels. • The depth of the polder terminal yard is bounded by uplifting of the cover layer. • This paper can support decisions regarding alternatives for port expansions.

  15. Logistic regression and multiple classification analyses to explore risk factors of under-5 mortality in bangladesh

    International Nuclear Information System (INIS)

    Bhowmik, K.R.; Islam, S.

    2016-01-01

    Logistic regression (LR) analysis is the most common statistical methodology to find out the determinants of childhood mortality. However, the significant predictors cannot be ranked according to their influence on the response variable. Multiple classification (MC) analysis can be applied to identify the significant predictors with a priority index which helps to rank the predictors. The main objective of the study is to find the socio-demographic determinants of childhood mortality at neonatal, post-neonatal, and post-infant period by fitting LR model as well as to rank those through MC analysis. The study is conducted using the data of Bangladesh Demographic and Health Survey 2007 where birth and death information of children were collected from their mothers. Three dichotomous response variables are constructed from children age at death to fit the LR and MC models. Socio-economic and demographic variables significantly associated with the response variables separately are considered in LR and MC analyses. Both the LR and MC models identified the same significant predictors for specific childhood mortality. For both the neonatal and child mortality, biological factors of children, regional settings, and parents socio-economic status are found as 1st, 2nd, and 3rd significant groups of predictors respectively. Mother education and household environment are detected as major significant predictors of post-neonatal mortality. This study shows that MC analysis with or without LR analysis can be applied to detect determinants with rank which help the policy makers taking initiatives on a priority basis. (author)

  16. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  17. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  18. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based Lender...

  19. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  20. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  1. A protein relational database and protein family knowledge bases to facilitate structure-based design analyses.

    Science.gov (United States)

    Mobilio, Dominick; Walker, Gary; Brooijmans, Natasja; Nilakantan, Ramaswamy; Denny, R Aldrin; Dejoannis, Jason; Feyfant, Eric; Kowticwar, Rupesh K; Mankala, Jyoti; Palli, Satish; Punyamantula, Sairam; Tatipally, Maneesh; John, Reji K; Humblet, Christine

    2010-08-01

    The Protein Data Bank is the most comprehensive source of experimental macromolecular structures. It can, however, be difficult at times to locate relevant structures with the Protein Data Bank search interface. This is particularly true when searching for complexes containing specific interactions between protein and ligand atoms. Moreover, searching within a family of proteins can be tedious. For example, one cannot search for some conserved residue as residue numbers vary across structures. We describe herein three databases, Protein Relational Database, Kinase Knowledge Base, and Matrix Metalloproteinase Knowledge Base, containing protein structures from the Protein Data Bank. In Protein Relational Database, atom-atom distances between protein and ligand have been precalculated allowing for millisecond retrieval based on atom identity and distance constraints. Ring centroids, centroid-centroid and centroid-atom distances and angles have also been included permitting queries for pi-stacking interactions and other structural motifs involving rings. Other geometric features can be searched through the inclusion of residue pair and triplet distances. In Kinase Knowledge Base and Matrix Metalloproteinase Knowledge Base, the catalytic domains have been aligned into common residue numbering schemes. Thus, by searching across Protein Relational Database and Kinase Knowledge Base, one can easily retrieve structures wherein, for example, a ligand of interest is making contact with the gatekeeper residue.

  2. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 3

    International Nuclear Information System (INIS)

    1983-01-01

    Critical review of the analyses of the German Risk Assessment Study on Nuclear Power Plants (DRS) concerning the reliability of the containment under accident conditions and the conditions of fission product release (transport and distribution in the environment). Main point of interest in this context is an explosion in the steam section and its impact on the containment. Critical comments are given on the models used in the DRS for determining the accident consequences. The analyses made deal with the mathematical models and database for propagation calculations, the methods of dose computation and assessment of health hazards, and the modelling of protective and safety measures. Social impacts of reactor accidents are also considered. (RF) [de

  3. Estimation of effective block conductivities based on discrete network analyses using data from the Aespoe site

    International Nuclear Information System (INIS)

    La Pointe, P.R.; Wallmann, P.; Follin, S.

    1995-09-01

    Numerical continuum codes may be used for assessing the role of regional groundwater flow in far-field safety analyses of a nuclear waste repository at depth. The focus of this project is to develop and evaluate one method based on Discrete Fracture Network (DFN) models to estimate block-scale permeability values for continuum codes. Data from the Aespoe HRL and surrounding area are used. 57 refs, 76 figs, 15 tabs

  4. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 1

    International Nuclear Information System (INIS)

    1983-01-01

    This documentation of the activities of the Oeko-Institut is intended to show errors made and limits encountered in the experimental approaches and in results obtained by the work performed under phase A of the German Risk Assessment Study on Nuclear Power Plants (DRS). Concern is expressed and explained relating to the risk definition used in the Study, and the results of other studies relied on; specific problems of methodology are discussed with regard to the value of fault-tree/accident analyses for describing the course of safety-related events, and to the evaluations presented in the DRS. The Markov model is explained as an approach offering alternative solutions. The identification and quantification of common-mode failures is discussed. Origin, quality and methods of assessing the reliability characteristics used in the DRS as well as the statistical models for describing failure scenarios of reactor components and systems are critically reviewed. (RF) [de

  5. Using risk based tools in emergency response

    International Nuclear Information System (INIS)

    Dixon, B.W.; Ferns, K.G.

    1987-01-01

    Probabilistic Risk Assessment (PRA) techniques are used by the nuclear industry to model the potential response of a reactor subjected to unusual conditions. The knowledge contained in these models can aid in emergency response decision making. This paper presents requirements for a PRA based emergency response support system to date. A brief discussion of published work provides background for a detailed description of recent developments. A rapid deep assessment capability for specific portions of full plant models is presented. The program uses a screening rule base to control search space expansion in a combinational algorithm

  6. Genome-wide association analyses identify 44 risk variants and refine the genetic architecture of major depression.

    Science.gov (United States)

    Wray, Naomi R; Ripke, Stephan; Mattheisen, Manuel; Trzaskowski, Maciej; Byrne, Enda M; Abdellaoui, Abdel; Adams, Mark J; Agerbo, Esben; Air, Tracy M; Andlauer, Till M F; Bacanu, Silviu-Alin; Bækvad-Hansen, Marie; Beekman, Aartjan F T; Bigdeli, Tim B; Binder, Elisabeth B; Blackwood, Douglas R H; Bryois, Julien; Buttenschøn, Henriette N; Bybjerg-Grauholm, Jonas; Cai, Na; Castelao, Enrique; Christensen, Jane Hvarregaard; Clarke, Toni-Kim; Coleman, Jonathan I R; Colodro-Conde, Lucía; Couvy-Duchesne, Baptiste; Craddock, Nick; Crawford, Gregory E; Crowley, Cheynna A; Dashti, Hassan S; Davies, Gail; Deary, Ian J; Degenhardt, Franziska; Derks, Eske M; Direk, Nese; Dolan, Conor V; Dunn, Erin C; Eley, Thalia C; Eriksson, Nicholas; Escott-Price, Valentina; Kiadeh, Farnush Hassan Farhadi; Finucane, Hilary K; Forstner, Andreas J; Frank, Josef; Gaspar, Héléna A; Gill, Michael; Giusti-Rodríguez, Paola; Goes, Fernando S; Gordon, Scott D; Grove, Jakob; Hall, Lynsey S; Hannon, Eilis; Hansen, Christine Søholm; Hansen, Thomas F; Herms, Stefan; Hickie, Ian B; Hoffmann, Per; Homuth, Georg; Horn, Carsten; Hottenga, Jouke-Jan; Hougaard, David M; Hu, Ming; Hyde, Craig L; Ising, Marcus; Jansen, Rick; Jin, Fulai; Jorgenson, Eric; Knowles, James A; Kohane, Isaac S; Kraft, Julia; Kretzschmar, Warren W; Krogh, Jesper; Kutalik, Zoltán; Lane, Jacqueline M; Li, Yihan; Li, Yun; Lind, Penelope A; Liu, Xiaoxiao; Lu, Leina; MacIntyre, Donald J; MacKinnon, Dean F; Maier, Robert M; Maier, Wolfgang; Marchini, Jonathan; Mbarek, Hamdi; McGrath, Patrick; McGuffin, Peter; Medland, Sarah E; Mehta, Divya; Middeldorp, Christel M; Mihailov, Evelin; Milaneschi, Yuri; Milani, Lili; Mill, Jonathan; Mondimore, Francis M; Montgomery, Grant W; Mostafavi, Sara; Mullins, Niamh; Nauck, Matthias; Ng, Bernard; Nivard, Michel G; Nyholt, Dale R; O'Reilly, Paul F; Oskarsson, Hogni; Owen, Michael J; Painter, Jodie N; Pedersen, Carsten Bøcker; Pedersen, Marianne Giørtz; Peterson, Roseann E; Pettersson, Erik; Peyrot, Wouter J; Pistis, Giorgio; Posthuma, Danielle; Purcell, Shaun M; Quiroz, Jorge A; Qvist, Per; Rice, John P; Riley, Brien P; Rivera, Margarita; Saeed Mirza, Saira; Saxena, Richa; Schoevers, Robert; Schulte, Eva C; Shen, Ling; Shi, Jianxin; Shyn, Stanley I; Sigurdsson, Engilbert; Sinnamon, Grant B C; Smit, Johannes H; Smith, Daniel J; Stefansson, Hreinn; Steinberg, Stacy; Stockmeier, Craig A; Streit, Fabian; Strohmaier, Jana; Tansey, Katherine E; Teismann, Henning; Teumer, Alexander; Thompson, Wesley; Thomson, Pippa A; Thorgeirsson, Thorgeir E; Tian, Chao; Traylor, Matthew; Treutlein, Jens; Trubetskoy, Vassily; Uitterlinden, André G; Umbricht, Daniel; Van der Auwera, Sandra; van Hemert, Albert M; Viktorin, Alexander; Visscher, Peter M; Wang, Yunpeng; Webb, Bradley T; Weinsheimer, Shantel Marie; Wellmann, Jürgen; Willemsen, Gonneke; Witt, Stephanie H; Wu, Yang; Xi, Hualin S; Yang, Jian; Zhang, Futao; Arolt, Volker; Baune, Bernhard T; Berger, Klaus; Boomsma, Dorret I; Cichon, Sven; Dannlowski, Udo; de Geus, E C J; DePaulo, J Raymond; Domenici, Enrico; Domschke, Katharina; Esko, Tõnu; Grabe, Hans J; Hamilton, Steven P; Hayward, Caroline; Heath, Andrew C; Hinds, David A; Kendler, Kenneth S; Kloiber, Stefan; Lewis, Glyn; Li, Qingqin S; Lucae, Susanne; Madden, Pamela F A; Magnusson, Patrik K; Martin, Nicholas G; McIntosh, Andrew M; Metspalu, Andres; Mors, Ole; Mortensen, Preben Bo; Müller-Myhsok, Bertram; Nordentoft, Merete; Nöthen, Markus M; O'Donovan, Michael C; Paciga, Sara A; Pedersen, Nancy L; Penninx, Brenda W J H; Perlis, Roy H; Porteous, David J; Potash, James B; Preisig, Martin; Rietschel, Marcella; Schaefer, Catherine; Schulze, Thomas G; Smoller, Jordan W; Stefansson, Kari; Tiemeier, Henning; Uher, Rudolf; Völzke, Henry; Weissman, Myrna M; Werge, Thomas; Winslow, Ashley R; Lewis, Cathryn M; Levinson, Douglas F; Breen, Gerome; Børglum, Anders D; Sullivan, Patrick F

    2018-05-01

    Major depressive disorder (MDD) is a common illness accompanied by considerable morbidity, mortality, costs, and heightened risk of suicide. We conducted a genome-wide association meta-analysis based in 135,458 cases and 344,901 controls and identified 44 independent and significant loci. The genetic findings were associated with clinical features of major depression and implicated brain regions exhibiting anatomical differences in cases. Targets of antidepressant medications and genes involved in gene splicing were enriched for smaller association signal. We found important relationships of genetic risk for major depression with educational attainment, body mass, and schizophrenia: lower educational attainment and higher body mass were putatively causal, whereas major depression and schizophrenia reflected a partly shared biological etiology. All humans carry lesser or greater numbers of genetic risk factors for major depression. These findings help refine the basis of major depression and imply that a continuous measure of risk underlies the clinical phenotype.

  7. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  8. Effects of parenting interventions for at-risk parents with infants: a systematic review and meta-analyses.

    Science.gov (United States)

    Rayce, Signe B; Rasmussen, Ida S; Klest, Sihu K; Patras, Joshua; Pontoppidan, Maiken

    2017-12-27

    Infancy is a critical stage of life, and a secure relationship with caring and responsive caregivers is crucial for healthy infant development. Early parenting interventions aim to support families in which infants are at risk of developmental harm. Our objective is to systematically review the effects of parenting interventions on child development and on parent-child relationship for at-risk families with infants aged 0-12 months. This is a systematic review and meta-analyses. We extracted publications from 10 databases in June 2013, January 2015 and June 2016, and supplemented with grey literature and hand search. We assessed risk of bias, calculated effect sizes and conducted meta-analyses. (1) Randomised controlled trials of structured psychosocial interventions offered to at-risk families with infants aged 0-12 months in Western Organisation for Economic Co-operation and Development (OECD) countries, (2) interventions with a minimum of three sessions and at least half of these delivered postnatally and (3) outcomes reported for child development or parent-child relationship. Sixteen studies were included. Meta-analyses were conducted on seven outcomes represented in 13 studies. Parenting interventions significantly improved child behaviour ( d =0.14; 95% CI 0.03 to 0.26), parent-child relationship ( d =0.44; 95% CI 0.09 to 0.80) and maternal sensitivity ( d =0.46; 95% CI 0.26 to 0.65) postintervention. There were no significant effects on cognitive development ( d= 0.13; 95% CI -0.08 to 0.41), internalising behaviour ( d= 0.16; 95% CI -0.03 to 0.33) or externalising behaviour ( d= 0.16; 95% CI -0.01 to 0.30) post-intervention. At long-term follow-up we found no significant effect on child behaviour ( d= 0.15; 95% CI -0.03 to 0.31). Interventions offered to at-risk families in the first year of the child's life appear to improve child behaviour, parent-child relationship and maternal sensitivity post-intervention, but not child cognitive

  9. Analysing movements in investor’s risk aversion using the Heston volatility model

    Directory of Open Access Journals (Sweden)

    Alexie ALUPOAIEI

    2013-03-01

    Full Text Available In this paper we intend to identify and analyze, if it is the case, an “epidemiological” relationship between forecasts of professional investors and short-term developments in the EUR/RON exchange rate. Even that we don’t call a typical epidemiological model as those ones used in biology fields of research, we investigated the hypothesis according to which after the Lehman Brothers crash and implicit the generation of the current financial crisis, the forecasts of professional investors pose a significant explanatory power on the futures short-run movements of EUR/RON. How does it work this mechanism? Firstly, the professional forecasters account for the current macro, financial and political states, then they elaborate forecasts. Secondly, based on that forecasts they get positions in the Romanian exchange market for hedging and/or speculation purposes. But their positions incorporate in addition different degrees of uncertainty. In parallel, a part of their anticipations are disseminated to the public via media channels. Since some important movements are viewed within macro, financial or political fields, the positions of professsional investors from FX derivative market are activated. The current study represents a first step in that direction of analysis for Romanian case. For the above formulated objectives, in this paper different measures of EUR/RON rate volatility have been estimated and compared with implied volatilities. In a second timeframe we called the co-integration and dynamic correlation based tools in order to investigate the relationship between implied volatility and daily returns of EUR/RON exchange rate.

  10. Physical characterization of biomass-based pyrolysis liquids. Application of standard fuel oil analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oasmaa, A; Leppaemaeki, E; Koponen, P; Levander, J; Tapola, E [VTT Energy, Espoo (Finland). Energy Production Technologies

    1998-12-31

    The main purpose of the study was to test the applicability of standard fuel oil methods developed for petroleum-based fuels to pyrolysis liquids. In addition, research on sampling, homogeneity, stability, miscibility and corrosivity was carried out. The standard methods have been tested for several different pyrolysis liquids. Recommendations on sampling, sample size and small modifications of standard methods are presented. In general, most of the methods can be used as such but the accuracy of the analysis can be improved by minor modifications. Fuel oil analyses not suitable for pyrolysis liquids have been identified. Homogeneity of the liquids is the most critical factor in accurate analysis. The presence of air bubbles may disturb in several analyses. Sample preheating and prefiltration should be avoided when possible. The former may cause changes in the composition and structure of the pyrolysis liquid. The latter may remove part of organic material with particles. The size of the sample should be determined on the basis of the homogeneity and the water content of the liquid. The basic analyses of the Technical Research Centre of Finland (VTT) include water, pH, solids, ash, Conradson carbon residue, heating value, CHN, density, viscosity, pourpoint, flash point, and stability. Additional analyses are carried out when needed. (orig.) 53 refs.

  11. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    Directory of Open Access Journals (Sweden)

    Pei-Yuan Li

    2015-05-01

    Full Text Available This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1 centrifugal compressor geometry optimization based on one-dimensional calculations and (2 matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D2 is too large in the original design, resulting in the low efficiency of the entire stage. Based on the one-dimensional optimization results, the geometry of the diffuser has been redesigned. The outlet diameter of the vaneless diffuser has been reduced, and the original single stage diffuser has been replaced by a tandem vaned diffuser. After optimization, the entire stage pressure ratio is increased by approximately 4%, and the efficiency is increased by approximately 2%.

  12. Area-based assessment of extinction risk.

    Science.gov (United States)

    Hei, Fangliang

    2012-05-01

    Underpinning the International Union for Conservation of Nature (IUCN) Red List is the assessment of extinction risk as determined by the size and degree of loss of populations. The IUCN system lists a species as Critically Endangered, Endangered, or Vulnerable if its population size declines 80%, 50%, or 30% within a given time frame. However, effective implementation of the system faces substantial challenges and uncertainty because geographic scale data on population size and long-term dynamics are scarce. I develop a model to quantify extinction risk using a measure based on a species' distribution, a much more readily obtained quantity. The model calculates the loss of the area of occupancy that is equivalent to the loss of a given proportion of a population. It is a very simple yet general model that has no free parameters and is independent of scale. The model predicted well the distributions of 302 tree species at a local scale and the distributions of 348 species of North American land birds. This area-based model provides a solution to the long-standing problem for IUCN assessments of lack of data on population sizes, and thus it will contribute to facilitating the quantification of extinction risk worldwide.

  13. Adjuvant Sunitinib for High-risk Renal Cell Carcinoma After Nephrectomy: Subgroup Analyses and Updated Overall Survival Results.

    Science.gov (United States)

    Motzer, Robert J; Ravaud, Alain; Patard, Jean-Jacques; Pandha, Hardev S; George, Daniel J; Patel, Anup; Chang, Yen-Hwa; Escudier, Bernard; Donskov, Frede; Magheli, Ahmed; Carteni, Giacomo; Laguerre, Brigitte; Tomczak, Piotr; Breza, Jan; Gerletti, Paola; Lechuga, Mariajose; Lin, Xun; Casey, Michelle; Serfass, Lucile; Pantuck, Allan J; Staehler, Michael

    2018-01-01

    Adjuvant sunitinib significantly improved disease-free survival (DFS) versus placebo in patients with locoregional renal cell carcinoma (RCC) at high risk of recurrence after nephrectomy (hazard ratio [HR] 0.76, 95% confidence interval [CI] 0.59-0.98; p=0.03). To report the relationship between baseline factors and DFS, pattern of recurrence, and updated overall survival (OS). Data for 615 patients randomized to sunitinib (n=309) or placebo (n=306) in the S-TRAC trial. Subgroup DFS analyses by baseline risk factors were conducted using a Cox proportional hazards model. Baseline risk factors included: modified University of California Los Angeles integrated staging system criteria, age, gender, Eastern Cooperative Oncology Group performance status (ECOG PS), weight, neutrophil-to-lymphocyte ratio (NLR), and Fuhrman grade. Of 615 patients, 97 and 122 in the sunitinib and placebo arms developed metastatic disease, with the most common sites of distant recurrence being lung (40 and 49), lymph node (21 and 26), and liver (11 and 14), respectively. A benefit of adjuvant sunitinib over placebo was observed across subgroups, including: higher risk (T3, no or undetermined nodal involvement, Fuhrman grade ≥2, ECOG PS ≥1, T4 and/or nodal involvement; hazard ratio [HR] 0.74, 95% confidence interval [CI] 0.55-0.99; p=0.04), NLR ≤3 (HR 0.72, 95% CI 0.54-0.95; p=0.02), and Fuhrman grade 3/4 (HR 0.73, 95% CI 0.55-0.98; p=0.04). All subgroup analyses were exploratory, and no adjustments for multiplicity were made. Median OS was not reached in either arm (HR 0.92, 95% CI 0.66-1.28; p=0.6); 67 and 74 patients died in the sunitinib and placebo arms, respectively. A benefit of adjuvant sunitinib over placebo was observed across subgroups. The results are consistent with the primary analysis, which showed a benefit for adjuvant sunitinib in patients at high risk of recurrent RCC after nephrectomy. Most subgroups of patients at high risk of recurrent renal cell carcinoma after

  14. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    Bodvarsson, G.S.; Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  15. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  16. Risk-based emergency decision support

    International Nuclear Information System (INIS)

    Koerte, Jens

    2003-01-01

    In the present paper we discuss how to assist critical decisions taken under complex, contingent circumstances, with a high degree of uncertainty and short time frames. In such sharp-end decision regimes, standard rule-based decision support systems do not capture the complexity of the situation. At the same time, traditional risk analysis is of little use due to variability in the specific circumstances. How then, can an organisation provide assistance to, e.g. pilots in dealing with such emergencies? A method called 'contingent risk and decision analysis' is presented, to provide decision support for decisions under variable circumstances and short available time scales. The method consists of nine steps of definition, modelling, analysis and criteria definition to be performed 'off-line' by analysts, and procedure generation to transform the analysis result into an operational decision aid. Examples of pilots' decisions in response to sudden vibration in offshore helicopter transport method are used to illustrate the approach

  17. DESIGNING EAP MATERIALS BASED ON INTERCULTURAL CORPUS ANALYSES: THE CASE OF LOGICAL MARKERS IN RESEARCH ARTICLES

    Directory of Open Access Journals (Sweden)

    Pilar Mur Dueñas

    2009-10-01

    Full Text Available The ultimate aim of intercultural analyses in English for Academic Purposes is to help non-native scholars function successfully in the international disciplinary community in English. The aim of this paper is to show how corpus-based intercultural analyses can be useful to design EAP materials on a particular metadiscourse category, logical markers, in research article writing. The paper first describes the analysis carried out of additive, contrastive and consecutive logical markers in a corpus of research articles in English and in Spanish in a particular discipline, Business Management. Differences were found in their frequency and also in the use of each of the sub-categories. Then, five activities designed on the basis of these results are presented. They are aimed at raising Spanish Business scholars' awareness of the specific uses and pragmatic function of frequent logical markers in international research articles in English.

  18. Integrated approach for fusion multi-physics coupled analyses based on hybrid CAD and mesh geometries

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Yuefeng, E-mail: yuefeng.qiu@kit.edu; Lu, Lei; Fischer, Ulrich

    2015-10-15

    Highlights: • Integrated approach for neutronics, thermal and structural analyses was developed. • MCNP5/6, TRIPOLI-4 were coupled with CFX, Fluent and ANSYS Workbench. • A novel meshing approach has been proposed for describing MC geometry. - Abstract: Coupled multi-physics analyses on fusion reactor devices require high-fidelity neutronic models, and flexible, accurate data exchanging between various calculation codes. An integrated coupling approach has been developed to enable the conversion of CAD, mesh, or hybrid geometries for Monte Carlo (MC) codes MCNP5/6, TRIPOLI-4, and translation of nuclear heating data for CFD codes Fluent, CFX and structural mechanical software ANSYS Workbench. The coupling approach has been implemented based on SALOME platform with CAD modeling, mesh generation and data visualization capabilities. A novel meshing approach has been developed for generating suitable meshes for MC geometry descriptions. The coupling approach has been concluded to be reliable and efficient after verification calculations of several application cases.

  19. Tracing common origins of Genomic Islands in prokaryotes based on genome signature analyses.

    Science.gov (United States)

    van Passel, Mark Wj

    2011-09-01

    Horizontal gene transfer constitutes a powerful and innovative force in evolution, but often little is known about the actual origins of transferred genes. Sequence alignments are generally of limited use in tracking the original donor, since still only a small fraction of the total genetic diversity is thought to be uncovered. Alternatively, approaches based on similarities in the genome specific relative oligonucleotide frequencies do not require alignments. Even though the exact origins of horizontally transferred genes may still not be established using these compositional analyses, it does suggest that compositionally very similar regions are likely to have had a common origin. These analyses have shown that up to a third of large acquired gene clusters that reside in the same genome are compositionally very similar, indicative of a shared origin. This brings us closer to uncovering the original donors of horizontally transferred genes, and could help in elucidating possible regulatory interactions between previously unlinked sequences.

  20. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  1. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  2. Coronary risk assessment by point-based vs. equation-based Framingham models: significant implications for clinical care.

    Science.gov (United States)

    Gordon, William J; Polansky, Jesse M; Boscardin, W John; Fung, Kathy Z; Steinman, Michael A

    2010-11-01

    US cholesterol guidelines use original and simplified versions of the Framingham model to estimate future coronary risk and thereby classify patients into risk groups with different treatment strategies. We sought to compare risk estimates and risk group classification generated by the original, complex Framingham model and the simplified, point-based version. We assessed 2,543 subjects age 20-79 from the 2001-2006 National Health and Nutrition Examination Surveys (NHANES) for whom Adult Treatment Panel III (ATP-III) guidelines recommend formal risk stratification. For each subject, we calculated the 10-year risk of major coronary events using the original and point-based Framingham models, and then compared differences in these risk estimates and whether these differences would place subjects into different ATP-III risk groups (20% risk). Using standard procedures, all analyses were adjusted for survey weights, clustering, and stratification to make our results nationally representative. Among 39 million eligible adults, the original Framingham model categorized 71% of subjects as having "moderate" risk (20%) risk. Estimates of coronary risk by the original and point-based models often differed substantially. The point-based system classified 15% of adults (5.7 million) into different risk groups than the original model, with 10% (3.9 million) misclassified into higher risk groups and 5% (1.8 million) into lower risk groups, for a net impact of classifying 2.1 million adults into higher risk groups. These risk group misclassifications would impact guideline-recommended drug treatment strategies for 25-46% of affected subjects. Patterns of misclassifications varied significantly by gender, age, and underlying CHD risk. Compared to the original Framingham model, the point-based version misclassifies millions of Americans into risk groups for which guidelines recommend different treatment strategies.

  3. A web-based endpoint adjudication system for interim analyses in clinical trials.

    Science.gov (United States)

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  4. Varying geospatial analyses to assess climate risk and adaptive capacity in a hotter, drier Southwestern United States

    Science.gov (United States)

    Elias, E.; Reyes, J. J.; Steele, C. M.; Rango, A.

    2017-12-01

    Assessing vulnerability of agricultural systems to climate variability and change is vital in securing food systems and sustaining rural livelihoods. Farmers, ranchers, and forest landowners rely on science-based, decision-relevant, and localized information to maintain production, ecological viability, and economic returns. This contribution synthesizes a collection of research on the future of agricultural production in the American Southwest (SW). Research was based on a variety of geospatial methodologies and datasets to assess the vulnerability of rangelands and livestock, field crops, specialty crops, and forests in the SW to climate-risk and change. This collection emerged from the development of regional vulnerability assessments for agricultural climate-risk by the U.S. Department of Agriculture (USDA) Climate Hub Network, established to deliver science-based information and technologies to enable climate-informed decision-making. Authors defined vulnerability differently based on their agricultural system of interest, although each primarily focuses on biophysical systems. We found that an inconsistent framework for vulnerability and climate risk was necessary to adequately capture the diversity, variability, and heterogeneity of SW landscapes, peoples, and agriculture. Through the diversity of research questions and methodologies, this collection of articles provides valuable information on various aspects of SW vulnerability. All articles relied on geographic information systems technology, with highly variable levels of complexity. Agricultural articles used National Agricultural Statistics Service data, either as tabular county level summaries or through the CropScape cropland raster datasets. Most relied on modeled historic and future climate information, but with differing assumptions regarding spatial resolution and temporal framework. We assert that it is essential to evaluate climate risk using a variety of complementary methodologies and

  5. Determination of the spatial response of neutron based analysers using a Monte Carlo based method

    International Nuclear Information System (INIS)

    Tickner, James

    2000-01-01

    One of the principal advantages of using thermal neutron capture (TNC, also called prompt gamma neutron activation analysis or PGNAA) or neutron inelastic scattering (NIS) techniques for measuring elemental composition is the high penetrating power of both the incident neutrons and the resultant gamma-rays, which means that large sample volumes can be interrogated. Gauges based on these techniques are widely used in the mineral industry for on-line determination of the composition of bulk samples. However, attenuation of both neutrons and gamma-rays in the sample and geometric (source/detector distance) effects typically result in certain parts of the sample contributing more to the measured composition than others. In turn, this introduces errors in the determination of the composition of inhomogeneous samples. This paper discusses a combined Monte Carlo/analytical method for estimating the spatial response of a neutron gauge. Neutron propagation is handled using a Monte Carlo technique which allows an arbitrarily complex neutron source and gauge geometry to be specified. Gamma-ray production and detection is calculated analytically which leads to a dramatic increase in the efficiency of the method. As an example, the method is used to study ways of reducing the spatial sensitivity of on-belt composition measurements of cement raw meal

  6. Assessing residential building values in Spain for risk analyses - application to the landslide hazard in the Autonomous Community of Valencia

    Science.gov (United States)

    Cantarino, I.; Torrijo, F. J.; Palencia, S.; Gielen, E.

    2014-11-01

    This paper proposes a method of valuing the stock of residential buildings in Spain as the first step in assessing possible damage caused to them by natural hazards. For the purposes of the study we had access to the SIOSE (the Spanish Land Use and Cover Information System), a high-resolution land-use model, as well as to a report on the financial valuations of this type of building throughout Spain. Using dasymetric disaggregation processes and GIS techniques we developed a geolocalized method of obtaining this information, which was the exposure variable in the general risk assessment formula. Then, with the application over a hazard map, the risk value can be easily obtained. An example of its application is given in a case study that assesses the risk of a landslide in the entire 23 200 km2 of the Valencia Autonomous Community (NUT2), the results of which are analysed by municipal areas (LAU2) for the years 2005 and 2009.

  7. CrusView: a Java-based visualization platform for comparative genomics analyses in Brassicaceae species.

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-09-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/.

  8. Questionnaire-based risk assessment system

    International Nuclear Information System (INIS)

    Sakajo, Satoko; Ohi, Tadashi

    2004-01-01

    In order to reduce human errors efficiently, it is important to evaluate error-likely tasks and improve them. There are a lot of evaluation methods, for example, experimental evaluation methods, investigations by the expert of human factors, checking guidelines, estimating human error probabilities, and so on. There are roughly two problems in those methods. (1) Qualitative evaluation methods do not evaluate how likely human errors will occur and do not estimate how effective the countermeasure is in reducing human error. (2) Most of the quantitative evaluation methods and detailed analysis methods require expert's judgment. We developed a questionnaire-based risk assessment method and its system. In this paper, we introduce the concept of the method, realization, and applications to a maintenance procedure of a nuclear power plant and an elevator. The feature of the method is that it is so simple and the inexpert can easily evaluate the risk of human error. Furthermore, because it is provided as an application service provider system, a lot of evaluators can use it simultaneously through internet and it is easy to collect and sum up the responses. We confirmed that it is useful to evaluate the risk of human error, analyze the problem, and estimate the effectiveness of countermeasures in advance through the applications. (author)

  9. A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Asteriadis, Stylianos

    2011-01-01

    present dierent types of information that have been extracted from game context, player preferences and perception of the game, as well as user features, automatically extracted from video recordings.We run a number of initial experiments to analyse players' behavior while playing video games as a case......Recognizing players' aective state while playing video games has been the focus of many recent research studies. In this paper we describe the process that has been followed to build a corpus based on game events and recorded video sessions from human players while playing Super Mario Bros. We...

  10. How distributed processing produces false negatives in voxel-based lesion-deficit analyses.

    Science.gov (United States)

    Gajardo-Vidal, Andrea; Lorca-Puls, Diego L; Crinion, Jennifer T; White, Jitrachote; Seghier, Mohamed L; Leff, Alex P; Hope, Thomas M H; Ludersdorfer, Philipp; Green, David W; Bowman, Howard; Price, Cathy J

    2018-07-01

    In this study, we hypothesized that if the same deficit can be caused by damage to one or another part of a distributed neural system, then voxel-based analyses might miss critical lesion sites because preservation of each site will not be consistently associated with preserved function. The first part of our investigation used voxel-based multiple regression analyses of data from 359 right-handed stroke survivors to identify brain regions where lesion load is associated with picture naming abilities after factoring out variance related to object recognition, semantics and speech articulation so as to focus on deficits arising at the word retrieval level. A highly significant lesion-deficit relationship was identified in left temporal and frontal/premotor regions. Post-hoc analyses showed that damage to either of these sites caused the deficit of interest in less than half the affected patients (76/162 = 47%). After excluding all patients with damage to one or both of the identified regions, our second analysis revealed a new region, in the anterior part of the left putamen, which had not been previously detected because many patients had the deficit of interest after temporal or frontal damage that preserved the left putamen. The results illustrate how (i) false negative results arise when the same deficit can be caused by different lesion sites; (ii) some of the missed effects can be unveiled by adopting an iterative approach that systematically excludes patients with lesions to the areas identified in previous analyses, (iii) statistically significant voxel-based lesion-deficit mappings can be driven by a subset of patients; (iv) focal lesions to the identified regions are needed to determine whether the deficit of interest is the consequence of focal damage or much more extensive damage that includes the identified region; and, finally, (v) univariate voxel-based lesion-deficit mappings cannot, in isolation, be used to predict outcome in other patients

  11. Parent-based adolescent sexual health interventions and effect on communication outcomes: a systematic review and meta-analyses.

    Science.gov (United States)

    Santa Maria, Diane; Markham, Christine; Bluethmann, Shirley; Mullen, Patricia Dolan

    2015-03-01

    Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. A systematic search of databases for the period 1998-2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. These findings point to gaps in the range of programs examined in published trials-for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. Copyright © 2015 by the Guttmacher Institute.

  12. A Bayesian network model for predicting type 2 diabetes risk based on electronic health records

    Science.gov (United States)

    Xie, Jiang; Liu, Yan; Zeng, Xu; Zhang, Wu; Mei, Zhen

    2017-07-01

    An extensive, in-depth study of diabetes risk factors (DBRF) is of crucial importance to prevent (or reduce) the chance of suffering from type 2 diabetes (T2D). Accumulation of electronic health records (EHRs) makes it possible to build nonlinear relationships between risk factors and diabetes. However, the current DBRF researches mainly focus on qualitative analyses, and the inconformity of physical examination items makes the risk factors likely to be lost, which drives us to study the novel machine learning approach for risk model development. In this paper, we use Bayesian networks (BNs) to analyze the relationship between physical examination information and T2D, and to quantify the link between risk factors and T2D. Furthermore, with the quantitative analyses of DBRF, we adopt EHR and propose a machine learning approach based on BNs to predict the risk of T2D. The experiments demonstrate that our approach can lead to better predictive performance than the classical risk model.

  13. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  14. German data for risk based fire safety assessment

    International Nuclear Information System (INIS)

    Roewekamp, M.; Berg, H.P.

    1998-01-01

    Different types of data are necessary to perform risk based fire safety assessments and, in particular, to quantify the fire event tree considering the plant specific conditions. Data on fire barriers, fire detection and extinguishing, including also data on secondary effects of a fire, have to be used for quantifying the potential hazard and damage states. The existing German database on fires in nuclear power plants (NPPs) is very small. Therefore, in general generic data, mainly from US databases, are used for risk based safety assessments. Due to several differences in the plant design and conditions generic data can only be used as conservative assumptions. World-wide existing generic data on personnel failures in case of fire fighting have only to be adapted to the plant specific conditions inside the NPP to be investigated. In contrary, unavailabilities of fire barrier elements may differ strongly depending on different standards, testing requirements, etc. In addition, the operational behaviour of active fire protection equipment may vary depending on type and manufacturer. The necessity for more detailed and for additional plant specific data was the main reason for generating updated German data on the operational behaviour of active fire protection equipment/features in NPPs to support risk based fire safety analyses being recommended to be carried out as an additional tool to deterministic fire hazard analyses in the frame of safety reviews. The results of these investigations revealed a broader and more realistic database for technical reliability of active fire protection means, but improvements as well as collection of further data are still necessary. (author)

  15. 12 CFR 932.3 - Risk-based capital requirement.

    Science.gov (United States)

    2010-01-01

    ... credit risk capital requirement, its market risk capital requirement, and its operations risk capital... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND...

  16. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    Science.gov (United States)

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  17. Model-based mitigation of availability risks

    NARCIS (Netherlands)

    Zambon, E.; Bolzoni, D.; Etalle, S.; Salvato, M.

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for risk assessment and mitigation show limitations when evaluating and mitigating availability risks. This is due

  18. Model-Based Mitigation of Availability Risks

    NARCIS (Netherlands)

    Zambon, Emmanuele; Bolzoni, D.; Etalle, Sandro; Salvato, Marco

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for Risk Assessment and Mitigation show limitations when evaluating and mitigating availability risks. This is due

  19. Individual-based analyses reveal limited functional overlap in a coral reef fish community.

    Science.gov (United States)

    Brandl, Simon J; Bellwood, David R

    2014-05-01

    Detailed knowledge of a species' functional niche is crucial for the study of ecological communities and processes. The extent of niche overlap, functional redundancy and functional complementarity is of particular importance if we are to understand ecosystem processes and their vulnerability to disturbances. Coral reefs are among the most threatened marine systems, and anthropogenic activity is changing the functional composition of reefs. The loss of herbivorous fishes is particularly concerning as the removal of algae is crucial for the growth and survival of corals. Yet, the foraging patterns of the various herbivorous fish species are poorly understood. Using a multidimensional framework, we present novel individual-based analyses of species' realized functional niches, which we apply to a herbivorous coral reef fish community. In calculating niche volumes for 21 species, based on their microhabitat utilization patterns during foraging, and computing functional overlaps, we provide a measurement of functional redundancy or complementarity. Complementarity is the inverse of redundancy and is defined as less than 50% overlap in niche volumes. The analyses reveal extensive complementarity with an average functional overlap of just 15.2%. Furthermore, the analyses divide herbivorous reef fishes into two broad groups. The first group (predominantly surgeonfishes and parrotfishes) comprises species feeding on exposed surfaces and predominantly open reef matrix or sandy substrata, resulting in small niche volumes and extensive complementarity. In contrast, the second group consists of species (predominantly rabbitfishes) that feed over a wider range of microhabitats, penetrating the reef matrix to exploit concealed surfaces of various substratum types. These species show high variation among individuals, leading to large niche volumes, more overlap and less complementarity. These results may have crucial consequences for our understanding of herbivorous processes on

  20. Genome-based comparative analyses of Antarctic and temperate species of Paenibacillus.

    Directory of Open Access Journals (Sweden)

    Melissa Dsouza

    Full Text Available Antarctic soils represent a unique environment characterised by extremes of temperature, salinity, elevated UV radiation, low nutrient and low water content. Despite the harshness of this environment, members of 15 bacterial phyla have been identified in soils of the Ross Sea Region (RSR. However, the survival mechanisms and ecological roles of these phyla are largely unknown. The aim of this study was to investigate whether strains of Paenibacillus darwinianus owe their resilience to substantial genomic changes. For this, genome-based comparative analyses were performed on three P. darwinianus strains, isolated from gamma-irradiated RSR soils, together with nine temperate, soil-dwelling Paenibacillus spp. The genome of each strain was sequenced to over 1,000-fold coverage, then assembled into contigs totalling approximately 3 Mbp per genome. Based on the occurrence of essential, single-copy genes, genome completeness was estimated at approximately 88%. Genome analysis revealed between 3,043-3,091 protein-coding sequences (CDSs, primarily associated with two-component systems, sigma factors, transporters, sporulation and genes induced by cold-shock, oxidative and osmotic stresses. These comparative analyses provide an insight into the metabolic potential of P. darwinianus, revealing potential adaptive mechanisms for survival in Antarctic soils. However, a large proportion of these mechanisms were also identified in temperate Paenibacillus spp., suggesting that these mechanisms are beneficial for growth and survival in a range of soil environments. These analyses have also revealed that the P. darwinianus genomes contain significantly fewer CDSs and have a lower paralogous content. Notwithstanding the incompleteness of the assemblies, the large differences in genome sizes, determined by the number of genes in paralogous clusters and the CDS content, are indicative of genome content scaling. Finally, these sequences are a resource for further

  1. Digging the pupfish out of its hole: risk analyses to guide harvest of Devils Hole pupfish for captive breeding

    Directory of Open Access Journals (Sweden)

    Steven R. Beissinger

    2014-09-01

    Full Text Available The Devils Hole pupfish is restricted to one wild population in a single aquifer-fed thermal pool in the Desert National Wildlife Refuge Complex. Since 1995 the pupfish has been in a nearly steady decline, where it was perched on the brink of extinction at 35–68 fish in 2013. A major strategy for conserving the pupfish has been the establishment of additional captive or “refuge” populations, but all ended in failure. In 2013 a new captive propagation facility designed specifically to breed pupfish was opened. I examine how a captive population can be initiated by removing fish from the wild without unduly accelerating extinction risk for the pupfish in Devils Hole. I construct a count-based PVA model, parameterized from estimates of the intrinsic rate of increase and its variance using counts in spring and fall from 1995–2013, to produce the first risk assessment for the pupfish. Median time to extinction was 26 and 27 years from spring and fall counts, respectively, and the probability of extinction in 20 years was 26–33%. Removing individuals in the fall had less risk to the wild population than harvest in spring. For both spring and fall harvest, risk increased rapidly when levels exceeded six adult pupfish per year for three years. Extinction risk was unaffected by the apportionment of total harvest among years. A demographic model was used to examine how removal of different stage classes affects the dynamics of the wild population based on reproductive value (RV and elasticity. Removing eggs had the least impact on the pupfish in Devils Hole; RV of an adult was roughly 25 times that of an egg. To evaluate when it might be prudent to remove all pupfish from Devils Hole for captive breeding, I used the count-based model to examine how extinction risk related to pupfish population size. Risk accelerated when initial populations were less than 30 individuals. Results are discussed in relation to the challenges facing pupfish recovery

  2. Risk factors for child maltreatment in an Australian population-based birth cohort.

    Science.gov (United States)

    Doidge, James C; Higgins, Daryl J; Delfabbro, Paul; Segal, Leonie

    2017-02-01

    Child maltreatment and other adverse childhood experiences adversely influence population health and socioeconomic outcomes. Knowledge of the risk factors for child maltreatment can be used to identify children at risk and may represent opportunities for prevention. We examined a range of possible child, parent and family risk factors for child maltreatment in a prospective 27-year population-based birth cohort of 2443 Australians. Physical abuse, sexual abuse, emotional abuse, neglect and witnessing of domestic violence were recorded retrospectively in early adulthood. Potential risk factors were collected prospectively during childhood or reported retrospectively. Associations were estimated using bivariate and multivariate logistic regressions and combined into cumulative risk scores. Higher levels of economic disadvantage, poor parental mental health and substance use, and social instability were strongly associated with increased risk of child maltreatment. Indicators of child health displayed mixed associations and infant temperament was uncorrelated to maltreatment. Some differences were observed across types of maltreatment but risk profiles were generally similar. In multivariate analyses, nine independent risk factors were identified, including some that are potentially modifiable: economic disadvantage and parental substance use problems. Risk of maltreatment increased exponentially with the number of risk factors experienced, with prevalence of maltreatment in the highest risk groups exceeding 80%. A cumulative risk score based on the independent risk factors allowed identification of individuals at very high risk of maltreatment, while a score that incorporated all significant risk and protective factors provided better identification of low-risk individuals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  4. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  5. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  6. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    International Nuclear Information System (INIS)

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-01-01

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an ''importance factor'' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor

  7. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  8. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  9. Associations of Fitness, Physical Activity, Strength, and Genetic Risk With Cardiovascular Disease: Longitudinal Analyses in the UK Biobank Study.

    Science.gov (United States)

    Tikkanen, Emmi; Gustafsson, Stefan; Ingelsson, Erik

    2018-04-09

    Background -Observational studies have shown inverse associations among fitness, physical activity, and cardiovascular disease. However, little is known about these associations in individuals with elevated genetic susceptibility for these diseases. Methods -We estimated associations of grip strength, objective and subjective physical activity, and cardiorespiratory fitness with cardiovascular events and all-cause death in a large cohort of 502635 individuals from the UK Biobank (median follow-up, 6.1 years; interquartile range, 5.4-6.8 years). Then we further examined these associations in individuals with different genetic burden by stratifying individuals based on their genetic risk scores for coronary heart disease and atrial fibrillation. We compared disease risk among individuals in different tertiles of fitness, physical activity, and genetic risk using lowest tertiles as reference. Results -Grip strength, physical activity, and cardiorespiratory fitness showed inverse associations with incident cardiovascular events (coronary heart disease: hazard ratio [HR], 0.79; 95% confidence interval [CI], 0.77- 0.81; HR, 0.95; 95% CI, 0.93-0.97; and HR, 0.68; 95% CI, 0.63-0.74, per SD change, respectively; atrial fibrillation: HR, 0.75; 95% CI, 0.73- 0.76; HR, 0.93; 95% CI, 0.91-0.95; and HR, 0.60; 95% CI, 0.56-0.65, per SD change, respectively). Higher grip strength and cardiorespiratory fitness were associated with lower risk of incident coronary heart disease and atrial fibrillation in each genetic risk score group ( P trend fitness were associated with 49% lower risk for coronary heart disease (HR, 0.51; 95% CI, 0.38-0.69) and 60% lower risk for atrial fibrillation (HR, 0.40; 95%, CI 0.30-0.55) among individuals at high genetic risk for these diseases. Conclusions - Fitness and physical activity demonstrated inverse associations with incident cardiovascular disease in the general population, as well as in individuals with elevated genetic risk for these diseases.

  10. Apparently conclusive meta-analyses may be inconclusive--Trial sequential analysis adjustment of random error risk due to repetitive testing of accumulating data in apparently conclusive neonatal meta-analyses

    DEFF Research Database (Denmark)

    Brok, Jesper; Thorlund, Kristian; Wetterslev, Jørn

    2008-01-01

    BACKGROUND: Random error may cause misleading evidence in meta-analyses. The required number of participants in a meta-analysis (i.e. information size) should be at least as large as an adequately powered single trial. Trial sequential analysis (TSA) may reduce risk of random errors due to repeti......BACKGROUND: Random error may cause misleading evidence in meta-analyses. The required number of participants in a meta-analysis (i.e. information size) should be at least as large as an adequately powered single trial. Trial sequential analysis (TSA) may reduce risk of random errors due...

  11. Benefits of Exercise Training For Computer-Based Staff: A Meta Analyses

    Directory of Open Access Journals (Sweden)

    Mothna Mohammed

    2017-04-01

    Full Text Available Background: Office workers sit down to work for approximately 8 hours a day and, as a result, many of them do not have enough time for any form of physical exercise. This can lead to musculoskeletal discomforts, especially low back pain and recently, many researchers focused on home/office-based exercise training for prevention/treatment of low back pain among this population. Objective: This Meta analyses paper tried to discuss about the latest suggested exercises for the office workers based on the mechanisms and theories behind low back pain among office workers. Method: In this Meta analyses the author tried to collect relevant papers which were published previously on the subject. Google Scholar, Scopus, and PubMed were used as sources to find the articles. Only articles that were published using the same methodology, including office workers, musculoskeletal discomforts, low back pain, and exercise training keywords, were selected. Studies that failed to report sufficient sample statistics, or lacked a substantial review of past academic scholarship and/or clear methodologies, were excluded. Results: Limited evidence regarding the prevention of, and treatment methods for, musculoskeletal discomfort, especially those in the low back, among office workers, is available. The findings showed that training exercises had a significant effect (p<0.05 on low back pain discomfort scores and decreased pain levels in response to office-based exercise training. Conclusion: Office-based exercise training can affect pain/discomfort scores among office workers through positive effects on flexibility and strength of muscles. As such, it should be suggested to occupational therapists as a practical way for the treatment/prevention of low back pain among office workers.

  12. Smokers' increased risk for disability pension: social confounding or health-mediated effects? Gender-specific analyses of the Hordaland Health Study cohort.

    Science.gov (United States)

    Haukenes, Inger; Riise, Trond; Haug, Kjell; Farbu, Erlend; Maeland, John Gunnar

    2013-09-01

    Studies indicate that cigarette smokers have an increased risk for disability pension, presumably mediated by adverse health effects. However, smoking is also related to socioeconomic status. The current study examined the association between smoking and subsequent disability pension, and whether the association is explained by social confounding and/or health-related mediation. A subsample of 7934 men and 8488 women, aged 40-46, from the Hordaland Health Study, Norway (1997-1999), provided baseline information on smoking status, self-reported health measures and socioeconomic status. Outcome was register-based disability pension from 12 months after baseline to end of 2004. Gender stratified Cox regression analyses were used adjusted for socioeconomic status, physical activity, self-reported health and musculoskeletal pain sites. A total of 155 (2%) men and 333 (3.9%) women were granted disability pension during follow-up. The unadjusted disability risk associated with heavy smoking versus non-smoking was 1.88 (95% CI 1.23 to 2.89) among men and 3.06 (95% CI 2.23 to 4.20) among women. In multivariate analyses, adjusting for socioeconomic status, HRs were 1.33 (95% CI 0.84 to 2.11) among men and 2.22 (95% CI 1.58 to 3.13) among women. Final adjustment for physical activity, self-reported health and musculoskeletal pain further reduced the effect of heavy smoking in women (HR=1.53, 95% CI 1.09 to 2.16). Socioeconomic status confounded the smoking-related risk for disability pension; for female heavy smokers, however, a significant increased risk persisted after adjustment. Women may be particularly vulnerable to heavy smoking and to its sociomedical consequences, such as disability pension.

  13. Statistical analyses of incidents on onshore gas transmission pipelines based on PHMSA database

    International Nuclear Information System (INIS)

    Lam, Chio; Zhou, Wenxing

    2016-01-01

    This article reports statistical analyses of the mileage and pipe-related incidents data corresponding to the onshore gas transmission pipelines in the US between 2002 and 2013 collected by the Pipeline Hazardous Material Safety Administration of the US Department of Transportation. The analysis indicates that there are approximately 480,000 km of gas transmission pipelines in the US, approximately 60% of them more than 45 years old as of 2013. Eighty percent of the pipelines are Class 1 pipelines, and about 20% of the pipelines are Classes 2 and 3 pipelines. It is found that the third-party excavation, external corrosion, material failure and internal corrosion are the four leading failure causes, responsible for more than 75% of the total incidents. The 12-year average rate of rupture equals 3.1 × 10"−"5 per km-year due to all failure causes combined. External corrosion is the leading cause for ruptures: the 12-year average rupture rate due to external corrosion equals 1.0 × 10"−"5 per km-year and is twice the rupture rate due to the third-party excavation or material failure. The study provides insights into the current state of gas transmission pipelines in the US and baseline failure statistics for the quantitative risk assessments of such pipelines. - Highlights: • Analyze PHMSA pipeline mileage and incident data between 2002 and 2013. • Focus on gas transmission pipelines. • Leading causes for pipeline failures are identified. • Provide baseline failure statistics for risk assessments of gas transmission pipelines.

  14. Adjuvant Sunitinib for High-risk Renal Cell Carcinoma After Nephrectomy: Subgroup Analyses and Updated Overall Survival Results

    DEFF Research Database (Denmark)

    Motzer, Robert J; Ravaud, Alain; Patard, Jean-Jacques

    2018-01-01

    BACKGROUND: Adjuvant sunitinib significantly improved disease-free survival (DFS) versus placebo in patients with locoregional renal cell carcinoma (RCC) at high risk of recurrence after nephrectomy (hazard ratio [HR] 0.76, 95% confidence interval [CI] 0.59-0.98; p=0.03). OBJECTIVE: To report...... sunitinib over placebo was observed across subgroups, including: higher risk (T3, no or undetermined nodal involvement, Fuhrman grade ≥2, ECOG PS ≥1, T4 and/or nodal involvement; hazard ratio [HR] 0.74, 95% confidence interval [CI] 0.55-0.99; p=0.04), NLR ≤3 (HR 0.72, 95% CI 0.54-0.95; p=0.02), and Fuhrman...... grade 3/4 (HR 0.73, 95% CI 0.55-0.98; p=0.04). All subgroup analyses were exploratory, and no adjustments for multiplicity were made. Median OS was not reached in either arm (HR 0.92, 95% CI 0.66-1.28; p=0.6); 67 and 74 patients died in the sunitinib and placebo arms, respectively. CONCLUSIONS...

  15. Handbook of methods for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC's present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance

  16. Radon in homes and risk of lung cancer: 13 collaborative analyses of individual data from European case-control studies

    International Nuclear Information System (INIS)

    Darby, S.; Hill, D.; Doll, R.; Auvinen, A.; Barros Dios, J.M.; Ruano Ravina, A.; Baysson, H.; Tirmarche, M.; Bochicchio, F.; Deo, H.; Falk, R.; Forastiere, F.; Hakama, M.; Heid, I.; Schaffrath Rosario, A.; Wichmann, H.E.; Kreienbrock, L.; Kreuzer, M.; Lagarde, F.; Pershagen, G.; Makelainen, I.; Ruosteenoja, E.; Muirhead, C.; Oberaigner, W.; TomaBek, L.; Whitley, E.

    2007-01-01

    Objective: To determine the risk of lung cancer associated with exposure at home to the radioactive disintegration products of naturally occurring radon gas. Design: Collaborative analysis of individual data from 13 case-control studies of residential radon and lung cancer. Setting: Nine European countries. Subjects: 7148 cases of lung cancer and 14 208 controls. Main outcome measures: Relative risks of lung cancer and radon gas concentrations in homes inhabited during the previous 5-34 years measured in becquerels (radon disintegrations per second) per cubic metre (Bq/m3) of household air. Results: The mean measured radon concentration in homes of people in the control group was 97 Bq/m3, with 11% measuring > 200 and 4% measuring > 400 Bq/m3. For cases of lung cancer the mean concentration was 104 Bq/m3. The risk of lung cancer increased by 8.4% (95% confidence interval 3.0% to 15.8%) per 100 Bq/m3 increase in measured radon (P=0.0007). This corresponds to an increase of 16% (5% to 31%) per 100 Bq/m3 increase in usual radon- that is, after correction for the dilution caused by random uncertainties in measuring radon concentrations. The dose-response relation seemed to be linear with no threshold and remained significant (P = 0.04) in analyses limited to individuals from homes with measured radon < 200 Bq/m3. The proportionate excess risk did not differ significantly with study, age, sex, or smoking. In the absence of other causes of death, the absolute risks of lung cancer by age 75 years at usual radon concentrations of 0, 100, and 400 Bq/m3 would be about 0.4%, 0.5%, and 0.7%, respectively, for lifelong non-smokers, and about 25 times greater (10%, 12%, and 16%) for cigarette smokers. Conclusions: Collectively, though not separately, these studies show appreciable hazards from residential radon, particularly for smokers and recent ex-smokers, and indicate that it is responsible for about 2% of all deaths from cancer in Europe. (author)

  17. Interactions among Candidate Genes Selected by Meta-Analyses Resulting in Higher Risk of Ischemic Stroke in a Chinese Population.

    Directory of Open Access Journals (Sweden)

    Man Luo

    Full Text Available Ischemic stroke (IS is a multifactorial disorder caused by both genetic and environmental factors. The combined effects of multiple susceptibility genes might result in a higher risk for IS than a single gene. Therefore, we investigated whether interactions among multiple susceptibility genes were associated with an increased risk of IS by evaluating gene polymorphisms identified in previous meta-analyses, including methylenetetrahydrofolate reductase (MTHFR C677T, beta fibrinogen (FGB, β-FG A455G and T148C, apolipoprotein E (APOE ε2-4, angiotensin-converting enzyme (ACE insertion/deletion (I/D, and endothelial nitric oxide synthase (eNOS G894T. In order to examine these interactions, 712 patients with IS and 774 controls in a Chinese Han population were genotyped using the SNaPshot method, and multifactor dimensionality reduction analysis was used to detect potential interactions among the candidate genes. The results of this study found that ACE I/D and β-FG T148C were significant synergistic contributors to IS. In particular, the ACE DD + β-FG 148CC, ACE DD + β-FG 148CT, and ACE ID + β-FG 148CC genotype combinations resulted in higher risk of IS. After adjusting for potential confounding IS risk factors (age, gender, family history of IS, hypertension history and history of diabetes mellitus using a logistic analysis, a significant correlation between the genotype combinations and IS patients persisted (overall stroke: adjusted odds ratio [OR] = 1.57, 95% confidence interval [CI]: 1.22-2.02, P < 0.001, large artery atherosclerosis subtype: adjusted OR = 1.50, 95% CI: 1.08-2.07, P = 0.016, small-artery occlusion subtype: adjusted OR = 2.04, 95% CI: 1.43-2.91, P < 0.001. The results of this study indicate that the ACE I/D and β-FG T148C combination may result in significantly higher risk of IS in this Chinese population.

  18. Analyse of pollution sources in Horna Nitra river basin using the system GeoEnviron such as instrument for groundwater and surface water pollution risk assessment

    International Nuclear Information System (INIS)

    Kutnik, P.

    2004-01-01

    In this presentation author deals with the analyse of pollution sources in Horna Nitra river basin using the system GeoEnviron such as instrument for groundwater and surface water pollution risk assessment

  19. Reviewing PSA-based analyses to modify technical specifications at nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Martinez-Guridi, G.; Vesely, W.E.

    1995-12-01

    Changes to Technical Specifications (TSs) at nuclear power plants (NPPs) require review and approval by the United States Nuclear Regulatory Commission (USNRC). Currently, many requests for changes to TSs use analyses that are based on a plant's probabilistic safety assessment (PSA). This report presents an approach to reviewing such PSA-based submittals for changes to TSs. We discuss the basic objectives of reviewing a PSA-based submittal to modify NPP TSs; the methodology of reviewing a TS submittal, and the differing roles of a PSA review, a PSA Computer Code review, and a review of a TS submittal. To illustrate this approach, we discuss our review of changes to allowed outage time (AOT) and surveillance test interval (STI) in the TS for the South Texas Project Nuclear Generating Station. Based on this experience gained, a check-list of items is given for future reviewers; it can be used to verify that the submittal contains sufficient information, and also that the review has addressed the relevant issues. Finally, recommended steps in the review process and the expected findings of each step are discussed

  20. Performance Analyses of Renewable and Fuel Power Supply Systems for Different Base Station Sites

    Directory of Open Access Journals (Sweden)

    Josip Lorincz

    2014-11-01

    Full Text Available Base station sites (BSSs powered with renewable energy sources have gained the attention of cellular operators during the last few years. This is because such “green” BSSs impose significant reductions in the operational expenditures (OPEX of telecom operators due to the possibility of on-site renewable energy harvesting. In this paper, the green BSSs power supply system parameters detected through remote and centralized real time sensing are presented. An implemented sensing system based on a wireless sensor network enables reliable collection and post-processing analyses of many parameters, such as: total charging/discharging current of power supply system, battery voltage and temperature, wind speed, etc. As an example, yearly sensing results for three different BSS configurations powered by solar and/or wind energy are discussed in terms of renewable energy supply (RES system performance. In the case of powering those BSS with standalone systems based on a fuel generator, the fuel consumption models expressing interdependence among the generator load and fuel consumption are proposed. This has allowed energy-efficiency comparison of the fuel powered and RES systems, which is presented in terms of the OPEX and carbon dioxide (CO2 reductions. Additionally, approaches based on different BSS air-conditioning systems and the on/off regulation of a daily fuel generator activity are proposed and validated in terms of energy and capital expenditure (CAPEX savings.

  1. Maintenance evaluation using risk based criteria

    International Nuclear Information System (INIS)

    Torres Valle, A.

    1996-01-01

    The maintenance evaluation is currently performed by using economic and, in some case, technical equipment failure criteria, however this is done to a specific equipment level. In general, when statistics are used the analysis for maintenance optimization are made isolated and whit a post mortem character; The integration provided by mean of Probabilistic Safety assessment (PSA) together with the possibilities of its applications, allow for evaluation of maintenance on the basis of broader scope criteria in regard to those traditionally used. The evaluate maintenance using risk based criteria, is necessary to follow a dynamic and systematic approach, in studying the maintenance strategy, to allow for updating the initial probabilistic models, for including operational changes that often take place during operation of complex facilities. This paper proposes a dynamic evaluation system of maintenance task. The system is illustrated by means of a practical example

  2. Comparative Analyses of Zebrafish Anxiety-Like Behavior Using Conflict-Based Novelty Tests.

    Science.gov (United States)

    Kysil, Elana V; Meshalkina, Darya A; Frick, Erin E; Echevarria, David J; Rosemberg, Denis B; Maximino, Caio; Lima, Monica Gomes; Abreu, Murilo S; Giacomini, Ana C; Barcellos, Leonardo J G; Song, Cai; Kalueff, Allan V

    2017-06-01

    Modeling of stress and anxiety in adult zebrafish (Danio rerio) is increasingly utilized in neuroscience research and central nervous system (CNS) drug discovery. Representing the most commonly used zebrafish anxiety models, the novel tank test (NTT) focuses on zebrafish diving in response to potentially threatening stimuli, whereas the light-dark test (LDT) is based on fish scototaxis (innate preference for dark vs. bright areas). Here, we systematically evaluate the utility of these two tests, combining meta-analyses of published literature with comparative in vivo behavioral and whole-body endocrine (cortisol) testing. Overall, the NTT and LDT behaviors demonstrate a generally good cross-test correlation in vivo, whereas meta-analyses of published literature show that both tests have similar sensitivity to zebrafish anxiety-like states. Finally, NTT evokes higher levels of cortisol, likely representing a more stressful procedure than LDT. Collectively, our study reappraises NTT and LDT for studying anxiety-like states in zebrafish, and emphasizes their developing utility for neurobehavioral research. These findings can help optimize drug screening procedures by choosing more appropriate models for testing anxiolytic or anxiogenic drugs.

  3. Chemometrical characterization of four italian rice varieties based on genetic and chemical analyses.

    Science.gov (United States)

    Brandolini, Vincenzo; Coïsson, Jean Daniel; Tedeschi, Paola; Barile, Daniela; Cereti, Elisabetta; Maietti, Annalisa; Vecchiati, Giorgio; Martelli, Aldo; Arlorio, Marco

    2006-12-27

    This paper describes a method for achieving qualitative identification of four rice varieties from two different Italian regions. To estimate the presence of genetic diversity among the four rice varieties, we used polymerase chain reaction-randomly amplified polymorphic DNA (PCR-RAPD) markers, and to elucidate whether a relationship exists between the ground and the specific characteristics of the product, we studied proximate composition, fatty acid composition, mineral content, and total antioxidant capacity. Using principal component analysis on genomic and compositional data, we were able to classify rice samples according to their variety and their district of production. This work also examined the discrimination ability of different parameters. It was found that genomic data give the best discrimination based on varieties, indicating that RAPD assays could be useful in discriminating among closely related species, while compositional analyses do not depend on the genetic characters only but are related to the production area.

  4. Stress and deflection analyses of floating roofs based on a load-modifying method

    International Nuclear Information System (INIS)

    Sun Xiushan; Liu Yinghua; Wang Jianbin; Cen Zhangzhi

    2008-01-01

    This paper proposes a load-modifying method for the stress and deflection analyses of floating roofs used in cylindrical oil storage tanks. The formulations of loads and deformations are derived according to the equilibrium analysis of floating roofs. Based on these formulations, the load-modifying method is developed to conduct a geometrically nonlinear analysis of floating roofs with the finite element (FE) simulation. In the procedure with the load-modifying method, the analysis is carried out through a series of iterative computations until a convergence is achieved within the error tolerance. Numerical examples are given to demonstrate the validity and reliability of the proposed method, which provides an effective and practical numerical solution to the design and analysis of floating roofs

  5. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  6. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  7. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  8. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  9. HOW INTERNAL RISK - BASED AUDIT APPRAISES THE EVALUATION OF RISKS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    N. Dorosh

    2017-09-01

    Full Text Available The article deals with the nature and function of the internal risk-based audit process approach to create patterns of risks and methods of evaluation. Deals with the relationship between the level of maturity of the risk of the company and the method of risk-based internal audit. it was emphasized that internal auditing provides an independent and objective opinion to an organization’s management as to whether its risks are being managed to acceptable levels.

  10. Project transition to risk based corrective action

    International Nuclear Information System (INIS)

    Judge, J.M.; Cormier, S.L.

    1996-01-01

    Many states have adopted or are considering the adoption of the America Society for Testing and Materials (ASTM) Standard E 1739, Standard for Risk Based Corrective Action (RBCA) Applied to Petroleum Release Sites. This standard is being adopted to regulate leaking underground storage tank (LUST) sites. The case studies of two LUST sites in Michigan will be presented to demonstrate the decision making process and limiting factors involved in transitioning sites to the RBCA program. Both of these case studies had been previously investigated and one was actively remediated. The first case study involves a private petroleum facility where soil and ground water have been impacted. Remediation involved a ground water pump and treat system. Subsequent monitoring during system operation indicated that analytical data were still above the Tier 1 RBSLs but below the Tier 2 SSTLs. The closure strategy that was developed was based on the compounds of concern that were below the SSTLs. A deed restriction was also developed for the site as an institutional control. The second LUST site exhibited BTEX concentrations in soil and ground water above the Tier 1 RBSLs. Due to the exceedence of the Tier 1 RBSLs, the second site required a Tier 2 assessment to develop SSTLs as remedial objectives and remove hot spots in the soil and treat the ground water to achieve closure. Again, a deed restriction was instituted along with a performance monitoring plan

  11. Validation of a fully autonomous phosphate analyser based on a microfluidic lab-on-a-chip

    DEFF Research Database (Denmark)

    Slater, Conor; Cleary, J.; Lau, K.T.

    2010-01-01

    of long-term operation. This was proven by a bench top calibration of the analyser using standard solutions and also by comparing the analyser's performance to a commercially available phosphate monitor installed at a waste water treatment plant. The output of the microfluidic lab-on-a-chip analyser...

  12. Analyses in Support of Risk-Informed Natural Gas Vehicle Maintenance Facility Codes and Standards: Phase II.

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca; Ehrhart, Brian David

    2018-03-01

    Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behavior of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.

  13. Cost effectiveness of risk-based closures at UST sites

    International Nuclear Information System (INIS)

    Scruton, K.M.; Baker, J.N.

    1995-01-01

    Risk-based closures have been achieved at Underground Storage Tank (UST) sites throughout the country for a major transportation company. The risk-based closures were cost-effective because a streamlined risk-based approach was used instead of the generic baseline risk assessment approach. USEPA has recently provided guidance encouraging the use of risk-based methodology for achieving closure at UST sites. The risk-based approach used in achieving the site closures involved an identification of potential human and ecological receptors and exposure pathways, and a comparison of maximum onsite chemical concentrations to applicable or relevant and appropriate requirements (ARARs). The ARARs used in the evaluation included Federal and/or State Maximum Contaminant Levels (MCLs) for groundwater and risk-based screening levels for soils. If the maximum concentrations were above the screening levels, a baseline risk assessment was recommended. In several instances, however, the risk-based approach resulted in a regulatory agency acceptance of a ''no further action'' alternative at UST sites which did not pose a significant threat to human health and the environment. The cost of the streamlined risk-based approach is approximately $3,500, while a baseline risk assessment for the same UST site could cost up to $10,000 or more. The use of the streamlined risk-based approach has proven to be successful for achieving a ''no further action'' outcome for the client at a reasonable cost

  14. Risk assessment of logistics outsourcing based on BP neural network

    Science.gov (United States)

    Liu, Xiaofeng; Tian, Zi-you

    The purpose of this article is to evaluate the risk of the enterprises logistics outsourcing. To get this goal, the paper first analysed he main risks existing in the logistics outsourcing, and then set up a risk evaluation index system of the logistics outsourcing; second applied BP neural network into the logistics outsourcing risk evaluation and used MATLAB to the simulation. It proved that the network error is small and has strong practicability. And this method can be used by enterprises to evaluate the risks of logistics outsourcing.

  15. A risk-based microbiological criterion that uses the relative risk as the critical limit

    DEFF Research Database (Denmark)

    Andersen, Jens Kirk; Nørrung, Birgit; da Costa Alves Machado, Simone

    2015-01-01

    A risk-based microbiological criterion is described, that is based on the relative risk associated to the analytical result of a number of samples taken from a food lot. The acceptable limit is a specific level of risk and not a specific number of microorganisms, as in other microbiological...... criteria. The approach requires the availability of a quantitative microbiological risk assessment model to get risk estimates for food products from sampled food lots. By relating these food lot risk estimates to the mean risk estimate associated to a representative baseline data set, a relative risk...... estimate can be obtained. This relative risk estimate then can be compared with a critical value, defined by the criterion. This microbiological criterion based on a relative risk limit is particularly useful when quantitative enumeration data are available and when the prevalence of the microorganism...

  16. Identification of novel risk factors for community-acquired Clostridium difficile infection using spatial statistics and geographic information system analyses.

    Directory of Open Access Journals (Sweden)

    Deverick J Anderson

    Full Text Available The rate of community-acquired Clostridium difficile infection (CA-CDI is increasing. While receipt of antibiotics remains an important risk factor for CDI, studies related to acquisition of C. difficile outside of hospitals are lacking. As a result, risk factors for exposure to C. difficile in community settings have been inadequately studied.To identify novel environmental risk factors for CA-CDI.We performed a population-based retrospective cohort study of patients with CA-CDI from 1/1/2007 through 12/31/2014 in a 10-county area in central North Carolina. 360 Census Tracts in these 10 counties were used as the demographic Geographic Information System (GIS base-map. Longitude and latitude (X, Y coordinates were generated from patient home addresses and overlaid to Census Tracts polygons using ArcGIS; ArcView was used to assess "hot-spots" or clusters of CA-CDI. We then constructed a mixed hierarchical model to identify environmental variables independently associated with increased rates of CA-CDI.A total of 1,895 unique patients met our criteria for CA-CDI. The mean patient age was 54.5 years; 62% were female and 70% were Caucasian. 402 (21% patient addresses were located in "hot spots" or clusters of CA-CDI (p<0.001. "Hot spot" census tracts were scattered throughout the 10 counties. After adjusting for clustering and population density, age ≥ 60 years (p = 0.03, race (<0.001, proximity to a livestock farm (0.01, proximity to farming raw materials services (0.02, and proximity to a nursing home (0.04 were independently associated with increased rates of CA-CDI.Our study is the first to use spatial statistics and mixed models to identify important environmental risk factors for acquisition of C. difficile and adds to the growing evidence that farm practices may put patients at risk for important drug-resistant infections.

  17. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin

    2017-09-23

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  18. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin; Drillon, Gué nola; Ryu, Taewoo; Voolstra, Christian R.; Aranda, Manuel

    2017-01-01

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  19. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  20. Analyses of microstructural and elastic properties of porous SOFC cathodes based on focused ion beam tomography

    Science.gov (United States)

    Chen, Zhangwei; Wang, Xin; Giuliani, Finn; Atkinson, Alan

    2015-01-01

    Mechanical properties of porous SOFC electrodes are largely determined by their microstructures. Measurements of the elastic properties and microstructural parameters can be achieved by modelling of the digitally reconstructed 3D volumes based on the real electrode microstructures. However, the reliability of such measurements is greatly dependent on the processing of raw images acquired for reconstruction. In this work, the actual microstructures of La0.6Sr0.4Co0.2Fe0.8O3-δ (LSCF) cathodes sintered at an elevated temperature were reconstructed based on dual-beam FIB/SEM tomography. Key microstructural and elastic parameters were estimated and correlated. Analyses of their sensitivity to the grayscale threshold value applied in the image segmentation were performed. The important microstructural parameters included porosity, tortuosity, specific surface area, particle and pore size distributions, and inter-particle neck size distribution, which may have varying extent of effect on the elastic properties simulated from the microstructures using FEM. Results showed that different threshold value range would result in different degree of sensitivity for a specific parameter. The estimated porosity and tortuosity were more sensitive than surface area to volume ratio. Pore and neck size were found to be less sensitive than particle size. Results also showed that the modulus was essentially sensitive to the porosity which was largely controlled by the threshold value.

  1. 12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk Adjustment

    Science.gov (United States)

    2010-01-01

    ...) The bank must have a risk control unit that reports directly to senior management and is independent... management systems at least annually. (c) Market risk factors. The bank's internal model must use risk.... Section 4. Internal Models (a) General. For risk-based capital purposes, a bank subject to this appendix...

  2. Improving the safety of a body composition analyser based on the PGNAA method

    Energy Technology Data Exchange (ETDEWEB)

    Miri-Hakimabad, Hashem; Izadi-Najafabadi, Reza; Vejdani-Noghreiyan, Alireza; Panjeh, Hamed [FUM Radiation Detection And Measurement Laboratory, Ferdowsi University of Mashhad (Iran, Islamic Republic of)

    2007-12-15

    The {sup 252}Cf radioisotope and {sup 241}Am-Be are intense neutron emitters that are readily encapsulated in compact, portable and sealed sources. Some features such as high flux of neutron emission and reliable neutron spectrum of these sources make them suitable for the prompt gamma neutron activation analysis (PGNAA) method. The PGNAA method can be used in medicine for neutron radiography and body chemical composition analysis. {sup 252}Cf and {sup 241}Am-Be sources generate not only neutrons but also are intense gamma emitters. Furthermore, the sample in medical treatments is a human body, so it may be exposed to the bombardments of these gamma-rays. Moreover, accumulations of these high-rate gamma-rays in the detector volume cause simultaneous pulses that can be piled up and distort the spectra in the region of interest (ROI). In order to remove these disadvantages in a practical way without being concerned about losing the thermal neutron flux, a gamma-ray filter made of Pb must be employed. The paper suggests a relatively safe body chemical composition analyser (BCCA) machine that uses a spherical Pb shield, enclosing the neutron source. Gamma-ray shielding effects and the optimum radius of the spherical Pb shield have been investigated, using the MCNP-4C code, and compared with the unfiltered case, the bare source. Finally, experimental results demonstrate that an optimised gamma-ray shield for the neutron source in a BCCA can reduce effectively the risk of exposure to the {sup 252}Cf and {sup 241}Am-Be sources.

  3. Ecogeographical associations between climate and human body composition: analyses based on anthropometry and skinfolds.

    Science.gov (United States)

    Wells, Jonathan C K

    2012-02-01

    In the 19th century, two "ecogeographical rules" were proposed hypothesizing associations of climate with mammalian body size and proportions. Data on human body weight and relative leg length support these rules; however, it is unknown whether such associations are attributable to lean tissue (the heat-producing component) or fat (energy stores). Data on weight, height, and two skinfold thickness were obtained from the literature for 137 nonindustrialized populations, providing 145 male and 115 female individual samples. A variety of indices of adiposity and lean mass were analyzed. Preliminary analyses indicated secular increases in skinfolds in men but not women, and associations of age and height with lean mass in both sexes. Decreasing annual temperature was associated with increasing body mass index (BMI), and increasing triceps but not subscapular skinfold. After adjusting for skinfolds, decreasing temperature remained associated with increasing BMI. These results indicate that colder environments favor both greater peripheral energy stores, and greater lean mass. Contrasting results for triceps and subscapular skinfolds might be due to adaptive strategies either constraining central adiposity in cold environments to reduce cardiovascular risk, or favoring central adiposity in warmer environments to maintain energetic support of the immune system. Polynesian populations were analyzed separately and contradicted all of the climate trends, indicating support for the hypothesis that they are cold-adapted despite occupying a tropical region. It is unclear whether such associations emerge through natural selection or through trans-generational and life-course plasticity. These findings nevertheless aid understanding of the wide variability in human physique and adiposity. Copyright © 2011 Wiley Periodicals, Inc.

  4. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  5. Comprehensive logic based analyses of Toll-like receptor 4 signal transduction pathway.

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar Padwal

    Full Text Available Among the 13 TLRs in the vertebrate systems, only TLR4 utilizes both Myeloid differentiation factor 88 (MyD88 and Toll/Interleukin-1 receptor (TIR-domain-containing adapter interferon-β-inducing Factor (TRIF adaptors to transduce signals triggering host-protective immune responses. Earlier studies on the pathway combined various experimental data in the form of one comprehensive map of TLR signaling. But in the absence of adequate kinetic parameters quantitative mathematical models that reveal emerging systems level properties and dynamic inter-regulation among the kinases/phosphatases of the TLR4 network are not yet available. So, here we used reaction stoichiometry-based and parameter independent logical modeling formalism to build the TLR4 signaling network model that captured the feedback regulations, interdependencies between signaling kinases and phosphatases and the outcome of simulated infections. The analyses of the TLR4 signaling network revealed 360 feedback loops, 157 negative and 203 positive; of which, 334 loops had the phosphatase PP1 as an essential component. The network elements' interdependency (positive or negative dependencies in perturbation conditions such as the phosphatase knockout conditions revealed interdependencies between the dual-specific phosphatases MKP-1 and MKP-3 and the kinases in MAPK modules and the role of PP2A in the auto-regulation of Calmodulin kinase-II. Our simulations under the specific kinase or phosphatase gene-deficiency or inhibition conditions corroborated with several previously reported experimental data. The simulations to mimic Yersinia pestis and E. coli infections identified the key perturbation in the network and potential drug targets. Thus, our analyses of TLR4 signaling highlights the role of phosphatases as key regulatory factors in determining the global interdependencies among the network elements; uncovers novel signaling connections; identifies potential drug targets for

  6. 12 CFR 652.70 - Risk-based capital level.

    Science.gov (United States)

    2010-01-01

    ... risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk. The... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE...

  7. Risk-based inspection of nuclear power plants

    International Nuclear Information System (INIS)

    Masopust, R.

    1995-01-01

    A multidiscipline research programme was developed in the USA to establish risk-based inspections for NPP structures and equipment components. Based on this US research effort, the risk-based procedure for developing inspection guidelines for NPPs is described. The procedure includes the definition of systems, qualitative risk assessment, qualitative risk analysis and development of the inspection programme. The method, when adopted and modified, is recommended also for risk-based inspections of structures and equipment of WWER-type NPPs. A pilot application of the method to unit 1 of the Surry NPP is summarized. (Z.S.) 1 tab., 1 fig., 5 refs

  8. Voxel-based morphometry analyses of in-vivo MRI in the aging mouse lemur primate

    Directory of Open Access Journals (Sweden)

    Stephen John Sawiak

    2014-05-01

    Full Text Available Cerebral atrophy is one of the most widely brain alterations associated to aging. A clear relationship has been established between age-associated cognitive impairments and cerebral atrophy. The mouse lemur (Microcebus murinus is a small primate used as a model of age-related neurodegenerative processes. It is the first nonhuman primate in which cerebral atrophy has been correlated with cognitive deficits. Previous studies of cerebral atrophy in this model were based on time consuming manual delineation or measurement of selected brain regions from magnetic resonance images (MRI. These measures could not be used to analyse regions that cannot be easily outlined such as the nucleus basalis of Meynert or the subiculum. In humans, morphometric assessment of structural changes with age is generally performed with automated procedures such as voxel-based morphometry (VBM. The objective of our work was to perform user-independent assessment of age-related morphological changes in the whole brain of large mouse lemur populations thanks to VBM. The study was based on the SPMMouse toolbox of SPM 8 and involved thirty mouse lemurs aged from 1.9 to 11.3 years. The automatic method revealed for the first time atrophy in regions where manual delineation is prohibitive (nucleus basalis of Meynert, subiculum, prepiriform cortex, Brodmann areas 13-16, hypothalamus, putamen, thalamus, corpus callosum. Some of these regions are described as particularly sensitive to age-associated alterations in humans. The method revealed also age-associated atrophy in cortical regions (cingulate, occipital, parietal, nucleus septalis, and the caudate. Manual measures performed in some of these regions were in good agreement with results from automatic measures. The templates generated in this study as well as the toolbox for SPM8 can be downloaded. These tools will be valuable for future evaluation of various treatments that are tested to modulate cerebral aging in lemurs.

  9. Risk-based plant performance indicators

    International Nuclear Information System (INIS)

    Boccio, J.L.; Azarm, M.A.; Hall, R.E.

    1991-01-01

    Tasked by the 1979 President's Commission on the Accident at Three Mile Island, the U.S. nuclear power industry has put into place a performance indicator program as one means for showing a demonstrable record of achievement. Largely through the efforts of the Institute of Nuclear Power Operations (INPO), plant performance data has, since 1983, been collected and analyzed to aid utility management in measuring their plants' performance progress. The U.S. Nuclear Regulatory Commission (NRC) has also developed a set of performance indicators. This program, conducted by NRC's Office for the Analysis and Evaluation of Operational Data (AEOD), is structured to present information on plant operational performance in a manner that could enhance the staff's ability to recognize changes in the safety performance. Both organizations recognized that performance indicators have limitations and could be subject to misinterpretation and misuse with the potential for an adverse impact on safety. This paper reports on performance indicators presently in use, e.g., unplanned automatic scrams, unplanned safety system actuation, safety system failures, etc., which are logically related to safety. But, a reliability/risk-based method for evaluating either individual indicators or an aggregated set of indicators is not yet available

  10. SADA: Ecological Risk Based Decision Support System for Selective Remediation

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is freeware that implements terrestrial ecological risk assessment and yields a selective remediation design using its integral geographical information system, based on ecological and risk assessment inputs. Selective remediation ...

  11. Lifestyle-based risk model for fall risk assessment

    OpenAIRE

    Sannino, Giovanna; De Falco, Ivanoe; De Pietro, Guiseppe

    2016-01-01

    Purpose: The aim of this study was to identify the explicit relationship between life-style and the risk of falling under the form of a mathematical model. Starting from some personal and behavioral information of a subject as, e.g., weight, height, age, data about physical activity habits, and concern about falling, the model would estimate the score of her/his Mini-Balance Evaluation Systems (Mini-BES) test. This score ranges within 0 and 28, and the lower its value the more likely the subj...

  12. Risk of Debt-Based Financing in Indonesian Islamic Banking

    Directory of Open Access Journals (Sweden)

    Kharisya Ayu Effendi

    2017-05-01

    Full Text Available The purpose of this study is to know the risk of debt-based financing in Islamic banking in Indonesia by using an accounting based calculation, those are NPF analysis, Credit risk Z-score and Altman Z-score. This study is telling about the risk of debt-based finacing on Indonesian Islamic banking using an accounting based measurement, those are NPF analysis, Credit Risk Z-score analysis and Altman Z-score analysis. The data was obtained from 2011 to 2015 from the website of each bank. The result is a risk on debt-based financing on Indonesian Islamic banking is low. The measurement using 3 accounting based measurement tool gives a consistent result, that is Indonesian Islamic banking use a debt-based financing have a high financial stability and a low risk.DOI: 10.15408/aiq.v9i2.4821

  13. Ecology of Subglacial Lake Vostok (Antarctica, Based on Metagenomic/Metatranscriptomic Analyses of Accretion Ice

    Directory of Open Access Journals (Sweden)

    Tom D'Elia

    2013-03-01

    Full Text Available Lake Vostok is the largest of the nearly 400 subglacial Antarctic lakes and has been continuously buried by glacial ice for 15 million years. Extreme cold, heat (from possible hydrothermal activity, pressure (from the overriding glacier and dissolved oxygen (delivered by melting meteoric ice, in addition to limited nutrients and complete darkness, combine to produce one of the most extreme environments on Earth. Metagenomic/metatranscriptomic analyses of ice that accreted over a shallow embayment and over the southern main lake basin indicate the presence of thousands of species of organisms (94% Bacteria, 6% Eukarya, and two Archaea. The predominant bacterial sequences were closest to those from species of Firmicutes, Proteobacteria and Actinobacteria, while the predominant eukaryotic sequences were most similar to those from species of ascomycetous and basidiomycetous Fungi. Based on the sequence data, the lake appears to contain a mixture of autotrophs and heterotrophs capable of performing nitrogen fixation, nitrogen cycling, carbon fixation and nutrient recycling. Sequences closest to those of psychrophiles and thermophiles indicate a cold lake with possible hydrothermal activity. Sequences most similar to those from marine and aquatic species suggest the presence of marine and freshwater regions.

  14. Loss of Flow Accident (LOFA) analyses using LabView-based NRR simulator

    Energy Technology Data Exchange (ETDEWEB)

    Arafa, Amany Abdel Aziz; Saleh, Hassan Ibrahim [Atomic Energy Authority, Cairo (Egypt). Radiation Engineering Dept.; Ashoub, Nagieb [Atomic Energy Authority, Cairo (Egypt). Reactor Physics Dept.

    2016-12-15

    This paper presents a generic Loss of Flow Accident (LOFA) scenario module which is integrated in the LabView-based simulator to imitate a Nuclear Research Reactor (NRR) behavior for different user defined LOFA scenarios. It also provides analyses of a LOFA of a single fuel channel and its impact on operational transactions and on the behavior of the reactor. The generic LOFA scenario module includes graphs needed to clarify the effects of the LOFA under study. Furthermore, the percentage of the loss of mass flow rate, the mode of flow reduction and the start time and transient time of LOFA are user defined to add flexibility to the LOFA scenarios. The objective of integrating such generic LOFA module is to be able to deal with such incidents and avoid their significant effects. It is also useful in the development of expertise in this area and reducing the operator training and simulations costs. The results of the implemented generic LOFA module agree well with that of COBRA-IIIC code and the earlier guidebook for this series of transients.

  15. TAXONOMY AND GENETIC RELATIONSHIPS OF PANGASIIDAE, ASIAN CATFISHES, BASED ON MORPHOLOGICAL AND MOLECULAR ANALYSES

    Directory of Open Access Journals (Sweden)

    Rudhy Gustiano

    2007-12-01

    Full Text Available Pangasiids are economically important riverine catfishes generally residing in freshwater from the Indian subcontinent to the Indonesian Archipelago. The systematics of this family are still poorly known. Consequently, lack of such basic information impedes the understanding of the biology of the Pangasiids and the study of their aquaculture potential as well as improvement of seed production and growth performance. The objectives of the present study are to clarify phylogeny of this family based on a biometric analysis and molecular evidence using 12S ribosomal mtDNA on the total of 1070 specimens. The study revealed that 28 species are recognised as valid in Pangasiidae. Four genera are also recognized as Helicophagus Bleeker 1858, Pangasianodon Chevey 1930, Pteropangasius Fowler 1937, and Pangasius Valenciennes 1840 instead of two as reported by previous workers. The phylogenetic analysis demonstrated the recognised genera, and genetic relationships among taxa. Overall, trees from the different analyses show similar topologies and confirm the hypothesis derived from geological history, palaeontology, and similar models in other taxa of fishes from the same area. The oldest genus may already have existed when the Asian mainland was still connected to the islands in the southern part about 20 million years ago.

  16. Historical Weathering Based on Chemical Analyses of Two Spodosols in Southern Sweden

    International Nuclear Information System (INIS)

    Melkerud, Per-Arne; Bain, Derek C.; Olsson, Mats T.

    2003-01-01

    Chemical weathering losses were calculated for two conifer stands in relation to ongoing studies on liming effects and ash amendments on chemical status, soil solution chemistry and soil genesis. Weathering losses were based on elemental depletion trends in soil profiles since deglaciation and exposure to the weathering environment. Gradients in total geochemical composition were assumed to reflect alteration over time. Study sites were Horroed and Hassloev in southern Sweden. Both Horroed and Hassloev sites are located on sandy loamy Weichselian till at an altitude of 85 and 190 m a.s.l., respectively. Aliquots from volume determined samples from a number of soil levels were fused with lithium metaborate, dissolved in HNO 3 , and analysed by ICP - AES. Results indicated highest cumulative weathering losses at Hassloev. The weathering losses for the elements are in the following order:Si > Al > K > Na > Ca > MgTotal annual losses for Ca+Mg+K+Na, expressed in mmol c m -2 yr -1 , amounted to c. 28 and 58 at Horroed and Hassloev, respectively. Variations between study sites could not be explained by differences in bulk density, geochemistry or mineralogy. The accumulated weathering losses since deglaciation were larger in the uppermost 15 cm than in deeper B horizons for most elements studied

  17. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  18. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  19. Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook

    LENUS (Irish Health Repository)

    Lonergan, Peter E

    2011-09-25

    Abstract Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http:\\/\\/www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes.

  20. [Research on fast classification based on LIBS technology and principle component analyses].

    Science.gov (United States)

    Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng

    2014-11-01

    Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.

  1. The term 'risk' and its evaluation bases

    International Nuclear Information System (INIS)

    Brueckner, R.

    1976-01-01

    The term risk, the risk itself and its application for radiation exposure in practised medicine is presented from the following points of view: Life expectation, susceptibility to sickness and permanent inability to work, impaired professional and earning capacity, work accident and sickness. (HP) [de

  2. Model-based analyses to compare health and economic outcomes of cancer control: inclusion of disparities.

    Science.gov (United States)

    Goldie, Sue J; Daniels, Norman

    2011-09-21

    Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs. Hispanic women, 69.7% vs. 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28,200 per year of life saved when compared with the same strategy without

  3. Risk and Protective Factors for Intimate Partner Violence Against Women: Systematic Review and Meta-analyses of Prospective-Longitudinal Studies.

    Science.gov (United States)

    Yakubovich, Alexa R; Stöckl, Heidi; Murray, Joseph; Melendez-Torres, G J; Steinert, Janina I; Glavin, Calla E Y; Humphreys, David K

    2018-07-01

    The estimated lifetime prevalence of physical or sexual intimate partner violence (IPV) is 30% among women worldwide. Understanding risk and protective factors is essential for designing effective prevention strategies. To quantify the associations between prospective-longitudinal risk and protective factors and IPV and identify evidence gaps. We conducted systematic searches in 16 databases including MEDLINE and PsycINFO from inception to June 2016. The study protocol is registered with PROSPERO (CRD42016039213). We included published and unpublished studies available in English that prospectively analyzed any risk or protective factor(s) for self-reported IPV victimization among women and controlled for at least 1 other variable. Three reviewers were involved in study screening. One reviewer extracted estimates of association and study characteristics from each study and 2 reviewers independently checked a random subset of extractions. We assessed study quality with the Cambridge Quality Checklists. When studies investigated the same risk or protective factor using similar measures, we computed pooled odds ratios (ORs) by using random-effects meta-analyses. We summarized heterogeneity with I 2 and τ 2 . We synthesized all estimates of association, including those not meta-analyzed, by using harvest plots to illustrate evidence gaps and trends toward negative or positive associations. Of 18 608 studies identified, 60 were included and 35 meta-analyzed. Most studies were based in the United States. The strongest evidence for modifiable risk factors for IPV against women were unplanned pregnancy (OR = 1.66; 95% confidence interval [CI] = 1.20, 1.31) and having parents with less than a high-school education (OR = 1.55; 95% CI = 1.10, 2.17). Being older (OR = 0.96; 95% CI = 0.93, 0.98) or married (OR = 0.93; 95% CI = 0.87, 0.99) were protective. To our knowledge, this is the first systematic, meta-analytic review of all risk and

  4. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  5. Data base of accident and agricultural statistics for transportation risk assessment

    International Nuclear Information System (INIS)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs

  6. Pareto frontier analyses based decision making tool for transportation of hazardous waste

    International Nuclear Information System (INIS)

    Das, Arup; Mazumder, T.N.; Gupta, A.K.

    2012-01-01

    Highlights: ► Posteriori method using multi-objective approach to solve bi-objective routing problem. ► System optimization (with multiple source–destination pairs) in a capacity constrained network using non-dominated sorting. ► Tools like cost elasticity and angle based focus used to analyze Pareto frontier to aid stakeholders make informed decisions. ► A real life case study of Kolkata Metropolitan Area to explain the workability of the model. - Abstract: Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology.

  7. Immunochip analyses identify a novel risk locus for primary biliary cirrhosis at 13q14, multiple independent associations at four established risk loci and epistasis between 1p31 and 7q32 risk variants

    Science.gov (United States)

    Juran, Brian D.; Hirschfield, Gideon M.; Invernizzi, Pietro; Atkinson, Elizabeth J.; Li, Yafang; Xie, Gang; Kosoy, Roman; Ransom, Michael; Sun, Ye; Bianchi, Ilaria; Schlicht, Erik M.; Lleo, Ana; Coltescu, Catalina; Bernuzzi, Francesca; Podda, Mauro; Lammert, Craig; Shigeta, Russell; Chan, Landon L.; Balschun, Tobias; Marconi, Maurizio; Cusi, Daniele; Heathcote, E. Jenny; Mason, Andrew L.; Myers, Robert P.; Milkiewicz, Piotr; Odin, Joseph A.; Luketic, Velimir A.; Bacon, Bruce R.; Bodenheimer, Henry C.; Liakina, Valentina; Vincent, Catherine; Levy, Cynthia; Franke, Andre; Gregersen, Peter K.; Bossa, Fabrizio; Gershwin, M. Eric; deAndrade, Mariza; Amos, Christopher I.; Lazaridis, Konstantinos N.; Seldin, Michael F.; Siminovitch, Katherine A.

    2012-01-01

    To further characterize the genetic basis of primary biliary cirrhosis (PBC), we genotyped 2426 PBC patients and 5731 unaffected controls from three independent cohorts using a single nucleotide polymorphism (SNP) array (Immunochip) enriched for autoimmune disease risk loci. Meta-analysis of the genotype data sets identified a novel disease-associated locus near the TNFSF11 gene at 13q14, provided evidence for association at six additional immune-related loci not previously implicated in PBC and confirmed associations at 19 of 22 established risk loci. Results of conditional analyses also provided evidence for multiple independent association signals at four risk loci, with haplotype analyses suggesting independent SNP effects at the 2q32 and 16p13 loci, but complex haplotype driven effects at the 3q25 and 6p21 loci. By imputing classical HLA alleles from this data set, four class II alleles independently contributing to the association signal from this region were identified. Imputation of genotypes at the non-HLA loci also provided additional associations, but none with stronger effects than the genotyped variants. An epistatic interaction between the IL12RB2 risk locus at 1p31and the IRF5 risk locus at 7q32 was also identified and suggests a complementary effect of these loci in predisposing to disease. These data expand the repertoire of genes with potential roles in PBC pathogenesis that need to be explored by follow-up biological studies. PMID:22936693

  8. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  9. Description of OPRA: A Danish database designed for the analyses of risk factors associated with 30-day hospital readmission of people aged 65+ years.

    Science.gov (United States)

    Pedersen, Mona K; Nielsen, Gunnar L; Uhrenfeldt, Lisbeth; Rasmussen, Ole S; Lundbye-Christensen, Søren

    2017-08-01

    To describe the construction of the Older Person at Risk Assessment (OPRA) database, the ability to link this database with existing data sources obtained from Danish nationwide population-based registries and to discuss its research potential for the analyses of risk factors associated with 30-day hospital readmission. We reviewed Danish nationwide registries to obtain information on demographic and social determinants as well as information on health and health care use in a population of hospitalised older people. The sample included all people aged 65+ years discharged from Danish public hospitals in the period from 1 January 2007 to 30 September 2010. We used personal identifiers to link and integrate the data from all events of interest with the outcome measures in the OPRA database. The database contained records of the patients, admissions and variables of interest. The cohort included 1,267,752 admissions for 479,854 unique people. The rate of 30-day all-cause acute readmission was 18.9% ( n=239,077) and the overall 30-day mortality was 5.0% ( n=63,116). The OPRA database provides the possibility of linking data on health and life events in a population of people moving into retirement and ageing. Construction of the database makes it possible to outline individual life and health trajectories over time, transcending organisational boundaries within health care systems. The OPRA database is multi-component and multi-disciplinary in orientation and has been prepared to be used in a wide range of subgroup analyses, including different outcome measures and statistical methods.

  10. Process-based project proposal risk management

    Directory of Open Access Journals (Sweden)

    Alok Kumar

    2016-12-01

    Full Text Available We all are aware of the organizational omnipresence. Projects within the organizations are ubiquitous too. Projects achieve their goals successfully if they are planned, scheduled, controlled and implemented well. The project lifecycle of initiating, planning, scheduling, controlling and implementing are very well-planned by project managers and the organizations. Successful projects have well-developed risk management plans to deal with situations impacting projects. Like any other organisation, a university does try to access funds for different purposes too. For such organisations, running a project is not the issue, rather getting a project proposal approved to fund a project is the key. Project proposal processing is done by the nodal office in every organisation. Usually, these nodal offices help in administration and submission of a project proposal for accessing funds. Seldom are these nodal project offices within the organizations facilitate a project proposal approval by proactively reaching out to the project managers. And as project managers prepare project proposals, little or no attention is made to prepare a project proposal risk plan so as to maximise project acquisition. Risk plans are submitted while preparing proposals but these risk plans cater to a requirement to address actual projects upon approval. Hence, a risk management plan for project proposal is either missing or very little effort is made to treat the risks inherent in project acquisition. This paper is an integral attempt to highlight the importance of risk treatment for project proposal stage as an extremely important step to preparing the risk management plan made for projects corresponding to their lifecycle phases. Several tools and techniques have been proposed in the paper to help and guide either the project owner (proposer or the main organisational unit responsible for project management. Development of tools and techniques to further enhance project

  11. Air Quality Monitoring: Risk-Based Choices

    Science.gov (United States)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  12. Comparison based on energy and exergy analyses of the potential cogeneration efficiencies for fuel cells and other electricity generation devices

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnical Inst., Toronto, (CA). Dept. of Mechanical Engineering

    1990-01-01

    Comparisons of the potential cogeneration efficiencies are made, based on energy and exergy analyses, for several devices for electricity generation. The investigation considers several types of fuel cell system (Phosphoric Acid, Alkaline, Solid Polymer Electrolyte, Molten Carbonate and Solid Oxide), and several fossil-fuel and nuclear cogeneration systems based on steam power plants. In the analysis, each system is modelled as a device for which fuel and air enter, and electrical- and thermal-energy products and material and thermal-energy wastes exit. The results for all systems considered indicate that exergy analyses should be used when analysing the cogeneration potential of systems for electricity generation, because they weigh the usefulnesses of heat and electricity on equivalent bases. Energy analyses tend to present overly optimistic views of performance. These findings are particularly significant when large fractions of the heat output from a system are utilized for cogeneration. (author).

  13. Incentivising flood risk adaptation through risk based insurance premiums : Trade-offs between affordability and risk reduction

    NARCIS (Netherlands)

    Hudson, Paul F.; Botzen, W.J.W.; Feyen, L.; Aerts, Jeroen C.J.H.

    2016-01-01

    The financial incentives offered by the risk-based pricing of insurance can stimulate policyholder adaptation to flood risk while potentially conflicting with affordability. We examine the trade-off between risk reduction and affordability in a model of public-private flood insurance in France and

  14. Teleseism-based Relative Time Corrections for Modern Analyses of Digitized Analog Seismograms

    Science.gov (United States)

    Lee, T. A.; Ishii, M.

    2017-12-01

    With modern-day instruments and seismic networks timed by GPS systems, synchronization of data streams is all but a forgone conclusion. However, during the analog era, when each station had its own clock, comparing data timing from different stations was a far more daunting prospect. Today, with recently developed methods by which analog data can be digitized, having the ability to accurately reconcile the timings of two separate stations would open decades worth of data to modern analyses. For example, one possible and exciting application would be using noise interferometry with digitized analog data in order to investigate changing structural features (on a volcano for example) over a much longer timescale than was previously possible. With this in mind, we introduce a new approach to sync time between stations based on teleseismic arrivals. P-wave arrivals are identified at stations for pairs of earthquakes from the digital and analog eras that have nearly identical distances, locations, and depths. Assuming accurate timing of the modern data, relative time corrections between a pair of stations can then be inferred for the analog data. This method for time correction depends upon the analog stations having modern equivalents, and both having sufficiently long durations of operation to allow for recording of usable teleseismic events. The Hawaii Volcano Observatory (HVO) network is an especially ideal environment for this, as it not only has a large and well-preserved collection of analog seismograms, but also has a long operating history (1912 - present) with many of the older stations having modern equivalents. As such, the scope of this project is to calculate and apply relative time corrections to analog data from two HVO stations, HILB (1919-present) and UWE (1928-present)(HILB now part of Pacific Tsunami network). Further application of this method could be for investigation of the effects of relative clock-drift, that is, the determining factor for how

  15. Molecular Characterization of Five Potyviruses Infecting Korean Sweet Potatoes Based on Analyses of Complete Genome Sequences

    Directory of Open Access Journals (Sweden)

    Hae-Ryun Kwak

    2015-12-01

    Full Text Available Sweet potatoes (Ipomea batatas L. are grown extensively, in tropical and temperate regions, and are important food crops worldwide. In Korea, potyviruses, including Sweet potato feathery mottle virus (SPFMV, Sweet potato virus C (SPVC, Sweet potato virus G (SPVG, Sweet potato virus 2 (SPV2, and Sweet potato latent virus (SPLV, have been detected in sweet potato fields at a high (~95% incidence. In the present work, complete genome sequences of 18 isolates, representing the five potyviruses mentioned above, were compared with previously reported genome sequences. The complete genomes consisted of 10,081 to 10,830 nucleotides, excluding the poly-A tails. Their genomic organizations were typical of the Potyvirus genus, including one target open reading frame coding for a putative polyprotein. Based on phylogenetic analyses and sequence comparisons, the Korean SPFMV isolates belonged to the strains RC and O with >98% nucleotide sequence identity. Korean SPVC isolates had 99% identity to the Japanese isolate SPVC-Bungo and 70% identity to the SPFMV isolates. The Korean SPVG isolates showed 99% identity to the three previously reported SPVG isolates. Korean SPV2 isolates had 97% identity to the SPV2 GWB-2 isolate from the USA. Korean SPLV isolates had a relatively low (88% nucleotide sequence identity with the Taiwanese SPLV-TW isolates, and they were phylogenetically distantly related to SPFMV isolates. Recombination analysis revealed that possible recombination events occurred in the P1, HC-Pro and NIa-NIb regions of SPFMV and SPLV isolates and these regions were identified as hotspots for recombination in the sweet potato potyviruses.

  16. RiskREP: Risk-Based Security Requirements Elicitation and Prioritization

    OpenAIRE

    Herrmann, Andrea; Morali, A.; Etalle, Sandro; Wieringa, Roelf J.; Niedrite, Laila; Strazdina, Renate; Wangler, Benkt

    2011-01-01

    Companies are under pressure to be in control of their assets but at the same time they must operate as efficiently as possible. This means that they aim to implement “good-enough security‿ but need to be able to justify their security investment plans. In this paper, we present a Risk-Based Requirements Prioritization method (RiskREP) that extends misuse case-based methods with IT architecture based risk assessment and countermeasure definition and prioritization. Countermeasure prioritizati...

  17. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Science.gov (United States)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  18. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    International Nuclear Information System (INIS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-01-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  19. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Energy Technology Data Exchange (ETDEWEB)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky [Department of Mechanical Engineering, Diponegoro University, Semarang (Indonesia); Kim, Seon Jin [Department of Mechanical & Automotive Engineering of Pukyong National University (Korea, Republic of)

    2016-04-19

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  20. Deconvoluting complex tissues for expression quantitative trait locus-based analyses

    DEFF Research Database (Denmark)

    Seo, Ji-Heui; Li, Qiyuan; Fatima, Aquila

    2013-01-01

    Breast cancer genome-wide association studies have pinpointed dozens of variants associated with breast cancer pathogenesis. The majority of risk variants, however, are located outside of known protein-coding regions. Therefore, identifying which genes the risk variants are acting through present...

  1. Risk perception, risk evaluation and human values: cognitive bases of acceptability of a radioactive waste repository

    International Nuclear Information System (INIS)

    Earle, T.C.; Lindell, M.K.; Rankin, W.L.

    1981-07-01

    Public acceptance of radioactive waste management alternatives depends in part on public perception of the associated risks. Three aspects of those perceived risks were explored in this study: (1) synthetic measures of risk perception based on judgments of probability and consequences; (2) acceptability of hypothetical radioactive waste policies, and (3) effects of human values on risk perception. Both the work on synthetic measures of risk perception and on the acceptability of hypothetical policies included investigations of three categories of risk: (1) Short-term public risk (affecting persons living when the wastes are created), (2) Long-term public risk (affecting persons living after the time the wastes were created), and (3) Occupational risk (affecting persons working with the radioactive wastes). The human values work related to public risk perception in general, across categories of persons affected

  2. Risk perception, risk evaluation and human values: cognitive bases of acceptability of a radioactive waste repository

    Energy Technology Data Exchange (ETDEWEB)

    Earle, T.C.; Lindell, M.K.; Rankin, W.L.

    1981-07-01

    Public acceptance of radioactive waste management alternatives depends in part on public perception of the associated risks. Three aspects of those perceived risks were explored in this study: (1) synthetic measures of risk perception based on judgments of probability and consequences; (2) acceptability of hypothetical radioactive waste policies, and (3) effects of human values on risk perception. Both the work on synthetic measures of risk perception and on the acceptability of hypothetical policies included investigations of three categories of risk: (1) Short-term public risk (affecting persons living when the wastes are created), (2) Long-term public risk (affecting persons living after the time the wastes were created), and (3) Occupational risk (affecting persons working with the radioactive wastes). The human values work related to public risk perception in general, across categories of persons affected. Respondents were selected according to a purposive sampling strategy.

  3. Pathways and cost-risk-benefit analyses for INEL radioactively contaminated soil areas being evaluated for decontamination and decommissioning

    International Nuclear Information System (INIS)

    Chapin, J.A.

    1980-12-01

    Several radioactively contaminated soil areas exist at the Idaho National Engineering Laboratory; virtually all are contaminated with nuclides of cesium, strontium, and cobalt at low levels of activity. This study develops a method of analysis to determine cost effective alternatives for decommissioning these areas, considering risk to the workers and general public, as well as the benefits to be gained. Because much of the input data to the analysis is highly subjective and detailed radiological characterization of the soil areas is minimal, it was decided that an analysis based on a relative weighting method be employed. The results of this analysis constitute a relative prioritization list of the soil areas being considered for decommissioning as well as the recommended decommissioning alternatives. The results of this analysis indicate that, of the 46 areas considered, 11 should be left in place under protective storage and 16 should be left as is. Nineteen areas were not analyzed because they were either operational or characterization data were not available. These results are based on a maximum exposure to a member of the general population, through realistic exposure pathways, of 5 mrem/yr

  4. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    Directory of Open Access Journals (Sweden)

    Zhigang Zuo

    2014-04-01

    Full Text Available It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were discussed. A model with casing diameter of 0.875D40 was selected for further analyses. FEM analyses were then carried out on different combinations of casing sizes, casing wall thickness, and materials, to evaluate its safety related to pressure integrity, with respect to both static and fatigue strength analyses. Two models with forging and cast materials were selected as final results.

  5. Risk assessment and model for community-based construction ...

    African Journals Online (AJOL)

    It, therefore, becomes necessary to systematically manage uncertainty in community-based construction in order to increase the likelihood of meeting project objectives using necessary risk management strategies. Risk management, which is an iterative process due to the dynamic nature of many risks, follows three main ...

  6. Risk-based maintenance-Techniques and applications

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2007-01-01

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions

  7. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    International Nuclear Information System (INIS)

    Kitchen, Marcus J.; Pavlov, Konstantin M.; Hooper, Stuart B.; Vine, David J.; Siu, Karen K.W.; Wallace, Megan J.; Siew, Melissa L.L.; Yagi, Naoto; Uesugi, Kentaro; Lewis, Rob A.

    2008-01-01

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 μm thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution

  8. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kitchen, Marcus J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: Marcus.Kitchen@sci.monash.edu.au; Pavlov, Konstantin M. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia); Physics and Electronics, School of Science and Technology, University of New England, NSW 2351 (Australia)], E-mail: Konstantin.Pavlov@sci.monash.edu.au; Hooper, Stuart B. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Stuart.Hooper@med.monash.edu.au; Vine, David J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: David.Vine@sci.monash.edu.au; Siu, Karen K.W. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Karen.Siu@sci.monash.edu.au; Wallace, Megan J. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Megan.Wallace@med.monash.edu.au; Siew, Melissa L.L. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Melissa.Siew@med.monash.edu.au; Yagi, Naoto [SPring-8/JASRI, Sayo (Japan)], E-mail: yagi@spring8.or.jp; Uesugi, Kentaro [SPring-8/JASRI, Sayo (Japan)], E-mail: ueken@spring8.or.jp; Lewis, Rob A. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Rob.Lewis@sync.monash.edu.au

    2008-12-15

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 {mu}m thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution.

  9. A Knowledge-Based Model of Audit Risk

    OpenAIRE

    Dhar, Vasant; Lewis, Barry; Peters, James

    1988-01-01

    Within the academic and professional auditing communities, there has been growing concern about how to accurately assess the various risks associated with performing an audit. These risks are difficult to conceptualize in terms of numeric estimates. This article discusses the development of a prototype computational model (computer program) that assesses one of the major audit risks -- inherent risk. This program bases most of its inferencing activities on a qualitative model of a typical bus...

  10. Big data based fraud risk management at Alibaba

    OpenAIRE

    Chen, Jidong; Tao, Ye; Wang, Haoran; Chen, Tao

    2015-01-01

    With development of mobile internet and finance, fraud risk comes in all shapes and sizes. This paper is to introduce the Fraud Risk Management at Alibaba under big data. Alibaba has built a fraud risk monitoring and management system based on real-time big data processing and intelligent risk models. It captures fraud signals directly from huge amount data of user behaviors and network, analyzes them in real-time using machine learning, and accurately predicts the bad users and transactions....

  11. SeeSway - A free web-based system for analysing and exploring standing balance data.

    Science.gov (United States)

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate

  12. Principles of risk-based decision making

    National Research Council Canada - National Science Library

    United States. Coast Guard. Risk based decision-making guidelines

    2001-01-01

    ... original content in order to make this product more generically applicable and less Coast Guard specific. h s k assessment and risk management are important topics in industry and government. Because of limited resources and increasing demands for services, most organizations simply cannot continue business as usual. Even if resources are not dec...

  13. Internet-based screening for dementia risk.

    Science.gov (United States)

    Brandt, Jason; Sullivan, Campbell; Burrell, Larry E; Rogerson, Mark; Anderson, Allan

    2013-01-01

    The Dementia Risk Assessment (DRA) is an online tool consisting of questions about known risk factors for dementia, a novel verbal memory test, and an informant report of cognitive decline. Its primary goal is to educate the public about dementia risk factors and encourage clinical evaluation where appropriate. In Study 1, more than 3,000 anonymous persons over age 50 completed the DRA about themselves; 1,000 people also completed proxy reports about another person. Advanced age, lower education, male sex, complaints of severe memory impairment, and histories of cerebrovascular disease, Parkinson's disease, and brain tumor all contributed significantly to poor memory performance. A high correlation was obtained between proxy-reported decline and actual memory test performance. In Study 2, 52 persons seeking first-time evaluation at dementia clinics completed the DRA prior to their visits. Their responses (and those of their proxy informants) were compared to the results of independent evaluation by geriatric neuropsychiatrists. The 30 patients found to meet criteria for probable Alzheimer's disease, vascular dementia, or frontotemporal dementia differed on the DRA from the 22 patients without dementia (most other neuropsychiatric conditions). Scoring below criterion on the DRA's memory test had moderately high predictive validity for clinically diagnosed dementia. Although additional studies of larger clinical samples are needed, the DRA holds promise for wide-scale screening for dementia risk.

  14. LOGISTICS RISK RESEARCH OF PREFABRICATED HOUSE CONSTRUCTION ENGINEERING BASED ON CREDIBILITY METHOD

    Directory of Open Access Journals (Sweden)

    Xiaoping Bai

    2017-07-01

    Full Text Available In recent years, the prefabricated house industry has rapid development,.Because of fewer suppliers, higher demand transport scheme and complex quality test, the risks of construction engineering logistics links are relatively high. Studying how to effectively evaluate the risks of construction engineering logistics links is significant. According to the characteristics of the prefabricated house construction engineering, we analyse the construction engineering logistics risks and use the combined weights method to determine the weight of indexes which contains both subjective and objective factors, to improve the scientific value and the validity of the assessment. Based on credibility measure method, a new logistics risk evaluation model in prefabricated housing is established to estimate the risk during making prefabricated house construction engineering. The presented model can avoid the subjectivity of selecting the membership function and solve the problem of how to comprehensively assess the construction engineering logistics risk in a certain extent.

  15. Suicide risk in relation to level of urbanicity - a population-based linkage study

    DEFF Research Database (Denmark)

    Qin, Ping

    2005-01-01

    from various Danish longitudinal registers. Data were analysed with conditional logistic regression. RESULTS: This study confirms that people living in more urbanized areas are at a higher risk of suicide than their counterparts in less urbanized areas. However, this excess risk is largely eliminated...... when adjusted for personal marital, income, and ethnic differences; it is even reversed when further adjusted for psychiatric status. Moreover, the impact of urbanicity on suicide risk differs significantly by sex and across age. Urban living reduces suicide risk significantly among men, especially......BACKGROUND: The extent to which the high suicide rate in urban areas is influenced by exposures to risk factors for suicide other than urbanicity remains unknown. This population-based study aims to investigate suicide risk in relation to the level of urbanicity in the context of other factors...

  16. New insights into survival trend analyses in cancer population-based studies: the SUDCAN methodology.

    Science.gov (United States)

    Uhry, Zoé; Bossard, Nadine; Remontet, Laurent; Iwaz, Jean; Roche, Laurent

    2017-01-01

    The main objective of the SUDCAN study was to compare, for 15 cancer sites, the trends in net survival and excess mortality rates from cancer 5 years after diagnosis between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland). The data were extracted from the EUROCARE-5 database. The study period ranged from 6 (Portugal, 2000-2005) to 18 years (Switzerland, 1989-2007). Trend analyses were carried out separately for each country and cancer site; the number of cases ranged from 1500 to 104 000 cases. We developed an original flexible excess rate modelling strategy that accounts for the continuous effects of age, year of diagnosis, time since diagnosis and their interactions. Nineteen models were constructed; they differed in the modelling of the effect of the year of diagnosis in terms of linearity, proportionality and interaction with age. The final model was chosen according to the Akaike Information Criterion. The fit was assessed graphically by comparing model estimates versus nonparametric (Pohar-Perme) net survival estimates. Out of the 90 analyses carried out, the effect of the year of diagnosis on the excess mortality rate depended on age in 61 and was nonproportional in 64; it was nonlinear in 27 out of the 75 analyses where this effect was considered. The model fit was overall satisfactory. We analysed successfully 15 cancer sites in six countries. The refined methodology proved necessary for detailed trend analyses. It is hoped that three-dimensional parametric modelling will be used more widely in net survival trend studies as it has major advantages over stratified analyses.

  17. Risk based rulemaking and design - Proceed with caution

    DEFF Research Database (Denmark)

    Zachariadis, Panos; Psaraftis, Harilaos N.; Kontovas, Christos A.

    2007-01-01

    The trend towards a risk based regulatory framework at IMO and within classification societies is expanding while some voices claim that a full ship risk based scantlings design approach can be immediately implementable. This paper attempts to clarify some widely used, but confusing to many, noti...

  18. A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

    Science.gov (United States)

    2016-09-01

    performance study of these algorithms in the particular problem of analysing backscatter signals from rotating blades. The report is organised as follows...provide further insight into the behaviour of the techniques. Here, the algorithms for MP, OMP, CGP, gOMP and ROMP terminate when 10 atoms are

  19. Conformational determination of [Leu]enkephalin based on theoretical and experimental VA and VCD spectral analyses

    DEFF Research Database (Denmark)

    Abdali, Salim; Jalkanen, Karl J.; Cao, X.

    2004-01-01

    Conformational determination of [Leu]enkephalin in DMSO-d6 is carried out using VA and VCD spectral analyses. Conformational energies, vibrational frequencies and VA and VCD intensities are calculated using DFT at B3LYP/6-31G* level of theory. Comparison between the measured spectra...

  20. Scientific information and the Tongass land management plan: key findings derived from the scientific literature, species assessments, resource analyses, workshops, and risk assessment panels.

    Science.gov (United States)

    Douglas N. Swanston; Charles G. Shaw; Winston P. Smith; Kent R. Julin; Guy A. Cellier; Fred H. Everest

    1996-01-01

    This document highlights key items of information obtained from the published literature and from specific assessments, workshops, resource analyses, and various risk assessment panels conducted as part of the Tongass land management planning process. None of this information dictates any particular decision; however, it is important to consider during decisionmaking...

  1. A systematic review and meta-Analyses show that carbapenem use and medical devices are the leading risk factors for carbapenem- resistant pseudomonas aeruginosa

    NARCIS (Netherlands)

    A.F. Voor (Anne); J.A. Severin (Juliëtte); E.M.E.H. Lesaffre (Emmanuel); M.C. Vos (Margreet)

    2014-01-01

    textabstractA systematic review and meta-Analyses were performed to identify the risk factors associated with carbapenem-resistant Pseudomonas aeruginosa and to identify sources and reservoirs for the pathogen. A systematic search of PubMed and Embase databases from 1 January 1987 until 27 January

  2. Interruption of antiretroviral therapy and risk of cardiovascular disease in persons with HIV-1 infection: exploratory analyses from the SMART trial

    DEFF Research Database (Denmark)

    Phillips, Andrew N; Carr, Andrew; Neuhaus, Jacquie

    2008-01-01

    BACKGROUND: The SMART trial found a raised risk of cardiovascular disease (CVD) events in patients undergoing CD4+ T cell-count guided intermittent antiretroviral therapy (ART) compared with patients on continuous ART. Exploratory analyses were performed to better understand the reasons for this ...

  3. Risk based technique for improving technical specifications

    International Nuclear Information System (INIS)

    Kim, I. S.; Jae, M. S.; Kim, B. S.; Hwang, S. W.; Kang, K. M.; Park, S. S.; Yu, Y. S.

    2001-03-01

    The objective of this study is to develop the systematic guidance for reviewing the documents associated with the changes of technical specifications. The work done in this fiscal year is the following : surveys in TS requirements, TS improvements and TS regulations in foreign countries as well as Korea, surveys on the state-of-the-art of RITSs and their use in Korea, development of a decision-making framework for both the licensee and the regulation agency, description of risk measures, assessment methodology on STI/AOT, and adverse effects caused by periodic maintenance, which are explained in appendix. The results of this study might contribute to enhancing the quality of the current technical specifications and contribute to preparing the risk informed regulation program using the decision-making framework developed in this study

  4. Risk-based decisionmaking in the DOE: Challenges and status

    International Nuclear Information System (INIS)

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-01-01

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities

  5. Risk-based decisionmaking in the DOE: Challenges and status

    Energy Technology Data Exchange (ETDEWEB)

    Henry, C.J.; Alchowiak, J.; Moses, M. [Dept. of Energy, Washington, DC (United States)] [and others

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  6. Incremental Validity Analyses of the Violence Risk Appraisal Guide and the Psychopathy Checklist: Screening Version in a Civil Psychiatric Sample

    Science.gov (United States)

    Edens, John F.; Skeem, Jennifer L.; Douglas, Kevin S.

    2006-01-01

    This study compares two instruments frequently used to assess risk for violence, the Violence Risk Appraisal Guide (VRAG) and the Psychopathy Checklist: Screening Version (PCL:SV), in a large sample of civil psychiatric patients. Despite a strong bivariate relationship with community violence, the VRAG could not improve on the predictive validity…

  7. Response to Ecological Risk Assessment Forum Request for Information on the Benefits of PCB Congener-Specific Analyses

    Science.gov (United States)

    In August, 2001, the Ecological Risk Assessment Forum (ERAF) submitted a formal question to the Ecological Risk Assessment Support Center (ERASC) on the benefits of evaluating PCB congeners in environmental samples. This question was developed by ERAF members Bruce Duncan and Cla...

  8. Risk-based zoning for urbanizing floodplains.

    Science.gov (United States)

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.

  9. Medicine and ionizing rays: a help sheet in analysing risks in intra-oral dental radiology and applicable texts

    International Nuclear Information System (INIS)

    Gauron, C.

    2009-01-01

    This document proposes a synthesis of useful knowledge for radioprotection in the case of intra-oral dental radiology. In the first part, several aspects are considered: the concerned personnel, the course of treatment procedures, the hazards, the identification of the risk associated with ionizing radiation, the risk assessment and the determination of exposure levels, the strategy to control the risks (reduction of risks, technical measures concerning the installation or the personnel, teaching and information, prevention and medical monitoring), and risk control assessment. A second part indicates the various applicable legal and regulatory texts (European directives, institutions in charge of radioprotection, general arrangements applicable to workers and patients, and regulatory texts concerning worker protection or patient protection against ionizing radiations)

  10. PENGARUH PELAKSANAAN RISK BASED INTERNAL AUDITING TERHADAP PENCEGAHAN FRAUD

    Directory of Open Access Journals (Sweden)

    Rozmita Dewi Yuniarti Rozali

    2015-12-01

    Full Text Available This study aims to determine the effect of implementation of risk based internal auditing on fraud prevention on internal audit Inspection Office Bank BRI Bandung Region. The sample used by 18 internal auditors in Inspection Office of Bank BRI Bandung Region saturated sampling method. Based on calculation of simple regression analysis obtained result that every increase of implementation of risk based internal auditing (X will lead to increase fraud prevention (Y. It shows that there is a positive influence between the implementation of risk based internal auditing on fraud prevention on the internal audit of Inspection Office of Bank BRI Bandung Region.

  11. Molecular systematics of Indian Alysicarpus (Fabaceae) based on analyses of nuclear ribosomal DNA sequences.

    Science.gov (United States)

    Gholami, Akram; Subramaniam, Shweta; Geeta, R; Pandey, Arun K

    2017-06-01

    Alysicarpus Necker ex Desvaux (Fabaceae, Desmodieae) consists of ~30 species that are distributed in tropical and subtropical regions of theworld. In India, the genus is represented by ca. 18 species, ofwhich seven are endemic. Sequences of the nuclear Internal transcribed spacer from38 accessions representing 16 Indian specieswere subjected to phylogenetic analyses. The ITS sequence data strongly support the monophyly of the genus Alysicarpus. Analyses revealed four major well-supported clades within Alysicarpus. Ancestral state reconstructions were done for two morphological characters, namely calyx length in relation to pod (macrocalyx and microcalyx) and pod surface ornamentation (transversely rugose and nonrugose). The present study is the first report on molecular systematics of Indian Alysicarpus.

  12. Developing points-based risk-scoring systems in the presence of competing risks.

    Science.gov (United States)

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  13. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  14. Regional analyses of labor markets and demography: a model based Norwegian example.

    Science.gov (United States)

    Stambol, L S; Stolen, N M; Avitsland, T

    1998-01-01

    The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt

  15. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    OpenAIRE

    Zhigang Zuo; Shuhong Liu; Yizhang Fan; Yulin Wu

    2014-01-01

    It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were...

  16. Epidemiologic Analyses of Risk Factors for Bone Loss and Recovery Related to Long-Duration Space Flight

    Science.gov (United States)

    Sibonga, Jean; Amin, Shreyasee

    2010-01-01

    AIM 1: To investigate the risk of microgravity exposure on long-term changes in bone health and fracture risk. compare data from crew members ("observed") with what would be "expected" from Rochester Bone Health Study. AIM 2: To provide a summary of current evidence available on potential risk factors for bone loss, recovery & fracture following long-duration space flight. integrative review of all data pre, in-, and post-flight across disciplines (cardiovascular, nutrition, muscle, etc.) and their relation to bone loss and recovery

  17. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Vesely, W.E.; Lofgren, E.V.

    1989-01-01

    This paper presents an evaluation of the configuration risks associated with the operation of a nuclear power plant and the approaches to control these risks using risk-based configuration control considerations. In that context, the actual and maximum potential configuration risks at a plant are analyzed and the alternative types criteria for a risk-based configuration control systems are described. The risk-based configuration calculations which are studied here focus on the core-melt frequency impacts from given plant configurations. By calculating the core-melt frequency for given configurations, the configurations which cause large core-melt frequency increases can be identified and controlled. The duration time in which the configuration can exist can then be limited or the core-melt frequency level associated with the configuration can be reduced by various actions. Furthermore, maintenances and tests can be scheduled to avoid the configurations which cause large core-melt frequency increases. Present technical specifications do not control many of these configurations which can cause large core-melt frequency increases but instead focus on many risk-unimportant allowed outage times. Hence, risk-based configuration management can be effectively used to reduce core-melt frequency associated risks at a plant and at the same time can provide flexibility in plant operation. The alternative strategies for controlling the core-melt frequency and other risk contributions include: (1) controlling the increased risk level which is associated with the configuration; (2) controlling the individual configuration risk which is associated with a given duration of a configuration; (3) controlling the time period configuration risk from configurations which occur in a time period

  18. Toward risk-based control of nuclear power plant configurations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Veseley, W.E.; Kim, I.S.

    1992-01-01

    This paper presents an evaluation of the configuration risks associated with the operation of a nuclear power plant and the approaches to control these risks using risk-based configuration control considerations. In that context, the actual and maximum potential configuration risks at a plant are analyzed and the alternative types criteria for a risk-based configuration control systems are described. The risk-based configuration calculations which are studied here focus on the core-melt frequency impacts from given plant configurations, the configurations which cause large core-melt frequency increases can be identified and controlled. The duration time in which the configuration can exist can then be limited or the core-melt frequency level associated with the configuration can be reduced by various actions. Futhermore, maintenances and tests can be scheduled to avoid the configurations which cause large core-melt frequency increases. Present technical specifications do not control many of these configurations which can cause large core-melt frequency increases but instead focus on many risk-unimportant allowed outage times. Hence, risk-based configuration management can be effectively used to reduce core-melt frequency associated risks at a plant and at the same time can provide flexibility in plant operation. The alternative strategies for controlling the core-melt frequency and other risk contributions include: (1) controlling the increased risk level which is associated with the configuration; (2) controlling the individual configuration risk which is associated with a given duration of a configuration; (3) controlling the time period configuration risk from configurations which occur in a time period. (orig.)

  19. Impact of shutdown risk on risk-based assessment of technical specifications

    International Nuclear Information System (INIS)

    Deriot, S.

    1992-10-01

    This paper describes the current work performed by the Research and Development Division of EDF concerning risk-based assessment of Operating Technical Specifications (OTS). The current risk-based assessment of OTS at EDF is presented. Then, the level 1 Probabilistic Safety Assessment of unit 3 of the Paluel nuclear power station (called PSA 1300) is described. It is fully computerized and takes into account the risk in shutdown states. A case study is presented. It shows that the fact of considering shutdown risk suggests that the current OTS should be modified

  20. Intake of butylated hydroxyanisole and butylated hydroxytoluene and stomach cancer risk : results from analyses in the Netherlands : cohort study

    NARCIS (Netherlands)

    Botterweck, A.A.M.; Verhagen, H.; Goldbohm, R.A.; Kleinjans, J.; Brandt, P.A. van den

    2000-01-01

    Both carcinogenic and anticarcinogenic properties have been reported for the synthetic antioxidants butylated hydroxyanisole (BHA) and butylated hydroxytoluene (BHT). The association between dietary intake of BHA and BHT and stomach cancer risk was investigated in the Netherlands Cohort Study (NLCS)

  1. Analyses of cyanobacterial toxins (microcystins, cylindrospermopsins) in the reservoirs of the Czech Republic and evaluation of health risks

    Czech Academy of Sciences Publication Activity Database

    Bláhová, Lucie; Babica, Pavel; Adamovský, Ondřej; Kohoutek, Jiří; Maršálek, Blahoslav

    2008-01-01

    Roč. 6, č. 4 (2008), s. 223-227 ISSN 1610-3653 Institutional research plan: CEZ:AV0Z60050516 Keywords : microcystins * cylindrospermopsin * health risks Subject RIV: EF - Botanics Impact factor: 1.366, year: 2008

  2. Process Review for Development of Quantitative Risk Analyses for Transboundary Animal Disease to Pathogen-Free Territories

    Directory of Open Access Journals (Sweden)

    Jonathan Miller

    2017-10-01

    Full Text Available Outbreaks of transboundary animal diseases (TADs have the potential to cause significant detriment to animal, human, and environmental health; severe economic implications; and national security. Challenges concerning data sharing, model development, decision support, and disease emergence science have recently been promoted. These challenges and recommendations have been recognized and advocated in the disciplines intersecting with outbreak prediction and forecast modeling regarding infectious diseases. To advance the effective application of computation and risk communication, analytical products ought to follow a collaboratively agreed common plan for implementation. Research articles should seek to inform and assist prioritization of national and international strategies in developing established criteria to identify and follow best practice standards to assess risk model attributes and performance. A well-defined framework to help eliminate gaps in policy, process, and planning knowledge areas would help alleviate the intense need for the formation of a comprehensive strategy for countering TAD outbreak risks. A quantitative assessment that accurately captures the risk of introduction of a TAD through various pathways can be a powerful tool in guiding where government, academic, and industry resources ought to be allocated, whether implementation of additional risk management solutions is merited, and where research efforts should be directed to minimize risk. This review outlines a part of a process for the development of quantitative risk analysis to collect, analyze, and communicate this knowledge. A more comprehensive and unabridged manual was also developed. The framework used in supporting the application of aligning computational tools for readiness continues our approach to apply a preparedness mindset to challenges concerning threats to global biosecurity, secure food systems, and risk-mitigated agricultural economies.

  3. A Risk Metric Assessment of Scenario-Based Market Risk Measures for Volatility and Risk Estimation: Evidence from Emerging Markets

    Directory of Open Access Journals (Sweden)

    Sitima Innocent

    2015-03-01

    Full Text Available The study evaluated the sensitivity of the Value- at- Risk (VaR and Expected Shortfalls (ES with respect to portfolio allocation in emerging markets with an index portfolio of a developed market. This study utilised different models for VaR and ES techniques using various scenario-based models such as Covariance Methods, Historical Simulation and the GARCH (1, 1 for the predictive ability of these models in both relatively stable market conditions and extreme market conditions. The results showed that Expected Shortfall has less risk tolerance than VaR based on the same scenario-based market risk measures

  4. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  5. Risk Based Maintenance in Electricity Network Organisations

    NARCIS (Netherlands)

    Mehairjan, R.P.Y.

    2016-01-01

    Presently, maintenance management of assets in infrastructure utilities such as electricity, gas and water are widely undergoing changes towards new working environments. These are mainly driven against the background of stringent regulatory regimes, an ageing asset base, increased customer demands

  6. III: Use of biomarkers as Risk Indicators in Environmental Risk Assessment of oil based discharges offshore.

    Science.gov (United States)

    Sanni, Steinar; Lyng, Emily; Pampanin, Daniela M

    2017-06-01

    Offshore oil and gas activities are required not to cause adverse environmental effects, and risk based management has been established to meet environmental standards. In some risk assessment schemes, Risk Indicators (RIs) are parameters to monitor the development of risk affecting factors. RIs have not yet been established in the Environmental Risk Assessment procedures for management of oil based discharges offshore. This paper evaluates the usefulness of biomarkers as RIs, based on their properties, existing laboratory biomarker data and assessment methods. Data shows several correlations between oil concentrations and biomarker responses, and assessment principles exist that qualify biomarkers for integration into risk procedures. Different ways that these existing biomarkers and methods can be applied as RIs in a probabilistic risk assessment system when linked with whole organism responses are discussed. This can be a useful approach to integrate biomarkers into probabilistic risk assessment related to oil based discharges, representing a potential supplement to information that biomarkers already provide about environmental impact and risk related to these kind of discharges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Atmospheric radiation environment analyses based-on CCD camera at various mountain altitudes and underground sites

    Directory of Open Access Journals (Sweden)

    Li Cavoli Pierre

    2016-01-01

    Full Text Available The purpose of this paper is to discriminate secondary atmospheric particles and identify muons by measuring the natural radiative environment in atmospheric and underground locations. A CCD camera has been used as a cosmic ray sensor. The Low Noise Underground Laboratory of Rustrel (LSBB, France gives the access to a unique low-noise scientific environment deep enough to ensure the screening from the neutron and proton radiative components. Analyses of the charge levels in pixels of the CCD camera induced by radiation events and cartographies of the charge events versus the hit pixel are proposed.

  8. C-reactive protein as a risk factor for coronary heart disease: a systematic review and meta-analyses for the U.S. Preventive Services Task Force.

    Science.gov (United States)

    Buckley, David I; Fu, Rongwei; Freeman, Michele; Rogers, Kevin; Helfand, Mark

    2009-10-06

    C-reactive protein (CRP) may help to refine global risk assessment for coronary heart disease (CHD), particularly among persons who are at intermediate risk on the basis of traditional risk factors alone. To assist the U.S. Preventive Services Task Force (USPSTF) in determining whether CRP should be incorporated into guidelines for CHD risk assessment. MEDLINE search of English-language articles (1966 to November 2007), supplemented by reference lists of reviews, pertinent studies, editorials, and Web sites and by expert suggestions. Prospective cohort, case-cohort, and nested case-control studies relevant to the independent predictive ability of CRP when used in intermediate-risk persons. Included studies were reviewed according to predefined criteria, and the quality of each study was rated. The validity of the body of evidence and the net benefit or harm of using CRP for CHD risk assessment were evaluated. The combined magnitude of effect was determined by meta-analysis. The body of evidence is of good quality, consistency, and applicability. For good studies that adjusted for all Framingham risk variables, the summary estimate of relative risk for incident CHD was 1.58 (95% CI, 1.37 to 1.83) for CRP levels greater than 3.0 mg/L compared with levels less than 1.0 mg/L. Analyses from 4 large cohorts were consistent in finding evidence that including CRP improves risk stratification among initially intermediate-risk persons. C-reactive protein has desirable test characteristics, and good data exist on the prevalence of elevated CRP levels in intermediate-risk persons. Limited evidence links changes in CRP level to primary prevention of CHD events. Study methods for measuring Framingham risk variables and other covariates varied. Ethnic and racial minority populations were poorly represented in most studies, limiting generalizability. Few studies directly assessed the effect of CRP on risk reclassification in intermediate-risk persons. Strong evidence indicates

  9. Application of risk-based inspection methods for cryogenic equipment

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Risk-based Inspection (RBI) is widely applied across the world as part of Pressure Equipment Integrity Management, especially in the oil and gas industry, to generally reduce costs compared with time-based approaches and assist in assigning resources to the most critical equipment. One of the challenges in RBI is to apply it for low temperature and cryogenic applications, as there are usually no degradation mechanisms by which to determine a suitable probability of failure in the overall risk assessment. However, the assumptions used for other degradation mechanisms can be adopted to determine, qualitatively and semi-quantitatively, a consequence of failure within the risk assessment. This can assist in providing a consistent basis for the assumptions used in ensuring adequate process safety barriers and determining suitable sizing of relief devices. This presentation will discuss risk-based inspection in the context of cryogenic safety, as well as present some of the considerations for the risk assessme...

  10. Research needs for risk-informed, performance-based regulations

    International Nuclear Information System (INIS)

    Thadani, A.C.

    1997-01-01

    This article summarizes the activities of the Office of Research of the NRC, both from a historical aspect as well as it applies to the application of risk-based decision making. The office has been actively involved in problems related to understanding risks related to core accidents, to understanding the problem of aging of reactor components and materials from years of service, and toward the understanding and analysis of severe accidents. In addition new policy statements regarding the role of risk assessment in regulatory applications has given focus for the need of further work. The NRC has used risk assessment in regulatory questions in the past but in a fairly ad hoc sort of manner. The new policies will clearly require a better defined application of risk assessment, and help for people evaluating applications in judging the applicability of such applications when a component of them is based on risk-based decision making. To address this, standard review plans are being prepared to serve as guides for such questions. In addition, with regulatory decisions being allowed to be based upon risk-based decisions, it is necessary to have an adequate data base prepared, and made publically available, to support such a position

  11. Simple Crosscutting Concerns Are Not So Simple : Analysing Variability in Large-Scale Idioms-Based Implementations

    NARCIS (Netherlands)

    Bruntink, M.; Van Deursen, A.; d’Hondt, M.; Tourwé, T.

    2007-01-01

    This paper describes a method for studying idioms-based implementations of crosscutting concerns, and our experiences with it in the context of a real-world, large-scale embedded software system. In particular, we analyse a seemingly simple concern, tracing, and show that it exhibits significant

  12. Systematics of Plant-Pathogenic and Related Streptomyces Species Based on Phylogenetic Analyses of Multiple Gene Loci

    Science.gov (United States)

    The 10 species of Streptomyces implicated as the etiological agents in scab disease of potatoes or soft rot disease of sweet potatoes are distributed among 7 different phylogenetic clades in analyses based on 16S rRNA gene sequences, but high sequence similarity of this gene among Streptomyces speci...

  13. Identification among morphologically similar Argyreia (Convolvulaceae) based on leaf anatomy and phenetic analyses.

    Science.gov (United States)

    Traiperm, Paweena; Chow, Janene; Nopun, Possathorn; Staples, G; Swangpol, Sasivimon C

    2017-12-01

    The genus Argyreia Lour. is one of the species-rich Asian genera in the family Convolvulaceae. Several species complexes were recognized in which taxon delimitation was imprecise, especially when examining herbarium materials without fully developed open flowers. The main goal of this study is to investigate and describe leaf anatomy for some morphologically similar Argyreia using epidermal peeling, leaf and petiole transverse sections, and scanning electron microscopy. Phenetic analyses including cluster analysis and principal component analysis were used to investigate the similarity of these morpho-types. Anatomical differences observed between the morpho-types include epidermal cell walls and the trichome types on the leaf epidermis. Additional differences in the leaf and petiole transverse sections include the epidermal cell shape of the adaxial leaf blade, the leaf margins, and the petiole transverse sectional outline. The phenogram from cluster analysis using the UPGMA method represented four groups with an R value of 0.87. Moreover, the important quantitative and qualitative leaf anatomical traits of the four groups were confirmed by the principal component analysis of the first two components. The results from phenetic analyses confirmed the anatomical differentiation between the morpho-types. Leaf anatomical features regarded as particularly informative for morpho-type differentiation can be used to supplement macro morphological identification.

  14. Bioassay-based risk assessment of complex mixtures

    International Nuclear Information System (INIS)

    Donnelly, K.C.; Huebner, H.J.

    1996-01-01

    The baseline risk assessment often plays an integral role in various decision-making processes at Superfund sites. The present study reports on risk characterizations prepared for seven complex mixtures using biological and chemical analysis. Three of the samples (A, B, and C) were complex mixtures of polycyclic aromatic hydrocarbons (PAHs) extracted from coal tar; while four samples extracted from munitions-contaminated soil contained primarily nitroaromatic hydrocarbons. The chemical-based risk assessment ranked sample C as least toxic, while the risk associated with samples A and B was approximately equal. The microbial bioassay was in general agreement for the coal tar samples. The weighted activity of the coal tar extracts in Salmonella was 4,960 for sample C, and 162,000 and 206,000 for samples A and B, respectively. The bacterial mutagenicity of 2,4,6-trinitrotoluene contaminated soils exhibited an indirect correlation with chemical-based risk assessment. The aqueous extract of sample 004 induced 1,292 net revertants in Salmonella, while the estimated risk to ingestion and dermal adsorption was 2E-9. The data indicate that the chemical-based risk assessment accurately predicted the genotoxicity of the PAHs, while the accuracy of the risk assessment for munitions contaminated soils was limited due to the presence of metabolites of TNT degradation. The biological tests used in this research provide a valuable compliment to chemical analysis for characterizing the genotoxic risk of complex mixtures

  15. Structural and Treatment Analyses of Safe and At-Risk Behaviors and Postures Performed by Pharmacy Employees

    Science.gov (United States)

    Fante, Rhiannon; Gravina, Nicole; Betz, Alison; Austin, John

    2010-01-01

    This study employed structural and treatment analyses to determine factors that contributed to wrist posture safety in a small pharmacy. The pharmacy was located on a university campus and participants were three female pharmacy technicians. These particular employees had experienced various repetitive-motion injuries that resulted in a total of…

  16. 78 FR 63479 - Meta-Analyses of Randomized Controlled Clinical Trials (RCTs) for the Evaluation of Risk To...

    Science.gov (United States)

    2013-10-24

    ... pharmaceutical industry and health care organizations, and others from the general public, about the use of meta-analyses of randomized trials as a tool for safety assessment in the regulation of pharmaceutical products... PDUFA Goals Letter, titled ``Enhancing Regulatory Science and Expediting Drug Development,'' includes an...

  17. Neural Network-Based Model for Landslide Susceptibility and Soil Longitudinal Profile Analyses

    DEFF Research Database (Denmark)

    Farrokhzad, F.; Barari, Amin; Choobbasti, A. J.

    2011-01-01

    The purpose of this study was to create an empirical model for assessing the landslide risk potential at Savadkouh Azad University, which is located in the rural surroundings of Savadkouh, about 5 km from the city of Pol-Sefid in northern Iran. The soil longitudinal profile of the city of Babol......, located 25 km from the Caspian Sea, also was predicted with an artificial neural network (ANN). A multilayer perceptron neural network model was applied to the landslide area and was used to analyze specific elements in the study area that contributed to previous landsliding events. The ANN models were...... studies in landslide susceptibility zonation....

  18. Risk-based audit selection of dairy farms

    NARCIS (Netherlands)

    Asseldonk, van M.A.P.M.; Velthuis, A.G.J.

    2014-01-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process

  19. Risk-Based Educational Accountability in Dutch Primary Education

    Science.gov (United States)

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  20. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  1. Risk based decision tool for space exploration missions

    Science.gov (United States)

    Meshkat, Leila; Cornford, Steve; Moran, Terrence

    2003-01-01

    This paper presents an approach and corresponding tool to assess and analyze the risks involved in a mission during the pre-phase A design process. This approach is based on creating a risk template for each subsystem expert involved in the mission design process and defining appropriate interactions between the templates.

  2. Risk-based inspection in the context of nuclear power plants

    International Nuclear Information System (INIS)

    Soares, Wellington A.; Vasconcelos, Vanderley de; Rabello, Emerson G.

    2015-01-01

    Nuclear power plant owners have to consider several aspects like safety, availability, costs and radiation exposure during operation of nuclear power plants. They also need to demonstrate to regulatory bodies that risk assessment and inspection planning processes are being implemented in effective and appropriate manner. Risk-Based Inspection (RBI) is a methodology that, unlike time-based inspection, involves a quantitative assessment of both failure probability and consequence associated with each safety-related item. A correctly implemented RBI program classifies individual equipment by its risks and prioritizes inspection efforts based on this classification. While in traditional deterministic approach, the inspection frequencies are constant, in the RBI approach the inspection interval for each item depends on the risk level. Regularly, inspection intervals from RBI result in risk levels lower or equal than deterministic inspection intervals. According to the literature, RBI solutions improve integrity and reduce costs through a more effective inspection. Risk-Informed In-service Inspection (RI-ISI) is the equivalent term used in the nuclear area. Its use in nuclear power plants around world is briefly reviewed in this paper. Identification of practice methodologies for performing risk-based analyses presented in this paper can help both Brazilian nuclear power plant operator and regulatory body in evaluating the RI-ISI technique feasibility as a tool for optimizing inspections within nuclear plants. (author)

  3. Risk-based inspection in the context of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Wellington A.; Vasconcelos, Vanderley de; Rabello, Emerson G., E-mail: soaresw@cdtn.br, E-mail: vasconv@cdtn.br, E-mail: egr@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    Nuclear power plant owners have to consider several aspects like safety, availability, costs and radiation exposure during operation of nuclear power plants. They also need to demonstrate to regulatory bodies that risk assessment and inspection planning processes are being implemented in effective and appropriate manner. Risk-Based Inspection (RBI) is a methodology that, unlike time-based inspection, involves a quantitative assessment of both failure probability and consequence associated with each safety-related item. A correctly implemented RBI program classifies individual equipment by its risks and prioritizes inspection efforts based on this classification. While in traditional deterministic approach, the inspection frequencies are constant, in the RBI approach the inspection interval for each item depends on the risk level. Regularly, inspection intervals from RBI result in risk levels lower or equal than deterministic inspection intervals. According to the literature, RBI solutions improve integrity and reduce costs through a more effective inspection. Risk-Informed In-service Inspection (RI-ISI) is the equivalent term used in the nuclear area. Its use in nuclear power plants around world is briefly reviewed in this paper. Identification of practice methodologies for performing risk-based analyses presented in this paper can help both Brazilian nuclear power plant operator and regulatory body in evaluating the RI-ISI technique feasibility as a tool for optimizing inspections within nuclear plants. (author)

  4. Activity Based Learning in a Freshman Global Business Course: Analyses of Preferences and Demographic Differences

    Science.gov (United States)

    Levine, Mark F.; Guy, Paul W.

    2007-01-01

    The present study investigates pre-business students' reaction to Activity Based Learning in a lower division core required course entitled Introduction to Global Business in the business curriculum at California State University Chico. The study investigates students' preference for Activity Based Learning in comparison to a more traditional…

  5. Variability Abstractions: Trading Precision for Speed in Family-Based Analyses

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2015-01-01

    Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is capable of analyzing all valid products (variants) without generating any of them explicitly. It takes as input only the common code base, which encodes all variants of a SPL, and produces analysis results corresponding...

  6. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  7. A Semi Risk-Based Approach for Managing Urban Drainage Systems under Extreme Rainfall

    Directory of Open Access Journals (Sweden)

    Carlos Salinas-Rodriguez

    2018-03-01

    Full Text Available Conventional design standards for urban drainage systems are not set to deal with extreme rainfall events. As these events are becoming more frequent, there is room for proposing new planning approaches and standards that are flexible enough to cope with a wide range of rainfall events. In this paper, a semi risk-based approach is presented as a simple and practical way for the analysis and management of rainfall flooding at the precinct scale. This approach uses various rainfall events as input parameters for the analysis of the flood hazard and impacts, and categorises the flood risk in different levels, ranging from very low to very high risk. When visualised on a map, the insight into the risk levels across the precinct will enable engineers and spatial planners to identify and prioritise interventions to manage the flood risk. The approach is demonstrated for a sewer district in the city of Rotterdam, the Netherlands, using a one-dimensional (1D/two-dimensional (2D flood model. The risk level of this area is classified as being predominantly very low or low, with a couple of locations with high and very high risk. For these locations interventions, such as disconnection and lowering street profiles, have been proposed and analysed with the 1D/2D flood model. The interventions were shown to be effective in reducing the risk levels from very high/high risk to medium/low risk.

  8. Risk Based Corrosion Studies at SRS

    International Nuclear Information System (INIS)

    Hoffman, E.

    2010-01-01

    TYPE I and II (ASTM 285-B) - Experienced stress corrosion cracking (SCC), 2 have been closed; 22 scheduled for closure by 2017, and No active leak sites today. TYPE III (ASTM A516-70 and A537 Class I) - Post-fabrication relief of weld residual stresses, Improved resistance to SCC and brittle fracture, No leakage history, and Receives new waste. The objectives are to utilize statistical methods to reduce conservatism in current chemistry control program; and express nitrite inhibitor limits in terms of pitting risk on waste tank steel. Conclusions are: (1) A statistically designed experimental study has been undertaken to improve the effectiveness of the minimum nitrite concentrations to inhibit pitting corrosion; (2) Mixture/amount model supports that pitting depends on the ratio of aggressive to inhibitive anions, as well as the concentration of each species; (3) Secondary aggressive species, Cl - and SO 4 2- , significantly effect the corrosion response; and (4) Results support the reduction of the chemistry control nitrite inhibitor concentrations in the regime of 0.8-1.0 M nitrate.

  9. Analyses of integrated aircraft cabin contaminant monitoring network based on Kalman consensus filter.

    Science.gov (United States)

    Wang, Rui; Li, Yanxiao; Sun, Hui; Chen, Zengqiang

    2017-11-01

    The modern civil aircrafts use air ventilation pressurized cabins subject to the limited space. In order to monitor multiple contaminants and overcome the hypersensitivity of the single sensor, the paper constructs an output correction integrated sensor configuration using sensors with different measurement theories after comparing to other two different configurations. This proposed configuration works as a node in the contaminant distributed wireless sensor monitoring network. The corresponding measurement error models of integrated sensors are also proposed by using the Kalman consensus filter to estimate states and conduct data fusion in order to regulate the single sensor measurement results. The paper develops the sufficient proof of the Kalman consensus filter stability when considering the system and the observation noises and compares the mean estimation and the mean consensus errors between Kalman consensus filter and local Kalman filter. The numerical example analyses show the effectiveness of the algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Toxicity testing and chemical analyses of recycled fibre-based paper for food contact

    DEFF Research Database (Denmark)

    Binderup, Mona-Lise; Pedersen, Gitte Alsing; Vinggaard, Anne

    2002-01-01

    of different qualities as food-contact materials and to Perform a preliminary evaluation of their suitability from a safety point of view, and, second, to evaluate the use of different in vitro toxicity tests for screening of paper and board. Paper produced from three different categories of recycled fibres (B...... of the paper products were extracted with either 99% ethanol or water. Potential migrants in the extracts were identified and semiquantified by GC-1R-MS or GC-HRMS. In parallel to the chemical analyses, a battery of four different in vitro toxicity tests with different endpoints were applied to the same...... was less cytotoxic than the extracts prepared from paper made from recycled fibres, and extracts prepared from C was the most cytotoxic. None of the extracts showed mutagenic activity No conclusion about the oestrogenic activity could be made, because all extracts were cytotoxic to the test organism (yeast...

  11. Risk Based Milk Pricing Model at Dairy Farmers Level

    Directory of Open Access Journals (Sweden)

    W. Septiani

    2017-12-01

    Full Text Available The milk price from a cooperative institution to farmer does not fully cover the production cost. Though, dairy farmers encounter various risks and uncertainties in conducting their business. The highest risk in milk supply lies in the activities at the farm. This study was designed to formulate a model for calculating milk price at farmer’s level based on risk. Risks that occur on farms include the risk of cow breeding, sanitation, health care, cattle feed management, milking and milk sales. This research used the location of the farm in West Java region. There were five main stages in the preparation of this model, (1 identification and analysis of influential factors, (2 development of a conceptual model, (3 structural analysis and the amount of production costs, (4 model calculation of production cost with risk factors, and (5 risk based milk pricing model. This research built a relationship between risks on smallholder dairy farms with the production costs to be incurred by the farmers. It was also obtained the formulation of risk adjustment factor calculation for the variable costs of production in dairy cattle farm. The difference in production costs with risk and the total production cost without risk was about 8% to 10%. It could be concluded that the basic price of milk proposed based on the research was around IDR 4,250-IDR 4,350/L for 3 to 4 cows ownership. Increasing farmer income was expected to be obtained by entering the value of this risk in the calculation of production costs. 

  12. Aroma profile of Garnacha Tintorera-based sweet wines by chromatographic and sensorial analyses.

    Science.gov (United States)

    Noguerol-Pato, R; González-Álvarez, M; González-Barreiro, C; Cancho-Grande, B; Simal-Gándara, J

    2012-10-15

    The aroma profiles obtained of three Garnacha Tintorera-based wines were studied: a base wine, a naturally sweet wine, and a mixture of naturally sweet wine with other sweet wine obtained by fortification with spirits. The aroma fingerprint was traced by GC-MS analysis of volatile compounds and by sensorial analysis of odours and tastes. Within the volatiles compounds, sotolon (73 μg/L) and acetoin (122 μg/L) were the two main compounds found in naturally sweet wine. With regards to the odorant series, those most dominant for Garnacha Tintorera base wine were floral, fruity and spicy. Instead, the most marked odorant series affected by off-vine drying of the grapes were floral, caramelized and vegetal-wood. Finally, odorant series affected by the switch-off of alcoholic fermentation with ethanol 96% (v/v) fit for human consumption followed by oak barrel aging were caramelized and vegetal-wood. A partial least square test (PLS-2) was used to detect correlations between sets of sensory data (those obtained with mouth and nose) with the ultimate aim of improving our current understanding of the flavour of Garnacha Tintorera red wines, both base and sweet. Based on the sensory dataset analysis, the descriptors with the highest weight for separating base and sweet wines from Garnacha Tintorera were sweetness, dried fruit and caramel (for sweet wines) vs. bitterness, astringency and geranium (for base wines). Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  14. Study of a risk-based piping inspection guideline system.

    Science.gov (United States)

    Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung

    2007-02-01

    A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.

  15. Assessment of Home-Based Nigerian Engineers on Risk ...

    African Journals Online (AJOL)

    Assessment of Home-Based Nigerian Engineers on Risk Management Approach ... Journal of Applied Sciences and Environmental Management ... The Analysis of Variance (ANOVA) and Correlation methods were adopted for statistical ...

  16. RiskREP: Risk-Based Security Requirements Elicitation and Prioritization (extended version)

    NARCIS (Netherlands)

    Herrmann, Andrea; Morali, A.

    2010-01-01

    Today, companies are required to be in control of the security of their IT assets. This is especially challenging in the presence of limited budgets and conflicting requirements. Here, we present Risk-Based Requirements Elicitation and Prioritization (RiskREP), a method for managing IT security

  17. Cancer risk of anti-TNF-α at recommended doses in adult rheumatoid arthritis: a meta-analysis with intention to treat and per protocol analyses.

    Directory of Open Access Journals (Sweden)

    Guillaume Moulis

    Full Text Available BACKGROUND: The risk of malignancies on TNF-α antagonists is controversial. The aim of this survey was to assess cancer risk on TNF-α antagonists in adult rheumatoid arthritis patients, including the five marketed drugs (infliximab, etanercept, adalimumab, golimumab and certolizumab used in line with the New Drug Application. Furthermore, the relative interest of modified intention to treat or per protocol analyses to assess such sparse events remains unknown. METHODOLOGY/PRINCIPAL FINDINGS: Data sources were MEDLINE, CENTRAL, ISI Web of Science, ACR and EULAR meeting abstracts, scientific evaluation of the drugs leading to their marketing approval, and clinicaltrials.gov, until 31 December 2012.We selected double-blind randomized controlled trials in adult rheumatoid arthritis patients, including at least one treatment arm in line with New Drug Application. We performed random effect meta-analysis, with modified intention to treat and per protocol analyses. Thirty-three trials were included. There was no excess risk of malignancies on anti-TNF-α administered in line with New Drug Application in the per protocol model (OR, 0.93 95%CI[0.59-1.44], as well as in the modified intention to treat model (OR, 1.27 95%CI[0.82-1.98]. There was a non-significant tendency for an excess non-melanoma skin cancer risk in both models (respectively, 1.37 [0.71-2.66] and 1.90 [0.98-3.67]. With fixed effect Peto model restricting to trials during at least 52 weeks, the overall cancer risk was respectively 1.60 [0.97-2.64] and 1.22 [0.72-2.08]. Whatever the model, modified intention to treat analysis led to higher estimations than per protocol analysis. The later may underestimate the treatment effect when assessing very sparse events and when many patients dropped out in placebo arms. In metaregression, there was no differential risk among the five drugs. CONCLUSIONS/SIGNIFICANCE: This study did not find any evidence for an excess cancer risk on TNF

  18. Ontology-based specification, identification and analysis of perioperative risks.

    Science.gov (United States)

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  19. Assessing residential buildings value in Spain for risk analyses. Application to the landslide hazard in the Autonomous Community of Valencia

    Science.gov (United States)

    Cantarino, I.; Torrijo, F. J.; Palencia, S.; Gielen, E.

    2014-05-01

    This paper proposes a method of valuing the stock of residential buildings in Spain as the first step in assessing possible damage caused to them by natural hazards. For the purposes of the study we had access to the SIOSE (the Spanish Land Use and Cover Information System), a high-resolution land-use model, as well as to a report on the financial valuations of this type of buildings throughout Spain. Using dasymetric disaggregation processes and GIS techniques we developed a geolocalized method of obtaining this information, which was the exposure variable in the general risk assessment formula. If hazard maps and risk assessment methods - the other variables - are available, the risk value can easily be obtained. An example of its application is given in a case study that assesses the risk of a landslide in the entire 23 200 km2 of the Valencia Autonomous Community (NUT2), the results of which are analyzed by municipal areas (LAU2) for the years 2005 and 2009.

  20. Circulating biomarkers for predicting cardiovascular disease risk : a systematic review and comprehensive overview of meta-analyses

    NARCIS (Netherlands)

    Holten, van T.C.; Waanders, L.F.; Groot, de P.G.; Vissers, J.; Hoefer, I.E.; Pasterkamp, G.; Prins, M.W.J.; Roest, M.

    2013-01-01

    Background : Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming

  1. Event-Specific Analyses of Poly-Drug Abuse and Concomitant Risk Behavior in a College Bar District in Florida

    Science.gov (United States)

    Thombs, Dennis L.; O'Mara, Ryan; Dodd, Virginia J.; Merves, Michele L.; Weiler, Robert M.; Goldberger, Bruce A.; Pokorny, Steven B.; Moore, Christine; Reingle, Jennifer; Gullet, Sara E.

    2009-01-01

    Objective: The authors describe the epidemiology of risk behavior associated with poly-drug use in a college bar district of a large campus community. Participants: A total of 469 bar patrons participated in the study. Methods: The authors used self-report data and biological measures collected from patrons outside bars in July and August of…

  2. Prediction of Banking Systemic Risk Based on Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Shouwei Li

    2013-01-01

    Full Text Available Banking systemic risk is a complex nonlinear phenomenon and has shed light on the importance of safeguarding financial stability by recent financial crisis. According to the complex nonlinear characteristics of banking systemic risk, in this paper we apply support vector machine (SVM to the prediction of banking systemic risk in an attempt to suggest a new model with better explanatory power and stability. We conduct a case study of an SVM-based prediction model for Chinese banking systemic risk and find the experiment results showing that support vector machine is an efficient method in such case.

  3. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  4. Risk based regulation: a convenient concept for legislation and regulation in the field of technical risks?

    International Nuclear Information System (INIS)

    Seiler, J.H.

    1998-01-01

    Legislation and regulation concerning risk activities are traditionally based on deterministic safety measures. This may lead to inefficient results: sometimes the law requires safety measures which are - from an economic viewpoint - not justified because of their poor cost-effectiveness; sometimes it does not require safety measures although they would be very efficient. The risk based regulation approach wants to make the law more efficient and to get more safety at less costs. Legislation and regulation should be based on terms of risk rather than on deterministic rules. Risk should be expressed in quantitative terms and risk regulation should be based on the cost-effectiveness of safety measures. Thus a most efficient (in the sense of the economic analysis of the law) strategy for safety and environmental law could be established. The approach is economically reasonable and theoretically convincing. Its practical implementation however raises a lot of technical and legal questions. The project 'Risk Based Regulation' (1996-1999), sponsored by the Swiss National Fund for Scientific Research, intends to evaluate the practical feasibility of the approach from a technical and a legal view. It contains a general part which describes the risk based regulation approach and its legal and technical questions, case studies which try to practically implement the risk based regulation approach; the case studies are: storage and management of explosives in the army, storage and management of explosives for non-military purposes, safety at work, accident prevention in the non-professional field (mainly road accidents), fire protection, transportation of dangerous goods, waste disposal: traditional waste, waste disposal: radioactive waste, nuclear energy (reactor safety), a synthesis with recommendations for the future legislation and regulation in the field of technical risks. The paper presents the project and its preliminary results. (author)

  5. Factors predicting the development of pressure ulcers in an at-risk population who receive standardized preventive care: secondary analyses of a multicentre randomised controlled trial.

    Science.gov (United States)

    Demarre, Liesbet; Verhaeghe, Sofie; Van Hecke, Ann; Clays, Els; Grypdonck, Maria; Beeckman, Dimitri

    2015-02-01

    To identify predictive factors associated with the development of pressure ulcers in patients at risk who receive standardized preventive care. Numerous studies have examined factors that predict risk for pressure ulcer development. Only a few studies identified risk factors associated with pressure ulcer development in hospitalized patients receiving standardized preventive care. Secondary analyses of data collected in a multicentre randomized controlled trial. The sample consisted of 610 consecutive patients at risk for pressure ulcer development (Braden Score Pressure ulcers in category II-IV were significantly associated with non-blanchable erythema, urogenital disorders and higher body temperature. Predictive factors significantly associated with superficial pressure ulcers were admission to an internal medicine ward, incontinence-associated dermatitis, non-blanchable erythema and a lower Braden score. Superficial sacral pressure ulcers were significantly associated with incontinence-associated dermatitis. Despite the standardized preventive measures they received, hospitalized patients with non-blanchable erythema, urogenital disorders and a higher body temperature were at increased risk for developing pressure ulcers. Improved identification of at-risk patients can be achieved by taking into account specific predictive factors. Even if preventive measures are in place, continuous assessment and tailoring of interventions is necessary in all patients at risk. Daily skin observation can be used to continuously monitor the effectiveness of the intervention. © 2014 John Wiley & Sons Ltd.

  6. Analyses of Helsinki 2012 European Athletics Championships injury and illness surveillance to discuss elite athletes risk factors.

    Science.gov (United States)

    Edouard, Pascal; Depiesse, Frédéric; Branco, Pedro; Alonso, Juan-Manuel

    2014-09-01

    To further analyze newly incurred injuries and illnesses (I&Is) during Athletics International Championships to discuss risk factors. Prospective recording of newly occurred injuries and illnesses. The 2012 European Athletics (EA) Championships in Helsinki, Finland. National team and local organizing committee physicians and physiotherapists and 1342 registered athletes. Incidence and characteristics of new injuries and illnesses. Ninety-three percent of athletes were covered by medical teams, with a response rate of 91%. One hundred thirty-three injuries were reported (incidence of 98.4 injuries per 1000 registered athletes). Sixty-two injuries (47%) resulted in time loss from sport. The most common diagnosis was hamstring strain (11.4% of injuries and 21% of time-loss injuries). Injury risk was higher in males and increased with age. The highest incidences of injuries were found in combined events and middle- and long-distance events. Twenty-seven illnesses were reported (4.0 illnesses per 1000 athlete days). The most common diagnoses were upper respiratory tract infection (33.3%) and gastroenteritis/diarrhea (25.9%). During outdoor EA Championships, injury and illness incidences were slightly lower and injury characteristics were comparable with those during outdoor World Athletics Championships. During elite athletics Championships, gender (male), age (older than 30 years), finals, and some events (combined events and middle- and long-distance races) seem to be injury risk factors. Illness risk factors remain unclear. As in previous recommendations, preventive interventions should focus on overuse injuries, hamstring strains, and adequate rehabilitation of previous injuries, decreasing risk of infectious diseases transmission, appropriate event scheduling, sports clothes, and heat acclimatization.

  7. Pedestrian road traffic injuries in urban Peruvian children and adolescents: case control analyses of personal and environmental risk factors.

    Directory of Open Access Journals (Sweden)

    Joseph Donroe

    2008-09-01

    Full Text Available Child pedestrian road traffic injuries (RTIs are an important cause of death and disability in poorer nations, however RTI prevention strategies in those countries largely draw upon studies conducted in wealthier countries. This research investigated personal and environmental risk factors for child pedestrian RTIs relevant to an urban, developing world setting.This is a case control study of personal and environmental risk factors for child pedestrian RTIs in San Juan de Miraflores, Lima, Perú. The analysis of personal risk factors included 100 cases of serious pedestrian RTIs and 200 age and gender matched controls. Demographic, socioeconomic, and injury data were collected. The environmental risk factor study evaluated vehicle and pedestrian movement and infrastructure at the sites in which 40 of the above case RTIs occurred and 80 control sites.After adjustment, factors associated with increased risk of child pedestrian RTIs included high vehicle volume (OR 7.88, 95%CI 1.97-31.52, absent lane demarcations (OR 6.59, 95% CI 1.65-26.26, high vehicle speed (OR 5.35, 95%CI 1.55-18.54, high street vendor density (OR 1.25, 95%CI 1.01-1.55, and more children living in the home (OR 1.25, 95%CI 1.00-1.56. Protective factors included more hours/day spent in school (OR 0.52, 95%CI 0.33-0.82 and years of family residence in the same home (OR 0.97, 95%CI 0.95-0.99.Reducing traffic volumes and speeds, limiting the number of street vendors on a given stretch of road, and improving lane demarcation should be evaluated as components of child pedestrian RTI interventions in poorer countries.

  8. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Lofgren, E.V.; Vesely, W.E.

    1990-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective test and maintenance practices that control risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety

  9. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Kim, I.S.; Lofgren, E.V.

    1989-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective tests and maintenance practices that control; risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety. 3 refs., 7 figs., 2 tabs

  10. Risk-based inspection--Development of guidelines

    International Nuclear Information System (INIS)

    1993-07-01

    Effective inservice inspection programs can play a significant role in minimizing equipment and structural failures. Most of the current inservice inspection programs for light water reactor (LWR) nuclear power plant components are based on experience and engineers' qualitative judgment. These programs include only an implicit consideration of risk, which combines the probability of failure of a component under its operation and loading conditions and the consequences of such failure, if it occurs. This document recommends appropriate methods for establishing a risk-based inspection program for LWR nuclear power plant components. The process has been built from a general methodology (Volume 1) and has been expanded to involve five major steps: defining the system; evaluating qualitative risk assessment results; using this and information from plant probabilistic risk assessments to perform a quantitative risk analysis; selecting target failure probabilities; and developing an inspection program for components using economic decision analysis and structural reliability assessment methods

  11. Risk-based decision making for terrorism applications.

    Science.gov (United States)

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.

  12. Risk-based rules for crane safety systems

    Energy Technology Data Exchange (ETDEWEB)

    Ruud, Stian [Section for Control Systems, DNV Maritime, 1322 Hovik (Norway)], E-mail: Stian.Ruud@dnv.com; Mikkelsen, Age [Section for Lifting Appliances, DNV Maritime, 1322 Hovik (Norway)], E-mail: Age.Mikkelsen@dnv.com

    2008-09-15

    The International Maritime Organisation (IMO) has recommended a method called formal safety assessment (FSA) for future development of rules and regulations. The FSA method has been applied in a pilot research project for development of risk-based rules and functional requirements for systems and components for offshore crane systems. This paper reports some developments in the project. A method for estimating target reliability for the risk-control options (safety functions) by means of the cost/benefit decision criterion has been developed in the project and is presented in this paper. Finally, a structure for risk-based rules is proposed and presented.

  13. Risk-based rules for crane safety systems

    International Nuclear Information System (INIS)

    Ruud, Stian; Mikkelsen, Age

    2008-01-01

    The International Maritime Organisation (IMO) has recommended a method called formal safety assessment (FSA) for future development of rules and regulations. The FSA method has been applied in a pilot research project for development of risk-based rules and functional requirements for systems and components for offshore crane systems. This paper reports some developments in the project. A method for estimating target reliability for the risk-control options (safety functions) by means of the cost/benefit decision criterion has been developed in the project and is presented in this paper. Finally, a structure for risk-based rules is proposed and presented

  14. Labour Market Flexibility between Risk and Opportunity for Gender Equality Analyses of Self-employment, Part-time Work, and Job Autonomy

    OpenAIRE

    König, Stefanie

    2016-01-01

    The dissertation “Labour Market Flexibility between Risk and Opportunity for Gender Equality – Analyses of Self-employment, Part-time Work, and Job Autonomy” addresses the main research question: Is flexibility the key to a less gendered labour market, or does it rather foster more traditional roles and gender inequality? In four empirical studies, different aspects in life were investigated in order to gain a holistic understanding of gender inequalities related to flexibility at work: the d...

  15. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  16. Phylogenetic tree based on complete genomes using fractal and correlation analyses without sequence alignment

    Directory of Open Access Journals (Sweden)

    Zu-Guo Yu

    2006-06-01

    Full Text Available The complete genomes of living organisms have provided much information on their phylogenetic relationships. Similarly, the complete genomes of chloroplasts have helped resolve the evolution of this organelle in photosynthetic eukaryotes. In this review, we describe two algorithms to construct phylogenetic trees based on the theories of fractals and dynamic language using complete genomes. These algorithms were developed by our research group in the past few years. Our distance-based phylogenetic tree of 109 prokaryotes and eukaryotes agrees with the biologists' "tree of life" based on the 16S-like rRNA genes in a majority of basic branchings and most lower taxa. Our phylogenetic analysis also shows that the chloroplast genomes are separated into two major clades corresponding to chlorophytes s.l. and rhodophytes s.l. The interrelationships among the chloroplasts are largely in agreement with the current understanding on chloroplast evolution.

  17. Integration of Evidence Base into a Probabilistic Risk Assessment

    Science.gov (United States)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  18. Perception of mobile phone and base station risks.

    Science.gov (United States)

    Siegrist, Michael; Earle, Timothy C; Gutscher, Heinz; Keller, Carmen

    2005-10-01

    Perceptions of risks associated with mobile phones, base stations, and other sources of electromagnetic fields (EMF) were examined. Data from a telephone survey conducted in the German- and French-speaking parts of Switzerland are presented (N = 1,015). Participants assessed both risks and benefits associated with nine different sources of EMF. Trust in the authorities regulating these hazards was assessed as well. In addition, participants answered a set of questions related to attitudes toward EMF and toward mobile phone base stations. According to respondents' assessments, high-voltage transmission lines are the most risky source of EMF. Mobile phones and mobile phone base stations received lower risk ratings. Results showed that trust in authorities was positively associated with perceived benefits and negatively associated with perceived risks. People who use their mobile phones frequently perceived lower risks and higher benefits than people who use their mobile phones infrequently. People who believed they lived close to a base station did not significantly differ in their level of risks associated with mobile phone base stations from people who did not believe they lived close to a base station. Regarding risk regulation, a majority of participants were in favor of fixing limiting values based on the worst-case scenario. Correlations suggest that belief in paranormal phenomena is related to level of perceived risks associated with EMF. Furthermore, people who believed that most chemical substances cause cancer also worried more about EMF than people who did not believe that chemical substances are that harmful. Practical implications of the results are discussed.

  19. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  20. Sediment Characteristics of Mergui Basin, Andaman Sea based on Multi-proxy Analyses

    Directory of Open Access Journals (Sweden)

    Rina Zuraida

    2018-02-01

    Full Text Available This paper presents the characteristics of sediment from core BS-36 (6°55.85’ S and 96°7.48’ E, 1147.1 m water depth that was acquired in the Mergui Basin, Andaman Sea. The analyses involved megascopic description, core scanning by multi-sensor core logger, and carbonate content measurement. The purpose of this study is to determine the physical and chemical characteristics of sediment to infer the depositional environment. The results show that this core can be divided into 5 lithologic units that represent various environmental conditions. The sedimentation of the bottom part, Units V and IV were inferred to be deposited in suboxic to anoxic bottom condition combined with high productivity and low precipitation. Unit III was deposited during high precipitation and oxic condition due to ocean ventilation. In the upper part, Units II and I occurred during higher precipitation, higher carbonate production and suboxic to anoxic condition. Keywords: sediment characteristics, Mergui Basin, Andaman Sea, suboxic, anoxic, oxic, carbonate content

  1. Revised age of deglaciation of Lake Emma based on new radiocarbon and macrofossil analyses

    Science.gov (United States)

    Elias, S.A.; Carrara, P.E.; Toolin, L.J.; Jull, A.J.T.

    1991-01-01

    Previous radiocarbon ages of detrital moss fragments in basal organic sediments of Lake Emma indicated that extensive deglaciation of the San Juan Mountains occurred prior to 14,900 yr B.P. (Carrara et al., 1984). Paleoecological analyses of insect and plant macrofossils from these basal sediments cast doubt on the reliability of the radiocarbon ages. Subsequent accelerator radiocarbon dates of insect fossils and wood fragments indicate an early Holocene age, rather than a late Pleistocene age, for the basal sediments of Lake Emma. These new radiocarbon ages suggest that by at least 10,000 yr B.P. deglaciation of the San Juan Mountains was complete. The insect and plant macrofossils from the basal organic sediments indicate a higher-than-present treeline during the early Holocene. The insect assemblages consisted of about 30% bark beetles, which contrasts markedly with the composition of insects from modern lake sediments and modern specimens collected in the Lake Emma cirque, in which bark beetles comprise only about 3% of the assemblages. In addition, in the fossil assemblages there were a number of flightless insect species (not subject to upslope transport by wind) indicative of coniferous forest environments. These insects were likewise absent in the modern assemblage. ?? 1991.

  2. Is autoimmunology a discipline of its own? A big data-based bibliometric and scientometric analyses.

    Science.gov (United States)

    Watad, Abdulla; Bragazzi, Nicola Luigi; Adawi, Mohammad; Amital, Howard; Kivity, Shaye; Mahroum, Naim; Blank, Miri; Shoenfeld, Yehuda

    2017-06-01

    Autoimmunology is a super-specialty of immunology specifically dealing with autoimmune disorders. To assess the extant literature concerning autoimmune disorders, bibliometric and scientometric analyses (namely, research topics/keywords co-occurrence, journal co-citation, citations, and scientific output trends - both crude and normalized, authors network, leading authors, countries, and organizations analysis) were carried out using open-source software, namely, VOSviewer and SciCurve. A corpus of 169,519 articles containing the keyword "autoimmunity" was utilized, selecting PubMed/MEDLINE as bibliographic thesaurus. Journals specifically devoted to autoimmune disorders were six and covered approximately 4.15% of the entire scientific production. Compared with all the corpus (from 1946 on), these specialized journals have been established relatively few decades ago. Top countries were the United States, Japan, Germany, United Kingdom, Italy, China, France, Canada, Australia, and Israel. Trending topics are represented by the role of microRNAs (miRNAs) in the ethiopathogenesis of autoimmune disorders, contributions of genetics and of epigenetic modifications, role of vitamins, management during pregnancy and the impact of gender. New subsets of immune cells have been extensively investigated, with a focus on interleukin production and release and on Th17 cells. Autoimmunology is emerging as a new discipline within immunology, with its own bibliometric properties, an identified scientific community and specifically devoted journals.

  3. Shielding analysis method applied to nuclear ship 'MUTSU' and its evaluation based on experimental analyses

    International Nuclear Information System (INIS)

    Yamaji, Akio; Miyakoshi, Jun-ichi; Iwao, Yoshiaki; Tsubosaka, Akira; Saito, Tetsuo; Fujii, Takayoshi; Okumura, Yoshihiro; Suzuoki, Zenro; Kawakita, Takashi.

    1984-01-01

    Procedures of shielding analysis are described which were used for the shielding modification design of the Nuclear Ship ''MUTSU''. The calculations of the radiation distribution on board were made using Sn codes ANISN and TWOTRAN, a point kernel code QAD and a Monte Carlo code MORSE. The accuracies of these calculations were investigated through the analysis of various shielding experiments: the shield tank experiment of the Nuclear Ship ''Otto Hahn'', the shielding mock-up experiment for ''MUTSU'' performed in JRR-4, the shielding benchmark experiment using the 16 N radiation facility of AERE Harwell and the shielding effect experiment of the ship structure performed in the training ship ''Shintoku-Maru''. The values calculated by the ANISN agree with the data measured at ''Otto Hahn'' within a factor of 2 for fast neutrons and within a factor of 3 for epithermal and thermal neutrons. The γ-ray dose rates calculated by the QAD agree with the measured values within 30% for the analysis of the experiment in JRR-4. The design values for ''MUTSU'' were determined in consequence of these experimental analyses. (author)

  4. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    Directory of Open Access Journals (Sweden)

    Sung-Chien Lin

    2014-07-01

    Full Text Available In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results of this study show that when the number of topics was set to 10, the topic model has the smallest perplexity. Although data scopes and analysis methodsare different to previous studies, the generating topics of this study are consistent with those results produced by analyses of experts. Empirical case studies and measurements of bibliometric indicators were concerned important in every year during the whole analytic period, and the field was increasing stability. Both the two core journals broadly paid more attention to all of the topics in the field of Informetrics. The Journal of Informetricsput particular emphasis on construction and applications ofbibliometric indicators and Scientometrics focused on the evaluation and the factors of productivity of countries, institutions, domains, and journals.

  5. Analysing Surface Exposure to Climate Dynamics in the Himalayas to Adopt a Planning Framework for Landslide Risk Reduction

    Science.gov (United States)

    Tiwari, A.

    2017-12-01

    Himalayas rank first in the inventory of most densely populated and congested high altitude mountain regions of the planet. The region is mostly characterized by inadequate infrastructure, lack of mitigation tools along with constraints of terrain undermining the carrying capacity and resilience of urban ecosystems. Moreover, climate change has increased vulnerability of poor and marginalized population living in rapidly urbanizing mountain towns to increased frequency and severity of risks from extreme weather events. Such events pose multifold threat by easily translating to hazards, without the ability to respond and mitigate. Additionally, the recent extreme climate dynamics such as rainfall patterns have influenced the natural rate of surface/slope processes in the Himalaya. The aim of the study was to analyze the extent of interaction between climate dynamics and upland surface to develop participatory planning framework for landslide risk reduction using Integral Geographic Information System (integral GIS). At this stage, the study is limited to only rainfall triggered landslides (RTL). The study region lies in the middle Himalayan range (Himachal). Research utilized terrain analysis tools in integral GIS and identified risk susceptible surface without: 1.adding to its (often) complex fragmentation, and 2. Interference in surface/slope processes. Analysis covered most of the relevant surface factors including geology, slope instability, infrastructure development, natural and urban drainage system, land-cover and land-use as well. The outcome included an exposure-reduced model of existing terrain and the surface-process accommodated by it, with the use of local technical tools available among the poor and fragile mountain community. The final participatory planning framework successfully harmonized people's perception and adaptation knowledge, and incorporated priorities of local authorities. This research is significant as it rises above the fundamental

  6. Food intake patterns and cardiovascular risk factors in Japanese adults: analyses from the 2012 National Health and nutrition survey, Japan

    OpenAIRE

    Htun, Nay Chi; Suga, Hitomi; Imai, Shino; Shimizu, Wakana; Takimoto, Hidemi

    2017-01-01

    Background There is an increasing global interest in the role of Japanese diet as a possible explanation for the nation?s healthy diet, which contributes to the world?s highest life-expectancy enjoyed in Japan. However, nationwide studies on current food intake status among general Japanese population have not been established yet. This study examined the association between food intake patterns and cardiovascular risk factors (CVRF) such as waist circumference (WC), body mass index (BMI), bl...

  7. Modeling the potential risk factors of bovine viral diarrhea prevalence in Egypt using univariable and multivariable logistic regression analyses

    Directory of Open Access Journals (Sweden)

    Abdelfattah M. Selim

    2018-03-01

    Full Text Available Aim: The present cross-sectional study was conducted to determine the seroprevalence and potential risk factors associated with Bovine viral diarrhea virus (BVDV disease in cattle and buffaloes in Egypt, to model the potential risk factors associated with the disease using logistic regression (LR models, and to fit the best predictive model for the current data. Materials and Methods: A total of 740 blood samples were collected within November 2012-March 2013 from animals aged between 6 months and 3 years. The potential risk factors studied were species, age, sex, and herd location. All serum samples were examined with indirect ELIZA test for antibody detection. Data were analyzed with different statistical approaches such as Chi-square test, odds ratios (OR, univariable, and multivariable LR models. Results: Results revealed a non-significant association between being seropositive with BVDV and all risk factors, except for species of animal. Seroprevalence percentages were 40% and 23% for cattle and buffaloes, respectively. OR for all categories were close to one with the highest OR for cattle relative to buffaloes, which was 2.237. Likelihood ratio tests showed a significant drop of the -2LL from univariable LR to multivariable LR models. Conclusion: There was an evidence of high seroprevalence of BVDV among cattle as compared with buffaloes with the possibility of infection in different age groups of animals. In addition, multivariable LR model was proved to provide more information for association and prediction purposes relative to univariable LR models and Chi-square tests if we have more than one predictor.

  8. Analysing a Web-Based E-Commerce Learning Community: A Case Study in Brazil.

    Science.gov (United States)

    Joia, Luiz Antonio

    2002-01-01

    Demonstrates the use of a Web-based participative virtual learning environment for graduate students in Brazil enrolled in an electronic commerce course in a Masters in Business Administration program. Discusses learning communities; computer-supported collaborative work and collaborative learning; influences on student participation; the role of…

  9. Group analyses of connectivity-based cortical parcellation using repeated k-means clustering

    NARCIS (Netherlands)

    Nanetti, Luca; Cerliani, Leonardo; Gazzola, Valeria; Renken, Remco; Keysers, Christian

    2009-01-01

    K-means clustering has become a popular tool for connectivity-based cortical segmentation using Diffusion Weighted Imaging (DWI) data. A sometimes ignored issue is, however, that the output of the algorithm depends on the initial placement of starting points, and that different sets of starting

  10. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  11. Analyses of Receptive and Productive Korean EFL Vocabulary: Computer-Based Vocabulary Learning Program

    Science.gov (United States)

    Kim, Scott Sungki

    2013-01-01

    The present research study investigated the effects of 8 versions of a computer-based vocabulary learning program on receptive and productive knowledge levels of college students. The participants were 106 male and 103 female Korean EFL students from Kyungsung University and Kwandong University in Korea. Students who participated in versions of…

  12. Using Arc GIS to analyse urban growth towards torrent risk areas (Aswan city as a case study)

    International Nuclear Information System (INIS)

    Hamdy, Omar; Zhao, Shichen; Salheen, Mohamed A; Eid, Y Y

    2014-01-01

    Areas suffering from storm water drains are considered to be the places most at risk, water torrents have an effect on urban areas and can cause a lot of damage to buildings and infrastructure. Moreover, there is dangerous situation whereby urban growth is occuring towards at-risk areas. The urban growth rate in risk areas rose up to 24.9% in 2001, and reached 48.8% in 2013. Urban growth in ''Abouelreesh'' village had been influenced by the construction of larger buildings, because most people were looking forward to live in bigger houses. We can discover the previous problem by observing the average size increase of the buildings' areas from 2001 until 2013, especially in risky areas where the average building's area had grown from 254 m2 in 2001 to 411 m2 in 2013. This Phenomenon is considered to be very important factor which attracts the urban growth towards the risky areas in spite of the danger surrounding them

  13. Cardiovascular Disease Risk in a Large, Population-Based Cohort of Breast Cancer Survivors

    Energy Technology Data Exchange (ETDEWEB)

    Boekel, Naomi B.; Schaapveld, Michael [Epidemiology, Netherlands Cancer Institute, Amsterdam (Netherlands); Gietema, Jourik A. [Medical Oncology, University Medical Center Groningen, Groningen (Netherlands); Russell, Nicola S. [Radiation Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Poortmans, Philip [Radiation Oncology, Institute Verbeeten, Tilburg (Netherlands); Radiation Oncology, Radboud University Nijmegen Medical Center, Nijmegen (Netherlands); Theuws, Jacqueline C.M. [Radiotherapy, Catharina Hospital Eindhoven, Eindhoven (Netherlands); Schinagl, Dominic A.X. [Radiation Oncology, Radboud University Nijmegen Medical Center, Nijmegen (Netherlands); Rietveld, Derek H.F. [Radiation Oncology, VU University Medical Center Amsterdam, Amsterdam (Netherlands); Versteegh, Michel I.M. [Steering Committee Cardiac Interventions Netherlands, Leiden University Medical Center, Leiden (Netherlands); Visser, Otto [Registration and Research, Comprehensive Cancer Center The Netherlands, Utrecht (Netherlands); Rutgers, Emiel J.T. [Surgery, Netherlands Cancer Institute, Amsterdam (Netherlands); Aleman, Berthe M.P. [Radiation Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Leeuwen, Flora E. van, E-mail: f.v.leeuwen@nki.nl [Epidemiology, Netherlands Cancer Institute, Amsterdam (Netherlands)

    2016-04-01

    Purpose: To conduct a large, population-based study on cardiovascular disease (CVD) in breast cancer (BC) survivors treated in 1989 or later. Methods and Materials: A large, population-based cohort comprising 70,230 surgically treated stage I to III BC patients diagnosed before age 75 years between 1989 and 2005 was linked with population-based registries for CVD. Cardiovascular disease risks were compared with the general population, and within the cohort using competing risk analyses. Results: Compared with the general Dutch population, BC patients had a slightly lower CVD mortality risk (standardized mortality ratio 0.92, 95% confidence interval [CI] 0.88-0.97). Only death due to valvular heart disease was more frequent (standardized mortality ratio 1.28, 95% CI 1.08-1.52). Left-sided radiation therapy after mastectomy increased the risk of any cardiovascular event compared with both surgery alone (subdistribution hazard ratio (sHR) 1.23, 95% CI 1.11-1.36) and right-sided radiation therapy (sHR 1.19, 95% CI 1.04-1.36). Radiation-associated risks were found for not only ischemic heart disease, but also for valvular heart disease and congestive heart failure (CHF). Risks were more pronounced in patients aged <50 years at BC diagnosis (sHR 1.48, 95% CI 1.07-2.04 for left- vs right-sided radiation therapy after mastectomy). Left- versus right-sided radiation therapy after wide local excision did not increase the risk of all CVD combined, yet an increased ischemic heart disease risk was found (sHR 1.14, 95% CI 1.01-1.28). Analyses including detailed radiation therapy information showed an increased CVD risk for left-sided chest wall irradiation alone, left-sided breast irradiation alone, and internal mammary chain field irradiation, all compared with right-sided breast irradiation alone. Compared with patients not treated with chemotherapy, chemotherapy used ≥1997 (ie, anthracyline-based chemotherapy) increased the risk of CHF (sHR 1.35, 95% CI 1

  14. Cardiovascular Disease Risk in a Large, Population-Based Cohort of Breast Cancer Survivors

    International Nuclear Information System (INIS)

    Boekel, Naomi B.; Schaapveld, Michael; Gietema, Jourik A.; Russell, Nicola S.; Poortmans, Philip; Theuws, Jacqueline C.M.; Schinagl, Dominic A.X.; Rietveld, Derek H.F.; Versteegh, Michel I.M.; Visser, Otto; Rutgers, Emiel J.T.; Aleman, Berthe M.P.; Leeuwen, Flora E. van

    2016-01-01

    Purpose: To conduct a large, population-based study on cardiovascular disease (CVD) in breast cancer (BC) survivors treated in 1989 or later. Methods and Materials: A large, population-based cohort comprising 70,230 surgically treated stage I to III BC patients diagnosed before age 75 years between 1989 and 2005 was linked with population-based registries for CVD. Cardiovascular disease risks were compared with the general population, and within the cohort using competing risk analyses. Results: Compared with the general Dutch population, BC patients had a slightly lower CVD mortality risk (standardized mortality ratio 0.92, 95% confidence interval [CI] 0.88-0.97). Only death due to valvular heart disease was more frequent (standardized mortality ratio 1.28, 95% CI 1.08-1.52). Left-sided radiation therapy after mastectomy increased the risk of any cardiovascular event compared with both surgery alone (subdistribution hazard ratio (sHR) 1.23, 95% CI 1.11-1.36) and right-sided radiation therapy (sHR 1.19, 95% CI 1.04-1.36). Radiation-associated risks were found for not only ischemic heart disease, but also for valvular heart disease and congestive heart failure (CHF). Risks were more pronounced in patients aged <50 years at BC diagnosis (sHR 1.48, 95% CI 1.07-2.04 for left- vs right-sided radiation therapy after mastectomy). Left- versus right-sided radiation therapy after wide local excision did not increase the risk of all CVD combined, yet an increased ischemic heart disease risk was found (sHR 1.14, 95% CI 1.01-1.28). Analyses including detailed radiation therapy information showed an increased CVD risk for left-sided chest wall irradiation alone, left-sided breast irradiation alone, and internal mammary chain field irradiation, all compared with right-sided breast irradiation alone. Compared with patients not treated with chemotherapy, chemotherapy used ≥1997 (ie, anthracyline-based chemotherapy) increased the risk of CHF (sHR 1.35, 95% CI 1

  15. Palivizumab for immunoprophylaxis of respiratory syncytial virus (RSV) bronchiolitis in high-risk infants and young children: a systematic review and additional economic modelling of subgroup analyses.

    Science.gov (United States)

    Wang, D; Bayliss, S; Meads, C

    2011-01-01

    find any relevant studies that may have been missed. The risk factors identified from the systematic review of included studies were analysed and synthesised using stata. The base-case decision tree model developed in the original HTA journal publication [Health Technol Assess 2008;12(36)] was used to derive the cost-effectiveness of immunoprophylaxis of RSV using palivizumab in different subgroups of pre-term infants and young children who are at high risk of serious morbidity from RSV infection. Cost-effective spectra of prophylaxis with palivizumab compared with no prophylaxis for children without CLD/CHD, children with CLD, children with acyanotic CHD and children with cyanotic CHD were derived. Thirteen studies were included in this analysis. Analysis of 16,128 subgroups showed that prophylaxis with palivizumab may be cost-effective [at a willingness-to-pay threshold of £30,000/quality-adjusted life-year (QALY)] for some subgroups. For example, for children without CLD or CHD, the cost-effective subgroups included children under 6 weeks old at the start of the RSV season who had at least two other risk factors that were considered in this report and were born at 24 weeks gestational age (GA) or less, but did not include children who were > 9 months old at the start of the RSV season or had a GA of > 32 weeks. For children with CLD, the cost-effective subgroups included children 21 months old at the start of the RSV season. For children with acyanotic CHD, the cost-effective subgroups included children 21 months old at the start of the RSV season. For children with cyanotic CHD, the cost-effective subgroups included children 12 months old at the start of the RSV season. The poor quality of the studies feeding numerical results into this analysis means that the true cost-effectiveness may vary considerably from that estimated here. There is a risk that the relatively high mathematical precision of the point estimates of cost-effectiveness may be quite inaccurate

  16. Risk-based technical specifications program: Site interview results

    International Nuclear Information System (INIS)

    Andre, G.R.; Baker, A.J.; Johnson, R.L.

    1991-08-01

    The Electric Power Research Institute and Pacific Gas and Electric Company are sponsoring a program directed at improving Technical Specifications using risk-based methods. The major objectives of the program are to develop risk-based approaches to improve Technical Specifications and to develop an Interactive Risk Advisor (IRA) prototype. The IRA is envisioned as an interactive system that is available to plant personnel to assist in controlling plant operation. Use of an IRA is viewed as a method to improve plant availability while maintaining or improving plant safety. In support of the program, interviews were conducted at several PWR and BWR plant sites, to elicit opinions and information concerning risk-based approaches to Technical Specifications and IRA requirements. This report presents the results of these interviews, including the functional requirements of an IRA. 2 refs., 6 figs., 2 tabs

  17. SU-F-T-243: Major Risks in Radiotherapy. A Review Based On Risk Analysis Literature

    Energy Technology Data Exchange (ETDEWEB)

    López-Tarjuelo, J; Guasp-Tortajada, M; Iglesias-Montenegro, N; Monasor-Denia, P [Servicio de Radiofísica y Protección Radiológica, Consorcio Hospitalario Provincial de Castellón, Castellón de la Plana, España/Spain (Spain); Bouché-Babiloni, A; Morillo-Macías, V; Ferrer-Albiach, C [Servicio de Oncología Radioterápica, Consorcio Hospitalario Provincial de Castellón, Castellón de la Plana, España/Spain (Spain)

    2016-06-15

    Purpose: We present a literature review of risk analyses in radiotherapy to highlight the most reported risks and facilitate the spread of this valuable information so that professionals can be aware of these major threats before performing their own studies. Methods: We considered studies with at least an estimation of the probability of occurrence of an adverse event (O) and its associated severity (S). They cover external beam radiotherapy, brachytherapy, intraoperative radiotherapy, and stereotactic techniques. We selected only the works containing a detailed ranked series of elements or failure modes and focused on the first fully reported quartile as much. Afterward, we sorted the risk elements according to a regular radiotherapy procedure so that the resulting groups were cited in several works and be ranked in this way. Results: 29 references published between 2007 and February 2016 were studied. Publication trend has been generally rising. The most employed analysis has been the Failure mode and effect analysis (FMEA). Among references, we selected 20 works listing 258 ranked risk elements. They were sorted into 31 groups appearing at least in two different works. 11 groups appeared in at least 5 references and 5 groups did it in 7 or more papers. These last sets of risks where choosing another set of images or plan for planning or treating, errors related with contours, errors in patient positioning for treatment, human mistakes when programming treatments, and planning errors. Conclusion: There is a sufficient amount and variety of references for identifying which failure modes or elements should be addressed in a radiotherapy department before attempting a specific analysis. FMEA prevailed, but other studies such as “risk matrix” or “occurrence × severity” analyses can also lead professionals’ efforts. Risk associated with human actions ranks very high; therefore, they should be automated or at least peer-reviewed.

  18. SU-F-T-243: Major Risks in Radiotherapy. A Review Based On Risk Analysis Literature

    International Nuclear Information System (INIS)

    López-Tarjuelo, J; Guasp-Tortajada, M; Iglesias-Montenegro, N; Monasor-Denia, P; Bouché-Babiloni, A; Morillo-Macías, V; Ferrer-Albiach, C

    2016-01-01

    Purpose: We present a literature review of risk analyses in radiotherapy to highlight the most reported risks and facilitate the spread of this valuable information so that professionals can be aware of these major threats before performing their own studies. Methods: We considered studies with at least an estimation of the probability of occurrence of an adverse event (O) and its associated severity (S). They cover external beam radiotherapy, brachytherapy, intraoperative radiotherapy, and stereotactic techniques. We selected only the works containing a detailed ranked series of elements or failure modes and focused on the first fully reported quartile as much. Afterward, we sorted the risk elements according to a regular radiotherapy procedure so that the resulting groups were cited in several works and be ranked in this way. Results: 29 references published between 2007 and February 2016 were studied. Publication trend has been generally rising. The most employed analysis has been the Failure mode and effect analysis (FMEA). Among references, we selected 20 works listing 258 ranked risk elements. They were sorted into 31 groups appearing at least in two different works. 11 groups appeared in at least 5 references and 5 groups did it in 7 or more papers. These last sets of risks where choosing another set of images or plan for planning or treating, errors related with contours, errors in patient positioning for treatment, human mistakes when programming treatments, and planning errors. Conclusion: There is a sufficient amount and variety of references for identifying which failure modes or elements should be addressed in a radiotherapy department before attempting a specific analysis. FMEA prevailed, but other studies such as “risk matrix” or “occurrence × severity” analyses can also lead professionals’ efforts. Risk associated with human actions ranks very high; therefore, they should be automated or at least peer-reviewed.

  19. Community-Based Diabetes Screening and Risk Assessment in Rural West Virginia

    Directory of Open Access Journals (Sweden)

    Ranjita Misra

    2016-01-01

    Full Text Available This project utilized a cross-sectional study design to assess diabetes risk among 540 individuals from 12 counties using trained extension agents and community organizations in West Virginia. Individuals were screened for diabetes using (1 the validated 7-item diabetes risk assessment survey and (2 hemoglobin A1c tests. Demographic and lifestyle behaviors were also collected. The average age, body mass index, and A1c were 51.2±16.4, 31.1±7.5, and 5.8±0.74, respectively. The majority were females, Non-Hispanic Whites with no prior diagnosis of diabetes. Screenings showed that 61.8% of participants were at high risk for diabetes. Family history of diabetes (siblings or parents, overweight or obese status, sedentary lifestyle, and older age were commonly prevalent risk factors. Higher risk scores computed from the 7-item questions correlated positively with higher A1c (r=0.221, P<0.001. In multivariate logistic regression analyses, higher diabetes risk was predicted by obesity, older age, family history of hypertension, and gestational diabetes. Females were 4 times at higher risk than males. The findings indicated that community-based screenings were an effective way to assess diabetes risk in rural West Virginia. Linking diabetes screenings with referrals to lifestyle programs for high risk individuals can help reduce the burden of diabetes in the state.

  20. White matter disruption in moderate/severe pediatric traumatic brain injury: Advanced tract-based analyses

    Directory of Open Access Journals (Sweden)

    Emily L. Dennis

    2015-01-01

    Full Text Available Traumatic brain injury (TBI is the leading cause of death and disability in children and can lead to a wide range of impairments. Brain imaging methods such as DTI (diffusion tensor imaging are uniquely sensitive to the white matter (WM damage that is common in TBI. However, higher-level analyses using tractography are complicated by the damage and decreased FA (fractional anisotropy characteristic of TBI, which can result in premature tract endings. We used the newly developed autoMATE (automated multi-atlas tract extraction method to identify differences in WM integrity. 63 pediatric patients aged 8–19 years with moderate/severe TBI were examined with cross sectional scanning at one or two time points after injury: a post-acute assessment 1–5 months post-injury and a chronic assessment 13–19 months post-injury. A battery of cognitive function tests was performed in the same time periods. 56 children were examined in the first phase, 28 TBI patients and 28 healthy controls. In the second phase 34 children were studied, 17 TBI patients and 17 controls (27 participants completed both post-acute and chronic phases. We did not find any significant group differences in the post-acute phase. Chronically, we found extensive group differences, mainly for mean and radial diffusivity (MD and RD. In the chronic phase, we found higher MD and RD across a wide range of WM. Additionally, we found correlations between these WM integrity measures and cognitive deficits. This suggests a distributed pattern of WM disruption that continues over the first year following a TBI in children.

  1. Geology of Southern Guinevere Planitia, Venus, based on analyses of Goldstone radar data

    International Nuclear Information System (INIS)

    Arvidson, R.E.; Plaut, J.J.; Jurgens, R.F.; Saunders, R.S.; Slade, M.A.

    1989-01-01

    The ensemble of 41 backscatter images of Venus acquired by the S Band (12.6 cm) Goldstone radar system covers approx. 35 million km and includes the equatorial portion of Guinevere Planitia, Navka Planitia, Heng-O Chasma, and Tinatin Planitia, and parts of Devana Chasma and Phoebe Regio. The images and associated altimetry data combine relatively high spatial resolution (1 to 10 km) with small incidence angles (less than 10 deg) for regions not covered by either Venera Orbiter or Arecibo radar data. Systematic analyses of the Goldstone data show that: (1) Volcanic plains dominate, including groups of small volcanic constructs, radar bright flows on a NW-SE arm of Phoebe Regio and on Ushas Mons and circular volcano-tectonic depressions; (2) Some of the regions imaged by Goldstone have high radar cross sections, including the flows on Ushas Mons and the NW-SE arm of Phoebe Regio, and several other unnamed hills, ridged terrains, and plains areas; (3) A 1000 km diameter multiringed structure is observed and appears to have a morphology not observed in Venera data (The northern section corresponds to Heng-O Chasma); (4) A 150 km wide, 2 km deep, 1400 km long rift valley with upturned flanks is located on the western flank of Phoebe Regio and extends into Devana Chasma; (5) A number of structures can be discerned in the Goldstone data, mainly trending NW-SE and NE-SW, directions similar to those discerned in Pioneer-Venus topography throughout the equatorial region; and (6) The abundance of circular and impact features is similar to the plains global average defined from Venera and Arecibo data, implying that the terrain imaged by Goldstone has typical crater retention ages, measured in hundreds of millions of years. The rate of resurfacing is less than or equal to 4 km/Ga

  2. Intra-specific genetic relationship analyses of Elaeagnus angustifolia based on RP-HPLC biochemical markers

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Elaeagnus angustifolia Linn. has various ecological, medicinal and economical uses. An approach was established using RP-HPLC (reversed-phase high-performance liquid chromatography) to classify and analyse the intra-specific genetic relationships of seventeen populations of E. angustifolia, collected from the Xinjiang areas of China. Chromatograms of alcohol-soluble proteins produced by seventeen populations ofE. angustifolia, were compared. Each chromatogram of alcohol-soluble proteins came from a single seed of one wild plant only. The results showed that when using a Waters Delta Pak. C18, 5 μm particle size reversed phase column (150 mm×3.9 mm), a linear gradient of 25%~60% solvent B with flow rate of 1 ml/min and run time of 67 min, the chromatography yielded optimum separation ofE. angustifolia alcohol-soluble proteins. Representative peaks in each population were chosen according to peak area and occurrence in every seed. The converted data on the elution peaks of each population were different and could be used to represent those populations. GSC (genetic similarity coefficients) of 41% to 62% showed a medium degree of genetic diversity among the populations in these eco-areas. Cluster analysis showed that the seventeen populations ofE. angustifolia could be divided into six clusters at the GSC=0.535 level and indicated the general and unique biochemical markers of these clusters. We suggest that E. angustifolia distribution in these eco-areas could be classified into six variable species. RP-HPLC was shown to be a rapid, repeatable and reliable method for E. angustifolia classification and identification and for analysis of genetic diversity.

  3. A risk-based approach to flammable gas detector spacing.

    Science.gov (United States)

    Defriend, Stephen; Dejmek, Mark; Porter, Leisa; Deshotels, Bob; Natvig, Bernt

    2008-11-15

    Flammable gas detectors allow an operating company to address leaks before they become serious, by automatically alarming and by initiating isolation and safe venting. Without effective gas detection, there is very limited defense against a flammable gas leak developing into a fire or explosion that could cause loss of life or escalate to cascading failures of nearby vessels, piping, and equipment. While it is commonly recognized that some gas detectors are needed in a process plant containing flammable gas or volatile liquids, there is usually a question of how many are needed. The areas that need protection can be determined by dispersion modeling from potential leak sites. Within the areas that must be protected, the spacing of detectors (or alternatively, number of detectors) should be based on risk. Detector design can be characterized by spacing criteria, which is convenient for design - or alternatively by number of detectors, which is convenient for cost reporting. The factors that influence the risk are site-specific, including process conditions, chemical composition, number of potential leak sites, piping design standards, arrangement of plant equipment and structures, design of isolation and depressurization systems, and frequency of detector testing. Site-specific factors such as those just mentioned affect the size of flammable gas cloud that must be detected (within a specified probability) by the gas detection system. A probability of detection must be specified that gives a design with a tolerable risk of fires and explosions. To determine the optimum spacing of detectors, it is important to consider the probability that a detector will fail at some time and be inoperative until replaced or repaired. A cost-effective approach is based on the combined risk from a representative selection of leakage scenarios, rather than a worst-case evaluation. This means that probability and severity of leak consequences must be evaluated together. In marine and

  4. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  5. Critical experiments analyses by using 70 energy group library based on ENDF/B-VI

    Energy Technology Data Exchange (ETDEWEB)

    Tahara, Yoshihisa; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Yokohama (Japan). Nuclear Energy Systems Engineering Center; Huria, H.C.; Ouisloumen, M.

    1998-03-01

    The newly developed 70-group library has been validated by comparing kinf from a continuous energy Monte-Carlo code MCNP and two dimensional spectrum calculation code PHOENIX-CP. The code employs Discrete Angular Flux Method based on Collision Probability. The library has been also validated against a large number of critical experiments and numerical benchmarks for assemblies with MOX and Gd fuels. (author)

  6. Increased migraine risk in osteoporosis patients: a nationwide population-based study

    OpenAIRE

    Wu, Chieh-Hsin; Zhang, Zi-Hao; Wu, Ming-Kung; Wang, Chiu-Huan; Lu, Ying-Yi; Lin, Chih-Lung

    2016-01-01

    Background Osteoporosis and migraine are both important public health problems and may have overlapping pathophysiological mechanisms. The aim of this study was to use a Taiwanese population-based dataset to assess migraine risk in osteoporosis patients. Methods The Taiwan National Health Insurance Research Database was used to analyse data for 40,672 patients aged ?20?years who had been diagnosed with osteoporosis during 1996?2010. An additional 40,672 age-matched patients without osteoporos...

  7. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  8. Financial and Performance Analyses of Microcontroller Based Solar-Powered Autorickshaw for a Developing Country

    Directory of Open Access Journals (Sweden)

    Abu Raihan Mohammad Siddique

    2016-01-01

    Full Text Available This paper presents a case study to examine the economic viability and performance analysis of a microcontroller based solar powered battery operated autorickshaw (m-SBAR, for the developing countries, which is compared with different types of rickshaws such as pedal rickshaw (PR, battery operated autorickshaw (BAR, and solar-powered battery operated autorickshaw (SBAR, available in Bangladesh. The BAR consists of a rickshaw structure, a battery bank, a battery charge controller, a DC motor driver, and a DC motor whereas the proposed m-SBAR contains additional components like solar panel and microcontroller based DC motor driver. The complete design considered the local radiation data and load profile of the proposed m-SBAR. The Levelized Cost of Energy (LCOE analysis, Net Present Worth, payback periods, and Benefit-to-Cost Ratio methods have been used to evaluate the financial feasibility and sensitivity analysis of m-SBAR, grid-powered BAR, and PR. The numerical analysis reveals that LCOE and Benefit-to-Cost Ratio of the proposed m-SBAR are lower compared to the grid-powered BAR. It has also been found that microcontroller based DC motor control circuit reduces battery discharge rate, improves battery life, and controls motor speed efficiency.

  9. Risk based maintenance to increase safety and decrease costs

    International Nuclear Information System (INIS)

    Phillips, J.H.

    2000-01-01

    Risk-Based techniques have been developed for commercial nuclear power plants for the last eight years by a team working through the ASME Center for Research and Technology Development (CRTD). System boundaries and success criteria is defined using the Probabilistic Risk Analysis or Probabilistic Safety Analysis developed to meet the Individual Plant Evaluation. Final ranking of components is by a plant expert panel similar to the one developed for the Maintenance Rule. Components are identified as being high risk-significant or low risk-significant. Maintenance and resources are focused on those components that have the highest risk-significance. The techniques have been developed and applied at a number of plants. Results from the first risk-based inspection pilot plant indicates safety due to pipe failure can be doubled while the inspection reduced to about 80% when compared with current inspection programs. Pilot studies on risk-based testing indicate that about 60% of pumps and 25 to 30% of valves in plants are high safety-significant The reduction in inspection and testing reduces the person-rem exposure and resulting in further increases in safety. These techniques have been documented in publications by the ASME CRTD which are referenced. (author)

  10. Impacts of extreme heat on emergency medical service calls in King County, Washington, 2007-2012: relative risk and time series analyses of basic and advanced life support.

    Science.gov (United States)

    Calkins, Miriam M; Isaksen, Tania Busch; Stubbs, Benjamin A; Yost, Michael G; Fenske, Richard A

    2016-01-28

    Exposure to excessive heat kills more people than any other weather-related phenomenon, aggravates chronic diseases, and causes direct heat illness. Strong associations between extreme heat and health have been identified through increased mortality and hospitalizations and there is growing evidence demonstrating increased emergency department visits and demand for emergency medical services (EMS). The purpose of this study is to build on an existing regional assessment of mortality and hospitalizations by analyzing EMS demand associated with extreme heat, using calls as a health metric, in King County, Washington (WA), for a 6-year period. Relative-risk and time series analyses were used to characterize the association between heat and EMS calls for May 1 through September 30 of each year for 2007-2012. Two EMS categories, basic life support (BLS) and advanced life support (ALS), were analyzed for the effects of heat on health outcomes and transportation volume, stratified by age. Extreme heat was model-derived as the 95th (29.7 °C) and 99th (36.7 °C) percentile of average county-wide maximum daily humidex for BLS and ALS calls respectively. Relative-risk analyses revealed an 8 % (95 % CI: 6-9 %) increase in BLS calls, and a 14 % (95 % CI: 9-20 %) increase in ALS calls, on a heat day (29.7 and 36.7 °C humidex, respectively) versus a non-heat day for all ages, all causes. Time series analyses found a 6.6 % increase in BLS calls, and a 3.8 % increase in ALS calls, per unit-humidex increase above the optimum threshold, 40.7 and 39.7 °C humidex respectively. Increases in "no" and "any" transportation were found in both relative risk and time series analyses. Analysis by age category identified significant results for all age groups, with the 15-44 and 45-64 year old age groups showing some of the highest and most frequent increases across health conditions. Multiple specific health conditions were associated with increased risk of an EMS call including abdominal

  11. Assessing an organizational culture instrument based on the Competing Values Framework: Exploratory and confirmatory factor analyses

    Science.gov (United States)

    Helfrich, Christian D; Li, Yu-Fang; Mohr, David C; Meterko, Mark; Sales, Anne E

    2007-01-01

    Background The Competing Values Framework (CVF) has been widely used in health services research to assess organizational culture as a predictor of quality improvement implementation, employee and patient satisfaction, and team functioning, among other outcomes. CVF instruments generally are presented as well-validated with reliable aggregated subscales. However, only one study in the health sector has been conducted for the express purpose of validation, and that study population was limited to hospital managers from a single geographic locale. Methods We used exploratory and confirmatory factor analyses to examine the underlying structure of data from a CVF instrument. We analyzed cross-sectional data from a work environment survey conducted in the Veterans Health Administration (VHA). The study population comprised all staff in non-supervisory positions. The survey included 14 items adapted from a popular CVF instrument, which measures organizational culture according to four subscales: hierarchical, entrepreneurial, team, and rational. Results Data from 71,776 non-supervisory employees (approximate response rate 51%) from 168 VHA facilities were used in this analysis. Internal consistency of the subscales was moderate to strong (α = 0.68 to 0.85). However, the entrepreneurial, team, and rational subscales had higher correlations across subscales than within, indicating poor divergent properties. Exploratory factor analysis revealed two factors, comprising the ten items from the entrepreneurial, team, and rational subscales loading on the first factor, and two items from the hierarchical subscale loading on the second factor, along with one item from the rational subscale that cross-loaded on both factors. Results from confirmatory factor analysis suggested that the two-subscale solution provides a more parsimonious fit to the data as compared to the original four-subscale model. Conclusion This study suggests that there may be problems applying conventional

  12. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    Directory of Open Access Journals (Sweden)

    Peixin Zhao

    2014-01-01

    Full Text Available This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area.

  13. Genome-Wide Interaction Analyses between Genetic Variants and Alcohol Consumption and Smoking for Risk of Colorectal Cancer.

    Directory of Open Access Journals (Sweden)

    Jian Gong

    2016-10-01

    Full Text Available Genome-wide association studies (GWAS have identified many genetic susceptibility loci for colorectal cancer (CRC. However, variants in these loci explain only a small proportion of familial aggregation, and there are likely additional variants that are associated with CRC susceptibility. Genome-wide studies of gene-environment interactions may identify variants that are not detected in GWAS of marginal gene effects. To study this, we conducted a genome-wide analysis for interaction between genetic variants and alcohol consumption and cigarette smoking using data from the Colon Cancer Family Registry (CCFR and the Genetics and Epidemiology of Colorectal Cancer Consortium (GECCO. Interactions were tested using logistic regression. We identified interaction between CRC risk and alcohol consumption and variants in the 9q22.32/HIATL1 (Pinteraction = 1.76×10-8; permuted p-value 3.51x10-8 region. Compared to non-/occasional drinking light to moderate alcohol consumption was associated with a lower risk of colorectal cancer among individuals with rs9409565 CT genotype (OR, 0.82 [95% CI, 0.74-0.91]; P = 2.1×10-4 and TT genotypes (OR,0.62 [95% CI, 0.51-0.75]; P = 1.3×10-6 but not associated among those with the CC genotype (p = 0.059. No genome-wide statistically significant interactions were observed for smoking. If replicated our suggestive finding of a genome-wide significant interaction between genetic variants and alcohol consumption might contribute to understanding colorectal cancer etiology and identifying subpopulations with differential susceptibility to the effect of alcohol on CRC risk.

  14. Genome-Wide Interaction Analyses between Genetic Variants and Alcohol Consumption and Smoking for Risk of Colorectal Cancer

    Science.gov (United States)

    Newcomb, Polly A.; Campbell, Peter T.; Baron, John A.; Berndt, Sonja I.; Bezieau, Stephane; Brenner, Hermann; Casey, Graham; Chan, Andrew T.; Chang-Claude, Jenny; Du, Mengmeng; Figueiredo, Jane C.; Gallinger, Steven; Giovannucci, Edward L.; Haile, Robert W.; Harrison, Tabitha A.; Hayes, Richard B.; Hoffmeister, Michael; Hopper, John L.; Hudson, Thomas J.; Jeon, Jihyoun; Jenkins, Mark A.; Küry, Sébastien; Le Marchand, Loic; Lin, Yi; Lindor, Noralane M.; Nishihara, Reiko; Ogino, Shuji; Potter, John D.; Rudolph, Anja; Schoen, Robert E.; Seminara, Daniela; Slattery, Martha L.; Thibodeau, Stephen N.; Thornquist, Mark; Toth, Reka; Wallace, Robert; White, Emily; Jiao, Shuo; Lemire, Mathieu; Hsu, Li; Peters, Ulrike

    2016-01-01

    Genome-wide association studies (GWAS) have identified many genetic susceptibility loci for colorectal cancer (CRC). However, variants in these loci explain only a small proportion of familial aggregation, and there are likely additional variants that are associated with CRC susceptibility. Genome-wide studies of gene-environment interactions may identify variants that are not detected in GWAS of marginal gene effects. To study this, we conducted a genome-wide analysis for interaction between genetic variants and alcohol consumption and cigarette smoking using data from the Colon Cancer Family Registry (CCFR) and the Genetics and Epidemiology of Colorectal Cancer Consortium (GECCO). Interactions were tested using logistic regression. We identified interaction between CRC risk and alcohol consumption and variants in the 9q22.32/HIATL1 (Pinteraction = 1.76×10−8; permuted p-value 3.51x10-8) region. Compared to non-/occasional drinking light to moderate alcohol consumption was assoc