WorldWideScience

Sample records for quantitative risk analysis

  1. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  2. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  3. Quantitative risks analysis of maritime terminal petrochemical

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Leandro Silveira; Leal, Cesar A. [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica (PROMEC)]. E-mail: leandro19889900@yahoo.com.br

    2008-07-01

    This work consists of the application of a computer program (RISKAN) developed for studies of quantification of industrial risks and also a revision of the models used in the program. As part the evaluation made, a test was performed with the application of the computer program to estimate the risks for a marine terminal for storage of petrochemical products, in the city of Rio Grande, Brazil. Thus, as part of the work, it was performed a Quantitative Risk Analysis associated to the terminal, both for the workers and for the population nearby, with a verification of acceptability using the tolerability limits established by the State Licensing Agency (FEPAM-RS). In the risk analysis methodology used internationally, the most used way of presenting results of social risks is in the graphical form with the use of the FN curves and for the individual risk it is common the use of the iso-risk curves traced on the map of the area where is the plant. In the beginning of the study, both a historical analysis of accidents and use of the technique of Preliminary Analysis of Risks were made in order to aid in the process of identification of the possible scenarios of accidents related to the activities in the terminal. After identifying the initiating events, their frequencies or probabilities of occurrence were estimated and followed by the calculations of the physical effects and deaths, with the use, inside the computer program, of published models of Prins Mauritz Laboratory and of American Institute of Chemical Engineers. The average social risk obtained for the external populations was of 8.7x10{sup -7} fatality.year{sup -1} and for the internal population (people working inside the terminal), 3.2x10{sup -4} fatality.year-1. The accident scenario that most contributed to the social risk was death due to exposure to the thermal radiation caused by pool fire, with 84.3% of the total estimated for external populations and 82.9% for the people inside the terminal. The

  4. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  5. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  6. Country Risk Analysis: A Survey of the Quantitative Methods

    OpenAIRE

    Hiranya K Nath

    2008-01-01

    With globalization and financial integration, there has been rapid growth of international lending and foreign direct investment (FDI). In view of this emerging trend, country risk analysis has become extremely important for the international creditors and investors. This paper briefly discusses the concepts and definitions, and presents a survey of the quantitative methods that are used to address various issues related to country risk. It also gives a summary review of selected empirical st...

  7. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  8. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  9. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study repor

  10. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Energy Technology Data Exchange (ETDEWEB)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky [Department of Mechanical Engineering, Diponegoro University, Semarang (Indonesia); Kim, Seon Jin [Department of Mechanical & Automotive Engineering of Pukyong National University (Korea, Republic of)

    2016-04-19

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  11. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Science.gov (United States)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  12. Issues in qualitative and quantitative risk analysis for developmental toxicology.

    Science.gov (United States)

    Kimmel, C A; Gaylor, D W

    1988-03-01

    The qualitative and quantitative evaluation of risk in developmental toxicology has been discussed in several recent publications. A number of issues still are to be resolved in this area. The qualitative evaluation and interpretation of end points in developmental toxicology depends on an understanding of the biological events leading to the end points observed, the relationships among end points, and their relationship to dose and to maternal toxicity. The interpretation of these end points is also affected by the statistical power of the experiments used for detecting the various end points observed. The quantitative risk assessment attempts to estimate human risk for developmental toxicity as a function of dose. The current approach is to apply safety (uncertainty) factors to the no observed effect level (NOEL). An alternative presented and discussed here is to model the experimental data and apply a safety factor to an estimated risk level to achieve an "acceptable" level of risk. In cases where the dose-response curves upward, this approach provides a conservative estimate of risk. This procedure does not preclude the existence of a threshold dose. More research is needed to develop appropriate dose-response models that can provide better estimates for low-dose extrapolation of developmental effects.

  13. Quantitative Risks

    Science.gov (United States)

    2015-02-24

    program risk evaluation frameworks and criteria set out, in formal documentation, for proposal and contact execution evaluation as determined by the PMO ...phase until late December 2015. During this time there was turnover in the PMO . We decided that we needed to reaffirm the collaboration agreement...pertinent to developing RLI. We reviewed prior analyses of root causes of adverse program outcome. We identified Program Management Office ( PMO ) risk

  14. Quantitative risk analysis in two pipelines operated by TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Bittencourt, Euclides [Centro Universitario FIB, Salvador , BA (Brazil)

    2009-07-01

    Transportation risk analysis techniques were used to study two pipelines operated by TRANSPETRO. The Pipeline A is for the simultaneous transportation of diesel, gasoline and LPG and comprises three parts, all of them crossing rural areas. The Pipeline B is for oil transportation and one of its ends is located in an area of a high density population. Both pipelines had their risk studied using the PHAST RISK{sup R} software and the individual risk measures, the only considered measures for license purposes for this type of studies, presented level far below the maximum tolerable levels considered. (author)

  15. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph......D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes...... of variations in the data. Essential limiting factors of urban flooding QMRAs were identified as uncertainty regarding ingestion volumes, the limited use of dose-response models and low numbers of microbial parameters measurements and absent validation of the risk assessments. Because improving knowledge...

  16. Quantitative risk analysis offshore-Human and organizational factors

    Energy Technology Data Exchange (ETDEWEB)

    Espen Skogdalen, Jon, E-mail: jon.espen.skogdalen@gmail.co [Department of Industrial Economics, Risk Management and Planning, University of Stavanger, 4036 Stavanger (Norway); Vinnem, Jan Erik, E-mail: jev@preventor.n [Department of Industrial Economics, Risk Management and Planning, University of Stavanger, 4036 Stavanger (Norway)

    2011-04-15

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  17. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    OpenAIRE

    2016-01-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequen...

  18. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  19. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  20. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  1. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  2. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    Science.gov (United States)

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  3. Quantitative risk analysis for landslides ‒ Examples from Bíldudalur, NW-Iceland

    Directory of Open Access Journals (Sweden)

    R. Bell

    2004-01-01

    Full Text Available Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative

  4. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    Science.gov (United States)

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  5. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    Science.gov (United States)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  6. Spatial stochastic simulation offers potential as a quantitative method for pest risk analysis.

    Science.gov (United States)

    Rafoss, Trond

    2003-08-01

    Pest risk analysis represents an emerging field of risk analysis that evaluates the potential risks of the introduction and establishment of plant pests into a new geographic location and then assesses the management options to reduce those potential risks. Development of new and adapted methodology is required to answer questions concerning pest risk analysis of exotic plant pests. This research describes a new method for predicting the potential establishment and spread of a plant pest into new areas using a case study, Ralstonia solanacearum, a bacterial disease of potato. This method combines current quantitative methodologies, stochastic simulation, and geographic information systems with knowledge of pest biology and environmental data to derive new information about pest establishment potential in a geographical region where a pest had not been introduced. This proposed method extends an existing methodology for matching pest characteristics with environmental conditions by modeling and simulating dissemination behavior of a pest organism. Issues related to integrating spatial variables into risk analysis models are further discussed in this article.

  7. A semi-quantitative approach to GMO risk-benefit analysis.

    Science.gov (United States)

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  8. Quantitative microbiological risk assessment.

    Science.gov (United States)

    Hoornstra, E; Notermans, S

    2001-05-21

    The production of safe food is being increasingly based on the use of risk analysis, and this process is now in use to establish national and international food safety objectives. It is also being used more frequently to guarantee that safety objectives are met and that such guarantees are achieved in a cost-effective manner. One part of the overall risk analysis procedure-risk assessment-is the scientific process in which the hazards and risk factors are identified, and the risk estimate or risk profile is determined. Risk assessment is an especially important tool for governments when food safety objectives have to be developed in the case of 'new' contaminants in known products or known contaminants causing trouble in 'new' products. Risk assessment is also an important approach for food companies (i) during product development, (ii) during (hygienic) process optimalization, and (iii) as an extension (validation) of the more qualitative HACCP-plan. This paper discusses these two different types of risk assessment, and uses probability distribution functions to assess the risks posed by Escherichia coli O157:H7 in each case. Such approaches are essential elements of risk management, as they draw on all available information to derive accurate and realistic estimations of the risk posed. The paper also discusses the potential of scenario-analysis in simulating the impact of different or modified risk factors during the consideration of new or improved control measures.

  9. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    Science.gov (United States)

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-03

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  10. New Approaches to Transport Project Assessment: Reference Scenario Forecasting and Quantitative Risk Analysis

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2010-01-01

    This presentation sets out a new methodology for examining the uncertainties relating to transport decision making based on infrastructure appraisals. Traditional transport infrastructure projects are based upon cost-benefit analyses in order to appraise the projects feasibility. Recent research...... however has proved that the point estimates derived from such analyses are embedded with a large degree of uncertainty. Thus, a new scheme was proposed in terms of applying quantitative risk analysis (QRA) and Monte Carlo simulation in order to represent the uncertainties within the cost-benefit analysis....... Additionally, the handling of uncertainties is supplemented by making use of the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits (user demands i.e. travel time savings) and underestimating investment costs....

  11. Quantitative Risk - Phases 1 & 2

    Science.gov (United States)

    2013-11-12

    quantitative risk characterization”, " Risk characterization of microbiological hazards in food ", Chapter 4, 2009 314...State University, July 9, 2013 213. Albert I, Grenier E, Denis JB, Rousseau J., “ Quantitative Risk Assessment from Farm to Fork and Beyond: a...MELHEM, G., “Conduct Effective Quantitative Risk Assessment (QRA) Studies”, ioMosaic Corporation, 2006 233. Anderson, J., Brown, R., “ Risk

  12. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    Science.gov (United States)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  13. A Note on Quantitative Definition of Risk

    Institute of Scientific and Technical Information of China (English)

    李力; 赵联文; 杨宁

    2004-01-01

    Risk analysis has become more and more important in practical application, but there has not been a widely accepted frame work and model for the research. Motivated by factor analysis method and Kaplan-Garrick's quantitative definition of risk, a general risk model was established based on analyzing risk situations and employing information system and evidence theory, and Kaplan-Garrick's result was improved to introduce a frame word for analyzing and managing risk.

  14. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  15. Quantitative Structure Activity Relationship and Risk Analysis of Some Pesticides in the Goat milk

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2013-01-01

    Full Text Available The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean+/-SEM levels (ppm of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34+/-0.007, 0.063+/-0.002, 0.034+/-0.002 and 0.092+/-0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW, melting point (MP, and log octanol to water partition coefficient (Ko/w in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985 for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  16. Meta-analysis for quantitative microbiological risk assessments and benchmarking data

    NARCIS (Netherlands)

    Besten, den H.M.W.; Zwietering, M.H.

    2012-01-01

    Meta-analysis studies are increasingly being conducted in the food microbiology area to quantitatively integrate the findings of many individual studies on specific questions or kinetic parameters of interest. Meta-analyses provide global estimates of parameters and quantify their variabilities, and

  17. Microbiological Quantitative Risk Assessment

    Science.gov (United States)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  18. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  19. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  20. Quantitative Risk - Phase 1

    Science.gov (United States)

    2013-09-03

    Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for...OF PAGES 88 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev...Gerencia de Proyectos , 24-25 October 2003 261. Mehdizadeh, R., “Dynamic and multi-perspective risk management of construction projects using tailor

  1. A Quantitative Risk-Benefit Analysis of Prophylactic Surgery Prior to Extended-Duration Spaceflight

    Science.gov (United States)

    Carroll, Danielle; Reyes, David; Kerstman, Eric; Walton, Marlei; Antonsen, Erik

    2017-01-01

    INTRODUCTION: Among otherwise healthy astronauts undertaking deep space missions, the risks for acute appendicitis (AA) and cholecystitis (AC) are not zero. If these conditions were to occur during spaceflight they may require surgery for definitive care. The proposed study quantifies and compares the risks of developing de novo AA and AC in-flight to the surgical risks of prophylactic laparoscopic appendectomy (LA) and cholecystectomy (LC) using NASA's Integrated Medical Model (IMM). METHODS: The IMM is a Monte Carlo simulation that forecasts medical events during spaceflight missions and estimates the impact of these medical events on crew health. In this study, four Design Reference Missions (DRMs) were created to assess the probability of an astronaut developing in-flight small-bowel obstruction (SBO) following prophylactic 1) LA, 2) LC, 3) LA and LC, or 4) neither surgery (SR# S-20160407-351). Model inputs were drawn from a large, population-based 2011 Swedish study that examined the incidence and risks of post-operative SBO over a 5-year follow-up period. The study group included 1,152 patients who underwent LA, and 16,371 who underwent LC. RESULTS: Preliminary results indicate that prophylactic LA may yield higher mission risks than the control DRM. Complete analyses are pending and will be subsequently available. DISCUSSION: The risk versus benefits of prophylactic surgery in astronauts to decrease the probability of acute surgical events during spaceflight has only been qualitatively examined in prior studies. Within the assumptions and limitations of the IMM, this work provides the first quantitative guidance that has previously been lacking to this important question for future deep space exploration missions.

  2. Quantitative risk assessment of CO

    NARCIS (Netherlands)

    Koornneef, J.; Spruijt, M.; Molag, M.; Ramírez, A.; Turkenburg, W.; Faaij, A.

    2010-01-01

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO2 pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperat

  3. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  4. Quantitative data analysis of chemical contamination in the Venice lagoon. A risk management perspective

    Energy Technology Data Exchange (ETDEWEB)

    Miniero, R.; Domenico, A. di [Istituto Superiore di Sanita, Rome (Italy). Dept. Environment and Primary Prevention

    2004-09-15

    A comprehensive risk management for the contaminants present in bottom sediments of the Venice lagoon appears to be complicated by three issues: the past, present, and future influence of human pressure; the obvious sensitivity of a wetland like the lagoon; its extension. The actual situation can be viewed as typical of stressors at regional scale. The relationships between a coastal city and its environment are one of the central question addressed in Chapter 17 of Agenda 21, adopted at the United Nations Conference on Environment and Development (UNCED). In this chapter, the importance of coasts in a life-supporting system and the positive opportunity for sustainable development that coastal areas represent are stressed. However, in industrialized countries a practicable co-existence of environment and development will require mostly regulatory measures to regulate their relationships. The Venice lagoon is one of the leading shellfish production areas in Italy, harvesting several metric tons per year of the clam Tapes philippinarum and the mussel Mytilus galloprovincialis. A number of studies in recent years have characterized the chemical contamination of matrices like biota and sediment. The chemicals analyzed belong to different families including organic contaminants (such as polychlorinated dibenzodioxins (PCDDs) and dibenzofurans (PCDFs)), chlorinated pesticides, heavy metals, organometals, etc. The primary contamination sources have been clearly identified with Porto Marghera industrial settlement and the city of Venice with its canals, motorboats, and dense anthropogenic activity. The impacts of all these activities appear to be concentrated in the central basin although the industrial area be situated at the southern boundaries of the northern basin. From the studies on sediments, the following four impact types were identified in the lagoon: industrial, urban, ''not classifiable'', and lagoon background. In this paper, the PCDD

  5. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2017-08-12

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm(3), 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  6. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  7. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  8. Quantitative risk-benefit analysis of fish consumption for women of child-bearing age in Hong Kong.

    Science.gov (United States)

    Chen, M Y Y; Wong, W W K; Chung, S W C; Tran, C H; Chan, B T P; Ho, Y Y; Xiao, Y

    2014-01-01

    Maternal fish consumption is associated with both risks from methylmercury (MeHg) and beneficial effects from omega-3 fatty acids to the developing foetal brain. This paper assessed the dietary exposure to MeHg of women of child-bearing age (20-49 years) in Hong Kong, and conducted risk-benefit analysis in terms of the effects in children's intelligent quotient (IQ) based on local data and the quantitative method derived by the expert consultation of FAO/WHO. Results showed that average and high consumers consume 450 and 1500 g of fish (including seafood) per week, respectively. About 11% of women of child-bearing age had a dietary exposure to MeHg exceeding the PTWI of 1.6 µg kg(-1) bw. In pregnant women MeHg intake may pose health risks to the developing foetuses. For average consumers, eating any of the 19 types of the most commonly consumed fish and seafood during pregnancy would result in 0.79-5.7 IQ points gain by their children. For high consumers, if they only ate tuna during pregnancy, it would cause 2.3 IQ points reduction in their children. The results indicated that for pregnant women the benefit outweighed the risk associated with eating fish if they consume different varieties of fish in moderation.

  9. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    Science.gov (United States)

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations.

  10. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Kriticos, D.J.; Baker, R.H.A.; Battisti, A.; Brunel, S.; Dupin, M.; Eyre, D.; Faccoli, M.; Ilieva, Z.; Kenis, M.; Knight, J.; Reynaud, P.; Yart, A.; Werf, van der W.

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no sui

  11. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  12. A toolbox for rockfall Quantitative Risk Assessment

    Science.gov (United States)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  13. Quantitative Analysis of Uncertainty in Financial Risk Assessment of Road Transportation of Wood in Uruguay

    Directory of Open Access Journals (Sweden)

    Danilo Simões

    2016-06-01

    Full Text Available The uncertainty in road transportation of wood is inherent to its operational costs, to the amount of transported wood, to the traveled distance, to its revenue, and more. Although it is not possible to measure this uncertainty fully, it can be quantified by the investment risk, which is the probability and degree of financial loss. The objective of this study is to quantify the financial risk of the investment in wood transportation through Monte Carlo simulation, which uses realistic situations to estimate the operational cost of vehicles used for road transportation of wood. We quantify these uncertainties by assessing financial risk and building pseudorandom scenarios with the Monte Carlo simulation method, in addition to the Net Present Value techniques, the Modified Internal Rate of Return, and the Profitability Index, all commonly used in financial investment projects. The results show that the estimated operational costs are equivalent to the actual ones, along with the evidence that the cost of fuel, the driver’s manpower, and tires are components that mainly increase the degree of financial risk for an investment project in road transportation of wood. In contrast, optimizing the amount of transported wood and maximizing wood transportation cost have a significant and positive correlation with the volume of transported wood and the average price of wood transportation, leading to a reduction in the degree of financial risk.

  14. Retrospective analysis of a listeria monocytogenes contamination episode in raw milk goat cheese using quantitative microbial risk assessment tools.

    Science.gov (United States)

    Delhalle, L; Ellouze, M; Yde, M; Clinquart, A; Daube, G; Korsak, N

    2012-12-01

    In 2005, the Belgian authorities reported a Listeria monocytogenes contamination episode in cheese made from raw goat's milk. The presence of an asymptomatic shedder goat in the herd caused this contamination. On the basis of data collected at the time of the episode, a retrospective study was performed using an exposure assessment model covering the production chain from the milking of goats up to delivery of cheese to the market. Predictive microbiology models were used to simulate the growth of L. monocytogenes during the cheese process in relation with temperature, pH, and water activity. The model showed significant growth of L. monocytogenes during chilling and storage of the milk collected the day before the cheese production (median increase of 2.2 log CFU/ml) and during the addition of starter and rennet to milk (median increase of 1.2 log CFU/ml). The L. monocytogenes concentration in the fresh unripened cheese was estimated to be 3.8 log CFU/g (median). This result is consistent with the number of L. monocytogenes in the fresh cheese (3.6 log CFU/g) reported during the cheese contamination episode. A variance-based method sensitivity analysis identified the most important factors impacting the cheese contamination, and a scenario analysis then evaluated several options for risk mitigation. Thus, by using quantitative microbial risk assessment tools, this study provides reliable information to identify and control critical steps in a local production chain of cheese made from raw goat's milk.

  15. Quantitative risk analysis of gas explosions in tunnels; probability, effects, and consequences

    NARCIS (Netherlands)

    Weerheijm, J.; Voort, M.M. van der; Verreault, J.; Berg, A.C. van den

    2015-01-01

    Tunnel accidents with transports of combustible liquefied gases may lead to explosions. Depending on the substance involved this can be a Boiling Liquid Expanding Vapour Explosion (BLEVE), a Gas Expansion Explosion (GEE) or a gas explosion. Quantification of the risk of these scenarios is important

  16. Quantitative risk analysis of gas explosions in tunnels; probability, effects, and consequences

    NARCIS (Netherlands)

    Weerheijm, J.; Voort, M.M. van der; Verreault, J.; Berg, A.C. van den

    2015-01-01

    Tunnel accidents with transports of combustible liquefied gases may lead to explosions. Depending on the substance involved this can be a Boiling Liquid Expanding Vapour Explosion (BLEVE), a Gas Expansion Explosion (GEE) or a gas explosion. Quantification of the risk of these scenarios is important

  17. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Science.gov (United States)

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  18. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Directory of Open Access Journals (Sweden)

    Christelle Robinet

    Full Text Available Pest Risk Analyses (PRAs are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens. Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  19. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  20. Use of quantitative hazard analysis to evaluate risk associated with US Department of Energy Nuclear Explosive Operations

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, S.R.; O`Brien, D.A.; Martinez, J.; LeDoux, M.

    1996-03-01

    Quantitative hazard assessments (QHAs) are being used to support the US Department of Energy (DOE) Integrated Safety Process (SS-21), Nuclear Explosive Safety Studies (NESS), and Environmental Safety and Health (ES&H) initiatives. The QHAs are used to identify hazards associated with DOE nuclear explosive operations. In 1994, Los Alamos National Laboratory, Sandia National Laboratory, and the Pantex Plant participated in a joint effort to demonstrate the utility of performing hazard assessments (HAs) concurrently with process design and development efforts. Early identification of high risk operations allow for process modifications before final process design is completed. This demonstration effort, which used an integrated design process (SS-21), resulted in the redesign of the dismantlement process for the B61 center case. The SS-21 program integrates environment, safety, and health (ES&H) and nuclear explosive safety requirements. QHAs are used to identify accidents that have the potential for worker injury or public health or environmental impact. The HA is to evaluate the likelihood of accident sequences that have the potential for worker or public injury or environmental damage; identify safety critical tooling and procedural steps; identify operational safety controls; identify safety-class/significant systems, structures and components; identify dominant accident sequences; demonstrate that the facility Safety Analysis Report (SAR) design-basis accident envelops process-specific accidents; and support future change control activities.

  1. The use of quantitative risk assessment in HACCP

    NARCIS (Netherlands)

    Hoornstra, E.; Northolt, M.D.; Notermans, S.; Barendsz, A.W.

    2001-01-01

    During the hazard analysis as part of the development of a HACCP-system, first the hazards (contaminants) have to be identified and then the risks have to be assessed. Often, this assessment is restricted to a qualitative analysis. By using elements of quantitative risk assessment (QRA) the hazard a

  2. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  3. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    Science.gov (United States)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  4. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  5. Quantitative risk analysis applied to innocuity and potency tests on the oil-adjuvanted vaccine against foot and mouth disease in Argentina.

    Science.gov (United States)

    Cané, B G; Rodríguez Toledo, J; Falczuk, A; Leanes, L F; Manetti, J C; Maradei, E; Verde, P

    1995-12-01

    The authors describe the method used in Argentina for quantification of risk in controls of the potency and innocuity of foot and mouth disease vaccine. Quantitative risk analysis is a relatively new tool in the animal health field, and is in line with the principles of transparency and equivalency of the Sanitary and Phytosanitary Agreement of the Uruguay Round of the General Agreement on Tariffs and Trade (GATT: now World Trade Organisation [WTO]). The risk assessment is presented through a description of the steps involved in manufacturing the vaccine, and the controls performed by the manufacturer and by the National Health Animal Service (Servicio Nacional de Sanidad Animal: SENASA). The adverse situation is considered as the lack of potency or innocuity of the vaccine, and the risk is estimated using a combination of the Monte Carlo simulation and the application of a Bayesian model.

  6. Quantitative analysis of privatization

    CERN Document Server

    Vahabi, M

    2008-01-01

    In recent years, the economic policy of privatization, which is defined as the transfer of property or responsibility from public sector to private sector, is one of the global phenomenon that increases use of markets to allocate resources. One important motivation for privatization is to help develop factor and product markets, as well as security markets. Progress in privatization is correlated with improvements in perceived political and investment risk. Many emerging countries have gradually reduced their political risks during the course of sustained privatization. In fact, most risk resolution seems to take place as privatization proceeds to its later stage. Alternative benefits of privatization are improved risk sharing and increased liquidity and activity of the market. One of the main methods to develop privatization is entering a new stock to the markets for arising competition. However, attention to the capability of the markets to accept a new stock is substantial. Without considering the above st...

  7. 软件项目进度风险的定量分析方法%Quantitative Risk Analysis of Software Project Schedule Management

    Institute of Scientific and Technical Information of China (English)

    何剑虹; 白晓颖; 胡林平

    2011-01-01

    Project delay is still a common phenomenon in software development,which may affect seriously in software quality and cost, even result in project failure Risk control is thus essential for project schedule management. The paper analyzed the quantitative approaches for schedule risk analysis and management. It surveyed and compared the three typical models of software project management, including PERT/CPM, Critical Chain/Buffer Management, Event Chain Methodology. Among them,the Event Chain is a newly proposed model and risk analysis method from 2004. It introduces the event model into traditional models of activities, which captures the basic characteristics of risk occurrence and propagation. It represents a new viewpoint with a separated and dedicated risk model, and promotes a innovative approach for quantitative risk analysis.%软件项目超期仍然是业界的一个普遍现象,进度延期可能直接导致项目质量下降、费用超支甚至失败,因此项目执行过程中对于进度的风险控制至关重要.从定量分析的角度探讨项目进度管理的风险管理方法,并对3种主要的进度管理方法和模型--PERT/CPM方法、关键链方法、事件链方法--进行了调研和对比.其中,事件链是近年来新提出的一种针对风险事件进行建模和分析的方法,为进一步研究有效的进度风险管理控制方法和技术提供了新的思路和视角.

  8. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  9. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  10. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  11. Asbestos exposure--quantitative assessment of risk

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  12. A Team Mental Model Perspective of Pre-Quantitative Risk

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  13. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    Science.gov (United States)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  14. Cluster analysis of bone microarchitecture from high resolution peripheral quantitative computed tomography demonstrates two separate phenotypes associated with high fracture risk in men and women.

    Science.gov (United States)

    Edwards, M H; Robinson, D E; Ward, K A; Javaid, M K; Walker-Bone, K; Cooper, C; Dennison, E M

    2016-07-01

    Osteoporosis is a major healthcare problem which is conventionally assessed by dual energy X-ray absorptiometry (DXA). New technologies such as high resolution peripheral quantitative computed tomography (HRpQCT) also predict fracture risk. HRpQCT measures a number of bone characteristics that may inform specific patterns of bone deficits. We used cluster analysis to define different bone phenotypes and their relationships to fracture prevalence and areal bone mineral density (BMD). 177 men and 159 women, in whom fracture history was determined by self-report and vertebral fracture assessment, underwent HRpQCT of the distal radius and femoral neck DXA. Five clusters were derived with two clusters associated with elevated fracture risk. "Cluster 1" contained 26 women (50.0% fractured) and 30 men (50.0% fractured) with a lower mean cortical thickness and cortical volumetric BMD, and in men only, a mean total and trabecular area more than the sex-specific cohort mean. "Cluster 2" contained 20 women (50.0% fractured) and 14 men (35.7% fractured) with a lower mean trabecular density and trabecular number than the sex-specific cohort mean. Logistic regression showed fracture rates in these clusters to be significantly higher than the lowest fracture risk cluster [5] (pcluster 5 in women in cluster 1 and 2 (pcluster 2 (pclusters in both men and women which may differ in etiology and response to treatment. As cluster 1 in men does not have low areal BMD, these men may not be identified as high risk by conventional DXA alone. Copyright © 2016. Published by Elsevier Inc.

  15. Quantitative analysis of glycated proteins.

    Science.gov (United States)

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  16. Contract farming risks: A quantitative assessment

    Directory of Open Access Journals (Sweden)

    Arkins M Kabungo

    2016-03-01

    Full Text Available The objective of this study is to identify the key risks facing each of the stakeholders in the export-focused paprika value chain in Zambia. Although a deterministic cost-benefit analysis indicated that this outgrower scheme would have a very satisfactory net present value (NPV, a Monte Carlo analysis using an integrated financial–economic–stakeholder model identifies a number of risk variables that could make this system unsustainable. The major risks include the variability of the real exchange rate in Zambia; the international price of paprika; and the farm yield rates. This analysis points out that irrigation systems are very important for both stabilising and increasing yields. The analysis also shows the limitations of loan financing for such outgrower arrangements when at the sector level it is difficult or even impossible to mitigate the risks from real exchange rate movements and changes in international commodity prices. This micro-level analysis shows how critical real exchange rate management policies are in achieving sustainability of such export-oriented value chains.

  17. Quantitative analysis of risk factors associated with brucellosis in livestock in the Katavi-Rukwa ecosystem, Tanzania.

    Science.gov (United States)

    Assenga, Justine A; Matemba, Lucas E; Malakalinga, Joseph J; Muller, Shabani K; Kazwala, Rudovick R

    2016-02-01

    Brucellosis is a neglected contagious bacterial disease of public health and economic importance. Nevertheless, its spread is not well known to many livestock farmers. Unmatched case control study was carried out to identify risk factors associated with brucellosis in cattle and goats at the herd level in Mpanda, Mlele and Nsimbo districts of Katavi region, in Tanzania between September 2012 and July 2013. A total of 138 adult respondents were selected randomly for the interview using a structured questionnaire. The criterion for inclusion was to have at least one Brucella-positive animal in the herd while the control was chosen from among the herds which these animals tested negative. The presence of seropositive herds were statistically linked (P brucellosis between animals and humans and the implementation of disease prevention and control programmes.

  18. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David;

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... on three fragrance ingredients in order to provide a practical assessment of the predictive value of the QRA approach. It is important to have data to assess that the methodology provides a robust approach for primary prevention of contact sensitization induction for fragrance ingredients identified...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  19. Increased brain iron deposition is a risk factor for brain atrophy in patients with haemodialysis: a combined study of quantitative susceptibility mapping and whole brain volume analysis.

    Science.gov (United States)

    Chai, Chao; Zhang, Mengjie; Long, Miaomiao; Chu, Zhiqiang; Wang, Tong; Wang, Lijun; Guo, Yu; Yan, Shuo; Haacke, E Mark; Shen, Wen; Xia, Shuang

    2015-08-01

    To explore the correlation between increased brain iron deposition and brain atrophy in patients with haemodialysis and their correlation with clinical biomarkers and neuropsychological test. Forty two patients with haemodialysis and forty one age- and gender-matched healthy controls were recruited in this prospective study. 3D whole brain high resolution T1WI and susceptibility weighted imaging were scanned on a 3 T MRI system. The brain volume was analyzed using voxel-based morphometry (VBM) in patients and to compare with that of healthy controls. Quantitative susceptibility mapping was used to measure and compare the susceptibility of different structures between patients and healthy controls. Correlation analysis was used to investigate the relationship between the brain volume, iron deposition and neuropsychological scores. Stepwise multiple regression analysis was used to explore the effect of clinical biomarkers on the brain volumes in patients. Compared with healthy controls, patients with haemodialysis showed decreased volume of bilateral putamen and left insular lobe (All P putamen, substantia nigra, red nucleus and dentate nucleus were significantly higher (All P putamen (P putamen (P < 0.05). Our study indicated increased brain iron deposition and dialysis duration was risk factors for brain atrophy in patients with haemodialysis. The decreased gray matter volume of the left insular lobe was correlated with neurocognitive impairment.

  20. Quantitative and Qualitative Assessment of Soil Erosion Risk in Małopolska (Poland), Supported by an Object-Based Analysis of High-Resolution Satellite Images

    Science.gov (United States)

    Drzewiecki, Wojciech; Wężyk, Piotr; Pierzchalski, Marcin; Szafrańska, Beata

    2014-06-01

    In 2011 the Marshal Office of Małopolska Voivodeship decided to evaluate the vulnerability of soils to water erosion for the entire region. The quantitative and qualitative assessment of the erosion risk for the soils of the Małopolska region was done based on the USLE approach. The special work-flow of geoinformation technologies was used to fulfil this goal. A high-resolution soil map, together with rainfall data, a detailed digital elevation model and statistical information about areas sown with particular crops created the input information for erosion modelling in GIS environment. The satellite remote sensing technology and the object-based image analysis (OBIA) approach gave valuable support to this study. RapidEye satellite images were used to obtain the essential up-to-date data about land use and vegetation cover for the entire region (15,000 km2). The application of OBIA also led to defining the direction of field cultivation and the mapping of contour tillage areas. As a result, the spatially differentiated values of erosion control practice factor were used. Both, the potential and the actual soil erosion risk were assessed quantificatively and qualitatively. The results of the erosion assessment in the Małopolska Voivodeship reveal the fact that a majority of its agricultural lands is characterized by moderate or low erosion risk levels. However, high-resolution erosion risk maps show its substantial spatial diversity. According to our study, average or higher actual erosion intensity levels occur for 10.6 % of agricultural land, i.e. 3.6 % of the entire voivodeship area. In 20 % of the municipalities there is a very urgent demand for erosion control. In the next 23 % an urgent erosion control is needed. Our study showed that even a slight improvement of P-factor estimation may have an influence on modeling results. In our case, despite a marginal change of erosion assessment figures on a regional scale, the influence on the final prioritization of

  1. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    Science.gov (United States)

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  2. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  3. Quantitative Measures of Mineral Supply Risk

    Science.gov (United States)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  4. Risk Analysis

    Science.gov (United States)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  5. The Application of the Semi-quantitative Risk Assessment Method to Urban Natural Gas Pipelines

    OpenAIRE

    Bai, Yongqiang; LiangHai LV; Wang, Tong

    2013-01-01

    This paper provides a method of semi-quantitative risk assessment for urban gas pipelines, by modifying Kent analysis method. The influence factors of fault frequency and consequence for urban gas pipelines are analyzed, and the grade rules are studied on. The grade rules of fault frequency and consequence for urban natural gas pipelines are provided. Using semi-quantitative risk matrix, the risk grade of the urban gas pipelines is obtained, and the risk primary sort for gas pipel...

  6. Simulation Analysis of Quantitative Analysis Model of Personal Credit Risk Assessment in Commercial Banks%商业银行个人信贷风险评估的定量分析模型仿真分析

    Institute of Scientific and Technical Information of China (English)

    宋强; 刘洋

    2015-01-01

    商业银行个人信贷受到管理不到位、风险信息管理系统滞后等因素的影响,需要进行商业银行个人信贷风险评估的定量分析模型构建,提高商业银行信贷风险管理的能力。提出一种基于多层步进动态评估的商业银行个人信贷风险评估的定量分析模型。首先进行了商业银行个人信贷风险种类分析和影响因素分析,构建商业银行个人信贷的风险评估参量评价体系,根据先期的数据分析,计算得到商业银行个人信贷风险标准中的等级梯度值,进行商业银行个人信贷风险评估的定量分析模型优化设计。仿真结果表明,采用该模型进行商业银行个人信贷风险评估的定量分析,数据分析拟合结果准确,能有效抵御信贷风险,通过多渠道建立有效的损失补偿机制,提高商业银行的信贷风险防控能力。%Personal credit in commercial banks affected by factors such as not in place management and lagged behind risk information management system, it is necessary to build up the quantitative analysis model of personal credit risk as⁃sessment in commercial banks to improve their capacity in credit risk management. Therefore, a kind of quantitative anal⁃ysis model of personal credit risk assessment in commercial banks is proposed based on multilayer step-by-step dynamic assessment. First, the article analyzes the types and influence factors of commercial banks' personal credit risks, then es⁃ tablish the parameters evaluation system of personal credit risk assessment in commercial banks, according to preliminary data analysis, calculates the rank gradient value in the personal credit risk standard of commercial banks, then optimizes the design of quantitative analysis model for commercial banks' personal credit risk assessment. The simulation results show that adopting this model to proceed the quantitative analysis for personal credit risk assessment of commercial

  7. QUANTITATIVE RISK MAPPING OF URBAN GAS PIPELINE NETWORKS USING GIS

    Directory of Open Access Journals (Sweden)

    P. Azari

    2017-09-01

    Full Text Available Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.

  8. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  9. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    Science.gov (United States)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  10. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    Science.gov (United States)

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction.

  11. A comparison of risk assessment techniques from qualitative to quantitative

    Energy Technology Data Exchange (ETDEWEB)

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  12. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    Directory of Open Access Journals (Sweden)

    Upasna Saluja

    2012-06-01

    Full Text Available Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.

  13. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    Directory of Open Access Journals (Sweden)

    Upasna Saluja

    2012-06-01

    Full Text Available Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.

  14. 镇域生态环境风险定量分析%Quantitative Risk Analysis of Eco- Environment Quality in Towns

    Institute of Scientific and Technical Information of China (English)

    尚玲玲; 林艳新

    2011-01-01

    evaluation of eco - environment quality for town comprehensively and objectively. It is a suitable method for eco - environmental quantitative risk analysis of Chinese towns.

  15. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  16. Quantitative analysis and reduction of the eco-toxicity risk of heavy metals for the fine fraction of automobile shredder residue (ASR) using H2O2.

    Science.gov (United States)

    Singh, Jiwan; Yang, Jae-Kyu; Chang, Yoon-Young

    2016-02-01

    Automobile shredder residue (ASR) fraction (size <0.25mm) can be considered as hazardous due to presence of high concentrations of heavy metals. Hydrogen peroxide combined with nitric acid has been used for the recovery of heavy metals (Zn, Cu, Mn, Fe, Ni, Pb, Cd and Cr) from the fine fraction of ASR. A sequential extraction procedure has also been used to determine the heavy metal speciation in the fine fraction of ASR before and after treatment. A risk analysis of the fine fraction of ASR before and after treatment was conducted to assess the bioavailability and eco-toxicity of heavy metals. These results showed that the recovery of heavy metals from ASR increased with an increase in the hydrogen peroxide concentration. A high concentration of heavy metals was found to be present in Cbio fractions (the sum of the exchangeable and carbonate fractions) in the fine fraction of ASR, indicating high toxicity risk. The Cbio rate of all selected heavy metals was found to range from 8.6% to 33.4% of the total metal content in the fine fraction of ASR. After treatment, Cbio was reduced to 0.3-3.3% of total metal upon a treatment with 2.0% hydrogen peroxide. On the basis of the risk assessment code (RAC), the environmental risk values for heavy metals in the fine fraction of ASR reflect high risk/medium risk. However, after treatment, the heavy metals would be categorized as low risk/no risk. The present study concludes that hydrogen peroxide combined with nitric acid is a promising treatment for the recovery and reduction of the eco-toxicity risk of heavy metals in ASR.

  17. Comparative QRA (Quantitative Risk Analysis) of natural gas distribution pipelines in urban areas; Analise comparativa dos riscos da operacao de linhas de gas natural em areas urbanas

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Luiz Fernando S. de [Energy Solutions South America (Brazil); Cardoso, Cassia de O.; Storch, Rafael [Det Norske Veritas (DNV) (Brazil)

    2008-07-01

    The natural gas pipeline network grows around the world, but its operation inherently imposes a risk to the people living next to pipelines. Due to this, it is necessary to conduct a risk analysis during the environmental licensing in Brazil. Despite the risk analysis methodology is well established, some points of its application for the distribution pipelines are still under discussion. This paper presents a methodology that examines the influences of major projects and operating parameters on the risk calculation of a distribution pipeline accident in urban areas as well as the possible accident scenarios assessment complexity. The impact of some scenarios has been evaluated using a Computational Fluid Dynamics tool. The results indicate that, under certain conditions, the risks from the pipeline operation under operating pressures of 20 bar may be acceptable in location class 3 or even in class 4. These results play a very important role if management decisions on the growth of the distribution of natural gas network in densely populated areas as well as in the improvement of laws to control the activity of distribution of natural gas. (author)

  18. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye mov

  19. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye

  20. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  1. Quantitative Analysis of Face Symmetry.

    Science.gov (United States)

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  2. Quantitative analysis of Boehm's GC

    Institute of Scientific and Technical Information of China (English)

    GUAN Xue-tao; ZHANG Yuan-rui; GOU Xiao-gang; CHENG Xu

    2003-01-01

    The term garbage collection describes the automated process of finding previously allocated memorythatis no longer in use in order to make the memory available to satisfy subsequent allocation requests. Wehave reviewed existing papers and implementations of GC, and especially analyzed Boehm' s C codes, which isa real-time mark-sweep GC running under Linux and ANSI C standard. In this paper, we will quantitatively an-alyze the performance of different configurations of Boehm' s collector subjected to different workloads. Reportedmeasurements demonstrate that a refined garbage collector is a viable alternative to traditional explicit memorymanagement techniques, even for low-level languages. It is more a trade-off for certain system than an all-or-nothing proposition.

  3. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  4. Quantitative risk stratification of oral leukoplakia with exfoliative cytology.

    Directory of Open Access Journals (Sweden)

    Yao Liu

    Full Text Available Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma (OSCC. Test outcome is reported as "negative", "atypical" (defined as abnormal epithelial changes of uncertain diagnostic significance, and "positive" (defined as definitive cellular evidence of epithelial dysplasia or carcinoma. The major challenge is how to properly manage the "atypical" patients in order to diagnose OSCC early and prevent OSCC. In this study, we collected exfoliative cytology data, histopathology data, and clinical data of normal subjects (n=102, oral leukoplakia (OLK patients (n=82, and OSCC patients (n=93, and developed a data analysis procedure for quantitative risk stratification of OLK patients. This procedure involving a step called expert-guided data transformation and reconstruction (EdTAR which allows automatic data processing and reconstruction and reveals informative signals for subsequent risk stratification. Modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Among the several models tested using resampling methods for parameter pruning and performance evaluation, Support Vector Machine (SVM was found to be optimal with a high sensitivity (median>0.98 and specificity (median>0.99. With the SVM model, we constructed an oral cancer risk index (OCRI which may potentially guide clinical follow-up of OLK patients. One OLK patient with an initial OCRI of 0.88 developed OSCC after 40 months of follow-up. In conclusion, we have developed a statistical method for qualitative risk stratification of OLK patients. This method may potentially improve cost-effectiveness of clinical follow-up of OLK patients, and help design clinical chemoprevention trial for high-risk populations.

  5. Quantitative risk stratification of oral leukoplakia with exfoliative cytology.

    Science.gov (United States)

    Liu, Yao; Li, Jianying; Liu, Xiaoyong; Liu, Xudong; Khawar, Waqaar; Zhang, Xinyan; Wang, Fan; Chen, Xiaoxin; Sun, Zheng

    2015-01-01

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma (OSCC). Test outcome is reported as "negative", "atypical" (defined as abnormal epithelial changes of uncertain diagnostic significance), and "positive" (defined as definitive cellular evidence of epithelial dysplasia or carcinoma). The major challenge is how to properly manage the "atypical" patients in order to diagnose OSCC early and prevent OSCC. In this study, we collected exfoliative cytology data, histopathology data, and clinical data of normal subjects (n=102), oral leukoplakia (OLK) patients (n=82), and OSCC patients (n=93), and developed a data analysis procedure for quantitative risk stratification of OLK patients. This procedure involving a step called expert-guided data transformation and reconstruction (EdTAR) which allows automatic data processing and reconstruction and reveals informative signals for subsequent risk stratification. Modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Among the several models tested using resampling methods for parameter pruning and performance evaluation, Support Vector Machine (SVM) was found to be optimal with a high sensitivity (median>0.98) and specificity (median>0.99). With the SVM model, we constructed an oral cancer risk index (OCRI) which may potentially guide clinical follow-up of OLK patients. One OLK patient with an initial OCRI of 0.88 developed OSCC after 40 months of follow-up. In conclusion, we have developed a statistical method for qualitative risk stratification of OLK patients. This method may potentially improve cost-effectiveness of clinical follow-up of OLK patients, and help design clinical chemoprevention trial for high-risk populations.

  6. Observations on risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested.

  7. The anatomy of risk: a quantitative investigation into injection drug users' taxonomy of risk attitudes and perceptions.

    Science.gov (United States)

    Marsch, Lisa A; Bickel, Warren K; Badger, Gary J; Quesnel, Kimberly J

    2007-04-01

    The authors report on the first study to use the systematic quantitative methods of the psychometric paradigm of risk analysis to examine risk perceptions among substance abusers. Fifty opioid-dependent injection drug users (IDUs) and 50 matched, control individuals completed a series of measures to provide quantitative representations of risk perceptions about 53 risk-laden items (including activities, substances, technologies, and diseases). Results indicated that risk perceptions of IDUs and controls were highly correlated on many items; however, IDUs perceived several items, such as hepatitis, HIV, handguns, and unprotected sex, as markedly more risky. IDUs also perceived both themselves and others as having greater risk of contracting hepatitis and HIV. IDUs wanted significantly reduced regulation of drugs, including prescription drugs, heroin, valium, and barbiturates. Factor analyses conducted to understand how risk ratings related to various characteristics that have been shown to influence risk perception revealed 3 factors that account for approximately 80% of the variability in risk perception across groups: Factor 1, related to the severity of the risk; Factor 2, related to the certainty of the risk; and Factor 3, related to the immediacy of the risk. However, IDUs more strongly associated the extent to which they were personally affected by a risk item and the extent to which the risk affected fewer or more people to Factor 1, whereas control participants more strongly associated these characteristics with Factor 2. Identifying improved methodologies for evaluating the risk perceptions of IDUs may be of considerable utility in understanding their high-risk behavior.

  8. Quantitative histogram analysis of images

    Science.gov (United States)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  9. Integrating a quantitative risk appraisal in a health impact assessment

    DEFF Research Database (Denmark)

    Adám, Balázs; Molnár, Agnes; Gulis, Gabriel;

    2013-01-01

    BACKGROUND: Although the quantification of health outcomes in a health impact assessment (HIA) is scarce in practice, it is preferred by policymakers, as it assists various aspects of the decision-making process. This article provides an example of integrating a quantitative risk appraisal in an ...... a practical example of applying quantification in an HIA, thereby promoting its incorporation into political decision making.......BACKGROUND: Although the quantification of health outcomes in a health impact assessment (HIA) is scarce in practice, it is preferred by policymakers, as it assists various aspects of the decision-making process. This article provides an example of integrating a quantitative risk appraisal...

  10. Status and future of Quantitative Microbiological Risk Assessment in China.

    Science.gov (United States)

    Dong, Q L; Barker, G C; Gorris, L G M; Tian, M S; Song, X Y; Malakar, P K

    2015-03-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives.

  11. Quantitative risk assessment for a glass fiber insulation product.

    Science.gov (United States)

    Fayerweather, W E; Bender, J R; Hadley, J G; Eastes, W

    1997-04-01

    California Proposition 65 (Prop65) provides a mechanism by which the manufacturer may perform a quantitative risk assessment to be used in determining the need for cancer warning labels. This paper presents a risk assessment under this regulation for professional and do-it-yourself insulation installers. It determines the level of insulation glass fiber exposure (specifically Owens Corning's R-25 PinkPlus with Miraflex) that, assuming a working lifetime exposure, poses no significant cancer risk under Prop65's regulations. "No significant risk" is defined under Prop65 as a lifetime risk of no more than one additional cancer case per 100,000 exposed persons, and nonsignificant exposure is defined as a working lifetime exposure associated with "no significant risk." This determination can be carried out despite the fact that the relevant underlying studies (i.e., chronic inhalation bioassays) of comparable glass wool fibers do not show tumorigenic activity. Nonsignificant exposures are estimated from (1) the most recent RCC chronic inhalation bioassay of nondurable fiberglass in rats; (2) intraperitoneal fiberglass injection studies in rats; (3) a distributional, decision analysis approach applied to four chronic inhalation rat bioassays of conventional fiberglass; (4) an extrapolation from the RCC chronic rat inhalation bioassay of durable refractory ceramic fibers; and (5) an extrapolation from the IOM chronic rat inhalation bioassay of durable E glass microfibers. When the EPA linear nonthreshold model is used, central estimates of nonsignificant exposure range from 0.36 fibers/cc (for the RCC chronic inhalation bioassay of fiberglass) through 21 fibers/cc (for the i.p. fiberglass injection studies). Lower 95% confidence bounds on these estimates vary from 0.17 fibers/cc through 13 fibers/cc. Estimates derived from the distributional approach or from applying the EPA linear nonthreshold model to chronic bioassays of durable fibers such as refractory ceramic fiber

  12. Status and future of Quantitative Microbiological Risk Assessment in China

    OpenAIRE

    Dong, Q. L.; Barker, G. C.; Gorris, L.G.M.; Tian, M.S.; Song, X. Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, eff...

  13. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  14. New developments in quantitative risk assessment of campylobacteriosis

    DEFF Research Database (Denmark)

    Havelaar, Arie; Nauta, Maarten

    the formulation of quantitative criteria for broilers and other meats. New modelling methods (e.g. Bayesian approaches) are being proposed, but are currently less detailed than conventional Monte Carlo simulation models. There is considerable uncertainty in dose-response relationships and in particular, current......Quantitative microbiological risk assessment (QMRA) is now broadly accepted as an important decision support tool in food safety risk management. It has been used to support decision making at the global level (Codex Alimentarius, FAO and WHO), at the European level (European Food Safety Authority...... go undetected after on-farm monitoring. Processing models indicate that defeathering and evisceration are key processes leading to carcass contamination, but there is still considerable uncertainty of the quantitative effects of these processes. In the kitchen, cross-contamination from contaminated...

  15. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HWAN; YANG

    2001-01-01

    Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).……

  16. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).

  17. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HAWNYANG

    2001-01-01

    Risk analysis is a useful too for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data.contaminant residue levels,statistical tools,exposure values and relevant variants,Risk managers consider scientific evidence and risk estimates,along with statutory,engineering,economic,social,and political factors,in evaluating alternative regulatory options and choosing among those options(NRC,1983).

  18. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage ti

  19. Students perceiving risk: a quantitative assessment on three South ...

    African Journals Online (AJOL)

    Students perceiving risk: a quantitative assessment on three South African ... The project originated in qualitative research conducted over two years (2008- 2009) ... Cet article documente le risque perçu par les étudiants dans trois universités ...

  20. Estimation of prevalence of Salmonella on pig carcasses and pork joints, using a quantitative risk assessment model aided by meta-analysis.

    Science.gov (United States)

    Barron, Ursula Gonzales; Soumpasis, Ilias; Butler, Francis; Prendergast, Deirdre; Duggan, Sharon; Duffy, Geraldine

    2009-02-01

    This risk assessment study aimed to estimate the prevalence of Salmonella on pig carcasses and pork joints produced in slaughterhouses, on the basis that within groups of slaughter there is a strong association between the proportion of Salmonella-positive animals entering the slaughter lines (x) and the resulting proportion of contaminated eviscerated pig carcasses (y). To this effect, the results of a number of published studies reporting estimates of x and y were assembled in order to model a stochastic weighted regression considering the sensitivities of the diverse Salmonella culture methods. Meta-analysis was used to assign weights to the regression and to estimate the overall effect of chilling on Salmonella incidence on pig carcasses. The model's ability to produce accurate estimates and the intrinsic effectiveness of the modeling capabilities of meta-analysis were appraised using Irish data for the input parameter of prevalence of Salmonella carrier slaughter pigs. The model approximated a Salmonella prevalence in pork joints from Irish boning halls of 4.0% (95% confidence interval, 0.3 to 12.0%) and was validated by the results of a large survey (n = 720) of Salmonella in pork joints (mean, 3.3%; 95% confidence interval, 2.0 to 4.6%) carried out in four commercial pork abattoirs as part of this research project. Sensitivity analysis reinforced the importance of final rinsing (r = -0.382) and chilling (r = -0.221) as stages that contribute to reducing considerably the occurrence of Salmonella on the final product, while hygiene practices during jointing seemed to moderate only marginally the amount of contaminated pork joints. Finally, the adequacy of meta-analysis for integrating different findings and producing distributions for use in stochastic modeling was demonstrated.

  1. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  2. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    Science.gov (United States)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  3. Campylobacter Risk Analysis

    DEFF Research Database (Denmark)

    Nauta, Maarten

    In several countries quantitative microbiological risk assessments (QMRAs) have been performed for Campylobacter in chicken meat. The models constructed for this purpose provide a good example of the development of QMRA in general and illustrate the diversity of available methods. Despite...... the differences between the models, the most prominent conclusions of the QMRAs are similar. These conclusions for example relate to the large risk of highly contaminated meat products and the insignificance of contamination from Campylobacter positive flocks to negative flocks during slaughter and processing...

  4. Quantitative risk assessment methods for cancer and noncancer effects.

    Science.gov (United States)

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Christhin: Quantitative Analysis of Thin Layer Chromatography

    CERN Document Server

    Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda

    2012-01-01

    Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

  6. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  7. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  8. Quantitative analysis of arm movement smoothness

    Science.gov (United States)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  9. Seniors' Online Communities: A Quantitative Content Analysis

    Science.gov (United States)

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  10. A quantitative approach to scar analysis.

    Science.gov (United States)

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. Copyright © 2011 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  11. Optimized immunohistochemistry in combination with image analysis: a reliable alternative to quantitative ELISA determination of uPA and PAI-1 for routine risk group discrimination in breast cancer.

    Science.gov (United States)

    Lang, D S; Heilenkötter, U; Schumm, W; Behrens, O; Simon, R; Vollmer, E; Goldmann, T

    2013-10-01

    The determination of the invasion markers urokinase-type plasminogen activator (uPA) and plasminogen activator inhibitor-1 (PAI-1) has further improved the possibilities for individualized therapy of breast cancer. To date, quantitative measurement by ELISA, that needs large amounts of fresh, frozen material, is the only standardized procedure for diagnostic purposes. Therefore, the aim of this study was the establishment of a reliable alternative method based on immunohistochemistry (IHC) and image analysis requiring only small amounts of fixed tumor tissue. Protein expression of uPA and PAI-1 was analyzed in HOPE-fixed tumor samples using tissue microarrays (TMAs) and semiquantitative image analysis. The results of both methods were significantly correlated and risk assessment showed an overall concordance of 78% (83/107; high- and low-risk) and of 94% (74/79) regarding only high-risk patients. The data demonstrate that optimized IHC in combination with image analysis can provide adequate clinical significance compared to ELISA-derived determination of uPA and PAI-1.

  12. A QUANTITATIVE ANALYSIS METHOD FOR SECURITY RISK AGAINST DPA ATTACK ON PASSWORD CHIP%一种针对密码芯片 DPA 攻击的安全风险量化分析方法

    Institute of Scientific and Technical Information of China (English)

    李廷顺; 谭文; 程志华; 张宗华

    2016-01-01

    针对密码芯片遭受差分功耗分析 DPA(Differential Power Analysis)策略攻击时风险如何量化精确评估的问题,提出一种量化应用模拟分析方法。该方法对正常运行中的密码芯片功率消耗大小的概率分布密度值实行核函数机制理论推导,并引入密钥在获取时攻击分析结构模型与功率消耗大小之间的互通信熵值。实验结果表明,该量化分析方法不仅能精确地验算出与互通信熵值类似的相关度参数,而且能有效地提高密钥芯片风险全方位分析能力。%Aiming at how to estimate the risk of the password chips being attacked with differential power analysis (DPA)in quantified and precise way,this paper proposes a quantitative analysis method for applied simulation.The method makes the theoretical derivation with kernel function mechanism on the probability distribution density value of the power consumption of password chip in normal working process, and introduces the mutual communication entropy between the analysis structure model of the attack during key acquisition and the energy consumption.Experimental results indicate that the quantitative analysis can not only check the relevance parameter similar to mutual communication entropy,but can also effectively improve the full-range analysis capability in regard to key chip risks.

  13. A qualitative and quantitative analysis of risk perception and treatment options as related to wildfires in the USDA FS Region 3 National Forests

    Science.gov (United States)

    Ingrid M. Martin; Wade E. Martin; Carol B. Raish

    2011-01-01

    As the incidence of devastating fires rises, managing the risk posed by these fires has become critical. This report provides important information to examine the ways that different groups or disaster subcultures develop the mentalities or perceived realities that affect their views and responses concerning risk and disaster preparedness. Fire risk beliefs and...

  14. Quantitative basin evaluation, a tool for reducing exploration risks

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    Over the last decade, the evaluation of the petroleum potential of sedimentary basins has shifted from a qualitative to a quantitative approach. This new perspective has led to the elaboration of basin simulators and a new discipline emerged: the basin modeling which is nowadays massively used by the oil industry. This methodology is an integration platform including geology, geochemistry and physics. Combined with uncertainty assessments, it is a powerful tool for reducing exploration risks. In this respect, IFP dedicates one of its IFP Sessions to 'Quantitative Basin Evaluation, a Tool for Reducing Exploration Risks'. This session was held on 18 June, 2001. The purpose of this workshop is to review the most modern advances used by professionals in the oil industry to improve the rate of hydrocarbon discovery and to lower exploration costs in frontier areas. IFP Sessions are conferences held for decision-makers and technical leaders to promote exchanges among involved parties: oil and gas companies, petroleum services and supply industries, engineering firms and research centers. Program: quantitative basin evaluation: the Berkine basin and its results; application of 2-D and 3-D basin modelling in the Gulf of Mexico; 2-D modelling in Brazilian sedimentary basins: lessons from recent case histories; quantitative modelling in the North Sea: towards a more confident assessment of the critical risks (*); uncertainties evaluation in petroleum system modelling (*); prediction of hydrocarbon type in the deep-water, Gulf of Mexico (*); constraining source and charge risk in deep-water areas - the Gulf of Mexico and offshore Angola (*); an overview of basin modelling and geochemistry activities in PDO; quantitative basin evaluation: how does it affect decision in exploration (*); panel discussion: evolution of the modelling tools. Abstracts are available only for 5 presentations (marked with (*)). (J.S.)

  15. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  16. Quantitative finance and risk management a physicist's approach

    CERN Document Server

    Dash, Jan W

    2004-01-01

    Written by a physicist with over 15 years of experience as a quant on Wall Street, this book treats a wide variety of topics. Presenting the theory and practice of quantitative finance and risk, it delves into the "how to" and "what it's like" aspects not covered in textbooks or research papers. Both standard and new results are presented. A "Technical Index" indicates the mathematical level; from zero to PhD mathematical background; for each section. The finance aspect in each section is self-contained. Real-life comments on "life as a quant" are included. This book is designed for scientists and engineers desiring to learn quantitative finance, and for quantitative analysts and finance graduate students. Parts will be of interest to research academics.

  17. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  18. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  19. New developments in quantitative risk assessment of campylobacteriosis

    DEFF Research Database (Denmark)

    Havelaar, Arie; Nauta, Maarten

    Quantitative microbiological risk assessment (QMRA) is now broadly accepted as an important decision support tool in food safety risk management. It has been used to support decision making at the global level (Codex Alimentarius, FAO and WHO), at the European level (European Food Safety Authority...... go undetected after on-farm monitoring. Processing models indicate that defeathering and evisceration are key processes leading to carcass contamination, but there is still considerable uncertainty of the quantitative effects of these processes. In the kitchen, cross-contamination from contaminated...... meat to ready-to-eat foods is the main pathway of consumer exposure. Undercooking appears to be of minor importance. However, this conclusion may need to be reconsidered in the light of increasing consumption of minced meat preparations. Five QMRA models have been compared in detail, and detailed...

  20. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  1. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    Science.gov (United States)

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  2. Quantitative image analysis of celiac disease.

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-07

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  3. Quantitative image analysis of celiac disease

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  4. Risco privado em infra-estrutura pública: uma análise quantitativa de risco como ferramenta de modelagem de contratos Private risk in public infrastructure: a quantitative risk analysis as a contract modeling tool

    Directory of Open Access Journals (Sweden)

    Luiz E. T. Brandão

    2007-12-01

    Full Text Available Parcerias público-privadas (PPP são arranjos contratuais onde o governo assume compromissos futuros por meio de garantias e opções. São alternativas para aumentar a eficiência do Estado por uma alocação mais eficiente de incentivos e riscos. No entanto, a determinação do nível ótimo de garantias e a própria alocação de riscos são geralmente realizadas de forma subjetiva, podendo levar o governo a ter que assumir passivos significativos. Este artigo propõe um modelo de valoração quantitativa de garantias governamentais em projetos de PPP por meio da metodologia das opções reais, e este modelo é aplicado a um projeto de concessão rodoviária. Os autores analisam o impacto de diversos níveis de garantia de receita sobre o valor e risco do projeto, bem como o valor esperado do desembolso futuro do governo em cada uma das situações, concluindo que é possível ao poder público determinar o nível ótimo de garantia em função do grau de redução de risco desejado, e que o desenho e a modelagem contratual de projetos de PPP podem se beneficiar de ferramentas quantitativas aqui apresentadas.Public private partnerships (PPP are contractual arrangements in which the government assumes future obligations by providing project guarantees. They are considered a way of increasing government efficiency through a more efficient allocation of risks and incentives. On the other hand, the assessment and determination the optimal level of these guarantees is usually subjective, exposing the government to potentially high future liabilities. This article proposes a quantitative model for the evaluation of government guarantees in PPP projects under the real options approach, and applies this model to a toll highway concession with a minimum revenue guarantee. It studies the impact of different guarantee levels on the value and the risk of the project, as well as the expected level of future cash payments to be made by the government in

  5. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  6. Quantitative proteomics analysis using 2D-PAGE to investigate the effects of cigarette smoke and aerosol of a prototypic modified risk tobacco product on the lung proteome in C57BL/6 mice.

    Science.gov (United States)

    Elamin, Ashraf; Titz, Bjoern; Dijon, Sophie; Merg, Celine; Geertz, Marcel; Schneider, Thomas; Martin, Florian; Schlage, Walter K; Frentzel, Stefan; Talamo, Fabio; Phillips, Blaine; Veljkovic, Emilija; Ivanov, Nikolai V; Vanscheeuwijck, Patrick; Peitsch, Manuel C; Hoeng, Julia

    2016-08-11

    Smoking is associated with several serious diseases, such as lung cancer and chronic obstructive pulmonary disease (COPD). Within our systems toxicology framework, we are assessing whether potential modified risk tobacco products (MRTP) can reduce smoking-related health risks compared to conventional cigarettes. In this article, we evaluated to what extent 2D-PAGE/MALDI MS/MS (2D-PAGE) can complement the iTRAQ LC-MS/MS results from a previously reported mouse inhalation study, in which we assessed a prototypic MRTP (pMRTP). Selected differentially expressed proteins identified by both LC-MS/MS and 2D-PAGE approaches were further verified using reverse-phase protein microarrays. LC-MS/MS captured the effects of cigarette smoke (CS) on the lung proteome more comprehensively than 2D-PAGE. However, an integrated analysis of both proteomics data sets showed that 2D-PAGE data complement the LC-MS/MS results by supporting the overall trend of lower effects of pMRTP aerosol than CS on the lung proteome. Biological effects of CS exposure supported by both methods included increases in immune-related, surfactant metabolism, proteasome, and actin cytoskeleton protein clusters. Overall, while 2D-PAGE has its value, especially as a complementary method for the analysis of effects on intact proteins, LC-MS/MS approaches will likely be the method of choice for proteome analysis in systems toxicology investigations. Quantitative proteomics is anticipated to play a growing role within systems toxicology assessment frameworks in the future. To further understand how different proteomics technologies can contribute to toxicity assessment, we conducted a quantitative proteomics analysis using 2D-PAGE and isobaric tag-based LC-MS/MS approaches and compared the results produced from the 2 approaches. Using a prototypic modified risk tobacco product (pMRTP) as our test item, we show compared with cigarette smoke, how 2D-PAGE results can complement and support LC-MS/MS data, demonstrating

  7. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics.

  8. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  9. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... of health impact categorization that has been successfully in force for several years within several emergency planning programs. Four health impact categories are distinguished: No-Health Impact, Low-Health Impact, Moderate-Health Impact and Severe-Health Impact. Two different charts are presented...

  10. Quantitative risk assessment for skin sensitization: Success or failure?

    Science.gov (United States)

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used.

  11. Targeted assets risk analysis.

    Science.gov (United States)

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians.

  12. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    Directory of Open Access Journals (Sweden)

    Lauren E. Joly

    2016-01-01

    Full Text Available Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45 of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61 of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability.

  13. Simplified seismic risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pellissetti, Manuel; Klapp, Ulrich [AREVA NP GmbH, Erlangen (Germany)

    2011-07-01

    Within the context of probabilistic safety analysis (PSA) for nuclear power plants (NPP's), seismic risk assessment has the purpose to demonstrate that the contribution of seismic events to overall risk is not excessive. The most suitable vehicle for seismic risk assessment is a full scope seismic PSA (SPSA), in which the frequency of core damage due to seismic events is estimated. An alternative method is represented by seismic margin assessment (SMA), which aims at showing sufficient margin between the site-specific safe shutdown earthquake (SSE) and the actual capacity of the plant. Both methods are based on system analysis (fault-trees and event-trees) and hence require fragility estimates for safety relevant systems, structures and components (SSC's). If the seismic conditions at a specific site of a plant are not very demanding, then it is reasonable to expect that the risk due to seismic events is low. In such cases, the cost-benefit ratio for performing a full scale, site-specific SPSA or SMA will be excessive, considering the ultimate objective of seismic risk analysis. Rather, it will be more rational to rely on a less comprehensive analysis, used as a basis for demonstrating that the risk due to seismic events is not excessive. The present paper addresses such a simplified approach to seismic risk assessment which is used in AREVA to: - estimate seismic risk in early design stages, - identify needs to extend the design basis, - define a reasonable level of seismic risk analysis Starting from a conservative estimate of the overall plant capacity, in terms of the HCLPF (High Confidence of Low Probability of Failure), and utilizing a generic value for the variability, the seismic risk is estimated by convolution of the hazard and the fragility curve. Critical importance is attached to the selection of the plant capacity in terms of the HCLPF, without performing extensive fragility calculations of seismically relevant SSC's. A suitable basis

  14. Quantitative and qualitative analysis and interpretation of CT perfusion imaging.

    Science.gov (United States)

    Valdiviezo, Carolina; Ambrose, Marietta; Mehra, Vishal; Lardo, Albert C; Lima, Joao A C; George, Richard T

    2010-12-01

    Coronary artery disease (CAD) remains the leading cause of death in the United States. Rest and stress myocardial perfusion imaging has an important role in the non-invasive risk stratification of patients with CAD. However, diagnostic accuracies have been limited, which has led to the development of several myocardial perfusion imaging techniques. Among them, myocardial computed tomography perfusion imaging (CTP) is especially interesting as it has the unique capability of providing anatomic- as well as coronary stenosis-related functional data when combined with computed tomography angiography (CTA). The primary aim of this article is to review the qualitative, semi-quantitative, and quantitative analysis approaches to CTP imaging. In doing so, we will describe the image data required for each analysis and discuss the advantages and disadvantages of each approach.

  15. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  16. 易燃液体库火灾风险定量分析与防火%On fire risk quantitative analysis for flammable liquid storeroom

    Institute of Scientific and Technical Information of China (English)

    王以革; 任常兴; 吕东; 王婕; 郝爱玲; 李晋

    2011-01-01

    以甲类易燃液体库为例,探讨易燃液体泄漏的典型事故场景及火灾爆炸事故发生的可能性,采用多能法定量评估“最坏事故场景”对人员和建构筑物的伤害和破坏范围,分析事故通风换气对可燃气云爆炸浓度范围的影响作用,并提出防火防爆的安全对策措施.%Taking a class-A flammable liquid storeroom as an example, the typical accident scenarios of flammable liquid leakiness, and the probability of fire and explosion accident were discussed. The best worst accident scenario was quantitative evaluated by TNO method, and the damage and incidence zone to people and building were determined. Moreover, accident draught of storeroom works on explosive concentration scope, influence process of which were depicted. Finally, some countermeasures for preventing and controlling accidents were put forward.

  17. Campylobacter detection along the food chain--towards improved quantitative risk analysis by live/dead discriminatory culture-independent methods.

    Science.gov (United States)

    Stingl, Kerstin; Buhler, Christiane; Krüger, Nora-Johanna

    2015-01-01

    Death, although absolute in its consequence, is not measurable by an absolute parameter in bacteria. Viability assays address different aspects of life, e. g. the capability to form a colony on an agar plate (CFU), metabolic properties or mem- brane integrity. For food safety, presence of infectious potential is the relevant criterion for risk assessment, currently only partly reflected by the quantification of CFU. It will be necessary for future improved risk assessment, in particular when fastidious bacterial pathogens are implicated, to enhance the informative value of CFU. This might be feasible by quantification of the number of intact and potentially infectious Campylobacter, impermeable to the DNA intercalating dye propidium monoazide (PMA). The latter are quantifiable by the combination of PMA with real-time PCR, although thorough controls have to be developed for standardization and the circumvention of pitfalls. Under consideration of differ- ent physiological states of the food-borne pathogen, we provide an overview of current and future suitable detection/quantification targets along the food chain, including putative limitations of detection.

  18. Semi-quantitative assessment of the physical vulnerability of buildings for the landslide risk analysis. A case study in the Loures municipality, Lisbon district, Portugal

    Science.gov (United States)

    Guillard-Gonçalves, Clémence; Zêzere, José Luis; Pereira, Susana; Garcia, Ricardo

    2016-04-01

    The physical vulnerability of the buildings of Loures (a Portuguese municipality) to landslides was assessed, and the landslide risk was computed as the product of the landslide hazard by the vulnerability and the market economic value of the buildings. First, the hazard was assessed by combining the spatio-temporal probability and the frequency-magnitude relationship of the landslides, which was established by plotting the probability of a landslide area. The susceptibility of deep-seated and shallow landslides was assessed by a bi-variate statistical method and was mapped. The annual and multiannual spatio-temporal probabilities were estimated, providing a landslide hazard model. Then, an assessment of buildings vulnerability to landslides, based on an inquiry of a pool of landslide European experts, was developed and applied to the study area. The inquiry was based on nine magnitude scenarios and four structural building types. A sub-pool of the landslide experts who know the study area was extracted from the pool, and the variability of the answers coming from the pool and the sub-pool was assessed with standard deviation. Moreover, the average vulnerability of the basic geographic entities was compared by changing the map unit and applying the vulnerability to all the buildings of a test site (included in the study area), the inventory of which was listed on the field. Next, the market economic value of the buildings was calculated using an adaptation of the Portuguese Tax Services approach. Finally, the annual and multiannual landslide risk was computed for the nine landslide magnitude scenarios and different spatio-temporal probabilities by multiplying the potential loss (Vulnerability × Economic Value) by the hazard probability. As a rule, the vulnerability values given by the sub-pool of experts who know the study area are higher than those given by the European experts, namely for the high magnitude landslides. The obtained vulnerabilities vary from 0

  19. Benefits and risks of fish consumption Part I. A quantitative analysis of the intake of omega-3 fatty acids and chemical contaminants.

    Science.gov (United States)

    Domingo, José L; Bocio, Ana; Falcó, Gemma; Llobet, Juan M

    2007-02-12

    In recent years, and based on the importance of fish as a part of a healthy diet, there has been a notable promotion of fish consumption. However, the balance between health benefits and risks, due to the intake of chemical contaminants, is not well characterized. In the present study, edible samples of 14 marine species were analyzed for the concentrations of omega-3 fatty acids, as well as a number of metals and organic pollutants. Daily intakes were specifically determined for a standard adult of 70kg, and compared with the tolerable/admissible intakes of the pollutants, if available. Salmon, mackerel, and red mullet were the species showing the highest content of omega-3 fatty acids. The daily intakes of cadmium, lead, and mercury through fish consumption were 1.1, 2.0, and 9.9microg, respectively. Dioxins and furans plus dioxin-like polychlorinated biphenyls (PCBs) intake was 38.0pg WHO-TEQ/day, whereas those of polybrominated diphenyl ethers (PBDEs), polychlorinated diphenyl ethers (PCDEs), polychlorinated naphthalenes (PCNs) and hexachlorobenzene (HCB) were 20.8, 39.4, 1.53, and 1.50ng/day, respectively. In turn, the total intake of 16 analyzed polycyclic aromatic hydrocarbons (PAHs) was 268ng/day. The monthly fish consumption limits for human health endpoints based on the intake of these chemical contaminants were calculated for a 70 years exposure. In general terms, most marine species here analyzed should not mean adverse health effects for the consumers. However, the type of fish, the frequency of consumption, and the meal size are essential issues for the balance of the health benefits and risks of regular fish consumption.

  20. Quantitative analysis of spirality in elliptical galaxies

    CERN Document Server

    Dojcsak, Levente

    2013-01-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  1. Quantitative laryngeal electromyography: turns and amplitude analysis.

    Science.gov (United States)

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  2. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    Science.gov (United States)

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  3. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    Science.gov (United States)

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  4. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  5. Is there a place for quantitative risk assessment?

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Eric J [Columbia University Medical Center, New York, NY (United States)

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  6. Risk Quantitative Analysis of En-route Controllers′Cognitive Process%区域管制员认知过程风险分析方法研究

    Institute of Scientific and Technical Information of China (English)

    王洁宁; 李培; 周沅

    2015-01-01

    Air traffic management system is the result of complex interactions and joint cognitive among human operators, procedures and technical systems.Based on the summary of the en-route controllers′daily work in which there are message flows and cooperation modes, the logical relations between the activities of the cognitive processes was made with the use of UML.And, take advantage of the specification of BPMN, a cognitive model of confir-ming/updating MP process was built.Then, in order to realize the risk analysis quantitatively, the BPMN model was extended by combining with event tree analysis.The results show that this method can find out the safety weak-ness in process by analysis the risk value, which provides support for risk identification and analysis of ATM.%空中交通管理系统是一个典型的高交互、高分布式和具备联合认知特性的复杂系统。在总结区域管制员日常工作任务基础上,考虑流程中的消息和协作模式,采用UML活动图刻画认知过程中各个活动之间的逻辑关系,提出利用BPMN语言的规范性,建立区域管制员“确定/更新MP”任务模型,结合事件树分析方法拓展BPMN模型,实现风险定量分析。结果表明,此方法通过对管制员认知过程的跟踪和训练,根据计算得出的风险值可以确定流程中影响安全的薄弱环节,为空管风险识别与分析提供支持。

  7. Quantitative microbial risk assessment of antibacterial hand hygiene products on risk of shigellosis.

    Science.gov (United States)

    Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H

    2014-04-01

    There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk.

  8. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  9. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  10. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  11. Nonlinear dynamics and quantitative EEG analysis.

    Science.gov (United States)

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  12. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  13. Quantitative analysis of protein turnover in plants.

    Science.gov (United States)

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  14. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    Science.gov (United States)

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  15. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  16. Probabilistic risk analysis and terrorism risk.

    Science.gov (United States)

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  17. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    Science.gov (United States)

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  18. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  19. Applying Knowledge of Quantitative Design and Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  20. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative inf

  1. Quantitative color analysis for capillaroscopy image segmentation.

    Science.gov (United States)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  2. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  3. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  4. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard a...

  5. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    Science.gov (United States)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  6. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  7. Dermal sensitization quantitative risk assessment (QRA) for fragrance ingredients.

    Science.gov (United States)

    Api, Anne Marie; Basketter, David A; Cadby, Peter A; Cano, Marie-France; Ellis, Graham; Gerberick, G Frank; Griem, Peter; McNamee, Pauline M; Ryan, Cindy A; Safford, Robert

    2008-10-01

    Based on chemical, cellular, and molecular understanding of dermal sensitization, an exposure-based quantitative risk assessment (QRA) can be conducted to determine safe use levels of fragrance ingredients in different consumer product types. The key steps are: (1) determination of benchmarks (no expected sensitization induction level (NESIL)); (2) application of sensitization assessment factors (SAF); and (3) consumer exposure (CEL) calculation through product use. Using these parameters, an acceptable exposure level (AEL) can be calculated and compared with the CEL. The ratio of AEL to CEL must be favorable to support safe use of the potential skin sensitizer. This ratio must be calculated for the fragrance ingredient in each product type. Based on the Research Institute for Fragrance Materials, Inc. (RIFM) Expert Panel's recommendation, RIFM and the International Fragrance Association (IFRA) have adopted the dermal sensitization QRA approach described in this review for fragrance ingredients identified as potential dermal sensitizers. This now forms the fragrance industry's core strategy for primary prevention of dermal sensitization to these materials in consumer products. This methodology is used to determine global fragrance industry product management practices (IFRA Standards) for fragrance ingredients that are potential dermal sensitizers. This paper describes the principles of the recommended approach, provides detailed review of all the information used in the dermal sensitization QRA approach for fragrance ingredients and presents key conclusions for its use now and refinement in the future.

  8. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  9. The Curriculum in Quantitative Analysis: Results of a Survey.

    Science.gov (United States)

    Locke, David C.; Grossman, William E. L.

    1987-01-01

    Reports on the results of a survey of college level instructors of quantitative analysis courses. Discusses what topics are taught in such courses, how much weight is given to these topics, and which experiments are used in the laboratory. Poses some basic questions about the curriculum in quantitative analysis. (TW)

  10. Benchmark dose profiles for joint-action continuous data in quantitative risk assessment.

    Science.gov (United States)

    Deutsch, Roland C; Piegorsch, Walter W

    2013-09-01

    Benchmark analysis is a widely used tool in biomedical and environmental risk assessment. Therein, estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a prespecified benchmark response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This paper demonstrates how the benchmark modeling paradigm can be expanded from the single-agent setting to joint-action, two-agent studies. Focus is on continuous response outcomes. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile-a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR-is defined for use in quantitative risk characterization and assessment.

  11. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  12. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    Science.gov (United States)

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment.

  13. International Conference on Risk Analysis

    CERN Document Server

    Oliveira, Teresa; Rigas, Alexandros; Gulati, Sneh

    2015-01-01

    This book covers the latest results in the field of risk analysis. Presented topics include probabilistic models in cancer research, models and methods in longevity, epidemiology of cancer risk, engineering reliability and economical risk problems. The contributions of this volume originate from the 5th International Conference on Risk Analysis (ICRA 5). The conference brought together researchers and practitioners working in the field of risk analysis in order to present new theoretical and computational methods with applications in biology, environmental sciences, public health, economics and finance.

  14. Quantitative genetic analysis of injury liability in infants and toddlers

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, K.; Matheny, A.P. Jr. [Univ. of Louisville Medical School, KY (United States)

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  15. A quantitative approach for integrating multiple lines of evidence for the evaluation of environmental health risks

    Directory of Open Access Journals (Sweden)

    Jerome J. Schleier III

    2015-01-01

    Full Text Available Decision analysis often considers multiple lines of evidence during the decision making process. Researchers and government agencies have advocated for quantitative weight-of-evidence approaches in which multiple lines of evidence can be considered when estimating risk. Therefore, we utilized Bayesian Markov Chain Monte Carlo to integrate several human-health risk assessment, biomonitoring, and epidemiology studies that have been conducted for two common insecticides (malathion and permethrin used for adult mosquito management to generate an overall estimate of risk quotient (RQ. The utility of the Bayesian inference for risk management is that the estimated risk represents a probability distribution from which the probability of exceeding a threshold can be estimated. The mean RQs after all studies were incorporated were 0.4386, with a variance of 0.0163 for malathion and 0.3281 with a variance of 0.0083 for permethrin. After taking into account all of the evidence available on the risks of ULV insecticides, the probability that malathion or permethrin would exceed a level of concern was less than 0.0001. Bayesian estimates can substantially improve decisions by allowing decision makers to estimate the probability that a risk will exceed a level of concern by considering seemingly disparate lines of evidence.

  16. Some Epistemological Considerations Concerning Quantitative Analysis

    Science.gov (United States)

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  17. Quantitative analysis of γ-H2AX and p53 nuclear expression levels in ovarian and fallopian tube epithelium from risk-reducing salpingo-oophorectomies in BRCA1 and BRCA2 mutation carriers.

    Science.gov (United States)

    Staff, Synnöve; Tolonen, Teemu; Laasanen, Satu-Leena; Mecklin, Jukka-Pekka; Isola, Jorma; Mäenpää, Johanna

    2014-05-01

    Mutations in BRCA1 and BRCA2 genes confer an increased lifetime risk for breast and ovarian cancer. Increased lifetime ovarian cancer risk among BRCA1/BRCA2 mutation carriers can be substantially decreased by risk-reducing salpingo-oophorectomy (RRSO), which also provides material for molecular research on early pathogenesis of serous ovarian cancer. RRSO studies have suggested fallopian tube as a primary site of serous high-grade ovarian cancer. In this study, the nuclear expression levels of γ-H2AX and p53 using immunohistochemical (IHC) study was quantitatively assessed in ovarian and fallopian tube epithelium derived from RRSOs in 29 BRCA1 and BRCA2 mutation carriers and in 1 patient with a strong family history of breast and ovarian cancer but showing an unknown BRCA status. Both p53 and γ-H2AX nuclear staining levels were significantly higher in BRCA1/2 mutation-positive fallopian tube epithelium compared with the control fallopian tube epithelium (Pepithelium and controls. Both γ-H2AX and p53 showed significantly higher nuclear expression levels in BRCA1/2 mutation-positive fallopian tube epithelium compared with BRCA1/2 mutation-positive ovarian epithelium (Pepithelium showed a positive correlation between the γ-H2AX and p53 nuclear expression levels (Pearson r=0.508, P=0.003). Our results of quantitative nuclear p53 and γ-H2AX expression levels in ovarian and fallopian tube epithelium derived from RRSO in high-risk patients support the previously suggested role of fallopian tube epithelium serving as a possible site of initial serous ovarian carcinogenesis.

  18. Campylobacter Risk Analysis

    DEFF Research Database (Denmark)

    Nauta, Maarten

    by quantitative data. The easy way out of this problem is to conclude that the models are inaccurate. However, this may be too simple. The models are based on the best knowledge and insights of a diverse set of scientists and poultry processing experts, combined with some logical argumentation. This implies...

  19. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    Science.gov (United States)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  20. Structural and quantitative analysis of Equisetum alkaloids.

    Science.gov (United States)

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  1. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  2. 基于主成分分析法土地储备风险的定量评估%Quantitative Land Reserve Risk Assessment Based on Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    朱家明; 程倩倩; 朱海龙

    2015-01-01

    Aiming at the risk assessment for land reserve programs, the authors used Excel land reserve programs to screen and process related data, then employed the principal component analysis, fuzzy C-means clustering analysis and normality test methods to establish a land reserve program risk and other related evaluation model.Finally, Matlab programming was used to get total risk primarily through the establishment of risk assessment index to measure land reserve programs, and to establish appropriate risk assessment standard,at last, an effective approach for land reserve risk assessment was attained.%针对土地储备方案的风险评估,使用Excel对所给土地储备方案的相关数据筛选和处理,其次采取主成分分析、模糊C均值聚类分析和正态性检验等方法,建立了土地储备方案风险等相关的评价模型。运用Matlab编程,得到了主要通过建立风险评价指标来衡量土地储备方案的总风险,并确立相应的风险评价标准,得到了对土地储备风险进行有效评估的方法。

  3. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Soichiro; Araki; Itaru; Nishioka; Yoshihiko; Suemura

    2003-01-01

    This paper proposes two migration scenarios from China ring networks to ASON mesh networks. In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  4. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Guoying Zhang; Soichiro Araki; Itaru Nishioka; Yoshihiko Suemura

    2003-01-01

    This paper proposes two migration scenarios from China rin g networks to ASON mesh networks . In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  5. Quantitative and qualitative analysis of sterols/sterolins and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-06-03

    Jun 3, 2008 ... Quantitative and qualitative analysis of sterols/sterolins ... method was developed to identify and quantify sterols (especially β-sitosterol) in chloroform extracts of ... Studies with phytosterols, especially β-sitosterol, have.

  6. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    Science.gov (United States)

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  7. Benchmark dose profiles for joint-action quantal data in quantitative risk assessment.

    Science.gov (United States)

    Deutsch, Roland C; Piegorsch, Walter W

    2012-12-01

    Benchmark analysis is a widely used tool in public health risk analysis. Therein, estimation of minimum exposure levels, called Benchmark Doses (BMDs), that induce a prespecified Benchmark Response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This article demonstrates how the benchmark modeling paradigm can be expanded from the single-dose setting to joint-action, two-agent studies. Focus is on response outcomes expressed as proportions. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile (BMP) - a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR - is defined for use in quantitative risk characterization and assessment. The resulting, joint, low-dose guidelines can improve public health planning and risk regulation when dealing with low-level exposures to combinations of hazardous agents.

  8. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  9. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  10. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety.

    Science.gov (United States)

    van Gerwen, S J; te Giffel, M C; van't Riet, K; Beumer, R R; Zwietering, M H

    2000-06-01

    This paper describes a system for the microbiological quantitative risk assessment for food products and their production processes. The system applies a stepwise risk assessment, allowing the main problems to be addressed before focusing on less important problems. First, risks are assessed broadly, using order of magnitude estimates. Characteristic numbers are used to quantitatively characterize microbial behaviour during the production process. These numbers help to highlight the major risk-determining phenomena, and to find negligible aspects. Second, the risk-determining phenomena are studied in more detail. Both general and/or specific models can be used for this and varying situations can be simulated to quantitatively describe the risk-determining phenomena. Third, even more detailed studies can be performed where necessary, for instance by using stochastic variables. The system for quantitative risk assessment has been implemented as a decision supporting expert system called SIEFE: Stepwise and Interactive Evaluation of Food safety by an Expert System. SIEFE performs bacterial risk assessments in a structured manner, using various information sources. Because all steps are transparent, every step can easily be scrutinized. In the current study the effectiveness of SIEFE is shown for a cheese spread. With this product, quantitative data concerning the major risk-determining factors were not completely available to carry out a full detailed assessment. However, this did not necessarily hamper adequate risk estimation. Using ranges of values instead helped identifying the quantitatively most important parameters and the magnitude of their impact. This example shows that SIEFE provides quantitative insights into production processes and their risk-determining factors to both risk assessors and decision makers, and highlights critical gaps in knowledge.

  11. Towards a quantitative OCT image analysis.

    Directory of Open Access Journals (Sweden)

    Marina Garcia Garrido

    Full Text Available Optical coherence tomography (OCT is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study.Spectral-Domain Optical Coherence Tomography (OCT, confocal Scanning-Laser Ophthalmoscopy (SLO, and Fluorescein Angiography (FA were performed in mice (Mus musculus, gerbils (Gerbillus perpadillus, and cynomolgus monkeys (Macaca fascicularis using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer. Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/.Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP, the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated.OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions. Our results highlight the

  12. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  13. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  14. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    Science.gov (United States)

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise.

  15. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety

    NARCIS (Netherlands)

    Gerwen, van S.J.C.; Giffel, te M.C.; Riet, van 't K.; Beumer, R.R.; Zwietering, M.H.

    2000-01-01

    This paper describes a system for the microbiological quantitative risk assessment for food products and their production processes. The system applies a stepwise risk assessment, allowing the main problems to be addressed before focusing on less important problems. First, risks are assessed broadly

  16. Quantitative Analysis of the Operating Risks of Private Bank:A Study Based on AHP Analysis%基于层次分析法的民营银行经营风险实证分析

    Institute of Scientific and Technical Information of China (English)

    吴少新; 李敏芳

    2015-01-01

    The main sources of the business risk of private bank are risk of residents’ trust, risk of related party transac-tions and risk of insufficient funds.There are six suggestions to prevent the business risk of private bank. Firstly, private bank can innovate the financial allocation system to improve residents’ trust; Secondly, private bank could establish a regulatory technology of related party transactions to prevent the risk of related party transactions; Thirdly, we should build a deposit in-surance system as soon as possible to attract deposits; Fourthly, private bank can limit the number of shareholders ’ credit loans to avoid the risk of internal control; Fifthly, private bank may service remote areas to improve competitiveness in the industry; Finally, we need to build a good credit environment to prevent the risk of breaking a contract.%影响民营银行经营风险的主要因素有居民信任风险、关联交易风险和存款不足风险等。防范民营银行经营风险可从以下六个方面着手:一是进行财务分配制度创新,提高居民信任度;二是建立关联交易的监管技术,防范关联交易风险;三是尽快建立存款保险制度,吸纳存款;四是限制股东信用贷款数量,避免内部人控制风险;五是服务县域经济,提高在行业中的竞争力;六是构建良好的信用环境,防范违约风险。

  17. Workshop One : Risk Analysis

    NARCIS (Netherlands)

    Carlson, T.J.; Jong, C.A.F. de; Dekeling, R.P.A.

    2012-01-01

    The workshop looked at the assessment of risk to aquatic animals exposed to anthropogenic sound. The discussion focused on marine mammals given the worldwide attention being paid to them at the present time, particularly in relationship to oil and gas exploration, ocean power, and increases in ship

  18. CONSIDERATIONS ON ENTITY'S RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    MIRELA MONEA

    2014-12-01

    Full Text Available In the present paper, because of the complexity of this topic, the purpose is to discuss the main aspects involved by risk analysis; starting with few conceptual approaches about risk and to outline the contributions about methods to assess different risks categories, especially methods to assess bankruptcy risk prediction (entity insolvency from economic literature. The methods used to estimate bankruptcy risk are based on the score function which helps to find if an entity is confronted with financial difficulties. The score functions are a diagnosis method elaborated relying on the discriminant analysis, allowing to assess and to predict the bankruptcy risk of the entity using a set of relevant financial ratios.

  19. A quantitative microbiological risk assessment for Campylobacter in petting zoos.

    Science.gov (United States)

    Evers, Eric G; Berk, Petra A; Horneman, Mijke L; van Leusden, Frans M; de Jonge, Rob

    2014-09-01

    The significance of petting zoos for transmission of Campylobacter to humans and the effect of interventions were estimated. A stochastic QMRA model simulating a child or adult visiting a Dutch petting zoo was built. The model describes the transmission of Campylobacter in animal feces from the various animal species, fences, and the playground to ingestion by visitors through touching these so-called carriers and subsequently touching their lips. Extensive field and laboratory research was done to fulfill data needs. Fecal contamination on all carriers was measured by swabbing in 10 petting zoos, using Escherichia coli as an indicator. Carrier-hand and hand-lip touching frequencies were estimated by, in total, 13 days of observations of visitors by two observers at two petting zoos. The transmission from carrier to hand and from hand to lip by touching was measured using preapplied cow feces to which E. coli WG5 was added as an indicator. Via a Beta-Poisson dose-response function, the number of Campylobacter cases for the whole of the Netherlands (16 million population) in a year was estimated at 187 and 52 for children and adults, respectively, so 239 in total. This is significantly lower than previous QMRA results on chicken fillet and drinking water consumption. Scenarios of 90% reduction of the contamination (meant to mimic cleaning) of all fences and just goat fences reduces the number of cases by 82% and 75%, respectively. The model can easily be adapted for other fecally transmitted pathogens. © 2014 Society for Risk Analysis.

  20. Multiple quantitative trait analysis using bayesian networks.

    Science.gov (United States)

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness.

  1. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    Science.gov (United States)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  2. Stability and reproducibility of semi-quantitative risk assessment methods within the occupational health and safety scope.

    Science.gov (United States)

    Carvalho, F; Melo, R B

    2015-01-01

    In many enterprises the semi-quantitative approach turns out to be the available and most suitable technique to perform a risk assessment. Despite its advantages, we cannot disregard the existing gap in terms of validation of this type of applications. This paper reports a study about risk assessments' reliability, namely both inter-coder (reproducibility) and intra-coder (stability) reliability of the semi-quantitative approach. This study comprised 4 fundamental stages. Data collection relied on free and systematized observations and made use of video recording, documental research, analysis grids and questionnaires specifically developed for this purpose. A set of different analysts were asked to use four semi-quantitative risk assessment methods (in two different moments) to estimate and assess six risks identified in two tasks accomplished to produce Airbags. The Krippendorff's Alpha Coefficient (α K) was the agreement measure selected to evaluate both inter-coder and intra-coder consensus. The preliminary results revealed a general low concordance (α K risk assessment results obtained by individuals with different levels of experience or expertise. This study revealed that the use of the semi-quantitative approach should be done with caution.

  3. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...... of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure....

  4. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  5. A quantitative framework for estimating risk of collision between marine mammals and boats

    Science.gov (United States)

    Martin, Julien; Sabatier, Quentin; Gowan, Timothy A.; Giraud, Christophe; Gurarie, Eliezer; Calleson, Scott; Ortega-Ortiz, Joel G.; Deutsch, Charles J.; Rycyk, Athena; Koslovsky, Stacie M.

    2016-01-01

    Speed regulations of watercraft in protected areas are designed to reduce lethal collisions with wildlife but can have economic consequences. We present a quantitative framework for investigating the risk of deadly collisions between boats and wildlife.

  6. Quantitative and Public Perception of Landslide Risk in Badulla, Sri Lanka

    Science.gov (United States)

    Gunasekera, R.; Bandara, R. M. S.; Mallawatantri, A.; Saito, K.

    2009-04-01

    Landslides are often triggered by intense precipitation and are exacerbated by increased urbanisation and human activity. There is a significant risk of large scale landslides in Sri Lanka and when they do occur, they have the potential to cause devastation to property, lives and livelihoods. There are several high landslide risk areas in seven districts (Nuwara Eliya, Badulla, Ratnapura, Kegalle, Kandy, Matale and Kalutara) in Sri Lanka. These are also some of the poorest areas in the country and consequently the recovery process after catastrophic landslides become more problematic. Therefore landslide risk management is an important concern in poverty reduction strategies. We focused on the district of Badulla, Sri Lanka to evaluate the a) quantitative scientific analysis of landslide risk and b) qualitative public perception of landslides in the area. Combining high resolution, hazard and susceptibility data we quantified the risk of landslides in the area. We also evaluated the public perception of landslides in the area using participatory GIS techniques. The evaluation of public perception of landslide risk has been complemented by use of Landscan data. The framework of the methodology for Landscan data is based on using the second order administrative population data from census, each 30 arc-second cell within the administrative units receives a probability coefficient based on slope, proximity to roads and land cover. Provision of this information from these complementary methods to the regional planners help to strengthen the disaster risk reduction options and improving sustainable land use practices through enhanced public participation in the decision making and governance processes.

  7. Application of Bayesian networks in quantitative risk assessment of subsea blowout preventer operations.

    Science.gov (United States)

    Cai, Baoping; Liu, Yonghong; Liu, Zengkai; Tian, Xiaojie; Zhang, Yanzhen; Ji, Renjie

    2013-07-01

    This article proposes a methodology for the application of Bayesian networks in conducting quantitative risk assessment of operations in offshore oil and gas industry. The method involves translating a flow chart of operations into the Bayesian network directly. The proposed methodology consists of five steps. First, the flow chart is translated into a Bayesian network. Second, the influencing factors of the network nodes are classified. Third, the Bayesian network for each factor is established. Fourth, the entire Bayesian network model is established. Lastly, the Bayesian network model is analyzed. Subsequently, five categories of influencing factors, namely, human, hardware, software, mechanical, and hydraulic, are modeled and then added to the main Bayesian network. The methodology is demonstrated through the evaluation of a case study that shows the probability of failure on demand in closing subsea ram blowout preventer operations. The results show that mechanical and hydraulic factors have the most important effects on operation safety. Software and hardware factors have almost no influence, whereas human factors are in between. The results of the sensitivity analysis agree with the findings of the quantitative analysis. The three-axiom-based analysis partially validates the correctness and rationality of the proposed Bayesian network model.

  8. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  9. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  10. Vulnerability of Russian regions to natural risk: experience of quantitative assessment

    Science.gov (United States)

    Petrova, E.

    2006-01-01

    One of the important tracks leading to natural risk prevention, disaster mitigation or the reduction of losses due to natural hazards is the vulnerability assessment of an "at-risk" region. The majority of researchers propose to assess vulnerability according to an expert evaluation of several qualitative characteristics, scoring each of them usually using three ratings: low, average, and high. Unlike these investigations, we attempted a quantitative vulnerability assessment using multidimensional statistical methods. Cluster analysis for all 89 Russian regions revealed five different types of region, which are characterized with a single (rarely two) prevailing factor causing increase of vulnerability. These factors are: the sensitivity of the technosphere to unfavorable influences; a "human factor"; a high volume of stored toxic waste that increases possibility of NDs with serious consequences; the low per capita GRP, which determine reduced prevention and protection costs; the heightened liability of regions to natural disasters that can be complicated due to unfavorable social processes. The proposed methods permitted us to find differences in prevailing risk factor (vulnerability factor) for the region types that helps to show in which direction risk management should focus on.

  11. Vulnerability of Russian regions to natural risk: experience of quantitative assessment

    Directory of Open Access Journals (Sweden)

    E. Petrova

    2006-01-01

    Full Text Available One of the important tracks leading to natural risk prevention, disaster mitigation or the reduction of losses due to natural hazards is the vulnerability assessment of an 'at-risk' region. The majority of researchers propose to assess vulnerability according to an expert evaluation of several qualitative characteristics, scoring each of them usually using three ratings: low, average, and high. Unlike these investigations, we attempted a quantitative vulnerability assessment using multidimensional statistical methods. Cluster analysis for all 89 Russian regions revealed five different types of region, which are characterized with a single (rarely two prevailing factor causing increase of vulnerability. These factors are: the sensitivity of the technosphere to unfavorable influences; a 'human factor'; a high volume of stored toxic waste that increases possibility of NDs with serious consequences; the low per capita GRP, which determine reduced prevention and protection costs; the heightened liability of regions to natural disasters that can be complicated due to unfavorable social processes. The proposed methods permitted us to find differences in prevailing risk factor (vulnerability factor for the region types that helps to show in which direction risk management should focus on.

  12. Quantitative analysis of cascade impactor samples - revisited

    Science.gov (United States)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  13. Quantitative analysis of Li by PIGE technique

    Science.gov (United States)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  14. Quantitative MRI analysis of dynamic enhancement of focal liver lesions

    Directory of Open Access Journals (Sweden)

    S. S. Bagnenko

    2012-01-01

    Full Text Available In our study 45 patients with different focal liver lesions (110 nodules were examined using high field MR-system (1,5 T. During this investigation quantitative MRI analysis of dynamic enhancement of various hepatic lesions and parenchymatous organs of abdomen were performed. It was shown that quantitative evaluation of enhanced MRI improves understanding of vascular transformation processes in pathologic hepatic focuses and in liver itself that is important for differential diagnoses of these diseases.

  15. Analysis and management of risks experienced in tunnel construction

    Directory of Open Access Journals (Sweden)

    Cagatay Pamukcu

    2015-12-01

    Full Text Available In this study, first of all, the definitions of "risk", "risk analysis", "risk assessment" and "risk management" were made to avoid any confusions about these terms and significance of risk analysis and management in engineering projects was emphasized. Then, both qualitative and quantitative risk analysis techniques were mentioned and within the scope of the study, Event Tree Analysis method was selected in order to analyze the risks regarding TBM (Tunnel Boring Machine operations in tunnel construction. After all hazards that would be encountered during tunnel construction by TBM method had been investigated, those hazards were undergoing a Preliminary Hazard Analysis to sort out and prioritize the risks with high scores. When the risk scores were taken into consideration, it was seen that the hazards with high risk scores could be classified into 4 groups which are excavation + support induced accidents, accidents stemming from geologic conditions, auxiliary works, and project contract. According to these four classified groups of initiating events, Event Tree Analysis was conducted by taking into care 4 countermeasures apart from each other. Finally, the quantitative and qualitative consequences of Event Tree Analyses, which were undertaken for all initiating events, were investigated and interpreted together by making comparisons and referring to previous studies.

  16. A comprehensive quantitative assessment of bird extinction risk in Brazil.

    Directory of Open Access Journals (Sweden)

    Nathália Machado

    Full Text Available In an effort to avoid species loss, scientists have focused their efforts on the mechanisms making some species more prone to extinction than others. However, species show different responses to threats given their evolutionary history, behavior, and intrinsic biological features. We used bird biological features and external threats to (1 understand the multiple pathways driving Brazilian bird species to extinction, (2 to investigate if and how extinction risk is geographically structured, and (3 to quantify how much diversity is currently represented inside protected areas. We modeled the extinction risk of 1557 birds using classification trees and evaluated the relative contribution of each biological feature and external threat in predicting extinction risk. We also quantified the proportion of species and their geographic range currently protected by the network of Brazilian protected areas. The optimal classification tree showed different pathways to bird extinction. Habitat conversion was the most important predictor driving extinction risk though other variables, such as geographic range size, type of habitat, hunting or trapping and trophic guild, were also relevant in our models. Species under higher extinction risk were concentrated mainly in the Cerrado Biodiversity Hotspot and were not quite represented inside protected areas, neither in richness nor range. Predictive models could assist conservation actions, and this study could contribute by highlighting the importance of natural history and ecology in these actions.

  17. Estimating distributions out of qualitative and (semi)quantitative microbiological contamination data for use in risk assessment.

    Science.gov (United States)

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2010-04-15

    A framework using maximum likelihood estimation (MLE) is used to fit a probability distribution to a set of qualitative (e.g., absence in 25 g), semi-quantitative (e.g., presence in 25 g and absence in 1g) and/or quantitative test results (e.g., 10 CFU/g). Uncertainty about the parameters of the variability distribution is characterized through a non-parametric bootstrapping method. The resulting distribution function can be used as an input for a second order Monte Carlo simulation in quantitative risk assessment. As an illustration, the method is applied to two sets of in silico generated data. It is demonstrated that correct interpretation of data results in an accurate representation of the contamination level distribution. Subsequently, two case studies are analyzed, namely (i) quantitative analyses of Campylobacter spp. in food samples with nondetects, and (ii) combined quantitative, qualitative, semiquantitative analyses and nondetects of Listeria monocytogenes in smoked fish samples. The first of these case studies is also used to illustrate what the influence is of the limit of quantification, measurement error, and the number of samples included in the data set. Application of these techniques offers a way for meta-analysis of the many relevant yet diverse data sets that are available in literature and (inter)national reports of surveillance or baseline surveys, therefore increases the information input of a risk assessment and, by consequence, the correctness of the outcome of the risk assessment.

  18. Financial indicators for municipalities: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    Sreĉko Devjak

    2009-12-01

    Full Text Available From the characterization of Local Authority financing models and structures in Portugal and Slovenia, a set of financial and generic budget indicators has been established. These indicators may be used in a comparative analysis considering the Bragança District in Portugal, and municipalities of similar population size in Slovenia. The research identified significant differences, in terms of financing sources due to some discrepancies on financial models and competences of municipalities on each country. The results show that Portuguese and Slovenian municipalities, in 2003, for the economy indicator, had similar ranking behaviour, but in 2004, they changed this behaviour.

  19. QUANTITATIVE ANALYSIS OF DRAWING TUBES MICROSTRUCTURE

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2009-05-01

    Full Text Available Final properties of forming pieces are affected by production, at first conditions of mechanical working. Application of stereology methods to statistic reconstruction of three-dimensional plastic deformed material structure by bulk forming led to detail analysis of material structure changes. The microstructure of cold drawing tubes from STN 411353 steel was analyzed. Grain boundaries orientation was measured on perpendicular and parallel section of tubes with different degree of deformation. Macroscopic deformation leads to grain boundaries deformation and these ones were compared.

  20. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...... time-to-event characteristic of interest. Real genetic longevity studies based on female animals of different species (sows, dairy cows, and sheep) exemplifies the use of the methods. Moreover these studies allow to understand som genetic mechanisms related to the lenght of the productive life...

  1. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  2. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  3. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  4. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  5. Coffee Consumption and Risk of Stroke: A Dose-Response Meta-Analysis of Prospective Studies

    National Research Council Canada - National Science Library

    Larsson, Susanna C; Orsini, Nicola

    2011-01-01

    Coffee consumption has been inconsistently associated with risk of stroke. The authors conducted a meta-analysis of prospective studies to quantitatively assess the association between coffee consumption and stroke risk...

  6. A quantitative risk assessment for bovine spongiform encephalopathy in Japan

    NARCIS (Netherlands)

    Kadohira, M.; Stevenson, M.A.; Hogasen, H.R.; Koeijer, de A.A.

    2012-01-01

    A predictive case-cohort model was applied to Japanese data to analyze the interaction between challenge and stability factors for bovine spongiform encephalopathy (BSE) for the period 1985–2020. BSE risk in cattle was estimated as the expected number of detectable cases per year. The model was comp

  7. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  8. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  9. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  10. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  12. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Science.gov (United States)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  13. A Comprehensive Quantitative Assessment of Bird Extinction Risk in Brazil

    OpenAIRE

    Nathália Machado; Rafael Dias Loyola

    2013-01-01

    In an effort to avoid species loss, scientists have focused their efforts on the mechanisms making some species more prone to extinction than others. However, species show different responses to threats given their evolutionary history, behavior, and intrinsic biological features. We used bird biological features and external threats to (1) understand the multiple pathways driving Brazilian bird species to extinction, (2) to investigate if and how extinction risk is geographically structured,...

  14. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  15. ‘Omics’ technologies in quantitative microbial risk assessment

    NARCIS (Netherlands)

    Brul, S.; Basset, J.; Cook, P.; Kathariou, S.; McClure, P.; Jasti, P.R.; Betts, R.

    2012-01-01

    ‘Omics’ tools are being developed at an ever increasing pace. Collectively, genome sequencing, genome-wide transcriptional analysis (transcriptomics), proteomics, metabolomics, flux analysis (‘fluxomics’) and other applications are captured under the term omics. The data generated using these tools

  16. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  17. Quantitative relations between risk, return and firm size

    Science.gov (United States)

    Podobnik, B.; Horvatic, D.; Petersen, A. M.; Stanley, H. E.

    2009-03-01

    We analyze —for a large set of stocks comprising four financial indices— the annual logarithmic growth rate R and the firm size, quantified by the market capitalization MC. For the Nasdaq Composite and the New York Stock Exchange Composite we find that the probability density functions of growth rates are Laplace ones in the broad central region, where the standard deviation σ(R), as a measure of risk, decreases with the MC as a power law σ(R)~(MC)- β. For both the Nasdaq Composite and the S&P 500, we find that the average growth rate langRrang decreases faster than σ(R) with MC, implying that the return-to-risk ratio langRrang/σ(R) also decreases with MC. For the S&P 500, langRrang and langRrang/σ(R) also follow power laws. For a 20-year time horizon, for the Nasdaq Composite we find that σ(R) vs. MC exhibits a functional form called a volatility smile, while for the NYSE Composite, we find power law stability between σ(r) and MC.

  18. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  19. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  20. Quantitative risk assessment of entry of contagious bovine pleuropneumonia through live cattle imported from northwestern Ethiopia.

    Science.gov (United States)

    Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye

    2015-11-01

    Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk.

  1. ANALYSIS OF THE TAX RISK OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Safonova M. F.

    2014-09-01

    Full Text Available In the article, the author considers the essence of tax risks, summarized methodological approaches to the determination of tax risk classification and systematization of tax risks; stages in the analysis of tax risks

  2. Conceptual risk assessment framework for global change risk analysis SRP

    CSIR Research Space (South Africa)

    Elphinstone, CD

    2007-12-01

    Full Text Available This report is submitted as a deliverable of the SRP project Global Change Risk Analysis which aims at applying risk analysis as a unifying notion for quantifying and communicating threats to ecosystem services originating from global change...

  3. Accuracy of Image Analysis in Quantitative Study of Cement Paste

    Directory of Open Access Journals (Sweden)

    Feng Shu-Xia

    2016-01-01

    Full Text Available Quantitative study on cement paste especially blended cement paste has been a hot and difficult issue over the years, and the technique of backscattered electron image analysis showed unique advantages in this field. This paper compared the test results of cement hydration degree, Ca(OH2 content and pore size distribution in pure pastes by image analysis and other methods. Then the accuracy of qualitative study by image analysis was analyzed. The results showed that image analysis technique had displayed higher accuracy in quantifying cement hydration degree and Ca(OH2 content than non-evaporable water test and thermal analysis respectively.

  4. Quantitative analysis of microtubule transport in growing nerve processes

    DEFF Research Database (Denmark)

    Ma*, Ytao; Shakiryanova*, Dinara; Vardya, Irina;

    2004-01-01

    the translocation of MT plus ends in the axonal shaft by expressing GFP-EB1 in Xenopus embryo neurons in culture. Formal quantitative analysis of MT assembly/disassembly indicated that none of the MTs in the axonal shaft were rapidly transported. Our results suggest that transport of axonal MTs is not required...

  5. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources ... 350 km long extends from the eastern border of Sierra Leone all the way to. Ghana. .... consider whether data will likely fit the assumptions of a selected model. ... These tests are not alternatives to parametric tests, but rather are a means of.

  6. Analysis of Forecasting Sales By Using Quantitative And Qualitative Methods

    Directory of Open Access Journals (Sweden)

    B. Rama Sanjeeva Sresta,

    2016-09-01

    Full Text Available This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives.

  7. Insights Into Quantitative Biology: analysis of cellular adaptation

    OpenAIRE

    Agoni, Valentina

    2013-01-01

    In the last years many powerful techniques have emerged to measure protein interactions as well as gene expression. Many progresses have been done since the introduction of these techniques but not toward quantitative analysis of data. In this paper we show how to study cellular adaptation and how to detect cellular subpopulations. Moreover we go deeper in analyzing signal transduction pathways dynamics.

  8. Quantitating the subtleties of microglial morphology with fractal analysis.

    Science.gov (United States)

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  9. Quantitating the Subtleties of Microglial Morphology with Fractal Analysis

    Directory of Open Access Journals (Sweden)

    Audrey eKarperien

    2013-01-01

    Full Text Available It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells. Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  10. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    Science.gov (United States)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  11. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    Science.gov (United States)

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Quantitative numerical analysis of transient IR-experiments on buildings

    Science.gov (United States)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  13. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  14. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  15. Quantitative NDI integration with probabilistic fracture mechanics for the assessment of fracture risk in pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, Jochen H.; Cioclov, Dragos; Dobmann, Gerd; Boiler, Christian [Fraunhofer Inst. fuer Zerstoerungsfreie Pruefverfahren (IZFP), Saarbruecken (Germany)

    2009-07-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The key concept in the analysis is the stress intensity factor (SIF) which accounts on the geometry of the component and the size of a pre-existent defect of a crack nature. FAD assessments can be made in deterministic sense, which yields the end result in dual terms of fail/not-fail. The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one (in mean sense) is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. An important feature of the PVrisk software is the ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI). It is achieved by integrating, algorithmically. probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure (increase of reliability) when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses of the fracture toughness are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. (orig.)

  16. [Identification of risk factors for work-related stress in a hospital: a qualitative and quantitative approach].

    Science.gov (United States)

    Cortese, C G; Gerbaudo, Laura; Manconi, Maria Paola; Violante, B

    2013-01-01

    Italian legislation establishes the obligation for the employer to assess any risks to the safety and health of workers, including those relating to work-related stress (WRS). Several studies have proved the existence of a link between WRS and both individual diseases and organizational results. The research aimed at detecting WRS risk factors in a hospital consisting of 53 departments employing 2334 workers. A qualitative and quantitative approach was adopted divided into six steps: 1) analysis of the hospital indicators; 2) semistructured interviews of the 53 department heads; 3) preparation of a checklist including 42 WRS risk indicators; 4) observation by shadowing of the 53 departments; 5) setting up of 53 focus groups with staff from each department; 6) distribution of the check-list to a representative sample of 747 employees. Data analysis showed a "low" level of WRS risk regarding the hospital as a whole, a "medium" level regarding six transversal indicators and eight departments. Three indicators considered particularly significant were examined in detail: "workplace and ergonomic conditions", "shift work", "interruptions in work flow". The results helped to identifj a series of best practices aimed at reducing WRS risk that are applicable to other health care settings. The qualitative and quantitative approach produced a keen involvement of the employees of the hospital which will positively encourage the real efficacy of the measures taken.

  17. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  18. Towards a better reliability of risk assessment: development of a qualitative & quantitative risk evaluation model (Q2REM) for different trades of construction works in Hong Kong.

    Science.gov (United States)

    Fung, Ivan W H; Lo, Tommy Y; Tung, Karen C F

    2012-09-01

    Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals' beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1. At the second stage, the deficiencies of the current general practice for risk assessment can be sorted out firstly. Based on the findings from Stage 1 and the historical accident data from 15 large-scaled construction projects in 3-year average, a risk evaluation model prioritizing the risk levels of different trades of works and which cause different types of site accident due to various accident causes will be developed quantitatively. With the suggested systematic accident recording techniques, this model can be implemented in the construction industry at both project level and organizational level. The model (Q(2)REM) not only act as a useful supplementary guideline of risk assessment for the construction safety professionals, but also assists them to pinpoint the potential risks on site for the construction workers under respective trades of works through safety trainings and education. It, in turn, arouses their awareness on safety risk. As the Q(2)REM can clearly show the potential accident causes leading to

  19. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  20. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union

    DEFF Research Database (Denmark)

    Snary, Emma L.; Swart, Arno N.; Simons, Robin R. L.

    2016-01-01

    A farm‐to‐consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter‐pig prevalence and the imp......A farm‐to‐consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter‐pig prevalence...

  1. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Science.gov (United States)

    2013-02-11

    ... Quantitative Assessment of the Risk of Listeriosis From Soft-Ripened Cheese Consumption in the United States... Canada--Sant Canada Quantitative Assessment of the Risk of Listeriosis From Soft-Ripened Cheese... overall risk of invasive listeriosis to the consumer in the United States or Canada of soft-...

  2. Preparing for the unprecedented - Towards quantitative oil risk assessment in the Arctic marine areas.

    Science.gov (United States)

    Nevalainen, Maisa; Helle, Inari; Vanhatalo, Jarno

    2017-01-15

    The probability of major oil accidents in Arctic seas is increasing alongside with increasing maritime traffic. Hence, there is a growing need to understand the risks posed by oil spills to these unique and sensitive areas. So far these risks have mainly been acknowledged in terms of qualitative descriptions. We introduce a probabilistic framework, based on a general food web approach, to analyze ecological impacts of oil spills. We argue that the food web approach based on key functional groups is more appropriate for providing holistic view of the involved risks than assessments based on single species. We discuss the issues characteristic to the Arctic that need a special attention in risk assessment, and provide examples how to proceed towards quantitative risk estimates. The conceptual model presented in the paper helps to identify the most important risk factors and can be used as a template for more detailed risk assessments. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. A risk analysis model in concurrent engineering product development.

    Science.gov (United States)

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company.

  4. Quantitative nanoscale analysis in 3D using electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kuebel, Christian [Karlsruhe Institute of Technology, INT, 76344 Eggenstein-Leopoldshafen (Germany)

    2011-07-01

    State-of-the-art electron tomography has been established as a powerful tool to image complex structures with nanometer resolution in 3D. Especially STEM tomography is used extensively in materials science in such diverse areas as catalysis, semiconductor materials, and polymer composites mainly providing qualitative information on morphology, shape and distribution of materials. However, for an increasing number of studies quantitative information, e.g. surface area, fractal dimensions, particle distribution or porosity are needed. A quantitative analysis is typically performed after segmenting the tomographic data, which is one of the main sources of error for the quantification. In addition to noise, systematic errors due to the missing wedge and due to artifacts from the reconstruction algorithm itself are responsible for these segmentation errors and improved algorithms are needed. This presentation will provide an overview of the possibilities and limitations of quantitative nanoscale analysis by electron tomography. Using catalysts and nano composites as applications examples, intensities and intensity variations observed for the 3D volume reconstructed by WBP and SIRT will be quantitatively compared to alternative reconstruction algorithms; implications for quantification of electron (or X-ray) tomographic data will be discussed and illustrated for quantification of particle size distributions, particle correlations, surface area, and fractal dimensions in 3D.

  5. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  6. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  7. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  8. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  9. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu

    2014-01-01

    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  10. Vulnerability of Russian regions to natural risk: experience of quantitative assessment

    OpenAIRE

    Petrova, E.

    2006-01-01

    International audience; One of the important tracks leading to natural risk prevention, disaster mitigation or the reduction of losses due to natural hazards is the vulnerability assessment of an "at-risk" region. The majority of researchers propose to assess vulnerability according to an expert evaluation of several qualitative characteristics, scoring each of them usually using three ratings: low, average, and high. Unlike these investigations, we attempted a quantitative vulnerability asse...

  11. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  12. Qualitative and quantitative stability analysis of penta-rhythmic circuits

    Science.gov (United States)

    Schwabedal, Justus T. C.; Knapper, Drake E.; Shilnikov, Andrey L.

    2016-12-01

    Inhibitory circuits of relaxation oscillators are often-used models for dynamics of biological networks. We present a qualitative and quantitative stability analysis of such a circuit constituted by three generic oscillators (of a Fitzhugh-Nagumo type) as its nodes coupled reciprocally. Depending on inhibitory strengths, and parameters of individual oscillators, the circuit exhibits polyrhythmicity of up to five simultaneously stable rhythms. With methods of bifurcation analysis and phase reduction, we investigate qualitative changes in stability of these circuit rhythms for a wide range of parameters. Furthermore, we quantify robustness of the rhythms maintained under random perturbations by monitoring phase diffusion in the circuit. Our findings allow us to describe how circuit dynamics relate to dynamics of individual nodes. We also find that quantitative and qualitative stability properties of polyrhythmicity do not always align.

  13. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H;

    2016-01-01

    BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...... to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm...

  14. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  15. The value of quantitative patient preferences in regulatory benefit-risk assessment

    NARCIS (Netherlands)

    Oude Egbrink, Mart; IJzerman, Maarten Joost

    2014-01-01

    Quantitative patient preferences are a method to involve patients in regulatory benefit-risk assessment. Assuming preferences can be elicited, there might be multiple advantages to their use. Legal, methodological and procedural issues do however imply that preferences are currently at most part of

  16. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    Science.gov (United States)

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  17. Quantitative risk assessment for skin sensitisation : Consideration of a simplified approach for hair dye ingredients

    NARCIS (Netherlands)

    Goebel, C.; Diepgen, T.L.; Krasteva, M.; Schlatter, H.; Nicolas, J.F.; Blomeke, B.; Coenraads, P.J.; Schnuch, A.; Taylor, J.S.; Pungier, J.; Fautz, R.; Fuchs, A.; Schuh, W.; Gerberick, G.F.; Kimber, I.

    2012-01-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-ind

  18. Preoperative Prediction of Microvascular Invasion in Hepatocellular Carcinoma using Quantitative Image Analysis.

    Science.gov (United States)

    Zheng, Jian; Chakraborty, Jayasree; Chapman, William C; Gerst, Scott; Gonen, Mithat; Pak, Linda M; Jarnagin, William R; DeMatteo, Ronald P; Do, Richard Kg; Simpson, Amber L; Allen, Peter J; Balachandran, Vinod P; D'Angelica, Michael I; Kingham, T Peter; Vachharajani, Neeta

    2017-09-20

    Microvascular invasion (MVI) is a significant risk factor for early recurrence after resection or transplantation for hepatocellular carcinoma (HCC). Knowledge of MVI status would help guide treatment recommendations but is generally identified after surgery. This study aims to predict MVI preoperatively using quantitative image analysis. From 2 institutions, 120 patients submitted to resection of HCC from 2003 to 2015 were included. The largest tumor from preoperative CT was subjected to quantitative image analysis, which uses an automated computer algorithm to capture regional variation in CT enhancement patterns. Quantitative imaging features by automatic analysis, qualitative radiographic descriptors by 2 radiologists, and preoperative clinical variables were included in multivariate analysis to predict histologic MVI. Histologic MVI was identified in 19 (37%) patients with tumors ≤ 5 cm and 34 (49%) patients with tumors > 5 cm. Among patients with ≤ 5 cm tumors, none of clinical findings or radiographic descriptors was associated with MVI; however, quantitative feature based on angle co-occurrence matrix predicted MVI with area under curve (AUC) 0.80, positive predictive value (PPV) 63% and negative predictive value (NPV) 85%. In patients with > 5 cm tumors, higher α-fetoprotein (AFP) level, larger tumor size, and viral hepatitis history were associated with MVI, whereas radiographic descriptors did not. However, a multivariate model combining AFP, tumor size, hepatitis status, and quantitative feature based on local binary pattern predicted MVI with AUC 0.88, PPV 72% and NPV 96%. This study reveals the potential importance of quantitative image analysis as a predictor of MVI. Copyright © 2017. Published by Elsevier Inc.

  19. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    Science.gov (United States)

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  20. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    Science.gov (United States)

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  1. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies

    Directory of Open Access Journals (Sweden)

    Siu-Leung Chau

    2016-08-01

    Full Text Available Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ injection (SQI, via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS; saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC with evaporative light scattering detector (ELSD on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%–0.21%, and 53.49%–58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  2. Panel Random Analysis of Credit Risk in Business

    Institute of Scientific and Technical Information of China (English)

    LIU Wei; ZHOU Yue-mei; ZHOU Ke

    2005-01-01

    Market economy is a kind of credit economy.The survival and development of an individual in the society are closely related with his credit. Without credit, market economy can not continue, the society can hardly run in good order and good health. This paper defines the basic concept of trade credit risk with its manifestation and brings forward the basic mode quantitatively analyzing the credit risk. The data structure of information is analyzed, the decomposition model of credit risk is structured and with the aid of statistical analysis, including regression analysis, analysis of variance, test of hypothesized, the description, classification, certification and confirmation of credit risk model are completed, then, we can describe and control the credit risk with the model to provide basis when building credit support system in today's society.

  3. Quantitative analysis for nonlinear fluorescent spectra based on edges matching

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A novel spectra-edge-matching approach is proposed for the quantitative analysis of the nonlinear fluorescence spectra of the air impurities excited by a femtosecond laser.The fluorescence spectra are first denoised and compressed,both by wavelet transform,and several peak groups are then picked from each spectrum according to a threshold of intensity and are used to extract the spectral features through principal component analysis.It is indicated that the first two principle components actually cover up to 98% of the total information and are sufficient for the final concentration analysis.The analysis reveals a monotone relationship between the spectra intensity and the concentration of the air impurities,suggesting that the femtosecond laser induced fluorescence spectroscopy along with the proposed spectra analysis method can become a powerful tool for monitoring environmental pollutants.

  4. Quantitative gait analysis following hemispherotomy for Rasmussen′s encephalitis

    Directory of Open Access Journals (Sweden)

    Santhosh George Thomas

    2007-01-01

    Full Text Available Peri-insular hemispherotomy is a form of disconnective hemispherectomy involving complete disconnection of all ascending / descending and commisural connections of one hemisphere. We report a case of a seven and a half year old child with intractable epilepsy due to Rasmussen′s encephalitis who underwent peri-insular hemispherotomy and achieved complete freedom from seizures. Quantitative gait analysis was used to describe the changes in the kinematic and kinetic parameters of gait with surface electromyographs 18 months after surgery. The focus of this paper is to highlight the utility of gait analysis following hemispherotomy with a view to directing postsurgical motor training and rehabilitation.

  5. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    Science.gov (United States)

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  6. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  7. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    Science.gov (United States)

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  8. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    Science.gov (United States)

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  9. Dam risk assistant analysis system design

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In order to reduce the labor intensity and task difficulty of dam risk analysis and to meet the actual requirement of dam risk analysis,it is necessary to establish a dam risk assistant analysis system.The program structure and the implementation ways of the dam risk assistant analysis system are analyzed,and a procedural framework with "three-tier and multi-database" structure and "level structure" is established.The concept of dam risk assessment system modular development is proposed and the coupled mode of function module and data is improved.Finally,the dam risk assistant analysis system is developed using Delphi visual programming language.

  10. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  11. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    Science.gov (United States)

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  12. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  13. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  14. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    Science.gov (United States)

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  15. Segregation Analysis on Genetic System of Quantitative Traits in Plants

    Institute of Scientific and Technical Information of China (English)

    Gai Junyi

    2006-01-01

    Based on the traditional polygene inheritance model of quantitative traits,the author suggests the major gene and polygene mixed inheritance model.The model was considered as a general one,while the pure major gene and pure polygene inheritance model was a specific case of the general model.Based on the proposed theory,the author established the segregation analysis procedure to study the genetic system of quantitative traits of plants.At present,this procedure can be used to evaluate the genetic effect of individual major genes (up to two to three major genes),the collective genetic effect of polygene,and their heritability value.This paper introduces how to establish the procedure,its main achievements,and its applications.An example is given to illustrate the steps,methods,and effectiveness of the procedure.

  16. Analysis of quantitative pore features based on mathematical morphology

    Institute of Scientific and Technical Information of China (English)

    QI Heng-nian; CHEN Feng-nong; WANG Hang-jun

    2008-01-01

    Wood identification is a basic technique of wood science and industry. Pore features are among the most important identification features for hardwoods. We have used a method based on an analysis of quantitative pore feature, which differs from traditional qualitative methods. We applies mathematical morphology methods such as dilation and erosion, open and close transformation of wood cross-sections, image repairing, noise filtering and edge detection to segment the pores from their background. Then the mean square errors (MSE) of pores were computed to describe the distribution of pores. Our experiment shows that it is easy to classift the pore features into three basic types, just as in traditional qualitative methods, but with the use of MSE of pores. This quantitative method improves wood identification considerably.

  17. Quantitative analysis of in vivo confocal microscopy images: a review.

    Science.gov (United States)

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  18. 38 CFR 75.115 - Risk analysis.

    Science.gov (United States)

    2010-07-01

    ... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational recommendations... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115...

  19. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2013-12-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  20. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure.

  1. Geothermal Power Plant Maintenance: Evaluating Maintenance System Needs Using Quantitative Kano Analysis

    Directory of Open Access Journals (Sweden)

    Reynir S. Atlason

    2014-07-01

    Full Text Available A quantitative Kano model is used in this study to identify which features are preferred by top-level maintenance engineers within Icelandic geothermal power plants to be implemented in a maintenance tool or software. Visits were conducted to the largest Icelandic energy companies operating geothermal power plants. Thorough interviews with chiefs of operations and maintenance were used as a basis for a quantitative Kano analysis. Thirty seven percent of all maintenance engineers at Reykjavik Energy and Landsvirkjun, responsible for 71.5% of the total energy production from geothermal resources in Iceland, answered the Kano questionnaire. Findings show that solutions focusing on (1 planning maintenance according to condition; (2 shortening documentation times; and (3 risk analysis are sought after by the energy companies but not provided for the geothermal sector specifically.

  2. Quantitative phosphoproteomic analysis using iTRAQ method.

    Science.gov (United States)

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  3. A quantitative analysis of IRAS maps of molecular clouds

    Science.gov (United States)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  4. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  5. Chromatin immunoprecipitation: optimization, quantitative analysis and data normalization

    Directory of Open Access Journals (Sweden)

    Peterhansel Christoph

    2007-09-01

    Full Text Available Abstract Background Chromatin remodeling, histone modifications and other chromatin-related processes play a crucial role in gene regulation. A very useful technique to study these processes is chromatin immunoprecipitation (ChIP. ChIP is widely used for a few model systems, including Arabidopsis, but establishment of the technique for other organisms is still remarkably challenging. Furthermore, quantitative analysis of the precipitated material and normalization of the data is often underestimated, negatively affecting data quality. Results We developed a robust ChIP protocol, using maize (Zea mays as a model system, and present a general strategy to systematically optimize this protocol for any type of tissue. We propose endogenous controls for active and for repressed chromatin, and discuss various other controls that are essential for successful ChIP experiments. We experienced that the use of quantitative PCR (QPCR is crucial for obtaining high quality ChIP data and we explain why. The method of data normalization has a major impact on the quality of ChIP analyses. Therefore, we analyzed different normalization strategies, resulting in a thorough discussion of the advantages and drawbacks of the various approaches. Conclusion Here we provide a robust ChIP protocol and strategy to optimize the protocol for any type of tissue; we argue that quantitative real-time PCR (QPCR is the best method to analyze the precipitates, and present comprehensive insights into data normalization.

  6. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  7. Analysis of foreign schools of risk management

    OpenAIRE

    I.M. Posokhov

    2013-01-01

    The aim of the article. The aim of the article is to study the scientific development of foreign scientific schools of risk management and analysis of their main publications; the allocation of foreign scientific schools of risk management. The results of the analysis. Research of modern risk management is carried out leading foreign schools. The most famous school in the theory of financial risk and risk management is American school. Among its current members are D. Galai, H. Greuning, A...

  8. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  9. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    Science.gov (United States)

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese.

  10. A quantitative method for risk assessment of agriculture due to climate change

    Science.gov (United States)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  11. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  12. Quantitative Analysis of Polarimetric Model-Based Decomposition Methods

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2016-11-01

    Full Text Available In this paper, we analyze the robustness of the parameter inversion provided by general polarimetric model-based decomposition methods from the perspective of a quantitative application. The general model and algorithm we have studied is the method proposed recently by Chen et al., which makes use of the complete polarimetric information and outperforms traditional decomposition methods in terms of feature extraction from land covers. Nevertheless, a quantitative analysis on the retrieved parameters from that approach suggests that further investigations are required in order to fully confirm the links between a physically-based model (i.e., approaches derived from the Freeman–Durden concept and its outputs as intermediate products before any biophysical parameter retrieval is addressed. To this aim, we propose some modifications on the optimization algorithm employed for model inversion, including redefined boundary conditions, transformation of variables, and a different strategy for values initialization. A number of Monte Carlo simulation tests for typical scenarios are carried out and show that the parameter estimation accuracy of the proposed method is significantly increased with respect to the original implementation. Fully polarimetric airborne datasets at L-band acquired by German Aerospace Center’s (DLR’s experimental synthetic aperture radar (E-SAR system were also used for testing purposes. The results show different qualitative descriptions of the same cover from six different model-based methods. According to the Bragg coefficient ratio (i.e., β , they are prone to provide wrong numerical inversion results, which could prevent any subsequent quantitative characterization of specific areas in the scene. Besides the particular improvements proposed over an existing polarimetric inversion method, this paper is aimed at pointing out the necessity of checking quantitatively the accuracy of model-based PolSAR techniques for a

  13. Spatially quantitative models for vulnerability analyses and resilience measures in flood risk management: Case study Rafina, Greece

    Science.gov (United States)

    Karagiorgos, Konstantinos; Chiari, Michael; Hübl, Johannes; Maris, Fotis; Thaler, Thomas; Fuchs, Sven

    2013-04-01

    We will address spatially quantitative models for vulnerability analyses in flood risk management in the catchment of Rafina, 25 km east of Athens, Greece; and potential measures to reduce damage costs. The evaluation of flood damage losses is relatively advanced. Nevertheless, major problems arise since there are no market prices for the evaluation process available. Moreover, there is particular gap in quantifying the damages and necessary expenditures for the implementation of mitigation measures with respect to flash floods. The key issue is to develop prototypes for assessing flood losses and the impact of mitigation measures on flood resilience by adjusting a vulnerability model and to further develop the method in a Mediterranean region influenced by both, mountain and coastal characteristics of land development. The objective of this study is to create a spatial and temporal analysis of the vulnerability factors based on a method combining spatially explicit loss data, data on the value of exposed elements at risk, and data on flood intensities. In this contribution, a methodology for the development of a flood damage assessment as a function of the process intensity and the degree of loss is presented. It is shown that (1) such relationships for defined object categories are dependent on site-specific and process-specific characteristics, but there is a correlation between process types that have similar characteristics; (2) existing semi-quantitative approaches of vulnerability assessment for elements at risk can be improved based on the proposed quantitative method; and (3) the concept of risk can be enhanced with respect to a standardised and comprehensive implementation by applying the vulnerability functions to be developed within the proposed research. Therefore, loss data were collected from responsible administrative bodies and analysed on an object level. The used model is based on a basin scale approach as well as data on elements at risk exposed

  14. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    OpenAIRE

    A. O. Domanov

    2014-01-01

    Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg) explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial chara...

  15. Quantitative analysis of sideband coupling in photoinduced force microscopy

    Science.gov (United States)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  16. Quantitative and comparative analysis of hyperspectral data fusion performance

    Institute of Scientific and Technical Information of China (English)

    王强; 张晔; 李硕; 沈毅

    2002-01-01

    Hyperspectral data fusion technique is the key to hyperspectral data processing in recent years. Manyfusion methods have been proposed, but little research has been done to evaluate the performances of differentdata fusion methods. In order to meet the urgent need, quantitative correlation analysis (QCA) is proposed toanalyse and compare the performances of different fusion methods directly from data before and after fusion. Ex-periment results show that the new method is effective and the results of comparison are in agreement with theresults of application.

  17. Linkage of DNA Methylation Quantitative Trait Loci to Human Cancer Risk

    Directory of Open Access Journals (Sweden)

    Holger Heyn

    2014-04-01

    Full Text Available Epigenetic regulation and, in particular, DNA methylation have been linked to the underlying genetic sequence. DNA methylation quantitative trait loci (meQTL have been identified through significant associations between the genetic and epigenetic codes in physiological and pathological contexts. We propose that interrogating the interplay between polymorphic alleles and DNA methylation is a powerful method for improving our interpretation of risk alleles identified in genome-wide association studies that otherwise lack mechanistic explanation. We integrated patient cancer risk genotype data and genome-scale DNA methylation profiles of 3,649 primary human tumors, representing 13 solid cancer types. We provide a comprehensive meQTL catalog containing DNA methylation associations for 21% of interrogated cancer risk polymorphisms. Differentially methylated loci harbor previously reported and as-yet-unidentified cancer genes. We suggest that such regulation at the DNA level can provide a considerable amount of new information about the biology of cancer-risk alleles.

  18. Quantitative assessment of the risk of introduction of bovine viral diarrhea virus in Danish dairy herds

    DEFF Research Database (Denmark)

    Foddai, Alessandro; Boklund, Anette; Stockmarr, Anders

    2014-01-01

    A quantitative risk assessment was carried out to estimate the likelihood of introduc-ing bovine viral diarrhea virus (BVDV) in Danish dairy herds per year and per trimester,respectively. The present study gives important information on the impact of risk mitiga-tion measures and sources of uncer......A quantitative risk assessment was carried out to estimate the likelihood of introduc-ing bovine viral diarrhea virus (BVDV) in Danish dairy herds per year and per trimester,respectively. The present study gives important information on the impact of risk mitiga-tion measures and sources...... of uncertainty due to lack of data. As suggested in the Agreementon the Application of Sanitary and Phytosanitary Measures (SPS Agreement), the OIE Ter-restrial Animal Health Code was followed for a transparent science-based risk assessment.Data from 2010 on imports of live cattle, semen, and embryos, exports...... of live cattle, aswell as use of vaccines were analyzed. Information regarding the application of biosecuritymeasures, by veterinarians and hoof trimmers practicing in Denmark and in other countries,was obtained by contacting several stakeholders, public institutions and experts. Stochas-tic scenario...

  19. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    Science.gov (United States)

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  20. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    Science.gov (United States)

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org).

  1. Quantitative chemical analysis of ocular melanosomes in the TEM.

    Science.gov (United States)

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  2. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    Science.gov (United States)

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  3. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    Science.gov (United States)

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  4. Risk Analysis in Road Tunnels – Most Important Risk Indicators

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Thöns, Sebastian

    2016-01-01

    the effects and highlights the most important risk indicators with the aim to support further developments in risk analysis. Therefore, a system model of a road tunnel was developed to determine the risk measures. The system model can be divided into three parts: the fire part connected to the fire model Fire...... Dynamics Simulator (FDS); the evacuation part connected to the evacuation model FDS+Evac; and the frequency part connected to a model to calculate the frequency of fires. This study shows that the parts of the system model (and their most important risk indicators) affect the risk measures in the following......, further research can focus on these most important risk indicators with the aim to optimise risk analysis....

  5. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  6. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  7. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    Science.gov (United States)

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  8. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  9. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  10. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  11. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  12. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    Science.gov (United States)

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  13. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  14. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    Science.gov (United States)

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm.

  15. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  16. Risk Analysis for Tea Processing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@ Itis lbviors that after all the disasters with dilxins, BSE, pathogcns,Footand Mouth disease a. o. and now shortly because of the possibillties of bioterrorism, thatFoodSafetyisalmostatthetopoftheagendaoftheEUfor theyearstocome The implementaion of certainhy gicneprinci plessuchas HA C C P and a transparent hygiene policy applicable to all food and all food operators, from the farm to the table, togetherwith effoctiveinstruments to manage Food Safety will form fsubstantialpart on this agenda. As an example external quality factors such as certain pathogens in tea will. be discussed. Since risk analysis of e. g. my cotoxing have already a quite long histoy and development in sereral international bodies and tea might bear unwanted (or deliberately added by terroristic action)contaminants, the need to monitor teamuch more regularly than is being done today, seems to be a"conditio sine qua non ". Recentoy developed Immuno Flow tests may one day help the consumer perhaps to find out if he gets poisoned.

  17. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  18. Local scale multiple quantitative risk assessment and uncertainty evaluation in a densely urbanised area (Brescia, Italy

    Directory of Open Access Journals (Sweden)

    S. Lari

    2012-11-01

    Full Text Available The study of the interactions between natural and anthropogenic risks is necessary for quantitative risk assessment in areas affected by active natural processes, high population density and strong economic activities.

    We present a multiple quantitative risk assessment on a 420 km2 high risk area (Brescia and surroundings, Lombardy, Northern Italy, for flood, seismic and industrial accident scenarios. Expected economic annual losses are quantified for each scenario and annual exceedance probability-loss curves are calculated. Uncertainty on the input variables is propagated by means of three different methodologies: Monte-Carlo-Simulation, First Order Second Moment, and point estimate.

    Expected losses calculated by means of the three approaches show similar values for the whole study area, about 64 000 000 € for earthquakes, about 10 000 000 € for floods, and about 3000 € for industrial accidents. Locally, expected losses assume quite different values if calculated with the three different approaches, with differences up to 19%.

    The uncertainties on the expected losses and their propagation, performed with the three methods, are compared and discussed in the paper. In some cases, uncertainty reaches significant values (up to almost 50% of the expected loss. This underlines the necessity of including uncertainty in quantitative risk assessment, especially when it is used as a support for territorial planning and decision making. The method is developed thinking at a possible application at a regional-national scale, on the basis of data available in Italy over the national territory.

  19. Glioblastoma multiforme: exploratory radiogenomic analysis by using quantitative image features.

    Science.gov (United States)

    Gevaert, Olivier; Mitchell, Lex A; Achrol, Achal S; Xu, Jiajing; Echegaray, Sebastian; Steinberg, Gary K; Cheshier, Samuel H; Napel, Sandy; Zaharchuk, Greg; Plevritis, Sylvia K

    2014-10-01

    To derive quantitative image features from magnetic resonance (MR) images that characterize the radiographic phenotype of glioblastoma multiforme (GBM) lesions and to create radiogenomic maps associating these features with various molecular data. Clinical, molecular, and MR imaging data for GBMs in 55 patients were obtained from the Cancer Genome Atlas and the Cancer Imaging Archive after local ethics committee and institutional review board approval. Regions of interest (ROIs) corresponding to enhancing necrotic portions of tumor and peritumoral edema were drawn, and quantitative image features were derived from these ROIs. Robust quantitative image features were defined on the basis of an intraclass correlation coefficient of 0.6 for a digital algorithmic modification and a test-retest analysis. The robust features were visualized by using hierarchic clustering and were correlated with survival by using Cox proportional hazards modeling. Next, these robust image features were correlated with manual radiologist annotations from the Visually Accessible Rembrandt Images (VASARI) feature set and GBM molecular subgroups by using nonparametric statistical tests. A bioinformatic algorithm was used to create gene expression modules, defined as a set of coexpressed genes together with a multivariate model of cancer driver genes predictive of the module's expression pattern. Modules were correlated with robust image features by using the Spearman correlation test to create radiogenomic maps and to link robust image features with molecular pathways. Eighteen image features passed the robustness analysis and were further analyzed for the three types of ROIs, for a total of 54 image features. Three enhancement features were significantly correlated with survival, 77 significant correlations were found between robust quantitative features and the VASARI feature set, and seven image features were correlated with molecular subgroups (P < .05 for all). A radiogenomics map was

  20. Multiparent intercross populations in analysis of quantitative traits

    Indian Academy of Sciences (India)

    Sujay Rakshit; Arunita Rakshit; J. V. Patil

    2011-04-01

    Most traits of interest to medical, agricultural and animal scientists show continuous variation and complex mode of inheritance. DNA-based markers are being deployed to analyse such complex traits, that are known as quantitative trait loci (QTL). In conventional QTL analysis, F2, backcross populations, recombinant inbred lines, backcross inbred lines and double haploids from biparental crosses are commonly used. Introgression lines and near isogenic lines are also being used for QTL analysis. However, such populations have major limitations like predominantly relying on the recombination events taking place in the F1 generation and mapping of only the allelic pairs present in the two parents. The second generation mapping resources like association mapping, nested association mapping and multiparent intercross populations potentially address the major limitations of available mapping resources. The potential of multiparent intercross populations in gene mapping has been discussed here. In such populations both linkage and association analysis can be conductted without encountering the limitations of structured populations. In such populations, larger genetic variation in the germplasm is accessed and various allelic and cytoplasmic interactions are assessed. For all practical purposes, across crop species, use of eight founders and a fixed population of 1000 individuals are most appropriate. Limitations with multiparent intercross populations are that they require longer time and more resource to be generated and they are likely to show extensive segregation for developmental traits, limiting their use in the analysis of complex traits. However, multiparent intercross population resources are likely to bring a paradigm shift towards QTL analysis in plant species.

  1. Phenotypic analysis of Arabidopsis mutants: quantitative analysis of root growth.

    Science.gov (United States)

    Doerner, Peter

    2008-03-01

    INTRODUCTIONThe growth of plant roots is very easy to measure and is particularly straightforward in Arabidopsis thaliana, because the increase in organ size is essentially restricted to one dimension. The precise measurement of root apical growth can be used to accurately determine growth activity (the rate of growth at a given time) during development in mutants, transgenic backgrounds, or in response to experimental treatments. Root growth is measured in a number of ways, the simplest of which is to grow the seedlings in a Petri dish and record the position of the advancing root tip at appropriate time points. The increase in root length is measured with a ruler and the data are entered into Microsoft Excel for analysis. When dealing with large numbers of seedlings, however, this procedure can be tedious, as well as inaccurate. An alternative approach, described in this protocol, uses "snapshots" of the growing plants, which are taken using gel-documentation equipment (i.e., a video camera with a frame-grabber unit, now commonly used to capture images from ethidium-bromide-stained electrophoresis gels). The images are analyzed using publicly available software (NIH-Image), which allows the user simply to cut and paste data into Microsoft Excel.

  2. Epistasis analysis for quantitative traits by functional regression model.

    Science.gov (United States)

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  3. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  4. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  5. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  6. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. A Subjective Risk Analysis Approach of Container Supply Chains

    Institute of Scientific and Technical Information of China (English)

    Zai-Li Yang; Jin Wang; Steve Bonsall; Jian-Bo Yang; Quan-Gen Fang

    2005-01-01

    After the 9/11 terrorism attacks, the lock-out of the American West Ports in 2002 and the breakout of SARS disease in 2003 have further focused mind of both the public and industrialists to take effective and timely measures for assessing and controlling the risks related to container supply chains (CSCs). However, due to the complexity of the risks in the chains, conventional quantitative risk assessment (QRA) methods may not be capable of providing sufficient safety management information, as achieving such a functionality requires enabling the possibility of conducting risk analysis in view of the challenges and uncertainties posed by the unavailability and incompleteness of historical failure data. Combing the fuzzy set theory (FST) and an evidential reasoning (ER) approach, the paper presents a subjective method to deal with the vulnerability-based risks, which are more ubiquitous and uncertain than the traditional hazard-based ones in the chains.

  8. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    Science.gov (United States)

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quantitative modeling and data analysis of SELEX experiments

    Science.gov (United States)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (systematic evolution of ligands by exponential enrichment) is an experimental procedure that allows the extraction, from an initially random pool of DNA, of those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate dataset. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such a modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure.

  10. Quantitative analysis of multiple sclerosis: a feasibility study

    Science.gov (United States)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  11. Quantitative colorimetric-imaging analysis of nickel in iron meteorites.

    Science.gov (United States)

    Zamora, L Lahuerta; López, P Alemán; Fos, G M Antón; Algarra, R Martín; Romero, A M Mellado; Calatayud, J Martínez

    2011-02-15

    A quantitative analytical imaging approach for determining the nickel content of metallic meteorites is proposed. The approach uses a digital image of a series of standard solutions of the nickel-dimethylglyoxime coloured chelate and a meteorite sample solution subjected to the same treatment as the nickel standards for quantitation. The image is processed with suitable software to assign a colour-dependent numerical value (analytical signal) to each standard. Such a value is directly proportional to the analyte concentration, which facilitates construction of a calibration graph where the value for the unknown sample can be interpolated to calculate the nickel content of the meteorite. The results thus obtained were validated by comparison with the official, ISO-endorsed spectrophotometric method for nickel. The proposed method is fairly simple and inexpensive; in fact, it uses a commercially available digital camera as measuring instrument and the images it provides are processed with highly user-friendly public domain software (specifically, ImageJ, developed by the National Institutes of Health and freely available for download on the Internet). In a scenario dominated by increasingly sophisticated and expensive equipment, the proposed method provides a cost-effective alternative based on simple, robust hardware that is affordable and can be readily accessed worldwide. This can be especially advantageous for countries were available resources for analytical equipment investments are scant. The proposed method is essentially an adaptation of classical chemical analysis to current, straightforward, robust, cost-effective instrumentation. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Quantitative analysis of tumor burden in mouse lung via MRI.

    Science.gov (United States)

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  13. Risk analysis and emergency management of ammonia installations

    NARCIS (Netherlands)

    Ham, J.M.; Gansevoort, J.

    1992-01-01

    The use of Quantitative Risk Assessment has been increasing for evaluating the risk of handling hazardous materials and land-use planning. This article reports on several studies carried out on the risk of handling, storage and transport of ammonia.

  14. Risk analysis and emergency management of ammonia installations

    NARCIS (Netherlands)

    Ham, J.M.; Gansevoort, J.

    1992-01-01

    The use of Quantitative Risk Assessment has been increasing for evaluating the risk of handling hazardous materials and land-use planning. This article reports on several studies carried out on the risk of handling, storage and transport of ammonia.

  15. Quantitative risk assessment of human campylobacteriosis associated with thermophilic Campylobacter species in chickens

    DEFF Research Database (Denmark)

    Rosenquist, Hanne; Nielsen, N. L.; Sommer, Helle Mølgaard

    2003-01-01

    A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic...... covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules...... Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption...

  16. Quantitative microbial risk assessment of Cryptosporidium and Giardia in well water from a native community of Mexico.

    Science.gov (United States)

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Balderas-Cortés, José de Jesús; Mondaca-Fernández, Iram; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2015-01-01

    Cryptosporidium and Giardia are gastrointestinal disease-causing organisms transmitted by the fecal-oral route, zoonotic and prevalent in all socioeconomic segments with greater emphasis in rural communities. The goal of this study was to assess the risk of cryptosporidiosis and giardiasis of Potam dwellers consuming drinking water from communal well water. To achieve the goal, quantitative microbial risk assessment (QMRA) was carried out as follows: (a) identification of Cryptosporidium oocysts and Giardia cysts in well water samples by information collection rule method, (b) assessment of exposure to healthy Potam residents, (c) dose-response modelling, and (d) risk characterization using an exponential model. All well water samples tested were positive for Cryptosporidium and Giardia. The QMRA results indicate a mean of annual risks of 99:100 (0.99) for cryptosporidiosis and 1:1 (1.0) for giardiasis. The outcome of the present study may drive decision-makers to establish an educational and treatment program to reduce the incidence of parasite-borne intestinal infection in the Potam community, and to conduct risk analysis programs in other similar rural communities in Mexico.

  17. The Impact of Arithmetic Skills on Mastery of Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bruce K. Blaylock

    2012-01-01

    Full Text Available Over the past several years math education has moved from a period where all math calculations were done by hand to an era where most calculations are done using a calculator or computer. There are certainly benefits to this approach, but when one concomitantly recognizes the declining scores on national standardized mathematics exams, it raises the question, “Could the lack of technology-assisted arithmetic manipulation skills have a carryover to understanding higher-level mathematical concepts or is it just a spurious correlation?” Eighty-seven students were tested for their ability to do simple arithmetic and algebra by hand. These scores were then regressed on three important areas of quantitative analysis: recognizing the appropriate tool to use in an analysis, creating a model to carry out the analysis, and interpreting the results of the analysis. The study revealed a significant relationship between the ability to accurately do arithmetic calculations and the ability to recognize the appropriate tool and creating a model. It found no significant relationship between results interpretation and arithmetic skills.

  18. Analysis of generalized interictal discharges using quantitative EEG.

    Science.gov (United States)

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  19. Quantitative approaches for health risk assessment of environmental pollutants : estimation of differences in sensitivity, relative potencies, and margins of exposure

    OpenAIRE

    Kalantari, Fereshteh

    2012-01-01

    Historically, quantitative health risk assessment of chemical substances is based on deterministic approaches. For a more realistic and informative health risk assessment, however, the variability and uncertainty inherent in measurements should be taken into consideration. The variability and uncertainty may be assessed by applying probabilistic methods when performing dose-response assessment, exposure assessment and risk characterization. The benchmark dose (BMD) method has b...

  20. Risk factors for retained surgical items: a meta-analysis and proposed risk stratification system.

    Science.gov (United States)

    Moffatt-Bruce, Susan D; Cook, Charles H; Steinberg, Steven M; Stawicki, Stanislaw P

    2014-08-01

    Retained surgical items (RSI) are designated as completely preventable "never events". Despite numerous case reports, clinical series, and expert opinions few studies provide quantitative insight into RSI risk factors and their relative contributions to the overall RSI risk profile. Existing case-control studies lack the ability to reliably detect clinically important differences within the long list of proposed risks. This meta-analysis examines the best available data for RSI risk factors, seeking to provide a clinically relevant risk stratification system. Nineteen candidate studies were considered for this meta-analysis. Three retrospective, case-control studies of RSI-related risk factors contained suitable group comparisons between patients with and without RSI, thus qualifying for further analysis. Comprehensive Meta-Analysis 2.0 (BioStat, Inc, Englewood, NJ) software was used to analyze the following "common factor" variables compiled from the above studies: body-mass index, emergency procedure, estimated operative blood loss >500 mL, incorrect surgical count, lack of surgical count, >1 subprocedure, >1 surgical team, nursing staff shift change, operation "afterhours" (i.e., between 5 PM and 7 AM), operative time, trainee presence, and unexpected intraoperative factors. We further stratified resulting RSI risk factors into low, intermediate, and high risk. Despite the fact that only between three and six risk factors were associated with increased RSI risk across the three studies, our analysis of pooled data demonstrates that seven risk factors are significantly associated with increased RSI risk. Variables found to elevate the RSI risk include intraoperative blood loss >500 mL (odds ratio [OR] 1.6); duration of operation (OR 1.7); >1 subprocedure (OR 2.1); lack of surgical counts (OR 2.5); >1 surgical team (OR 3.0); unexpected intraoperative factors (OR 3.4); and incorrect surgical count (OR 6.1). Changes in nursing staff, emergency surgery, body

  1. Revisiting support optimization at the Driskos tunnel using a quantitative risk approach

    Directory of Open Access Journals (Sweden)

    J. Connor Langford

    2016-04-01

    Full Text Available With the scale and cost of geotechnical engineering projects increasing rapidly over the past few decades, there is a clear need for the careful consideration of calculated risks in design. While risk is typically dealt with subjectively through the use of conservative design parameters, with the advent of reliability-based methods, this no longer needs to be the case. Instead, a quantitative risk approach can be considered that incorporates uncertainty in ground conditions directly into the design process to determine the variable ground response and support loads. This allows for the optimization of support on the basis of both worker safety and economic risk. This paper presents the application of such an approach to review the design of the initial lining system along a section of the Driskos twin tunnels as part of the Egnatia Odos highway in northern Greece. Along this section of tunnel, weak rock masses were encountered as well as high in situ stress conditions, which led to excessive deformations and failure of the as built temporary support. Monitoring data were used to validate the rock mass parameters selected in this area and a risk approach was used to determine, in hindsight, the most appropriate support category with respect to the cost of installation and expected cost of failure. Different construction sequences were also considered in the context of both convenience and risk cost.

  2. A quantitative test of the cultural theory of risk perceptions: comparison with the psychometric paradigm.

    Science.gov (United States)

    Marris, C; Langford, I H; O'Riordan, T

    1998-10-01

    This paper seeks to compare two frameworks which have been proposed to explain risk perceptions, namely, cultural theory and the psychometric paradigm. A structured questionnaire which incorporated elements from both approaches was administered to 129 residents of Norwich, England. The qualitative risk characteristics generated by the psychometric paradigm explained a far greater proportion of the variance in risk perceptions than cultural biases, though it should be borne in mind that the qualitative characteristics refer directly to risks whereas cultural biases are much more distant variables. Correlations between cultural biases and risk perceptions were very low, but the key point was that each cultural bias was associated with concern about distinct types of risks and that the pattern of responses was compatible with that predicted by cultural theory. The cultural approach also provided indicators for underlying beliefs regarding trust and the environment; beliefs which were consistent within each world view but divergent between them. An important drawback, however, was that the psychometric questionnaire could only allocate 32% of the respondents unequivocally to one of the four cultural types. The rest of the sample expressed several cultural biases simultaneously, or none at all. Cultural biases are therefore probably best interpreted as four extreme world views, and a mixture of qualitative and quantitative research methodologies would generate better insights into who might defend these views in what circumstances, whether there are only four mutually exclusive world views or not, and how these views are related to patterns of social solidarity, and judgments on institutional trust.

  3. Quantitative assessment of the risk of lung cancer associated with occupational exposure to refractory ceramic fibers.

    Science.gov (United States)

    Moolgavkar, S H; Luebeck, E G; Turim, J; Hanna, L

    1999-08-01

    We present the results of a quantitative assessment of the lung cancer risk associated with occupational exposure to refractory ceramic fibers (RCF). The primary sources of data for our risk assessment were two long-term oncogenicity studies in male Fischer rats conducted to assess the potential pathogenic effects associated with prolonged inhalation of RCF. An interesting feature of the data was the availability of the temporal profile of fiber burden in the lungs of experimental animals. Because of this information, we were able to conduct both exposure-response and dose-response analyses. Our risk assessment was conducted within the framework of a biologically based model for carcinogenesis, the two-stage clonal expansion model, which allows for the explicit incorporation of the concepts of initiation and promotion in the analyses. We found that a model positing that RCF was an initiator had the highest likelihood. We proposed an approach based on biological considerations for the extrapolation of risk to humans. This approach requires estimation of human lung burdens for specific exposure scenarios, which we did by using an extension of a model due to Yu. Our approach acknowledges that the risk associated with exposure to RCF depends on exposure to other lung carcinogens. We present estimates of risk in two populations: (1) a population of nonsmokers and (2) an occupational cohort of steelworkers not exposed to coke oven emissions, a mixed population that includes both smokers and nonsmokers.

  4. Bioaerosol Deposition to Food Crops near Manure Application: Quantitative Microbial Risk Assessment.

    Science.gov (United States)

    Jahne, Michael A; Rogers, Shane W; Holsen, Thomas M; Grimberg, Stefan J; Ramler, Ivan P; Kim, Seungo

    2016-03-01

    Production of both livestock and food crops are central priorities of agriculture; however, food safety concerns arise where these practices intersect. In this study, we investigated the public health risks associated with potential bioaerosol deposition to crops grown in the vicinity of manure application sites. A field sampling campaign at dairy manure application sites supported the emission, transport, and deposition modeling of bioaerosols emitted from these lands following application activities. Results were coupled with a quantitative microbial risk assessment model to estimate the infection risk due to consumption of leafy green vegetable crops grown at various distances downwind from the application area. Inactivation of pathogens ( spp., spp., and O157:H7) on both the manure-amended field and on crops was considered to determine the maximum loading of pathogens to plants with time following application. Overall median one-time infection risks at the time of maximum loading decreased from 1:1300 at 0 m directly downwind from the field to 1:6700 at 100 m and 1:92,000 at 1000 m; peak risks (95th percentiles) were considerably greater (1:18, 1:89, and 1:1200, respectively). Median risk was below 1:10,000 at >160 m downwind. As such, it is recommended that a 160-m setback distance is provided between manure application and nearby leafy green crop production. Additional distance or delay before harvest will provide further protection of public health.

  5. Quantitative microstructure analysis of polymer-modified mortars.

    Science.gov (United States)

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  6. Ozone Determination: A Comparison of Quantitative Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rachmat Triandi Tjahjanto

    2012-10-01

    Full Text Available A comparison of ozone quantitative analysis methods by using spectrophotometric and volumetric method has been studied. The aim of this research is to determine the better method by considering the effect of reagent concentration and volume on the measured ozone concentration. Ozone which was analyzed in this research was synthesized from air, then it is used to ozonize methyl orange and potassium iodide solutions at different concentration and volume. Ozonation was held for 20 minutes with 363 mL/minutes air flow rates. The concentrations of ozonized methyl orange and potassium iodide solutions was analyzed by spectrophotometric and volumetric method, respectively. The result of this research shows that concentration and volume of reagent having an effect on the measured ozone concentration. Based on the results of both methods, it can be concluded that volumetric method is better than spectrophotometric method.

  7. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  8. Quantitative image analysis of WE43-T6 cracking behavior

    Science.gov (United States)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  9. qfasar: quantitative fatty acid signature analysis with R

    Science.gov (United States)

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  10. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  11. Quantitative analysis of forest island pattern in selected Ohio landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  12. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  13. Establishment of a Risk Assessment Framework for Analysis of the Spread of Highly Pathogenic Avian Influenza

    Institute of Scientific and Technical Information of China (English)

    LI Jing; WANG Jing-fei; WU Chun-yan; YANG Yan-tao; JI Zeng-tao; WANG Hong-bin

    2007-01-01

    To evaluate the risk of highly pathogenic avian influenza (HPAI) in mainland China, a risk assessment framework was built.Risk factors were determined by analyzing the epidemic data using the brainstorming method; the analytic hierarchy process was designed to weigh risk factors, and the integrated multicriteria analysis was used to evaluate the final result.The completed framework included the risk factor system, data standards for risk factors, weights of risk factors, and integrated assessment methods. This risk assessment framework can be used to quantitatively analyze the outbreak and spread of HPAI in mainland China.

  14. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates pept

  15. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  16. QTL analysis for some quantitative traits in bread wheat

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection.

  17. Quantitative polymerase chain reaction analysis by deconvolution of internal standard.

    Science.gov (United States)

    Hirakawa, Yasuko; Medh, Rheem D; Metzenberg, Stan

    2010-04-29

    Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  18. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    Directory of Open Access Journals (Sweden)

    Metzenberg Stan

    2010-04-01

    Full Text Available Abstract Background Quantitative Polymerase Chain Reaction (qPCR is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  19. Dynamic Blowout Risk Analysis Using Loss Functions.

    Science.gov (United States)

    Abimbola, Majeed; Khan, Faisal

    2017-08-11

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  20. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  1. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    Science.gov (United States)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  2. Quantitative Deficits of Preschool Children at Risk for Mathematical Learning Disability

    Directory of Open Access Journals (Sweden)

    Felicia W. Chu

    2013-05-01

    Full Text Available The study tested the hypothesis that acuity of the potentially inherent approximate number system (ANS contributes to risk of mathematical learning disability (MLD. Sixty-eight (35 boys preschoolers at risk for school failure were assessed on a battery of quantitative tasks, and on intelligence, executive control, preliteracy skills, and parental education. Mathematics achievement scores at the end of one year of preschool indicated that 34 of these children were at high risk for MLD. Relative to the 34 typically achieving children, the at risk children were less accurate on the ANS task, and a one standard deviation deficit on this task resulted in a 2.4 fold increase in the odds of MLD status. The at risk children also had a poor understanding of ordinal relations, and had slower learning of Arabic numerals, number words, and their cardinal values. Poor performance on these tasks resulted in 3.6 to 4.5 fold increases in the odds of MLD status. The results provide some support for the ANS hypothesis but also suggest these deficits are not the primary source of poor mathematics learning.

  3. Quantitative deficits of preschool children at risk for mathematical learning disability.

    Science.gov (United States)

    Chu, Felicia W; Vanmarle, Kristy; Geary, David C

    2013-01-01

    The study tested the hypothesis that acuity of the potentially inherent approximate number system (ANS) contributes to risk of mathematical learning disability (MLD). Sixty-eight (35 boys) preschoolers at risk for school failure were assessed on a battery of quantitative tasks, and on intelligence, executive control, preliteracy skills, and parental education. Mathematics achievement scores at the end of 1 year of preschool indicated that 34 of these children were at high risk for MLD. Relative to the 34 typically achieving children, the at risk children were less accurate on the ANS task, and a one standard deviation deficit on this task resulted in a 2.4-fold increase in the odds of MLD status. The at risk children also had a poor understanding of ordinal relations, and had slower learning of Arabic numerals, number words, and their cardinal values. Poor performance on these tasks resulted in 3.6- to 4.5-fold increases in the odds of MLD status. The results provide some support for the ANS hypothesis but also suggest these deficits are not the primary source of poor mathematics learning.

  4. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    Science.gov (United States)

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society.

  5. Risk Analysis of Telecom Enterprise Financing

    Institute of Scientific and Technical Information of China (English)

    YU Hua; SHU Hua-ying

    2005-01-01

    The main research objects in this paper are the causes searching and risk estimating method for telecom enterprises' financial risks. The multi-mode financing for telecom enterprises makes it flexible to induce the capital and obtain the profit by corresponding projects. But there are also potential risks going with these financing modes. After making analysis of categories and causes of telecom enterprises' financing risk, a method by Analytic Hierarchy Process (AHP) is put forward to estimating the financing risk. And the author makes her suggestion and opinion by example analysis, in order to provide some ideas and basis for telecom enterprise's financing decision-making.

  6. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  7. Quantitative analysis and parametric display of regional myocardial mechanics

    Science.gov (United States)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in

  8. Quantitative analysis with the optoacoustic/ultrasound system OPUS

    Science.gov (United States)

    Haisch, Christoph; Zell, Karin; Sperl, Jonathan; Vogel, Mika W.; Niessner, Reinhard

    2009-02-01

    The OPUS (Optoacoustic plus Ultrasound) system is a combination of a medical ultrasound scanner with a highrepetition rate, wavelength-tunable laser system and a suitable triggering interface to synchronize the laser and the ultrasound system. The pulsed laser generates an optoacoustic (OA), or photoacoustic (PA), signal which is detected by the ultrasound system. Alternatively, imaging in conventional ultrasound mode can be performed. Both imaging modes can be superimposed. The laser light is coupled into the tissue laterally, parallel to the ultrasound transducer, which does not require for any major modification to the transducer or the ultrasound beam forming. This was a basic requirement on the instrument, as the intention of the project was to establish the optoacoustic imaging modality as add-on to a conventional standard ultrasound instrument. We believe that this approach may foster the introduction of OA imaging as routine tool in medical diagnosis. Another key aspect of the project was to exploit the capabilities of OA imaging for quantitative analysis. The intention of the presented work is to summarize all steps necessary to extract the significant information from the PA raw data, which are required for the quantification of local absorber distributions. We show results of spatially resolved absorption measurements in scattering samples and a comparison of four different image reconstruction algorithms, regarding their influence on lateral resolution as well as on the signal to noise ratio for different sample depths and absorption values. The reconstruction algorithms are based on Fourier transformation, on a generalized 2D Hough transformation, on circular back-projection and the classical delay-and-sum approach which is implemented in most ultrasound scanners. Furthermore, we discuss the influence of a newly developed laser source, combining diode and flash lamp pumping. Compared to all-flash-lamp pumped systems it features a significantly higher

  9. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Science.gov (United States)

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  10. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Science.gov (United States)

    Wandinger, Sebastian K; Lahortiga, Idoya; Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T M; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  11. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Directory of Open Access Journals (Sweden)

    Sebastian K Wandinger

    Full Text Available The four members of the epidermal growth factor receptor (EGFR/ERBB family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1 treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  12. Quantitative analysis of cryptic splicing associated with TDP-43 depletion.

    Science.gov (United States)

    Humphrey, Jack; Emmett, Warren; Fratta, Pietro; Isaacs, Adrian M; Plagnol, Vincent

    2017-05-26

    Reliable exon recognition is key to the splicing of pre-mRNAs into mature mRNAs. TDP-43 is an RNA-binding protein whose nuclear loss and cytoplasmic aggregation are a hallmark pathology in amyotrophic lateral sclerosis and frontotemporal dementia (ALS/FTD). TDP-43 depletion causes the aberrant inclusion of cryptic exons into a range of transcripts, but their extent, relevance to disease pathogenesis and whether they are caused by other RNA-binding proteins implicated in ALS/FTD are unknown. We developed an analysis pipeline to discover and quantify cryptic exon inclusion and applied it to publicly available human and murine RNA-sequencing data. We detected widespread cryptic splicing in TDP-43 depletion datasets but almost none in another ALS/FTD-linked protein FUS. Sequence motif and iCLIP analysis of cryptic exons demonstrated that they are bound by TDP-43. Unlike the cryptic exons seen in hnRNP C depletion, those repressed by TDP-43 cannot be linked to transposable elements. Cryptic exons are poorly conserved and inclusion overwhelmingly leads to nonsense-mediated decay of the host transcript, with reduced transcript levels observed in differential expression analysis. RNA-protein interaction data on 73 different RNA-binding proteins showed that, in addition to TDP-43, 7 specifically bind TDP-43 linked cryptic exons. This suggests that TDP-43 competes with other splicing factors for binding to cryptic exons and can repress cryptic exon inclusion. Our quantitative analysis pipeline confirms the presence of cryptic exons during the depletion of TDP-43 but not FUS providing new insight into to RNA-processing dysfunction as a cause or consequence in ALS/FTD.

  13. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  14. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  15. Correlation between two methods of florbetapir PET quantitative analysis.

    Science.gov (United States)

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  16. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    Science.gov (United States)

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  17. Therapeutic electrical stimulation for spasticity: quantitative gait analysis.

    Science.gov (United States)

    Pease, W S

    1998-01-01

    Improvement in motor function following electrical stimulation is related to strengthening of the stimulated spastic muscle and inhibition of the antagonist. A 26-year-old man with familial spastic paraparesis presented with gait dysfunction and bilateral lower limb spastic muscle tone. Clinically, muscle strength and sensation were normal. He was considered appropriate for a trial of therapeutic electrical stimulation following failed trials of physical therapy and baclofen. No other treatment was used concurrent with the electrical stimulation. Before treatment, quantitative gait analysis revealed 63% of normal velocity and a crouched gait pattern, associated with excessive electromyographic activity in the hamstrings and gastrocnemius muscles. Based on these findings, bilateral stimulation of the quadriceps and anterior compartment musculature was performed two to three times per week for three months. Repeat gait analysis was conducted three weeks after the cessation of stimulation treatment. A 27% increase in velocity was noted associated with an increase in both cadence and right step length. Right hip and bilateral knee stance motion returned to normal (rather than "crouched"). No change in the timing of dynamic electromyographic activity was seen. These findings suggest a role for the use of electrical stimulation for rehabilitation of spasticity. The specific mechanism of this improvement remains uncertain.

  18. Groundwater vulnerability and pollution risk assessment of porous aquifers to nitrate: Modifying the DRASTIC method using quantitative parameters

    Science.gov (United States)

    Kazakis, Nerantzis; Voudouris, Konstantinos S.

    2015-06-01

    In the present study the DRASTIC method was modified to estimate vulnerability and pollution risk of porous aquifers to nitrate. The qualitative parameters of aquifer type, soil and impact of the vadose zone were replaced with the quantitative parameters of aquifer thickness, nitrogen losses from soil and hydraulic resistance. Nitrogen losses from soil were estimated based on climatic, soil and topographic data using indices produced by the GLEAMS model. Additionally, the class range of each parameter and the final index were modified using nitrate concentration correlation with four grading methods (natural breaks, equal interval, quantile and geometrical intervals). For this reason, seventy-seven (77) groundwater samples were collected and analyzed for nitrate. Land uses were added to estimate the pollution risk to nitrates. The two new methods, DRASTIC-PA and DRASTIC-PAN, were then applied in the porous aquifer of Anthemountas basin together with the initial versions of DRASTIC and the LOSN-PN index. The two modified methods displayed the highest correlations with nitrate concentrations. The two new methods provided higher discretisation of the vulnerability and pollution risk, whereas the high variance of the (ANOVA) F statistic confirmed the increase of the average concentrations of NO3-, increasing from low to high between the vulnerability and pollution risk classes. The importance of the parameters of hydraulic resistance of the vadose zone, aquifer thickness and land use was confirmed by single-parameter sensitivity analysis.

  19. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  20. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  1. Evaluation of skeletal status by quantitative ultrasonometry in postmenopausal women without known risk factors for osteoporosis.

    Science.gov (United States)

    Mandato, Vincenzo Dario; Sammartino, Annalidia; Di Carlo, Costantino; Tommaselli, Giovanni A; Tauchmanovà, Libuse; D'Elia, Antonio; Nappi, Carmine

    2005-09-01

    The objective of our study was to evaluate bone density in Italian postmenopausal women without clinical risk factors for osteoporosis resident in the Naples area using quantitative ultrasonometry of bone (QUS). Subjects were 1149 Italian postmenopausal women (age: 54.9 +/- 5.0 years (mean +/- standard deviation); range: 45-74 years) resident in the Naples area. Clinical risk factors for osteoporosis resulting in exclusion from the study were family history of osteoporosis, dietary, smoking and alcohol habits, personal history of fractures and/or metabolic diseases. The following QUS parameters were calculated: amplitude-dependent speed of sound (AD-SoS), T-score and Z-score. We found significant inverse correlations between AD-SoS and age (r = - 0.23), time since menopause (r = - 0.25) and body mass index (BMI) (r = - 0.16). The same was observed for T-score. In contrast, Z-score showed a significant positive correlation with age and time since menopause, and a negative correlation with BMI. A T-score suggestive of high risk for osteoporosis (less than -3.2) was found in 1.6% of subjects, while a T-score suggestive of moderate risk for osteoporosis (between -3.2 and -2) was found in 19.3% of patients. In this group of women without clinical risk factors for osteoporosis we found a very low prevalence of QUS results suggesting a high risk for osteoporosis. However, a condition of 'moderate' risk for osteoporosis was present in a remarkable percentage of these women.

  2. State of the art in benefit-risk analysis: medicines.

    Science.gov (United States)

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Use of global sensitivity analysis in quantitative microbial risk assessment: application to the evaluation of a biological time temperature integrator as a quality and safety indicator for cold smoked salmon.

    Science.gov (United States)

    Ellouze, M; Gauchi, J-P; Augustin, J-C

    2011-06-01

    The aim of this study was to apply a global sensitivity analysis (SA) method in model simplification and to evaluate (eO)®, a biological Time Temperature Integrator (TTI) as a quality and safety indicator for cold smoked salmon (CSS). Models were thus developed to predict the evolutions of Listeria monocytogenes and the indigenous food flora in CSS and to predict TTIs endpoint. A global SA was then applied on the three models to identify the less important factors and simplify the models accordingly. Results showed that the subset of the most important factors of the three models was mainly composed of the durations and temperatures of two chill chain links, out of the control of the manufacturers: the domestic refrigerator and the retail/cabinet links. Then, the simplified versions of the three models were run with 10(4) time temperature profiles representing the variability associated to the microbial behavior, to the TTIs evolution and to the French chill chain characteristics. The results were used to assess the distributions of the microbial contaminations obtained at the TTI endpoint and at the end of the simulated profiles and proved that, in the case of poor storage conditions, the TTI use could reduce the number of unacceptable foods by 50%.

  4. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    Science.gov (United States)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  5. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  6. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  7. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  8. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    Science.gov (United States)

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  9. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  10. Quantitative cancer risk assessment for occupational exposures to asphalt fumes during built-up roofing asphalt (BURA) operations.

    Science.gov (United States)

    Rhomberg, Lorenz R; Mayfield, David B; Goodman, Julie E; Butler, Eric L; Nascarella, Marc A; Williams, Daniel R

    2015-01-01

    The International Agency for Research on Cancer qualitatively characterized occupational exposure to oxidized bitumen emissions during roofing as probably carcinogenic to humans (Group 2A). We examine chemistry, exposure, epidemiology and animal toxicity data to explore quantitative risks for roofing workers applying built-up roofing asphalt (BURA). Epidemiology studies do not consistently report elevated risks, and generally do not have sufficient exposure information or adequately control for confounders, precluding their use for dose-response analysis. Dermal carcinogenicity bioassays using mice report increased tumor incidence with single high doses. In order to quantify potential cancer risks, we develop time-to-tumor model methods [consistent with U.S. Environmental Protection Agency (EPA) dose-response analysis and mixtures guidelines] using the dose-time-response shape of concurrent exposures to benzo[a]pyrene (B[a]P) as concurrent controls (which had several exposure levels) to infer presumed parallel dose-time-response curves for BURA-fume condensate. We compare EPA relative potency factor approaches, based on observed relative potency of BURA to B[a]P in similar experiments, and direct observation of the inferred BURA dose-time-response (scaled to humans) as means for characterizing a dermal unit risk factor. We apply similar approaches to limited data on asphalt-fume inhalation and respiratory cancers in rats. We also develop a method for adjusting potency estimates for asphalts that vary in composition using measured fluorescence. Overall, the various methods indicate that cancer risks to roofers from both dermal and inhalation exposure to BURA are within a range typically deemed acceptable within regulatory frameworks. The approaches developed may be useful in assessing carcinogenic potency of other complex mixtures of polycyclic aromatic compounds.

  11. Modelling bacterial growth in quantitative microbiological risk assessment: is it possible?

    Science.gov (United States)

    Nauta, Maarten J

    2002-03-01

    Quantitative microbiological risk assessment (QMRA), predictive modelling and HACCP may be used as tools to increase food safety and can be integrated fruitfully for many purposes. However, when QMRA is applied for public health issues like the evaluation of the status of public health, existing predictive models may not be suited to model bacterial growth. In this context, precise quantification of risks is more important than in the context of food manufacturing alone. In this paper, the modular process risk model (MPRM) is briefly introduced as a QMRA modelling framework. This framework can be used to model the transmission of pathogens through any food pathway, by assigning one of six basic processes (modules) to each of the processing steps. Bacterial growth is one of these basic processes. For QMRA, models of bacterial growth need to be expressed in terms of probability, for example to predict the probability that a critical concentration is reached within a certain amount of time. In contrast, available predictive models are developed and validated to produce point estimates of population sizes and therefore do not fit with this requirement. Recent experience from a European risk assessment project is discussed to illustrate some of the problems that may arise when predictive growth models are used in QMRA. It is suggested that a new type of predictive models needs to be developed that incorporates modelling of variability and uncertainty in growth.

  12. Rat Mcs5a is a compound quantitative trait locus with orthologous human loci that associate with breast cancer risk

    Science.gov (United States)

    Samuelson, David J.; Hesselson, Stephanie E.; Aperavich, Beth A.; Zan, Yunhong; Haag, Jill D.; Trentham-Dietz, Amy; Hampton, John M.; Mau, Bob; Chen, Kai-Shun; Baynes, Caroline; Khaw, Kay-Tee; Luben, Robert; Perkins, Barbara; Shah, Mitul; Pharoah, Paul D.; Dunning, Alison M.; Easton, Doug F.; Ponder, Bruce A.; Gould, Michael N.

    2007-01-01

    Breast cancer risk is a polygenic trait. To identify breast cancer modifier alleles that have a high population frequency and low penetrance we used a comparative genomics approach. Quantitative trait loci (QTL) were initially identified by linkage analysis in a rat mammary carcinogenesis model followed by verification in congenic rats carrying the specific QTL allele under study. The Mcs5a locus was identified by fine-mapping Mcs5 in a congenic model. Here we characterize the Mcs5a locus, which when homozygous for the Wky allele, reduces mammary cancer risk by 50%. The Mcs5a locus is a compound QTL with at least two noncoding interacting elements: Mcs5a1 and Mcs5a2. The resistance phenotype is only observed in rats carrying at least one copy of the Wky allele of each element on the same chromosome. Mcs5a1 is located within the ubiquitin ligase Fbxo10, whereas Mcs5a2 includes the 5′ portion of Frmpd1. Resistant congenic rats show a down-regulation of Fbxo10 in the thymus and an up-regulation of Frmpd1 in the spleen. The association of the Mcs5a1 and Mcs5a2 human orthologs with breast cancer was tested in two population-based breast cancer case-control studies (≈12,000 women). The minor alleles of rs6476643 (MCS5A1) and rs2182317 (MCS5A2) were independently associated with breast cancer risk. The minor allele of rs6476643 increases risk, whereas the rs2182317 minor allele decreases risk. Both alleles have a high population frequency and a low penetrance toward breast cancer risk. PMID:17404222

  13. Quantitative analysis of polymorphic mixtures of ranitidine hydrochloride by Raman spectroscopy and principal components analysis.

    Science.gov (United States)

    Pratiwi, Destari; Fawcett, J Paul; Gordon, Keith C; Rades, Thomas

    2002-11-01

    Ranitidine hydrochloride exists as two polymorphs, forms I and II, both of which are used to manufacture commercial tablets. Raman spectroscopy can be used to differentiate the two forms but univariate methods of quantitative analysis of one polymorph as an impurity in the other lack sensitivity. We have applied principal components analysis (PCA) of Raman spectra to binary mixtures of the two polymorphs and to binary mixtures prepared by adding one polymorph to powdered tablets of the other. Based on absorption measurements of seven spectral regions, it was found that >97% of the spectral variation was accounted for by three principal components. Quantitative calibration models generated by multiple linear regression predicted a detection limit and quantitation limit for either forms I or II in mixtures of the two of 0.6 and 1.8%, respectively. This study demonstrates that PCA of Raman spectroscopic data provides a sensitive method for the quantitative analysis of polymorphic impurities of drugs in commercial tablets with a quantitation limit of less than 2%.

  14. Supply-Chain Risk Analysis

    Science.gov (United States)

    2016-06-07

    security score upon first submission – 3/1/2010 Measured Against CWE/SANS Top-25 Errors 24 SQL Database Query Output: All records with ID = 48983...exploitable design or coding errors • Very little data for software supply chains 8 Software Supply Chain Complexity-1 Composite inherits risk from any point... Relative Effort Operational Capabilities Knowledge of Supplier Capabilities Knowledge of Product Attributes 13 Supply-Chain Risk Categories Category

  15. Bank Liquidity Risk: Analysis and Estimates

    Directory of Open Access Journals (Sweden)

    Meilė Jasienė

    2012-12-01

    Full Text Available In today’s banking business, liquidity risk and its management are some of the most critical elements that underlie the stability and security of the bank’s operations, profit-making and clients confidence as well as many of the decisions that the bank makes. Managing liquidity risk in a commercial bank is not something new, yet scientific literature has not focused enough on different approaches to liquidity risk management and assessment. Furthermore, models, methodologies or policies of managing liquidity risk in a commercial bank have never been examined in detail either. The goal of this article is to analyse the liquidity risk of commercial banks as well as the possibilities of managing it and to build a liquidity risk management model for a commercial bank. The development, assessment and application of the commercial bank liquidity risk management was based on an analysis of scientific resources, a comparative analysis and mathematical calculations.

  16. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    Science.gov (United States)

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  17. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  18. APPLICATION OF NEOTAME IN CATCHUP: QUANTITATIVE DESCRIPTIVE AND PHYSICOCHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    G. C. M. C. BANNWART

    2008-11-01

    Full Text Available

    In this study, fi ve prototypes of catchup were developed by replacing partially or totally the sucrose in the formulation by the sweetener Neotame (NTM. These prototypes were evaluated for their physicochemical characteristics and sensory profi le (Quantitative Descriptive Analysis. The main sensory differences observed among the prototypes were regarding to color, consistency, mouthfeel, sweet taste and tomato taste, for which lower means were obtained as the sugar level was decreased, and also in terms of salty taste, that had higher means with the decrease of sugar. In terms of bitter and sweetener aftertastes, the prototype 100% sweetened with NTM presented the higher mean score, but with no signifi cant difference when compared to other prototypes containing sucrose, for bitter taste, however, it had the highest mean score, statistically different from all the other prototypes. In terms of physicochemical characteristics, the differences were mainly in terms of consistency, solids and color. Despite the differences observed among the prototypes as the sugar level was reduced, it was concluded that NTM is a suitable sweetener for catchup, both for use in reduced calories and no sugar versions.

  19. Quantitative Analysis of AGV System in FMS Cell Layout

    Directory of Open Access Journals (Sweden)

    B. Ramana

    1997-01-01

    Full Text Available Material handling is a specialised activity for a modern manufacturing concern. Automated guided vehicles (AGVs are invariably used for material handling in flexible manufacturing Systems (FMSs due to their flexibility. The quantitative analysis of an AGV system is useful for determining the material flow rates, operation times, length of delivery, length of empty move of AGV and the number of AGVs required for a typical FMS cell layout. The efficiency of the material handling system, such as AGV can be improved by reducing the length of empty move. The length of empty move of AGV depends upon despatching and scheduling methods. If these methods of AGVs are not properly planned, the length of empty move of AGV is greater than the length of delivery .This results in increase in material handling time which in turn increases the number of AGVs required in FMS cell. This paper presents a method for optimising the length of empty travel of AGV in a typical FMS cell layout.

  20. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    Science.gov (United States)

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  1. Quantitative produced water analysis using mobile 1H NMR

    Science.gov (United States)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  2. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    Science.gov (United States)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  3. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    A. O. Domanov

    2014-01-01

    Full Text Available Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial characteristics and citizens’ identities. As the geographical position at the border of Russia provides the citizens with geopolitical alternatives to identify their location as a fortress defending the nation (as in the case of Kronstadt or a bridge between cultures, the given study allows us to compare reasons for these geopolitical choices of inhabitants. Furthermore, the research aims at bridging the gap in the studies of European and multiple identity in Russian regions and provides Northwest Russian perspective on the perpetual discussion about subjective Eastern border of Europe.

  4. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    Science.gov (United States)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  5. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    Energy Technology Data Exchange (ETDEWEB)

    Haase, A.T.; Zupancic, M.; Cavert, W. [Univ. of Minnesota Medical School, Minneapolis, MN (United States)] [and others

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  6. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  7. Quantitative Analysis and Comparisons of EPON Protection Schemes

    Institute of Scientific and Technical Information of China (English)

    CHENHong; JINDepeng; ZENGLieguang; SULi

    2005-01-01

    This paper presents the relationship between the intensity of network damage and the network survivability. Then a method for quantitatively analyzing the survivability of tree network is studied. Based on the analysis, the survivability of Ethernet passive optical network (EPON) with three kinds of protection schemes (i.e., Trunk-fiber protection scheme, Node-fiber protection scheme, and Bus-fiber protection) is discussed. Following this, the comparisons of the survivability among these three kinds of protection schemes of F.PON are put forward. The simulation results show that, when the coverage area is the same, the survivability of EPON with Node-fiber protection scheme is better than that of EPON with Trunk-fiber protection scheme, and when the number and distribution of Optical network unit (ONU) are the same, the survivability of EPON with Bus-fiber protection scheme is better than that of EPON with Nodefiber protection scheme. Under the same constraints, the needed fiber of EPON with Bus-fiber protection scheme is the least when there are more than 12 ONU nodes. These results are useful not only for forecasting and evaluating the survivability of EPON access network, but also for its topology design.

  8. Quantitative analysis of regulatory flexibility under changing environmental conditions

    Science.gov (United States)

    Edwards, Kieron D; Akman, Ozgur E; Knox, Kirsten; Lumsden, Peter J; Thomson, Adrian W; Brown, Paul E; Pokhilko, Alexandra; Kozma-Bognar, Laszlo; Nagy, Ferenc; Rand, David A; Millar, Andrew J

    2010-01-01

    The circadian clock controls 24-h rhythms in many biological processes, allowing appropriate timing of biological rhythms relative to dawn and dusk. Known clock circuits include multiple, interlocked feedback loops. Theory suggested that multiple loops contribute the flexibility for molecular rhythms to track multiple phases of the external cycle. Clear dawn- and dusk-tracking rhythms illustrate the flexibility of timing in Ipomoea nil. Molecular clock components in Arabidopsis thaliana showed complex, photoperiod-dependent regulation, which was analysed by comparison with three contrasting models. A simple, quantitative measure, Dusk Sensitivity, was introduced to compare the behaviour of clock models with varying loop complexity. Evening-expressed clock genes showed photoperiod-dependent dusk sensitivity, as predicted by the three-loop model, whereas the one- and two-loop models tracked dawn and dusk, respectively. Output genes for starch degradation achieved dusk-tracking expression through light regulation, rather than a dusk-tracking rhythm. Model analysis predicted which biochemical processes could be manipulated to extend dusk tracking. Our results reveal how an operating principle of biological regulators applies specifically to the plant circadian clock. PMID:21045818

  9. Quantitative analysis of piperine in ayurvedic formulation by UV Spectrophotometry

    Directory of Open Access Journals (Sweden)

    Gupta Vishvnath

    2011-02-01

    Full Text Available A simple and reproducible UV- spectrophotometric method for the quantitative determination of piperine in Sitopaladi churna (STPLC were developed and validated in the present work. The parameters linearity, precision , accuracy, and standard error were studies according to indian herbal pharmacopiea. In this present study a new, simple, rapid, sensitive, precise and economic spectrophotometric method in ultraviolet region has been developed for the determination of piperine in market and laboratory herbal formulation of Sitopaladi churna. which were procured and purchased respectively from the local market and they were evaluated as per Indian herbal Pharmacopoeia and WHO guidelines. The concentration of piperine present in raw material of PSC was found to be 1.45±0.014 w/w in piper longum fruits. Piperine has the maximum wavelength at 342.5 nm and hence the UV spectrophotometric method was performed at 342.5 nm. The samples were prepared in methanol and methos obeys Beers law in concentration ranges employed for evaluation. The content of piperine in ayurvedic formulation was determined. The result of analysis have been validated statistically and recovery studies confirmed the accuracy of the proposed method. Hence the proposed method can be used for the reliable quantification of Piperine in crude drug and its herbal formulation.

  10. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  11. Quantitative analysis of 3-OH oxylipins in fermentation yeast.

    Science.gov (United States)

    Potter, Greg; Xia, Wei; Budge, Suzanne M; Speers, R Alex

    2017-02-01

    Despite the ubiquitous distribution of oxylipins in plants, animals, and microbes, and the application of numerous analytical techniques to study these molecules, 3-OH oxylipins have never been quantitatively assayed in yeasts. The formation of heptafluorobutyrate methyl ester derivatives and subsequent analysis with gas chromatography - negative chemical ionization - mass spectrometry allowed for the first determination of yeast 3-OH oxylipins. The concentration of 3-OH 10:0 (0.68-4.82 ng/mg dry cell mass) in the SMA strain of Saccharomyces pastorianus grown in laboratory-scale beverage fermentations was elevated relative to oxylipin concentrations in plant tissues and macroalgae. In fermenting yeasts, the onset of 3-OH oxylipin formation has been related to fermentation progression and flocculation initiation. When the SMA strain was grown in laboratory-scale fermentations, the maximal sugar consumption rate preceded the lowest concentration of 3-OH 10:0 by ∼4.5 h and a distinct increase in 3-OH 10:0 concentration by ∼16.5 h.

  12. Communication about vaccinations in Italian websites: a quantitative analysis.

    Science.gov (United States)

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  13. Quantitative Risk Analysis for Homeland Security Resource Allocation

    Science.gov (United States)

    2006-12-01

    156 A. Papoulis , Probability, Random Variables and Stochastic Processes, (New York: McGraw-Hill, 1965), chapters 5-7. 44 max ( ) D R D...Report. New York: W.W. Norton. Papoulis , A. 1965. Probability, Random Variables and Stochastic Processes. New York: McGraw-Hill. Pate-Cornell

  14. Collision Risk Analysis for HSC

    DEFF Research Database (Denmark)

    Urban, Jesper; Pedersen, Preben Terndrup; Simonsen, Bo Cerup

    1999-01-01

    High Speed Craft (HSC) have a risk profile, which is distinctly different from conventional ferries. Due to different hull building material, structural layout, compartmentation and operation, both frequency and consequences of collision and grounding accidents must be expected to be different from...... conventional ships. To reach a documented level of safety, it is therefore not possible directly to transfer experience with conventional ships. The purpose of this paper is to present new rational scientific tools to assess and quantify the collision risk associated with HSC transportation. The paper...

  15. Quantitative Assessment of the Association between Genetic Variants in MicroRNAs and Colorectal Cancer Risk

    Directory of Open Access Journals (Sweden)

    Xiao-Xu Liu

    2015-01-01

    Full Text Available Background. The associations between polymorphisms in microRNAs and the susceptibility of colorectal cancer (CRC were inconsistent in previous studies. This study aims to quantify the strength of the correlation between the four common polymorphisms among microRNAs (hsa-mir-146a rs2910164, hsa-mir-149 rs2292832, hsa-mir-196a2 rs11614913, and hsa-mir-499 rs3746444 and CRC risk. Methods. We searched PubMed, Web of Knowledge, and CNKI to find relevant studies. The combined odds ratio (OR with 95% confidence interval (95% CI was used to estimate the strength of the association in a fixed or random effect model. Results. 15 studies involving 5,486 CRC patients and 7,184 controls were included. Meta-analyses showed that rs3746444 had association with CRC risk in Caucasians (OR = 0.57, 95% CI = 0.34–0.95. In the subgroup analysis, we found significant associations between rs2910164 and CRC in hospital based studies (OR = 1.24, 95% CI = 1.03–1.49. rs2292832 may be a high risk factor of CRC in population based studied (OR = 1.18, 95% CI = 1.08–1.38. Conclusion. This meta-analysis showed that rs2910164 and rs2292832 may increase the risk of CRC. However, rs11614913 polymorphism may reduce the risk of CRC. rs3746444 may have a decreased risk to CRC in Caucasians.

  16. Initial Risk Analysis and Decision Making Framework

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  17. Fatalities in high altitude mountaineering: a review of quantitative risk estimates.

    Science.gov (United States)

    Weinbruch, Stephan; Nordby, Karl-Christian

    2013-12-01

    Quantitative estimates for mortality in high altitude mountaineering are reviewed. Special emphasis is placed on the heterogeneity of the risk estimates and on confounding. Crude estimates for mortality are on the order of 1/1000 to 40/1000 persons above base camp, for both expedition members and high altitude porters. High altitude porters have mostly a lower risk than expedition members (risk ratio for all Nepalese peaks requiring an expedition permit: 0.73; 95 % confidence interval 0.59-0.89). The summit bid is generally the most dangerous part of an expedition for members, whereas most high altitude porters die during route preparation. On 8000 m peaks, the mortality during descent from summit varies between 4/1000 and 134/1000 summiteers (members plus porters). The risk estimates are confounded by human and environmental factors. Information on confounding by gender and age is contradictory and requires further work. There are indications for safety segregation of men and women, with women being more risk averse than men. Citizenship appears to be a significant confounder. Prior high altitude mountaineering experience in Nepal has no protective effect. Commercial expeditions in the Nepalese Himalayas have a lower mortality than traditional expeditions, though after controlling for confounding, the difference is not statistically significant. The overall mortality is increasing with increasing peak altitude for expedition members but not for high altitude porters. In the Nepalese Himalayas and in Alaska, a significant decrease of mortality with calendar year was observed. A few suggestions for further work are made at the end of the article.

  18. Quantitative Polymerase Chain Reaction to Assess Response to Treatment of Bacterial Vaginosis and Risk of Preterm Birth.

    Science.gov (United States)

    Abramovici, Adi; Lobashevsky, Elena; Cliver, Suzanne P; Edwards, Rodney K; Hauth, John C; Biggio, Joseph R

    2015-10-01

    The aim of this study was to determine whether quantitative polymerase chain reaction (qPCR) bacterial load measurement is a valid method to assess response to treatment of bacterial vaginosis and risk of preterm birth in pregnant women. Secondary analysis by utilizing stored vaginal samples obtained during a previous randomized controlled trial studying the effect of antibiotics on preterm birth (PTB). All women had risk factors for PTB: (1) positive fetal fibronectin (n=146), (2) bacterial vaginosis (BV) and a prior PTB (n=43), or (3) BV and a prepregnancy weightbacterial loads were correlated with Nugent score and each of its specific bacterial components (all pbacterial load did not differ by treatment group (p=0.87). Posttreatment total bacterial load was significantly lower in the antibiotics group than the placebo group (pbacterial load decreased significantly posttreatment in the antibiotics group (pbacterial loads were not statistically significant. qPCR correlates with Nugent score and demonstrates decreased bacterial load after antibiotic treatment. Therefore, it is a valid method of vaginal flora assessment in pregnant women who are at high risk for PTB. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  19. QUANTITATIVE AND QUALITATIVE METHODS IN RISK-BASED RELIABILITY ASSESSING UNDER EPISTEMIC UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    M. Khalaj

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The overall objective of the reliability allocation is to increase the profitability of the operation and optimise the total life cycle cost without losses from failure issues. The methodology, called risk-based reliability (RBR, is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum cost and acceptable risk. This assessment integrates reliability with the smaller losses from failures issues, and so can be used as a tool for decision-making. The approach to maximising reliability should be replaced with the risk-based reliability assessment approach, in which reliability planning, based on risk analysis, minimises the probability of system failure and its consequences. This paper proposes a new methodology for risk-based reliability under epistemic uncertainty, using possibility theory and probability theory. This methodology is used for a case study in order to determine the reliability and risk of production systems when the available data are insufficient, and to help make decisions.

    AFRIKAANSE OPSOMMING: Die hoofdoelwit van betroubaarheidstoedeling is die verhoging van winsgewendheid van sisteemgebruik teen die agtergrond van optimum totale leeftydsikluskoste. Die metodologie genaamd “risikogebaseerde betroubaarheid” behels die bepaling van ’n risikoskatting om optimum koste te behaal vir ’n aanvaarbare risiko. Die navorsing stel vervolgens ’n nuwe ontledingsmetode voor vir epistemiese onsekerheid gebaseer op moontlikheids- en waarskynlikheidsleer. ’n Bypassende gevallestudie word aangebied.

  20. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    Science.gov (United States)

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-07-20

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations.

  1. A probabilistic approach to quantitatively assess the inhalation risk for airborne endotoxin in cotton textile workers

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Vivian Hsiu-Chuan, E-mail: vivianliao@ntu.edu.tw [Department of Bioenvironmental Systems Engineering, National Taiwan University, 1 Roosevelt Road, Sec. 4, Taipei 106, Taiwan (China); Chou, Wei-Chun; Chio, Chia-Pin; Ju, Yun-Ru; Liao, Chung-Min [Department of Bioenvironmental Systems Engineering, National Taiwan University, 1 Roosevelt Road, Sec. 4, Taipei 106, Taiwan (China)

    2010-05-15

    Endotoxin, a component of gram-negative bacterial cell walls, is a proinflammatory agent that induces local and systemic inflammatory responses in normal subjects which can contribute to the risk of developing asthma and chronic obstructive lung diseases. A probabilistic approach linking models of exposure, internal dosimetry, and health effects were carried out to quantitatively assess the potential inhalation risk of airborne endotoxin for workers in cotton textile plants. Combining empirical data and modeling results, we show that the half-maximum effects of the endotoxin dose (ED50) were estimated to be 3.3 x 10{sup 5} (95% confidence interval (CI): 1.9-14.7 x 10{sup 5}) endotoxin units (EU) for the blood C-reactive protein (CRP) concentration, 1.1 x 10{sup 5} (95% CI: 0.6-1.7 x 10{sup 5}) EU for the blood polymorphonuclear neutrophil (PMN) count, and 1.5 x 10{sup 5} (95% CI: 0.4-2.5 x 10{sup 5}) EU for the sputum PMN count. Our study offers a risk-management framework for discussing future establishment of limits for respiratory exposure to airborne endotoxin for workers in cotton textile plants.

  2. Collision Risk Analysis for HSC

    DEFF Research Database (Denmark)

    Urban, Jesper; Pedersen, Preben Terndrup; Simonsen, Bo Cerup

    1999-01-01

    for a HSC on a given route, an analysis of the released energy during a collision, analytical closed form solutions for the absorbed energy in the structure and finally an assessment of the overall structural crushing behaviour of the vessel, including the level of acceleration and the size of the crushing...... analysis tools to quantify the effect of the high speed have been available. Instead nearly all research on ship accidents has been devoted to analysis of the consequences of given accident scenarios. The proposed collision analysis includes an analysis which determines the probability of a collision...

  3. A quantitative analysis of Salmonella Typhimurium metabolism during infection

    OpenAIRE

    Steeb, Benjamin

    2012-01-01

    In this thesis, Salmonella metabolism during infection was investigated. The goal was to gain a quantitative and comprehensive understanding of Salmonella in vivo nutrient supply, utilization and growth. To achieve this goal, we used a combined experimental / in silico approach. First, we generated a reconstruction of Salmonella metabolism ([1], see 2.1). This reconstruction was then combined with in vivo data from experimental mutant phenotypes to build a comprehensive quantitative in viv...

  4. Using MFM methodology to generate and define major accident scenarios for quantitative risk assessment studies

    DEFF Research Database (Denmark)

    Hua, Xinsheng; Wu, Zongzhi; Lind, Morten

    2017-01-01

    to calculate likelihood of each MAS. Combining the likelihood of each scenario with a qualitative risk matrix, each major accident scenario is thereby ranked for consideration for detailed consequence analysis. The methodology is successfully highlighted using part of BMA-process for production of hydrogen...

  5. State of the art in benefit-risk analysis: introduction.

    Science.gov (United States)

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume.

  6. Quantitative analysis of flavanones and chalcones from willow bark.

    Science.gov (United States)

    Freischmidt, A; Untergehrer, M; Ziegler, J; Knuth, S; Okpanyi, S; Müller, J; Kelber, O; Weiser, D; Jürgenliemk, G

    2015-09-01

    Willow bark extracts are used for the treatment of fever, pain and inflammation. Recent clinical and pharmacological research revealed that not only the salicylic alcohol derivatives, but also the polyphenols significantly contribute to these effects. Quantitative analysis of the European Pharmacopoeia still focuses on the determination of the salicylic alcohol derivatives. The objective of the present study was the development of an effective quantification method for the determination of as many flavanone and chalcone glycosides as possible in Salix purpurea and other Salix species as well as commercial preparations thereof. As Salix species contain a diverse spectrum of the glycosidated flavanones naringenin, eriodictyol, and the chalcone chalconaringenin, a subsequent acidic and enzymatic hydrolysis was developed to yield naringenin and eriodictyol as aglycones, which were quantified by HPLC. The 5-O-glucosides were cleaved with 11.5% TFA before subsequent hydrolysis of the 7-O-glucosides with an almond β-glucosidase at pH 6-7. The method was validated with regard to LOD, LOQ, intraday and interday precision, accuracy, stability, recovery, time of hydrolysis, robustness and applicability to extracts. All 5-O- and 7-O-glucosides of naringenin, eriodictyol and chalconaringenin were completely hydrolysed and converted to naringenin and eriodictyol. The LOD of the HPLC method was 0.77 μM of naringenin and 0.45 μM of eriodictyol. The LOQ was 2.34 μM of naringenin and 1.35 μM for eriodictyol. The method is robust with regard to sample weight, but susceptible concerning enzyme deterioration. The developed method is applicable to the determination of flavanone and chalcone glycosides in willow bark and corresponding preparations.

  7. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  8. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    Science.gov (United States)

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  9. Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.

    Science.gov (United States)

    Meyr, Andrew J; Myers, Adam; Pontious, Jane

    2014-01-01

    Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction.

  10. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  11. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  12. Analysis of foreign schools of risk management

    Directory of Open Access Journals (Sweden)

    I.M. Posokhov

    2013-12-01

    Full Text Available The aim of the article. The aim of the article is to study the scientific development of foreign scientific schools of risk management and analysis of their main publications; the allocation of foreign scientific schools of risk management. The results of the analysis. Research of modern risk management is carried out leading foreign schools. The most famous school in the theory of financial risk and risk management is American school. Among its current members are D. Galai, H. Greuning, A. Damodaran, P. Jorion, J. Kallman, M. Crouhy, M. Mccarthy, R. Mark, T. Flynn and other scientists. Important contribution to the development of the theory and practice of risk management made British scientists and economists – the representatives of English Schools of Risk Management: T. Andersen, T. Bedford, A. Griffin, A. Zaman, R. Cooke, P. Sweeting, P. Hopkin, a German P. Schroder and others. A significant advance to the theory of risk management of German scientific school, based on the classic work of Zadeh has received significant results of the risk assessment using fuzzy logic and fuzzy sets. Graduate School of Risk Management of the University of Cologne is training and research group funded by the German Research Foundation (DFG under the project «Theoretical and Empirical Basis of Risk Management». The aim of Graduate School of Risk Management is to promote young scientists. The school risk management of the University of Cologne outstanding research conducted by German and foreign professors, such as: K. Mosler, A. Kempf, C. Kuhner, T. Hartmann-Wendels, C. Homburg, D. Hess, D. Sliwka, F. Schmid, H. Schradin. The author noted the existence and fruitful work in the capital of Switzerland Laboratory of risk management (Risk Lab Switzerland and its leading scientists: P. Embrechts, A. Gisler, M. Wüthrich, H. Bühlmann, V. Bignozzi, M. Hofert, P. Deprez, and the Basel Committee on banking supervision – Developer international standards of Basel

  13. Legionellae in engineered systems and use of quantitative microbial risk assessment to predict exposure.

    Science.gov (United States)

    Buse, Helen Y; Schoen, Mary E; Ashbolt, Nicholas J

    2012-03-15

    While it is well-established that Legionella are able to colonize engineered water systems, the number of interacting factors contributing to their occurrence, proliferation, and persistence are unclear. This review summarizes current methods used to detect and quantify legionellae as well as the current knowledge of engineered water system characteristics that both favour and promote legionellae growth. Furthermore, the use of quantitative microbial risk assessment (QMRA) models to predict potentially critical human exposures to legionellae are also discussed. Understanding the conditions favouring Legionella occurrence in engineered systems and their overall ecology (growth in these systems/biofilms, biotic interactions and release) will aid in developing new treatment technologies and/or systems that minimize or eliminate human exposure to potentially pathogenic legionellae.

  14. Quantitative seismic interpretation: Applying rock physics tools to reduce interpretation risk

    Institute of Scientific and Technical Information of China (English)

    Yong Chen

    2007-01-01

    @@ Seismic data analysis is one of the key technologies for characterizing reservoirs and monitoring subsurface pore fluids. While there have been great advances in 3D seismic data processing, the quantitative interpretation of the seismic data for rock properties still poses many challenges. This book demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes, as well as how the multidisciplinary combination of rock physics models with seismic data, sedimentological information, and stochastic techniques can lead to more powerful results than can be obtained from a single technique.

  15. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    Science.gov (United States)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    fractured reservoir, fracture propagation, fault zones and their role in regard to fluid migration into shallow aquifers). A quantitative risk assessment which should be the main aim of future work in this field has much higher demands, especially on site specific data, as the estimation of statistical parameter uncertainty requires site specific parameter distributions. There is already ongoing research on risk assessment in related fields like CO2 sequestration. We therefore propose these methodologies to be transferred to risk estimation relating to the use of the hydraulic fracking method, be it for unconventional gas or enhanced geothermal energy production. The overall aim should be to set common and transparent standards for different uses of the subsurface and their involved risks and communicate those to policy makers and stake holders.

  16. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    Science.gov (United States)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  17. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    Directory of Open Access Journals (Sweden)

    P. Jaiswal

    2010-06-01

    Full Text Available A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years. The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road, vehicles (train, bus, lorry, car and motorbike was expressed in monetary value (US$, and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US$ derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from

  18. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    Science.gov (United States)

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  19. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  20. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Directory of Open Access Journals (Sweden)

    Aino eSalminen

    2015-10-01

    Full Text Available Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary A. actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥ 6 mm pockets, and alveolar bone loss (ABL. High level of T. forsythia was associated also with bleeding on probing (BOP. The combination of the four bacteria, i.e. the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR of 2.40 (95% CI 1.39–4.13. When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52. The highest odds ratio 3.59 (95% CI 1.94–6.63 was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary

  1. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    Science.gov (United States)

    2012-03-01

    coatings Vinyl/asbestos floor tile Automatic transmission components Clutch facings Disc brake pads Drum brake linings Brake blocks Commercial and...1EMDQ March 2012 ASBESTOS WORKSHOP: SAMPLING, ANALYSIS , AND RISK ASSESSMENT Paul Black, PhD, Neptune and Company Ralph Perona, DABT, Neptune and...Sampling, Analysis , and Risk Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  2. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  3. [Bibliometric analysis of bacterial quantitative proteomics in English literatures].

    Science.gov (United States)

    Zhang, Xin; She, Danyang; Liu, Youning; Wang, Rui; Di, Xiuzhen; Liang, Beibei; Wang, Yue

    2014-07-01

    To analyze the worldwide advances on bacterial quantitative proteomics over the past fifteen years with bibliometric approach. Literature retrieval was conducted throughout the databases of Pubmed, Embase and Science citation index (SCI), using "bacterium" and "quantitative proteomics" as the key words. The deadline is July 2013. We sorted and analyzed these articles with Endnote X6 from the aspects of published year, the first author, name of journal, published institution, cited frequency and publication type. 932 English articles were included in our research after deleting the duplicates. The first article on bacterial quantitative proteomics was reported in 1999. The maximal publications were 163 related articles in 2012. Up till July 2013, authors from more than 23 countries and regions have published articles in this field. China ranks the fourth. The main publication type is original articles. The most frequently cited article is entitled with "Absolute quantification of proteins by LCMSE: a virtue of parallel MS acquisition" by Silva JC, Gorenstein MV, Li GZ, et al in Mol Cell Proteomics 2006. The most productive author is Smith RD from Biological Sciences Division, Pac. Northwest National Laboratory. The top journal publishing bacterial quantitative proteomics is Proteomics. More and more researchers pay attention to quantitative proteomics which will be widely used in bacteriology.

  4. Project cost analysis under risk

    Directory of Open Access Journals (Sweden)

    Florica LUBAN

    2010-12-01

    Full Text Available In this paper, an integrated approach based on Monte Carlo simulation and Six Sigma methodology is used to analyze the risk associated with a project's total cost. Monte Carlo simulation is applied to understand the variability in total cost caused by the probabilistic cost items. By Six Sigma methodology the range of variation of the project cost can be reduced by operating on the input factors with the greatest impact on total cost to cover the variation of 6 between the limits that were established in the design phase of Six Sigma.

  5. An Investigation Of Organizational Information Security Risk Analysis

    Directory of Open Access Journals (Sweden)

    Zack Jourdan

    2010-12-01

    Full Text Available Despite a growing number and variety of information security threats, many organizations continue to neglect implementing information security policies and procedures.  The likelihood that an organization’s information systems can fall victim to these threats is known as information systems risk (Straub & Welke, 1998.  To combat these threats, an organization must undergo a rigorous process of self-analysis. To better understand the current state of this information security risk analysis (ISRA process, this study deployed a questionnaire using both open-ended and closed ended questions administered to a group of information security professionals (N=32.  The qualitative and quantitative results of this study show that organizations are beginning to conduct regularly scheduled ISRA processes.  However, the results also show that organizations still have room for improvement to create idyllic ISRA processes. 

  6. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    Science.gov (United States)

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality.

  7. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    Science.gov (United States)

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  8. Quantitative analysis of norfloxacin by 1H NMR and HPLC.

    Science.gov (United States)

    Frackowiak, Anita; Kokot, Zenon J

    2012-01-01

    1H NMR and developed previously HPLC methods were applied to quantitative determination of norfloxacin in veterinary solution form for pigeon. Changes in concentration can lead to significant changes in the 1H chemical shifts of non-exchangeable aromatic protons as a result of extensive self-association phenomena. This chemical shift variation of protons was analyzed and applied in the quantitative determination of norfloxacin. The method is simple, rapid, precise and accurate, and can be used for quality control of this drug.

  9. Gender Analysis of Risk in Innovation System

    DEFF Research Database (Denmark)

    Ayinde, Ope; Muchie, Mammo; Abaniyan, E. O.

    2011-01-01

    This study analyzed risk by gender in innovation in Kwara state, Nigeria, using downy mildew resistant maize production as case study. The study employed primary and secondary data. The primary data were collected from well-structured questionnaires administered to both male and female producing...... the new maize variety. The analytical tools used include descriptive statistics, regression model; risk utility functions and risk parameter analysis. The result showed that invasion by animals, disease and pest, lack of access to credit wind and price fluctuation were the major risk facing the maize...... producers in the area in the usage of the new innovation. The study also revealed that male producers were willing to take risk in the new maize variety production than the female, while the females were more indifferent to the risk involved in the new maize production variety than males. None...

  10. Intentional risk management through complex networks analysis

    CERN Document Server

    Chapela, Victor; Moral, Santiago; Romance, Miguel

    2015-01-01

    This book combines game theory and complex networks to examine intentional technological risk through modeling. As information security risks are in constant evolution,  the methodologies and tools to manage them must evolve to an ever-changing environment. A formal global methodology is explained  in this book, which is able to analyze risks in cyber security based on complex network models and ideas extracted from the Nash equilibrium. A risk management methodology for IT critical infrastructures is introduced which provides guidance and analysis on decision making models and real situations. This model manages the risk of succumbing to a digital attack and assesses an attack from the following three variables: income obtained, expense needed to carry out an attack, and the potential consequences for an attack. Graduate students and researchers interested in cyber security, complex network applications and intentional risk will find this book useful as it is filled with a number of models, methodologies a...

  11. Relative risk regression analysis of epidemiologic data.

    Science.gov (United States)

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  12. Quantitative analysis of sensor for pressure waveform measurement

    Directory of Open Access Journals (Sweden)

    Tyan Chu-Chang

    2010-01-01

    Full Text Available Abstract Background Arterial pressure waveforms contain important diagnostic and physiological information since their contour depends on a healthy cardiovascular system 1. A sensor was placed at the measured artery and some contact pressure was used to measure the pressure waveform. However, where is the location of the sensor just about enough to detect a complete pressure waveform for the diagnosis? How much contact pressure is needed over the pulse point? These two problems still remain unresolved. Method In this study, we propose a quantitative analysis to evaluate the pressure waveform for locating the position and applying the appropriate force between the sensor and the radial artery. The two-axis mechanism and the modified sensor have been designed to estimate the radial arterial width and detect the contact pressure. The template matching method was used to analyze the pressure waveform. In the X-axis scan, we found that the arterial diameter changed waveform (ADCW and the pressure waveform would change from small to large and then back to small again when the sensor was moved across the radial artery. In the Z-axis scan, we also found that the ADCW and the pressure waveform would change from small to large and then back to small again when the applied contact pressure continuously increased. Results In the X-axis scan, the template correlation coefficients of the left and right boundaries of the radial arterial width were 0.987 ± 0.016 and 0.978 ± 0.028, respectively. In the Z-axis scan, when the excessive contact pressure was more than 100 mm Hg, the template correlation was below 0.983. In applying force, when using the maximum amplitude as the criteria level, the lower contact pressure (r = 0.988 ± 0.004 was better than the higher contact pressure (r = 0.976 ± 0.012. Conclusions Although, the optimal detective position has to be close to the middle of the radial arterial, the pressure waveform also has a good completeness with

  13. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    Science.gov (United States)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (pGLCM-derived texture features.

  14. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    Science.gov (United States)

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  15. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    Energy Technology Data Exchange (ETDEWEB)

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. (Gradient Corporation, Cambridge, MA (United States))

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  16. Quantitative Risk Assessment for Lung Cancer from Exposure to Metal Ore Dust

    Institute of Scientific and Technical Information of China (English)

    FUHUA; JINGXIPENG; 等

    1992-01-01

    To quantitatively assess risk for lung cancer of metal miners,a historical cohort study was conducted.The cohort consisted of 1113 miners who were employed to underground work for at least 12 months between January 1,1960and December,12,1974,According to the records or dust concentration,a cumulative dust dose of each miner in the cohort was estimated.There wer 162 deaths in total and 45 deaths from lung cancer with a SMR of 2184,The SMR for lung cancer increased from 1019 for those with cumulative dust of less than 500mg-year to 2469 for those with the dose of greater than 4500mg-year.Furthermore,the risk in the highest category of combined cumulative dust dose and cigarette smoking was 46-fold greater than the lowest category of dust dose and smoking.This study showed that there was an exposure-response relationship between metal ore dust and lung cancer,and an interaction of lung cancer between smoking and metal ore dust exposure.

  17. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  18. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    Science.gov (United States)

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  19. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  20. [Improvement of legislation basis for occupational risk analysis