WorldWideScience

Sample records for quantitative risk analysis

  1. What is a risk. [Quantitative risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schoen, G [Physikalisch-Technische Bundesanstalt, Braunschweig (Germany, F.R.)

    1979-02-01

    The following article is a revised version of a lecture given by the author during the VDE meeting 'Technical Expert Activities' in Brunswick. First of all, the concept of 'risk' is discussed which leads to a probability scale which then permits a definition of the 'justifiable risk' as the boundary between 'hazard' and 'safety'. The boundary is quantified indirectly from laws, regulations, instructions, etc. to the 'Technological rules' for special fields of application by minimum requirement data. These viewpoints described in detail are not only of substantial significance for the creation of safety regulations but also for their application and consequently for jurisdiction.

  2. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  3. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  4. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  5. Risk management and analysis: risk assessment (qualitative and quantitative)

    OpenAIRE

    Valentin Mazareanu

    2007-01-01

    We use to define risk as the possibility of suffering a loss. Starting this, risk management is defined as a business process whose purpose is to ensure that the organization is protected against risks and their effects. In order to prioritize, to develop a response plan and after that to monitor the identified risks we need to asses them. But at this point a question is born: should I choose a qualitative approach or a quantitative one? This paper will make a short overview over the risk eva...

  6. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  7. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  8. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  9. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  10. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  11. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Science.gov (United States)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  12. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    International Nuclear Information System (INIS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-01-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  13. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    Energy Technology Data Exchange (ETDEWEB)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky [Department of Mechanical Engineering, Diponegoro University, Semarang (Indonesia); Kim, Seon Jin [Department of Mechanical & Automotive Engineering of Pukyong National University (Korea, Republic of)

    2016-04-19

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  14. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes......, but also when wading through a flooded area. The results in this thesis have brought microbial risk assessments one step closer to more uniform and repeatable risk analysis by using actual and relevant measured data and hydrodynamic water quality models to estimate the risk from flooding caused...... are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph...

  15. Quantitative risk analysis in two pipelines operated by TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Bittencourt, Euclides [Centro Universitario FIB, Salvador , BA (Brazil)

    2009-07-01

    Transportation risk analysis techniques were used to study two pipelines operated by TRANSPETRO. The Pipeline A is for the simultaneous transportation of diesel, gasoline and LPG and comprises three parts, all of them crossing rural areas. The Pipeline B is for oil transportation and one of its ends is located in an area of a high density population. Both pipelines had their risk studied using the PHAST RISK{sup R} software and the individual risk measures, the only considered measures for license purposes for this type of studies, presented level far below the maximum tolerable levels considered. (author)

  16. Quantitative Risks

    Science.gov (United States)

    2015-02-24

    design failure modes and effects analysis (DFMEA), (b) Fault Tree Analysis ( FTA ) for all essential functions listed in the Failure Definition and...subsystem reliability date from Accomplishment 3, completed (a) updated DFMEA, (b) updated FTA , (c) updated reliability and maintainability estimates, (d...www.gao.gov/assets/660/658615.pdf [4] Karen Richey. Update to GAO’s Cost Estimating Assessment Guide and Scheduling Guide (draft). GAO. Mar 2013

  17. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    Morgenroth, M.; Donnelly, C.R.; Westermann, G.D.; Huang, J.H.S.; Lam, T.M.

    1999-01-01

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  18. Quantitative risk analysis offshore-Human and organizational factors

    International Nuclear Information System (INIS)

    Espen Skogdalen, Jon; Vinnem, Jan Erik

    2011-01-01

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  19. From risk analysis to risk control in land transport of dangerous materials. Contribution of quantitative evaluation

    International Nuclear Information System (INIS)

    Hubert, Ph.; Pages, P.

    1985-03-01

    The different approaches of risks and risk management system are described: statistics, potential risk, prevention, information and intervention. Quantitative evaluation is developed: data collection, purposes and methods. Two examples of application are given on risks associated to road transport of propane and of uranium hexafluoride. In conclusion level of risk and practical use of studies on risks are examined. 41 refs [fr

  20. Use of quantitative uncertainty analysis for human health risk assessment

    International Nuclear Information System (INIS)

    Duncan, F.L.W.; Gordon, J.W.; Kelly, M.

    1994-01-01

    Current human health risk assessment method for environmental risks typically use point estimates of risk accompanied by qualitative discussions of uncertainty. Alternatively, Monte Carlo simulations may be used with distributions for input parameters to estimate the resulting risk distribution and descriptive risk percentiles. These two techniques are applied for the ingestion of 1,1=dichloroethene in ground water. The results indicate that Monte Carlo simulations provide significantly more information for risk assessment and risk management than do point estimates

  1. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  2. A Quantitative Risk Analysis of Deficient Contractor Business System

    Science.gov (United States)

    2012-04-30

    Mathematically , Jorion’s concept of VaR looks like this: ( > ) ≤ 1 − (2) where, = ^Åèìáëáíáçå=oÉëÉ~êÅÜ=éêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=ÅÜ...presents three models for calculating VaR. The local-valuation method determines the value of a portfolio once and uses mathematical derivatives...management. In the insurance industry, actuarial data is applied to model risk and risk capital reserves are “held” to cover the expected values for

  3. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  4. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  5. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  6. Quantitative risk analysis for landslides ‒ Examples from Bíldudalur, NW-Iceland

    Directory of Open Access Journals (Sweden)

    R. Bell

    2004-01-01

    Full Text Available Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative

  7. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    Science.gov (United States)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  8. New Approaches to Transport Project Assessment: Reference Scenario Forecasting and Quantitative Risk Analysis

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2010-01-01

    however has proved that the point estimates derived from such analyses are embedded with a large degree of uncertainty. Thus, a new scheme was proposed in terms of applying quantitative risk analysis (QRA) and Monte Carlo simulation in order to represent the uncertainties within the cost-benefit analysis....... Additionally, the handling of uncertainties is supplemented by making use of the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits (user demands i.e. travel time savings) and underestimating investment costs....

  9. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  10. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    Science.gov (United States)

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-03

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  11. Quantitative Risk Analysis of a Pervaporation Process for Concentrating Hydrogen Peroxide

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Ho Jin; Yoon, Ik Keun [Korea Gas Corporation, Ansan (Korea, Republic of); Choi, Soo Hyoung [Chonbuk National University, Jeonju (Korea, Republic of)

    2014-12-15

    Quantitative risk analysis has been performed for a pervaporation process for production of high test peroxide. Potential main accidents are explosion and fire caused by a decomposition reaction. As the target process has a laboratory scale, the consequence is considered to belong to Category 3. An event tree has been developed as a model for occurrence of a decomposition reaction in the target process. The probability functions of the accident causes have been established based on the frequency data of similar events. Using the constructed model, the failure rate has been calculated. The result indicates that additional safety devices are required in order to achieve an acceptable risk level, i.e. an accident frequency less than 10{sup -4}/yr. Therefore, a layer of protection analysis has been applied. As a result, it is suggested to introduce inherently safer design to avoid catalytic reaction, a safety instrumented function to prevent overheating, and a relief system that prevents explosion even if a decomposition reaction occurs. The proposed method is expected to contribute to developing safety management systems for various chemical processes including concentration of hydrogen peroxide.

  12. Quantitative Structure activity relationship and risk analysis of some pesticides in the cattle milk

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad*, Ijaz Javed, Masood Akhtar1, Zia-ur-Rahman, Mian Muhammad Awais1, Muhammad Kashif Saleemi2 and Muhammad Irfan Anwar3

    2012-10-01

    Full Text Available Milk of cattle was collected from various localities of Faisalabad, Pakistan. Pesticides concentration was determined by HPLC using solid phase microextraction. The residue analysis revealed that about 40% milk samples were contaminated with pesticides. The mean±SE levels (ppm of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.38±0.02, 0.26±0.02, 0.072±0.01 and 0.085±0.02, respectively. Quantitative structure activity relationship (QSAR models were used to predict the residues of unknown pesticides in the milk of cattle using their known physicochemical properties such as molecular weight (MW, melting point (MP, and log octanol to water partition coefficient (Ko/w as well as the milk characteristics such as pH, % fat, and specific gravity (SG in this species. The analysis revealed good correlation coefficients (R2 = 0.91 for cattle QSAR model. The coefficient for Ko/w for the studied pesticides was higher in cattle milk. Risk analysis was conducted based upon the determined pesticide residues and their provisional tolerable daily intakes. The daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study were 3, 11, 2.5 times higher, respectively in cattle milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  13. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk.

    Science.gov (United States)

    Muhammad, Faqir; Awais, Mian Muhammad; Akhtar, Masood; Anwar, Muhammad Irfan

    2013-01-04

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  14. Quantitative Structure Activity Relationship and Risk Analysis of Some Pesticides in the Goat milk

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2013-01-01

    Full Text Available The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean+/-SEM levels (ppm of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34+/-0.007, 0.063+/-0.002, 0.034+/-0.002 and 0.092+/-0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW, melting point (MP, and log octanol to water partition coefficient (Ko/w in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985 for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  15. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  16. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  17. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  18. Meta-analysis for quantitative microbiological risk assessments and benchmarking data

    NARCIS (Netherlands)

    Besten, den H.M.W.; Zwietering, M.H.

    2012-01-01

    Meta-analysis studies are increasingly being conducted in the food microbiology area to quantitatively integrate the findings of many individual studies on specific questions or kinetic parameters of interest. Meta-analyses provide global estimates of parameters and quantify their variabilities, and

  19. Are risks quantitatively determinable

    International Nuclear Information System (INIS)

    Buetzer, P.

    1985-01-01

    ''Chemical risks'' can only be determined with accurate figures in a few extraordinary cases. The difficulties lie, as has been shown by the example of the Flixborough catastrophe, mostly in the determination of the probabilities of occurrence. With a rough semiquantitative estimate of the potential hazards and the corresponding probabilities we can predict the risks with astonishing accuracy. Statistical data from incidents in the chemical industry are very useful, and they also show that ''chemical catastrophes'' are only to a very small extent initiated by uncontrolled chemical reactions. (orig.) [de

  20. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  1. Quantitative Structure activity relationship and risk analysis of some pesticides in the cattle milk

    OpenAIRE

    Faqir Muhammad*, Ijaz Javed, Masood Akhtar1, Zia-ur-Rahman, Mian Muhammad Awais1, Muhammad Kashif Saleemi2 and Muhammad Irfan Anwar3

    2012-01-01

    Milk of cattle was collected from various localities of Faisalabad, Pakistan. Pesticides concentration was determined by HPLC using solid phase microextraction. The residue analysis revealed that about 40% milk samples were contaminated with pesticides. The mean±SE levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.38±0.02, 0.26±0.02, 0.072±0.01 and 0.085±0.02, respectively. Quantitative structure activity relationship (QSAR) models were used to predict the residues...

  2. Qualitative and Quantitative Analysis of Off-Shore Wind Energy Project’s Risks

    Directory of Open Access Journals (Sweden)

    Sayed Amir Hamzeh Mirkheshti

    2017-01-01

    Full Text Available The benefits of wind power can solve the issue of growing power consumption with insufficient distribution facilities. Based on an extensive research on more than 20 studies, this study explores the risks associated with off-shore wind energy in Persian Gulf in Iran. This paper tries to identify the risks in related off-shore wind energy project, in order to specify which variables have the most impact on project by qualitative analysis through application of the impact and the possibility of every risk. A survey was conducted in order to determine the relative importance of variables and risks. Certain key components in completion of the project should be taken into account such as technology, research team, expert teams (personnel that have a good knowledge of this industry, and choosing the right spot where the wind farms will be located. The objective of this paper is to present the variables encountered in wind power project and to highlight the risks that must be controlled by the project developers, project team, supply chain actors, manufacturers, and all the stockholders involved in successful completion of a project.

  3. Quantitative and qualitative analysis of the expert and non-expert opinion in fire risk in buildings

    International Nuclear Information System (INIS)

    Hanea, D.M.; Jagtman, H.M.; Alphen, L.L.M.M. van; Ale, B.J.M.

    2010-01-01

    Expert judgment procedure is a method very often used in the area of risk assessments of complex systems or processes to fill in quantitative data. Although it has been proved to be a very reliable source of information when no other data are available, the choice of experts is always questioned. When the available data are limited, the seed questions cover only partially the domains of expertise, which may cause problems. Expertise is assessed not covering the full object of study but only those topics for which seed questions can be formulated. The commonly used quantitative analysis of an expert judgment exercise is combined with a qualitative analysis. The latter adds more insights to the relation between the assessor's field and statistical knowledge and their performance in an expert judgment. In addition the qualitative analysis identifies different types of seed questions. Three groups of assessors with different levels of statistical and domain knowledge are studied. The quantitative analysis shows no differences between field experts and non-experts and no differences between having advanced statistical knowledge or not. The qualitative analysis supports these findings. In addition it is found that especially technical questions are answered with larger intervals. Precaution is required when using seed questions for which the real value can be calculated, which was the case for one of the seed questions.

  4. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  5. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  6. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    Science.gov (United States)

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  8. Challenges in Risk Assessment: Quantitative Risk Assessment

    OpenAIRE

    Jacxsens, Liesbeth; Uyttendaele, Mieke; De Meulenaer, Bruno

    2016-01-01

    The process of risk analysis consists out of three components, risk assessment, risk management and risk communication. These components are internationally well spread by Codex Alimentarius Commission as being the basis for setting science based standards, criteria on food safety hazards, e.g. setting maximum limits of mycotoxins in foodstuffs. However, the technical component risk assessment is hard to elaborate and to understand. Key in a risk assessment is the translation of biological or...

  9. A Statistical-Probabilistic Pattern for Determination of Tunnel Advance Step by Quantitative Risk Analysis

    Directory of Open Access Journals (Sweden)

    sasan ghorbani

    2017-12-01

    Full Text Available One of the main challenges faced in design and construction phases of tunneling projects is the determination of maximum allowable advance step to maximize excavation rate and reduce project delivery time. Considering the complexity of determining this factor and unexpected risks associated with inappropriate determination of that, it is necessary to employ a method which is capable of accounting for interactions among uncertain geotechnical parameters and advance step. The main objective in the present research is to undertake optimization and risk management of advance step length in water diversion tunnel at Shahriar Dam based on uncertainty of geotechnical parameters following a statistic-probabilistic approach. In the present research, in order to determine optimum advance step for excavation operation, two hybrid methods were used: strength reduction method-discrete element method- Monte Carlo simulation (SRM/DEM/MCS and strength reduction method- discrete element method- point estimate method (SRM/DEM/PEM. Moreover, Taguchi analysis was used to investigate the sensitivity of advance step to changes in statistical distribution function of input parameters under three tunneling scenarios at sections of poor to good qualities (as per RMR classification system. Final results implied the optimality of the advance step defined in scenario 2 where 2 m advance per excavation round was proposed, according to shear strain criterion and SRM/DEM/MCS, with minimum failure probability and risk of 8.05% and 75281.56 $, respectively, at 95% confidence level. Moreover, in either of normal, lognormal, and gamma distributions, as the advance step increased from Scenario 1 to 2, failure probability was observed to increase at lower rate than that observed when advance step in scenario 2 was increased to that In Scenario 3. In addition, Taguchi tests were subjected to signal-to-noise analysis and the results indicated that, considering the three statistical

  10. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  11. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    Science.gov (United States)

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  12. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  13. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    Science.gov (United States)

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Quantitative risk assessment via uncertainty analysis in combination with error propagation for the determination of the dynamic Design Space of the primary drying step during freeze-drying

    DEFF Research Database (Denmark)

    Van Bockstal, Pieter Jan; Mortier, Séverine Thérèse F.C.; Corver, Jos

    2017-01-01

    of a freeze-drying process, allowing to quantitatively estimate and control the risk of cake collapse (i.e., the Risk of Failure (RoF)). The propagation of the error on the estimation of the thickness of the dried layer Ldried as function of primary drying time was included in the uncertainty analysis...

  15. Quantitative risk analysis of gas explosions in tunnels; probability, effects, and consequences

    NARCIS (Netherlands)

    Weerheijm, J.; Voort, M.M. van der; Verreault, J.; Berg, A.C. van den

    2015-01-01

    Tunnel accidents with transports of combustible liquefied gases may lead to explosions. Depending on the substance involved this can be a Boiling Liquid Expanding Vapour Explosion (BLEVE), a Gas Expansion Explosion (GEE) or a gas explosion. Quantification of the risk of these scenarios is important

  16. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Science.gov (United States)

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  17. Quantitative Identification of Construction Risk

    OpenAIRE

    Kasprowicz T.

    2017-01-01

    Risks pertaining to construction work relate to situations in which various events may randomly change the duration and cost of the project or worsen its quality. Because of possible significant changes of random events, favorable, moderate, and difficult conditions of construction work are considered. It is the first stage of the construction risk analysis. The probabilistic parameters of construction are identified and described by using the design characteristics model of the structure and...

  18. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk

    OpenAIRE

    Muhammad, Faqir; Awais, Mian Muhammad; Akhtar, Masood; Anwar, Muhammad Irfan

    2013-01-01

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean+/-SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34+/-0.007, 0.063+/-0.002, 0.034+/-0.002 and 0.092+/-0.002, respectively; whereas, methyl parathion was not detected in an...

  19. Quantitative risk analysis using vulnerability indicators to assess food insecurity in the Niayes agricultural region of West Senegal

    Directory of Open Access Journals (Sweden)

    Mateugue Diack

    2017-11-01

    Full Text Available There is an increasing need to develop indicators of vulnerability and adaptive capacity to determine the robustness of response strategies over time and better understand the underlying processes. This study aimed to determine levels of risk of food insecurity using defined vulnerability indicators. For the purpose of this study, factors influencing food insecurity and different vulnerable indicators were examined using quantitative and qualitative research methods. Observations made on the physical environment (using tools for spatial analysis and socio-economic surveys conducted with local populations have quantified vulnerability indicators in the Niayes agricultural region. Application of the Classification and Regression Tree (CART model has enabled us to quantify the level of vulnerability of the zone. The results show that the decrease in agricultural surface areas is the most discriminant one in this study. The speed of reduction of the agricultural areas has specially increased between 2009 and 2014, with a loss of 65% of these areas. Therefore, a decision-making system, centred on the need for reinforcing the resilience of local populations, by preserving the agricultural vocation of the Niayes region and even in the Sahelian regions requires support and extension services for the farmers in order to promote sustainable agricultural practices.

  20. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  1. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  2. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  3. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  4. Quantitative risk trends deriving from PSA-based event analyses. Analysis of results from U.S.NRC's accident sequence precursor program

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2004-01-01

    The United States Nuclear Regulatory Commission (U.S.NRC) has been carrying out the Accident Sequence Precursor (ASP) Program to identify and categorize precursors to potential severe core damage accident sequences using the probabilistic safety assessment (PSA) technique. The ASP Program has identified a lot of risk significant events as precursors that occurred at U.S. nuclear power plants. Although the results from the ASP Program include valuable information that could be useful for obtaining and characterizing risk significant insights and for monitoring risk trends in nuclear power industry, there are only a few attempts to determine and develop the trends using the ASP results. The present study examines and discusses quantitative risk trends for the industry level, using two indicators, that is, the occurrence frequency of precursors and the annual core damage probability, deriving from the results of the ASP analysis. It is shown that the core damage risk at U.S. nuclear power plants has been lowered and the likelihood of risk significant events has been remarkably decreasing. As well, the present study demonstrates that two risk indicators used here can provide quantitative information useful for examining and monitoring the risk trends and/or risk characteristics in nuclear power industry. (author)

  5. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    Science.gov (United States)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  6. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  7. The use of quantitative risk assessment in HACCP

    NARCIS (Netherlands)

    Hoornstra, E.; Northolt, M.D.; Notermans, S.; Barendsz, A.W.

    2001-01-01

    During the hazard analysis as part of the development of a HACCP-system, first the hazards (contaminants) have to be identified and then the risks have to be assessed. Often, this assessment is restricted to a qualitative analysis. By using elements of quantitative risk assessment (QRA) the hazard

  8. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    Energy Technology Data Exchange (ETDEWEB)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-03-15

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  9. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Patriarca, Riccardo; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-01-01

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  10. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  11. Quantitative Concept Analysis

    NARCIS (Netherlands)

    Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas

    2012-01-01

    Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all

  12. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    Science.gov (United States)

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  13. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Incorporating assumption deviation risk in quantitative risk assessments: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Khorsandi, Jahon; Aven, Terje

    2017-01-01

    Quantitative risk assessments (QRAs) of complex engineering systems are based on numerous assumptions and expert judgments, as there is limited information available for supporting the analysis. In addition to sensitivity analyses, the concept of assumption deviation risk has been suggested as a means for explicitly considering the risk related to inaccuracies and deviations in the assumptions, which can significantly impact the results of the QRAs. However, challenges remain for its practical implementation, considering the number of assumptions and magnitude of deviations to be considered. This paper presents an approach for integrating an assumption deviation risk analysis as part of QRAs. The approach begins with identifying the safety objectives for which the QRA aims to support, and then identifies critical assumptions with respect to ensuring the objectives are met. Key issues addressed include the deviations required to violate the safety objectives, the uncertainties related to the occurrence of such events, and the strength of knowledge supporting the assessments. Three levels of assumptions are considered, which include assumptions related to the system's structural and operational characteristics, the effectiveness of the established barriers, as well as the consequence analysis process. The approach is illustrated for the case of an offshore installation. - Highlights: • An approach for assessing the risk of deviations in QRA assumptions is presented. • Critical deviations and uncertainties related to their occurrence are addressed. • The analysis promotes critical thinking about the foundation and results of QRAs. • The approach is illustrated for the case of an offshore installation.

  15. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    Science.gov (United States)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  16. Effect of human movement on airborne disease transmission in an airplane cabin: study using numerical modeling and quantitative risk analysis.

    Science.gov (United States)

    Han, Zhuyang; To, Gin Nam Sze; Fu, Sau Chung; Chao, Christopher Yu-Hang; Weng, Wenguo; Huang, Quanyi

    2014-08-06

    Airborne transmission of respiratory infectious disease in indoor environment (e.g. airplane cabin, conference room, hospital, isolated room and inpatient ward) may cause outbreaks of infectious diseases, which may lead to many infection cases and significantly influences on the public health. This issue has received more and more attentions from academics. This work investigates the influence of human movement on the airborne transmission of respiratory infectious diseases in an airplane cabin by using an accurate human model in numerical simulation and comparing the influences of different human movement behaviors on disease transmission. The Eulerian-Lagrangian approach is adopted to simulate the dispersion and deposition of the expiratory aerosols. The dose-response model is used to assess the infection risks of the occupants. The likelihood analysis is performed as a hypothesis test on the input parameters and different human movement pattern assumptions. An in-flight SARS outbreak case is used for investigation. A moving person with different moving speeds is simulated to represent the movement behaviors. A digital human model was used to represent the detailed profile of the occupants, which was obtained by scanning a real thermal manikin using the 3D laser scanning system. The analysis results indicate that human movement can strengthen the downward transport of the aerosols, significantly reduce the overall deposition and removal rate of the suspended aerosols and increase the average infection risk in the cabin. The likelihood estimation result shows that the risk assessment results better fit the outcome of the outbreak case when the movements of the seated passengers are considered. The intake fraction of the moving person is significantly higher than most of the seated passengers. The infection risk distribution in the airplane cabin highly depends on the movement behaviors of the passengers and the index patient. The walking activities of the crew

  17. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  18. A quantitative analysis of a risk impact due to a starting time extension of the emergency diesel generator in optimized power reactor-1000

    International Nuclear Information System (INIS)

    Lim, Ho-Gon; Yang, Joon-Eon; Hwang, Mee-Jeong

    2007-01-01

    An emergency diesel generator (EDG) is the ultimate electric power supply source for the operation of emergency engineered safety features when a nuclear power plant experiences a loss of off-site power (LOOP). If a loss of coolant accident (LOCA) with a simultaneous LOOP occurs, the EDG should be in the state of a full power within 10 s, which is a prescribed regulatory requirement in the technical specifications (TS) of the Optimized Power Reactor-1000 (OPR-1000). Recently, the US nuclear regulatory commission (NRC) has been preparing a new risk-informed emergency core cooling system (ECCS) rule called 10 CFR 50.46. The new rule redefines the size for the design basis LOCA and it relaxes some of the requirements such as the single failure criteria, simultaneous LOOP, and the methods of analysis. The revision of the ECCS rule will provide flexibility for plant changes if the plant risks are checked and balanced with the specified criteria. The present study performed a quantitative analysis of the plant risk impact due to the EDG starting time extension given that the new rule will be applied to OPR-1000. The thermal-hydraulic analysis and OPR-1000 probabilistic safety assessment (PSA) model were combined to estimate the whole plant risk impact. Also, sensitivity analyses were implemented for the important uncertainty parameters

  19. Risk analysis

    International Nuclear Information System (INIS)

    Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.

    1997-01-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es

  20. QMRA (quantitative microbial risk assessment) and HACCP (hazard analysis and critical control points) for management of pathogens in wastewater and sewage sludge treatment and reuse.

    Science.gov (United States)

    Westrell, T; Schönning, C; Stenström, T A; Ashbolt, N J

    2004-01-01

    Hazard Analysis and Critical Control Points (HACCP) was applied for identifying and controlling exposure to pathogenic microorganisms encountered during normal sludge and wastewater handling at a 12,500 m3/d treatment plant utilising tertiary wastewater treatment and mesophilic sludge digestion. The hazardous scenarios considered were human exposure during treatment, handling, soil application and crop consumption, and exposure via water at the wetland-area and recreational swimming. A quantitative microbial risk assessment (QMRA), including rotavirus, adenovirus, haemorrhagic E. coli, Salmonella, Giardia and Cryptosporidium, was performed in order to prioritise pathogen hazards for control purposes. Human exposures were treated as individual risks but also related to the endemic situation in the general population. The highest individual health risk from a single exposure was via aerosols for workers at the belt press for sludge dewatering (virus infection risk = 1). The largest impact on the community would arise if children ingested sludge at the unprotected storage site, although in the worst-case situation the largest number of infections would arise through vegetables fertilised with sludge and eaten raw (not allowed in Sweden). Acceptable risk for various hazardous scenarios, treatment and/or reuse strategies could be tested in the model.

  1. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  2. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  3. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1978-01-01

    The bases for developing quantitative assessment of exposure risks in the human being, and the several problems that accompany the assessment and introduction of the risk of exposure to high and low LET radiation into radiation protection, will be evaluated. The extension of the pioneering radiation protection philosophies to the control of other hazardous agents that cannot be eliminated from the environment will be discussed, as will the serious misunderstandings and misuse of concepts and facts that have inevitably surrounded the application to one agent alone, of the protection philosophy that must in time be applied to a broad spectrum of potentially hazardous agents. (orig.) [de

  4. Quantitative influence of risk factors on blood glucose level.

    Science.gov (United States)

    Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu

    2014-01-01

    The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.

  5. Quantitative Relationship Between Cumulative Risk Alleles Based on Genome-Wide Association Studies and Type 2 Diabetes Mellitus: A Systematic Review and Meta-analysis

    Directory of Open Access Journals (Sweden)

    Satoru Kodama

    2018-01-01

    Full Text Available Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM using several single nucleotide polymorphisms (SNPs based on findings of genome-wide association studies (GWAS. However, the quantitative association of cumulative risk alleles (RAs of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR of T2DM for 1 increment in RAs carried (1-ΔRA in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13–1.19. In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08–1.13. There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies. The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies. The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution.

  6. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  7. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  8. A quantitative risk-analysis for introduction of Bovine Viral Diarrhoea Virus in the Netherlands through cattle imports.

    Science.gov (United States)

    Santman-Berends, I M G A; Mars, M H; Van Duijn, L; Van den Broek, K W H; Van Schaik, G

    2017-10-01

    Many countries have implemented control programmes aiming to eradicate Bovine Viral Diarrhoea Virus (BVDV). After obtaining the free status, a risk of re-introduction of the virus through import may remain. Therefore the risk of introduction of BVDV through cattle imports in the Netherlands was quantified and the effectiveness of subsequent intervention measures was assessed. Data, literature and expert opinion were used to estimate values for input parameters to feed a stochastic simulation model. The probability that BVDV was imported was differentiated into persistently infected (PI) cattle, trojan cows that transmitted the virus vertically resulting in a PI foetus (TR) and transient infected cattle (TI). The import risk was stratified to beef, dairy, small scale, suckler, trade, veal and young stock herds. The intervention scenarios that were evaluated consisted of virus testing, a combination of virus testing and antibody testing in pregnant cows, abolishment of imports from high risk countries (i.e. countries with a BVDV prevalence >15%) and a combination of import restrictions and testing prior to import. Each year, 334 (5th and 95th percentile: 65-902) Dutch cattle herds were estimated to be infected with BVDV through import. Veal herds account for most infections associated with import (87%), whereas in the other herd types, only 9 beef, 6 dairy, 2 small scale, 16 suckler, 10 trade and 2 young stock herds are infected through imports per year. Import of PI cattle is the most important risk for introduction in veal herds, while import of TR cows is the main source of BVDV introduction in dairy, small scale and suckler herds. With the intervention scenarios, the number of BVDV infected herds in the Netherlands could be reduced to 81 and 58 herds per year when respectively virus testing or a combination of virus and antibody testing was applied or to 108 herds when import from high risk countries was abolished. With the scenario in which both import from high

  9. The Age-Specific Quantitative Effects of Metabolic Risk Factors on Cardiovascular Diseases and Diabetes: A Pooled Analysis

    Science.gov (United States)

    Farzadfar, Farshad; Stevens, Gretchen A.; Woodward, Mark; Wormser, David; Kaptoge, Stephen; Whitlock, Gary; Qiao, Qing; Lewington, Sarah; Di Angelantonio, Emanuele; vander Hoorn, Stephen; Lawes, Carlene M. M.; Ali, Mohammed K.; Mozaffarian, Dariush; Ezzati, Majid

    2013-01-01

    Background The effects of systolic blood pressure (SBP), serum total cholesterol (TC), fasting plasma glucose (FPG), and body mass index (BMI) on the risk of cardiovascular diseases (CVD) have been established in epidemiological studies, but consistent estimates of effect sizes by age and sex are not available. Methods We reviewed large cohort pooling projects, evaluating effects of baseline or usual exposure to metabolic risks on ischemic heart disease (IHD), hypertensive heart disease (HHD), stroke, diabetes, and, as relevant selected other CVDs, after adjusting for important confounders. We pooled all data to estimate relative risks (RRs) for each risk factor and examined effect modification by age or other factors, using random effects models. Results Across all risk factors, an average of 123 cohorts provided data on 1.4 million individuals and 52,000 CVD events. Each metabolic risk factor was robustly related to CVD. At the baseline age of 55–64 years, the RR for 10 mmHg higher SBP was largest for HHD (2.16; 95% CI 2.09–2.24), followed by effects on both stroke subtypes (1.66; 1.39–1.98 for hemorrhagic stroke and 1.63; 1.57–1.69 for ischemic stroke). In the same age group, RRs for 1 mmol/L higher TC were 1.44 (1.29–1.61) for IHD and 1.20 (1.15–1.25) for ischemic stroke. The RRs for 5 kg/m2 higher BMI for ages 55–64 ranged from 2.32 (2.04–2.63) for diabetes, to 1.44 (1.40–1.48) for IHD. For 1 mmol/L higher FPG, RRs in this age group were 1.18 (1.08–1.29) for IHD and 1.14 (1.01–1.29) for total stroke. For all risk factors, proportional effects declined with age, were generally consistent by sex, and differed by region in only a few age groups for certain risk factor-disease pairs. Conclusion Our results provide robust, comparable and precise estimates of the effects of major metabolic risk factors on CVD and diabetes by age group. PMID:23935815

  10. The profile of quantitative risk indicators in Krsko NPP

    International Nuclear Information System (INIS)

    Vrbanic, I.; Basic, I.; Bilic-Zabric, T.; Spiler, J.

    2004-01-01

    During the past decade strong initiative was observed which was aimed at incorporating information on risk into various aspects of operation of nuclear power plants. The initiative was observable in activities carried out by regulators as well as utilities and industry. It resulted in establishing the process, or procedure, which is often referred to as integrated decision making or risk informed decision making. In this process, engineering analyses and evaluations that are usually termed traditional and that rely on considerations of safety margins and defense in depth are supplemented by quantitative indicators of risk. Throughout the process, the plant risk was most commonly expressed in terms of likelihood of events involving damage to the reactor core and events with radiological releases to the environment. These became two commonly used quantitative indicators or metrics of plant risk (or, reciprocally, plant safety). They were evaluated for their magnitude (e.g. the expected number of events per specified time interval), as well as their profile (e.g. the types of contributing events). The information for quantitative risk indicators (to be used in risk informing process) is obtained from plant's probabilistic safety analyses or analyses of hazards. It is dependable on issues such as availability of input data or quality of model or analysis. Nuclear power plant Krsko has recently performed Periodic Safety Review, which was a good opportunity to evaluate and integrate the plant specific information on quantitative plant risk indicators and their profile. The paper discusses some aspects of quantitative plant risk profile and its perception.(author)

  11. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1979-01-01

    Although the overall aim of radiobiology is to understand the biological effects of radiation, it also has the implied practical purpose of developing rational measures for the control of radiation exposure in man. The emphasis in this presentation is to show that the enormous effort expended over the years to develop quantitative dose-effect relationships in biochemical and cellular systems, animals, and human beings now seems to be paying off. The pieces appear to be falling into place, and a framework is evolving to utilize these data. Specifically, quantitative risk assessments will be discussed in terms of the cellular, animal, and human data on which they are based; their use in the development of radiation protection standards; and their present and potential impact and meaning in relation to the quantity dose equivalent and its special unit, the rem

  12. Quantitative risk assessment of drinking water contaminants

    International Nuclear Information System (INIS)

    Cothern, C.R.; Coniglio, W.A.; Marcus, W.L.

    1986-01-01

    The development of criteria and standards for the regulation of drinking water contaminants involves a variety of processes, one of which is risk estimation. This estimation process, called quantitative risk assessment, involves combining data on the occurrence of the contaminant in drinking water and its toxicity. The human exposure to a contaminant can be estimated from occurrence data. Usually the toxicity or number of health effects per concentration level is estimated from animal bioassay studies using the multistage model. For comparison, other models will be used including the Weibull, probit, logit and quadratic ones. Because exposure and toxicity data are generally incomplete, assumptions need to be made and this generally results in a wide range of certainty in the estimates. This range can be as wide as four to six orders of magnitude in the case of the volatile organic compounds in drinking water and a factor of four to five for estimation of risk due to radionuclides in drinking water. As examples of the differences encountered in risk assessment of drinking water contaminants, discussions are presented on benzene, lead, radon and alachlor. The lifetime population risk estimates for these contaminants are, respectively, in the ranges of: <1 - 3000, <1 - 8000, 2000-40,000 and <1 - 80. 11 references, 1 figure, 1 table

  13. Quantitative risk analysis for potentially resistant E. coli in surface waters caused by antibiotic use in agricultural systems.

    Science.gov (United States)

    Limayem, Alya; Martin, Elizabeth M

    2014-01-01

    Antibiotics are frequently used in agricultural systems to promote livestock health and to control bacterial contaminants. Given the upsurge of the resistant fecal indicator bacteria (FIB) in the surface waters, a novel statistical method namely, microbial risk assessment (MRA) was performed, to evaluate the probability of infection by resistant FIB on populations exposed to recreational waters. Diarrheagenic Escherichia coli, except E. coli O157:H7, were selected for their prevalence in aquatic ecosystem. A comparative study between a typical E. coli pathway and a case scenario aggravated by antibiotic use has been performed via Crystal Ball® software in an effort to analyze a set of available inputs provided by the US institutions including E. coli concentrations in US Great Lakes through using random sampling and probability distributions. Results from forecasting a possible worst-case scenario dose-response, accounted for an approximate 50% chance for 20% of the exposed human populations to be infected by recreational water in the U.S. However, in a typical scenario, there is a 50% chance of infection for only 1% of the exposed human populations. The uncertain variable, E. coli concentration accounted for approximately 92.1% in a typical scenario as the major contributing factor of the dose-response model. Resistant FIB in recreational waters that are exacerbated by a low dose of antibiotic pollutants would increase the adverse health effects in exposed human populations by 10 fold.

  14. Her-2/neu expression in node-negative breast cancer: direct tissue quantitation by computerized image analysis and association of overexpression with increased risk of recurrent disease.

    Science.gov (United States)

    Press, M F; Pike, M C; Chazin, V R; Hung, G; Udove, J A; Markowicz, M; Danyluk, J; Godolphin, W; Sliwkowski, M; Akita, R

    1993-10-15

    The HER-2/neu proto-oncogene (also known as c-erb B-2) is homologous with, but distinct from, the epidermal growth factor receptor. Amplification of this gene in node-positive breast cancers has been shown to correlate with both earlier relapse and shorter overall survival. In node-negative breast cancer patients, the subgroup for which accurate prognostic data could make a significant contribution to treatment decisions, the prognostic utility of HER-2/neu amplification and/or overexpression has been controversial. The purpose of this report is to address the issues surrounding this controversy and to evaluate the prognostic utility of overexpression in a carefully followed group of patients using appropriately characterized reagents and methods. In this report we present data from a study of HER-2/neu expression designed specifically to test whether or not overexpression is associated with an increased risk of recurrence in node-negative breast cancers. From a cohort of 704 women with node-negative breast cancer who experienced recurrent disease (relapsed cases) 105 were matched with 105 women with no recurrence (disease-free controls) after the equivalent follow-up period. Immunohistochemistry was used to assess HER-2/neu expression in archival tissue blocks from both relapsed cases and their matched disease-free controls. Importantly, a series of molecularly characterized breast cancer specimens were used to confirm that the antibody used was of sufficient sensitivity and specificity to identify those cancers overexpressing the HER-2/neu protein in this formalin-fixed, paraffin-embedded tissue cohort. In addition, a quantitative approach was developed to more accurately assess the amount of HER-2/neu protein identified by immunostaining tumor tissue. This was done using a purified HER-2/neu protein synthesized in a bacterial expression vector and protein lysates derived from a series of cell lines, engineered to express a defined range of HER-2/neu oncoprotein

  15. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  16. Quantitative rockfall hazard and risk analysis in selected municipalities of the České Švýcarsko National Park, northwestern Czechia

    Czech Academy of Sciences Publication Activity Database

    Blahůt, Jan; Klimeš, Jan; Vařilová, Z.

    2013-01-01

    Roč. 118, č. 3 (2013), s. 205-220 ISSN 1212-0014 R&D Projects: GA ČR GP205/09/P383 Institutional support: RVO:67985891 Keywords : rockfall hazard and risk * quantitative risk * Cretaceous sandstones * CONEFALL Subject RIV: DE - Earth Magnetism, Geodesy, Geography Impact factor: 0.400, year: 2013 http://geography.cz/sbornik/wp-content/uploads/downloads/2013/10/g13-3-s205-220-blahút.pdf

  17. Risk analysis

    International Nuclear Information System (INIS)

    Correa Lizarazu, X.

    2013-01-01

    The power point presentation Colombia risk evaluation experiences, sanitarian regulations evolution, chemical dangers food, biological dangers food, codex alimentarius, trade, industrial effects, dangers identification, data collection and risk profile

  18. Contract farming risks: A quantitative assessment

    Directory of Open Access Journals (Sweden)

    Arkins M Kabungo

    2016-03-01

    Full Text Available The objective of this study is to identify the key risks facing each of the stakeholders in the export-focused paprika value chain in Zambia. Although a deterministic cost-benefit analysis indicated that this outgrower scheme would have a very satisfactory net present value (NPV, a Monte Carlo analysis using an integrated financial–economic–stakeholder model identifies a number of risk variables that could make this system unsustainable. The major risks include the variability of the real exchange rate in Zambia; the international price of paprika; and the farm yield rates. This analysis points out that irrigation systems are very important for both stabilising and increasing yields. The analysis also shows the limitations of loan financing for such outgrower arrangements when at the sector level it is difficult or even impossible to mitigate the risks from real exchange rate movements and changes in international commodity prices. This micro-level analysis shows how critical real exchange rate management policies are in achieving sustainability of such export-oriented value chains.

  19. Increased brain iron deposition is a risk factor for brain atrophy in patients with haemodialysis: a combined study of quantitative susceptibility mapping and whole brain volume analysis.

    Science.gov (United States)

    Chai, Chao; Zhang, Mengjie; Long, Miaomiao; Chu, Zhiqiang; Wang, Tong; Wang, Lijun; Guo, Yu; Yan, Shuo; Haacke, E Mark; Shen, Wen; Xia, Shuang

    2015-08-01

    To explore the correlation between increased brain iron deposition and brain atrophy in patients with haemodialysis and their correlation with clinical biomarkers and neuropsychological test. Forty two patients with haemodialysis and forty one age- and gender-matched healthy controls were recruited in this prospective study. 3D whole brain high resolution T1WI and susceptibility weighted imaging were scanned on a 3 T MRI system. The brain volume was analyzed using voxel-based morphometry (VBM) in patients and to compare with that of healthy controls. Quantitative susceptibility mapping was used to measure and compare the susceptibility of different structures between patients and healthy controls. Correlation analysis was used to investigate the relationship between the brain volume, iron deposition and neuropsychological scores. Stepwise multiple regression analysis was used to explore the effect of clinical biomarkers on the brain volumes in patients. Compared with healthy controls, patients with haemodialysis showed decreased volume of bilateral putamen and left insular lobe (All P brain iron deposition is negatively correlated with the decreased volume of bilateral putamen (P brain iron deposition and dialysis duration was risk factors for brain atrophy in patients with haemodialysis. The decreased gray matter volume of the left insular lobe was correlated with neurocognitive impairment.

  20. Comparative risk analysis

    International Nuclear Information System (INIS)

    Niehaus, F.

    1988-01-01

    In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)

  1. Quantitative and Qualitative Assessment of Soil Erosion Risk in Małopolska (Poland), Supported by an Object-Based Analysis of High-Resolution Satellite Images

    Science.gov (United States)

    Drzewiecki, Wojciech; Wężyk, Piotr; Pierzchalski, Marcin; Szafrańska, Beata

    2014-06-01

    In 2011 the Marshal Office of Małopolska Voivodeship decided to evaluate the vulnerability of soils to water erosion for the entire region. The quantitative and qualitative assessment of the erosion risk for the soils of the Małopolska region was done based on the USLE approach. The special work-flow of geoinformation technologies was used to fulfil this goal. A high-resolution soil map, together with rainfall data, a detailed digital elevation model and statistical information about areas sown with particular crops created the input information for erosion modelling in GIS environment. The satellite remote sensing technology and the object-based image analysis (OBIA) approach gave valuable support to this study. RapidEye satellite images were used to obtain the essential up-to-date data about land use and vegetation cover for the entire region (15,000 km2). The application of OBIA also led to defining the direction of field cultivation and the mapping of contour tillage areas. As a result, the spatially differentiated values of erosion control practice factor were used. Both, the potential and the actual soil erosion risk were assessed quantificatively and qualitatively. The results of the erosion assessment in the Małopolska Voivodeship reveal the fact that a majority of its agricultural lands is characterized by moderate or low erosion risk levels. However, high-resolution erosion risk maps show its substantial spatial diversity. According to our study, average or higher actual erosion intensity levels occur for 10.6 % of agricultural land, i.e. 3.6 % of the entire voivodeship area. In 20 % of the municipalities there is a very urgent demand for erosion control. In the next 23 % an urgent erosion control is needed. Our study showed that even a slight improvement of P-factor estimation may have an influence on modeling results. In our case, despite a marginal change of erosion assessment figures on a regional scale, the influence on the final prioritization of

  2. Quantitative Measures of Mineral Supply Risk

    Science.gov (United States)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  3. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  4. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    Science.gov (United States)

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  6. Current developments in the assessment of petroleum hydrocarbon contaminated sites: Analysis, interpretation, and use of the TPH parameter for quantitative risk assessment

    International Nuclear Information System (INIS)

    Garcia-Surette, M.; Maynard, P.; Lamie, P.O.; Kaslick, C.

    1995-01-01

    In 1994, the Massachusetts Department of Environmental Protection (MDEP) estimated that petroleum-only cases comprised approximately one-half of the state's hazardous waste sites currently under investigation and/or remediation. Because of this significant percentage, it became clear that assessing petroleum sites more efficiently in terms of risk and cleanup alternatives was necessary. One of these key MDEP policies describes an alternative risk assessment approach enabling the quantitative characterization of total petroleum hydrocarbon (TPH)-related health risks. The approach relies on the use of an analytical technique by which the mass of petroleum hydrocarbons within specified carbon ranges is quantified. MDEP's TPH risk assessment approach was successfully employed at a residential site contaminated with No. 2 fuel oil. The combined use of MDEP's suggested analytical methods, alternative reference compounds and toxicity values, as well as chromatograms, standard dose equations, and an EPA-approved box model, facilitated the performance of a more realistic and cost-effective assessment of risk. Such assessment provided key management information to regulatory agencies, and project managers, as well as property owners concerned with potential property value loss

  7. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  8. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  9. Approaches to quantitative risk assessment with applications to PP

    International Nuclear Information System (INIS)

    Geiger, G.; Schaefer, A.

    2002-01-01

    Full text: Experience with accidents such as Goiania in Brazil and indications of a considerable number of orphan sources suggest that improved protection would be desirable for some types of radioactive material of wide-spread use such as radiation sources for civil purposes. Regarding large potential health and economic consequences (in particular, if terrorists attacks cannot be excluded), significant costs of preventive actions, and large uncertainties about both the likelihood of occurrence and the potential consequences of PP safety and security incidents, an optimum relationship between preventive and mitigative efforts is likely to be a key issue for successful risk management in this field. Thus, possible violations of physical protection combined with threats of misuse of nuclear materials, including terrorist attack, pose considerable challenges to global security from various perspectives. In view of these challenges, recent advance in applied risk and decision analysis suggests methodological and procedural improvements in quantitative risk assessment, the demarcation of acceptable risk, and risk management. Advance is based on a recently developed model of optimal risky choice suitable for assessing and comparing the cumulative probability distribution functions attached to safety and security risks. Besides quantification of risk (e. g., in economic terms), the standardization of various risk assessment models frequently used in operations research can be approached on this basis. The paper explores possible applications of these improved methods to the safety and security management of nuclear materials, cost efficiency of risk management measures, and the establishment international safety and security standards of PP. Examples will be presented that are based on selected scenarios of misuse involving typical radioactive sources. (author)

  10. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    Directory of Open Access Journals (Sweden)

    Upasna Saluja

    2012-06-01

    Full Text Available Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.

  11. Quantitative flood risk assessment for Polders

    International Nuclear Information System (INIS)

    Manen, Sipke E. van; Brinkhuis, Martine

    2005-01-01

    In the Netherlands, the design of dikes and other water retaining structures is based on an acceptable probability (frequency) of overtopping. In 1993 a new safety concept was introduced based on total flood risk. Risk was defined as the product of probability and consequences. In recent years advanced tools have become available to calculate the actual flood risk of a polder. This paper describes the application of these tools to an existing lowland river area. The complete chain of calculations necessary to estimate the risk of flooding of a polder (or dike ring) is presented. The difficulties in applying the present day tools and the largest uncertainties in the calculations are shown

  12. Quantitative flood risk assessment for Polders

    Energy Technology Data Exchange (ETDEWEB)

    Manen, Sipke E. van [Ministry of Transport, Public Works and Water Management, Bouwdienst Rijkswaterstaat, Griffioenlaan 2, Utrecht 3526 (Netherlands)]. E-mail: s.e.vmanen@bwd.rws.minvenw.nl; Brinkhuis, Martine [Ministry of Transport, Public Works and Water Management, Delft (Netherlands)

    2005-12-01

    In the Netherlands, the design of dikes and other water retaining structures is based on an acceptable probability (frequency) of overtopping. In 1993 a new safety concept was introduced based on total flood risk. Risk was defined as the product of probability and consequences. In recent years advanced tools have become available to calculate the actual flood risk of a polder. This paper describes the application of these tools to an existing lowland river area. The complete chain of calculations necessary to estimate the risk of flooding of a polder (or dike ring) is presented. The difficulties in applying the present day tools and the largest uncertainties in the calculations are shown.

  13. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  14. Mare Risk Analysis monitor

    International Nuclear Information System (INIS)

    Fuente Prieto, I.; Alonso, P.; Carretero Fernandino, J. A.

    2000-01-01

    The Nuclear Safety Council's requirement that Spanish power plants comply with the requirements of the Maintenance Rule associated with plant risk assessment during power operation, arising from the partial unavailability of systems due to the maintenance activities, has led to need for additional tools to facilitate compliance with said requirements. While the impact on risk produced by individual equipment unavailabilities can easily be evaluated, either qualitatively or quantitatively, the process becomes more complicated when un programmed unavailabilities simultaneously occur in various systems, making it necessary to evaluate their functional impact. It is especially complex in the case of support systems that can affect the functionality of multiple systems. In view of the above, a computer application has been developed that is capable of providing the operator with quick answers based on the specific plant model in order to allow fast risk assessment using the information compiled as part of the Probabilistic Safety Analysis. The paper describes the most important characteristics of this application and the basic design requirements of the MARE Risk Monitor. (Author)

  15. Comparative QRA (Quantitative Risk Analysis) of natural gas distribution pipelines in urban areas; Analise comparativa dos riscos da operacao de linhas de gas natural em areas urbanas

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Luiz Fernando S. de [Energy Solutions South America (Brazil); Cardoso, Cassia de O.; Storch, Rafael [Det Norske Veritas (DNV) (Brazil)

    2008-07-01

    The natural gas pipeline network grows around the world, but its operation inherently imposes a risk to the people living next to pipelines. Due to this, it is necessary to conduct a risk analysis during the environmental licensing in Brazil. Despite the risk analysis methodology is well established, some points of its application for the distribution pipelines are still under discussion. This paper presents a methodology that examines the influences of major projects and operating parameters on the risk calculation of a distribution pipeline accident in urban areas as well as the possible accident scenarios assessment complexity. The impact of some scenarios has been evaluated using a Computational Fluid Dynamics tool. The results indicate that, under certain conditions, the risks from the pipeline operation under operating pressures of 20 bar may be acceptable in location class 3 or even in class 4. These results play a very important role if management decisions on the growth of the distribution of natural gas network in densely populated areas as well as in the improvement of laws to control the activity of distribution of natural gas. (author)

  16. Quantitative occupational risk model: Single hazard

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Aneziris, O.N.; Bellamy, L.J.; Ale, B.J.M.; Oh, J.

    2017-01-01

    A model for the quantification of occupational risk of a worker exposed to a single hazard is presented. The model connects the working conditions and worker behaviour to the probability of an accident resulting into one of three types of consequence: recoverable injury, permanent injury and death. Working conditions and safety barriers in place to reduce the likelihood of an accident are included. Logical connections are modelled through an influence diagram. Quantification of the model is based on two sources of information: a) number of accidents observed over a period of time and b) assessment of exposure data of activities and working conditions over the same period of time and the same working population. Effectiveness of risk reducing measures affecting the working conditions, worker behaviour and/or safety barriers can be quantified through the effect of these measures on occupational risk. - Highlights: • Quantification of occupational risk from a single hazard. • Influence diagram connects working conditions, worker behaviour and safety barriers. • Necessary data include the number of accidents and the total exposure of worker • Effectiveness of risk reducing measures is quantified through the impact on the risk • An example illustrates the methodology.

  17. On the quantitative definition of risk

    International Nuclear Information System (INIS)

    Kaplan, S.; Garrick, B.J.

    1985-01-01

    The purpose of this paper is to provide some suggestions and contributions toward a uniform conceptual/linguistic framework for quantifying and making precise the notion of risk. The concepts and definitions the authors present in this connection have shown themselves to be sturdy and serviceable in practical application to a wide variety of risk situations. They have demonstrated in the courtroom and elsewhere the ability to improve communication and greatly diminish the confusion and controversy that often swirls around public decision making involving risk. They hope therefore with this paper to widen the understanding and adoption of this framework, and to that end adopt a leisurely and tutorial place. In particular, they carefully draw a distinction between ''probability'' and ''frequency.'' Then, using this distinction, they return to the idea of risk, and give a ''second-level'' definition (of risk which generalizes the first-level definition) and is large enough and flexible enough to include at least all the aspects and subtleties of risk that have been encountered in the authors' experience

  18. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  19. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  20. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  1. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  2. Quantitative risk assessment of digitalized safety systems

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Sung Min; Lee, Sang Hun; Kang, Hym Gook [KAIST, Daejeon (Korea, Republic of); Lee, Seung Jun [UNIST, Ulasn (Korea, Republic of)

    2016-05-15

    A report published by the U.S. National Research Council indicates that appropriate methods for assessing reliability are key to establishing the acceptability of digital instrumentation and control (I and C) systems in safety-critical plants such as NPPs. Since the release of this issue, the methodology for the probabilistic safety assessment (PSA) of digital I and C systems has been studied. However, there is still no widely accepted method. Kang and Sung found three critical factors for safety assessment of digital systems: detection coverage of fault-tolerant techniques, software reliability quantification, and network communication risk. In reality the various factors composing digitalized I and C systems are not independent of each other but rather closely connected. Thus, from a macro point of view, a method that can integrate risk factors with different characteristics needs to be considered together with the micro approaches to address the challenges facing each factor.

  3. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  4. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  5. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  6. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  7. EFFICIENT QUANTITATIVE RISK ASSESSMENT OF JUMP PROCESSES: IMPLICATIONS FOR FOOD SAFETY

    OpenAIRE

    Nganje, William E.

    1999-01-01

    This paper develops a dynamic framework for efficient quantitative risk assessment from the simplest general risk, combining three parameters (contamination, exposure, and dose response) in a Kataoka safety-first model and a Poisson probability representing the uncertainty effect or jump processes associated with food safety. Analysis indicates that incorporating jump processes in food safety risk assessment provides more efficient cost/risk tradeoffs. Nevertheless, increased margin of safety...

  8. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  9. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  10. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    Science.gov (United States)

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  11. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  12. Observations on risk analysis

    International Nuclear Information System (INIS)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested

  13. Stochastic evaluation of tsunami inundation and quantitative estimating tsunami risk

    International Nuclear Information System (INIS)

    Fukutani, Yo; Anawat, Suppasri; Abe, Yoshi; Imamura, Fumihiko

    2014-01-01

    We performed a stochastic evaluation of tsunami inundation by using results of stochastic tsunami hazard assessment at the Soma port in the Tohoku coastal area. Eleven fault zones along the Japan trench were selected as earthquake faults generating tsunamis. The results show that estimated inundation area of return period about 1200 years had good agreement with that in the 2011 Tohoku earthquake. In addition, we evaluated quantitatively tsunami risk for four types of building; a reinforced concrete, a steel, a brick and a wood at the Soma port by combining the results of inundation assessment and tsunami fragility assessment. The results of quantitative estimating risk would reflect properly vulnerability of the buildings, that the wood building has high risk and the reinforced concrete building has low risk. (author)

  14. Quantitative Security Risk Assessment of Android Permissions and Applications

    OpenAIRE

    Wang , Yang; Zheng , Jun; Sun , Chen; Mukkamala , Srinivas

    2013-01-01

    Part 6: Mobile Computing; International audience; The booming of the Android platform in recent years has attracted the attention of malware developers. However, the permissions-based model used in Android system to prevent the spread of malware, has shown to be ineffective. In this paper, we propose DroidRisk, a framework for quantitative security risk assessment of both Android permissions and applications (apps) based on permission request patterns from benign apps and malware, which aims ...

  15. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  16. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  17. Implementing quantitative analysis and its complement

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Nelson, W.R.; Shepherd, J.C.

    1982-01-01

    This paper presents an application of risk analysis for the evaluation of nuclear reactor facility operation. Common cause failure analysis (CCFA) techniques to identify potential problem areas are discussed. Integration of CCFA and response trees, a particular form of the path sets of a success tree, to gain significant insight into the operation of the facility is also demonstrated. An example illustrating the development of the risk analysis methodology, development of the fault trees, generation of response trees, and evaluation of the CCFA is presented to explain the technique

  18. A qualitative and quantitative analysis of risk perception and treatment options as related to wildfires in the USDA FS Region 3 National Forests

    Science.gov (United States)

    Ingrid M. Martin; Wade E. Martin; Carol B. Raish

    2011-01-01

    As the incidence of devastating fires rises, managing the risk posed by these fires has become critical. This report provides important information to examine the ways that different groups or disaster subcultures develop the mentalities or perceived realities that affect their views and responses concerning risk and disaster preparedness. Fire risk beliefs and...

  19. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  20. Quantitative evaluation of risks for individuals in diagnostic radiology

    Energy Technology Data Exchange (ETDEWEB)

    Iinuma, T A; Tateno, Y; Hashizume, T [National Inst. of Radiological Sciences, Chiba (Japan)

    1980-05-01

    A method to estimate quantitatively risks of individual patients due to exposure to diagnostic radiation (carcinogenetic and genetic effects of radiation) was proposed on the basis of ICRP-26. Carcinogenetic effect of radiation was calculated by multiplying mean dose equivalent for each organ per each radiological examination by shortening of average life-expectancy which was calculated from incidence of fetal carcinoma of each organ, latent period of carcinoma, and incidence period of carcinoma. Genetic effect of radiation was calculated by multiplying mean dose equivalent for gonad per each radiological examination by incidence of genetically severe radiation damages due to parent's exposure and child expectancy rate. Three examples were shown on calculations of risks in the photofluorographic examinations of the stomach and chest, and mammography. The same method of calculation could be applied to the in-vivo nuclear medicine examinations. Further investigation was required to calculate the risks quantitatively for various types of diagnostic procedures using radiation.

  1. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    Science.gov (United States)

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  2. Quantitative evaluation of risks for individuals in diagnostic radiology

    International Nuclear Information System (INIS)

    Iinuma, T.A.; Tateno, Yukio; Hashizume, Tadashi

    1980-01-01

    A method to estimate quantitatively risks of individual patients due to exposure to diagnostic radiation (carcinogenetic and genetic effects of radiation) was proposed on the basis of ICRP-26. Carcinogenetic effect of radiation was calculated by multiplying mean dose equivalent for each organ per each radiological examination by shortening of average life-expectancy which was calculated from incidence of fetal carcinoma of each organ, latent period of carcinoma, and incidence period of carcinoma. Genetic effect of radiation was calculated by multiplying mean dose equivalent for gonad per each radiological examination by incidence of genetically severe radiation damages due to parent's exposure and child expectancy rate. Three examples were shown on calculations of risks in the photofluorographic examinations of the stomach and chest, and mammography. The same method of calculation could be applied to the in-vivo nuclear medicine examinations. Further investigation was required to calculate the risks quantitatively for various types of diagnostic procedures using radiation. (Tsunoda, M.)

  3. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  4. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  5. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  6. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    Science.gov (United States)

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  7. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  8. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    Science.gov (United States)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  9. Campylobacter Risk Analysis

    DEFF Research Database (Denmark)

    Nauta, Maarten

    In several countries quantitative microbiological risk assessments (QMRAs) have been performed for Campylobacter in chicken meat. The models constructed for this purpose provide a good example of the development of QMRA in general and illustrate the diversity of available methods. Despite...... the differences between the models, the most prominent conclusions of the QMRAs are similar. These conclusions for example relate to the large risk of highly contaminated meat products and the insignificance of contamination from Campylobacter positive flocks to negative flocks during slaughter and processing...

  10. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  11. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  12. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  13. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  14. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  15. Is risk analysis scientific?

    Science.gov (United States)

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  16. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  17. Risco privado em infra-estrutura pública: uma análise quantitativa de risco como ferramenta de modelagem de contratos Private risk in public infrastructure: a quantitative risk analysis as a contract modeling tool

    Directory of Open Access Journals (Sweden)

    Luiz E. T. Brandão

    2007-12-01

    Full Text Available Parcerias público-privadas (PPP são arranjos contratuais onde o governo assume compromissos futuros por meio de garantias e opções. São alternativas para aumentar a eficiência do Estado por uma alocação mais eficiente de incentivos e riscos. No entanto, a determinação do nível ótimo de garantias e a própria alocação de riscos são geralmente realizadas de forma subjetiva, podendo levar o governo a ter que assumir passivos significativos. Este artigo propõe um modelo de valoração quantitativa de garantias governamentais em projetos de PPP por meio da metodologia das opções reais, e este modelo é aplicado a um projeto de concessão rodoviária. Os autores analisam o impacto de diversos níveis de garantia de receita sobre o valor e risco do projeto, bem como o valor esperado do desembolso futuro do governo em cada uma das situações, concluindo que é possível ao poder público determinar o nível ótimo de garantia em função do grau de redução de risco desejado, e que o desenho e a modelagem contratual de projetos de PPP podem se beneficiar de ferramentas quantitativas aqui apresentadas.Public private partnerships (PPP are contractual arrangements in which the government assumes future obligations by providing project guarantees. They are considered a way of increasing government efficiency through a more efficient allocation of risks and incentives. On the other hand, the assessment and determination the optimal level of these guarantees is usually subjective, exposing the government to potentially high future liabilities. This article proposes a quantitative model for the evaluation of government guarantees in PPP projects under the real options approach, and applies this model to a toll highway concession with a minimum revenue guarantee. It studies the impact of different guarantee levels on the value and the risk of the project, as well as the expected level of future cash payments to be made by the government in

  18. New developments in quantitative risk assessment of campylobacteriosis

    DEFF Research Database (Denmark)

    Havelaar, Arie; Nauta, Maarten

    meat to ready-to-eat foods is the main pathway of consumer exposure. Undercooking appears to be of minor importance. However, this conclusion may need to be reconsidered in the light of increasing consumption of minced meat preparations. Five QMRA models have been compared in detail, and detailed......Quantitative microbiological risk assessment (QMRA) is now broadly accepted as an important decision support tool in food safety risk management. It has been used to support decision making at the global level (Codex Alimentarius, FAO and WHO), at the European level (European Food Safety Authority...

  19. Quantitative proteomics analysis using 2D-PAGE to investigate the effects of cigarette smoke and aerosol of a prototypic modified risk tobacco product on the lung proteome in C57BL/6 mice.

    Science.gov (United States)

    Elamin, Ashraf; Titz, Bjoern; Dijon, Sophie; Merg, Celine; Geertz, Marcel; Schneider, Thomas; Martin, Florian; Schlage, Walter K; Frentzel, Stefan; Talamo, Fabio; Phillips, Blaine; Veljkovic, Emilija; Ivanov, Nikolai V; Vanscheeuwijck, Patrick; Peitsch, Manuel C; Hoeng, Julia

    2016-08-11

    Smoking is associated with several serious diseases, such as lung cancer and chronic obstructive pulmonary disease (COPD). Within our systems toxicology framework, we are assessing whether potential modified risk tobacco products (MRTP) can reduce smoking-related health risks compared to conventional cigarettes. In this article, we evaluated to what extent 2D-PAGE/MALDI MS/MS (2D-PAGE) can complement the iTRAQ LC-MS/MS results from a previously reported mouse inhalation study, in which we assessed a prototypic MRTP (pMRTP). Selected differentially expressed proteins identified by both LC-MS/MS and 2D-PAGE approaches were further verified using reverse-phase protein microarrays. LC-MS/MS captured the effects of cigarette smoke (CS) on the lung proteome more comprehensively than 2D-PAGE. However, an integrated analysis of both proteomics data sets showed that 2D-PAGE data complement the LC-MS/MS results by supporting the overall trend of lower effects of pMRTP aerosol than CS on the lung proteome. Biological effects of CS exposure supported by both methods included increases in immune-related, surfactant metabolism, proteasome, and actin cytoskeleton protein clusters. Overall, while 2D-PAGE has its value, especially as a complementary method for the analysis of effects on intact proteins, LC-MS/MS approaches will likely be the method of choice for proteome analysis in systems toxicology investigations. Quantitative proteomics is anticipated to play a growing role within systems toxicology assessment frameworks in the future. To further understand how different proteomics technologies can contribute to toxicity assessment, we conducted a quantitative proteomics analysis using 2D-PAGE and isobaric tag-based LC-MS/MS approaches and compared the results produced from the 2 approaches. Using a prototypic modified risk tobacco product (pMRTP) as our test item, we show compared with cigarette smoke, how 2D-PAGE results can complement and support LC-MS/MS data, demonstrating

  20. N reactor individual risk comparison to quantitative nuclear safety goals

    International Nuclear Information System (INIS)

    Wang, O.S.; Rainey, T.E.; Zentner, M.D.

    1990-01-01

    A full-scope level III probabilistic risk assessment (PRA) has been completed for N reactor, a US Department of Energy (DOE) production reactor located on the Hanford Reservation in the state of Washington. Sandia National Laboratories (SNL) provided the technical leadership for this work, using the state-of-the-art NUREG-1150 methodology developed for the US Nuclear Regulatory Commission (NRC). The main objectives of this effort were to assess the risks to the public and to the on-site workers posed by the operation of N reactor, to identify changes to the plant that could reduce the overall risk, and to compare those risks to the proposed NRC and DOE quantitative safety goals. This paper presents the methodology adopted by Westinghouse Hanford Company (WHC) and SNL for individual health risk evaluation, its results, and a comparison to the NRC safety objectives and the DOE nuclear safety guidelines. The N reactor results, are also compared with the five NUREG-1150 nuclear plants. Only internal events are compared here because external events are not yet reported in the current draft NUREG-1150. This is the first full-scope level III PRA study with a detailed quantitative safety goal comparison performed for DOE production reactors

  1. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  2. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  3. Quantitative risk assessment for environmental and occupational health. The practical solution

    International Nuclear Information System (INIS)

    Hallenbeck, W.H.; Cunningham, K.M.

    1986-01-01

    These following topics are covered in this book: concepts, methods, and limitations; exposure characterization; qualitative evaluation of human and animal studies; quantitative evaluation of human and animal studies; risk analysis; acceptable concentrations; environmental and occupational exposure to a hypothetical toxicant; and environmental exposure to a natural toxicant radon-222 and its daughters

  4. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  5. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  6. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  7. Supply chain risk management of newspaper industry: A quantitative study

    Science.gov (United States)

    Sartika, Viny; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    The newspaper industry has several distinctive features that make it stands out from other industries. The strict delivery deadline and zero inventory led to a very short time frame for production and distribution. On the other hand, there is pressure from the newsroom to encourage the start of production as slowly as possible in order to enter the news, while there is pressure from production and distribution to start production as early as possible. Supply chain risk management is needed in determining the best strategy for dealing with possible risks in the newspaper industry. In a case study of a newspaper in Surakarta, quantitative approaches are made to the newspaper supply chain risk management by calculating the expected cost of risk based on the magnitude of the impact and the probability of a risk event. From the calculation results obtained that the five risks with the highest value are newspaper delays to the end customer, broken plate, miss print, down machine, and delayed delivery of newspaper content. Then analyzed appropriate mitigation strategies to cope with such risk events.

  8. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  9. Generalized indices for radiation risk analysis

    International Nuclear Information System (INIS)

    Bykov, A.A.; Demin, V.F.

    1989-01-01

    A new approach to ensuring nuclear safety has begun forming since the early eighties. The approach based on the probabilistic safety analysis, the principles of acceptable risk, the optimization of safety measures, etc. has forced a complex of adequate quantitative methods of assessment, safety analysis and risk management to be developed. The method of radiation risk assessment and analysis hold a prominent place in the complex. National and international research and regulatory organizations ICRP, IAEA, WHO, UNSCEAR, OECD/NEA have given much attention to the development of the conceptual and methodological basis of those methods. Some resolutions of the National Commission of Radiological Protection (NCRP) and the Problem Commission on Radiation Hygiene of the USSR Ministry of Health should be also noted. Both CBA (cost benefit analysis) and other methods of radiation risk analysis and safety management use a system of natural and socio-economic indices characterizing the radiation risk or damage. There exist a number of problems associated with the introduction, justification and use of these indices. For example, the price, a, of radiation damage, or collective dose unit, is a noteworthy index. The difficulties in its qualitative and quantitative determination are still an obstacle for a wide application of CBA to the radiation risk analysis and management. During recent 10-15 years these problems have been a subject of consideration for many authors. The present paper also considers the issues of the qualitative and quantitative justification of the indices of radiation risk analysis

  10. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  11. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  12. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... detailed information on the estimated health impact in a given exposure situation. These graphs will facilitate the discussions on appropriate risk reduction measures to be taken....

  13. Supplementing quantitative risk assessments with a stage addressing the risk understanding of the decision maker

    International Nuclear Information System (INIS)

    Aven, Terje

    2016-01-01

    A quantitative probabilistic risk assessment produces a conditional risk description given the knowledge of the analysts (formulated to a large extent through assumptions). However, important aspects of the risk may be concealed in the background knowledge of the analyst and the assumptions. This paper discusses this issue, the main purpose being to present a two-stage risk assessment approach where the second stage addresses the risk understanding of the decision maker. This second-stage is to a large extent qualitative. The approach is novel with its separation between the analysts' conditional risk descriptions using probability judgments, and the decision maker's risk understanding. The approach aims at improving the use of risk assessment in practical decision making by ensuring that the results of the risk assessments are properly interpreted and the key aspects of risk, uncertainty and knowledge are brought to attention for the decision makers. Examples are used to illustrate the approach. - Highlights: • A quantitative risk assessment produces a conditional risk description. • The decision maker (DM) needs to address risk beyond this description. • The paper presents a related two-stage process, covering analyst and DM judgments. • The second stage relates to the DM's risk understanding. • Strength of knowledge judgments are included in both stages.

  14. Component of the risk analysis

    International Nuclear Information System (INIS)

    Martinez, I.; Campon, G.

    2013-01-01

    The power point presentation reviews issues like analysis of risk (Codex), management risk, preliminary activities manager, relationship between government and industries, microbiological danger and communication of risk

  15. Unsharpness-risk analysis

    International Nuclear Information System (INIS)

    Preyssl, C.

    1986-01-01

    Safety analysis provides the only tool for evaluation and quantification of rare or hypothetical events leading to system failure. So far probability theory has been used for the fault- and event-tree methodology. The phenomenon of uncertainties constitutes an important aspect in risk analysis. Uncertainties can be classified as originating from 'randomness' or 'fuzziness'. Probability theory addresses randomness only. The use of 'fuzzy set theory' makes it possible to include both types of uncertainty in the mathematical model of risk analysis. Thus the 'fuzzy fault tree' is expressed in 'possibilistic' terms implying a range of simplifications and improvements. 'Human failure' and 'conditionality' can be treated correctly. Only minimum-maximum relations are used to combine the possibility distributions of events. Various event-classifications facilitate the interpretation of the results. The method is demonstrated by application to a TRIGA-research reactor. Uncertainty as an implicit part of 'fuzzy risk' can be quantified explicitly using an 'uncertainty measure'. Based on this the 'degree of relative compliance' with a quantizative safety goal can be defined for a particular risk. The introduction of 'weighting functionals' guarantees the consideration of the importances attached to different parts of the risk exceeding or complying with the standard. The comparison of two reference systems is demonstrated in a case study. It is concluded that any application of the 'fuzzy risk analysis' has to be free of any hypostatization when reducing subjective to objective information. (Author)

  16. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  17. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    Science.gov (United States)

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  18. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    Directory of Open Access Journals (Sweden)

    Lauren E. Joly

    2016-01-01

    Full Text Available Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45 of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61 of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability.

  19. The Pesticide Risk Beliefs Inventory: A Quantitative Instrument for the Assessment of Beliefs about Pesticide Risks

    OpenAIRE

    LePrevost, Catherine E.; Blanchard, Margaret R.; Cope, W. Gregory

    2011-01-01

    Recent media attention has focused on the risks that agricultural pesticides pose to the environment and human health; thus, these topics provide focal areas for scientists and science educators to enhance public understanding of basic toxicology concepts. This study details the development of a quantitative inventory to gauge pesticide risk beliefs. The goal of the inventory was to characterize misconceptions and knowledge gaps, as well as expert-like beliefs, concerning pesticide risk. This...

  20. Adversarial risk analysis

    CERN Document Server

    Banks, David L; Rios Insua, David

    2015-01-01

    Flexible Models to Analyze Opponent Behavior A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on a...

  1. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  2. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    Science.gov (United States)

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  3. Is there a place for quantitative risk assessment?

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Eric J [Columbia University Medical Center, New York, NY (United States)

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  4. Is there a place for quantitative risk assessment?

    International Nuclear Information System (INIS)

    Hall, Eric J

    2009-01-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  5. Quantitative risk assessment of foods containing peanut advisory labeling.

    Science.gov (United States)

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  7. Quantitative relationships between aging failure data and risk

    International Nuclear Information System (INIS)

    Vesely, W.E.; Vora, J.P.

    1988-01-01

    As part of the United States Nuclear Regulatory Commission's Nuclear Plant Aging Research program, a project is being carried out to quantify the risk effects of aging. The project is called the Risk Evaluation of Aging Phenomena (REAP) Project. With the REAP Project, a procedure has been developed to quantify nuclear power plant risks from aging failure data. The procedure utilizes the linear aging model and its extensions in order to relate component aging failure rates to aging mechanism parameters which are estimable from failure and maintenance data. The aging failure rates can then be used to quantify the age dependent plant risks, such as system unavailabilities, core melt frequency and public health risks. The REAP procedure is different from standard time dependent approaches in that the failure rates are phenomenologically based, allowing engineering information to be utilized. Furthermore, gross data and incomplete data can be utilized. A software package has been developed which systematically analyzes data for aging effects and interfaces with a time dependent risk analysis module to determine the risk implications of the aging effects. (author). 10 refs, 10 figs

  8. Integrating a quantitative risk appraisal in a health impact assessment

    DEFF Research Database (Denmark)

    Adám, Balázs; Molnár, Agnes; Gulis, Gabriel

    2013-01-01

    BACKGROUND: Although the quantification of health outcomes in a health impact assessment (HIA) is scarce in practice, it is preferred by policymakers, as it assists various aspects of the decision-making process. This article provides an example of integrating a quantitative risk appraisal...... in an HIA performed for the recently adopted Hungarian anti-smoking policy which introduced a smoking ban in closed public places, workplaces and public transport vehicles, and is one of the most effective measures to decrease smoking-related ill health. METHODS: A comprehensive, prospective HIA...... to decrease the prevalence of active and passive smoking and result in a considerably positive effect on several diseases, among which lung cancer, chronic pulmonary diseases, coronary heart diseases and stroke have the greatest importance. The health gain calculated for the quantifiable health outcomes...

  9. Environmental risk analysis

    International Nuclear Information System (INIS)

    Lima-e-Silva, Pedro Paulo de

    1996-01-01

    The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)

  10. Simplified quantitative treatment of uncertainty and interindividual variability in health risk assessment

    International Nuclear Information System (INIS)

    Bogen, K.T.

    1993-01-01

    A distinction between uncertainty (or the extent of lack of knowledge) and interindividual variability (or the extent of person-to-person heterogeneity) regarding the values of input variates must be maintained if a quantitative characterization of uncertainty in population risk or in individual risk is sought. Here, some practical methods are presented that should facilitate implementation of the analytic framework for uncertainty and variability proposed by Bogen and Spear. (1,2) Two types of methodology are discussed: one that facilitates the distinction between uncertainty and variability per se, and another that may be used to simplify quantitative analysis of distributed inputs representing either uncertainty or variability. A simple and a complex form for modeled increased risk are presented and then used to illustrate methods facilitating the distinction between uncertainty and variability in reference to characterization of both population and individual risk. Finally, a simple form of discrete probability calculus is proposed as an easily implemented, practical altemative to Monte-Carlo based procedures to quantitative integration of uncertainty and variability in risk assessment

  11. Impact Analysis for Risks in Informatics Systems

    OpenAIRE

    Baicu, Floarea; Baches, Maria Alexandra

    2013-01-01

    In this paper are presented methods of impact analysis on informatics system security accidents, qualitative and quantitative methods, starting with risk and informational system security definitions. It is presented the relationship between the risks of exploiting vulnerabilities of security system, security level of these informatics systems, probability of exploiting the weak points subject to financial losses of a company, respectively impact of a security accident on the company. Herewit...

  12. Safety analysis, risk assessment, and risk acceptance criteria

    International Nuclear Information System (INIS)

    Jamali, K.

    1997-01-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, 'ensuring' plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is 'safe.' Use of RACs requires quantitative estimates of consequence frequency and magnitude

  13. Probabilistic risk analysis in chemical engineering

    International Nuclear Information System (INIS)

    Schmalz, F.

    1991-01-01

    In risk analysis in the chemical industry, recognising potential risks is considered more important than assessing their quantitative extent. Even in assessing risks, emphasis is not on the probability involved but on the possible extent. Qualitative assessment has proved valuable here. Probabilistic methods are used in individual cases where the wide implications make it essential to be able to assess the reliability of safety precautions. In this case, assessment therefore centres on the reliability of technical systems and not on the extent of a chemical risk. 7 figs

  14. Risk analysis and reliability

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1979-01-01

    Mathematical foundations of risk analysis are addressed. The importance of having the same probability space in order to compare different experiments is pointed out. Then the following topics are discussed: consequences as random variables with infinite expectations; the phenomenon of rare events; series-parallel systems and different kinds of randomness that could be imposed on such systems; and the problem of consensus of estimates of expert opinion

  15. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  16. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  17. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  18. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  19. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  20. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  1. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  2. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  3. Using MFM methodology to generate and define major accident scenarios for quantitative risk assessment studies

    DEFF Research Database (Denmark)

    Hua, Xinsheng; Wu, Zongzhi; Lind, Morten

    2017-01-01

    to calculate likelihood of each MAS. Combining the likelihood of each scenario with a qualitative risk matrix, each major accident scenario is thereby ranked for consideration for detailed consequence analysis. The methodology is successfully highlighted using part of BMA-process for production of hydrogen......Generating and defining Major Accident Scenarios (MAS) are commonly agreed as the key step for quantitative risk assessment (QRA). The aim of the study is to explore the feasibility of using Multilevel Flow Modeling (MFM) methodology to formulating MAS. Traditionally this is usually done based...

  4. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  5. The watchdog role of risk analysis

    International Nuclear Information System (INIS)

    Reijen, G. van; Vinck, W.

    1983-01-01

    The reason why the risks of large-scale technology attract more attention lies in the fact that accidents would have more disastrous results and in the fact that it is probably more attractive to study the risks of some large projects than to do the same for a greater number of smaller projects. Within this presentation there will be some opening remarks on the Role of the Commission of the European Community with regard to accident prevention. The development of the concept of quantitative risks is dealt with. This development leads to a combinded of deterministic and probabilistic methods. The presentation concludes with some critical remarks on quantitative risk analysis and its use. (orig./HP) [de

  6. Characterizing health risks associated with recreational swimming at Taiwanese beaches by using quantitative microbial risk assessment.

    Science.gov (United States)

    Jang, Cheng-Shin; Liang, Ching-Ping

    2018-01-01

    Taiwan is surrounded by oceans, and therefore numerous pleasure beaches attract millions of tourists annually to participate in recreational swimming activities. However, impaired water quality because of fecal pollution poses a potential threat to the tourists' health. This study probabilistically characterized the health risks associated with recreational swimming engendered by waterborne enterococci at 13 Taiwanese beaches by using quantitative microbial risk assessment. First, data on enterococci concentrations at coastal beaches monitored by the Taiwan Environmental Protection Administration were reproduced using nonparametric Monte Carlo simulation (MCS). The ingestion volumes of recreational swimming based on uniform and gamma distributions were subsequently determined using MCS. Finally, after the distribution combination of the two parameters, the beta-Poisson dose-response function was employed to quantitatively estimate health risks to recreational swimmers. Moreover, various levels of risk to recreational swimmers were classified and spatially mapped to explore feasible recreational and environmental management strategies at the beaches. The study results revealed that although the health risks associated with recreational swimming did not exceed an acceptable benchmark of 0.019 illnesses daily at all beaches, they approached to this benchmark at certain beaches. Beaches with relatively high risks are located in Northwestern Taiwan owing to the current movements.

  7. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  8. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  9. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  10. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  11. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  12. Asset backed securities : risks, ratings and quantitative modelling

    NARCIS (Netherlands)

    Jönsson, B.H.B.; Schoutens, W.

    2009-01-01

    Asset backed securities (ABSs) are structured finance products backed by pools of assets and are created through a securitisation process. The risks in asset backed securities, such as, credit risk, prepayment risk, market risks, operational risk, and legal risks, are directly connected with the

  13. Probabilistic risk analysis and terrorism risk.

    Science.gov (United States)

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  14. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  15. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  16. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative

  17. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  18. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  19. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  20. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  1. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  2. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  3. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  4. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  5. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    Science.gov (United States)

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  6. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  7. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  8. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  9. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  10. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    Science.gov (United States)

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  11. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    Science.gov (United States)

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  12. Epidemiology and quantitation of environmental risk in humans from radiation and other agents

    International Nuclear Information System (INIS)

    Castellani, Amleto

    1985-01-01

    The identification and quantitation of environmental risk in humans is one of the main problems to be solved in order to improve the protection of individuals and of human populations against physical and chemical pollutants. Epidemiology plays a central role in the evaluation of health risk directly in human populations. In this volume are collected 33 lectures presented at the AS! course on ''Epidemiology and quantitation of environmental risk in humans from radiation and other agents: potential and limitations'', sponsored by NATO and Italian Association of Radiobiology and organized by ENEA. The course has been devoted to a number of aspects of environmental risk analysis and evaluation based on epidemiological investigation. Basic epidemiological concepts and methods have been reviewed. Fundamentals of dosimetry and microdosimetry were presented in relation to the contribution of epidemiology in defining the dose effect relationships for radiation carcinogenesis and its relation with age, sex and ethnicity. The mechanisms of carcinogenesis as a multi-stage process were illustrated. One of the main topics was 'cancer epidemiology' and its correlation with: - occupational and non-occupational exposure to radiation - diagnostic and therapeutic irradiation - cancer proneness - hereditary and familiar diseases - abnormal response to carcinogens - environmental pollution in air and water - exposure to radon in mines and in building material - atomic bomb explosion - chemotherapy - dioxin and related compounds

  13. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  14. Fundamentals of quantitative PET data analysis

    NARCIS (Netherlands)

    Willemsen, ATM; van den Hoff, J

    2002-01-01

    Drug analysis and development with PET should fully exhaust the ability of this tomographic technique to quantify regional tracer concentrations in vivo. Data evaluation based on visual inspection or assessment of regional image contrast is not sufficient for this purpose since much of the

  15. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  16. A quantitative approach for integrating multiple lines of evidence for the evaluation of environmental health risks

    Directory of Open Access Journals (Sweden)

    Jerome J. Schleier III

    2015-01-01

    Full Text Available Decision analysis often considers multiple lines of evidence during the decision making process. Researchers and government agencies have advocated for quantitative weight-of-evidence approaches in which multiple lines of evidence can be considered when estimating risk. Therefore, we utilized Bayesian Markov Chain Monte Carlo to integrate several human-health risk assessment, biomonitoring, and epidemiology studies that have been conducted for two common insecticides (malathion and permethrin used for adult mosquito management to generate an overall estimate of risk quotient (RQ. The utility of the Bayesian inference for risk management is that the estimated risk represents a probability distribution from which the probability of exceeding a threshold can be estimated. The mean RQs after all studies were incorporated were 0.4386, with a variance of 0.0163 for malathion and 0.3281 with a variance of 0.0083 for permethrin. After taking into account all of the evidence available on the risks of ULV insecticides, the probability that malathion or permethrin would exceed a level of concern was less than 0.0001. Bayesian estimates can substantially improve decisions by allowing decision makers to estimate the probability that a risk will exceed a level of concern by considering seemingly disparate lines of evidence.

  17. Comparison of Qualitative and Quantitative Risk Results for Shutdown Operation

    International Nuclear Information System (INIS)

    Oh, Hae Cheol; Kim, Myung Ki; Chung, Bag Soon; Seo, Mi Ro; Hong, Sung Yull

    2006-01-01

    The Defense-In-Depth philosophy is a fundamental concept of nuclear safety. The objective of Defense-In- Depth (DID) evaluation is to assess the level of Defense- In-Depth maintained during the various plant maintenance activities. Especially for shutdown and outage operations, the Defense-In-Depth might be challenged due to the reduction in redundancy and diversity resulting from the maintenance. The qualitative defense-in-depth evaluation using deterministic trees such as SFAT (Safety Function Assessment Tree), can provide 'Safety' related information on the levels of defense-in-depth according to the plant configuration including the levels of redundancy and diversity. For the more reasonable color decision of SFAT, it is necessary to identify the risk impact of degradation of redundancy and diversity of mitigation systems. The probabilistic safety analysis for the shutdown status can provide risk information related on the degradation of redundancy and diversity level for the safety functions during outage. Insights from the both methods for the plant status can be the same or different. The results of DID approach and PSA for the shutdown state are compared in this paper

  18. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  20. International Conference on Risk Analysis

    CERN Document Server

    Oliveira, Teresa; Rigas, Alexandros; Gulati, Sneh

    2015-01-01

    This book covers the latest results in the field of risk analysis. Presented topics include probabilistic models in cancer research, models and methods in longevity, epidemiology of cancer risk, engineering reliability and economical risk problems. The contributions of this volume originate from the 5th International Conference on Risk Analysis (ICRA 5). The conference brought together researchers and practitioners working in the field of risk analysis in order to present new theoretical and computational methods with applications in biology, environmental sciences, public health, economics and finance.

  1. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  2. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  3. Quantitative analysis of deuterium by gas chromatography

    International Nuclear Information System (INIS)

    Isomura, Shohei; Kaetsu, Hayato

    1981-01-01

    An analytical method for the determination of deuterium concentration in water and hydrogen gas by gas chromatography is described. HD and D 2 in a hydrogen gas sample were separated from H 2 by a column packed with Molecular Sieve 13X, using extra pure hydrogen gas as carrier. A thermal conductivity detector was used. Concentrations of deuterium were determined by comparison with standard samples. The error inherent to the present method was less a 1% on the basis of the calibration curves obtained with the standard samples. The average time required for the analysis was about 3 minutes. (author)

  4. Influence of corrosion layers on quantitative analysis

    International Nuclear Information System (INIS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Roehrich, J.; Strub, E.

    2005-01-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed

  5. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    Science.gov (United States)

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was associated with increased risk of breast cancer for both operators (OR = 4.0, 95% confidence interval (CI) 1.6-10.1, and 2.4, 95% CI 1.2-4.7). Quantitative

  6. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  7. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  8. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  9. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  10. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  11. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  12. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    Science.gov (United States)

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  13. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    Lefevre, Chantal.

    1979-11-01

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples [fr

  14. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  15. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    Science.gov (United States)

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  16. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    Science.gov (United States)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  17. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  18. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    Science.gov (United States)

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  20. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  1. Country risk analysis

    International Nuclear Information System (INIS)

    David, A.

    1992-01-01

    This paper reports that the oil industry has been an internationally based industry that has been heavily dependent on outside financing sources. Historically, financing came from investment houses that, in most cases, participated in the projects as equity investors. However, investment companies can no longer satisfy the capital requirements of the current high level of exploration and development activities. The current trend is to involve commercial banks on a purely lending basis. Commercial banks, by their nature, are risk averse. In the case of oil and gas exploration and production they are asked to take not only technical risk and price risk but geopolitical risk as well. Methods have been developed by commercial banks to reduce technical and price risks to point which enables them to be comfortable with a loan. However, geopolitical risks are more difficult to assess. The risk associated with many countries are the nationalization of the investment, new tax restrictions, restriction of currency movements, and/or revisions to the production sharing agreements

  2. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    OpenAIRE

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, d...

  3. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  4. Risk analysis of Odelouca cofferdam

    OpenAIRE

    Pimenta, L.; Caldeira, L.; Maranha das Neves, E.

    2009-01-01

    In this paper we present the risk analysis of Odelouca Cofferdam, using an event tree analysis. The initializing events, failure modes and analysed limit states are discussed based on an influence diagram. The constructed event trees and their interpretation are presented. The obtained risk values are represented in an FN plot superimposed to the acceptability and tolerability risk limits proposed for Portuguese dams. Initially, particular emphasis is placed on the main characteristic...

  5. Quantitative risk assessment of human salmonellosis in the smallholder pig value chains in urban of Vietnam.

    Science.gov (United States)

    Dang-Xuan, Sinh; Nguyen-Viet, Hung; Unger, Fred; Pham-Duc, Phuc; Grace, Delia; Tran-Thi, Ngan; Barot, Max; Pham-Thi, Ngoc; Makita, Kohei

    2017-02-01

    To quantify salmonellosis risk in humans through consumption of boiled pork in urban Hung Yen Province, Vietnam, using a quantitative microbial risk assessment. We collected 302 samples along the pork value chain in Hung Yen between April 2014 and February 2015. We developed a model in @Risk, based on microbiological, market, and household surveys on cooking, cross-contamination and consumption, and conducted sensitivity analysis. Salmonella prevalence of pen floor swabs, slaughterhouse carcasses and cut pork were 33.3, 41.7 and 44.4%, respectively. The annual incidence rate of salmonellosis in humans was estimated to be 17.7% (90% CI 0.89-45.96). Parameters with the greatest influence risk were household pork handling practice followed by prevalence in pork sold in the central market. Wide confidence interval in the incidence estimate was mainly due to the variability in the degree of reduction in bacteria concentration by cooking, and pork consumption pattern. The risk of salmonellosis in humans due to boiled pork consumption appears to be high. Control measures may include improving the safety of retailed pork and improving household hygiene.

  6. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  7. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  8. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  9. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    Science.gov (United States)

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.

  10. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  11. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  12. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  13. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Science.gov (United States)

    2011-12-13

    ... psychology'' (section 3507(b), Pub. L. 111-148, 124 Stat. 530), and to consult manufacturers and consumers... communication of quantitative benefit and risk information. FDA is making available the literature review report...

  14. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  15. A quantitative framework for estimating risk of collision between marine mammals and boats

    Science.gov (United States)

    Martin, Julien; Sabatier, Quentin; Gowan, Timothy A.; Giraud, Christophe; Gurarie, Eliezer; Calleson, Scott; Ortega-Ortiz, Joel G.; Deutsch, Charles J.; Rycyk, Athena; Koslovsky, Stacie M.

    2016-01-01

    Speed regulations of watercraft in protected areas are designed to reduce lethal collisions with wildlife but can have economic consequences. We present a quantitative framework for investigating the risk of deadly collisions between boats and wildlife.

  16. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  17. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  18. RISK ANALYSIS IN INFORMATION TECHNOLOGY AND COMMUNICATION OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Edmir Parada Vasques Prado

    2011-12-01

    Full Text Available This research aims at evaluating the risk analysis process in Information Technology and Communication (ICT outsourcing conducted by organizations of the private sector. The research is characterized by being a descriptive, quantitative and transversal type study, which was used the survey method. Data were collected through questionnaire, the sample is not random and we used a convenience sampling process. The research made contributions to understanding the risk analysis process in ICT services outsourcing, and identified statistically significant relationships between risk analysis, organization's size and its industry, and between risk analysis and diversity of outsourced services

  19. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  20. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  1. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  2. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Science.gov (United States)

    2012-07-17

    ... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...] Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A... Influenza Disease Models to Quantitatively Evaluate the Benefits and Risks of Vaccines: A Technical Workshop...

  3. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    Science.gov (United States)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  4. Advances in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Hardung von Hardung, H.

    1982-01-01

    Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de

  5. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    OpenAIRE

    Upasna Saluja; Norbik Bashah Idris

    2012-01-01

    Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in consid...

  6. Introduction of the risk analysis

    International Nuclear Information System (INIS)

    Campon, G.; Martinez, I.

    2013-01-01

    An introduction of risks analysis was given in the exposition which main issues were: food innocuousness, world, regional and national food context,change of paradigms, health definition, risk, codex, standardization, food chain role, trade agreement, codex alimentarius, food transmission diseases cost impact

  7. Hydroproject risk analysis

    International Nuclear Information System (INIS)

    Murdock, R.V.; Gulliver, J.S.

    1991-01-01

    Traditionally, economic feasibility studies performed for potential hydropower plant sites have included either no uncertainty or at best an ad hoc value associated with estimated benefits. However, formal methods for analyzing uncertainty do exist and have been outlined in the past. An application of these methods is demonstrated through conversion of a hydropower survey program, HYFEAS, to run on LOTUS 1-2-3, using the add-in software package RISK. In this paper the program principals are outlined and a case study of it's application to a hydropower site is presented

  8. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  9. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  10. Quantitative Microbial Risk Assessment for in Natural and Processed Cheeses

    Directory of Open Access Journals (Sweden)

    Heeyoung Lee

    2016-08-01

    Full Text Available This study evaluated the risk of Clostridium perfringens (C. perfringens foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11 was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g and processed cheeses (0.45 Log CFU/g were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309×uniform distribution (a = 0, b = 2; a = 0, b = 2.8 to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens

  11. ‘Omics’ technologies in quantitative microbial risk assessment

    NARCIS (Netherlands)

    Brul, S.; Basset, J.; Cook, P.; Kathariou, S.; McClure, P.; Jasti, P.R.; Betts, R.

    2012-01-01

    ‘Omics’ tools are being developed at an ever increasing pace. Collectively, genome sequencing, genome-wide transcriptional analysis (transcriptomics), proteomics, metabolomics, flux analysis (‘fluxomics’) and other applications are captured under the term omics. The data generated using these tools

  12. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  13. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    Science.gov (United States)

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  14. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  15. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  16. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  17. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  18. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  19. Analysis and management of risks experienced in tunnel construction

    Directory of Open Access Journals (Sweden)

    Cagatay Pamukcu

    2015-12-01

    Full Text Available In this study, first of all, the definitions of "risk", "risk analysis", "risk assessment" and "risk management" were made to avoid any confusions about these terms and significance of risk analysis and management in engineering projects was emphasized. Then, both qualitative and quantitative risk analysis techniques were mentioned and within the scope of the study, Event Tree Analysis method was selected in order to analyze the risks regarding TBM (Tunnel Boring Machine operations in tunnel construction. After all hazards that would be encountered during tunnel construction by TBM method had been investigated, those hazards were undergoing a Preliminary Hazard Analysis to sort out and prioritize the risks with high scores. When the risk scores were taken into consideration, it was seen that the hazards with high risk scores could be classified into 4 groups which are excavation + support induced accidents, accidents stemming from geologic conditions, auxiliary works, and project contract. According to these four classified groups of initiating events, Event Tree Analysis was conducted by taking into care 4 countermeasures apart from each other. Finally, the quantitative and qualitative consequences of Event Tree Analyses, which were undertaken for all initiating events, were investigated and interpreted together by making comparisons and referring to previous studies.

  20. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  1. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  2. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  3. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  4. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  5. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  6. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  7. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  8. QUANTITATIVE ANALYSIS OF FLUX REGULATION THROUGH HIERARCHICAL REGULATION ANALYSIS

    NARCIS (Netherlands)

    van Eunen, Karen; Rossell, Sergio; Bouwman, Jildau; Westerhoff, Hans V.; Bakker, Barbara M.; Jameson, D; Verma, M; Westerhoff, HV

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of V(max) can be dissected into the

  9. Quantitative analysis of flux regulation through hierarchical regulation analysis

    NARCIS (Netherlands)

    Eunen, K. van; Rossell, S.; Bouwman, J.; Westerhoff, H.V.; Bakker, B.M.

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of Vmax can be dissected into the

  10. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  11. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  12. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  13. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  14. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  15. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  16. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  17. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  18. ProRisk : risk analysis instrument : developed for William properties

    NARCIS (Netherlands)

    van Doorn, W.H.W.; Egeberg, Ingrid; Hendrickx, Kristoff; Kahramaner, Y.; Masseur, B.; Waijers, Koen; Weglicka, K.A.

    2005-01-01

    This report presents a Risk Analysis Instrument developed for William Properties. Based on the analysis, it appears that the practice of Risk Analysis exists within the organization, yet rather implicit. The Risk Analysis Instrument comes with a package of four components: an activity diagram, a

  19. Common approach of risks analysis

    International Nuclear Information System (INIS)

    Noviello, L.; Naviglio, A.

    1996-01-01

    Although, following the resolutions of the High German Court, the protection level of the human beings is an objective which can change in time, it is obvious that it is an important point when there is a risk for the population. This is true more particularly for the industrial plants whose possible accidents could affect the population. The accidents risk analysis indicates that there is no conceptual difference between the risks of a nuclear power plant and those of the other industrial plants as chemical plants, the gas distribution system and the hydraulic dams. A legislation analysis induced by the Seveso Directive for the industrial risks give some important indications which should always be followed. This work analyses more particularly the legislative situation in different European countries and identifies some of the most important characteristics. Indeed, for most of the countries, the situation is different and it is a later difficulties source for nuclear power plants. In order to strengthen this reasoning, this paper presents some preliminary results of an analysis of a nuclear power plant following the approach of other industrial plants. In conclusion, it will be necessary to analyse again the risks assessment approach for nuclear power plants because the real protection level of human beings in a country is determined by the less regulated of the dangerous industrial plants existing at the surroundings. (O.M.)

  20. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  1. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  2. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  3. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  4. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  5. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  6. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  7. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  8. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  10. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  11. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  12. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  13. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  14. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  15. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  16. Quantitative analysis with energy dispersive X-ray fluorescence analyser

    International Nuclear Information System (INIS)

    Kataria, S.K.; Kapoor, S.S.; Lal, M.; Rao, B.V.N.

    1977-01-01

    Quantitative analysis of samples using radioisotope excited energy dispersive x-ray fluorescence system is described. The complete set-up is built around a locally made Si(Li) detector x-ray spectrometer with an energy resolution of 220 eV at 5.94 KeV. The photopeaks observed in the x-ray fluorescence spectra are fitted with a Gaussian function and the intensities of the characteristic x-ray lines are extracted, which in turn are used for calculating the elemental concentrations. The results for a few typical cases are presented. (author)

  17. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  18. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    due to the apparatus used for measurements). Pull-up effects (more systematic increase caused by overlap in the spectrum) Stutters (peaks located four basepairs before the true peak). We present filtering techniques for all three technical artifacts based on statistical analysis of data from......The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  19. Quantitative assessment of exposure and risk for three carcinogenics in long-standing pollution sites

    International Nuclear Information System (INIS)

    Wichmann, H.E.; Wuppertal Univ.; Ihme, W.; Mekel, O.C.L.; Wuppertal Univ.

    1993-01-01

    The project attempts a quantitative assessment of risks for three carcinogenics that are common in sites of long-standing pollution. Benzo(a)pyrene stands for the group of polycyclic aromatic hydrocarbons, cadmium for heavy metals, and benzene for volatile aromatic compounds. The report discusses the general fundamentals of exposure and risk assessment. The exposure model is described in detail and applied to the three test substances. (orig./MG) [de

  20. Environmental modeling and health risk analysis (ACTS/RISK)

    National Research Council Canada - National Science Library

    Aral, M. M

    2010-01-01

    ... presents a review of the topics of exposure and health risk analysis. The Analytical Contaminant Transport Analysis System (ACTS) and Health RISK Analysis (RISK) software tools are an integral part of the book and provide computational platforms for all the models discussed herein. The most recent versions of these two softwa...

  1. A method of quantitative risk assessment for transmission pipeline carrying natural gas

    International Nuclear Information System (INIS)

    Jo, Young-Do; Ahn, Bum Jong

    2005-01-01

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline

  2. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  3. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  4. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  5. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  6. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    International Nuclear Information System (INIS)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu

    2001-01-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with 99m Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or 99m Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  7. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  8. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu [Tokai Univ., Isehara, Kanagawa (Japan). School of Medicine

    2001-04-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with {sup 99m}Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or {sup 99m}Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  9. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  10. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  11. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  12. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  13. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  14. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  15. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  16. Quantitative image analysis in sonograms of the thyroid gland

    Energy Technology Data Exchange (ETDEWEB)

    Catherine, Skouroliakou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Maria, Lyra [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)]. E-mail: mlyra@pindos.uoa.gr; Aristides, Antoniou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Lambros, Vlahos [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)

    2006-12-20

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  17. Quantitative prediction of oral cancer risk in patients with oral leukoplakia.

    Science.gov (United States)

    Liu, Yao; Li, Yicheng; Fu, Yue; Liu, Tong; Liu, Xiaoyong; Zhang, Xinyan; Fu, Jie; Guan, Xiaobing; Chen, Tong; Chen, Xiaoxin; Sun, Zheng

    2017-07-11

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma. We have developed an oral cancer risk index using DNA index value to quantitatively assess cancer risk in patients with oral leukoplakia, but with limited success. In order to improve the performance of the risk index, we collected exfoliative cytology, histopathology, and clinical follow-up data from two independent cohorts of normal, leukoplakia and cancer subjects (training set and validation set). Peaks were defined on the basis of first derivatives with positives, and modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Random forest was found to be the best model with high sensitivity (100%) and specificity (99.2%). Using the Peaks-Random Forest model, we constructed an index (OCRI2) as a quantitative measurement of cancer risk. Among 11 leukoplakia patients with an OCRI2 over 0.5, 4 (36.4%) developed cancer during follow-up (23 ± 20 months), whereas 3 (5.3%) of 57 leukoplakia patients with an OCRI2 less than 0.5 developed cancer (32 ± 31 months). OCRI2 is better than other methods in predicting oral squamous cell carcinoma during follow-up. In conclusion, we have developed an exfoliative cytology-based method for quantitative prediction of cancer risk in patients with oral leukoplakia.

  18. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Quantitative risk assessment of continuous liquid spill fires based on spread and burning behaviours

    DEFF Research Database (Denmark)

    Zhao, Jinlong; Huang, Hong; Li, Yuntao

    2017-01-01

    Spill fires usually occur during the storage and transportation of hazardous materials, posing a threat to the people and environment in their immediate proximity. In this paper, a classical Quantitative Risk Assessment (QRA) method is used to assess the risk of spill fires. In this method......, the maximum spread area and the steady burning area are introduced as parameters to clearly assess the range of influence of the spill fire. In the calculations, a modified spread model that takes into consideration the burning rate variation is established to calculate the maximum spread area. Furthermore......, large-scale experiments of spill fires on water and a glass sheet were conducted to verify the accuracy and application of the model. The results show that the procedure we developed can be used to quantitatively calculate the risk associated with a continuous spill fire....

  20. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  1. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  2. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  3. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  5. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  6. Risk Analysis in Road Tunnels – Most Important Risk Indicators

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Thöns, Sebastian

    2016-01-01

    Methodologies on fire risk analysis in road tunnels consider numerous factors affecting risks (risk indicators) and express the results by risk measures. But only few comprehensive studies on effects of risk indicators on risk measures are available. For this reason, this study quantifies...... the effects and highlights the most important risk indicators with the aim to support further developments in risk analysis. Therefore, a system model of a road tunnel was developed to determine the risk measures. The system model can be divided into three parts: the fire part connected to the fire model Fire...... Dynamics Simulator (FDS); the evacuation part connected to the evacuation model FDS+Evac; and the frequency part connected to a model to calculate the frequency of fires. This study shows that the parts of the system model (and their most important risk indicators) affect the risk measures in the following...

  7. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  8. Quantitative analysis and classification of AFM images of human hair.

    Science.gov (United States)

    Gurden, S P; Monteiro, V F; Longo, E; Ferreira, M M C

    2004-07-01

    The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty-eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly.

  9. A Quantitative Accident Sequence Analysis for a VHTR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jintae; Lee, Joeun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-05-15

    In Korea, the basic design features of VHTR are currently discussed in the various design concepts. Probabilistic risk assessment (PRA) offers a logical and structured method to assess risks of a large and complex engineered system, such as a nuclear power plant. It will be introduced at an early stage in the design, and will be upgraded at various design and licensing stages as the design matures and the design details are defined. Risk insights to be developed from the PRA are viewed as essential to developing a design that is optimized in meeting safety objectives and in interpreting the applicability of the existing demands to the safety design approach of the VHTR. In this study, initiating events which may occur in VHTRs were selected through MLD method. The initiating events were then grouped into four categories for the accident sequence analysis. Initiating events frequency and safety systems failure rate were calculated by using reliability data obtained from the available sources and fault tree analysis. After quantification, uncertainty analysis was conducted. The SR and LR frequency are calculated respectively 7.52E- 10/RY and 7.91E-16/RY, which are relatively less than the core damage frequency of LWRs.

  10. Process Review for Development of Quantitative Risk Analyses for Transboundary Animal Disease to Pathogen-Free Territories

    Directory of Open Access Journals (Sweden)

    Jonathan Miller

    2017-10-01

    Full Text Available Outbreaks of transboundary animal diseases (TADs have the potential to cause significant detriment to animal, human, and environmental health; severe economic implications; and national security. Challenges concerning data sharing, model development, decision support, and disease emergence science have recently been promoted. These challenges and recommendations have been recognized and advocated in the disciplines intersecting with outbreak prediction and forecast modeling regarding infectious diseases. To advance the effective application of computation and risk communication, analytical products ought to follow a collaboratively agreed common plan for implementation. Research articles should seek to inform and assist prioritization of national and international strategies in developing established criteria to identify and follow best practice standards to assess risk model attributes and performance. A well-defined framework to help eliminate gaps in policy, process, and planning knowledge areas would help alleviate the intense need for the formation of a comprehensive strategy for countering TAD outbreak risks. A quantitative assessment that accurately captures the risk of introduction of a TAD through various pathways can be a powerful tool in guiding where government, academic, and industry resources ought to be allocated, whether implementation of additional risk management solutions is merited, and where research efforts should be directed to minimize risk. This review outlines a part of a process for the development of quantitative risk analysis to collect, analyze, and communicate this knowledge. A more comprehensive and unabridged manual was also developed. The framework used in supporting the application of aligning computational tools for readiness continues our approach to apply a preparedness mindset to challenges concerning threats to global biosecurity, secure food systems, and risk-mitigated agricultural economies.

  11. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  12. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Tavares, Leila Aley; Ferreira, Antonio Gilberto

    2006-01-01

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1 H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  13. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Quantitative image analysis of WE43-T6 cracking behavior

    International Nuclear Information System (INIS)

    Ahmad, A; Yahya, Z

    2013-01-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  15. Quantitative analysis of spatial variability of geotechnical parameters

    Science.gov (United States)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  16. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  17. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    Malet, G.

    1962-01-01

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio ( 144 Ce + 144 Pr activity)/ 137 Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141 Ce, 144 Ce + 144 Pr, 103 Ru, 106 Ru + 106 Rh, 137 Cs, 95 Zr + 95 Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author) [fr

  18. Quantitative image analysis for investigating cell-matrix interactions

    Science.gov (United States)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  19. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  20. Qualitative and quantitative analysis of women's perceptions of transvaginal surgery.

    Science.gov (United States)

    Bingener, Juliane; Sloan, Jeff A; Ghosh, Karthik; McConico, Andrea; Mariani, Andrea

    2012-04-01

    Prior surveys evaluating women's perceptions of transvaginal surgery both support and refute the acceptability of transvaginal access. Most surveys employed mainly quantitative analysis, limiting the insight into the women's perspective. In this mixed-methods study, we include qualitative and quantitative methodology to assess women's perceptions of transvaginal procedures. Women seen at the outpatient clinics of a tertiary-care center were asked to complete a survey. Demographics and preferences for appendectomy, cholecystectomy, and tubal ligation were elicited, along with open-ended questions about concerns or benefits of transvaginal access. Multivariate logistic regression models were constructed to examine the impact of age, education, parity, and prior transvaginal procedures on preferences. For the qualitative evaluation, content analysis by independent investigators identified themes, issues, and concerns raised in the comments. The completed survey tool was returned by 409 women (grouped mean age 53 years, mean number of 2 children, 82% ≥ some college education, and 56% with previous transvaginal procedure). The transvaginal approach was acceptable for tubal ligation to 59%, for appendectomy to 43%, and for cholecystectomy to 41% of the women. The most frequently mentioned factors that would make women prefer a vaginal approach were decreased invasiveness (14.4%), recovery time (13.9%), scarring (13.7%), pain (6%), and surgical entry location relative to organ removed (4.4%). The most frequently mentioned concerns about the vaginal approach were the possibility of complications/safety (14.7%), pain (9%), infection (5.6%), and recovery time (4.9%). A number of women voiced technical concerns about the vaginal approach. As in prior studies, scarring and pain were important issues to be considered, but recovery time and increased invasiveness were also in the "top five" list. The surveyed women appeared to actively participate in evaluating the technical

  1. Quantitative analysis of protein-ligand interactions by NMR.

    Science.gov (United States)

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  2. Aspects of the risk analysis in the process engineering industry

    International Nuclear Information System (INIS)

    Hennings, W.; Madjar, M.; Mock, R.; Reer, B.

    1996-01-01

    This document is the result of a multi-discipline working group of a portion of a project called Risk analysis for chemical plants. Within the framework of the project, only selected methods and tools of risk analysis, thus, aspects of method, were able to be discussed and developed further. Case examples from the chemical industry are dealt with in order to discuss the application of a computer assisted quantitative error analysis in this industrial sector. Included is also a comprehensive documentation of the data and results utilised in the examples. figs., tabs., refs

  3. Risk analysis in radiosurgery treatments using risk matrices

    International Nuclear Information System (INIS)

    Delgado, J. M.; Sanchez Cayela, C.; Ramirez, M. L.; Perez, A.

    2011-01-01

    The aim of this study is the risk analysis process stereotactic single-dose radiotherapy and evaluation of those initiating events that lead to increased risk and a possible solution in the design of barriers.

  4. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  5. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  6. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  7. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  8. Fixing the cracks in the crystal ball: A maturity model for quantitative risk assessment

    International Nuclear Information System (INIS)

    Rae, Andrew; Alexander, Rob; McDermid, John

    2014-01-01

    Quantitative risk assessment (QRA) is widely practiced in system safety, but there is insufficient evidence that QRA in general is fit for purpose. Defenders of QRA draw a distinction between poor or misused QRA and correct, appropriately used QRA, but this distinction is only useful if we have robust ways to identify the flaws in an individual QRA. In this paper we present a comprehensive maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature and in a collection of risk assessment peer reviews. We provide initial validation of the completeness and realism of the model. Our risk assessment maturity model provides a way to prioritise both process development within an organisation and empirical research within the QRA community. - Highlights: • Quantitative risk assessment (QRA) is widely practiced, but there is insufficient evidence that it is fit for purpose. • A given QRA may be good, or it may not – we need systematic ways to distinguish this. • We have created a maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature. • We have provided initial validation of the completeness and realism of the model. • The maturity model can also be used to prioritise QRA research discipline-wide

  9. DYNAMIC HYBRIDS UNDER SOLVENCY II: RISK ANALYSIS AND MODIFICATION POSSIBILITIES

    Directory of Open Access Journals (Sweden)

    Christian Maier

    2017-06-01

    Full Text Available In this study, we investigate the new and standardized European system of supervisory called Solvency II. In essence, asymmetric distribution of information between policyholder and insurer triggered this new regulation which aims at better protecting policyholders. Its three-pillar model is about to challenge both, insurers as well as policyholders. The first pillar includes quantitative aspects, the second pillar contains qualitative aspects and the third pillar comprises market transparency and reporting obligations. Underwriting risks, the default risk of a bank and market risks can be identified for the dynamic hybrid. Solvency II covers all these risks in the first pillar and insurers shall deposit sufficient risk-bearing capital. In our analysis, we first identify the dynamic hybrid specific risks under the Solvency II regime und then develop product modifications to reduce this risk.

  10. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  11. A CGE analysis for quantitative evaluation of electricity market changes

    International Nuclear Information System (INIS)

    Hwang, Won-Sik; Lee, Jeong-Dong

    2015-01-01

    Risk and uncertainty entailed by electricity industry privatization impose a heavy burden on the political determination. In this sense, ex ante analyses are important in order to investigate the economic effects of privatization or liberalization in the electricity industry. For the purpose of fulfilling these quantitative analyses, a novel approach is developed, incorporating a top-down and bottom-up model that takes into account economic effects and technological constraints simultaneously. This study also examines various counterfactual scenarios after Korean electricity industry reform through the integrated framework. Simulation results imply that authorities should prepare an improved regulatory system and policy measures such as forward contracts for industry reform, in order to promote competition in the distribution sector as well as the generation sector. -- Highlights: •A novel approach is proposed for incorporating a top-down and bottom-up model. •This study examines various counterfactual scenarios after Korean electricity industry reform. •An improved regulatory system and policy measures are required before the reform

  12. Predicted cancer risks induced by computed tomography examinations during childhood, by a quantitative risk assessment approach.

    Science.gov (United States)

    Journy, Neige; Ancelet, Sophie; Rehel, Jean-Luc; Mezzarobba, Myriam; Aubert, Bernard; Laurier, Dominique; Bernier, Marie-Odile

    2014-03-01

    The potential adverse effects associated with exposure to ionizing radiation from computed tomography (CT) in pediatrics must be characterized in relation to their expected clinical benefits. Additional epidemiological data are, however, still awaited for providing a lifelong overview of potential cancer risks. This paper gives predictions of potential lifetime risks of cancer incidence that would be induced by CT examinations during childhood in French routine practices in pediatrics. Organ doses were estimated from standard radiological protocols in 15 hospitals. Excess risks of leukemia, brain/central nervous system, breast and thyroid cancers were predicted from dose-response models estimated in the Japanese atomic bomb survivors' dataset and studies of medical exposures. Uncertainty in predictions was quantified using Monte Carlo simulations. This approach predicts that 100,000 skull/brain scans in 5-year-old children would result in eight (90 % uncertainty interval (UI) 1-55) brain/CNS cancers and four (90 % UI 1-14) cases of leukemia and that 100,000 chest scans would lead to 31 (90 % UI 9-101) thyroid cancers, 55 (90 % UI 20-158) breast cancers, and one (90 % UI risks without exposure). Compared to background risks, radiation-induced risks would be low for individuals throughout life, but relative risks would be highest in the first decades of life. Heterogeneity in the radiological protocols across the hospitals implies that 5-10 % of CT examinations would be related to risks 1.4-3.6 times higher than those for the median doses. Overall excess relative risks in exposed populations would be 1-10 % depending on the site of cancer and the duration of follow-up. The results emphasize the potential risks of cancer specifically from standard CT examinations in pediatrics and underline the necessity of optimization of radiological protocols.

  13. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  14. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  15. Risk Analysis Group annual progress report 1984

    International Nuclear Information System (INIS)

    1985-06-01

    The activities of the Risk Analysis Group at Risoe during 1984 are presented. These include descriptions in some detail of work on general development topics and risk analysis performed as contractor. (author)

  16. A quantitative method for risk assessment of agriculture due to climate change

    Science.gov (United States)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  17. Risk analysis: assessing uncertainties beyond expected values and probabilities

    National Research Council Canada - National Science Library

    Aven, T. (Terje)

    2008-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 How to describe risk quantitatively . . . . . . . . . . . . . . . . . 2.2.1 Description of risk in a financial context . . . . . . . . . 2.2.2 Description...

  18. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  19. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  20. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  1. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    International Nuclear Information System (INIS)

    Charland, P.; Peters, T.; McGill Univ., Montreal, Quebec

    1996-01-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer's perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions

  2. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    Science.gov (United States)

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  3. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  4. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  5. Quantitative analysis of the security performance in wireless LANs

    Directory of Open Access Journals (Sweden)

    Poonam Jindal

    2017-07-01

    Full Text Available A comprehensive experimental study to analyze the security performance of a WLAN based on IEEE 802.11 b/g/n standards in various network scenarios is presented in this paper. By setting-up an experimental testbed we have measured results for a layered security model in terms of throughput, response time, encryption overheads, frame loss and jitter. Through numerical results obtained from the testbed, we have presented quantitative as well as realistic findings for both security mechanisms and network performance. It establishes the fact that there is always a tradeoff between the security strength and the associated network performance. It is observed that the non-roaming network always performs better than the roaming network under all network scenarios. To analyze the benefits offered by a particular security protocol a relative security strength index model is demonstrated. Further we have presented the statistical analysis of our experimental data. We found that different security protocols have different robustness against mobility. By choosing the robust security protocol, network performance can be improved. The presented analysis is significant and useful with reference to the assessment of the suitability of security protocols for given real time application.

  6. An approach to quantitative assessment of relative proliferation risks from nuclear fuel cycles

    International Nuclear Information System (INIS)

    Silvennoinen, P.; Vira, J.

    1981-01-01

    Feasibility of quantitative assessments of the risk of nuclear weapons proliferation is discussed in this paper. The proliferation risk is defined as a combined utility of the different fuel cycle processes or materials for the proscribed acquisition of a nuclear weapon. Based on a set of selected weighted criteria, the process utilities are calculated employing utility functions or fuzzy expectation values. The methods are compared to each other. The scheme appears feasible in relative comparisons while certain leeway must still be retained for political judgement. (author)

  7. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union

    DEFF Research Database (Denmark)

    Snary, Emma L.; Swart, Arno N.; Simons, Robin R. L.

    2016-01-01

    ,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready‐to‐eat sausages). Further analyses of the farm‐to‐consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration......A farm‐to‐consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter‐pig prevalence...

  8. Environmental risk analysis of hazardous material rail transportation

    International Nuclear Information System (INIS)

    Saat, Mohd Rapik; Werth, Charles J.; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P.L.

    2014-01-01

    Highlights: • Comprehensive, nationwide risk assessment of hazardous material rail transportation. • Application of a novel environmental (i.e. soil and groundwater) consequence model. • Cleanup cost and total shipment distance are the most significant risk factors. • Annual risk varies from $20,000 to $560,000 for different products. • Provides information on the risk cost associated with specific product shipments. -- Abstract: An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials

  9. Environmental risk analysis of hazardous material rail transportation

    Energy Technology Data Exchange (ETDEWEB)

    Saat, Mohd Rapik, E-mail: mohdsaat@illinois.edu [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States); Werth, Charles J.; Schaeffer, David [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States); Yoon, Hongkyu [Sandia National Laboratories, Albuquerque, NM 87123 (United States); Barkan, Christopher P.L. [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States)

    2014-01-15

    Highlights: • Comprehensive, nationwide risk assessment of hazardous material rail transportation. • Application of a novel environmental (i.e. soil and groundwater) consequence model. • Cleanup cost and total shipment distance are the most significant risk factors. • Annual risk varies from $20,000 to $560,000 for different products. • Provides information on the risk cost associated with specific product shipments. -- Abstract: An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  10. Linkage of DNA Methylation Quantitative Trait Loci to Human Cancer Risk

    Directory of Open Access Journals (Sweden)

    Holger Heyn

    2014-04-01

    Full Text Available Epigenetic regulation and, in particular, DNA methylation have been linked to the underlying genetic sequence. DNA methylation quantitative trait loci (meQTL have been identified through significant associations between the genetic and epigenetic codes in physiological and pathological contexts. We propose that interrogating the interplay between polymorphic alleles and DNA methylation is a powerful method for improving our interpretation of risk alleles identified in genome-wide association studies that otherwise lack mechanistic explanation. We integrated patient cancer risk genotype data and genome-scale DNA methylation profiles of 3,649 primary human tumors, representing 13 solid cancer types. We provide a comprehensive meQTL catalog containing DNA methylation associations for 21% of interrogated cancer risk polymorphisms. Differentially methylated loci harbor previously reported and as-yet-unidentified cancer genes. We suggest that such regulation at the DNA level can provide a considerable amount of new information about the biology of cancer-risk alleles.

  11. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    Science.gov (United States)

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  12. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    Science.gov (United States)

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. © 2015 Blackwell Verlag GmbH.

  13. Risk analysis; Analisis de riesgos

    Energy Technology Data Exchange (ETDEWEB)

    Baron, J H; Nunez McLeod, J; Rivera, S S [Universidad Nacional de Cuyo, Mendoza (Argentina). Instituto de Capacitacion Especial y Desarrollo de Ingenieria Asistida por Computadora (CEDIAC)

    1997-07-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field. [Spanish] Este libro contiene una seleccion de trabajos de investigacion realizados dentro del Instituto de Capacitacion Especial y Desarrollo de la Ingenieria Asistida por Computadora en el area del analisis de riesgos, con una orientacion hacia el estudio de incertidumbres y sensibilidad, confiabilidad de software, modelacion de accidentes severos, etc. Este volumen recoge un material de indudable importancia e interes para todos aquellos investigadores y profesionales que desean incursionar en este campo del analisis de riesgos como herramienta para la solucion de problemas frecuentemente encontrados en la ingenieria y las ciencias aplicadas, asi como para los academicos que desean mantenerse al dia, conociendo los nuevos desarrollos y tecnicas que constantemente aparecen en su area.

  14. A background to risk analysis. Vol. 3

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justifi- cation or evaluation, this is given in the form of a chapter appenix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 3 contains chapters on quantification of risk, failure and accident probability, risk analysis and design, and examles of risk analysis for process plant. (BP)

  15. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  16. Using a quantitative risk register to promote learning from a patient safety reporting system.

    Science.gov (United States)

    Mansfield, James G; Caplan, Robert A; Campos, John S; Dreis, David F; Furman, Cathie

    2015-02-01

    Patient safety reporting systems are now used in most health care delivery organizations. These systems, such as the one in use at Virginia Mason (Seattle) since 2002, can provide valuable reports of risk and harm from the front lines of patient care. In response to the challenge of how to quantify and prioritize safety opportunities, a risk register system was developed and implemented. Basic risk register concepts were refined to provide a systematic way to understand risks reported by staff. The risk register uses a comprehensive taxonomy of patient risk and algorithmically assigns each patient safety report to 1 of 27 risk categories in three major domains (Evaluation, Treatment, and Critical Interactions). For each category, a composite score was calculated on the basis of event rate, harm, and cost. The composite scores were used to identify the "top five" risk categories, and patient safety reports in these categories were analyzed in greater depth to find recurrent patterns of risk and associated opportunities for improvement. The top five categories of risk were easy to identify and had distinctive "profiles" of rate, harm, and cost. The ability to categorize and rank risks across multiple dimensions yielded insights not previously available. These results were shared with leadership and served as input for planning quality and safety initiatives. This approach provided actionable input for the strategic planning process, while at the same time strengthening the Virginia Mason culture of safety. The quantitative patient safety risk register serves as one solution to the challenge of extracting valuable safety lessons from large numbers of incident reports and could profitably be adopted by other organizations.

  17. Quantitative genetic tools for insecticide resistance risk assessment: estimating the heritability of resistance

    Science.gov (United States)

    Michael J. Firko; Jane Leslie Hayes

    1990-01-01

    Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...

  18. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  19. Quantitative assessment of early diabetic retinopathy using fractal analysis.

    Science.gov (United States)

    Cheung, Ning; Donaghue, Kim C; Liew, Gerald; Rogers, Sophie L; Wang, Jie Jin; Lim, Shueh-Wen; Jenkins, Alicia J; Hsu, Wynne; Li Lee, Mong; Wong, Tien Y

    2009-01-01

    Fractal analysis can quantify the geometric complexity of the retinal vascular branching pattern and may therefore offer a new method to quantify early diabetic microvascular damage. In this study, we examined the relationship between retinal fractal dimension and retinopathy in young individuals with type 1 diabetes. We conducted a cross-sectional study of 729 patients with type 1 diabetes (aged 12-20 years) who had seven-field stereoscopic retinal photographs taken of both eyes. From these photographs, retinopathy was graded according to the modified Airlie House classification, and fractal dimension was quantified using a computer-based program following a standardized protocol. In this study, 137 patients (18.8%) had diabetic retinopathy signs; of these, 105 had mild retinopathy. Median (interquartile range) retinal fractal dimension was 1.46214 (1.45023-1.47217). After adjustment for age, sex, diabetes duration, A1C, blood pressure, and total cholesterol, increasing retinal vascular fractal dimension was significantly associated with increasing odds of retinopathy (odds ratio 3.92 [95% CI 2.02-7.61] for fourth versus first quartile of fractal dimension). In multivariate analysis, each 0.01 increase in retinal vascular fractal dimension was associated with a nearly 40% increased odds of retinopathy (1.37 [1.21-1.56]). This association remained after additional adjustment for retinal vascular caliber. Greater retinal fractal dimension, representing increased geometric complexity of the retinal vasculature, is independently associated with early diabetic retinopathy signs in type 1 diabetes. Fractal analysis of fundus photographs may allow quantitative measurement of early diabetic microvascular damage.

  20. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  1. Collocations and collocation types in ESP textbooks: Quantitative pedagogical analysis

    Directory of Open Access Journals (Sweden)

    Bogdanović Vesna Ž.

    2016-01-01

    Full Text Available The term collocation, even though it is rather common in the English language grammar, it is not a well known or commonly used term in the textbooks and scientific papers written in the Serbian language. Collocating is usually defined as a natural appearance of two (or more words, which are usually one next to another even though they can be separated in the text, while collocations are defined as words with natural semantic and/or syntactic relations being joined together in a sentence. Collocations are naturally used in all English written texts, including scientific texts and papers. Using two textbooks for English for Specific Purposes (ESP for intermediate students' courses, this paper presents the frequency of collocations and their typology. The paper tries to investigate the relationship between lexical and grammatical collocations written in the ESP texts and the reasons for their presence. There is an overview of the most used subtypes of lexical collocations as well. Furthermore, on applying the basic corpus analysis based on the quantitative analysis, the paper presents the number of open, restricted and bound collocations in ESP texts, trying to draw conclusions on their frequency and hence the modes for their learning. There is also a section related to the number and usage of scientific collocations, both common scientific and narrow-professional ones. The conclusion is that the number of present collocations in the selected two textbooks imposes a demand for further analysis of these lexical connections, as well as new modes for their teaching and presentations to the English learning students.

  2. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    Science.gov (United States)

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  3. Local scale multiple quantitative risk assessment and uncertainty evaluation in a densely urbanised area (Brescia, Italy

    Directory of Open Access Journals (Sweden)

    S. Lari

    2012-11-01

    Full Text Available The study of the interactions between natural and anthropogenic risks is necessary for quantitative risk assessment in areas affected by active natural processes, high population density and strong economic activities.

    We present a multiple quantitative risk assessment on a 420 km2 high risk area (Brescia and surroundings, Lombardy, Northern Italy, for flood, seismic and industrial accident scenarios. Expected economic annual losses are quantified for each scenario and annual exceedance probability-loss curves are calculated. Uncertainty on the input variables is propagated by means of three different methodologies: Monte-Carlo-Simulation, First Order Second Moment, and point estimate.

    Expected losses calculated by means of the three approaches show similar values for the whole study area, about 64 000 000 € for earthquakes, about 10 000 000 € for floods, and about 3000 € for industrial accidents. Locally, expected losses assume quite different values if calculated with the three different approaches, with differences up to 19%.

    The uncertainties on the expected losses and their propagation, performed with the three methods, are compared and discussed in the paper. In some cases, uncertainty reaches significant values (up to almost 50% of the expected loss. This underlines the necessity of including uncertainty in quantitative risk assessment, especially when it is used as a support for territorial planning and decision making. The method is developed thinking at a possible application at a regional-national scale, on the basis of data available in Italy over the national territory.

  4. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    Science.gov (United States)

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  5. Economic analysis and management of climatic risks

    Energy Technology Data Exchange (ETDEWEB)

    Hourcade, J.C. (Centre International de Recherche sur l' Environnement et le Developpement, 92 - Montrouge (France))

    1994-01-01

    This paper aims at framing the collective decision problem in the face of climate change. It shows why it would be irrelevant to handle it in the form of a classical decision under uncertainty framework where a cost-benefit analysis is carried out including probability distribution on damages and risk aversion coefficients. A sequential approach to policy making is then proposed as an alternative in order to account for the inertia of socio-economic dynamics and the value of information. A simple model illustrates the gap between these two approaches; it shows the importance of combining the investments on climatic research, innovation policies and so-called 'no regret' short term decisions. It shows the fact that, even if they can be considered as quantitatively moderate, these potentials have a critical impact on long term viability of development; they embed a very high information value, lengthening the learning time vis-a-vis potentially major but controversial risks. (author). 21 refs., 3 figs.

  6. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  7. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  8. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  9. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  10. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  11. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  12. Qualitative and quantitative analysis of plutonium in solid waste drums

    International Nuclear Information System (INIS)

    Anno, Jacques; Escarieux, Emile

    1977-01-01

    An assessment of the results given by a study carried out for the development of qualitative and quantitative analysis, by γ spectrometry, of plutonium in solid waste drums is presented. After having reminded the standards and their incidence on the quantities of plutonium to be measured (application at industrial Pu: 20% of Pu 240 ) the equipment used is described. Measurement station provided with a mechanical system consisting of: a rail and a pulley block to bring the drums; a pit and a hydraulic jack with a rotating platform. The detection instrumentation consisting of: a high volume coaxial Ge(Li) detector with a γ ray resolution of 2 keV; an associated electronic; a processing of data by a 'Plurimat 20' minicomputer. Principles of the identification and measurements are specified and supported by experimental results. They are the following: determination of the quality of Pu by measuring the ratio between the γ ray intensities of the 239 Pu 129 keV and of the 241 Pu 148 keV; measurement of the 239 Pu mass by estimating the γ ray counting rate of the 375 keV from the calibrating curves given by plutonium samples varying from 32 mg to 80 g; correction of the results versus the source position into the drum and versus the filling in plastic materials into this drum. The experimental results obtained over 40 solid waste drums are presented along with the error estimates [fr

  13. A temperature-controlled photoelectrochemical cell for quantitative product analysis

    Science.gov (United States)

    Corson, Elizabeth R.; Creel, Erin B.; Kim, Youngsang; Urban, Jeffrey J.; Kostecki, Robert; McCloskey, Bryan D.

    2018-05-01

    In this study, we describe the design and operation of a temperature-controlled photoelectrochemical cell for analysis of gaseous and liquid products formed at an illuminated working electrode. This cell is specifically designed to quantitatively analyze photoelectrochemical processes that yield multiple gas and liquid products at low current densities and exhibit limiting reactant concentrations that prevent these processes from being studied in traditional single chamber electrolytic cells. The geometry of the cell presented in this paper enables front-illumination of the photoelectrode and maximizes the electrode surface area to electrolyte volume ratio to increase liquid product concentration and hence enhances ex situ spectroscopic sensitivity toward them. Gas is bubbled through the electrolyte in the working electrode chamber during operation to maintain a saturated reactant concentration and to continuously mix the electrolyte. Gaseous products are detected by an in-line gas chromatograph, and liquid products are analyzed ex situ by nuclear magnetic resonance. Cell performance was validated by examining carbon dioxide reduction on a silver foil electrode, showing comparable results both to those reported in the literature and identical experiments performed in a standard parallel-electrode electrochemical cell. To demonstrate a photoelectrochemical application of the cell, CO2 reduction experiments were carried out on a plasmonic nanostructured silver photocathode and showed different product distributions under dark and illuminated conditions.

  14. FFT transformed quantitative EEG analysis of short term memory load.

    Science.gov (United States)

    Singh, Yogesh; Singh, Jayvardhan; Sharma, Ratna; Talwar, Anjana

    2015-07-01

    The EEG is considered as building block of functional signaling in the brain. The role of EEG oscillations in human information processing has been intensively investigated. To study the quantitative EEG correlates of short term memory load as assessed through Sternberg memory test. The study was conducted on 34 healthy male student volunteers. The intervention consisted of Sternberg memory test, which runs on a version of the Sternberg memory scanning paradigm software on a computer. Electroencephalography (EEG) was recorded from 19 scalp locations according to 10-20 international system of electrode placement. EEG signals were analyzed offline. To overcome the problems of fixed band system, individual alpha frequency (IAF) based frequency band selection method was adopted. The outcome measures were FFT transformed absolute powers in the six bands at 19 electrode positions. Sternberg memory test served as model of short term memory load. Correlation analysis of EEG during memory task was reflected as decreased absolute power in Upper alpha band in nearly all the electrode positions; increased power in Theta band at Fronto-Temporal region and Lower 1 alpha band at Fronto-Central region. Lower 2 alpha, Beta and Gamma band power remained unchanged. Short term memory load has distinct electroencephalographic correlates resembling the mentally stressed state. This is evident from decreased power in Upper alpha band (corresponding to Alpha band of traditional EEG system) which is representative band of relaxed mental state. Fronto-temporal Theta power changes may reflect the encoding and execution of memory task.

  15. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, P; Weeke, B; Loewenstein, H [Rigshospitalet, Copenhagen (Denmark)

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10/sup 4/, 2 x 10/sup 4/, 2 x 10/sup 5/ dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A.

  16. Quantitative produced water analysis using mobile 1H NMR

    International Nuclear Information System (INIS)

    Wagner, Lisabeth; Fridjonsson, Einar O; May, Eric F; Stanwix, Paul L; Graham, Brendan F; Carroll, Matthew R J; Johns, Michael L; Kalli, Chris

    2016-01-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1 H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1 H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1 H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1–30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography. (paper)

  17. Photographers’ Nomenclature Units: A Structural and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Margarita A. Mihailova

    2017-11-01

    Full Text Available Addressing the needs of cross and intercultural communication as well as the methodology of contrastive research, the paper presents the results of the complex analysis conducted to describe semantic and pragmatic parameters of nomenclature units denoting photography equipment in the modern Russian informal discourse of professional photographers. The research is exemplified by 34 original nomenclature units and their 34 Russian equivalents used in 6871 comments posted at “Клуб.Foto.ru” web-site in 2015. The structural and quantitative analyses of photographers’ nomenclature demonstrate the users’ morphological and graphic preferences and indirectly reflect their social and professional values. The corpus-based approach developed by Kast-Aigner (2009: 141 was applied in the study with the aim to identify the nomenclature units denoting photography equipment, validate and elaborate the data of the existing corpus. The research also throws light on the problems of professional language development and derivational processes. The perspective of the study lies in the research of the broader context of professional nomenclature.

  18. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Prahl, P.; Weeke, B.; Loewenstein, H.

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10 4 , 2 x 10 4 , 2 x 10 5 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  19. Social media in epilepsy: A quantitative and qualitative analysis.

    Science.gov (United States)

    Meng, Ying; Elkaim, Lior; Wang, Justin; Liu, Jessica; Alotaibi, Naif M; Ibrahim, George M; Fallah, Aria; Weil, Alexander G; Valiante, Taufik A; Lozano, Andres M; Rutka, James T

    2017-06-01

    While the social burden of epilepsy has been extensively studied, an evaluation of social media related to epilepsy may provide novel insight into disease perception, patient needs and access to treatments. The objective of this study is to assess patterns in social media and online communication usage related to epilepsy and its associated topics. We searched two major social media platforms (Facebook and Twitter) for public accounts dedicated to epilepsy. Results were analyzed using qualitative and quantitative methodologies. The former involved thematic and word count analysis for online posts and tweets on these platforms, while the latter employed descriptive statistics and non-parametric tests. Facebook had a higher number of pages (840 accounts) and users (3 million) compared to Twitter (137 accounts and 274,663 users). Foundation and support groups comprised most of the accounts and users on both Facebook and Twitter. The number of accounts increased by 100% from 2012 to 2016. Among the 403 posts and tweets analyzed, "providing information" on medications or correcting common misconceptions in epilepsy was the most common theme (48%). Surgical interventions for epilepsy were only mentioned in 1% of all posts and tweets. The current study provides a comprehensive reference on the usage of social media in epilepsy. The number of online users interested in epilepsy is likely the highest among all neurological conditions. Surgery, as a method of treating refractory epilepsy, however, could be underrepresented on social media. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  1. B1 -sensitivity analysis of quantitative magnetization transfer imaging.

    Science.gov (United States)

    Boudreau, Mathieu; Stikov, Nikola; Pike, G Bruce

    2018-01-01

    To evaluate the sensitivity of quantitative magnetization transfer (qMT) fitted parameters to B 1 inaccuracies, focusing on the difference between two categories of T 1 mapping techniques: B 1 -independent and B 1 -dependent. The B 1 -sensitivity of qMT was investigated and compared using two T 1 measurement methods: inversion recovery (IR) (B 1 -independent) and variable flip angle (VFA), B 1 -dependent). The study was separated into four stages: 1) numerical simulations, 2) sensitivity analysis of the Z-spectra, 3) healthy subjects at 3T, and 4) comparison using three different B 1 imaging techniques. For typical B 1 variations in the brain at 3T (±30%), the simulations resulted in errors of the pool-size ratio (F) ranging from -3% to 7% for VFA, and -40% to > 100% for IR, agreeing with the Z-spectra sensitivity analysis. In healthy subjects, pooled whole-brain Pearson correlation coefficients for F (comparing measured double angle and nominal flip angle B 1 maps) were ρ = 0.97/0.81 for VFA/IR. This work describes the B 1 -sensitivity characteristics of qMT, demonstrating that it varies substantially on the B 1 -dependency of the T 1 mapping method. Particularly, the pool-size ratio is more robust against B 1 inaccuracies if VFA T 1 mapping is used, so much so that B 1 mapping could be omitted without substantially biasing F. Magn Reson Med 79:276-285, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  2. Quantitative Gait Analysis in Patients with Huntington’s Disease

    Directory of Open Access Journals (Sweden)

    Seon Jong Pyo

    2017-09-01

    Full Text Available Objective Gait disturbance is the main factor contributing to a negative impact on quality of life in patients with Huntington’s disease (HD. Understanding gait features in patients with HD is essential for planning a successful gait strategy. The aim of this study was to investigate temporospatial gait parameters in patients with HD compared with healthy controls. Methods We investigated 7 patients with HD. Diagnosis was confirmed by genetic analysis, and patients were evaluated with the Unified Huntington’s Disease Rating Scale (UHDRS. Gait features were assessed with a gait analyzer. We compared the results of patients with HD to those of 7 age- and sex-matched normal controls. Results Step length and stride length were decreased and base of support was increased in the HD group compared to the control group. In addition, coefficients of variability for step and stride length were increased in the HD group. The HD group showed slower walking velocity, an increased stance/swing phase in the gait cycle and a decreased proportion of single support time compared to the control group. Cadence did not differ significantly between groups. Among the UHDRS subscores, total motor score and total behavior score were positively correlated with step length, and total behavior score was positively correlated with walking velocity in patients with HD. Conclusion Increased variability in step and stride length, slower walking velocity, increased stance phase, and decreased swing phase and single support time with preserved cadence suggest that HD gait patterns are slow, ataxic and ineffective. This study suggests that quantitative gait analysis is needed to assess gait problems in HD.

  3. Structural reliability analysis applied to pipeline risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, M. [GL Industrial Services, Loughborough (United Kingdom); Mendes, Renato F.; Donato, Guilherme V.P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Quantitative Risk Assessment (QRA) of pipelines requires two main components to be provided. These are models of the consequences that follow from some loss of containment incident, and models for the likelihood of such incidents occurring. This paper describes how PETROBRAS have used Structural Reliability Analysis for the second of these, to provide pipeline- and location-specific predictions of failure frequency for a number of pipeline assets. This paper presents an approach to estimating failure rates for liquid and gas pipelines, using Structural Reliability Analysis (SRA) to analyze the credible basic mechanisms of failure such as corrosion and mechanical damage. SRA is a probabilistic limit state method: for a given failure mechanism it quantifies the uncertainty in parameters to mathematical models of the load-resistance state of a structure and then evaluates the probability of load exceeding resistance. SRA can be used to benefit the pipeline risk management process by optimizing in-line inspection schedules, and as part of the design process for new construction in pipeline rights of way that already contain multiple lines. A case study is presented to show how the SRA approach has recently been used on PETROBRAS pipelines and the benefits obtained from it. (author)

  4. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    Directory of Open Access Journals (Sweden)

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  5. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  6. Quantitative microbial risk assessment to estimate the health risk from exposure to noroviruses in polluted surface water in South Africa.

    Science.gov (United States)

    Van Abel, Nicole; Mans, Janet; Taylor, Maureen B

    2017-10-01

    This study assessed the risks posed by noroviruses (NoVs) in surface water used for drinking, domestic, and recreational purposes in South Africa (SA), using a quantitative microbial risk assessment (QMRA) methodology that took a probabilistic approach coupling an exposure assessment with four dose-response models to account for uncertainty. Water samples from three rivers were found to be contaminated with NoV GI (80-1,900 gc/L) and GII (420-9,760 gc/L) leading to risk estimates that were lower for GI than GII. The volume of water consumed and the probabilities of infection were lower for domestic (2.91 × 10 -8 to 5.19 × 10 -1 ) than drinking water exposures (1.04 × 10 -5 to 7.24 × 10 -1 ). The annual probabilities of illness varied depending on the type of recreational water exposure with boating (3.91 × 10 -6 to 5.43 × 10 -1 ) and swimming (6.20 × 10 -6 to 6.42 × 10 -1 ) being slightly greater than playing next to/in the river (5.30 × 10 -7 to 5.48 × 10 -1 ). The QMRA was sensitive to the choice of dose-response model. The risk of NoV infection or illness from contaminated surface water is extremely high in SA, especially for lower socioeconomic individuals, but is similar to reported risks from limited international studies.

  7. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  8. Quantitative analysis of LISA pathfinder test-mass noise

    International Nuclear Information System (INIS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-01-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3x10 -14 m s -2 /√(Hz) at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise

  9. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  10. Quantitative microbial risk assessment of Cryptosporidium and Giardia in well water from a native community of Mexico.

    Science.gov (United States)

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Balderas-Cortés, José de Jesús; Mondaca-Fernández, Iram; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2015-01-01

    Cryptosporidium and Giardia are gastrointestinal disease-causing organisms transmitted by the fecal-oral route, zoonotic and prevalent in all socioeconomic segments with greater emphasis in rural communities. The goal of this study was to assess the risk of cryptosporidiosis and giardiasis of Potam dwellers consuming drinking water from communal well water. To achieve the goal, quantitative microbial risk assessment (QMRA) was carried out as follows: (a) identification of Cryptosporidium oocysts and Giardia cysts in well water samples by information collection rule method, (b) assessment of exposure to healthy Potam residents, (c) dose-response modelling, and (d) risk characterization using an exponential model. All well water samples tested were positive for Cryptosporidium and Giardia. The QMRA results indicate a mean of annual risks of 99:100 (0.99) for cryptosporidiosis and 1:1 (1.0) for giardiasis. The outcome of the present study may drive decision-makers to establish an educational and treatment program to reduce the incidence of parasite-borne intestinal infection in the Potam community, and to conduct risk analysis programs in other similar rural communities in Mexico.

  11. Revisiting support optimization at the Driskos tunnel using a quantitative risk approach

    Directory of Open Access Journals (Sweden)

    J. Connor Langford

    2016-04-01

    Full Text Available With the scale and cost of geotechnical engineering projects increasing rapidly over the past few decades, there is a clear need for the careful consideration of calculated risks in design. While risk is typically dealt with subjectively through the use of conservative design parameters, with the advent of reliability-based methods, this no longer needs to be the case. Instead, a quantitative risk approach can be considered that incorporates uncertainty in ground conditions directly into the design process to determine the variable ground response and support loads. This allows for the optimization of support on the basis of both worker safety and economic risk. This paper presents the application of such an approach to review the design of the initial lining system along a section of the Driskos twin tunnels as part of the Egnatia Odos highway in northern Greece. Along this section of tunnel, weak rock masses were encountered as well as high in situ stress conditions, which led to excessive deformations and failure of the as built temporary support. Monitoring data were used to validate the rock mass parameters selected in this area and a risk approach was used to determine, in hindsight, the most appropriate support category with respect to the cost of installation and expected cost of failure. Different construction sequences were also considered in the context of both convenience and risk cost.

  12. Risk of vertebral insufficiency fractures in relation to compressive strength predicted by quantitative computed tomography

    International Nuclear Information System (INIS)

    Biggemann, M.; Hilweg, D.; Seidel, S.; Horst, M.; Brinckmann, P.

    1991-01-01

    Vertebral insufficiency fractures may result from excessive loading of normal and routine loading of osteoporotic spines. Fractures occur when the mechanical load exceeds the vertebral compressive strength, i.e., the maximum load a vertebra can tolerate. Vertebral compressive strength is determined by trabecular bone density and the size of end-plate area. Both parameters can be measured non-invasively by quanti-tative computed tomography (QCT). In 75 patients compressive strength (i.e., trabecular bone density and endplate area) of the vertebra L3 was determined using QCT. In addition, conventional radiographs of the spines were analysed for the prevalence of insufficiency fractures in each case. By relating fracture prevalence to strength, 3 fracture risk groups were found: a high-risk group with strength values of L3 5 kN and a fracture risk near 0 percent. Biomechanical measurements and model calculations indicate that spinal loads of 3 to 4 kN at L3/4 will be common in everyday activities. These data and the results described above suggest that spines with strength values of L3<3 kN are at an extremely high risk of insufficiency fractures in daily life. Advantages of fracture risk assessment by strength determination over risk estimation based on clinically used trabecular bone density measurements are discussed. (author). 18 refs.; 4 figs

  13. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  14. Quantitative analysis of intraoperative communication in open and laparoscopic surgery.

    Science.gov (United States)

    Sevdalis, Nick; Wong, Helen W L; Arora, Sonal; Nagpal, Kamal; Healey, Andrew; Hanna, George B; Vincent, Charles A

    2012-10-01

    Communication is important for patient safety in the operating room (OR). Several studies have assessed OR communications qualitatively or have focused on communication in crisis situations. This study used prospective, quantitative observation based on well-established communication theory to assess similarities and differences in communication patterns between open and laparoscopic surgery. Based on communication theory, a standardized proforma was developed for assessment in the OR via real-time observation of communication types, their purpose, their content, and their initiators/recipients. Data were collected prospectively in real time in the OR for 20 open and 20 laparoscopic inguinal hernia repairs. Assessors were trained and calibrated, and their reliability was established statistically. During 1,884 min of operative time, 4,227 communications were observed and analyzed (2,043 laparoscopic vs 2,184 open communications). The mean operative duration (laparoscopic, 48 min vs open, 47 min), mean communication frequency (laparoscopic, 102 communications/procedure vs open, 109 communications/procedure), and mean communication rate (laparoscopic, 2.13 communications/min vs open, 2.23 communications/min) did not differ significantly across laparoscopic and open procedures. Communications were most likely to be initiated by surgeons (80-81 %), to be received by either other surgeons (46-50%) or OR nurses (38-40 %), to be associated with equipment/procedural issues (39-47 %), and to provide direction for the OR team (38-46%) in open and laparoscopic cases. Moreover, communications in laparoscopic cases were significantly more equipment related (laparoscopic, 47 % vs open, 39 %) and aimed significantly more at providing direction (laparoscopic, 46 % vs open, 38 %) and at consulting (laparoscopic, 17 % vs open, 12 %) than at sharing information (laparoscopic, 17 % vs open, 31 %) (P communications were found in both laparoscopic and open cases during a relatively low-risk

  15. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  16. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  17. A methodology for the quantitative risk assessment of major accidents triggered by seismic events

    International Nuclear Information System (INIS)

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-01-01

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units

  18. Streamlining project delivery through risk analysis.

    Science.gov (United States)

    2015-08-01

    Project delivery is a significant area of concern and is subject to several risks throughout Plan Development : Process (PDP). These risks are attributed to major areas of project development, such as environmental : analysis, right-of-way (ROW) acqu...

  19. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  20. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Science.gov (United States)

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  1. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  2. Communicating quantitative risks and benefits in promotional prescription drug labeling or print advertising.

    Science.gov (United States)

    West, Suzanne L; Squiers, Linda B; McCormack, Lauren; Southwell, Brian G; Brouwer, Emily S; Ashok, Mahima; Lux, Linda; Boudewyns, Vanessa; O'Donoghue, Amie; Sullivan, Helen W

    2013-05-01

    Under the Food, Drug, and Cosmetic Act, all promotional materials for prescription drugs must strike a fair balance in presentation of risks and benefits. How to best present this information is not clear. We sought to determine if the presentation of quantitative risk and benefit information in drug advertising and labeling influences consumers', patients', and clinicians' information processing, knowledge, and behavior by assessing available empirical evidence. We used PubMed for a literature search, limiting to articles published in English from 1990 forward. Two reviewers independently reviewed the titles and abstracts for inclusion, after which we reviewed the full texts to determine if they communicated risk/benefit information either: (i) numerically (e.g., percent) versus non-numerically (e.g., using text such as "increased risk") or (ii) numerically using different formats (e.g., "25% of patients", "one in four patients", or use of pictographs). We abstracted information from included articles into standardized evidence tables. The research team identified a total of 674 relevant publications, of which 52 met our inclusion criteria. Of these, 37 focused on drugs. Presenting numeric information appears to improve understanding of risks and benefits relative to non-numeric presentation; presenting both numeric and non-numeric information when possible may be best practice. No single specific format or graphical approach emerged as consistently superior. Numeracy and health literacy also deserve more empirical attention as moderators. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Quantitative Deficits of Preschool Children at Risk for Mathematical Learning Disability

    Directory of Open Access Journals (Sweden)

    Felicia W. Chu

    2013-05-01

    Full Text Available The study tested the hypothesis that acuity of the potentially inherent approximate number system (ANS contributes to risk of mathematical learning disability (MLD. Sixty-eight (35 boys preschoolers at risk for school failure were assessed on a battery of quantitative tasks, and on intelligence, executive control, preliteracy skills, and parental education. Mathematics achievement scores at the end of one year of preschool indicated that 34 of these children were at high risk for MLD. Relative to the 34 typically achieving children, the at risk children were less accurate on the ANS task, and a one standard deviation deficit on this task resulted in a 2.4 fold increase in the odds of MLD status. The at risk children also had a poor understanding of ordinal relations, and had slower learning of Arabic numerals, number words, and their cardinal values. Poor performance on these tasks resulted in 3.6 to 4.5 fold increases in the odds of MLD status. The results provide some support for the ANS hypothesis but also suggest these deficits are not the primary source of poor mathematics learning.

  4. Farm to Fork Quantitative Risk Assessment of Listeria monocytogenes Contamination in Raw and Pasteurized Milk Cheese in Ireland.

    Science.gov (United States)

    Tiwari, Uma; Cummins, Enda; Valero, Antonio; Walsh, Des; Dalmasso, Marion; Jordan, Kieran; Duffy, Geraldine

    2015-06-01

    The objective of this study was to model and quantify the level of Listeria monocytogenes in raw milk cheese (RMc) and pasteurized milk cheese (PMc) from farm to fork using a Bayesian inference approach combined with a quantitative risk assessment. The modeling approach included a prediction of contamination arising from the farm environment as well from cross-contamination within the cheese-processing facility through storage and subsequent human exposure. The model predicted a high concentration of L. monocytogenes in contaminated RMc (mean 2.19 log10 CFU/g) compared to PMc (mean -1.73 log10 CFU/g). The mean probability of illness (P1 for low-risk population, LR) and (P2 for high-risk population, HR, e.g., immunocompromised) adult Irish consumers following exposure to contaminated cheese was 7 × 10(-8) (P1 ) and 9 × 10(-4) (P2 ) for RMc and 7 × 10(-10) (P1 ) and 8 × 10(-6) (P2 ) for PMc, respectively. In addition, the model was used to evaluate performance objectives at various stages, namely, the cheese making and ripening stages, and to set a food safety objective at the time of consumption. A scenario analysis predicted various probabilities of L. monocytogenes contamination along the cheese-processing chain for both RMc and PMc. The sensitivity analysis showed the critical factors for both cheeses were the serving size of the cheese, storage time, and temperature at the distribution stage. The developed model will allow food processors and policymakers to identify the possible routes of contamination along the cheese-processing chain and to reduce the risk posed to human health. © 2015 Society for Risk Analysis.

  5. A Quantitative Approach to Credit Risk Management in the Underwriting Process for the Retail Portfolio

    Directory of Open Access Journals (Sweden)

    Andreea Costea

    2017-03-01

    Full Text Available The core of this paper encloses a mathematical approach of credit risk management, based on a scorecard model used in the bank’s underwriting process. The main purpose of this paper is to present how to develop, validate and apply a rating model in practice. Using 21568 loan applications provided by one of the largest banks from Romania, a scorecard is built for the underwriting purposes. The customer data used in the modeling is based on socio-demographic characteristics. The model is developed according to a set of statistical methods for parameter estimation. A real-life example of how to use such a model in the strategic decisions of a bank is presented. The cut-off score for the acceptance of the applications is calibrated to a potential risk appetite of the main four banks in Romania. From an evaluative perspective, this paper is compatible with an exploratory approach to quantitative research methodology.

  6. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  7. Quantitative analysis of results for quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Passaro, Bruno Martins

    2011-01-01

    The linear accelerators represent the most important, practical and versatile source of ionizing radiation in radiotherapy. These functional characteristics influence the geometric and dosimetric accuracy of therapeutic doses applied to patients. The performance of this equipment may vary due to electronic defects, component failures or mechanical breakdowns, or may vary due to the deterioration and aging of components. Maintaining the quality of care depends on the stability of the accelerators and quality control of the institutions to monitor deviations in the parameters of the beam. The aim of this study is to assess and analyze the stability of the calibration factor of linear accelerators, as well as the other dosimetric parameters normally included in a program of quality control in radiotherapy. The average calibration factors of the accelerators for the period of approximately four years for the Clinac 600C and Clinac 6EX were (0,998 ± 0,012) and (0,996 ± 0,014), respectively. For the Clinac 2100CD 6 MV and 15 MV was (1,008 ± 0,009) and (1,006 ± 0,010), respectively, in a period of approximately four years. Statistical analysis of the three linear accelerators was found that the coefficient of variation of calibration factors had values below 2% which shows a consistency in the data. By calculating the normal distribution of calibration factors, we found that for the Clinac 600C and Clinac 2100CD, is an expected probability that more than 90% of cases the values are within acceptable limits according to the TG-142, while for the Clinac 6EX is expected around 85% since this had several exchanges of accelerator components. The values of TPR 20,10 of three accelerators are practically constant and within acceptable limits according to the TG-142. It can be concluded that a detailed study of data from the calibration factor of the accelerators and TPR20,10 from a quantitative point of view, is extremely useful in a quality assurance program. (author)

  8. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, California

    Science.gov (United States)

    Stock, G. M.; Luco, N.; Collins, B. D.; Harp, E.; Reichenbach, P.; Frankel, K. L.

    2011-12-01

    Rock falls are a considerable hazard in Yosemite Valley, California with more than 835 rock falls and other slope movements documented since 1857. Thus, rock falls pose potentially significant risk to the nearly four million annual visitors to Yosemite National Park. Building on earlier hazard assessment work by the U.S. Geological Survey, we performed a quantitative rock-fall hazard and risk assessment for Yosemite Valley. This work was aided by several new data sets, including precise Geographic Information System (GIS) maps of rock-fall deposits, airborne and terrestrial LiDAR-based point cloud data and digital elevation models, and numerical ages of talus deposits. Using Global Position Systems (GPS), we mapped the positions of over 500 boulders on the valley floor and measured their distance relative to the mapped base of talus. Statistical analyses of these data yielded an initial hazard zone that is based on the 90th percentile distance of rock-fall boulders beyond the talus edge. This distance was subsequently scaled (either inward or outward from the 90th percentile line) based on rock-fall frequency information derived from a combination of cosmogenic beryllium-10 exposure dating of boulders beyond the edge of the talus, and computer model simulations of rock-fall runout. The scaled distances provide the basis for a new hazard zone on the floor of Yosemite Valley. Once this zone was delineated, we assembled visitor, employee, and resident use data for each structure within the hazard zone to quantitatively assess risk exposure. Our results identify areas within the new hazard zone that may warrant more detailed study, for example rock-fall susceptibility, which can be assessed through examination of high-resolution photographs, structural measurements on the cliffs, and empirical calculations derived from LiDAR point cloud data. This hazard and risk information is used to inform placement of existing and potential future infrastructure in Yosemite Valley.

  9. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  10. Predictive value of quantitative dipyridamole-thallium scintigraphy in assessing cardiovascular risk after vascular surgery in diabetes mellitus

    International Nuclear Information System (INIS)

    Lane, S.E.; Lewis, S.M.; Pippin, J.J.; Kosinski, E.J.; Campbell, D.; Nesto, R.W.; Hill, T.

    1989-01-01

    Cardiac complications represent a major risk to patients undergoing vascular surgery. Diabetic patients may be particularly prone to such complications due to the high incidence of concomitant coronary artery disease, the severity of which may be clinically unrecognized. Attempts to stratify groups by clinical criteria have been useful but lack the predictive value of currently used noninvasive techniques such as dipyridamole-thallium scintigraphy. One hundred one diabetic patients were evaluated with dipyridamole-thallium scintigraphy before undergoing vascular surgery. The incidence of thallium abnormalities was high (80%) and did not correlate with clinical markers of coronary disease. Even in a subgroup of patients with no overt clinical evidence of underlying heart disease, thallium abnormalities were present in 59%. Cardiovascular complications, however, occurred in only 11% of all patients. Statistically significant prediction of risk was not achieved with simple assessment of thallium results as normal or abnormal. Quantification of total number of reversible defects, as well as assessment of ischemia in the distribution of the left anterior descending coronary artery was required for optimum predictive accuracy. The prevalence of dipyridamole-thallium abnormalities in a diabetic population is much higher than that reported in nondiabetic patients and cannot be predicted by usual clinical indicators of heart disease. In addition, cardiovascular risk of vascular surgery can be optimally assessed by quantitative analysis of dipyridamole-thallium scintigraphy and identification of high- and low-risk subgroups

  11. Quantitative analysis of DNA methylation in chronic lymphocytic leukemia patients.

    Science.gov (United States)

    Lyko, Frank; Stach, Dirk; Brenner, Axel; Stilgenbauer, Stephan; Döhner, Hartmut; Wirtz, Michaela; Wiessler, Manfred; Schmitz, Oliver J

    2004-06-01

    Changes in the genomic DNA methylation level have been found to be closely associated with tumorigenesis. In order to analyze the relation of aberrant DNA methylation to clinical and biological risk factors, we have determined the cytosine methylation level of 81 patients diagnosed with chronic lymphocytic leukemia (CLL). The analysis was based on DNA hydrolysis followed by derivatization of the 2'-desoxyribonucleoside-3'-monophosphates with BODIPY FL EDA. Derivatives were separated by micellar electrokinetic chromatography, and laser-induced fluorescence was used for detection. We analyzed potential correlations between DNA methylation levels and numerous patient parameters, including clinical observations and biological data. As a result, we observed a significant correlation with the immunoglobulin variable heavy chain gene (VH) mutation status. This factor has been repeatedly proposed as a reliable prognostic marker for CLL, which suggests that the methylation level might be a valuable factor in determining the prognostic outcome of CLL. We are now in the process of refining our method to broaden its application potential. In this context, we show here that the oxidation of the fluorescence marker in the samples and the evaporation of methanol in the electrolytes can be prevented by a film of paraffin oil. In summary, our results thus establish capillary electrophoresis as a valuable tool for analyzing the DNA methylation status of clinical samples.

  12. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  13. Quantitative analysis of soluble elements in environmental waters by PIXE

    International Nuclear Information System (INIS)

    Niizeki, T.; Kawasaki, K.; Adachi, M.; Tsuji, M.; Hattori, T.

    1999-01-01

    We have started PIXE research for environmental science at Van de Graaff accelerator facility in Tokyo Institute of Technology. Quantitative measurements of soluble fractions in river waters have been carried out using the preconcentrate method developed in Tohoku University. We reveal that this PIXE target preparation can be also applied to waste water samples. (author)

  14. Identification of Case Content with Quantitative Network Analysis

    DEFF Research Database (Denmark)

    Christensen, Martin Lolle; Olsen, Henrik Palmer; Tarissan, Fabian

    2016-01-01

    the relevant articles. In order to enhance information retrieval about case content, without relying on manual labor and subjective judgment, we propose in this paper a quantitative method that gives a better indication of case content in terms of which articles a given case is more closely associated with...

  15. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  16. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    SAM

    2014-05-14

    May 14, 2014 ... African Journal of Biotechnology. Full Length ... quantitative trait locus (QTLs) on chromosomes 1, 6, 7 and 20 in ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs ...

  17. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  18. Quantitative assessments of indoor air pollution and the risk of childhood acute leukemia in Shanghai

    International Nuclear Information System (INIS)

    Gao, Yu; Zhang, Yan; Kamijima, Michihiro; Sakai, Kiyoshi; Khalequzzaman, Md; Nakajima, Tamie; Shi, Rong; Wang, Xiaojin; Chen, Didi; Ji, Xiaofan; Han, Kaiyi; Tian, Ying

    2014-01-01

    We investigated the association between indoor air pollutants and childhood acute leukemia (AL). A total of 105 newly diagnosed cases and 105 1:1 gender-, age-, and hospital-matched controls were included. Measurements of indoor pollutants (including nitrogen dioxide (NO 2 ) and 17 types of volatile organic compounds (VOCs)) were taken with diffusive samplers for 64 pairs of cases and controls. Higher concentrations of NO 2 and almost half of VOCs were observed in the cases than in the controls and were associated with the increased risk of childhood AL. The use of synthetic materials for wall decoration and furniture in bedroom was related to the risk of childhood AL. Renovating the house in the last 5 years, changing furniture in the last 5 years, closing the doors and windows overnight in the winter and/or summer, paternal smoking history and outdoor pollutants affected VOC concentrations. Our results support the association between childhood AL and indoor air pollution. - Highlights: • We firstly assessed the effects of indoor air pollution on childhood AL in China. • Indoor air pollutants were assessed by questionnaire and quantitative measurements. • NO 2 and 17 types of VOCs were measured in bedrooms of both cases and controls. • Higher concentrations of indoor air pollutants increased the risk of childhood AL. • Indoor behavioral factors and outdoor pollution might affect indoor air pollution. - Higher concentrations of indoor air pollutants were related to an elevated risk of childhood AL

  19. Risk analysis and safety rationale

    International Nuclear Information System (INIS)

    Bengtsson, G.

    1989-01-01

    Decision making with respect to safety is becoming more and more complex. The risk involved must be taken into account together with numerous other factors such as the benefits, the uncertainties and the public perception. Can the decision maker be aided by some kind of system, general rules of thumb, or broader perspective on similar decisions? This question has been addressed in a joint Nordic project relating to nuclear power. Modern techniques for risk assessment and management have been studied, and parallels drawn to such areas as offshore safety and management of toxic chemicals in the environment. The report summarises the finding of 5 major technical reports which have been published in the NORD-series. The topics includes developments, uncertainties and limitations in probabilistic safety assessments, negligible risks, risk-cost trade-offs, optimisation of nuclear safety and radiation protection, and the role of risks in the decision making process. (author) 84 refs

  20. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    O' Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    2017-05-01

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vs treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.

  1. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance.

    Science.gov (United States)

    O'Daniel, Jennifer C; Yin, Fang-Fang

    2017-05-01

    To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Although the failure severity was greatest for daily imaging QA (imaging vs treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. A background to risk analysis. Vol. 2

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 2 treats generic methods of qualitative failure analysis. (BP)

  3. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  4. Risk Analysis of Telecom Enterprise Financing

    Institute of Scientific and Technical Information of China (English)

    YU Hua; SHU Hua-ying

    2005-01-01

    The main research objects in this paper are the causes searching and risk estimating method for telecom enterprises' financial risks. The multi-mode financing for telecom enterprises makes it flexible to induce the capital and obtain the profit by corresponding projects. But there are also potential risks going with these financing modes. After making analysis of categories and causes of telecom enterprises' financing risk, a method by Analytic Hierarchy Process (AHP) is put forward to estimating the financing risk. And the author makes her suggestion and opinion by example analysis, in order to provide some ideas and basis for telecom enterprise's financing decision-making.

  5. Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.

    Science.gov (United States)

    Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K

    2017-02-01

    Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.

  6. PRA and Risk Informed Analysis

    International Nuclear Information System (INIS)

    Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.

    2006-01-01

    The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codes and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs

  7. Advances in Risk Analysis with Big Data.

    Science.gov (United States)

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  8. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    Barakat, M.F.; Farag, A.N.; El-Gharbawy, A.A.

    1974-01-01

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  9. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    Science.gov (United States)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  10. Quantitative analysis of Moessbauer backscatter spectra from multilayer films

    International Nuclear Information System (INIS)

    Bainbridge, J.

    1975-01-01

    The quantitative interpretation of Moessbauer backscatter spectra with particular reference to internal conversion electrons has been treated assuming that electron attenuation in a surface film can be satisfactorily described by a simple exponential law. The theory of Krakowski and Miller has been extended to include multi-layer samples, and a relation between the Moessbauer spectrum area and an individual layer thickness derived. As an example, numerical results are obtained for a duplex oxide film grown on pure iron. (Auth.)

  11. Geometrical conditions at the quantitative neutronographic texture analysis

    International Nuclear Information System (INIS)

    Tobisch, J.; Kleinstueck, K.

    1975-10-01

    The beam geometry for measuring quantitative pole figures by a neutronographic texture diffractometer is explained for transmission and reflection arrangement of spherical samples and sheets. For given dimensions of counter aperture the maximum possible cross sections of the incident beam are calculated as a function of sample dimensions and the Bragg angle theta. Methods for the calculation of absorption factors and volume correction are given. Under special conditions advantages result in the transmission case for sample motion into the direction +α. (author)

  12. Quantitative analysis of strategic and tactical purchasing decisions

    OpenAIRE

    Heijboer, G.J.

    2003-01-01

    Purchasing management is a relatively new scientific research field, partly due to the fact that purchasing has only recently been recognized as a factor of strategic importance to an organization. In this thesis, the author focuses on a selection of strategic and tactical purchasing decision problems. New quantitative models are developed for these decision problems using a range of mathematical techniques, thereby contributing to the further development of purchasing theory and its appliati...

  13. Quantitative analysis of carbon radiation in edge plasmas of LHD

    International Nuclear Information System (INIS)

    Dong, C.F.; Morita, S.; Oishi, T.; Goto, M.; Murakami, I.; Wang, E.R.; Huang, X.L.

    2013-01-01

    It is of interest to compare the carbon radiation loss between LHD and tokamaks. Since the radiation from C"3"+ is much smaller than that from C"5"+, it is also interesting to examine the difference in the detached plasma. In addition, it is important to study quantitatively the radiation from each ionization stage of carbon which is uniquely the dominant impurity in most tokamaks and LHD. (J.P.N.)

  14. 38 CFR 75.115 - Risk analysis.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed or...

  15. A background to risk analysis. Vol. 4

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 4 treats human error in plant operation. (BP)

  16. Skin sensitisation quantitative risk assessment (QRA) based on aggregate dermal exposure to methylisothiazolinone in personal care and household cleaning products.

    NARCIS (Netherlands)

    Ezendam, J; Bokkers, B G H; Bil, W; Delmaar, J E

    2017-01-01

    Contact allergy to preservatives is an important public health problem. Ideally, new substances should be evaluated for the risk on skin sensitization before market entry, for example by using a quantitative risk assessment (QRA) as developed for fragrances. As a proof-of-concept, this QRA was

  17. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  18. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  19. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  20. Quantitative microbiological risk assessment in food industry: Theory and practical application.

    Science.gov (United States)

    Membré, Jeanne-Marie; Boué, Géraldine

    2018-04-01

    The objective of this article is to bring scientific background as well as practical hints and tips to guide risk assessors and modelers who want to develop a quantitative Microbiological Risk Assessment (MRA) in an industrial context. MRA aims at determining the public health risk associated with biological hazards in a food. Its implementation in industry enables to compare the efficiency of different risk reduction measures, and more precisely different operational settings, by predicting their effect on the final model output. The first stage in MRA is to clearly define the purpose and scope with stakeholders, risk assessors and modelers. Then, a probabilistic model is developed; this includes schematically three important phases. Firstly, the model structure has to be defined, i.e. the connections between different operational processing steps. An important step in food industry is the thermal processing leading to microbial inactivation. Growth of heat-treated surviving microorganisms and/or post-process contamination during storage phase is also important to take into account. Secondly, mathematical equations are determined to estimate the change of microbial load after each processing step. This phase includes the construction of model inputs by collecting data or eliciting experts. Finally, the model outputs are obtained by simulation procedures, they have to be interpreted and communicated to targeted stakeholders. In this latter phase, tools such as what-if scenarios provide an essential added value. These different MRA phases are illustrated through two examples covering important issues in industry. The first one covers process optimization in a food safety context, the second one covers shelf-life determination in a food quality context. Although both contexts required the same methodology, they do not have the same endpoint: up to the human health in the foie gras case-study illustrating here a safety application, up to the food portion in the