WorldWideScience

Sample records for risk adjustment methods

  1. A Review on Methods of Risk Adjustment and their Use in Integrated Healthcare Systems

    Science.gov (United States)

    Juhnke, Christin; Bethge, Susanne

    2016-01-01

    Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The objective of this review was to obtain an overview of existing models of risk adjustment as well as on crucial weights in risk adjustment. Moreover, the predictive performance of selected methods in international healthcare systems should be analysed. Theory and methods: A comprehensive, systematic literature review on methods of risk adjustment was conducted in terms of an encompassing, interdisciplinary examination of the related disciplines. Results: In general, several distinctions can be made: in terms of risk horizons, in terms of risk factors or in terms of the combination of indicators included. Within these, another differentiation by three levels seems reasonable: methods based on mortality risks, methods based on morbidity risks as well as those based on information on (self-reported) health status. Conclusions and discussion: After the final examination of different methods of risk adjustment it was shown that the methodology used to adjust risks varies. The models differ greatly in terms of their included morbidity indicators. The findings of this review can be used in the evaluation of integrated healthcare delivery systems and can be integrated into quality- and patient-oriented reimbursement of care providers in the design of healthcare contracts. PMID:28316544

  2. A method to adjust radiation dose-response relationships for clinical risk factors

    DEFF Research Database (Denmark)

    Appelt, Ane Lindegaard; Vogelius, Ivan R

    2012-01-01

    Several clinical risk factors for radiation induced toxicity have been identified in the literature. Here, we present a method to quantify the effect of clinical risk factors on radiation dose-response curves and apply the method to adjust the dose-response for radiation pneumonitis for patients...

  3. Direct risk standardisation: a new method for comparing casemix adjusted event rates using complex models.

    Science.gov (United States)

    Nicholl, Jon; Jacques, Richard M; Campbell, Michael J

    2013-10-29

    Comparison of outcomes between populations or centres may be confounded by any casemix differences and standardisation is carried out to avoid this. However, when the casemix adjustment models are large and complex, direct standardisation has been described as "practically impossible", and indirect standardisation may lead to unfair comparisons. We propose a new method of directly standardising for risk rather than standardising for casemix which overcomes these problems. Using a casemix model which is the same model as would be used in indirect standardisation, the risk in individuals is estimated. Risk categories are defined, and event rates in each category for each centre to be compared are calculated. A weighted sum of the risk category specific event rates is then calculated. We have illustrated this method using data on 6 million admissions to 146 hospitals in England in 2007/8 and an existing model with over 5000 casemix combinations, and a second dataset of 18,668 adult emergency admissions to 9 centres in the UK and overseas and a published model with over 20,000 casemix combinations and a continuous covariate. Substantial differences between conventional directly casemix standardised rates and rates from direct risk standardisation (DRS) were found. Results based on DRS were very similar to Standardised Mortality Ratios (SMRs) obtained from indirect standardisation, with similar standard errors. Direct risk standardisation using our proposed method is as straightforward as using conventional direct or indirect standardisation, always enables fair comparisons of performance to be made, can use continuous casemix covariates, and was found in our examples to have similar standard errors to the SMR. It should be preferred when there is a risk that conventional direct or indirect standardisation will lead to unfair comparisons.

  4. Risk adjustment methods for Home Care Quality Indicators (HCQIs based on the minimum data set for home care

    Directory of Open Access Journals (Sweden)

    Hirdes John P

    2005-01-01

    Full Text Available Abstract Background There has been increasing interest in enhancing accountability in health care. As such, several methods have been developed to compare the quality of home care services. These comparisons can be problematic if client populations vary across providers and no adjustment is made to account for these differences. The current paper explores the effects of risk adjustment for a set of home care quality indicators (HCQIs based on the Minimum Data Set for Home Care (MDS-HC. Methods A total of 22 home care providers in Ontario and the Winnipeg Regional Health Authority (WRHA in Manitoba, Canada, gathered data on their clients using the MDS-HC. These assessment data were used to generate HCQIs for each agency and for the two regions. Three types of risk adjustment methods were contrasted: a client covariates only; b client covariates plus an "Agency Intake Profile" (AIP to adjust for ascertainment and selection bias by the agency; and c client covariates plus the intake Case Mix Index (CMI. Results The mean age and gender distribution in the two populations was very similar. Across the 19 risk-adjusted HCQIs, Ontario CCACs had a significantly higher AIP adjustment value for eight HCQIs, indicating a greater propensity to trigger on these quality issues on admission. On average, Ontario had unadjusted rates that were 0.3% higher than the WRHA. Following risk adjustment with the AIP covariate, Ontario rates were, on average, 1.5% lower than the WRHA. In the WRHA, individual agencies were likely to experience a decline in their standing, whereby they were more likely to be ranked among the worst performers following risk adjustment. The opposite was true for sites in Ontario. Conclusions Risk adjustment is essential when comparing quality of care across providers when home care agencies provide services to populations with different characteristics. While such adjustment had a relatively small effect for the two regions, it did

  5. Risk adjustment methods for Home Care Quality Indicators (HCQIs) based on the minimum data set for home care

    Science.gov (United States)

    Dalby, Dawn M; Hirdes, John P; Fries, Brant E

    2005-01-01

    Background There has been increasing interest in enhancing accountability in health care. As such, several methods have been developed to compare the quality of home care services. These comparisons can be problematic if client populations vary across providers and no adjustment is made to account for these differences. The current paper explores the effects of risk adjustment for a set of home care quality indicators (HCQIs) based on the Minimum Data Set for Home Care (MDS-HC). Methods A total of 22 home care providers in Ontario and the Winnipeg Regional Health Authority (WRHA) in Manitoba, Canada, gathered data on their clients using the MDS-HC. These assessment data were used to generate HCQIs for each agency and for the two regions. Three types of risk adjustment methods were contrasted: a) client covariates only; b) client covariates plus an "Agency Intake Profile" (AIP) to adjust for ascertainment and selection bias by the agency; and c) client covariates plus the intake Case Mix Index (CMI). Results The mean age and gender distribution in the two populations was very similar. Across the 19 risk-adjusted HCQIs, Ontario CCACs had a significantly higher AIP adjustment value for eight HCQIs, indicating a greater propensity to trigger on these quality issues on admission. On average, Ontario had unadjusted rates that were 0.3% higher than the WRHA. Following risk adjustment with the AIP covariate, Ontario rates were, on average, 1.5% lower than the WRHA. In the WRHA, individual agencies were likely to experience a decline in their standing, whereby they were more likely to be ranked among the worst performers following risk adjustment. The opposite was true for sites in Ontario. Conclusions Risk adjustment is essential when comparing quality of care across providers when home care agencies provide services to populations with different characteristics. While such adjustment had a relatively small effect for the two regions, it did substantially affect the

  6. Response Adjusted for Days of Antibiotic Risk (RADAR): evaluation of a novel method to compare strategies to optimize antibiotic use.

    Science.gov (United States)

    Schweitzer, V A; van Smeden, M; Postma, D F; Oosterheert, J J; Bonten, M J M; van Werkhoven, C H

    2017-12-01

    The Response Adjusted for Days of Antibiotic Risk (RADAR) statistic was proposed to improve the efficiency of trials comparing antibiotic stewardship strategies to optimize antibiotic use. We studied the behaviour of RADAR in a non-inferiority trial in which a β-lactam monotherapy strategy (n = 656) was non-inferior to fluoroquinolone monotherapy (n = 888) for patients with moderately severe community-acquired pneumonia. Patients were ranked according to clinical outcome, using five or eight categories, and antibiotic use. RADAR was calculated as the probability that the β-lactam group had a more favourable ranking than the fluoroquinolone group. To investigate the sensitivity of RADAR to detrimental clinical outcome we simulated increasing rates of 90-day mortality in the β-lactam group and performed the RADAR and non-inferiority analysis. The RADAR of the β-lactam group compared with the fluoroquinolone group was 60.3% (95% CI 57.9%-62.7%) using five and 58.4% (95% CI 56.0%-60.9%) using eight clinical outcome categories, all in favour of β-lactam. Sample sizes for RADAR were 38% (250/653) and 89% (580/653) of the non-inferiority sample size calculation, using five or eight clinical outcome categories, respectively. With simulated mortality rates, loss of non-inferiority of the β-lactam group occurred at a relative risk of 1.125 in the conventional analysis, whereas using RADAR the β-lactam group lost superiority at a relative risk of mortality of 1.25 and 1.5, with eight and five clinical outcome categories, respectively. RADAR favoured β-lactam over fluoroquinolone therapy for community-acquired pneumonia. Although RADAR required fewer patients than conventional non-inferiority analysis, the statistic was less sensitive to detrimental outcomes. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  7. Direct comparison of risk-adjusted and non-risk-adjusted CUSUM analyses of coronary artery bypass surgery outcomes.

    Science.gov (United States)

    Novick, Richard J; Fox, Stephanie A; Stitt, Larry W; Forbes, Thomas L; Steiner, Stefan

    2006-08-01

    We previously applied non-risk-adjusted cumulative sum methods to analyze coronary bypass outcomes. The objective of this study was to assess the incremental advantage of risk-adjusted cumulative sum methods in this setting. Prospective data were collected in 793 consecutive patients who underwent coronary bypass grafting performed by a single surgeon during a period of 5 years. The composite occurrence of an "adverse outcome" included mortality or any of 10 major complications. An institutional logistic regression model for adverse outcome was developed by using 2608 contemporaneous patients undergoing coronary bypass. The predicted risk of adverse outcome in each of the surgeon's 793 patients was then calculated. A risk-adjusted cumulative sum curve was then generated after specifying control limits and odds ratio. This risk-adjusted curve was compared with the non-risk-adjusted cumulative sum curve, and the clinical significance of this difference was assessed. The surgeon's adverse outcome rate was 96 of 793 (12.1%) versus 270 of 1815 (14.9%) for all the other institution's surgeons combined (P = .06). The non-risk-adjusted curve reached below the lower control limit, signifying excellent outcomes between cases 164 and 313, 323 and 407, and 667 and 793, but transgressed the upper limit between cases 461 and 478. The risk-adjusted cumulative sum curve never transgressed the upper control limit, signifying that cases preceding and including 461 to 478 were at an increased predicted risk. Furthermore, if the risk-adjusted cumulative sum curve was reset to zero whenever a control limit was reached, it still signaled a decrease in adverse outcome at 166, 653, and 782 cases. Risk-adjusted cumulative sum techniques provide incremental advantages over non-risk-adjusted methods by not signaling a decrement in performance when preoperative patient risk is high.

  8. Underwriters' view of risk - An adjuster's perspective

    International Nuclear Information System (INIS)

    Smith, M.

    1992-01-01

    This paper reviews how a risk assessment is performed by an insurance adjuster to determine rates and insurability of a client. It provides a historical perspective on insurance and how information systems are used to monitor past claims to determine future risk. Although this paper does not specifically address the oil and gas industry, it is informative in identifying how insurance rates are determined and risk assessments for various oil and gas operations are performed

  9. A Machine Learning Framework for Plan Payment Risk Adjustment.

    Science.gov (United States)

    Rose, Sherri

    2016-12-01

    To introduce cross-validation and a nonparametric machine learning framework for plan payment risk adjustment and then assess whether they have the potential to improve risk adjustment. 2011-2012 Truven MarketScan database. We compare the performance of multiple statistical approaches within a broad machine learning framework for estimation of risk adjustment formulas. Total annual expenditure was predicted using age, sex, geography, inpatient diagnoses, and hierarchical condition category variables. The methods included regression, penalized regression, decision trees, neural networks, and an ensemble super learner, all in concert with screening algorithms that reduce the set of variables considered. The performance of these methods was compared based on cross-validated R 2 . Our results indicate that a simplified risk adjustment formula selected via this nonparametric framework maintains much of the efficiency of a traditional larger formula. The ensemble approach also outperformed classical regression and all other algorithms studied. The implementation of cross-validated machine learning techniques provides novel insight into risk adjustment estimation, possibly allowing for a simplified formula, thereby reducing incentives for increased coding intensity as well as the ability of insurers to "game" the system with aggressive diagnostic upcoding. © Health Research and Educational Trust.

  10. Spatial implications of covariate adjustment on patterns of risk

    DEFF Research Database (Denmark)

    Sabel, Clive Eric; Wilson, Jeff Gaines; Kingham, Simon

    2007-01-01

    Epidemiological studies that examine the relationship between environmental exposures and health often address other determinants of health that may influence the relationship being studied by adjusting for these factors as covariates. While disease surveillance methods routinely control...... for covariates such as deprivation, there has been limited investigative work on the spatial movement of risk at the intraurban scale due to the adjustment. It is important that the nature of any spatial relocation be well understood as a relocation to areas of increased risk may also introduce additional...... localised factors that influence the exposure-response relationship. This paper examines the spatial patterns of relative risk and clusters of hospitalisations based on an illustrative small-area example from Christchurch, New Zealand. A four-stage test of the spatial relocation effects of covariate...

  11. Competition Leverage : How the Demand Side Affects Optimal Risk Adjustment

    NARCIS (Netherlands)

    Bijlsma, M.; Boone, J.; Zwart, Gijsbert

    2011-01-01

    We study optimal risk adjustment in imperfectly competitive health insurance markets when high-risk consumers are less likely to switch insurer than low-risk consumers. First, we find that insurers still have an incentive to select even if risk adjustment perfectly corrects for cost differences

  12. 42 CFR 422.310 - Risk adjustment data.

    Science.gov (United States)

    2010-10-01

    ... that are used in the development and application of a risk adjustment payment model. (b) Data... (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Payments to Medicare Advantage Organizations § 422... risk adjustment factors used to adjust payments, as required under §§ 422.304(a) and (c). CMS also may...

  13. Portfolio balancing and risk adjusted values under constrained budget conditions

    International Nuclear Information System (INIS)

    MacKay, J.A.; Lerche, I.

    1996-01-01

    For a given hydrocarbon exploration opportunity, the influences of value, cost, success probability and corporate risk tolerance provide an optimal working interest that should be taken in the opportunity in order to maximize the risk adjusted value. When several opportunities are available, but when the total budget is insufficient to take optimal working interest in each, an analytic procedure is given for optimizing the risk adjusted value of the total portfolio; the relevant working interests are also derived based on a cost exposure constraint. Several numerical illustrations are provided to exhibit the use of the method under different budget conditions, and with different numbers of available opportunities. When value, cost, success probability, and risk tolerance are uncertain for each and every opportunity, the procedure is generalized to allow determination of probable optimal risk adjusted value for the total portfolio and, at the same time, the range of probable working interest that should be taken in each opportunity is also provided. The result is that the computations of portfolio balancing can be done quickly in either deterministic or probabilistic manners on a small calculator, thereby providing rapid assessments of opportunities and their worth to a corporation. (Author)

  14. INSTITUTIONAL OWNERSHIP LEVEL AND RISK-ADJUSTED RETURN

    OpenAIRE

    Isaiah, Chioma; Li, Meng (Emma)

    2017-01-01

    This paper examines the relationship between the level of institutional ownership andrisk-adjusted return on stocks. We find a significant positive relationship between the level ofinstitutional ownership on a stock and its risk-adjusted return. This result holds both in the longrun and in shorter time periods. Our findings suggest that all things being equal, it is possible toobtain risk-adjusted return by going short on the stocks with low institutional ownership andgoing long on those with...

  15. Risk-adjusted capitation: recent experiences in The Netherlands.

    Science.gov (United States)

    van de Ven, W P; van Vliet, R C; van Barneveld, E M; Lamers, L M

    1994-01-01

    The market-oriented health care reforms taking place in the Netherlands show a clear resemblance to the proposals for managed competition in U.S. health care. In both countries good risk adjustment mechanisms that prevent cream skimming--that is, that prevent plans from selecting the best health risks--are critical to the success of the reforms. In this paper we present an overview of the Dutch reforms and of our research concerning risk-adjusted capitation payments. Although we are optimistic about the technical possibilities for solving the problem of cream skimming, the implementation of good risk-adjusted capitation is a long-term challenge.

  16. Measurement Of Shariah Stock Performance Using Risk Adjusted Performance

    Directory of Open Access Journals (Sweden)

    Zuhairan Y Yunan

    2015-03-01

    Full Text Available The aim of this research is to analyze the shariah stock performance using risk adjusted performance method. There are three parameters to measure the stock performance i.e. Sharpe, Treynor, and Jensen. This performance’s measurements calculate the return and risk factor from shariah stocks. The data that used on this research is using the data of stocks at Jakarta Islamic Index. Sampling method that used on this paper is purposive sampling. This research is using ten companies as a sample. The result shows that from three parameters, the stock that have a best performance are AALI, ANTM, ASII, CPIN, INDF, KLBF, LSIP, and UNTR.DOI: 10.15408/aiq.v7i1.1364

  17. ACA Risk Adjustment - Overview, Context, and Challenges

    Data.gov (United States)

    U.S. Department of Health & Human Services — Volume 4, Issue 3 of the Medicare and Medicaid Research Review includes three articles describing the Department of Health and Human Services (HHS) developed risk...

  18. Risk adjusted financial costs of photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, Sandor; Jaeger-Waldau, Arnulf [Joint Research Centre, Institute for Energy, Via E. Fermi 2749, I-21020 Ispra (Italy); Szabo, Laszlo [Joint Research Centre, Institute for Prospective Technological Studies C. Inca Garcilaso, 3. E-41092 Sevilla (Spain)

    2010-07-15

    Recent research shows significant differences in the levelised photovoltaics (PV) electricity cost calculations. The present paper points out that no unique or absolute cost figure can be justified, the correct solution is to use a range of cost figures that is determined in a dynamic power portfolio interaction within the financial scheme, support mechanism and industry cost reduction. The paper draws attention to the increasing role of financial investors in the PV segment of the renewable energy market and the importance they attribute to the risks of all options in the power generation portfolio. Based on these trends, a former version of a financing model is adapted to project the energy mix changes in the EU electricity market due to investors behaviour with different risk tolerance/aversion. The dynamic process of translating these risks into the return expectation in the financial appraisal and investment decision making is also introduced. By doing so, the paper sets up a potential electricity market trend with the associated risk perception and classification. The necessary risk mitigation tasks for all stakeholders in the PV market are summarised which aims to avoid the burden of excessive risk premiums in this market segment. (author)

  19. Risk adjusted financial costs of photovoltaics

    International Nuclear Information System (INIS)

    Szabo, Sandor; Jaeger-Waldau, Arnulf; Szabo, Laszlo

    2010-01-01

    Recent research shows significant differences in the levelised photovoltaics (PV) electricity cost calculations. The present paper points out that no unique or absolute cost figure can be justified, the correct solution is to use a range of cost figures that is determined in a dynamic power portfolio interaction within the financial scheme, support mechanism and industry cost reduction. The paper draws attention to the increasing role of financial investors in the PV segment of the renewable energy market and the importance they attribute to the risks of all options in the power generation portfolio. Based on these trends, a former version of a financing model is adapted to project the energy mix changes in the EU electricity market due to investors behaviour with different risk tolerance/aversion. The dynamic process of translating these risks into the return expectation in the financial appraisal and investment decision making is also introduced. By doing so, the paper sets up a potential electricity market trend with the associated risk perception and classification. The necessary risk mitigation tasks for all stakeholders in the PV market are summarised which aims to avoid the burden of excessive risk premiums in this market segment.

  20. The Persistence of Risk-Adjusted Mutual Fund Performance.

    OpenAIRE

    Elton, Edwin J; Gruber, Martin J; Blake, Christopher R

    1996-01-01

    The authors examine predictability for stock mutual funds using risk-adjusted returns. They find that past performance is predictive of future risk-adjusted performance. Applying modern portfolio theory techniques to past data improves selection and allows the authors to construct a portfolio of funds that significantly outperforms a rule based on past rank alone. In addition, they can form a combination of actively managed portfolios with the same risk as a portfolio of index funds but with ...

  1. Ensemble of trees approaches to risk adjustment for evaluating a hospital's performance.

    Science.gov (United States)

    Liu, Yang; Traskin, Mikhail; Lorch, Scott A; George, Edward I; Small, Dylan

    2015-03-01

    A commonly used method for evaluating a hospital's performance on an outcome is to compare the hospital's observed outcome rate to the hospital's expected outcome rate given its patient (case) mix and service. The process of calculating the hospital's expected outcome rate given its patient mix and service is called risk adjustment (Iezzoni 1997). Risk adjustment is critical for accurately evaluating and comparing hospitals' performances since we would not want to unfairly penalize a hospital just because it treats sicker patients. The key to risk adjustment is accurately estimating the probability of an Outcome given patient characteristics. For cases with binary outcomes, the method that is commonly used in risk adjustment is logistic regression. In this paper, we consider ensemble of trees methods as alternatives for risk adjustment, including random forests and Bayesian additive regression trees (BART). Both random forests and BART are modern machine learning methods that have been shown recently to have excellent performance for prediction of outcomes in many settings. We apply these methods to carry out risk adjustment for the performance of neonatal intensive care units (NICU). We show that these ensemble of trees methods outperform logistic regression in predicting mortality among babies treated in NICU, and provide a superior method of risk adjustment compared to logistic regression.

  2. Inappropriate use of payment weights to risk adjust readmission rates.

    Science.gov (United States)

    Fuller, Richard L; Goldfield, Norbert I; Averill, Richard F; Hughes, John S

    2012-01-01

    In this article, the authors demonstrate that the use of relative weights, as incorporated within the National Quality Forum-endorsed PacifiCare readmission measure, is inappropriate for risk adjusting rates of hospital readmission.

  3. Belgium: risk adjustment and financial responsibility in a centralised system.

    Science.gov (United States)

    Schokkaert, Erik; Van de Voorde, Carine

    2003-07-01

    Since 1995 Belgian sickness funds are partially financed through a risk adjustment system and are held partially financially responsible for the difference between their actual and their risk-adjusted expenditures. However, they did not get the necessary instruments for exerting a real influence on expenditures and the health insurance market has not been opened for new entrants. At the same time the sickness funds have powerful tools for risk selection, because they also dominate the market for supplementary health insurance. The present risk-adjustment system is based on the results of a regression analysis with aggregate data. The main proclaimed purpose of this system is to guarantee a fair treatment to all the sickness funds. Until now the danger of risk selection has not been taken seriously. Consumer mobility has remained rather low. However, since the degree of financial responsibility is programmed to increase in the near future, the potential profits from cream skimming will increase.

  4. Can rent adjustment clauses reduce the income risk of farms?

    OpenAIRE

    Hotopp, Henning; Mußhoff, Oliver

    2012-01-01

    Risk management is gaining importance in agriculture. In addition to traditional instruments, new risk management instruments are increasingly being proposed. These proposals include the rent adjustment clauses (RACs), which seem to be an unusual instrument at first sight. In contrast with conventional instruments, RACs intentionally allow fixed-cost ‘rent payments’ to fluctuate. We investigate the whole-farm risk reduction potential of different types of RACs via a historical simulation....

  5. Incorporating Comorbidity Within Risk Adjustment for UK Pediatric Cardiac Surgery.

    Science.gov (United States)

    Brown, Katherine L; Rogers, Libby; Barron, David J; Tsang, Victor; Anderson, David; Tibby, Shane; Witter, Thomas; Stickley, John; Crowe, Sonya; English, Kate; Franklin, Rodney C; Pagel, Christina

    2017-07-01

    When considering early survival rates after pediatric cardiac surgery it is essential to adjust for risk linked to case complexity. An important but previously less well understood component of case mix complexity is comorbidity. The National Congenital Heart Disease Audit data representing all pediatric cardiac surgery procedures undertaken in the United Kingdom and Ireland between 2009 and 2014 was used to develop and test groupings for comorbidity and additional non-procedure-based risk factors within a risk adjustment model for 30-day mortality. A mixture of expert consensus based opinion and empiric statistical analyses were used to define and test the new comorbidity groups. The study dataset consisted of 21,838 pediatric cardiac surgical procedure episodes in 18,834 patients with 539 deaths (raw 30-day mortality rate, 2.5%). In addition to surgical procedure type, primary cardiac diagnosis, univentricular status, age, weight, procedure type (bypass, nonbypass, or hybrid), and era, the new risk factor groups of non-Down congenital anomalies, acquired comorbidities, increased severity of illness indicators (eg, preoperative mechanical ventilation or circulatory support) and additional cardiac risk factors (eg, heart muscle conditions and raised pulmonary arterial pressure) all independently increased the risk of operative mortality. In an era of low mortality rates across a wide range of operations, non-procedure-based risk factors form a vital element of risk adjustment and their presence leads to wide variations in the predicted risk of a given operation. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Diagnostic Risk Adjustment for Medicaid: The Disability Payment System

    Science.gov (United States)

    Kronick, Richard; Dreyfus, Tony; Lee, Lora; Zhou, Zhiyuan

    1996-01-01

    This article describes a system of diagnostic categories that Medicaid programs can use for adjusting capitation payments to health plans that enroll people with disability. Medicaid claims from Colorado, Michigan, Missouri, New York, and Ohio are analyzed to demonstrate that the greater predictability of costs among people with disabilities makes risk adjustment more feasible than for a general population and more critical to creating health systems for people with disability. The application of our diagnostic categories to State claims data is described, including estimated effects on subsequent-year costs of various diagnoses. The challenges of implementing adjustment by diagnosis are explored. PMID:10172665

  7. Performance of Comorbidity, Risk Adjustment, and Functional Status Measures in Expenditure Prediction for Patients With Diabetes

    OpenAIRE

    Maciejewski, Matthew L.; Liu, Chuan-Fen; Fihn, Stephan D.

    2009-01-01

    OBJECTIVE?To compare the ability of generic comorbidity and risk adjustment measures, a diabetes-specific measure, and a self-reported functional status measure to explain variation in health care expenditures for individuals with diabetes. RESEARCH DESIGN AND METHODS?This study included a retrospective cohort of 3,092 diabetic veterans participating in a multisite trial. Two comorbidity measures, four risk adjusters, a functional status measure, a diabetes complication count, and baseline ex...

  8. Evaluating intergenerational risks: Probabillity adjusted rank-discounted utilitarianism

    OpenAIRE

    Asheim, Geir B.; Zuber, Stéphane

    2015-01-01

    Climate policies have stochastic consequences that involve a great number of generations. This calls for evaluating social risk (what kind of societies will future people be born into) rather than individual risk (what will happen to people during their own lifetimes). As a response we propose and axiomatize probability adjusted rank-discounted critical-level generalized utilitarianism (PARDCLU), through a key axiom that requires that the social welfare order both be ethical and satisfy first...

  9. Personality, emotional adjustment, and cardiovascular risk: marriage as a mechanism.

    Science.gov (United States)

    Smith, Timothy W; Baron, Carolynne E; Grove, Jeremy L

    2014-12-01

    A variety of aspects of personality and emotional adjustment predict the development and course of coronary heart disease (CHD), as do indications of marital quality (e.g., satisfaction, conflict, strain, disruption). Importantly, the personality traits and aspects of emotional adjustment that predict CHD are also related to marital quality. In such instances of correlated risk factors, traditional epidemiological and clinical research typically either ignores the potentially overlapping effects or examines independent associations through statistical controls, approaches that can misrepresent the key components and mechanisms of psychosocial effects on CHD. The interpersonal perspective in personality and clinical psychology provides an alternative and integrative approach, through its structural and process models of interpersonal behavior. We present this perspective on psychosocial risk and review research on its application to the integration of personality, emotional adjustment, and marital processes as closely interrelated influences on health and disease. © 2013 Wiley Periodicals, Inc.

  10. Risk-adjusted capitation: Recent experiences in the Netherlands

    NARCIS (Netherlands)

    W.P.M.M. van de Ven (Wynand); R.C.J.A. van Vliet (René); E.M. van Barneveld (Erik); L.M. Lamers (Leida)

    1994-01-01

    textabstractThe market-oriented health care reforms taking place in the Netherlands show a clear resemblance to the proposals for managed competition in U.S. health care. In both countries good risk adjustment mechanisms that prevent cream skimming--that is, that prevent plans from selecting the

  11. Risk-adjusted capitation: recent experiences in The Netherlands

    NARCIS (Netherlands)

    W.P.M.M. van de Ven (Wynand); E.M. van Barneveld (Erik); L.M. Lamers (Leida); R.C.J.A. van Vliet (René)

    1994-01-01

    textabstractThe market-oriented health care reforms taking place in the Netherlands show a clear resemblance to the proposals for managed competition in U.S. health care. In both countries good risk adjustment mechanisms that prevent cream skimming--that is, that

  12. Methods of risk assessment

    International Nuclear Information System (INIS)

    Jones, D.R.

    1981-01-01

    The subject is discussed under the headings: introduction (identification, quantification of risk); some approaches to risk evaluation (use of the 'no risk' principle; the 'acceptable risk' method; risk balancing; comparison of risks, benefits and other costs); cost benefit analysis; an alternative approach (tabulation and display; description and reduction of the data table); identification of potential decision sets consistent with the constraints. Some references are made to nuclear power. (U.K.)

  13. Diagnosis-Based Risk Adjustment for Medicare Capitation Payments

    Science.gov (United States)

    Ellis, Randall P.; Pope, Gregory C.; Iezzoni, Lisa I.; Ayanian, John Z.; Bates, David W.; Burstin, Helen; Ash, Arlene S.

    1996-01-01

    Using 1991-92 data for a 5-percent Medicare sample, we develop, estimate, and evaluate risk-adjustment models that utilize diagnostic information from both inpatient and ambulatory claims to adjust payments for aged and disabled Medicare enrollees. Hierarchical coexisting conditions (HCC) models achieve greater explanatory power than diagnostic cost group (DCG) models by taking account of multiple coexisting medical conditions. Prospective models predict average costs of individuals with chronic conditions nearly as well as concurrent models. All models predict medical costs far more accurately than the current health maintenance organization (HMO) payment formula. PMID:10172666

  14. Ants avoid superinfections by performing risk-adjusted sanitary care.

    Science.gov (United States)

    Konrad, Matthias; Pull, Christopher D; Metzler, Sina; Seif, Katharina; Naderlinger, Elisabeth; Grasse, Anna V; Cremer, Sylvia

    2018-03-13

    Being cared for when sick is a benefit of sociality that can reduce disease and improve survival of group members. However, individuals providing care risk contracting infectious diseases themselves. If they contract a low pathogen dose, they may develop low-level infections that do not cause disease but still affect host immunity by either decreasing or increasing the host's vulnerability to subsequent infections. Caring for contagious individuals can thus significantly alter the future disease susceptibility of caregivers. Using ants and their fungal pathogens as a model system, we tested if the altered disease susceptibility of experienced caregivers, in turn, affects their expression of sanitary care behavior. We found that low-level infections contracted during sanitary care had protective or neutral effects on secondary exposure to the same (homologous) pathogen but consistently caused high mortality on superinfection with a different (heterologous) pathogen. In response to this risk, the ants selectively adjusted the expression of their sanitary care. Specifically, the ants performed less grooming and more antimicrobial disinfection when caring for nestmates contaminated with heterologous pathogens compared with homologous ones. By modulating the components of sanitary care in this way the ants acquired less infectious particles of the heterologous pathogens, resulting in reduced superinfection. The performance of risk-adjusted sanitary care reveals the remarkable capacity of ants to react to changes in their disease susceptibility, according to their own infection history and to flexibly adjust collective care to individual risk.

  15. Risk Selection, Risk Adjustment and Choice: Concepts and Lessons from the Americas

    Science.gov (United States)

    Ellis, Randall P.; Fernandez, Juan Gabriel

    2013-01-01

    Interest has grown worldwide in risk adjustment and risk sharing due to their potential to contain costs, improve fairness, and reduce selection problems in health care markets. Significant steps have been made in the empirical development of risk adjustment models, and in the theoretical foundations of risk adjustment and risk sharing. This literature has often modeled the effects of risk adjustment without highlighting the institutional setting, regulations, and diverse selection problems that risk adjustment is intended to fix. Perhaps because of this, the existing literature and their recommendations for optimal risk adjustment or optimal payment systems are sometimes confusing. In this paper, we present a unified way of thinking about the organizational structure of health care systems, which enables us to focus on two key dimensions of markets that have received less attention: what choices are available that may lead to selection problems, and what financial or regulatory tools other than risk adjustment are used to influence these choices. We specifically examine the health care systems, choices, and problems in four countries: the US, Canada, Chile, and Colombia, and examine the relationship between selection-related efficiency and fairness problems and the choices that are allowed in each country, and discuss recent regulatory reforms that affect choices and selection problems. In this sample, countries and insurance programs with more choices have more selection problems. PMID:24284351

  16. Risk Selection, Risk Adjustment and Choice: Concepts and Lessons from the Americas

    Directory of Open Access Journals (Sweden)

    Randall P. Ellis

    2013-10-01

    Full Text Available Interest has grown worldwide in risk adjustment and risk sharing due to their potential to contain costs, improve fairness, and reduce selection problems in health care markets. Significant steps have been made in the empirical development of risk adjustment models, and in the theoretical foundations of risk adjustment and risk sharing. This literature has often modeled the effects of risk adjustment without highlighting the institutional setting, regulations, and diverse selection problems that risk adjustment is intended to fix. Perhaps because of this, the existing literature and their recommendations for optimal risk adjustment or optimal payment systems are sometimes confusing. In this paper, we present a unified way of thinking about the organizational structure of health care systems, which enables us to focus on two key dimensions of markets that have received less attention: what choices are available that may lead to selection problems, and what financial or regulatory tools other than risk adjustment are used to influence these choices. We specifically examine the health care systems, choices, and problems in four countries: the US, Canada, Chile, and Colombia, and examine the relationship between selection-related efficiency and fairness problems and the choices that are allowed in each country, and discuss recent regulatory reforms that affect choices and selection problems. In this sample, countries and insurance programs with more choices have more selection problems.

  17. A risk adjustment approach to estimating the burden of skin disease in the United States.

    Science.gov (United States)

    Lim, Henry W; Collins, Scott A B; Resneck, Jack S; Bolognia, Jean; Hodge, Julie A; Rohrer, Thomas A; Van Beek, Marta J; Margolis, David J; Sober, Arthur J; Weinstock, Martin A; Nerenz, David R; Begolka, Wendy Smith; Moyano, Jose V

    2018-01-01

    Direct insurance claims tabulation and risk adjustment statistical methods can be used to estimate health care costs associated with various diseases. In this third manuscript derived from the new national Burden of Skin Disease Report from the American Academy of Dermatology, a risk adjustment method that was based on modeling the average annual costs of individuals with or without specific diseases, and specifically tailored for 24 skin disease categories, was used to estimate the economic burden of skin disease. The results were compared with the claims tabulation method used in the first 2 parts of this project. The risk adjustment method estimated the direct health care costs of skin diseases to be $46 billion in 2013, approximately $15 billion less than estimates using claims tabulation. For individual skin diseases, the risk adjustment cost estimates ranged from 11% to 297% of those obtained using claims tabulation for the 10 most costly skin disease categories. Although either method may be used for purposes of estimating the costs of skin disease, the choice of method will affect the end result. These findings serve as an important reference for future discussions about the method chosen in health care payment models to estimate both the cost of skin disease and the potential cost impact of care changes. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  18. The Experience of Risk-Adjusted Capitation Payment for Family Physicians in Iran: A Qualitative Study.

    Science.gov (United States)

    Esmaeili, Reza; Hadian, Mohammad; Rashidian, Arash; Shariati, Mohammad; Ghaderi, Hossien

    2016-04-01

    When a country's health system is faced with fundamental flaws that require the redesign of financing and service delivery, primary healthcare payment systems are often reformed. This study was conducted with the purpose of exploring the experiences of risk-adjusted capitation payment of urban family physicians in Iran when it comes to providing primary health care (PHC). This is a qualitative study using the framework method. Data were collected via digitally audio-recorded semi-structured interviews with 24 family physicians and 5 executive directors in two provinces of Iran running the urban family physician pilot program. The participants were selected using purposive and snowball sampling. The codes were extracted using inductive and deductive methods. Regarding the effects of risk-adjusted capitation on the primary healthcare setting, five themes with 11 subthemes emerged, including service delivery, institutional structure, financing, people's behavior, and the challenges ahead. Our findings indicated that the health system is enjoying some major changes in the primary healthcare setting through the implementation of risk-adjusted capitation payment. With regard to the current challenges in Iran's health system, using risk-adjusted capitation as a primary healthcare payment system can lead to useful changes in the health system's features. However, future research should focus on the development of the risk-adjusted capitation model.

  19. Seasonal adjustment methods and real time trend-cycle estimation

    CERN Document Server

    Bee Dagum, Estela

    2016-01-01

    This book explores widely used seasonal adjustment methods and recent developments in real time trend-cycle estimation. It discusses in detail the properties and limitations of X12ARIMA, TRAMO-SEATS and STAMP - the main seasonal adjustment methods used by statistical agencies. Several real-world cases illustrate each method and real data examples can be followed throughout the text. The trend-cycle estimation is presented using nonparametric techniques based on moving averages, linear filters and reproducing kernel Hilbert spaces, taking recent advances into account. The book provides a systematical treatment of results that to date have been scattered throughout the literature. Seasonal adjustment and real time trend-cycle prediction play an essential part at all levels of activity in modern economies. They are used by governments to counteract cyclical recessions, by central banks to control inflation, by decision makers for better modeling and planning and by hospitals, manufacturers, builders, transportat...

  20. Comparing treatment effects after adjustment with multivariable Cox proportional hazards regression and propensity score methods

    NARCIS (Netherlands)

    Martens, Edwin P; de Boer, Anthonius; Pestman, Wiebe R; Belitser, Svetlana V; Stricker, Bruno H Ch; Klungel, Olaf H

    PURPOSE: To compare adjusted effects of drug treatment for hypertension on the risk of stroke from propensity score (PS) methods with a multivariable Cox proportional hazards (Cox PH) regression in an observational study with censored data. METHODS: From two prospective population-based cohort

  1. HIV quality report cards: impact of case-mix adjustment and statistical methods.

    Science.gov (United States)

    Ohl, Michael E; Richardson, Kelly K; Goto, Michihiko; Vaughan-Sarrazin, Mary; Schweizer, Marin L; Perencevich, Eli N

    2014-10-15

    There will be increasing pressure to publicly report and rank the performance of healthcare systems on human immunodeficiency virus (HIV) quality measures. To inform discussion of public reporting, we evaluated the influence of case-mix adjustment when ranking individual care systems on the viral control quality measure. We used data from the Veterans Health Administration (VHA) HIV Clinical Case Registry and administrative databases to estimate case-mix adjusted viral control for 91 local systems caring for 12 368 patients. We compared results using 2 adjustment methods, the observed-to-expected estimator and the risk-standardized ratio. Overall, 10 913 patients (88.2%) achieved viral control (viral load ≤400 copies/mL). Prior to case-mix adjustment, system-level viral control ranged from 51% to 100%. Seventeen (19%) systems were labeled as low outliers (performance significantly below the overall mean) and 11 (12%) as high outliers. Adjustment for case mix (patient demographics, comorbidity, CD4 nadir, time on therapy, and income from VHA administrative databases) reduced the number of low outliers by approximately one-third, but results differed by method. The adjustment model had moderate discrimination (c statistic = 0.66), suggesting potential for unadjusted risk when using administrative data to measure case mix. Case-mix adjustment affects rankings of care systems on the viral control quality measure. Given the sensitivity of rankings to selection of case-mix adjustment methods-and potential for unadjusted risk when using variables limited to current administrative databases-the HIV care community should explore optimal methods for case-mix adjustment before moving forward with public reporting. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  2. Risk-adjusted payment and performance assessment for primary care.

    Science.gov (United States)

    Ash, Arlene S; Ellis, Randall P

    2012-08-01

    Many wish to change incentives for primary care practices through bundled population-based payments and substantial performance feedback and bonus payments. Recognizing patient differences in costs and outcomes is crucial, but customized risk adjustment for such purposes is underdeveloped. Using MarketScan's claims-based data on 17.4 million commercially insured lives, we modeled bundled payment to support expected primary care activity levels (PCAL) and 9 patient outcomes for performance assessment. We evaluated models using 457,000 people assigned to 436 primary care physician panels, and among 13,000 people in a distinct multipayer medical home implementation with commercially insured, Medicare, and Medicaid patients. Each outcome is separately predicted from age, sex, and diagnoses. We define the PCAL outcome as a subset of all costs that proxies the bundled payment needed for comprehensive primary care. Other expected outcomes are used to establish targets against which actual performance can be fairly judged. We evaluate model performance using R(2)'s at patient and practice levels, and within policy-relevant subgroups. The PCAL model explains 67% of variation in its outcome, performing well across diverse patient ages, payers, plan types, and provider specialties; it explains 72% of practice-level variation. In 9 performance measures, the outcome-specific models explain 17%-86% of variation at the practice level, often substantially outperforming a generic score like the one used for full capitation payments in Medicare: for example, with grouped R(2)'s of 47% versus 5% for predicting "prescriptions for antibiotics of concern." Existing data can support the risk-adjusted bundled payment calculations and performance assessments needed to encourage desired transformations in primary care.

  3. A Fast and Effective Block Adjustment Method with Big Data

    Directory of Open Access Journals (Sweden)

    ZHENG Maoteng

    2017-02-01

    Full Text Available To deal with multi-source, complex and massive data in photogrammetry, and solve the high memory requirement and low computation efficiency of irregular normal equation caused by the randomly aligned and large scale datasets, we introduce the preconditioned conjugate gradient combined with inexact Newton method to solve the normal equation which do not have strip characteristics due to the randomly aligned images. We also use an effective sparse matrix compression format to compress the big normal matrix, a brand new workflow of bundle adjustment is developed. Our method can avoid the direct inversion of the big normal matrix, the memory requirement of the normal matrix is also decreased by the proposed sparse matrix compression format. Combining all these techniques, the proposed method can not only decrease the memory requirement of normal matrix, but also largely improve the efficiency of bundle adjustment while maintaining the same accuracy as the conventional method. Preliminary experiment results show that the bundle adjustment of a dataset with about 4500 images and 9 million image points can be done in only 15 minutes while achieving sub-pixel accuracy.

  4. Adjust the method of the FMEA to the requirements of the aviation industry

    Directory of Open Access Journals (Sweden)

    Andrzej FELLNER

    2015-12-01

    Full Text Available The article presents a summary of current methods used in aviation and rail transport. It also contains a proposal to adjust the method of the FMEA to the latest requirements of the airline industry. The authors suggested tables of indicators Zn, Pr and Dt necessary to implement FMEA method of risk analysis taking into account current achievements aerospace and rail safety. Also proposed acceptable limits of the RPN number which allows you to classify threats.

  5. Magnetic field adjustment structure and method for a tapered wiggler

    Science.gov (United States)

    Halbach, Klaus

    1988-01-01

    An improved method and structure is disclosed for adjusting the magnetic field generated by a group of electromagnet poles spaced along the path of a charged particle beam to compensate for energy losses in the charged particles which comprises providing more than one winding on at least some of the electromagnet poles; connecting one respective winding on each of several consecutive adjacent electromagnet poles to a first power supply, and the other respective winding on the electromagnet pole to a different power supply in staggered order; and independently adjusting one power supply to independently vary the current in one winding on each electromagnet pole in a group whereby the magnetic field strength of each of a group of electromagnet poles may be changed in smaller increments.

  6. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  7. Modified risk evaluation method

    International Nuclear Information System (INIS)

    Udell, C.J.; Tilden, J.A.; Toyooka, R.T.

    1993-08-01

    The purpose of this paper is to provide a structured and cost-oriented process to determine risks associated with nuclear material and other security interests. Financial loss is a continuing concern for US Department of Energy contractors. In this paper risk is equated with uncertainty of cost impacts to material assets or human resources. The concept provides a method for assessing the effectiveness of an integrated protection system, which includes operations, safety, emergency preparedness, and safeguards and security. The concept is suitable for application to sabotage evaluations. The protection of assets is based on risk associated with cost impacts to assets and the potential for undesirable events. This will allow managers to establish protection priorities in terms of the cost and the potential for the event, given the current level of protection

  8. Risk selection and risk adjustment: improving insurance in the individual and small group markets.

    Science.gov (United States)

    Baicker, Katherine; Dow, William H

    2009-01-01

    Insurance market reforms face the key challenge of addressing the threat that risk selection poses to the availability, of stable, high-value insurance policies that provide long-term risk protection. Many of the strategies in use today fail to address this breakdown in risk pooling, and some even exacerbate it. Flexible risk adjustment schemes are a promising avenue for promoting market stability and limiting insurer cream-skimming, potentially providing greater benefits at lower cost. Reforms intended to increase insurance coverage and the value of care delivered will be much more effective if implemented in conjunction with policies that address these fundamental selection issues.

  9. Risk-adjusted hospital outcomes for children's surgery.

    Science.gov (United States)

    Saito, Jacqueline M; Chen, Li Ern; Hall, Bruce L; Kraemer, Kari; Barnhart, Douglas C; Byrd, Claudia; Cohen, Mark E; Fei, Chunyuan; Heiss, Kurt F; Huffman, Kristopher; Ko, Clifford Y; Latus, Melissa; Meara, John G; Oldham, Keith T; Raval, Mehul V; Richards, Karen E; Shah, Rahul K; Sutton, Laura C; Vinocur, Charles D; Moss, R Lawrence

    2013-09-01

    BACKGROUND The American College of Surgeons National Surgical Quality Improvement Program-Pediatric was initiated in 2008 to drive quality improvement in children's surgery. Low mortality and morbidity in previous analyses limited differentiation of hospital performance. Participating institutions included children's units within general hospitals and free-standing children's hospitals. Cases selected by Current Procedural Terminology codes encompassed procedures within pediatric general, otolaryngologic, orthopedic, urologic, plastic, neurologic, thoracic, and gynecologic surgery. Trained personnel abstracted demographic, surgical profile, preoperative, intraoperative, and postoperative variables. Incorporating procedure-specific risk, hierarchical models for 30-day mortality and morbidities were developed with significant predictors identified by stepwise logistic regression. Reliability was estimated to assess the balance of information versus error within models. In 2011, 46 281 patients from 43 hospitals were accrued; 1467 codes were aggregated into 226 groupings. Overall mortality was 0.3%, composite morbidity 5.8%, and surgical site infection (SSI) 1.8%. Hierarchical models revealed outlier hospitals with above or below expected performance for composite morbidity in the entire cohort, pediatric abdominal subgroup, and spine subgroup; SSI in the entire cohort and pediatric abdominal subgroup; and urinary tract infection in the entire cohort. Based on reliability estimates, mortality discriminates performance poorly due to very low event rate; however, reliable model construction for composite morbidity and SSI that differentiate institutions is feasible. The National Surgical Quality Improvement Program-Pediatric expansion has yielded risk-adjusted models to differentiate hospital performance in composite and specific morbidities. However, mortality has low utility as a children's surgery performance indicator. Programmatic improvements have resulted in

  10. Health-Based Capitation Risk Adjustment in Minnesota Public Health Care Programs

    Science.gov (United States)

    Gifford, Gregory A.; Edwards, Kevan R.; Knutson, David J.

    2004-01-01

    This article documents the history and implementation of health-based capitation risk adjustment in Minnesota public health care programs, and identifies key implementation issues. Capitation payments in these programs are risk adjusted using an historical, health plan risk score, based on concurrent risk assessment. Phased implementation of capitation risk adjustment for these programs began January 1, 2000. Minnesota's experience with capitation risk adjustment suggests that: (1) implementation can accelerate encounter data submission, (2) administrative decisions made during implementation can create issues that impact payment model performance, and (3) changes in diagnosis data management during implementation may require changes to the payment model. PMID:25372356

  11. Remotely adjustable fishing jar and method for using same

    International Nuclear Information System (INIS)

    Wyatt, W.B.

    1992-01-01

    This patent describes a method for providing a jarring force to dislodge objects stuck in well bores, the method it comprises: connecting a jarring tool between an operating string and an object in a well bore; selecting a jarring force to be applied to the object; setting the selected reference jarring force into a mechanical memory mechanism by progressively engaging a first latch body and a second latch body; retaining the reference jarring force in the mechanical memory mechanism during diminution of tensional force applied by the operating string; and initiating an upwardly directed impact force within the jarring tool by increasing tensional force on the operating string to a value greater than the tensional force corresponding with the selected jarring force. This patent also describes a remotely adjustable downhole fishing jar apparatus comprising: an operating mandrel; an impact release spring; a mechanical memory mechanism; and releasable latching means

  12. Risk-adjusted performance evaluation in three academic thoracic surgery units using the Eurolung risk models.

    Science.gov (United States)

    Pompili, Cecilia; Shargall, Yaron; Decaluwe, Herbert; Moons, Johnny; Chari, Madhu; Brunelli, Alessandro

    2018-01-03

    The objective of this study was to evaluate the performance of 3 thoracic surgery centres using the Eurolung risk models for morbidity and mortality. This was a retrospective analysis performed on data collected from 3 academic centres (2014-2016). Seven hundred and twenty-one patients in Centre 1, 857 patients in Centre 2 and 433 patients in Centre 3 who underwent anatomical lung resections were analysed. The Eurolung1 and Eurolung2 models were used to predict risk-adjusted cardiopulmonary morbidity and 30-day mortality rates. Observed and risk-adjusted outcomes were compared within each centre. The observed morbidity of Centre 1 was in line with the predicted morbidity (observed 21.1% vs predicted 22.7%, P = 0.31). Centre 2 performed better than expected (observed morbidity 20.2% vs predicted 26.7%, P models were successfully used as risk-adjusting instruments to internally audit the outcomes of 3 different centres, showing their applicability for future quality improvement initiatives. © The Author(s) 2018. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  13. 12 CFR 615.5210 - Risk-adjusted assets.

    Science.gov (United States)

    2010-01-01

    ... appropriate credit conversion factor in § 615.5212, is assigned to one of the risk categories specified in... risk-based capital requirement for the credit-enhanced assets, the risk-based capital required under..., determine the appropriate risk weight for any asset or credit equivalent amount that does not fit wholly...

  14. Use of risk-adjusted CUSUM charts to monitor 30-day mortality in Danish hospitals

    Directory of Open Access Journals (Sweden)

    Rasmussen TB

    2018-04-01

    Full Text Available Thomas Bøjer Rasmussen, Sinna Pilgaard Ulrichsen, Mette Nørgaard Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus N, Denmark Background: Monitoring hospital outcomes and clinical processes as a measure of clinical performance is an integral part of modern health care. The risk-adjusted cumulative sum (CUSUM chart is a frequently used sequential analysis technique that can be implemented to monitor a wide range of different types of outcomes.Objective: The aim of this study was to describe how risk-adjusted CUSUM charts based on population-based nationwide medical registers were used to monitor 30-day mortality in Danish hospitals and to give an example on how alarms of increased hospital mortality from the charts can guide further in-depth analyses.Materials and methods: We used routinely collected administrative data from the Danish National Patient Registry and the Danish Civil Registration System to create risk-adjusted CUSUM charts. We monitored 30-day mortality after hospital admission with one of 77 selected diagnoses in 24 hospital units in Denmark in 2015. The charts were set to detect a 50% increase in 30-day mortality, and control limits were determined by simulations.Results: Among 1,085,576 hospital admissions, 441,352 admissions had one of the 77 selected diagnoses as their primary diagnosis and were included in the risk-adjusted CUSUM charts. The charts yielded a total of eight alarms of increased mortality. The median of the hospitals’ estimated average time to detect a 50% increase in 30-day mortality was 50 days (interquartile interval, 43;54. In the selected example of an alarm, descriptive analyses indicated performance problems with 30-day mortality following hip fracture surgery and diagnosis of chronic obstructive pulmonary disease.Conclusion: The presented implementation of risk-adjusted CUSUM charts can detect significant increases in 30-day mortality within 2 months, on average, in most

  15. Evaluation of trauma care using TRISS method: the role of adjusted misclassification rate and adjusted w-statistic

    Directory of Open Access Journals (Sweden)

    Bytyçi Cen I

    2009-01-01

    Full Text Available Abstract Background Major trauma is a leading cause of death worldwide. Evaluation of trauma care using Trauma Injury and Injury Severity Score (TRISS method is focused in trauma outcome (deaths and survivors. For testing TRISS method TRISS misclassification rate is used. Calculating w-statistic, as a difference between observed and TRISS expected survivors, we compare our trauma care results with the TRISS standard. Aim The aim of this study is to analyze interaction between misclassification rate and w-statistic and to adjust these parameters to be closer to the truth. Materials and methods Analysis of components of TRISS misclassification rate and w-statistic and actual trauma outcome. Results The component of false negative (FN (by TRISS method unexpected deaths has two parts: preventable (Pd and non-preventable (nonPd trauma deaths. Pd represents inappropriate trauma care of an institution; otherwise nonpreventable trauma deaths represents errors in TRISS method. Removing patients with preventable trauma deaths we get an Adjusted misclassification rate: (FP + FN - Pd/N or (b+c-Pd/N. Substracting nonPd from FN value in w-statistic formula we get an Adjusted w-statistic: [FP-(FN - nonPd]/N, respectively (FP-Pd/N, or (b-Pd/N. Conclusion Because adjusted formulas clean method from inappropriate trauma care, and clean trauma care from the methods error, TRISS adjusted misclassification rate and adjusted w-statistic gives more realistic results and may be used in researches of trauma outcome.

  16. Evaluation of trauma care using TRISS method: the role of adjusted misclassification rate and adjusted w-statistic.

    Science.gov (United States)

    Llullaku, Sadik S; Hyseni, Nexhmi Sh; Bytyçi, Cen I; Rexhepi, Sylejman K

    2009-01-15

    Major trauma is a leading cause of death worldwide. Evaluation of trauma care using Trauma Injury and Injury Severity Score (TRISS) method is focused in trauma outcome (deaths and survivors). For testing TRISS method TRISS misclassification rate is used. Calculating w-statistic, as a difference between observed and TRISS expected survivors, we compare our trauma care results with the TRISS standard. The aim of this study is to analyze interaction between misclassification rate and w-statistic and to adjust these parameters to be closer to the truth. Analysis of components of TRISS misclassification rate and w-statistic and actual trauma outcome. The component of false negative (FN) (by TRISS method unexpected deaths) has two parts: preventable (Pd) and non-preventable (nonPd) trauma deaths. Pd represents inappropriate trauma care of an institution; otherwise nonpreventable trauma deaths represents errors in TRISS method. Removing patients with preventable trauma deaths we get an Adjusted misclassification rate: (FP + FN - Pd)/N or (b+c-Pd)/N. Substracting nonPd from FN value in w-statistic formula we get an Adjusted w-statistic: [FP-(FN - nonPd)]/N, respectively (FP-Pd)/N, or (b-Pd)/N). Because adjusted formulas clean method from inappropriate trauma care, and clean trauma care from the methods error, TRISS adjusted misclassification rate and adjusted w-statistic gives more realistic results and may be used in researches of trauma outcome.

  17. New ventures require accurate risk analyses and adjustments.

    Science.gov (United States)

    Eastaugh, S R

    2000-01-01

    For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.

  18. Adjusted Empirical Likelihood Method in the Presence of Nuisance Parameters with Application to the Sharpe Ratio

    Directory of Open Access Journals (Sweden)

    Yuejiao Fu

    2018-04-01

    Full Text Available The Sharpe ratio is a widely used risk-adjusted performance measurement in economics and finance. Most of the known statistical inferential methods devoted to the Sharpe ratio are based on the assumption that the data are normally distributed. In this article, without making any distributional assumption on the data, we develop the adjusted empirical likelihood method to obtain inference for a parameter of interest in the presence of nuisance parameters. We show that the log adjusted empirical likelihood ratio statistic is asymptotically distributed as the chi-square distribution. The proposed method is applied to obtain inference for the Sharpe ratio. Simulation results illustrate that the proposed method is comparable to Jobson and Korkie’s method (1981 and outperforms the empirical likelihood method when the data are from a symmetric distribution. In addition, when the data are from a skewed distribution, the proposed method significantly outperforms all other existing methods. A real-data example is analyzed to exemplify the application of the proposed method.

  19. Simple method for generating adjustable trains of picosecond electron bunches

    Directory of Open Access Journals (Sweden)

    P. Muggli

    2010-05-01

    Full Text Available A simple, passive method for producing an adjustable train of picosecond electron bunches is demonstrated. The key component of this method is an electron beam mask consisting of an array of parallel wires that selectively spoils the beam emittance. This mask is positioned in a high magnetic dispersion, low beta-function region of the beam line. The incoming electron beam striking the mask has a time/energy correlation that corresponds to a time/position correlation at the mask location. The mask pattern is transformed into a time pattern or train of bunches when the dispersion is brought back to zero downstream of the mask. Results are presented of a proof-of-principle experiment demonstrating this novel technique that was performed at the Brookhaven National Laboratory Accelerator Test Facility. This technique allows for easy tailoring of the bunch train for a particular application, including varying the bunch width and spacing, and enabling the generation of a trailing witness bunch.

  20. Health plans and selection: formal risk adjustment vs. market design and contracts.

    Science.gov (United States)

    Frank, R G; Rosenthal, M B

    2001-01-01

    In this paper, we explore the demand for risk adjustment by health plans that contract with private employers by considering the conditions under which plans might value risk adjustment. Three factors reduce the value of risk adjustment from the plans' point of view. First, only a relatively small segment of privately insured Americans face a choice of competing health plans. Second, health plans share much of their insurance risk with payers, providers, and reinsurers. Third, de facto experience rating that occurs during the premium negotiation process and management of coverage appear to substitute for risk adjustment. While the current environment has not generated much demand for risk adjustment, we reflect on its future potential.

  1. Impact of Race/Ethnicity and Socioeconomic Status on Risk-Adjusted Hospital Readmission Rates Following Hip and Knee Arthroplasty.

    Science.gov (United States)

    Martsolf, Grant R; Barrett, Marguerite L; Weiss, Audrey J; Kandrack, Ryan; Washington, Raynard; Steiner, Claudia A; Mehrotra, Ateev; SooHoo, Nelson F; Coffey, Rosanna

    2016-08-17

    Readmission rates following total hip arthroplasty (THA) and total knee arthroplasty (TKA) are increasingly used to measure hospital performance. Readmission rates that are not adjusted for race/ethnicity and socioeconomic status, patient risk factors beyond a hospital's control, may not accurately reflect a hospital's performance. In this study, we examined the extent to which risk-adjusting for race/ethnicity and socioeconomic status affected hospital performance in terms of readmission rates following THA and TKA. We calculated 2 sets of risk-adjusted readmission rates by (1) using the Centers for Medicare & Medicaid Services standard risk-adjustment algorithm that incorporates patient age, sex, comorbidities, and hospital effects and (2) adding race/ethnicity and socioeconomic status to the model. Using data from the Healthcare Cost and Utilization Project, 2011 State Inpatient Databases, we compared the relative performances of 1,194 hospitals across the 2 methods. Addition of race/ethnicity and socioeconomic status to the risk-adjustment algorithm resulted in (1) little or no change in the risk-adjusted readmission rates at nearly all hospitals; (2) no change in the designation of the readmission rate as better, worse, or not different from the population mean at >99% of the hospitals; and (3) no change in the excess readmission ratio at >97% of the hospitals. Inclusion of race/ethnicity and socioeconomic status in the risk-adjustment algorithm led to a relative-performance change in readmission rates following THA and TKA at socioeconomic status in risk-adjusted THA and TKA readmission rates used for hospital accountability, payment, and public reporting. Prognostic Level III. See instructions for Authors for a complete description of levels of evidence. Copyright © 2016 by The Journal of Bone and Joint Surgery, Incorporated.

  2. Monitoring risk-adjusted outcomes in congenital heart surgery: does the appropriateness of a risk model change with time?

    Science.gov (United States)

    Tsang, Victor T; Brown, Katherine L; Synnergren, Mats Johanssen; Kang, Nicholas; de Leval, Marc R; Gallivan, Steve; Utley, Martin

    2009-02-01

    Risk adjustment of outcomes in pediatric congenital heart surgery is challenging due to the great diversity in diagnoses and procedures. We have previously shown that variable life-adjusted display (VLAD) charts provide an effective graphic display of risk-adjusted outcomes in this specialty. A question arises as to whether the risk model used remains appropriate over time. We used a recently developed graphic technique to evaluate the performance of an existing risk model among those patients at a single center during 2000 to 2003 originally used in model development. We then compared the distribution of predicted risk among these patients with that among patients in 2004 to 2006. Finally, we constructed a VLAD chart of risk-adjusted outcomes for the latter period. Among 1083 patients between April 2000 and March 2003, the risk model performed well at predicted risks above 3%, underestimated mortality at 2% to 3% predicted risk, and overestimated mortality below 2% predicted risk. There was little difference in the distribution of predicted risk among these patients and among 903 patients between June 2004 and October 2006. Outcomes for the more recent period were appreciably better than those expected according to the risk model. This finding cannot be explained by any apparent bias in the risk model combined with changes in case-mix. Risk models can, and hopefully do, become out of date. There is scope for complacency in the risk-adjusted audit if the risk model used is not regularly recalibrated to reflect changing standards and expectations.

  3. Do insurers respond to risk adjustment? A long-term, nationwide analysis from Switzerland.

    Science.gov (United States)

    von Wyl, Viktor; Beck, Konstantin

    2016-03-01

    Community rating in social health insurance calls for risk adjustment in order to eliminate incentives for risk selection. Swiss risk adjustment is known to be insufficient, and substantial risk selection incentives remain. This study develops five indicators to monitor residual risk selection. Three indicators target activities of conglomerates of insurers (with the same ownership), which steer enrollees into specific carriers based on applicants' risk profiles. As a proxy for their market power, those indicators estimate the amount of premium-, health care cost-, and risk-adjustment transfer variability that is attributable to conglomerates. Two additional indicators, derived from linear regression, describe the amount of residual cost differences between insurers that are not covered by risk adjustment. All indicators measuring conglomerate-based risk selection activities showed increases between 1996 and 2009, paralleling the establishment of new conglomerates. At their maxima in 2009, the indicator values imply that 56% of the net risk adjustment volume, 34% of premium variability, and 51% cost variability in the market were attributable to conglomerates. From 2010 onwards, all indicators decreased, coinciding with a pre-announced risk adjustment reform implemented in 2012. Likewise, the regression-based indicators suggest that the volume and variance of residual cost differences between insurers that are not equaled out by risk adjustment have decreased markedly since 2009 as a result of the latest reform. Our analysis demonstrates that risk-selection, especially by conglomerates, is a real phenomenon in Switzerland. However, insurers seem to have reduced risk selection activities to optimize their losses and gains from the latest risk adjustment reform.

  4. Refining Risk Adjustment for the Proposed CMS Surgical Hip and Femur Fracture Treatment Bundled Payment Program.

    Science.gov (United States)

    Cairns, Mark A; Ostrum, Robert F; Clement, R Carter

    2018-02-21

    The U.S. Centers for Medicare & Medicaid Services (CMS) has been considering the implementation of a mandatory bundled payment program, the Surgical Hip and Femur Fracture Treatment (SHFFT) model. However, bundled payments without appropriate risk adjustment may be inequitable to providers and may restrict access to care for certain patients. The SHFFT proposal includes adjustment using the Diagnosis-Related Group (DRG) and geographic location. The goal of the current study was to identify and quantify patient factors that could improve risk adjustment for SHFFT bundled payments. We retrospectively reviewed a 5% random sample of Medicare data from 2008 to 2012. A total of 27,898 patients were identified who met SHFFT inclusion criteria (DRG 480, 481, and 482). Reimbursement was determined for each patient over the bundle period (the surgical hospitalization and 90 days of post-discharge care). Multivariable regression was performed to test demographic factors, comorbidities, geographic location, and specific surgical procedures for associations with reimbursement. The average reimbursement was $23,632 ± $17,587. On average, reimbursements for male patients were $1,213 higher than for female patients (p payments; e.g., reimbursement for those ≥85 years of age averaged $2,282 ± $389 less than for those aged 65 to 69 (p reimbursement, but dementia was associated with lower payments, by an average of $2,354 ± $243 (p reimbursement ranging from $22,527 to $24,033. Less common procedures varied by >$20,000 in average reimbursement (p reimbursement (p reimbursed by an average of $10,421 ± $543 more than DRG 482. Payments varied significantly by state (p ≤ 0.01). Risk adjustment incorporating specific comorbidities demonstrated better performance than with use of DRG alone (r = 0.22 versus 0.15). Our results suggest that the proposed SHFFT bundled payment model should use more robust risk-adjustment methods to ensure that providers are reimbursed fairly and that

  5. LSL: a logarithmic least-squares adjustment method

    International Nuclear Information System (INIS)

    Stallmann, F.W.

    1982-01-01

    To meet regulatory requirements, spectral unfolding codes must not only provide reliable estimates for spectral parameters, but must also be able to determine the uncertainties associated with these parameters. The newer codes, which are more appropriately called adjustment codes, use the least squares principle to determine estimates and uncertainties. The principle is simple and straightforward, but there are several different mathematical models to describe the unfolding problem. In addition to a sound mathematical model, ease of use and range of options are important considerations in the construction of adjustment codes. Based on these considerations, a least squares adjustment code for neutron spectrum unfolding has been constructed some time ago and tentatively named LSL

  6. Development and Evaluation of an Automated Machine Learning Algorithm for In-Hospital Mortality Risk Adjustment Among Critical Care Patients.

    Science.gov (United States)

    Delahanty, Ryan J; Kaufman, David; Jones, Spencer S

    2018-06-01

    Risk adjustment algorithms for ICU mortality are necessary for measuring and improving ICU performance. Existing risk adjustment algorithms are not widely adopted. Key barriers to adoption include licensing and implementation costs as well as labor costs associated with human-intensive data collection. Widespread adoption of electronic health records makes automated risk adjustment feasible. Using modern machine learning methods and open source tools, we developed and evaluated a retrospective risk adjustment algorithm for in-hospital mortality among ICU patients. The Risk of Inpatient Death score can be fully automated and is reliant upon data elements that are generated in the course of usual hospital processes. One hundred thirty-one ICUs in 53 hospitals operated by Tenet Healthcare. A cohort of 237,173 ICU patients discharged between January 2014 and December 2016. The data were randomly split into training (36 hospitals), and validation (17 hospitals) data sets. Feature selection and model training were carried out using the training set while the discrimination, calibration, and accuracy of the model were assessed in the validation data set. Model discrimination was evaluated based on the area under receiver operating characteristic curve; accuracy and calibration were assessed via adjusted Brier scores and visual analysis of calibration curves. Seventeen features, including a mix of clinical and administrative data elements, were retained in the final model. The Risk of Inpatient Death score demonstrated excellent discrimination (area under receiver operating characteristic curve = 0.94) and calibration (adjusted Brier score = 52.8%) in the validation dataset; these results compare favorably to the published performance statistics for the most commonly used mortality risk adjustment algorithms. Low adoption of ICU mortality risk adjustment algorithms impedes progress toward increasing the value of the healthcare delivered in ICUs. The Risk of Inpatient Death

  7. Latino risk-adjusted mortality in the men screened for the Multiple Risk Factor Intervention Trial.

    Science.gov (United States)

    Thomas, Avis J; Eberly, Lynn E; Neaton, James D; Smith, George Davey

    2005-09-15

    Latinos are now the largest minority in the United States, but their distinctive health needs and mortality patterns remain poorly understood. Proportional hazards regressions were used to compare Latino versus White risk- and income-adjusted mortality over 25 years' follow-up from 5,846 Latino and 300,647 White men screened for the Multiple Risk Factor Intervention Trial. Men were aged 35-57 years and residing in 14 states when screened in 1973-1975. Data on coronary heart disease risk factors, self-reported race/ethnicity, and home addresses were obtained at baseline; income was estimated by linking addresses to census data. Mortality follow-up through 1999 was obtained using the National Death Index. The fully adjusted Latino/White hazard ratio for all-cause mortality was 0.82 (95% confidence interval (CI): 0.77, 0.87), based on 1,085 Latino and 73,807 White deaths; this pattern prevailed over time and across states (thus, likely across Latino subgroups). Hazard ratios were significantly greater than one for stroke (hazard ratio = 1.30, 95% CI: 1.01, 1.68), liver cancer (hazard ratio = 2.02, 95% CI: 1.21, 3.37), and infection (hazard ratio = 1.69, 95% CI: 1.24, 2.32). A substudy found only minor racial/ethnic differences in the quality of Social Security numbers, birth dates, soundex-adjusted names, and National Death Index searches. Results were not likely an artifact of return migration or incomplete mortality data.

  8. Some adjustments to the human capital and the friction cost methods.

    Science.gov (United States)

    Targoutzidis, Antonis

    2018-03-21

    The cost of lost output is a major component of the total cost of illness estimates, especially those for the cost of workplace accidents and diseases. The two main methods for estimating this output, namely the human capital and the friction cost method, lead to very different results, particularly for cases of long-term absence, which makes the choice of method a critical dilemma. Two hidden assumptions, one for each method, are identified in this paper: for human capital method, the assumption that had the accident not happened the individual would remain alive, healthy and employed until retirement, and for friction cost method, the assumption that any created vacancy is covered by an unemployed person. Relevant adjustments to compensate for their impact are proposed: (a) to depreciate the estimates of the human capital method for the risks of premature death, disability or unemployment and (b) to multiply the estimates of the friction cost method with the expected number of job shifts that will be caused by a disability. The impact of these adjustments on the final estimates is very important in terms of magnitude and can lead to better results for each method.

  9. The risk-adjusted vision beyond casemix (DRG) funding in Australia. International lessons in high complexity and capitation.

    Science.gov (United States)

    Antioch, Kathryn M; Walsh, Michael K

    2004-06-01

    Hospitals throughout the world using funding based on diagnosis-related groups (DRG) have incurred substantial budgetary deficits, despite high efficiency. We identify the limitations of DRG funding that lack risk (severity) adjustment for State-wide referral services. Methods to risk adjust DRGs are instructive. The average price in casemix funding in the Australian State of Victoria is policy based, not benchmarked. Average cost weights are too low for high-complexity DRGs relating to State-wide referral services such as heart and lung transplantation and trauma. Risk-adjusted specified grants (RASG) are required for five high-complexity respiratory, cardiology and stroke DRGs incurring annual deficits of $3.6 million due to high casemix complexity and government under-funding despite high efficiency. Five stepwise linear regressions for each DRG excluded non-significant variables and assessed heteroskedasticity and multicollinearlity. Cost per patient was the dependent variable. Significant independent variables were age, length-of-stay outliers, number of disease types, diagnoses, procedures and emergency status. Diagnosis and procedure severity markers were identified. The methodology and the work of the State-wide Risk Adjustment Working Group can facilitate risk adjustment of DRGs State-wide and for Treasury negotiations for expenditure growth. The Alfred Hospital previously negotiated RASG of $14 million over 5 years for three trauma and chronic DRGs. Some chronic diseases require risk-adjusted capitation funding models for Australian Health Maintenance Organizations as an alternative to casemix funding. The use of Diagnostic Cost Groups can facilitate State and Federal government reform via new population-based risk adjusted funding models that measure health need.

  10. A system and method for adjusting and presenting stereoscopic content

    DEFF Research Database (Denmark)

    2013-01-01

    on the basis of one or more vision specific parameters (0M, ThetaMuAlphaChi, ThetaMuIotaNu, DeltaTheta) indicating abnormal vision for the user. In this way, presenting stereoscopic content is enabled that is adjusted specifically to the given person. This may e.g. be used for training purposes or for improved...

  11. Method for adjusting warp measurements to a different board dimension

    Science.gov (United States)

    William T. Simpson; John R. Shelly

    2000-01-01

    Warp in lumber is a common problem that occurs while lumber is being dried. In research or other testing programs, it is sometimes necessary to compare warp of different species or warp caused by different process variables. If lumber dimensions are not the same, then direct comparisons are not possible, and adjusting warp to a common dimension would be desirable so...

  12. The risk-adjusted performance of companies with female directors: A South African case

    Directory of Open Access Journals (Sweden)

    Mkhethwa Mkhize

    2013-04-01

    Full Text Available The objective of this research was to examine the effects of female directors on the risk-adjusted performance of firms listed on the JSE Securities Exchange of South Africa (the JSE. The theoretical underpinning for the relationship between representation of female directors and the risk-adjusted performance of companies was based on institutional theory. The hypothesis that there is no difference between the risk-adjusted performance of companies with female directors and that of companies without female directors was rejected. Implications of the results are discussed and suggestions for future research presented.

  13. Risk-adjusted Outcomes of Clinically Relevant Pancreatic Fistula Following Pancreatoduodenectomy: A Model for Performance Evaluation.

    Science.gov (United States)

    McMillan, Matthew T; Soi, Sameer; Asbun, Horacio J; Ball, Chad G; Bassi, Claudio; Beane, Joal D; Behrman, Stephen W; Berger, Adam C; Bloomston, Mark; Callery, Mark P; Christein, John D; Dixon, Elijah; Drebin, Jeffrey A; Castillo, Carlos Fernandez-Del; Fisher, William E; Fong, Zhi Ven; House, Michael G; Hughes, Steven J; Kent, Tara S; Kunstman, John W; Malleo, Giuseppe; Miller, Benjamin C; Salem, Ronald R; Soares, Kevin; Valero, Vicente; Wolfgang, Christopher L; Vollmer, Charles M

    2016-08-01

    To evaluate surgical performance in pancreatoduodenectomy using clinically relevant postoperative pancreatic fistula (CR-POPF) occurrence as a quality indicator. Accurate assessment of surgeon and institutional performance requires (1) standardized definitions for the outcome of interest and (2) a comprehensive risk-adjustment process to control for differences in patient risk. This multinational, retrospective study of 4301 pancreatoduodenectomies involved 55 surgeons at 15 institutions. Risk for CR-POPF was assessed using the previously validated Fistula Risk Score, and pancreatic fistulas were stratified by International Study Group criteria. CR-POPF variability was evaluated and hierarchical regression analysis assessed individual surgeon and institutional performance. There was considerable variability in both CR-POPF risk and occurrence. Factors increasing the risk for CR-POPF development included increasing Fistula Risk Score (odds ratio 1.49 per point, P ratio 3.30, P performance outliers were identified at the surgeon and institutional levels. Of the top 10 surgeons (≥15 cases) for nonrisk-adjusted performance, only 6 remained in this high-performing category following risk adjustment. This analysis of pancreatic fistulas following pancreatoduodenectomy demonstrates considerable variability in both the risk and occurrence of CR-POPF among surgeons and institutions. Disparities in patient risk between providers reinforce the need for comprehensive, risk-adjusted modeling when assessing performance based on procedure-specific complications. Furthermore, beyond inherent patient risk factors, surgical decision-making influences fistula outcomes.

  14. Risk-adjusted scoring systems in colorectal surgery.

    Science.gov (United States)

    Leung, Edmund; McArdle, Kirsten; Wong, Ling S

    2011-01-01

    Consequent to recent advances in surgical techniques and management, survival rate has increased substantially over the last 25 years, particularly in colorectal cancer patients. However, post-operative morbidity and mortality from colorectal cancer vary widely across the country. Therefore, standardised outcome measures are emphasised not only for professional accountability, but also for comparison between treatment units and regions. In a heterogeneous population, the use of crude mortality as an outcome measure for patients undergoing surgery is simply misleading. Meaningful comparisons, however, require accurate risk stratification of patients being analysed before conclusions can be reached regarding the outcomes recorded. Sub-specialised colorectal surgical units usually dedicated to more complex and high-risk operations. The need for accurate risk prediction is necessary in these units as both mortality and morbidity often are tools to justify the practice of high-risk surgery. The Acute Physiology And Chronic Health Evaluation (APACHE) is a system for classifying patients in the intensive care unit. However, APACHE score was considered too complex for general surgical use. The American Society of Anaesthesiologists (ASA) grade has been considered useful as an adjunct to informed consent and for monitoring surgical performance through time. ASA grade is simple but too subjective. The Physiological & Operative Severity Score for the enUmeration of Mortality and morbidity (POSSUM) and its variant Portsmouth POSSUM (P-POSSUM) were devised to predict outcomes in surgical patients in general, taking into account of the variables in the case-mix. POSSUM has two parts, which include assessment of physiological parameters and operative scores. There are 12 physiological parameters and 6 operative measures. The physiological parameters are taken at the time of surgery. Each physiological parameter or operative variable is sub-divided into three or four levels with

  15. Methods of Financial Risk Management

    Directory of Open Access Journals (Sweden)

    Korzh Natalia

    2016-10-01

    Full Text Available The essence and nature of financial risks are investigated. Their classification is conducted. The features of financial risk management and the main methods of management are considered. The ways of risk compensation are identified. It is proved that the objective external risk basis is such market imperfections as externalities of enterprises and incomplete information about the operation of the business environment and internal objective basis risk – the objective function to maximise profits in a competitive environment. It is revealed that to compensate market imperfections business entities should develop a strategy that combines fill in missing information and neutralise or minimise externalities that tactically implemented in financial risk management programs.

  16. Comparison of different methods for liquid level adjustment in tank prover calibration

    International Nuclear Information System (INIS)

    Garcia, D A; Farias, E C; Gabriel, P C; Aquino, M H; Gomes, R S E; Aibe, V Y

    2015-01-01

    The adjustment of the liquid level during the calibration of tank provers with fixed volume is normally done by overfill but it can be done in different ways. In this article four level adjustment techniques are compared: plate, pipette, ruler and overfill adjustment. The adjustment methods using plate and pipette presented good agreement with the tank's nominal volume and lower uncertainty among the tested methods

  17. Use of surveillance data for prevention of healthcare-associated infection: risk adjustment and reporting dilemmas.

    LENUS (Irish Health Repository)

    O'Neill, Eoghan

    2009-08-01

    Healthcare-associated or nosocomial infection (HCAI) is of increasing importance to healthcare providers and the public. Surveillance is crucial but must be adjusted for risk, especially when used for interhospital comparisons or for public reporting.

  18. A simple signaling rule for variable life-adjusted display derived from an equivalent risk-adjusted CUSUM chart.

    Science.gov (United States)

    Wittenberg, Philipp; Gan, Fah Fatt; Knoth, Sven

    2018-04-17

    The variable life-adjusted display (VLAD) is the first risk-adjusted graphical procedure proposed in the literature for monitoring the performance of a surgeon. It displays the cumulative sum of expected minus observed deaths. It has since become highly popular because the statistic plotted is easy to understand. But it is also easy to misinterpret a surgeon's performance by utilizing the VLAD, potentially leading to grave consequences. The problem of misinterpretation is essentially caused by the variance of the VLAD's statistic that increases with sample size. In order for the VLAD to be truly useful, a simple signaling rule is desperately needed. Various forms of signaling rules have been developed, but they are usually quite complicated. Without signaling rules, making inferences using the VLAD alone is difficult if not misleading. In this paper, we establish an equivalence between a VLAD with V-mask and a risk-adjusted cumulative sum (RA-CUSUM) chart based on the difference between the estimated probability of death and surgical outcome. Average run length analysis based on simulation shows that this particular RA-CUSUM chart has similar performance as compared to the established RA-CUSUM chart based on the log-likelihood ratio statistic obtained by testing the odds ratio of death. We provide a simple design procedure for determining the V-mask parameters based on a resampling approach. Resampling from a real data set ensures that these parameters can be estimated appropriately. Finally, we illustrate the monitoring of a real surgeon's performance using VLAD with V-mask. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Monitoring risk-adjusted medical outcomes allowing for changes over time.

    Science.gov (United States)

    Steiner, Stefan H; Mackay, R Jock

    2014-10-01

    We consider the problem of monitoring and comparing medical outcomes, such as surgical performance, over time. Performance is subject to change due to a variety of reasons including patient heterogeneity, learning, deteriorating skills due to aging, etc. For instance, we expect inexperienced surgeons to improve their skills with practice. We propose a graphical method to monitor surgical performance that incorporates risk adjustment to account for patient heterogeneity. The procedure gives more weight to recent outcomes and down-weights the influence of outcomes further in the past. The chart is clinically interpretable as it plots an estimate of the failure rate for a "standard" patient. The chart also includes a measure of uncertainty in this estimate. We can implement the method using historical data or start from scratch. As the monitoring proceeds, we can base the estimated failure rate on a known risk model or use the observed outcomes to update the risk model as time passes. We illustrate the proposed method with an example from cardiac surgery. © The Author 2013. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Behavioral Risk Factors: Selected Metropolitan Area Risk Trends (SMART) MMSA Age-adjusted Prevalence Data (2011 to Present)

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2011 to present. BRFSS SMART MMSA age-adjusted prevalence combined land line and cell phone data. The Selected Metropolitan Area Risk Trends (SMART) project uses the...

  1. [Risk adjusted assessment of quality of perinatal centers - results of perinatal/neonatal quality surveillance in Saxonia].

    Science.gov (United States)

    Koch, R; Gmyrek, D; Vogtmann, Ch

    2005-12-01

    The weak point of the country-wide perinatal/neonatal quality surveillance as a tool for evaluation of achievements of a distinct clinic, is the ignorance of interhospital differences in the case-mix of patients. Therefore, that approach can not result in a reliable bench marking. To adjust the results of quality assessment of different hospitals according to their risk profile of patients by multivariate analysis. The perinatal/neonatal data base of 12.783 newborns of the saxonian quality surveillance from 1998 to 2000 was analyzed. 4 relevant quality indicators of newborn outcome -- a) severe intraventricular hemorrhage in preterm infants 2500 g and d) hypoxic-ischemic encephalopathy -- were targeted to find out specific risk predictors by considering 26 risk factors. A logistic regression model was used to develop the risk predictors. Risk predictors for the 4 quality indicators could be described by 3 - 9 out of 26 analyzed risk factors. The AUC (ROC)-values for these quality indicators were 82, 89, 89 and 89 %, what signifies their reliability. Using the new specific predictors for calculation the risk adjusted incidence rates of quality indicator yielded in some remarkable changes. The apparent differences in the outcome criteria of analyzed hospitals were found to be much less pronounced. The application of the proposed method for risk adjustment of quality indicators makes it possible to perform a more objective comparison of neonatal outcome criteria between different hospitals or regions.

  2. Sperm competition risk drives rapid ejaculate adjustments mediated by seminal fluid.

    Science.gov (United States)

    Bartlett, Michael J; Steeves, Tammy E; Gemmell, Neil J; Rosengrave, Patrice C

    2017-10-31

    In many species, males can make rapid adjustments to ejaculate performance in response to sperm competition risk; however, the mechanisms behind these changes are not understood. Here, we manipulate male social status in an externally fertilising fish, chinook salmon ( Oncorhynchus tshawytscha ), and find that in less than 48 hr, males can upregulate sperm velocity when faced with an increased risk of sperm competition. Using a series of in vitro sperm manipulation and competition experiments, we show that rapid changes in sperm velocity are mediated by seminal fluid and the effect of seminal fluid on sperm velocity directly impacts paternity share and therefore reproductive success. These combined findings, completely consistent with sperm competition theory, provide unequivocal evidence that sperm competition risk drives plastic adjustment of ejaculate quality, that seminal fluid harbours the mechanism for the rapid adjustment of sperm velocity and that fitness benefits accrue to males from such adjustment.

  3. Process monitoring in intensive care with the use of cumulative expected minus observed mortality and risk-adjusted P charts.

    Science.gov (United States)

    Cockings, Jerome G L; Cook, David A; Iqbal, Rehana K

    2006-02-01

    A health care system is a complex adaptive system. The effect of a single intervention, incorporated into a complex clinical environment, may be different from that expected. A national database such as the Intensive Care National Audit & Research Centre (ICNARC) Case Mix Programme in the UK represents a centralised monitoring, surveillance and reporting system for retrospective quality and comparative audit. This can be supplemented with real-time process monitoring at a local level for continuous process improvement, allowing early detection of the impact of both unplanned and deliberately imposed changes in the clinical environment. Demographic and UK Acute Physiology and Chronic Health Evaluation II (APACHE II) data were prospectively collected on all patients admitted to a UK regional hospital between 1 January 2003 and 30 June 2004 in accordance with the ICNARC Case Mix Programme. We present a cumulative expected minus observed (E-O) plot and the risk-adjusted p chart as methods of continuous process monitoring. We describe the construction and interpretation of these charts and show how they can be used to detect planned or unplanned organisational process changes affecting mortality outcomes. Five hundred and eighty-nine adult patients were included. The overall death rate was 0.78 of predicted. Calibration showed excess survival in ranges above 30% risk of death. The E-O plot confirmed a survival above that predicted. Small transient variations were seen in the slope that could represent random effects, or real but transient changes in the quality of care. The risk-adjusted p chart showed several observations below the 2 SD control limits of the expected mortality rate. These plots provide rapid analysis of risk-adjusted performance suitable for local application and interpretation. The E-O chart provided rapid easily visible feedback of changes in risk-adjusted mortality, while the risk-adjusted p chart allowed statistical evaluation. Local analysis of

  4. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  5. Early Parental Positive Behavior Support and Childhood Adjustment: Addressing Enduring Questions with New Methods.

    Science.gov (United States)

    Waller, Rebecca; Gardner, Frances; Dishion, Thomas; Sitnick, Stephanie L; Shaw, Daniel S; Winter, Charlotte E; Wilson, Melvin

    2015-05-01

    A large literature provides strong empirical support for the influence of parenting on child outcomes. The current study addresses enduring research questions testing the importance of early parenting behavior to children's adjustment. Specifically, we developed and tested a novel multi-method observational measure of parental positive behavior support at age 2. Next, we tested whether early parental positive behavior support was related to child adjustment at school age, within a multi-agent and multi-method measurement approach and design. Observational and parent-reported data from mother-child dyads (N = 731; 49 percent female) were collected from a high-risk sample at age 2. Follow-up data were collected via teacher report and child assessment at age 7.5. The results supported combining three different observational methods to assess positive behavior support at age 2 within a latent factor. Further, parents' observed positive behavior support at age 2 predicted multiple types of teacher-reported and child-assessed problem behavior and competencies at 7.5 years old. Results supported the validity and predictive capability of a multi-method observational measure of parenting and the importance of a continued focus on the early years within preventive interventions.

  6. Willingness to pay for a quality-adjusted life year: an evaluation of attitudes towards risk and preferences

    OpenAIRE

    Martín-Fernández, Jesus; Polentinos-Castro, Elena; del Cura-González, Ma Isabel; Ariza-Cardiel, Gloria; Abraira, Victor; Gil-LaCruz, Ana Isabel; García-Pérez, Sonia

    2014-01-01

    Background This paper examines the Willingness to Pay (WTP) for a quality-adjusted life year (QALY) expressed by people who attended the healthcare system as well as the association of attitude towards risk and other personal characteristics with their response. Methods Health-state preferences, measured by EuroQol (EQ-5D-3L), were combined with WTP for recovering a perfect health state. WTP was assessed using close-ended, iterative bidding, contingent valuation method. Data on demographic an...

  7. Calculation of Credit Valuation Adjustment Based on Least Square Monte Carlo Methods

    Directory of Open Access Journals (Sweden)

    Qian Liu

    2015-01-01

    Full Text Available Counterparty credit risk has become one of the highest-profile risks facing participants in the financial markets. Despite this, relatively little is known about how counterparty credit risk is actually priced mathematically. We examine this issue using interest rate swaps. This largely traded financial product allows us to well identify the risk profiles of both institutions and their counterparties. Concretely, Hull-White model for rate and mean-reverting model for default intensity have proven to be in correspondence with the reality and to be well suited for financial institutions. Besides, we find that least square Monte Carlo method is quite efficient in the calculation of credit valuation adjustment (CVA, for short as it avoids the redundant step to generate inner scenarios. As a result, it accelerates the convergence speed of the CVA estimators. In the second part, we propose a new method to calculate bilateral CVA to avoid double counting in the existing bibliographies, where several copula functions are adopted to describe the dependence of two first to default times.

  8. Optimum adjusting method of fuel in a reactor

    International Nuclear Information System (INIS)

    Otsuji, Niro; Shirakawa, Toshihisa; Toyoshi, Isamu; Tatemichi, Shin-ichiro; Mukai, Hideyuki.

    1976-01-01

    Object: To effectively select an intermediate pattern of a control rod to thereby shorten time required to adjust the fuel. Structure: Control rods are divided into several regions in concentric and circular fashion or the like within a core. A control rod position satisfied with a thermal limit value in a maximal power level is preset as a target position by a three dimensional nuclear hydrothermal force counting code or the like. Next, an intermediate pattern of the control rods in each region is determined on the basis of the target position, and while judging operational condition, a part of fuel rods is maintained for a given period of time in a level more than power density of the target position with the power increased within the range not to produce interaction between pellet and cladding material (PCI), so that said power density may be learned. Thereafter, the power is rapidly decreased. Similar operation may be applied to the other fuel rods, after which the control rod may be set the target position to obtain the maximum power level. (Ikeda, J.)

  9. Risk adjustment and the fear of markets: the case of Belgium.

    Science.gov (United States)

    Schokkaert, E; Van de Voorde, C

    2000-02-01

    In Belgium the management and administration of the compulsory and universal health insurance is left to a limited number of non-governmental non-profit sickness funds. Since 1995 these sickness funds are partially financed in a prospective way. The risk adjustment scheme is based on a regression model to explain medical expenditures for different social groups. Medical supply is taken out of the formula to construct risk-adjusted capitation payments. The risk-adjustment formula still leaves scope for risk selection. At the same time, the sickness funds were not given the instruments to exert a real influence on expenditures and the health insurance market has not been opened for new entrants. As a consequence, Belgium runs the danger of ending up in a situation with little incentives for efficiency and considerable profits from cream skimming.

  10. Adjusting the general growth balance method for migration

    OpenAIRE

    Hill, Kenneth; Queiroz, Bernardo

    2010-01-01

    Death distribution methods proposed for death registration coverage by comparison with census age distributions assume no net migration. This assumption makes it problematic to apply these methods to sub-national and national populations affected by substantial net migration. In this paper, we propose and explore a two-step process in which the Growth Balance Equation is first used to estimate net migration rates, using a model of age-specific migration, and then it is used to compare the obs...

  11. A risk-adjusted O-E CUSUM with monitoring bands for monitoring medical outcomes.

    Science.gov (United States)

    Sun, Rena Jie; Kalbfleisch, John D

    2013-03-01

    In order to monitor a medical center's survival outcomes using simple plots, we introduce a risk-adjusted Observed-Expected (O-E) Cumulative SUM (CUSUM) along with monitoring bands as decision criterion.The proposed monitoring bands can be used in place of a more traditional but complicated V-shaped mask or the simultaneous use of two one-sided CUSUMs. The resulting plot is designed to simultaneously monitor for failure time outcomes that are "worse than expected" or "better than expected." The slopes of the O-E CUSUM provide direct estimates of the relative risk (as compared to a standard or expected failure rate) for the data being monitored. Appropriate rejection regions are obtained by controlling the false alarm rate (type I error) over a period of given length. Simulation studies are conducted to illustrate the performance of the proposed method. A case study is carried out for 58 liver transplant centers. The use of CUSUM methods for quality improvement is stressed. Copyright © 2013, The International Biometric Society.

  12. Conference Innovations in Derivatives Market : Fixed Income Modeling, Valuation Adjustments, Risk Management, and Regulation

    CERN Document Server

    Grbac, Zorana; Scherer, Matthias; Zagst, Rudi

    2016-01-01

    This book presents 20 peer-reviewed chapters on current aspects of derivatives markets and derivative pricing. The contributions, written by leading researchers in the field as well as experienced authors from the financial industry, present the state of the art in: • Modeling counterparty credit risk: credit valuation adjustment, debit valuation adjustment, funding valuation adjustment, and wrong way risk. • Pricing and hedging in fixed-income markets and multi-curve interest-rate modeling. • Recent developments concerning contingent convertible bonds, the measuring of basis spreads, and the modeling of implied correlations. The recent financial crisis has cast tremendous doubts on the classical view on derivative pricing. Now, counterparty credit risk and liquidity issues are integral aspects of a prudent valuation procedure and the reference interest rates are represented by a multitude of curves according to their different periods and maturities. A panel discussion included in the book (featuring D...

  13. Beyond preadoptive risk: The impact of adoptive family environment on adopted youth's psychosocial adjustment.

    Science.gov (United States)

    Ji, Juye; Brooks, Devon; Barth, Richard P; Kim, Hansung

    2010-07-01

    Adopted children often are exposed to preadoptive stressors--such as prenatal substance exposure, child maltreatment, and out-of-home placements--that increase their risks for psychosocial maladjustment. Psychosocial adjustment of adopted children emerges as the product of pre- and postadoptive factors. This study builds on previous research, which fails to simultaneously assess the influences of pre- and postadoptive factors, by examining the impact of adoptive family sense of coherence on adoptee's psychosocial adjustment beyond the effects of preadoptive risks. Using a sample of adoptive families (n = 385) taking part in the California Long Range Adoption Study, structural equation modeling analyses were performed. Results indicate a significant impact of family sense of coherence on adoptees' psychosocial adjustment and a considerably less significant role of preadoptive risks. The findings suggest the importance of assessing adoptive family's ability to respond to stress and of helping families to build and maintain their capacity to cope with stress despite the sometimes fractious pressures of adoption.

  14. Health-based risk adjustment: is inpatient and outpatient diagnostic information sufficient?

    Science.gov (United States)

    Lamers, L M

    Adequate risk adjustment is critical to the success of market-oriented health care reforms in many countries. Currently used risk adjusters based on demographic and diagnostic cost groups (DCGs) do not reflect expected costs accurately. This study examines the simultaneous predictive accuracy of inpatient and outpatient morbidity measures and prior costs. DCGs, pharmacy cost groups (PCGs), and prior year's costs improve the predictive accuracy of the demographic model substantially. DCGs and PCGs seem complementary in their ability to predict future costs. However, this study shows that the combination of DCGs and PCGs still leaves room for cream skimming.

  15. Risk-adjusted survival after tissue versus mechanical aortic valve replacement: a 23-year assessment.

    Science.gov (United States)

    Gaca, Jeffrey G; Clare, Robert M; Rankin, J Scott; Daneshmand, Mani A; Milano, Carmelo A; Hughes, G Chad; Wolfe, Walter G; Glower, Donald D; Smith, Peter K

    2013-11-01

    Detailed analyses of risk-adjusted outcomes after mitral valve surgery have documented significant survival decrements with tissue valves at any age. Several recent studies of prosthetic aortic valve replacement (AVR) also have suggested a poorer performance of tissue valves, although analyses have been limited to small matched series. The study aim was to test the hypothesis that AVR with tissue valves is associated with a lower risk-adjusted survival, as compared to mechanical valves. Between 1986 and 2009, primary isolated AVR, with or without coronary artery bypass grafting (CABG), was performed with currently available valve types in 2148 patients (1108 tissue valves, 1040 mechanical). Patients were selected for tissue valves to be used primarily in the elderly. Baseline and operative characteristics were documented prospectively with a consistent variable set over the entire 23-year period. Follow up was obtained with mailed questionnaires, supplemented by National Death Index searches. The average time to death or follow up was seven years, and follow up for survival was 96.2% complete. Risk-adjusted survival characteristics for the two groups were evaluated using a Cox proportional hazards model with stepwise selection of candidate variables. Differences in baseline characteristics between groups were (tissue versus mechanical): median age 73 versus 61 years; non-elective surgery 32% versus 28%; CABG 45% versus 35%; median ejection fraction 55% versus 55%; renal failure 6% versus 1%; diabetes 18% versus 7% (pvalves; however, after risk adjustment for the adverse profiles of tissue valve patients, no significant difference was observed in survival after tissue or mechanical AVR. Thus, the hypothesis did not hold, and risk-adjusted survival was equivalent, of course qualified by the fact that selection bias was evident. With selection criteria that employed tissue AVR more frequently in elderly patients, tissue and mechanical valves achieved similar survival

  16. Does risk-adjusted payment influence primary care providers’ decision on where to set up practices?

    DEFF Research Database (Denmark)

    Anell, Anders; Dackehag, Margareta; Dietrichson, Jens

    2018-01-01

    Background: Providing equal access to healthcare is an important objective in most health care systems. It is especially pertinent in systems like the Swedish primary care market, where private providers are free to establish themselves in any part of the country. To improve equity in access...... to care, 15 out 21 county councils in Sweden have implemented risk-adjusted capitation based on the Care Need Index, which increases capitation to primary care centers with a large share of patients with unfavorable socioeconomic and demographic characteristics. Our aim is to estimate the effects of using...... Index values. Results: Risk-adjusted capitation significantly increases the number of private primary care centers in areas with relatively high Care Need Index values. The adjustment results in a changed distribution of private centers within county councils; the total number of private centers does...

  17. Evidence that Risk Adjustment is Unnecessary in Estimates of the User Cost of Money

    Directory of Open Access Journals (Sweden)

    Diego A. Restrepo-Tobón

    2015-12-01

    Full Text Available Investors value the  special attributes of monetary assets (e.g.,  exchangeability, liquidity, and safety  and pay a premium for holding them in the form of a lower return rate. The user cost of holding monetary assets can be measured approximately by the difference between the  returns on illiquid risky assets and  those of safer liquid assets. A more appropriate measure should adjust this difference by the  differential risk of the  assets in question. We investigate the  impact that time  non-separable preferences has on the  estimation of the  risk-adjusted user cost of money. Using U.K. data from 1965Q1 to 2011Q1, we estimate a habit-based asset pricing model  with money  in the utility function and  find that the  risk  adjustment for risky monetary assets is negligible. Thus, researchers can dispense with risk adjusting the  user cost of money  in constructing monetary aggregate indexes.

  18. A comparison of internal versus external risk-adjustment for monitoring clinical outcomes

    NARCIS (Netherlands)

    Koetsier, Antonie; de Keizer, Nicolette; Peek, Niels

    2011-01-01

    Internal and external prognostic models can be used to calculate severity of illness adjusted mortality risks. However, it is unclear what the consequences are of using an external model instead of an internal model when monitoring an institution's clinical performance. Theoretically, using an

  19. Assessing At-Risk Youth Using the Reynolds Adolescent Adjustment Screening Inventory with a Latino Population

    Science.gov (United States)

    Balkin, Richard S.; Cavazos, Javier, Jr.; Hernandez, Arthur E.; Garcia, Roberto; Dominguez, Denise L.; Valarezo, Alexandra

    2013-01-01

    Factor analyses were conducted on scores from the Reynolds Adolescent Adjustment Screening Inventory (RAASI; Reynolds, 2001) representing at-risk Latino youth. The 4-factor model of the RAASI did not exhibit a good fit. However, evidence of generalizability for Latino youth was noted. (Contains 3 tables.)

  20. Measuring Profitability Impacts of Information Technology: Use of Risk Adjusted Measures.

    Science.gov (United States)

    Singh, Anil; Harmon, Glynn

    2003-01-01

    Focuses on understanding how investments in information technology are reflected in the income statements and balance sheets of firms. Shows that the relationship between information technology investments and corporate profitability is much better explained by using risk-adjusted measures of corporate profitability than using the same measures…

  1. School Adjustment of Pupils with ADHD: Cognitive, Emotional and Temperament Risk Factors

    Science.gov (United States)

    Sanchez-Perez, Noelia; Gonzalez-Salinas, Carmen

    2013-01-01

    From different research perspectives, the cognitive and emotional characteristics associated with ADHD in children have been identified as risk factors for the development of diverse adjustment problems in the school context. Research in nonclinical population can additionally help in understanding ADHD deficits, since children with specific…

  2. Prior use of durable medical equipment as a risk adjuster for health-based capitation

    NARCIS (Netherlands)

    R.C. van Kleef (Richard); R.C.J.A. van Vliet (René)

    2010-01-01

    textabstractThis paper examines a new risk adjuster for capitation payments to Dutch health plans, based on the prior use of durable medical equipment (DME). The essence is to classify users of DME in a previous year into clinically homogeneous classes and to apply the resulting classification as a

  3. 48 CFR 215.404-71-3 - Contract type risk and working capital adjustment.

    Science.gov (United States)

    2010-10-01

    .... Cost-plus-incentive-free (4) 1.0 0 to 2. Cost-plus-fixed-fee (4) 0.5 0 to 1. Time-and-materials... considered cost-plus-fixed-fee contracts for the purposes of assigning profit values. They shall not receive... CONTRACTING BY NEGOTIATION Contract Pricing 215.404-71-3 Contract type risk and working capital adjustment. (a...

  4. 12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk Adjustment

    Science.gov (United States)

    2010-01-01

    ...) The bank must have a risk control unit that reports directly to senior management and is independent... management systems at least annually. (c) Market risk factors. The bank's internal model must use risk.... Section 4. Internal Models (a) General. For risk-based capital purposes, a bank subject to this appendix...

  5. Anesthesiologist- and System-Related Risk Factors for Risk-Adjusted Pediatric Anesthesia-Related Cardiac Arrest.

    Science.gov (United States)

    Zgleszewski, Steven E; Graham, Dionne A; Hickey, Paul R; Brustowicz, Robert M; Odegard, Kirsten C; Koka, Rahul; Seefelder, Christian; Navedo, Andres T; Randolph, Adrienne G

    2016-02-01

    Pediatric anesthesia-related cardiac arrest (ARCA) is an uncommon but potentially preventable adverse event. Infants and children with more severe underlying disease are at highest risk. We aimed to identify system- and anesthesiologist-related risk factors for ARCA. We analyzed a prospectively collected patient cohort data set of anesthetics administered from 2000 to 2011 to children at a large tertiary pediatric hospital. Pre-procedure systemic disease level was characterized by ASA physical status (ASA-PS). Two reviewers independently reviewed cardiac arrests and categorized their anesthesia relatedness. Factors associated with ARCA in the univariate analyses were identified for reevaluation after adjustment for patient age and ASA-PS. Cardiac arrest occurred in 142 of 276,209 anesthetics (incidence 5.1/10,000 anesthetics); 72 (2.6/10,000 anesthetics) were classified as anesthesia-related. In the univariate analyses, risk of ARCA was much higher in cardiac patients and for anesthesiologists with lower annual caseload and/or fewer annual days delivering anesthetics (all P risk adjustment for ASA-PS ≥ III and age ≤ 6 months, however, the association with lower annual days delivering anesthetics remained (P = 0.03), but the other factors were no longer significant. Case-mix explained most associations between higher risk of pediatric ARCA and anesthesiologist-related variables at our institution, but the association with fewer annual days delivering anesthetics remained. Our findings highlight the need for rigorous adjustment for patient risk factors in anesthesia patient safety studies.

  6. Dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses.

    Science.gov (United States)

    Zhang, Xiang; Loda, Justin B; Woodall, William H

    2017-07-20

    For a patient who has survived a surgery, there could be several levels of recovery. Thus, it is reasonable to consider more than two outcomes when monitoring surgical outcome quality. The risk-adjusted cumulative sum (CUSUM) chart based on multiresponses has been developed for monitoring a surgical process with three or more outcomes. However, there is a significant effect of varying risk distributions on the in-control performance of the chart when constant control limits are applied. To overcome this disadvantage, we apply the dynamic probability control limits to the risk-adjusted CUSUM charts for multiresponses. The simulation results demonstrate that the in-control performance of the charts with dynamic probability control limits can be controlled for different patient populations because these limits are determined for each specific sequence of patients. Thus, the use of dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses allows each chart to be designed for the corresponding patient sequence of a surgeon or a hospital and therefore does not require estimating or monitoring the patients' risk distribution. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Rational Multi-curve Models with Counterparty-risk Valuation Adjustments

    DEFF Research Database (Denmark)

    Crépey, Stéphane; Macrina, Andrea; Nguyen, Tuyet Mai

    2016-01-01

    We develop a multi-curve term structure set-up in which the modelling ingredients are expressed by rational functionals of Markov processes. We calibrate to London Interbank Offer Rate swaptions data and show that a rational two-factor log-normal multi-curve model is sufficient to match market da...... with regulatory obligations. In order to compute counterparty-risk valuation adjustments, such as credit valuation adjustment, we show how default intensity processes with rational form can be derived. We flesh out our study by applying the results to a basis swap contract....... with accuracy. We elucidate the relationship between the models developed and calibrated under a risk-neutral measure Q and their consistent equivalence class under the real-world probability measure P. The consistent P-pricing models are applied to compute the risk exposures which may be required to comply...

  8. Method for environmental risk analysis (MIRA) revision 2007

    International Nuclear Information System (INIS)

    2007-04-01

    OLF's instruction manual for carrying out environmental risk analyses provides a united approach and a common framework for environmental risk assessments. This is based on the best information available. The manual implies standardizations of a series of parameters, input data and partial analyses that are included in the environmental risk analysis. Environmental risk analyses carried out according to the MIRA method will thus be comparable between fields and between companies. In this revision an update of the text in accordance with today's practice for environmental risk analyses and prevailing regulations is emphasized. Moreover, method adjustments for especially protected beach habitats have been introduced, as well as a general method for estimating environmental risk concerning fish. Emphasis has also been put on improving environmental risk analysis' possibilities to contribute to a better management of environmental risk in the companies (ml)

  9. What is the empirical evidence that hospitals with higher-risk adjusted mortality rates provide poorer quality care? A systematic review of the literature

    Directory of Open Access Journals (Sweden)

    Mohammed Mohammed A

    2007-06-01

    Full Text Available Abstract Background Despite increasing interest and publication of risk-adjusted hospital mortality rates, the relationship with underlying quality of care remains unclear. We undertook a systematic review to ascertain the extent to which variations in risk-adjusted mortality rates were associated with differences in quality of care. Methods We identified studies in which risk-adjusted mortality and quality of care had been reported in more than one hospital. We adopted an iterative search strategy using three databases – Medline, HealthSTAR and CINAHL from 1966, 1975 and 1982 respectively. We identified potentially relevant studies on the basis of the title or abstract. We obtained these papers and included those which met our inclusion criteria. Results From an initial yield of 6,456 papers, 36 studies met the inclusion criteria. Several of these studies considered more than one process-versus-risk-adjusted mortality relationship. In total we found 51 such relationships in a widen range of clinical conditions using a variety of methods. A positive correlation between better quality of care and risk-adjusted mortality was found in under half the relationships (26/51 51% but the remainder showed no correlation (16/51 31% or a paradoxical correlation (9/51 18%. Conclusion The general notion that hospitals with higher risk-adjusted mortality have poorer quality of care is neither consistent nor reliable.

  10. State infant mortality: an ecologic study to determine modifiable risks and adjusted infant mortality rates.

    Science.gov (United States)

    Paul, David A; Mackley, Amy; Locke, Robert G; Stefano, John L; Kroelinger, Charlan

    2009-05-01

    To determine factors contributing to state infant mortality rates (IMR) and develop an adjusted IMR in the United States for 2001 and 2002. Ecologic study of factors contributing to state IMR. State IMR for 2001 and 2002 were obtained from the United States linked death and birth certificate data from the National Center for Health Statistics. Factors investigated using multivariable linear regression included state racial demographics, ethnicity, state population, median income, education, teen birth rate, proportion of obesity, smoking during pregnancy, diabetes, hypertension, cesarean delivery, prenatal care, health insurance, self-report of mental illness, and number of in-vitro fertilization procedures. Final risk adjusted IMR's were standardized and states were compared with the United States adjusted rates. Models for IMR in individual states in 2001 (r2 = 0.66, P < 0.01) and 2002 (r2 = 0.81, P < 0.01) were tested. African-American race, teen birth rate, and smoking during pregnancy remained independently associated with state infant mortality rates for 2001 and 2002. Ninety five percent confidence intervals (CI) were calculated around the regression lines to model the expected IMR. After adjustment, some states maintained a consistent IMR; for instance, Vermont and New Hampshire remained low, while Delaware and Louisiana remained high. However, other states such as Mississippi, which have traditionally high infant mortality rates, remained within the expected 95% CI for IMR after adjustment indicating confounding affected the initial unadjusted rates. Non-modifiable demographic variables, including the percentage of non-Hispanic African-American and Hispanic populations of the state are major factors contributing to individual variation in state IMR. Race and ethnicity may confound or modify the IMR in states that shifted inside or outside the 95% CI following adjustment. Other factors including smoking during pregnancy and teen birth rate, which are

  11. The New York Sepsis Severity Score: Development of a Risk-Adjusted Severity Model for Sepsis.

    Science.gov (United States)

    Phillips, Gary S; Osborn, Tiffany M; Terry, Kathleen M; Gesten, Foster; Levy, Mitchell M; Lemeshow, Stanley

    2018-05-01

    In accordance with Rory's Regulations, hospitals across New York State developed and implemented protocols for sepsis recognition and treatment to reduce variations in evidence informed care and preventable mortality. The New York Department of Health sought to develop a risk assessment model for accurate and standardized hospital mortality comparisons of adult septic patients across institutions using case-mix adjustment. Retrospective evaluation of prospectively collected data. Data from 43,204 severe sepsis and septic shock patients from 179 hospitals across New York State were evaluated. Prospective data were submitted to a database from January 1, 2015, to December 31, 2015. None. Maximum likelihood logistic regression was used to estimate model coefficients used in the New York State risk model. The mortality probability was estimated using a logistic regression model. Variables to be included in the model were determined as part of the model-building process. Interactions between variables were included if they made clinical sense and if their p values were less than 0.05. Model development used a random sample of 90% of available patients and was validated using the remaining 10%. Hosmer-Lemeshow goodness of fit p values were considerably greater than 0.05, suggesting good calibration. Areas under the receiver operator curve in the developmental and validation subsets were 0.770 (95% CI, 0.765-0.775) and 0.773 (95% CI, 0.758-0.787), respectively, indicating good discrimination. Development and validation datasets had similar distributions of estimated mortality probabilities. Mortality increased with rising age, comorbidities, and lactate. The New York Sepsis Severity Score accurately estimated the probability of hospital mortality in severe sepsis and septic shock patients. It performed well with respect to calibration and discrimination. This sepsis-specific model provides an accurate, comprehensive method for standardized mortality comparison of adult

  12. An adjusted probability method for the identification of sociometric status in classrooms

    NARCIS (Netherlands)

    García Bacete, F.J.; Cillessen, A.H.N.

    2017-01-01

    Objective: The aim of this study was to test the performance of an adjusted probability method for sociometric classification proposed by García Bacete (GB) in comparison with two previous methods. Specific goals were to examine the overall agreement between methods, the behavioral correlates of

  13. Risk-adjusted antibiotic consumption in 34 public acute hospitals in Ireland, 2006 to 2014

    Science.gov (United States)

    Oza, Ajay; Donohue, Fionnuala; Johnson, Howard; Cunney, Robert

    2016-01-01

    As antibiotic consumption rates between hospitals can vary depending on the characteristics of the patients treated, risk-adjustment that compensates for the patient-based variation is required to assess the impact of any stewardship measures. The aim of this study was to investigate the usefulness of patient-based administrative data variables for adjusting aggregate hospital antibiotic consumption rates. Data on total inpatient antibiotics and six broad subclasses were sourced from 34 acute hospitals from 2006 to 2014. Aggregate annual patient administration data were divided into explanatory variables, including major diagnostic categories, for each hospital. Multivariable regression models were used to identify factors affecting antibiotic consumption. Coefficient of variation of the root mean squared errors (CV-RMSE) for the total antibiotic usage model was very good (11%), however, the value for two of the models was poor (> 30%). The overall inpatient antibiotic consumption increased from 82.5 defined daily doses (DDD)/100 bed-days used in 2006 to 89.2 DDD/100 bed-days used in 2014; the increase was not significant after risk-adjustment. During the same period, consumption of carbapenems increased significantly, while usage of fluoroquinolones decreased. In conclusion, patient-based administrative data variables are useful for adjusting hospital antibiotic consumption rates, although additional variables should also be employed. PMID:27541730

  14. [Family characteristics, organic risk factors, psychopathological picture and premorbid adjustment of hospitalized adolescent patients].

    Science.gov (United States)

    Małkiewicz-Borkowska, M; Namysłowska, I; Siewierska, A; Puzyńska, E; Sredniawa, H; Zechowski, C; Iwanek, A; Ruszkowska, E

    1996-01-01

    The relation of some family characteristics such as cohesion and adaptability with organic risk factors, developmental psychopathology, clinical picture and premorbid adjustment was assessed in the group of 100 hospitalized adolescent patients and families. We found correlation between: some of organic risk factors (pathology of neonatal period, pathology of early childhood), some of indicators of developmental psychopathology (eating disorders, conduct disorders), some of clinical signs (mannerism, grandiosity, hostility, suspciousness, disturbances of content of thinking), premorbid adjustment, and variables related to families, described before. We think that biological variables characterizing child (pathology of neonatal period, pathology of early childhood) have an influence on some family characteristics as independent variable. General system theory and circular thinking support these conclusions. In order to verify them, it is necessary to undertake further investigations, based on other methodology, using this results as preliminary findings.

  15. Funding issues for Victorian hospitals: the risk-adjusted vision beyond casemix funding.

    Science.gov (United States)

    Antioch, K; Walsh, M

    2000-01-01

    This paper discusses casemix funding issues in Victoria impacting on teaching hospitals. For casemix payments to be acceptable, the average price and cost weights must be set at an appropriate standard. The average price is based on a normative, policy basis rather than benchmarking. The 'averaging principle' inherent in cost weights has resulted in some AN-DRG weights being too low for teaching hospitals that are key State-wide providers of high complexity services such as neurosurgery and trauma. Casemix data have been analysed using international risk adjustment methodologies to successfully negotiate with the Victorian State Government for specified grants for several high complexity AN-DRGs. A risk-adjusted capitation funding model has also been developed for cystic fibrosis patients treated by The Alfred, called an Australian Health Maintenance Organisation (AHMO). This will facilitate the development of similar models by both the Victorian and Federal governments.

  16. Screening techniques, sustainability and risk adjusted returns. : - A quantitative study on the Swedish equity funds market

    OpenAIRE

    Ögren, Tobias; Forslund, Petter

    2017-01-01

    Previous studies have primarily compared the performance of sustainable equity funds and non-sustainable equity funds. A meta-analysis over 85 different studies in the field concludes that there is no statistically significant difference in risk-adjusted returns when comparing sustainable funds and non-sustainable funds. This study is thus an extension on previous studies where the authors have chosen to test the two most common sustainability screening techniques to test if there is a differ...

  17. The Impact of Capital Structure on Economic Capital and Risk Adjusted Performance

    OpenAIRE

    Porteous, Bruce; Tapadar, Pradip

    2008-01-01

    The impact that capital structure and capital asset allocation have on financial services firm economic capital and risk adjusted performance is considered. A stochastic modelling approach is used in conjunction with banking and insurance examples. It is demonstrated that gearing up Tier 1 capital with Tier 2 capital can be in the interests of bank Tier 1 capital providers, but may not always be so for insurance Tier 1 capital providers. It is also shown that, by allocating a bank or insuranc...

  18. Does Risk-Adjusted Payment Influence Primary Care Providers' Decision on Where to Set Up Practices?

    DEFF Research Database (Denmark)

    Dietrichson, Jens; Anell, Anders; Dackehag, Margareta

    Providing equal access to health care is an important objective in most health care systems. It is especially pertinent in systems like the Swedish primary care market, where providers are free to establish themselves in any part of the country. To improve equity in access to care, 15 out 21 county...... of private primary care centers in areas with unfavorable socioeconomic and demographic characteristics. More generally, this result indicates that risk-adjusted capitation can significantly affect private providers’ establishment decisions....

  19. Improved implementation of the risk-adjusted Bernoulli CUSUM chart to monitor surgical outcome quality.

    Science.gov (United States)

    Keefe, Matthew J; Loda, Justin B; Elhabashy, Ahmad E; Woodall, William H

    2017-06-01

    The traditional implementation of the risk-adjusted Bernoulli cumulative sum (CUSUM) chart for monitoring surgical outcome quality requires waiting a pre-specified period of time after surgery before incorporating patient outcome information. We propose a simple but powerful implementation of the risk-adjusted Bernoulli CUSUM chart that incorporates outcome information as soon as it is available, rather than waiting a pre-specified period of time after surgery. A simulation study is presented that compares the performance of the traditional implementation of the risk-adjusted Bernoulli CUSUM chart to our improved implementation. We show that incorporating patient outcome information as soon as it is available leads to quicker detection of process deterioration. Deterioration of surgical performance could be detected much sooner using our proposed implementation, which could lead to the earlier identification of problems. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  20. Household adjustment to flood risk: a survey of coastal residents in Texas and Florida, United States.

    Science.gov (United States)

    Brody, Samuel D; Lee, Yoonjeong; Highfield, Wesley E

    2017-07-01

    Individual households have increasingly borne responsibility for reducing the adverse impacts of flooding on their property. Little observational research has been conducted, however, at the household level to examine the major factors contributing to the selection of a particular household adjustment. This study addresses the issue by evaluating statistically the factors influencing the adoption of various household flood hazard adjustments. The results indicate that respondents with higher-value homes or longer housing tenure are more likely to adopt structural and expensive techniques. In addition, the information source and the Community Rating System (CRS) score for the jurisdiction where the household is located have a significant bearing on household adjustment. In contrast, proximity to risk zones and risk perception yield somewhat mixed results or behave counter to assumptions in the literature. The study findings provide insights that will be of value to governments and decision-makers interested in encouraging homeowners to take protective action given increasing flood risk. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  1. Suboptimal decision making by children with ADHD in the face of risk: Poor risk adjustment and delay aversion rather than general proneness to taking risks.

    Science.gov (United States)

    Sørensen, Lin; Sonuga-Barke, Edmund; Eichele, Heike; van Wageningen, Heidi; Wollschlaeger, Daniel; Plessen, Kerstin Jessica

    2017-02-01

    Suboptimal decision making in the face of risk (DMR) in children with attention-deficit hyperactivity disorder (ADHD) may be mediated by deficits in a number of different neuropsychological processes. We investigated DMR in children with ADHD using the Cambridge Gambling Task (CGT) to distinguish difficulties in adjusting to changing probabilities of choice outcomes (so-called risk adjustment) from general risk proneness, and to distinguish these 2 processes from delay aversion (the tendency to choose the least delayed option) and impairments in the ability to reflect on choice options. Based on previous research, we predicted that suboptimal performance on this task in children with ADHD would be primarily relate to problems with risk adjustment and delay aversion rather than general risk proneness. Drug naïve children with ADHD (n = 36), 8 to 12 years, and an age-matched group of typically developing children (n = 34) performed the CGT. As predicted, children with ADHD were not more prone to making risky choices (i.e., risk proneness). However, they had difficulty adjusting to changing risk levels and were more delay aversive-with these 2 effects being correlated. Our findings add to the growing body of evidence that children with ADHD do not favor risk taking per se when performing gambling tasks, but rather may lack the cognitive skills or motivational style to appraise changing patterns of risk effectively. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Methods for risk estimation in nuclear energy

    Energy Technology Data Exchange (ETDEWEB)

    Gauvenet, A [CEA, 75 - Paris (France)

    1979-01-01

    The author presents methods for estimating the different risks related to nuclear energy: immediate or delayed risks, individual or collective risks, risks of accidents and long-term risks. These methods have attained a highly valid level of elaboration and their application to other industrial or human problems is currently under way, especially in English-speaking countries.

  3. Risk adjustment for case mix and the effect of surgeon volume on morbidity.

    Science.gov (United States)

    Maas, Matthew B; Jaff, Michael R; Rordorf, Guy A

    2013-06-01

    Retrospective studies of large administrative databases have shown higher mortality for procedures performed by low-volume surgeons, but the adequacy of risk adjustment in those studies is in doubt. To determine whether the relationship between surgeon volume and outcomes is an artifact of case mix using a prospective sample of carotid endarterectomy cases. Observational cohort study from January 1, 2008, through December 31, 2010, with preoperative, immediate postoperative, and 30-day postoperative assessments acquired by independent monitors. Urban, tertiary academic medical center. All 841 patients who underwent carotid endarterectomy performed by a vascular surgeon or cerebrovascular neurosurgeon at the institution. Carotid endarterectomy without another concurrent surgery. Stroke, death, and other surgical complications occurring within 30 days of surgery along with other case data. A low-volume surgeon performed 40 or fewer cases per year. Variables used in a comparison administrative database study, as well as variables identified by our univariate analysis, were used for adjusted analyses to assess for an association between low-volume surgeons and the rate of stroke and death as well as other complications. RESULTS The rate of stroke and death was 6.9% for low-volume surgeons and 2.0% for high-volume surgeons (P = .001). Complications were similarly higher (13.4% vs 7.2%, P = .008). Low-volume surgeons performed more nonelective cases. Low-volume surgeons were significantly associated with stroke and death in the unadjusted analysis as well as after adjustment with variables used in the administrative database study (odds ratio, 3.61; 95% CI, 1.70-7.67, and odds ratio, 3.68; 95% CI, 1.72-7.89, respectively). However, adjusting for the significant disparity of American Society of Anesthesiologists Physical Status classification in case mix eliminated the effect of surgeon volume on the rate of stroke and death (odds ratio, 1.65; 95% CI, 0.59-4.64) and other

  4. Environmental Chemicals in Urine and Blood: Improving Methods for Creatinine and Lipid Adjustment

    Science.gov (United States)

    O’Brien, Katie M.; Upson, Kristen; Cook, Nancy R.; Weinberg, Clarice R.

    2015-01-01

    Background Investigators measuring exposure biomarkers in urine typically adjust for creatinine to account for dilution-dependent sample variation in urine concentrations. Similarly, it is standard to adjust for serum lipids when measuring lipophilic chemicals in serum. However, there is controversy regarding the best approach, and existing methods may not effectively correct for measurement error. Objectives We compared adjustment methods, including novel approaches, using simulated case–control data. Methods Using a directed acyclic graph framework, we defined six causal scenarios for epidemiologic studies of environmental chemicals measured in urine or serum. The scenarios include variables known to influence creatinine (e.g., age and hydration) or serum lipid levels (e.g., body mass index and recent fat intake). Over a range of true effect sizes, we analyzed each scenario using seven adjustment approaches and estimated the corresponding bias and confidence interval coverage across 1,000 simulated studies. Results For urinary biomarker measurements, our novel method, which incorporates both covariate-adjusted standardization and the inclusion of creatinine as a covariate in the regression model, had low bias and possessed 95% confidence interval coverage of nearly 95% for most simulated scenarios. For serum biomarker measurements, a similar approach involving standardization plus serum lipid level adjustment generally performed well. Conclusions To control measurement error bias caused by variations in serum lipids or by urinary diluteness, we recommend improved methods for standardizing exposure concentrations across individuals. Citation O’Brien KM, Upson K, Cook NR, Weinberg CR. 2016. Environmental chemicals in urine and blood: improving methods for creatinine and lipid adjustment. Environ Health Perspect 124:220–227; http://dx.doi.org/10.1289/ehp.1509693 PMID:26219104

  5. Risk adjustment models for short-term outcomes after surgical resection for oesophagogastric cancer.

    Science.gov (United States)

    Fischer, C; Lingsma, H; Hardwick, R; Cromwell, D A; Steyerberg, E; Groene, O

    2016-01-01

    Outcomes for oesophagogastric cancer surgery are compared with the aim of benchmarking quality of care. Adjusting for patient characteristics is crucial to avoid biased comparisons between providers. The study objective was to develop a case-mix adjustment model for comparing 30- and 90-day mortality and anastomotic leakage rates after oesophagogastric cancer resections. The study reviewed existing models, considered expert opinion and examined audit data in order to select predictors that were consequently used to develop a case-mix adjustment model for the National Oesophago-Gastric Cancer Audit, covering England and Wales. Models were developed on patients undergoing surgical resection between April 2011 and March 2013 using logistic regression. Model calibration and discrimination was quantified using a bootstrap procedure. Most existing risk models for oesophagogastric resections were methodologically weak, outdated or based on detailed laboratory data that are not generally available. In 4882 patients with oesophagogastric cancer used for model development, 30- and 90-day mortality rates were 2·3 and 4·4 per cent respectively, and 6·2 per cent of patients developed an anastomotic leak. The internally validated models, based on predictors selected from the literature, showed moderate discrimination (area under the receiver operating characteristic (ROC) curve 0·646 for 30-day mortality, 0·664 for 90-day mortality and 0·587 for anastomotic leakage) and good calibration. Based on available data, three case-mix adjustment models for postoperative outcomes in patients undergoing curative surgery for oesophagogastric cancer were developed. These models should be used for risk adjustment when assessing hospital performance in the National Health Service, and tested in other large health systems. © 2015 BJS Society Ltd Published by John Wiley & Sons Ltd.

  6. A New Scale Factor Adjustment Method for Magnetic Force Feedback Accelerometer

    Directory of Open Access Journals (Sweden)

    Xiangqing Huang

    2017-10-01

    Full Text Available A new and simple method to adjust the scale factor of a magnetic force feedback accelerometer is presented, which could be used in developing a rotating accelerometer gravity gradient instrument (GGI. Adjusting and matching the acceleration-to-current transfer function of the four accelerometers automatically is one of the basic and necessary technologies for rejecting the common mode accelerations in the development of GGI. In order to adjust the scale factor of the magnetic force rebalance accelerometer, an external current is injected and combined with the normal feedback current; they are then applied together to the torque coil of the magnetic actuator. The injected current could be varied proportionally according to the external adjustment needs, and the change in the acceleration-to-current transfer function then realized dynamically. The new adjustment method has the advantages of no extra assembly and ease of operation. Changes in the scale factors range from 33% smaller to 100% larger are verified experimentally by adjusting the different external coefficients. The static noise of the used accelerometer is compared under conditions with and without the injecting current, and the experimental results find no change at the current noise level, which further confirms the validity of the presented method.

  7. A New Scale Factor Adjustment Method for Magnetic Force Feedback Accelerometer.

    Science.gov (United States)

    Huang, Xiangqing; Deng, Zhongguang; Xie, Yafei; Li, Zhu; Fan, Ji; Tu, Liangcheng

    2017-10-27

    A new and simple method to adjust the scale factor of a magnetic force feedback accelerometer is presented, which could be used in developing a rotating accelerometer gravity gradient instrument (GGI). Adjusting and matching the acceleration-to-current transfer function of the four accelerometers automatically is one of the basic and necessary technologies for rejecting the common mode accelerations in the development of GGI. In order to adjust the scale factor of the magnetic force rebalance accelerometer, an external current is injected and combined with the normal feedback current; they are then applied together to the torque coil of the magnetic actuator. The injected current could be varied proportionally according to the external adjustment needs, and the change in the acceleration-to-current transfer function then realized dynamically. The new adjustment method has the advantages of no extra assembly and ease of operation. Changes in the scale factors range from 33% smaller to 100% larger are verified experimentally by adjusting the different external coefficients. The static noise of the used accelerometer is compared under conditions with and without the injecting current, and the experimental results find no change at the current noise level, which further confirms the validity of the presented method.

  8. Analysis of methods to determine the latency of online movement adjustments

    NARCIS (Netherlands)

    Oostwoud Wijdenes, L.; Brenner, E.; Smeets, J.B.J.

    2014-01-01

    When studying online movement adjustments, one of the interesting parameters is their latency. We set out to compare three different methods of determining the latency: the threshold, confidence interval, and extrapolation methods. We simulated sets of movements with different movement times and

  9. Mate guarding in the Seychelles warbler is energetically costly and adjusted to paternity risk.

    Science.gov (United States)

    Komdeur, J

    2001-10-22

    Males may increase their fitness through extra-pair copulations (copulations outside the pair bond) that result in extra-pair fertilizations, but also risk lost paternity when they leave their own mate unguarded. The fitness costs of cuckoldry for Seychelles warblers (Acrocephalus sechellensis) are considerable because warblers have a single-egg clutch and, given the short breeding season, no time for a successful replacement clutch. Neighbouring males are the primary threat to a male's genetic paternity. Males minimize their loss of paternity by guarding their mates to prevent them from having extra-pair copulations during their fertile period. Here, I provide experimental evidence that mate-guarding behaviour is energetically costly and that the expression of this trade-off is adjusted to paternity risk (local male density). Free-living males that were induced to reduce mate guarding spent significantly more time foraging and gained significantly better body condition than control males. The larger the reduction in mate guarding, the more pronounced was the increase in foraging and body condition (accounting for food availability). An experimental increase in paternity risk resulted in an increase in mate-guarding intensity and a decrease in foraging and body condition, and vice versa. This is examined using both cross-sectional and longitudinal data. This study on the Seychelles warbler offers experimental evidence that mate guarding is energetically costly and adjusted to paternity risk.

  10. One idea of portfolio risk control for absolute return strategy risk adjustments by signals from correlation behavior

    Science.gov (United States)

    Nishiyama, N.

    2001-12-01

    Absolute return strategy provided from fund of funds (FOFs) investment schemes is the focus in Japanese Financial Community. FOFs investment mainly consists of hedge fund investment and it has two major characteristics which are low correlation against benchmark index and little impact from various external changes in the environment given maximizing return. According to the historical track record of survival hedge funds in this business world, they maintain a stable high return and low risk. However, one must keep in mind that low risk would not be equal to risk free. The failure of Long-term capital management (LTCM) that took place in the summer of 1998 was a symbolized phenomenon. The summer of 1998 exhibited a certain limitation of traditional value at risk (VaR) and some possibility that traditional VaR could be ineffectual to the nonlinear type of fluctuation in the market. In this paper, I try to bring self-organized criticality (SOC) into portfolio risk control. SOC would be well known as a model of decay in the natural world. I analyzed nonlinear type of fluctuation in the market as SOC and applied SOC to capture complicated market movement using threshold point of SOC and risk adjustments by scenario correlation as implicit signals. Threshold becomes the control parameter of risk exposure to set downside floor and forecast extreme nonlinear type of fluctuation under a certain probability. Simulation results would show synergy effect of portfolio risk control between SOC and absolute return strategy.

  11. Variation In Accountable Care Organization Spending And Sensitivity To Risk Adjustment: Implications For Benchmarking.

    Science.gov (United States)

    Rose, Sherri; Zaslavsky, Alan M; McWilliams, J Michael

    2016-03-01

    Spending targets (or benchmarks) for accountable care organizations (ACOs) participating in the Medicare Shared Savings Program must be set carefully to encourage program participation while achieving fiscal goals and minimizing unintended consequences, such as penalizing ACOs for serving sicker patients. Recently proposed regulatory changes include measures to make benchmarks more similar for ACOs in the same area with different historical spending levels. We found that ACOs vary widely in how their spending levels compare with those of other local providers after standard case-mix adjustments. Additionally adjusting for survey measures of patient health meaningfully reduced the variation in differences between ACO spending and local average fee-for-service spending, but substantial variation remained, which suggests that differences in care efficiency between ACOs and local non-ACO providers vary widely. Accordingly, measures to equilibrate benchmarks between high- and low-spending ACOs--such as setting benchmarks to risk-adjusted average fee-for-service spending in an area--should be implemented gradually to maintain participation by ACOs with high spending. Use of survey information also could help mitigate perverse incentives for risk selection and upcoding and limit unintended consequences of new benchmarking methodologies for ACOs serving sicker patients. Project HOPE—The People-to-People Health Foundation, Inc.

  12. Risk adjustment of health-care performance measures in a multinational register-based study: A pragmatic approach to a complicated topic

    Directory of Open Access Journals (Sweden)

    Tron Anders Moger

    2014-03-01

    Full Text Available Objectives: Health-care performance comparisons across countries are gaining popularity. In such comparisons, the risk adjustment methodology plays a key role for meaningful comparisons. However, comparisons may be complicated by the fact that not all participating countries are allowed to share their data across borders, meaning that only simple methods are easily used for the risk adjustment. In this study, we develop a pragmatic approach using patient-level register data from Finland, Hungary, Italy, Norway, and Sweden. Methods: Data on acute myocardial infarction patients were gathered from health-care registers in several countries. In addition to unadjusted estimates, we studied the effects of adjusting for age, gender, and a number of comorbidities. The stability of estimates for 90-day mortality and length of stay of the first hospital episode following diagnosis of acute myocardial infarction is studied graphically, using different choices of reference data. Logistic regression models are used for mortality, and negative binomial models are used for length of stay. Results: Results from the sensitivity analysis show that the various models of risk adjustment give similar results for the countries, with some exceptions for Hungary and Italy. Based on the results, in Finland and Hungary, the 90-day mortality after acute myocardial infarction is higher than in Italy, Norway, and Sweden. Conclusion: Health-care registers give encouraging possibilities to performance measurement and enable the comparison of entire patient populations between countries. Risk adjustment methodology is affected by the availability of data, and thus, the building of risk adjustment methodology must be transparent, especially when doing multinational comparative research. In that case, even basic methods of risk adjustment may still be valuable.

  13. A Plant Control Technology Using Reinforcement Learning Method with Automatic Reward Adjustment

    Science.gov (United States)

    Eguchi, Toru; Sekiai, Takaaki; Yamada, Akihiro; Shimizu, Satoru; Fukai, Masayuki

    A control technology using Reinforcement Learning (RL) and Radial Basis Function (RBF) Network has been developed to reduce environmental load substances exhausted from power and industrial plants. This technology consists of the statistic model using RBF Network, which estimates characteristics of plants with respect to environmental load substances, and RL agent, which learns the control logic for the plants using the statistic model. In this technology, it is necessary to design an appropriate reward function given to the agent immediately according to operation conditions and control goals to control plants flexibly. Therefore, we propose an automatic reward adjusting method of RL for plant control. This method adjusts the reward function automatically using information of the statistic model obtained in its learning process. In the simulations, it is confirmed that the proposed method can adjust the reward function adaptively for several test functions, and executes robust control toward the thermal power plant considering the change of operation conditions and control goals.

  14. The adaptive problems of female teenage refugees and their behavioral adjustment methods for coping

    Directory of Open Access Journals (Sweden)

    Mhaidat F

    2016-04-01

    Full Text Available Fatin Mhaidat Department of Educational Psychology, Faculty of Educational Sciences, The Hashemite University, Zarqa, Jordan Abstract: This study aimed at identifying the levels of adaptive problems among teenage female refugees in the government schools and explored the behavioral methods that were used to cope with the problems. The sample was composed of 220 Syrian female students (seventh to first secondary grades enrolled at government schools within the Zarqa Directorate and who came to Jordan due to the war conditions in their home country. The study used the scale of adaptive problems that consists of four dimensions (depression, anger and hostility, low self-esteem, and feeling insecure and a questionnaire of the behavioral adjustment methods for dealing with the problem of asylum. The results indicated that the Syrian teenage female refugees suffer a moderate degree of adaptation problems, and the positive adjustment methods they have used are more than the negatives. Keywords: adaptive problems, female teenage refugees, behavioral adjustment

  15. Impact of selected risk factors on quality-adjusted life expectancy in Denmark

    DEFF Research Database (Denmark)

    Brønnum-Hansen, Henrik; Juel, Knud; Davidsen, Michael

    2007-01-01

    AIMS: The construct quality-adjusted life years (QALYs) combines mortality and overall health status and can be used to quantify the impact of risk factors on population health. The purpose of the study was to estimate the impact of tobacco smoking, high alcohol consumption, physical inactivity...... Health Survey 2000, and Danish EQ-5D values. RESULTS: The quality-adjusted life expectancy of 25-year-olds was 10-11 QALYs shorter for heavy smokers than for those who never smoke. The difference in life expectancy was 9-10 years. Men and women with high alcohol consumption could expect to lose about 5...... and 3 QALYs, respectively. Sedentary persons could expect to have about 7 fewer QALYs than physically active persons. Obesity shortened QALYs by almost 3 for men and 6 for women. CONCLUSIONS: Smoking, high alcohol consumption, physical inactivity, and obesity strongly reduce life expectancy and health...

  16. Regression Trees Identify Relevant Interactions: Can This Improve the Predictive Performance of Risk Adjustment?

    Science.gov (United States)

    Buchner, Florian; Wasem, Jürgen; Schillo, Sonja

    2017-01-01

    Risk equalization formulas have been refined since their introduction about two decades ago. Because of the complexity and the abundance of possible interactions between the variables used, hardly any interactions are considered. A regression tree is used to systematically search for interactions, a methodologically new approach in risk equalization. Analyses are based on a data set of nearly 2.9 million individuals from a major German social health insurer. A two-step approach is applied: In the first step a regression tree is built on the basis of the learning data set. Terminal nodes characterized by more than one morbidity-group-split represent interaction effects of different morbidity groups. In the second step the 'traditional' weighted least squares regression equation is expanded by adding interaction terms for all interactions detected by the tree, and regression coefficients are recalculated. The resulting risk adjustment formula shows an improvement in the adjusted R 2 from 25.43% to 25.81% on the evaluation data set. Predictive ratios are calculated for subgroups affected by the interactions. The R 2 improvement detected is only marginal. According to the sample level performance measures used, not involving a considerable number of morbidity interactions forms no relevant loss in accuracy. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Adjusted permutation method for multiple attribute decision making with meta-heuristic solution approaches

    Directory of Open Access Journals (Sweden)

    Hossein Karimi

    2011-04-01

    Full Text Available The permutation method of multiple attribute decision making has two significant deficiencies: high computational time and wrong priority output in some problem instances. In this paper, a novel permutation method called adjusted permutation method (APM is proposed to compensate deficiencies of conventional permutation method. We propose Tabu search (TS and particle swarm optimization (PSO to find suitable solutions at a reasonable computational time for large problem instances. The proposed method is examined using some numerical examples to evaluate the performance of the proposed method. The preliminary results show that both approaches provide competent solutions in relatively reasonable amounts of time while TS performs better to solve APM.

  18. Adjustment method for embedded metrology engine in an EM773 series microcontroller.

    Science.gov (United States)

    Blazinšek, Iztok; Kotnik, Bojan; Chowdhury, Amor; Kačič, Zdravko

    2015-09-01

    This paper presents the problems of implementation and adjustment (calibration) of a metrology engine embedded in NXP's EM773 series microcontroller. The metrology engine is used in a smart metering application to collect data about energy utilization and is controlled with the use of metrology engine adjustment (calibration) parameters. The aim of this research is to develop a method which would enable the operators to find and verify the optimum parameters which would ensure the best possible accuracy. Properly adjusted (calibrated) metrology engines can then be used as a base for variety of products used in smart and intelligent environments. This paper focuses on the problems encountered in the development, partial automatisation, implementation and verification of this method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  19. A comparison of methods to adjust for continuous covariates in the analysis of randomised trials

    Directory of Open Access Journals (Sweden)

    Brennan C. Kahan

    2016-04-01

    Full Text Available Abstract Background Although covariate adjustment in the analysis of randomised trials can be beneficial, adjustment for continuous covariates is complicated by the fact that the association between covariate and outcome must be specified. Misspecification of this association can lead to reduced power, and potentially incorrect conclusions regarding treatment efficacy. Methods We compared several methods of adjustment to determine which is best when the association between covariate and outcome is unknown. We assessed (a dichotomisation or categorisation; (b assuming a linear association with outcome; (c using fractional polynomials with one (FP1 or two (FP2 polynomial terms; and (d using restricted cubic splines with 3 or 5 knots. We evaluated each method using simulation and through a re-analysis of trial datasets. Results Methods which kept covariates as continuous typically had higher power than methods which used categorisation. Dichotomisation, categorisation, and assuming a linear association all led to large reductions in power when the true association was non-linear. FP2 models and restricted cubic splines with 3 or 5 knots performed best overall. Conclusions For the analysis of randomised trials we recommend (1 adjusting for continuous covariates even if their association with outcome is unknown; (2 keeping covariates as continuous; and (3 using fractional polynomials with two polynomial terms or restricted cubic splines with 3 to 5 knots when a linear association is in doubt.

  20. [Risk sharing methods in middle income countries].

    Science.gov (United States)

    Inotai, András; Kaló, Zoltán

    2012-01-01

    The pricing strategy of innovative medicines is based on the therapeutic value in the largest pharmaceutical markets. The cost-effectiveness of new medicines with value based ex-factory price is justifiable. Due to the international price referencing and parallel trade the ex-factory price corridor of new medicines has been narrowed in recent years. Middle income countries have less negotiation power to change the narrow drug pricing corridor, although their fair intention is to buy pharmaceuticals at lower price from their scarce public resources compared to higher income countries. Therefore the reimbursement of new medicines at prices of Western-European countries may not be justifiable in Central-Eastern European countries. Confidential pricing agreements (i.e. confidential price discounts, claw-back or rebate) in lower income countries of the European Union can alleviate this problem, as prices of new medicines can be adjusted to local purchasing power without influencing the published ex-factory price and so the accessibility of patients to these drugs in other countries. In order to control the drug budget payers tend to apply financial risk sharing agreements for new medicines in more and more countries to shift the consequences of potential overspending to pharmaceutical manufacturers. The major paradox of financial risk-sharing schemes is that increased mortality, poor persistence of patients, reduced access to healthcare providers, and no treatment reduce pharmaceutical spending. Consequently, payers have started to apply outcome based risk sharing agreements for new medicines recently to improve the quality of health care provision. Our paper aims to review and assess the published financial and outcome based risk sharing methods. Introduction of outcome based risk-sharing schemes can be a major advancement in the drug reimbursement strategy of payers in middle income countries. These schemes can help to reduce the medical uncertainty in coverage

  1. Method Based on Confidence Radius to Adjust the Location of Mobile Terminals

    DEFF Research Database (Denmark)

    García-Fernández, Juan Antonio; Jurado-Navas, Antonio; Fernández-Navarro, Mariano

    2017-01-01

    The present paper details a technique for adjusting in a smart manner the position estimates of any user equipment given by different geolocation/positioning methods in a wireless radiofrequency communication network based on different strategies (observed time difference of arrival , angle of ar...

  2. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan

    Directory of Open Access Journals (Sweden)

    Weiner Jonathan P

    2010-01-01

    Full Text Available Abstract Background Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. Methods A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234, while those in both 2002 and 2003 were included for prospective analyses (n = 164,562. Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. Results The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster. When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Conclusions Given the

  3. Singularity-sensitive gauge-based radar rainfall adjustment methods for urban hydrological applications

    Directory of Open Access Journals (Sweden)

    L.-P. Wang

    2015-09-01

    Full Text Available Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2 (Edinburgh, UK during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban

  4. Singularity-sensitive gauge-based radar rainfall adjustment methods for urban hydrological applications

    Science.gov (United States)

    Wang, L.-P.; Ochoa-Rodríguez, S.; Onof, C.; Willems, P.

    2015-09-01

    Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field) that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive) technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2) (Edinburgh, UK) during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban drainage system

  5. Breeds of risk-adjusted fundamentalist strategies in an order-driven market

    Science.gov (United States)

    LiCalzi, Marco; Pellizzari, Paolo

    2006-01-01

    This paper studies an order-driven stock market where agents have heterogeneous estimates of the fundamental value of the risky asset. The agents are budget-constrained and follow a value-based trading strategy which buys or sells depending on whether the price of the asset is below or above its risk-adjusted fundamental value. This environment generates returns that are remarkably leptokurtic and fat-tailed. By extending the study over a grid of different parameters for the fundamentalist trading strategy, we exhibit the existence of monotone relationships between the bid-ask spread demanded by the agents and several statistics of the returns. We conjecture that this effect, coupled with positive dependence of the risk premium on the volatility, generates positive feedbacks that might explain volatility bursts.

  6. PACE and the Medicare+Choice risk-adjusted payment model.

    Science.gov (United States)

    Temkin-Greener, H; Meiners, M R; Gruenberg, L

    2001-01-01

    This paper investigates the impact of the Medicare principal inpatient diagnostic cost group (PIP-DCG) payment model on the Program of All-Inclusive Care for the Elderly (PACE). Currently, more than 6,000 Medicare beneficiaries who are nursing home certifiable receive care from PACE, a program poised for expansion under the Balanced Budget Act of 1997. Overall, our analysis suggests that the application of the PIP-DCG model to the PACE program would reduce Medicare payments to PACE, on average, by 38%. The PIP-DCG payment model bases its risk adjustment on inpatient diagnoses and does not capture adequately the risk of caring for a population with functional impairments.

  7. The method and program system CABEI for adjusting consistency between natural element and its isotopes data

    Energy Technology Data Exchange (ETDEWEB)

    Tingjin, Liu; Zhengjun, Sun [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    To meet the requirement of nuclear engineering, especially nuclear fusion reactor, now the data in the major evaluated libraries are given not only for natural element but also for its isotopes. Inconsistency between element and its isotopes data is one of the main problem in present evaluated neutron libraries. The formulas for adjusting to satisfy simultaneously the two kinds of consistent relationships were derived by means of least square method, the program system CABEI were developed. This program was tested by calculating the Fe data in CENDL-2.1. The results show that adjusted values satisfy the two kinds of consistent relationships.

  8. Reliability of risk-adjusted outcomes for profiling hospital surgical quality.

    Science.gov (United States)

    Krell, Robert W; Hozain, Ahmed; Kao, Lillian S; Dimick, Justin B

    2014-05-01

    Quality improvement platforms commonly use risk-adjusted morbidity and mortality to profile hospital performance. However, given small hospital caseloads and low event rates for some procedures, it is unclear whether these outcomes reliably reflect hospital performance. To determine the reliability of risk-adjusted morbidity and mortality for hospital performance profiling using clinical registry data. A retrospective cohort study was conducted using data from the American College of Surgeons National Surgical Quality Improvement Program, 2009. Participants included all patients (N = 55,466) who underwent colon resection, pancreatic resection, laparoscopic gastric bypass, ventral hernia repair, abdominal aortic aneurysm repair, and lower extremity bypass. Outcomes included risk-adjusted overall morbidity, severe morbidity, and mortality. We assessed reliability (0-1 scale: 0, completely unreliable; and 1, perfectly reliable) for all 3 outcomes. We also quantified the number of hospitals meeting minimum acceptable reliability thresholds (>0.70, good reliability; and >0.50, fair reliability) for each outcome. For overall morbidity, the most common outcome studied, the mean reliability depended on sample size (ie, how high the hospital caseload was) and the event rate (ie, how frequently the outcome occurred). For example, mean reliability for overall morbidity was low for abdominal aortic aneurysm repair (reliability, 0.29; sample size, 25 cases per year; and event rate, 18.3%). In contrast, mean reliability for overall morbidity was higher for colon resection (reliability, 0.61; sample size, 114 cases per year; and event rate, 26.8%). Colon resection (37.7% of hospitals), pancreatic resection (7.1% of hospitals), and laparoscopic gastric bypass (11.5% of hospitals) were the only procedures for which any hospitals met a reliability threshold of 0.70 for overall morbidity. Because severe morbidity and mortality are less frequent outcomes, their mean

  9. Gender adjustment or stratification in discerning upper extremity musculoskeletal disorder risk?

    Science.gov (United States)

    Silverstein, Barbara; Fan, Z Joyce; Smith, Caroline K; Bao, Stephen; Howard, Ninica; Spielholz, Peregrin; Bonauto, David; Viikari-Juntura, Eira

    2009-03-01

    The aim was to explore whether "adjustment" for gender masks important exposure differences between men and women in a study of rotator cuff syndrome (RCS) and carpal tunnel syndrome (CTS) and work exposures. This cross-sectional study of 733 subjects in 12 health care and manufacturing workplaces used detailed individual health and work exposure assessment methods. Multiple logistic regression analysis was used to compare gender stratified and adjusted models. Prevalence of RCS and CTS among women was 7.1% and 11.3% respectively, and among men 7.8% and 6.4%. In adjusted (gender, age, body mass index) multivariate analyses of RCS and CTS, gender was not statistically significantly different. For RCS, upper arm flexion >/=45 degrees and forceful pinch increased the odds in the gender-adjusted model (OR 2.66, 95% CI 1.26-5.59) but primarily among women in the stratified analysis (OR 6.68, 95% CI 1.81-24.66 versus OR 1.45, 95% CI 0.53-4.00). For CTS, wrist radial/ulnar deviation >/=4% time and lifting >/=4.5kg >3% time, the adjusted OR was higher for women (OR 4.85, 95% CI 2.12-11.11) and in the gender stratified analyses, the odds were increased for both genders (women OR 5.18, 95% CI 1.70-15.81 and men OR 3.63, 95% CI 1.08-12.18). Gender differences in response to physical work exposures may reflect gender segregation in work and potential differences in pinch and lifting capacity. Reduction in these exposures may reduce prevalence of upper extremity disorders for all workers.

  10. Usefulness of administrative databases for risk adjustment of adverse events in surgical patients.

    Science.gov (United States)

    Rodrigo-Rincón, Isabel; Martin-Vizcaíno, Marta P; Tirapu-León, Belén; Zabalza-López, Pedro; Abad-Vicente, Francisco J; Merino-Peralta, Asunción; Oteiza-Martínez, Fabiola

    2016-03-01

    The aim of this study was to assess the usefulness of clinical-administrative databases for the development of risk adjustment in the assessment of adverse events in surgical patients. The study was conducted at the Hospital of Navarra, a tertiary teaching hospital in northern Spain. We studied 1602 hospitalizations of surgical patients from 2008 to 2010. We analysed 40 comorbidity variables included in the National Surgical Quality Improvement (NSQIP) Program of the American College of Surgeons using 2 sources of information: The clinical and administrative database (CADB) and the data extracted from the complete clinical records (CR), which was considered the gold standard. Variables were catalogued according to compliance with the established criteria: sensitivity, positive predictive value and kappa coefficient >0.6. The average number of comorbidities per study participant was 1.6 using the CR and 0.95 based on CADB (p<.0001). Thirteen types of comorbidities (accounting for 8% of the comorbidities detected in the CR) were not identified when the CADB was the source of information. Five of the 27 remaining comorbidities complied with the 3 established criteria; 2 pathologies fulfilled 2 criteria, whereas 11 fulfilled 1, and 9 did not fulfil any criterion. CADB detected prevalent comorbidities such as comorbid hypertension and diabetes. However, the CABD did not provide enough information to assess the variables needed to perform the risk adjustment proposed by the NSQIP for the assessment of adverse events in surgical patients. Copyright © 2015. Publicado por Elsevier España, S.L.U.

  11. Behavioural adjustment in response to increased predation risk: a study in three duck species.

    Directory of Open Access Journals (Sweden)

    Cédric Zimmer

    Full Text Available Predation directly triggers behavioural decisions designed to increase immediate survival. However, these behavioural modifications can have long term costs. There is therefore a trade-off between antipredator behaviours and other activities. This trade-off is generally considered between vigilance and only one other behaviour, thus neglecting potential compensations. In this study, we considered the effect of an increase in predation risk on the diurnal time-budget of three captive duck species during the wintering period. We artificially increased predation risk by disturbing two groups of 14 mallard and teals at different frequencies, and one group of 14 tufted ducks with a radio-controlled stressor. We recorded foraging, vigilance, preening and sleeping durations the week before, during and after disturbance sessions. Disturbed groups were compared to an undisturbed control group. We showed that in all three species, the increase in predation risk resulted in a decrease in foraging and preening and led to an increase in sleeping. It is worth noting that contrary to common observations, vigilance did not increase. However, ducks are known to be vigilant while sleeping. This complex behavioural adjustment therefore seems to be optimal as it may allow ducks to reduce their predation risk. Our results highlight the fact that it is necessary to encompass the whole individual time-budget when studying behavioural modifications under predation risk. Finally, we propose that studies of behavioural time-budget changes under predation risk should be included in the more general framework of the starvation-predation risk trade-off.

  12. IC layout adjustment method and tool for improving dielectric reliability at interconnects

    Energy Technology Data Exchange (ETDEWEB)

    Kahng, Andrew B.; Chan, Tuck Boon

    2018-03-20

    Method for adjusting a layout used in making an integrated circuit includes one or more interconnects in the layout that are susceptible to dielectric breakdown are selected. One or more selected interconnects are adjusted to increase via to wire spacing with respect to at least one via and one wire of the one or more selected interconnects. Preferably, the selecting analyzes signal patterns of interconnects, and estimates the stress ratio based on state probability of routed signal nets in the layout. An annotated layout is provided that describes distances by which one or more via or wire segment edges are to be shifted. Adjustments can include thinning and shifting of wire segments, and rotation of vias.

  13. Psychosocial Adjustment and Sibling Relationships in Siblings of Children with Autism Spectrum Disorder: Risk and Protective Factors

    Science.gov (United States)

    Walton, Katherine M.; Ingersoll, Brooke R.

    2015-01-01

    This study compared sibling adjustment and relationships in siblings of children with Autism Spectrum Disorder (ASD-Sibs; n = 69) and siblings of children with typical development (TD-Sibs; n = 93). ASD-Sibs and TD-Sibs demonstrated similar emotional/behavioral adjustment. Older male ASD-Sibs were at increased risk for difficulties. Sibling…

  14. Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-01-01

    Full Text Available Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.

  15. Monte Carlo method with heuristic adjustment for irregularly shaped food product volume measurement.

    Science.gov (United States)

    Siswantoro, Joko; Prabuwono, Anton Satria; Abdullah, Azizi; Idrus, Bahari

    2014-01-01

    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.

  16. An Adjusted Probability Method for the Identification of Sociometric Status in Classrooms

    Directory of Open Access Journals (Sweden)

    Francisco J. García Bacete

    2017-10-01

    Full Text Available Objective: The aim of this study was to test the performance of an adjusted probability method for sociometric classification proposed by García Bacete (GB in comparison with two previous methods. Specific goals were to examine the overall agreement between methods, the behavioral correlates of each sociometric group, the sources for discrepant classifications between methods, the behavioral profiles of discrepant and consistent cases between methods, and age differences.Method: We compared the GB adjusted probability method with the standard score model proposed by Coie and Dodge (CD and the probability score model proposed by Newcomb and Bukowski (NB. The GB method is an adaptation of the NB method, cutoff scores are derived from the distribution of raw liked most and liked least scores in each classroom instead of using fixed and absolute scores as does NB method. The criteria for neglected status are also modified by the GB method. Participants were 569 children (45% girls from 23 elementary school classrooms (13 Grades 1–2, 10 Grades 5–6.Results: We found agreement as well as differences between the three methods. The CD method yielded discrepancies in the classifications because of its dependence on z-scores and composite dimensions. The NB method was less optimal in the validation of the behavioral characteristics of the sociometric groups, because of its fixed cutoffs for identifying preferred, rejected, and controversial children, and not differentiating between positive and negative nominations for neglected children. The GB method addressed some of the limitations of the other two methods. It improved the classified of neglected students, as well as discrepant cases of the preferred, rejected, and controversial groups. Agreement between methods was higher with the oldest children.Conclusion: GB is a valid sociometric method as evidences by the behavior profiles of the sociometric status groups identified with this method.

  17. Critical review of methods for risk ranking of food related hazards, based on risks for human health

    DEFF Research Database (Denmark)

    van der Fels-Klerx, H. J.; van Asselt, E. D.; Raley, M.

    2018-01-01

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science......, and the risk ranking method characterized. The methods were then clustered - based on their characteristics - into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years, multi......-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking...

  18. The Effect of Adding Comorbidities to Current Centers for Disease Control and Prevention Central-Line-Associated Bloodstream Infection Risk-Adjustment Methodology.

    Science.gov (United States)

    Jackson, Sarah S; Leekha, Surbhi; Magder, Laurence S; Pineles, Lisa; Anderson, Deverick J; Trick, William E; Woeltje, Keith F; Kaye, Keith S; Stafford, Kristen; Thom, Kerri; Lowe, Timothy J; Harris, Anthony D

    2017-09-01

    BACKGROUND Risk adjustment is needed to fairly compare central-line-associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes. METHODS Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank. RESULTS Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51-0.59) for the ICU-type model and 0.64 (95% CI, 0.60-0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model. CONCLUSIONS Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals. Infect Control Hosp Epidemiol 2017;38:1019-1024.

  19. Improving Risk Adjustment for Mortality After Pediatric Cardiac Surgery: The UK PRAiS2 Model.

    Science.gov (United States)

    Rogers, Libby; Brown, Katherine L; Franklin, Rodney C; Ambler, Gareth; Anderson, David; Barron, David J; Crowe, Sonya; English, Kate; Stickley, John; Tibby, Shane; Tsang, Victor; Utley, Martin; Witter, Thomas; Pagel, Christina

    2017-07-01

    Partial Risk Adjustment in Surgery (PRAiS), a risk model for 30-day mortality after children's heart surgery, has been used by the UK National Congenital Heart Disease Audit to report expected risk-adjusted survival since 2013. This study aimed to improve the model by incorporating additional comorbidity and diagnostic information. The model development dataset was all procedures performed between 2009 and 2014 in all UK and Ireland congenital cardiac centers. The outcome measure was death within each 30-day surgical episode. Model development followed an iterative process of clinical discussion and development and assessment of models using logistic regression under 25 × 5 cross-validation. Performance was measured using Akaike information criterion, the area under the receiver-operating characteristic curve (AUC), and calibration. The final model was assessed in an external 2014 to 2015 validation dataset. The development dataset comprised 21,838 30-day surgical episodes, with 539 deaths (mortality, 2.5%). The validation dataset comprised 4,207 episodes, with 97 deaths (mortality, 2.3%). The updated risk model included 15 procedural, 11 diagnostic, and 4 comorbidity groupings, and nonlinear functions of age and weight. Performance under cross-validation was: median AUC of 0.83 (range, 0.82 to 0.83), median calibration slope and intercept of 0.92 (range, 0.64 to 1.25) and -0.23 (range, -1.08 to 0.85) respectively. In the validation dataset, the AUC was 0.86 (95% confidence interval [CI], 0.82 to 0.89), and the calibration slope and intercept were 1.01 (95% CI, 0.83 to 1.18) and 0.11 (95% CI, -0.45 to 0.67), respectively, showing excellent performance. A more sophisticated PRAiS2 risk model for UK use was developed with additional comorbidity and diagnostic information, alongside age and weight as nonlinear variables. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Fitting method of pseudo-polynomial for solving nonlinear parametric adjustment

    Institute of Scientific and Technical Information of China (English)

    陶华学; 宫秀军; 郭金运

    2001-01-01

    The optimal condition and its geometrical characters of the least-square adjustment were proposed. Then the relation between the transformed surface and least-squares was discussed. Based on the above, a non-iterative method, called the fitting method of pseudo-polynomial, was derived in detail. The final least-squares solution can be determined with sufficient accuracy in a single step and is not attained by moving the initial point in the view of iteration. The accuracy of the solution relys wholly on the frequency of Taylor's series. The example verifies the correctness and validness of the method.

  1. Approaches and methods of risk assessment

    International Nuclear Information System (INIS)

    Rowe, W.D.

    1983-01-01

    The classification system of risk assessment includes the categories: 1) risk comparisons, 2) cost-effectiveness of risk reduction, 3) balancing of costs, risks and benefits against one another, 4. Metasystems. An overview of methods and systems reveals that no single method can be applied to all cases and situations. The visibility of the process and the absolute consideration of all aspects of judging are, however, of first and fore most importance. (DG) [de

  2. Disease-Specific Trends of Comorbidity Coding and Implications for Risk Adjustment in Hospital Administrative Data.

    Science.gov (United States)

    Nimptsch, Ulrike

    2016-06-01

    To investigate changes in comorbidity coding after the introduction of diagnosis related groups (DRGs) based prospective payment and whether trends differ regarding specific comorbidities. Nationwide administrative data (DRG statistics) from German acute care hospitals from 2005 to 2012. Observational study to analyze trends in comorbidity coding in patients hospitalized for common primary diseases and the effects on comorbidity-related risk of in-hospital death. Comorbidity coding was operationalized by Elixhauser diagnosis groups. The analyses focused on adult patients hospitalized for the primary diseases of heart failure, stroke, and pneumonia, as well as hip fracture. When focusing the total frequency of diagnosis groups per record, an increase in depth of coding was observed. Between-hospital variations in depth of coding were present throughout the observation period. Specific comorbidity increases were observed in 15 of the 31 diagnosis groups, and decreases in comorbidity were observed for 11 groups. In patients hospitalized for heart failure, shifts of comorbidity-related risk of in-hospital death occurred in nine diagnosis groups, in which eight groups were directed toward the null. Comorbidity-adjusted outcomes in longitudinal administrative data analyses may be biased by nonconstant risk over time, changes in completeness of coding, and between-hospital variations in coding. Accounting for such issues is important when the respective observation period coincides with changes in the reimbursement system or other conditions that are likely to alter clinical coding practice. © Health Research and Educational Trust.

  3. Desirability of Outcome Ranking (DOOR) and Response Adjusted for Duration of Antibiotic Risk (RADAR).

    Science.gov (United States)

    Evans, Scott R; Rubin, Daniel; Follmann, Dean; Pennello, Gene; Huskins, W Charles; Powers, John H; Schoenfeld, David; Chuang-Stein, Christy; Cosgrove, Sara E; Fowler, Vance G; Lautenbach, Ebbing; Chambers, Henry F

    2015-09-01

    Clinical trials that compare strategies to optimize antibiotic use are of critical importance but are limited by competing risks that distort outcome interpretation, complexities of noninferiority trials, large sample sizes, and inadequate evaluation of benefits and harms at the patient level. The Antibacterial Resistance Leadership Group strives to overcome these challenges through innovative trial design. Response adjusted for duration of antibiotic risk (RADAR) is a novel methodology utilizing a superiority design and a 2-step process: (1) categorizing patients into an overall clinical outcome (based on benefits and harms), and (2) ranking patients with respect to a desirability of outcome ranking (DOOR). DOORs are constructed by assigning higher ranks to patients with (1) better overall clinical outcomes and (2) shorter durations of antibiotic use for similar overall clinical outcomes. DOOR distributions are compared between antibiotic use strategies. The probability that a randomly selected patient will have a better DOOR if assigned to the new strategy is estimated. DOOR/RADAR represents a new paradigm in assessing the risks and benefits of new strategies to optimize antibiotic use. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Profile of congenital heart disease and correlation to risk adjustment for surgery; an echocardiographic study

    International Nuclear Information System (INIS)

    Akhtar, K.; Ahmed, W.

    2008-01-01

    To determine the pattern and profile of Congenital Heart Diseases (CHD) in paediatric patients (age 1 day to 18 years) presenting to a paediatric tertiary referral centre and its correlation to risk adjustment for surgery for congenital heart disease. Over a period of 6 months, 1149 cases underwent 2-D echocardiography. It was a non-probability purposive sampling. This study showed 25% of all referrals had normal hearts. A male preponderance (38%) was observed from 1 year to 5 years age group. Nineteen percent of the cases were categorized as cyanotic CHD with the remaining as acyanotic variety. Tetralogy of Fallot (TOF) represented 10%, Ventricular Septal Defects (VSD) 24%, followed by Patent Ductus Arteriosus (PDA) and Atrial Septal Defect (ASD), which comprised 6.6% and 6.5% respectively. VSD was the most common association in patients with more complex CHD (10%) followed by PDA in 3% and ASD in 1.2% of the cases. Most of the cases were category 2 in the RACHS-1 scoring system. VSD and TOF formed the major groups of cases profiled. Most of the cases recommended for surgery for congenital heart disease belonged to the risk category 2 (28.1%) followed by the risk category 1 (12.7%) of the RACHS-1 scoring system. (author)

  5. Risk adjusted surgical audit in gynaecological oncology: P-POSSUM does not predict outcome.

    Science.gov (United States)

    Das, N; Talaat, A S; Naik, R; Lopes, A D; Godfrey, K A; Hatem, M H; Edmondson, R J

    2006-12-01

    To assess the Physiological and Operative Severity Score for the enumeration of mortality and morbidity (POSSUM) and its validity for use in gynaecological oncology surgery. All patients undergoing gynaecological oncology surgery at the Northern Gynaecological Oncology Centre (NGOC) Gateshead, UK over a period of 12months (2002-2003) were assessed prospectively. Mortality and morbidity predictions using the Portsmouth modification of the POSSUM algorithm (P-POSSUM) were compared to the actual outcomes. Performance of the model was also evaluated using the Hosmer and Lemeshow Chi square statistic (testing the goodness of fit). During this period 468 patients were assessed. The P-POSSUM appeared to over predict mortality rates for our patients. It predicted a 7% mortality rate for our patients compared to an observed rate of 2% (35 predicted deaths in comparison to 10 observed deaths), a difference that was statistically significant (H&L chi(2)=542.9, d.f. 8, prisk of mortality for gynaecological oncology patients undergoing surgery. The P-POSSUM algorithm will require further adjustments prior to adoption for gynaecological cancer surgery as a risk adjusted surgical audit tool.

  6. Using the Nudge and Shove Methods to Adjust Item Difficulty Values.

    Science.gov (United States)

    Royal, Kenneth D

    2015-01-01

    In any examination, it is important that a sufficient mix of items with varying degrees of difficulty be present to produce desirable psychometric properties and increase instructors' ability to make appropriate and accurate inferences about what a student knows and/or can do. The purpose of this "teaching tip" is to demonstrate how examination items can be affected by the quality of distractors, and to present a simple method for adjusting items to meet difficulty specifications.

  7. CALCULATION METHODS OF OPTIMAL ADJUSTMENT OF CONTROL SYSTEM THROUGH DISTURBANCE CHANNEL

    Directory of Open Access Journals (Sweden)

    I. M. Golinko

    2014-01-01

    Full Text Available In the process of automatic control system debugging the great attention is paid to determining formulas’ parameters of optimal dynamic adjustment of regulators, taking into account the dynamics of Objects control. In most cases the known formulas are oriented on design of automatic control system through channel “input-output definition”. But practically in all continuous processes the main task of all regulators is stabilization of output parameters. The Methods of parameters calculation for dynamic adjustment of regulations were developed. These methods allow to optimize the analog and digital regulators, taking into account minimization of regulated influences. There were suggested to use the fact of detuning and maximum value of regulated influence. As the automatic control system optimization with proportional plus reset controllers on disturbance channel is an unimodal task, the main algorithm of optimization is realized by Hooke – Jeeves method. For controllers optimization through channel external disturbance there were obtained functional dependences of parameters calculations of dynamic proportional plus reset controllers from dynamic characteristics of Object control. The obtained dependences allow to improve the work of controllers (regulators of automatic control on external disturbance channel and so it allows to improve the quality of regulation of transient processes. Calculation formulas provide high accuracy and convenience in usage. In suggested method there are no nomographs and this fact expels subjectivity of investigation in determination of parameters of dynamic adjustment of proportional plus reset controllers. Functional dependences can be used for calculation of adjustment of PR controllers in a great range of change of dynamic characteristics of Objects control.

  8. Set up of a method for the adjustment of resonance parameters on integral experiments

    International Nuclear Information System (INIS)

    Blaise, P.

    1996-01-01

    Resonance parameters for actinides play a significant role in the neutronic characteristics of all reactor types. All the major integral parameters strongly depend on the nuclear data of the isotopes in the resonance-energy regions.The author sets up a method for the adjustment of resonance parameters taking into account the self-shielding effects and restricting the cross section deconvolution problem to a limited energy region. (N.T.)

  9. A Proportional Hazards Regression Model for the Subdistribution with Covariates-adjusted Censoring Weight for Competing Risks Data

    DEFF Research Database (Denmark)

    He, Peng; Eriksson, Frank; Scheike, Thomas H.

    2016-01-01

    function by fitting the Cox model for the censoring distribution and using the predictive probability for each individual. Our simulation study shows that the covariate-adjusted weight estimator is basically unbiased when the censoring time depends on the covariates, and the covariate-adjusted weight......With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution...... and the covariates are independent. Covariate-dependent censoring sometimes occurs in medical studies. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with proper adjustments for covariate-dependent censoring. We consider a covariate-adjusted weight...

  10. [Do laymen understand information about hospital quality? An empirical verification using risk-adjusted mortality rates as an example].

    Science.gov (United States)

    Sander, Uwe; Kolb, Benjamin; Taheri, Fatemeh; Patzelt, Christiane; Emmert, Martin

    2017-11-01

    The effect of public reporting to improve quality in healthcare is reduced by the limited intelligibility of information about the quality of healthcare providers. This may result in worse health-related choices especially for older people and those with lower levels of education. There is, as yet, little information as to whether laymen understand the concepts behind quality comparisons and if this comprehension is correlated with hospital choices. An instrument with 20 items was developed to analyze the intelligibility of five technical terms which were used in German hospital report cards to explain risk-adjusted death rates. Two online presentations of risk-adjusted death rates for five hospitals in the style of hospital report cards were developed. An online survey of 353 volunteers tested the comprehension of the risk-adjusted mortality rates and included an experimental hospital choice. The intelligibility of five technical terms was tested: risk-adjusted, actual and expected death rate, reference range and national average. The percentages of correct answers for the five technical terms were in the range of 75.0-60.2%. Between 23.8% and 5.1% of the respondents were not able to answer the question about the technical term itself. The least comprehensible technical terms were "risk-adjusted death rate" and "reference range". The intelligibility of the 20 items that were used to test the comprehension of the risk-adjusted mortality was between 89.5% and 14.2%. The two items that proved to be least comprehensible were related to the technical terms "risk-adjusted death rate" and "reference range". For all five technical terms it was found that a better comprehension correlated significantly with better hospital choices. We found a better than average intelligibility for the technical terms "actual and expected death rate" and for "national average". The least understandable were "risk-adjusted death rate" and "reference range". Since the self

  11. Risk-adjusted capitation funding models for chronic disease in Australia: alternatives to casemix funding.

    Science.gov (United States)

    Antioch, K M; Walsh, M K

    2002-01-01

    Under Australian casemix funding arrangements that use Diagnosis-Related Groups (DRGs) the average price is policy based, not benchmarked. Cost weights are too low for State-wide chronic disease services. Risk-adjusted Capitation Funding Models (RACFM) are feasible alternatives. A RACFM was developed for public patients with cystic fibrosis treated by an Australian Health Maintenance Organization (AHMO). Adverse selection is of limited concern since patients pay solidarity contributions via Medicare levy with no premium contributions to the AHMO. Sponsors paying premium subsidies are the State of Victoria and the Federal Government. Cost per patient is the dependent variable in the multiple regression. Data on DRG 173 (cystic fibrosis) patients were assessed for heteroskedasticity, multicollinearity, structural stability and functional form. Stepwise linear regression excluded non-significant variables. Significant variables were 'emergency' (1276.9), 'outlier' (6377.1), 'complexity' (3043.5), 'procedures' (317.4) and the constant (4492.7) (R(2)=0.21, SE=3598.3, F=14.39, Probpayment (constant). The model explained 21% of the variance in cost per patient. The payment rate is adjusted by a best practice annual admission rate per patient. The model is a blended RACFM for in-patient, out-patient, Hospital In The Home, Fee-For-Service Federal payments for drugs and medical services; lump sum lung transplant payments and risk sharing through cost (loss) outlier payments. State and Federally funded home and palliative services are 'carved out'. The model, which has national application via Coordinated Care Trials and by Australian States for RACFMs may be instructive for Germany, which plans to use Australian DRGs for casemix funding. The capitation alternative for chronic disease can improve equity, allocative efficiency and distributional justice. The use of Diagnostic Cost Groups (DCGs) is a promising alternative classification system for capitation arrangements.

  12. A Cross-Section Adjustment Method for Double Heterogeneity Problem in VHTGR Analysis

    International Nuclear Information System (INIS)

    Yun, Sung Hwan; Cho, Nam Zin

    2011-01-01

    Very High Temperature Gas-Cooled Reactors (VHTGRs) draw strong interest as candidates for a Gen-IV reactor concept, in which TRISO (tristructuralisotropic) fuel is employed to enhance the fuel performance. However, randomly dispersed TRISO fuel particles in a graphite matrix induce the so-called double heterogeneity problem. For design and analysis of such reactors with the double heterogeneity problem, the Monte Carlo method is widely used due to its complex geometry and continuous-energy capabilities. However, its huge computational burden, even in the modern high computing power, is still problematic to perform wholecore analysis in reactor design procedure. To address the double heterogeneity problem using conventional lattice codes, the RPT (Reactivityequivalent Physical Transformation) method considers a homogenized fuel region that is geometrically transformed to provide equivalent self-shielding effect. Another method is the coupled Monte Carlo/Collision Probability method, in which the absorption and nu-fission resonance cross-section libraries in the deterministic CPM3 lattice code are modified group-wise by the double heterogeneity factors determined by Monte Carlo results. In this paper, a new two-step Monte Carlo homogenization method is described as an alternative to those methods above. In the new method, a single cross-section adjustment factor is introduced to provide self-shielding effect equivalent to the self-shielding in heterogeneous geometry for a unit cell of compact fuel. Then, the homogenized fuel compact material with the equivalent cross-section adjustment factor is used in continuous-energy Monte Carlo calculation for various types of fuel blocks (or assemblies). The procedure of cross-section adjustment is implemented in the MCNP5 code

  13. Premorbid adjustment in individuals at ultra-high risk for developing psychosis

    DEFF Research Database (Denmark)

    Dannevang, Anders; Randers, Lasse; Gondan, Matthias

    2017-01-01

    between childhood and early adolescence. The UHR individuals had more premorbid adjustment difficulties on both the social and academic scale, and on the individual PAS scales. Conclusion: From childhood UHR individuals have lower levels of social and academic premorbid adjustment compared to healthy...... and academic scales were computed. Results: Compared to the healthy controls the UHR individuals’ social and academic premorbid adjustment declined across age periods. Social premorbid adjustment declined particularly between late adolescence and adulthood. Academic premorbid adjustment declined particularly...

  14. Risk and Protective Factors at Age 16: Psychological Adjustment in Children With a Cleft Lip and/or Palate.

    Science.gov (United States)

    Feragen, Kristin Billaud; Stock, Nicola Marie; Kvalem, Ingela Lundin

    2015-09-01

    Explore psychological functioning in adolescents with a cleft at age 16 from a broad perspective, including cognitive, emotional, behavioral, appearance-related, and psychosocial adjustment. High-risk groups were identified within each area of adjustment to investigate whether vulnerable adolescents were found across domains or whether risk was limited to specific areas of adjustment. Cross-sectional data based on psychological assessments at age 16 (N = 857). The effect of gender, cleft visibility, and the presence of an additional condition were investigated on all outcome variables. Results were compared with large national samples. Hopkins Symptom Checklist, Harter Self-Perception Scale for Adolescents, Child Experience Questionnaire, and Satisfaction With Appearance scale. The main factor influencing psychological adjustment across domains was gender, with girls in general reporting more psychological problems, as seen in reference groups. The presence of an additional condition also negatively affected some of the measures. No support was found for cleft visibility as a risk factor except for dissatisfaction with appearance. Correlation analyses of risk groups seem to point to an association between social and emotional risk and between social risk and dissatisfaction with appearance. Associations between other domains were found to be weak. The results point to areas of both risk and strength in adolescents born with a cleft lip and/or palate. Future research should investigate how protective factors could counteract potential risk in adolescents with a cleft.

  15. Adjustment of lifetime risks of space radiation-induced cancer by the healthy worker effect and cancer misclassification

    Directory of Open Access Journals (Sweden)

    Leif E. Peterson

    2015-12-01

    Conclusions. The typical life table approach for projecting lifetime risk of radiation-induced cancer mortality and incidence for astronauts and radiation workers can be improved by adjusting for HWE while simulating the uncertainty of input rates, input excess risk coefficients, and bias correction factors during multiple Monte Carlo realizations of the life table.

  16. Population-Adjusted Street Connectivity, Urbanicity and Risk of Obesity in the U.S

    Science.gov (United States)

    Wang, Fahui; Wen, Ming; Xu, Yanqing

    2013-01-01

    Street connectivity, defined as the number of (3-way or more) intersections per area unit, is an important index of built environments as a proxy for walkability in a neighborhood. This paper examines its geographic variations across the rural-urban continuum (urbanicity), major racial-ethnic groups and various poverty levels. The population-adjusted street connectivity index is proposed as a better measure than the regular index for a large area such as county due to likely concentration of population in limited space within the large area. Based on the data from the Behavioral Risk Factor Surveillance System (BRFSS), this paper uses multilevel modeling to analyze its association with physical activity and obesity while controlling for various individual and county-level variables. Analysis of data subsets indicates that the influences of individual and county-level variables on obesity risk vary across areas of different urbanization levels. The positive influence of street connectivity on obesity control is limited to the more but not the mostly urbanized areas. This demonstrates the value of obesogenic environment research in different geographic settings, helps us reconcile and synthesize some seemingly contradictory results reported in different studies, and also promotes that effective policies need to be highly sensitive to the diversity of demographic groups and geographically adaptable. PMID:23667278

  17. A STUDY ON THE RISK-ADJUSTED PERFORMANCE OF MUTUAL FUNDS INDUSTRY IN INDIA

    Directory of Open Access Journals (Sweden)

    Shivangi Agarwal

    2017-04-01

    Full Text Available Investing through mutual funds has gained interest in recent years as it offers optimal risk adjusted returns to investors. The Indian market is no exception and has witnessed a multifold growth in mutual funds over the years. As of 2016, the Indian market is crowded with over two thousand mutual fund schemes, each promising higher returns compared to their peers. This comes as a challenge for an ordinary investor to select the best portfolio to invest making it critical to analyse the performance of these funds. While understanding and analysing the historical performance of mutual funds do not guarantee future performance, however, this may give an idea of how the fund is likely to perform in different market conditions. In this research we address multiple research issues. These include measuring the performance of selected mutual schemes on the basis of risk and return and compare the performance of these selected schemes with benchmark index to see whether the scheme is outperforming or underperforming the benchmark. We also rank funds on the basis of performance and suggest strategies to invest in a mutual fund and therefore, our findings have significant relevance for investing public.

  18. Risk-Adjusted Analysis of Relevant Outcome Drivers for Patients after More Than Two Kidney Transplants

    Directory of Open Access Journals (Sweden)

    Lampros Kousoulas

    2015-01-01

    Full Text Available Renal transplantation is the treatment of choice for patients suffering end-stage renal disease, but as the long-term renal allograft survival is limited, most transplant recipients will face graft loss and will be considered for a retransplantation. The goal of this study was to evaluate the patient and graft survival of the 61 renal transplant recipients after second or subsequent renal transplantation, transplanted in our institution between 1990 and 2010, and to identify risk factors related to inferior outcomes. Actuarial patient survival was 98.3%, 94.8%, and 88.2% after one, three, and five years, respectively. Actuarial graft survival was 86.8%, 80%, and 78.1% after one, three, and five years, respectively. Risk-adjusted analysis revealed that only age at the time of last transplantation had a significant influence on patient survival, whereas graft survival was influenced by multiple immunological and surgical factors, such as the number of HLA mismatches, the type of immunosuppression, the number of surgical complications, need of reoperation, primary graft nonfunction, and acute rejection episodes. In conclusion, third and subsequent renal transplantation constitute a valid therapeutic option, but inferior outcomes should be expected among elderly patients, hyperimmunized recipients, and recipients with multiple operations at the site of last renal transplantation.

  19. Adjusting the Parameters of Metal Oxide Gapless Surge Arresters’ Equivalent Circuits Using the Harmony Search Method

    Directory of Open Access Journals (Sweden)

    Christos A. Christodoulou

    2017-12-01

    Full Text Available The appropriate circuit modeling of metal oxide gapless surge arresters is critical for insulation coordination studies. Metal oxide arresters present a dynamic behavior for fast front surges; namely, their residual voltage is dependent on the peak value, as well as the duration of the injected impulse current, and should therefore not only be represented by non-linear elements. The aim of the current work is to adjust the parameters of the most frequently used surge arresters’ circuit models by considering the magnitude of the residual voltage, as well as the dissipated energy for given pulses. In this aim, the harmony search method is implemented to adjust parameter values of the arrester equivalent circuit models. This functions by minimizing a defined objective function that compares the simulation outcomes with the manufacturer’s data and the results obtained from previous methodologies.

  20. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    Science.gov (United States)

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become

  1. Risk assessment theory, methods, and applications

    CERN Document Server

    Rausand, Marvin

    2011-01-01

    With its balanced coverage of theory and applications along with standards and regulations, Risk Assessment: Theory, Methods, and Applications serves as a comprehensive introduction to the topic. The book serves as a practical guide to current risk analysis and risk assessment, emphasizing the possibility of sudden, major accidents across various areas of practice from machinery and manufacturing processes to nuclear power plants and transportation systems. The author applies a uniform framework to the discussion of each method, setting forth clear objectives and descriptions, while also shedding light on applications, essential resources, and advantages and disadvantages. Following an introduction that provides an overview of risk assessment, the book is organized into two sections that outline key theory, methods, and applications. * Introduction to Risk Assessment defines key concepts and details the steps of a thorough risk assessment along with the necessary quantitative risk measures. Chapters outline...

  2. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  3. Quality measurement in the shunt treatment of hydrocephalus: analysis and risk adjustment of the Revision Quotient.

    Science.gov (United States)

    Piatt, Joseph H; Freibott, Christina E

    2014-07-01

    OBJECT.: The Revision Quotient (RQ) has been defined as the ratio of the number of CSF shunt revisions to the number of new shunt insertions for a particular neurosurgical practice in a unit of time. The RQ has been proposed as a quality measure in the treatment of childhood hydrocephalus. The authors examined the construct validity of the RQ and explored the feasibility of risk stratification under this metric. The Kids' Inpatient Database for 1997, 2000, 2003, 2006, and 2009 was queried for admissions with diagnostic codes for hydrocephalus and procedural codes for CSF shunt insertion or revision. Revision quotients were calculated for hospitals that performed 12 or more shunt insertions annually. The univariate associations of hospital RQs with a variety of institutional descriptors were analyzed, and a generalized linear model of the RQ was constructed. There were 12,244 admissions (34%) during which new shunts were inserted, and there were 23,349 admissions (66%) for shunt revision. Three hundred thirty-four annual RQs were calculated for 152 different hospitals. Analysis of variance in hospital RQs over the 5 years of study data supports the construct validity of the metric. The following factors were incorporated into a generalized linear model that accounted for 41% of the variance of the measured RQs: degree of pediatric specialization, proportion of initial case mix in the infant age group, and proportion with neoplastic hydrocephalus. The RQ has construct validity. Risk adjustment is feasible, but the risk factors that were identified relate predominantly to patterns of patient flow through the health care system. Possible advantages of an alternative metric, the Surgical Activity Ratio, are discussed.

  4. Public Reporting of Primary Care Clinic Quality: Accounting for Sociodemographic Factors in Risk Adjustment and Performance Comparison.

    Science.gov (United States)

    Wholey, Douglas R; Finch, Michael; Kreiger, Rob; Reeves, David

    2018-01-03

    Performance measurement and public reporting are increasingly being used to compare clinic performance. Intended consequences include quality improvement, value-based payment, and consumer choice. Unintended consequences include reducing access for riskier patients and inappropriately labeling some clinics as poor performers, resulting in tampering with stable care processes. Two analytic steps are used to maximize intended and minimize unintended consequences. First, risk adjustment is used to reduce the impact of factors outside providers' control. Second, performance categorization is used to compare clinic performance using risk-adjusted measures. This paper examines the effects of methodological choices, such as risk adjusting for sociodemographic factors in risk adjustment and accounting for patients clustering by clinics in performance categorization, on clinic performance comparison for diabetes care, vascular care, asthma, and colorectal cancer screening. The population includes all patients with commercial and public insurance served by clinics in Minnesota. Although risk adjusting for sociodemographic factors has a significant effect on quality, it does not explain much of the variation in quality. In contrast, taking into account the nesting of patients within clinics in performance categorization has a substantial effect on performance comparison.

  5. Method and apparatus for rapid adjustment of process gas inventory in gaseous diffusion cascades

    International Nuclear Information System (INIS)

    Dyer, R.H.; Fowler, A.H.; Vanstrum, P.R.

    1977-01-01

    The invention relates to an improved method and system for making relatively large and rapid adjustments in the process gas inventory of an electrically powered gaseous diffusion cascade in order to accommodate scheduled changes in the electrical power available for cascade operation. In the preferred form of the invention, the cascade is readied for a decrease in electrical input by simultaneously withdrawing substreams of the cascade B stream into respective process-gas-freezing and storage zones while decreasing the datum-pressure inputs to the positioning systems for the cascade control valves in proportion to the weight of process gas so removed. Consequently, the control valve positions are substantially unchanged by the reduction in invention, and there is minimal disturbance of the cascade isotopic gradient. The cascade is readied for restoration of the power cut by simultaneously evaporating the solids in the freezing zones to regenerate the process gas substreams and introducing them to the cascade A stream while increasing the aforementioned datum pressure inputs in proportion to the weight of process gas so returned. In the preferred form of the system for accomplishing these operations, heat exchangers are provided for freezing, storing, and evaporating the various substreams. Preferably, the heat exchangers are connected to use existing cascade auxiliary systems as a heat sink. A common control is employed to adjust and coordinate the necessary process gas transfers and datum pressure adjustments

  6. Willingness to pay for a quality-adjusted life year: an evaluation of attitudes towards risk and preferences.

    Science.gov (United States)

    Martín-Fernández, Jesus; Polentinos-Castro, Elena; del Cura-González, Ma Isabel; Ariza-Cardiel, Gloria; Abraira, Victor; Gil-LaCruz, Ana Isabel; García-Pérez, Sonia

    2014-07-03

    This paper examines the Willingness to Pay (WTP) for a quality-adjusted life year (QALY) expressed by people who attended the healthcare system as well as the association of attitude towards risk and other personal characteristics with their response. Health-state preferences, measured by EuroQol (EQ-5D-3L), were combined with WTP for recovering a perfect health state. WTP was assessed using close-ended, iterative bidding, contingent valuation method. Data on demographic and socioeconomic characteristics, as well as usage of health services by the subjects were collected. The attitude towards risk was evaluated by collecting risky behaviors data, by the subject's self-evaluation, and through lottery games. Six hundred and sixty two subjects participated and 449 stated a utility inferior to 1. WTP/QALY ratios varied significantly when payments with personal money (mean €10,119; median €673) or through taxes (mean €28,187; median €915) were suggested. Family income, area income, higher education level, greater use of healthcare services, and the number of co-inhabitants were associated with greater WTP/QALY ratios. Age and female gender were associated with lower WTP/QALY ratios. Risk inclination was independently associated with a greater WTP/QALY when "out of pocket" payments were suggested. Clear discrepancies were demonstrated between linearity and neutrality towards risk assumptions and experimental results. WTP/QALY ratios vary noticeably based on demographic and socioeconomic characteristics of the subject, but also on their attitude towards risk. Knowing the expression of preferences by patients from this outcome measurement can be of interest for health service planning.

  7. Competing Risk Approach (CRA) for Estimation of Disability Adjusted Life Years (DALY's) for Female Breast Cancer in India.

    Science.gov (United States)

    Kunnavil, Radhika; Thirthahalli, Chethana; Nooyi, Shalini Chandrashekar; Shivaraj, N S; Murthy, Nandagudi Srinivasa

    2015-10-01

    Competing Risk Approach (CRA) has been used to compute burden of disease in terms of Disability Adjusted Life Years (DALYs) based on a life table for an initially disease-free cohort over time. To compute Years of Life Lost (YLL) due to premature mortality, Years of life lost due to Disability (YLD), DALYs and loss in expectation of life (LEL) using competing risk approach for female breast cancer patients for the year 2008 in India. The published data on breast cancer by age & sex, incidence & mortality for the year 2006-2008 relating to six population based cancer registries (PBCR) under Indian Council of Medical Research (ICMR), general mortality rates of 2007 in India, published in national health profile 2010; based on Sample Registration System (SRS) were utilized for computations. Three life tables were constructed by applying attrition of factors: (i) risk of death from all causes ('a'; where a is the general death rate); (ii) risk of incidence and that of death from causes other than breast cancer ('b-a+c'; where 'b' is the incidence of breast cancer and 'c' is the mortality of breast cancer); and (iii) risk of death from all other causes after excluding cancer mortality ('a-c'). Taking the differences in Total Person Years Lived (TPYL), YLD and YLL were derived along with LEL. CRA revealed that the DALYs were 40209 per 100,000 females in the life time of 0-70+ years with a LEL of 0.11 years per person. Percentage of YLL to DALYs was 28.20% in the cohort. The method of calculation of DALYs based on the CRA is simple and this will help to identify the burden of diseases using minimal information in terms of YLL, YLD, DALYs and LEL.

  8. Comparison of Methods for Adjusting Incorrect Assignments of Items to Subtests Oblique Multiple Group Method Versus Confirmatory Common Factor Method

    NARCIS (Netherlands)

    Stuive, Ilse; Kiers, Henk A.L.; Timmerman, Marieke E.

    2009-01-01

    A common question in test evaluation is whether an a priori assignment of items to subtests is supported by empirical data. If the analysis results indicate the assignment of items to subtests under study is not supported by data, the assignment is often adjusted. In this study the authors compare

  9. Research on the phase adjustment method for dispersion interferometer on HL-2A tokamak

    Science.gov (United States)

    Tongyu, WU; Wei, ZHANG; Haoxi, WANG; Yan, ZHOU; Zejie, YIN

    2018-06-01

    A synchronous demodulation system is proposed and deployed for CO2 dispersion interferometer on HL-2A, which aims at high plasma density measurements and real-time feedback control. In order to make sure that the demodulator and the interferometer signal are synchronous in phase, a phase adjustment (PA) method has been developed for the demodulation system. The method takes advantages of the field programmable gate array parallel and pipeline process capabilities to carry out high performance and low latency PA. Some experimental results presented show that the PA method is crucial to the synchronous demodulation system and reliable to follow the fast change of the electron density. The system can measure the line-integrated density with a high precision of 2.0 × 1018 m‑2.

  10. Risk management method for small photovoltaic plants

    Directory of Open Access Journals (Sweden)

    Kirova Milena

    2016-09-01

    Full Text Available Risk management is necessary for achieving the goals of the organization. There are many methods, approaches, and instruments in the literature concerning risk management. However, these are often highly specialized and transferring them to a different field can prove difficult. Therefore, managers often face situations where they have no tools to use for risk management. This is the case with small photovoltaic plants (according to a definition by the Bulgarian State Energy and Water Regulatory Commission small applies to systems with a total installed power of 200 kWp. There are some good practices in the energy field for minimizing risks, but they offer only partial risk prevention and are not sufficient. Therefore a new risk management method needs to be introduced. Small photovoltaic plants offer plenty of advantages in comparison to the other renewable energy sources which makes risk management in their case more important. There is no classification of risks for the exploitation of small photovoltaic systems in the available literature as well as to what degree the damages from those risks could spread. This makes risk analysis and evaluation necessary for obtaining information which could aid taking decisions for improving risk management. The owner of the invested capital takes a decision regarding the degree of acceptable risk for his organization and it must be protected depending on the goals set. Investors in small photovoltaic systems need to decide to what degree the existing risks can influence the goals previously set, the payback of the investment, and what is the acceptable level of damages for the investor. The purpose of this work is to present a risk management method, which currently does not exist in the Bulgaria, so that the risks and the damages that could occur during the exploitation of small photovoltaic plants could be identified and the investment in such technology – justified.

  11. A novel method to adjust efficacy estimates for uptake of other active treatments in long-term clinical trials.

    Directory of Open Access Journals (Sweden)

    John Simes

    2010-01-01

    Full Text Available When rates of uptake of other drugs differ between treatment arms in long-term trials, the true benefit or harm of the treatment may be underestimated. Methods to allow for such contamination have often been limited by failing to preserve the randomization comparisons. In the Fenofibrate Intervention and Event Lowering in Diabetes (FIELD study, patients were randomized to fenofibrate or placebo, but during the trial many started additional drugs, particularly statins, more so in the placebo group. The effects of fenofibrate estimated by intention-to-treat were likely to have been attenuated. We aimed to quantify this effect and to develop a method for use in other long-term trials.We applied efficacies of statins and other cardiovascular drugs from meta-analyses of randomized trials to adjust the effect of fenofibrate in a penalized Cox model. We assumed that future cardiovascular disease events were reduced by an average of 24% by statins, and 20% by a first other major cardiovascular drug. We applied these estimates to each patient who took these drugs for the period they were on them. We also adjusted the analysis by the rate of discontinuing fenofibrate. Among 4,900 placebo patients, average statin use was 16% over five years. Among 4,895 assigned fenofibrate, statin use was 8% and nonuse of fenofibrate was 10%. In placebo patients, use of cardiovascular drugs was 1% to 3% higher. Before adjustment, fenofibrate was associated with an 11% reduction in coronary events (coronary heart disease death or myocardial infarction (P = 0.16 and an 11% reduction in cardiovascular disease events (P = 0.04. After adjustment, the effects of fenofibrate on coronary events and cardiovascular disease events were 16% (P = 0.06 and 15% (P = 0.008, respectively.This novel application of a penalized Cox model for adjustment of a trial estimate of treatment efficacy incorporates evidence-based estimates for other therapies, preserves comparisons between the

  12. Practical Methods for Information Security Risk Management

    Directory of Open Access Journals (Sweden)

    Cristian AMANCEI

    2011-01-01

    Full Text Available The purpose of this paper is to present some directions to perform the risk man-agement for information security. The article follows to practical methods through question-naire that asses the internal control, and through evaluation based on existing controls as part of vulnerability assessment. The methods presented contains all the key elements that concurs in risk management, through the elements proposed for evaluation questionnaire, list of threats, resource classification and evaluation, correlation between risks and controls and residual risk computation.

  13. A Comparative Study of CAPM and Seven Factors Risk Adjusted Return Model

    Directory of Open Access Journals (Sweden)

    Madiha Riaz Bhatti

    2014-12-01

    Full Text Available This study is a comparison and contrast of the predictive powers of two asset pricing models: CAPM and seven factor risk-return adjusted model, to explain the cross section of stock rate of returns in the financial sector listed at Karachi Stock Exchange (KSE. To test the models daily returns from January 2013 to February 2014 have been taken and the excess returns of portfolios are regressed on explanatory variables. The results of the tested models indicate that the models are valid and applicable in the financial market of Pakistan during the period under study, as the intercepts are not significantly different from zero. It is consequently established from the findings that all the explanatory variables explain the stock returns in the financial sector of KSE. In addition, the results of this study show that addition of more explanatory variables to the single factor CAPM results in reasonably high values of R2. These results provide substantial support to fund managers, investors and financial analysts in making investment decisions.

  14. Resonant frequency detection and adjustment method for a capacitive transducer with differential transformer bridge

    International Nuclear Information System (INIS)

    Hu, M.; Bai, Y. Z.; Zhou, Z. B.; Li, Z. X.; Luo, J.

    2014-01-01

    The capacitive transducer with differential transformer bridge is widely used in ultra-sensitive space accelerometers due to their simple structure and high resolution. In this paper, the front-end electronics of an inductive-capacitive resonant bridge transducer is analyzed. The analysis result shows that the performance of this transducer depends upon the case that the AC pumping frequency operates at the resonance point of the inductive-capacitive bridge. The effect of possible mismatch between the AC pumping frequency and the actual resonant frequency is discussed, and the theoretical analysis indicates that the output voltage noise of the front-end electronics will deteriorate by a factor of about 3 due to either a 5% variation of the AC pumping frequency or a 10% variation of the tuning capacitance. A pre-scanning method to determine the actual resonant frequency is proposed followed by the adjustment of the operating frequency or the change of the tuning capacitance in order to maintain expected high resolution level. An experiment to verify the mismatching effect and the adjustment method is provided

  15. Resonant frequency detection and adjustment method for a capacitive transducer with differential transformer bridge

    Energy Technology Data Exchange (ETDEWEB)

    Hu, M.; Bai, Y. Z., E-mail: abai@mail.hust.edu.cn; Zhou, Z. B., E-mail: zhouzb@mail.hust.edu.cn; Li, Z. X.; Luo, J. [MOE Key Laboratory of Fundamental Physical Quantities Measurement, School of Physics, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2014-05-15

    The capacitive transducer with differential transformer bridge is widely used in ultra-sensitive space accelerometers due to their simple structure and high resolution. In this paper, the front-end electronics of an inductive-capacitive resonant bridge transducer is analyzed. The analysis result shows that the performance of this transducer depends upon the case that the AC pumping frequency operates at the resonance point of the inductive-capacitive bridge. The effect of possible mismatch between the AC pumping frequency and the actual resonant frequency is discussed, and the theoretical analysis indicates that the output voltage noise of the front-end electronics will deteriorate by a factor of about 3 due to either a 5% variation of the AC pumping frequency or a 10% variation of the tuning capacitance. A pre-scanning method to determine the actual resonant frequency is proposed followed by the adjustment of the operating frequency or the change of the tuning capacitance in order to maintain expected high resolution level. An experiment to verify the mismatching effect and the adjustment method is provided.

  16. Balancing the risks and benefits of drinking water disinfection: disability adjusted life-years on the scale.

    OpenAIRE

    Havelaar, A H; De Hollander, A E; Teunis, P F; Evers, E G; Van Kranen, H J; Versteegh, J F; Van Koten, J E; Slob, W

    2000-01-01

    To evaluate the applicability of disability adjusted life-years (DALYs) as a measure to compare positive and negative health effects of drinking water disinfection, we conducted a case study involving a hypothetical drinking water supply from surface water. This drinking water supply is typical in The Netherlands. We compared the reduction of the risk of infection with Cryptosporidium parvum by ozonation of water to the concomitant increase in risk of renal cell cancer arising from the produc...

  17. Case‐mix adjustment in non‐randomised observational evaluations: the constant risk fallacy

    OpenAIRE

    Nicholl, Jon

    2007-01-01

    Observational studies comparing groups or populations to evaluate services or interventions usually require case‐mix adjustment to account for imbalances between the groups being compared. Simulation studies have, however, shown that case‐mix adjustment can make any bias worse.

  18. Method, system and apparatus for monitoring and adjusting the quality of indoor air

    Science.gov (United States)

    Hartenstein, Steven D.; Tremblay, Paul L.; Fryer, Michael O.; Hohorst, Frederick A.

    2004-03-23

    A system, method and apparatus is provided for monitoring and adjusting the quality of indoor air. A sensor array senses an air sample from the indoor air and analyzes the air sample to obtain signatures representative of contaminants in the air sample. When the level or type of contaminant poses a threat or hazard to the occupants, the present invention takes corrective actions which may include introducing additional fresh air. The corrective actions taken are intended to promote overall health of personnel, prevent personnel from being overexposed to hazardous contaminants and minimize the cost of operating the HVAC system. The identification of the contaminants is performed by comparing the signatures provided by the sensor array with a database of known signatures. Upon identification, the system takes corrective actions based on the level of contaminant present. The present invention is capable of learning the identity of previously unknown contaminants, which increases its ability to identify contaminants in the future. Indoor air quality is assured by monitoring the contaminants not only in the indoor air, but also in the outdoor air and the air which is to be recirculated. The present invention is easily adaptable to new and existing HVAC systems. In sum, the present invention is able to monitor and adjust the quality of indoor air in real time by sensing the level and type of contaminants present in indoor air, outdoor and recirculated air, providing an intelligent decision about the quality of the air, and minimizing the cost of operating an HVAC system.

  19. Measuring Risk-adjusted Customer Lifetime Value and its Impact on Relationship Marketing Strategies and Shareholder Value

    OpenAIRE

    Ryals, Lynette; Knox, Simon

    2005-01-01

    The calculations which underlie efforts to balance marketing spending on customer acquisition and customer retention are usually based on either single- period customer profitability or forecasts of customer lifetime value (CLTV). This paper argues instead for risk-adjusted CLTV, which is termed the economic value (EV) of a customer, as the means for marketing to assess both customer profitability and shareholder value gains.

  20. Performance evaluation of inpatient service in Beijing: a horizontal comparison with risk adjustment based on Diagnosis Related Groups.

    Science.gov (United States)

    Jian, Weiyan; Huang, Yinmin; Hu, Mu; Zhang, Xiumei

    2009-04-30

    The medical performance evaluation, which provides a basis for rational decision-making, is an important part of medical service research. Current progress with health services reform in China is far from satisfactory, without sufficient regulation. To achieve better progress, an effective tool for evaluating medical performance needs to be established. In view of this, this study attempted to develop such a tool appropriate for the Chinese context. Data was collected from the front pages of medical records (FPMR) of all large general public hospitals (21 hospitals) in the third and fourth quarter of 2007. Locally developed Diagnosis Related Groups (DRGs) were introduced as a tool for risk adjustment and performance evaluation indicators were established: Charge Efficiency Index (CEI), Time Efficiency Index (TEI) and inpatient mortality of low-risk group cases (IMLRG), to reflect respectively work efficiency and medical service quality. Using these indicators, the inpatient services' performance was horizontally compared among hospitals. Case-mix Index (CMI) was used to adjust efficiency indices and then produce adjusted CEI (aCEI) and adjusted TEI (aTEI). Poisson distribution analysis was used to test the statistical significance of the IMLRG differences between different hospitals. Using the aCEI, aTEI and IMLRG scores for the 21 hospitals, Hospital A and C had relatively good overall performance because their medical charges were lower, LOS shorter and IMLRG smaller. The performance of Hospital P and Q was the worst due to their relatively high charge level, long LOS and high IMLRG. Various performance problems also existed in the other hospitals. It is possible to develop an accurate and easy to run performance evaluation system using Case-Mix as the tool for risk adjustment, choosing indicators close to consumers and managers, and utilizing routine report forms as the basic information source. To keep such a system running effectively, it is necessary to

  1. Risk adjustment policy options for casemix funding: international lessons in financing reform.

    Science.gov (United States)

    Antioch, Kathryn M; Ellis, Randall P; Gillett, Steve; Borovnicar, Daniel; Marshall, Ric P

    2007-09-01

    This paper explores modified hospital casemix payment formulae that would refine the diagnosis-related group (DRG) system in Victoria, Australia, which already makes adjustments for teaching, severity and demographics. We estimate alternative casemix funding methods using multiple regressions for individual hospital episodes from 2001 to 2003 on 70 high-deficit DRGs, focussing on teaching hospitals where the largest deficits have occurred. Our casemix variables are diagnosis- and procedure-based severity markers, counts of diagnoses and procedures, disease types, complexity, day outliers, emergency admission and "transfers in." The results are presented for four policy options that vary according to whether all of the dollars or only some are reallocated, whether all or some hospitals are used and whether the alternatives augment or replace existing payments. While our approach identifies variables that help explain patient cost variations, hospital-level simulations suggest that the approaches explored would only reduce teaching hospital underpayment by about 10%. The implications of various policy options are discussed.

  2. [Effect of 2 methods of occlusion adjustment on occlusal balance and muscles of mastication in patient with implant restoration].

    Science.gov (United States)

    Wang, Rong; Xu, Xin

    2015-12-01

    To compare the effect of 2 methods of occlusion adjustment on occlusal balance and muscles of mastication in patients with dental implant restoration. Twenty patients, each with a single edentulous posterior dentition with no distal dentition were selected, and divided into 2 groups. Patients in group A underwent original occlusion adjustment method and patients in group B underwent occlusal plane reduction technique. Ankylos implants were implanted in the edentulous space in each patient and restored with fixed prosthodontics single unit crown. Occlusion was adjusted in each restoration accordingly. Electromyograms were conducted to determine the effect of adjustment methods on occlusion and muscles of mastication 3 months and 6 months after initial restoration and adjustment. Data was collected and measurements for balanced occlusal measuring standards were obtained, including central occlusion force (COF), asymmetry index of molar occlusal force(AMOF). Balanced muscles of mastication measuring standards were also obtained including measurements from electromyogram for the muscles of mastication and the anterior bundle of the temporalis muscle at the mandibular rest position, average electromyogram measurements of the anterior bundle of the temporalis muscle at the intercuspal position(ICP), Astot, masseter muscle asymmetry index, and anterior temporalis asymmetry index (ASTA). Statistical analysis was performed using Student 's t test with SPSS 18.0 software package. Three months after occlusion adjustment, parameters of the original occlusion adjustment method were significantly different between group A and group B in balanced occlusal measuring standards and balanced muscles of mastication measuring standards. Six months after occlusion adjustment, parameters of the original occlusion adjustment methods were significantly different between group A and group B in balanced muscles of mastication measuring standards, but was no significant difference in balanced

  3. Adjustment for smoking reduces radiation risk: fifth analysis of mortality of nuclear industry workers in Japan, 1999-2010

    Energy Technology Data Exchange (ETDEWEB)

    Kudo, S.; Ishida, J.; Yoshimoto, K.; Mizuno, S.; Ohshima, S.; Kasagi, F., E-mail: s_kudo@rea.or.jp [Instituto of Radiation Epidemiology, Radiation Effects Association, 1-9-16 Kajicho, Chiyoda-ku, 101-0044 Tokyo (Japan)

    2015-10-15

    Full text: Many cohort studies among nuclear industry workers have been carried out to determine the possible health effects of low-level radiation. In those studies, confounding factors, for example, age was adjusted to exclude the effect of difference of mortality by age to estimate radiation risk. But there are few studies adjusting for smoking that is known as a strong factor which affects mortality. Radiation Effects Association (Rea) initiated a cohort study of nuclear industry workers mortality in 1990. To examine non-radiation factors confounding on the mortality risk among the radiation workers, Rea have performed life-style questionnaire surveys among the part of workers at 1997 and 2003 and found the correlation between radiation dose and smoking rate. Mortality follow-up were made on 75,442 male respondents for an average of 8.3 years during the observation period 1999-2010. Estimates of Excess Relative Risk percent (Err %) per 10 mSv were obtained by using the Poisson regression. The Err for all causes was statistically significant (1.05 (90 % CI 0.31 : 1.80)), but no longer significant after adjusting for smoking (0.45 (-0.24 : 1.13)). The Err for all cancers excluding leukemia was not significant (0.92 (-0.30 : 2.16)), but after adjusting for smoking, it decreased (0.36 (-0.79 : 1.50)). Thus smoking has a large effect to obscure a radiation risk, so adjustment for smoking is important to estimate radiation risk. (Author)

  4. Adjustment for smoking reduces radiation risk: fifth analysis of mortality of nuclear industry workers in Japan, 1999-2010

    International Nuclear Information System (INIS)

    Kudo, S.; Ishida, J.; Yoshimoto, K.; Mizuno, S.; Ohshima, S.; Kasagi, F.

    2015-10-01

    Full text: Many cohort studies among nuclear industry workers have been carried out to determine the possible health effects of low-level radiation. In those studies, confounding factors, for example, age was adjusted to exclude the effect of difference of mortality by age to estimate radiation risk. But there are few studies adjusting for smoking that is known as a strong factor which affects mortality. Radiation Effects Association (Rea) initiated a cohort study of nuclear industry workers mortality in 1990. To examine non-radiation factors confounding on the mortality risk among the radiation workers, Rea have performed life-style questionnaire surveys among the part of workers at 1997 and 2003 and found the correlation between radiation dose and smoking rate. Mortality follow-up were made on 75,442 male respondents for an average of 8.3 years during the observation period 1999-2010. Estimates of Excess Relative Risk percent (Err %) per 10 mSv were obtained by using the Poisson regression. The Err for all causes was statistically significant (1.05 (90 % CI 0.31 : 1.80)), but no longer significant after adjusting for smoking (0.45 (-0.24 : 1.13)). The Err for all cancers excluding leukemia was not significant (0.92 (-0.30 : 2.16)), but after adjusting for smoking, it decreased (0.36 (-0.79 : 1.50)). Thus smoking has a large effect to obscure a radiation risk, so adjustment for smoking is important to estimate radiation risk. (Author)

  5. Age-adjusted high-sensitivity troponin T cut-off value for risk stratification of pulmonary embolism.

    Science.gov (United States)

    Kaeberich, Anja; Seeber, Valerie; Jiménez, David; Kostrubiec, Maciej; Dellas, Claudia; Hasenfuß, Gerd; Giannitsis, Evangelos; Pruszczyk, Piotr; Konstantinides, Stavros; Lankeit, Mareike

    2015-05-01

    High-sensitivity troponin T (hsTnT) helps in identifying pulmonary embolism patients at low risk of an adverse outcome. In 682 normotensive pulmonary embolism patients we investigate whether an optimised hsTnT cut-off value and adjustment for age improve the identification of patients at elevated risk. Overall, 25 (3.7%) patients had an adverse 30-day outcome. The established hsTnT cut-off value of 14 pg·mL(-1) retained its high prognostic value (OR (95% CI) 16.64 (2.24-123.74); p=0.006) compared with the cut-off value of 33 pg·mL(-1) calculated by receiver operating characteristic analysis (7.14 (2.64-19.26); pvalue of 45 pg·mL(-1) but not the established cut-off value of 14 pg·mL(-1) predicted an adverse outcome. An age-adjusted hsTnT cut-off value (≥14 pg·mL(-1) for patients aged risk (12.4% adverse outcome). Risk assessment of normotensive pulmonary embolism patients was improved by the introduction of an age-adjusted hsTnT cut-off value. A three-step approach helped identify patients at higher risk of an adverse outcome who might benefit from advanced therapy. Copyright ©ERS 2015.

  6. Behavioral Risk Factor Surveillance System (BRFSS) Age-Adjusted Prevalence Data (2011 to present)

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2011 to present. BRFSS combined land line and cell phone age-adjusted prevalence data. The BRFSS is a continuous, state-based surveillance system that collects...

  7. Adjustment technique without explicit formation of normal equations /conjugate gradient method/

    Science.gov (United States)

    Saxena, N. K.

    1974-01-01

    For a simultaneous adjustment of a large geodetic triangulation system, a semiiterative technique is modified and used successfully. In this semiiterative technique, known as the conjugate gradient (CG) method, original observation equations are used, and thus the explicit formation of normal equations is avoided, 'huge' computer storage space being saved in the case of triangulation systems. This method is suitable even for very poorly conditioned systems where solution is obtained only after more iterations. A detailed study of the CG method for its application to large geodetic triangulation systems was done that also considered constraint equations with observation equations. It was programmed and tested on systems as small as two unknowns and three equations up to those as large as 804 unknowns and 1397 equations. When real data (573 unknowns, 965 equations) from a 1858-km-long triangulation system were used, a solution vector accurate to four decimal places was obtained in 2.96 min after 1171 iterations (i.e., 2.0 times the number of unknowns).

  8. Adjustments of the Pesticide Risk Index Used in Environmental Policy in Flanders.

    Science.gov (United States)

    Fevery, Davina; Peeters, Bob; Lenders, Sonia; Spanoghe, Pieter

    2015-01-01

    Indicators are used to quantify the pressure of pesticides on the environment. Pesticide risk indicators typically require weighting environmental exposure by a no effect concentration. An indicator based on spread equivalents (ΣSeq) is used in environmental policy in Flanders (Belgium). The pesticide risk for aquatic life is estimated by weighting active ingredient usage by the ratio of their maximum allowable concentration and their soil halflife. Accurate estimates of total pesticide usage in the region are essential in such calculations. Up to 2012, the environmental impact of pesticides was estimated on sales figures provided by the Federal Government. Since 2013, pesticide use is calculated based on results from the Farm Accountancy Data Network (FADN). The estimation of pesticide use was supplemented with data for non-agricultural use based on sales figures of amateur use provided by industry and data obtained from public services. The Seq-indicator was modified to better reflect reality. This method was applied for the period 2009-2012 and showed differences between estimated use and sales figures of pesticides. The estimated use of pesticides based on accountancy data is more accurate compared to sales figures. This approach resulted in a better view on pesticide use and its respective environmental impact in Flanders.

  9. Adjustments of the Pesticide Risk Index Used in Environmental Policy in Flanders.

    Directory of Open Access Journals (Sweden)

    Davina Fevery

    Full Text Available Indicators are used to quantify the pressure of pesticides on the environment. Pesticide risk indicators typically require weighting environmental exposure by a no effect concentration. An indicator based on spread equivalents (ΣSeq is used in environmental policy in Flanders (Belgium. The pesticide risk for aquatic life is estimated by weighting active ingredient usage by the ratio of their maximum allowable concentration and their soil halflife. Accurate estimates of total pesticide usage in the region are essential in such calculations. Up to 2012, the environmental impact of pesticides was estimated on sales figures provided by the Federal Government. Since 2013, pesticide use is calculated based on results from the Farm Accountancy Data Network (FADN. The estimation of pesticide use was supplemented with data for non-agricultural use based on sales figures of amateur use provided by industry and data obtained from public services. The Seq-indicator was modified to better reflect reality. This method was applied for the period 2009-2012 and showed differences between estimated use and sales figures of pesticides. The estimated use of pesticides based on accountancy data is more accurate compared to sales figures. This approach resulted in a better view on pesticide use and its respective environmental impact in Flanders.

  10. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  11. Augmenting the Deliberative Method for Ranking Risks.

    Science.gov (United States)

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.

  12. Development and Validation of Perioperative Risk-Adjustment Models for Hip Fracture Repair, Total Hip Arthroplasty, and Total Knee Arthroplasty.

    Science.gov (United States)

    Schilling, Peter L; Bozic, Kevin J

    2016-01-06

    Comparing outcomes across providers requires risk-adjustment models that account for differences in case mix. The burden of data collection from the clinical record can make risk-adjusted outcomes difficult to measure. The purpose of this study was to develop risk-adjustment models for hip fracture repair (HFR), total hip arthroplasty (THA), and total knee arthroplasty (TKA) that weigh adequacy of risk adjustment against data-collection burden. We used data from the American College of Surgeons National Surgical Quality Improvement Program to create derivation cohorts for HFR (n = 7000), THA (n = 17,336), and TKA (n = 28,661). We developed logistic regression models for each procedure using age, sex, American Society of Anesthesiologists (ASA) physical status classification, comorbidities, laboratory values, and vital signs-based comorbidities as covariates, and validated the models with use of data from 2012. The derivation models' C-statistics for mortality were 80%, 81%, 75%, and 92% and for adverse events were 68%, 68%, 60%, and 70% for HFR, THA, TKA, and combined procedure cohorts. Age, sex, and ASA classification accounted for a large share of the explained variation in mortality (50%, 58%, 70%, and 67%) and adverse events (43%, 45%, 46%, and 68%). For THA and TKA, these three variables were nearly as predictive as models utilizing all covariates. HFR model discrimination improved with the addition of comorbidities and laboratory values; among the important covariates were functional status, low albumin, high creatinine, disseminated cancer, dyspnea, and body mass index. Model performance was similar in validation cohorts. Risk-adjustment models using data from health records demonstrated good discrimination and calibration for HFR, THA, and TKA. It is possible to provide adequate risk adjustment using only the most predictive variables commonly available within the clinical record. This finding helps to inform the trade-off between model performance and data

  13. Cumulative socioeconomic status risk, allostatic load, and adjustment: a prospective latent profile analysis with contextual and genetic protective factors.

    Science.gov (United States)

    Brody, Gene H; Yu, Tianyi; Chen, Yi-fu; Kogan, Steven M; Evans, Gary W; Beach, Steven R H; Windle, Michael; Simons, Ronald L; Gerrard, Meg; Gibbons, Frederick X; Philibert, Robert A

    2013-05-01

    The health disparities literature has identified a common pattern among middle-aged African Americans that includes high rates of chronic disease along with low rates of psychiatric disorders despite exposure to high levels of cumulative socioeconomic status (SES) risk. The current study was designed to test hypotheses about the developmental precursors to this pattern. Hypotheses were tested with a representative sample of 443 African American youths living in the rural South. Cumulative SES risk and protective processes were assessed at ages 11-13 years; psychological adjustment was assessed at ages 14-18 years; genotyping at the 5-HTTLPR was conducted at age 16 years; and allostatic load (AL) was assessed at age 19 years. A latent profile analysis identified 5 profiles that evinced distinct patterns of SES risk, AL, and psychological adjustment, with 2 relatively large profiles designated as focal profiles: a physical health vulnerability profile characterized by high SES risk/high AL/low adjustment problems, and a resilient profile characterized by high SES risk/low AL/low adjustment problems. The physical health vulnerability profile mirrored the pattern found in the adult health disparities literature. Multinomial logistic regression analyses indicated that carrying an s allele at the 5-HTTLPR and receiving less peer support distinguished the physical health vulnerability profile from the resilient profile. Protective parenting and planful self-regulation distinguished both focal profiles from the other 3 profiles. The results suggest the public health importance of preventive interventions that enhance coping and reduce the effects of stress across childhood and adolescence.

  14. Auditing Neonatal Intensive Care: Is PREM a Good Alternative to CRIB for Mortality Risk Adjustment in Premature Infants?

    Science.gov (United States)

    Guenther, Kilian; Vach, Werner; Kachel, Walter; Bruder, Ingo; Hentschel, Roland

    2015-01-01

    Comparing outcomes at different neonatal intensive care units (NICUs) requires adjustment for intrinsic risk. The Clinical Risk Index for Babies (CRIB) is a widely used risk model, but it has been criticized for being affected by therapeutic decisions. The Prematurity Risk Evaluation Measure (PREM) is not supposed to be prone to treatment bias, but has not yet been validated. We aimed to validate the PREM, compare its accuracy to that of the original and modified versions of the CRIB and CRIB-II, and examine the congruence of risk categorization. Very-low-birth-weight (VLBW) infants with a gestational age (GA) auditing. It could be useful to combine scores. © 2015 S. Karger AG, Basel.

  15. Critical review of methods for risk ranking of food-related hazards, based on risks for human health.

    Science.gov (United States)

    Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J

    2018-01-22

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.

  16. Risk adjustment model of credit life insurance using a genetic algorithm

    Science.gov (United States)

    Saputra, A.; Sukono; Rusyaman, E.

    2018-03-01

    In managing the risk of credit life insurance, insurance company should acknowledge the character of the risks to predict future losses. Risk characteristics can be learned in a claim distribution model. There are two standard approaches in designing the distribution model of claims over the insurance period i.e, collective risk model and individual risk model. In the collective risk model, the claim arises when risk occurs is called individual claim, accumulation of individual claim during a period of insurance is called an aggregate claim. The aggregate claim model may be formed by large model and a number of individual claims. How the measurement of insurance risk with the premium model approach and whether this approach is appropriate for estimating the potential losses occur in the future. In order to solve the problem Genetic Algorithm with Roulette Wheel Selection is used.

  17. Short term load forecasting technique based on the seasonal exponential adjustment method and the regression model

    International Nuclear Information System (INIS)

    Wu, Jie; Wang, Jianzhou; Lu, Haiyan; Dong, Yao; Lu, Xiaoxiao

    2013-01-01

    Highlights: ► The seasonal and trend items of the data series are forecasted separately. ► Seasonal item in the data series is verified by the Kendall τ correlation testing. ► Different regression models are applied to the trend item forecasting. ► We examine the superiority of the combined models by the quartile value comparison. ► Paired-sample T test is utilized to confirm the superiority of the combined models. - Abstract: For an energy-limited economy system, it is crucial to forecast load demand accurately. This paper devotes to 1-week-ahead daily load forecasting approach in which load demand series are predicted by employing the information of days before being similar to that of the forecast day. As well as in many nonlinear systems, seasonal item and trend item are coexisting in load demand datasets. In this paper, the existing of the seasonal item in the load demand data series is firstly verified according to the Kendall τ correlation testing method. Then in the belief of the separate forecasting to the seasonal item and the trend item would improve the forecasting accuracy, hybrid models by combining seasonal exponential adjustment method (SEAM) with the regression methods are proposed in this paper, where SEAM and the regression models are employed to seasonal and trend items forecasting respectively. Comparisons of the quartile values as well as the mean absolute percentage error values demonstrate this forecasting technique can significantly improve the accuracy though models applied to the trend item forecasting are eleven different ones. This superior performance of this separate forecasting technique is further confirmed by the paired-sample T tests

  18. Can parents adjust to the idea that their child is at risk for a sudden death?: Psychological impact of risk for Long QT Syndrome

    NARCIS (Netherlands)

    Hendriks, Karin S. W. H.; Grosfeld, F. J. M.; van Tintelen, J. P.; van Langen, I. M.; Wilde, A. A. M.; van den Bout, J.; ten Kroode, H. F. J.

    2005-01-01

    Can a parent adjust to the idea that its child is at risk for a sudden death? This question is raised by a diagnostic procedure in which children were tested for an inherited Long QT Syndrome (LQTS). This potentially life-threatening but treatable cardiac arrhythmia syndrome may cause sudden death,

  19. Can parents adjust to the idea that their child is at risk for a sudden death? : Psychological impact of risk for Long QT Syndrome

    NARCIS (Netherlands)

    Hendriks, Karin S. W. H.; Grosfeld, FJM; van Tintelen, JP; van Langen, IM; Wilde, AAM; van den Bout, J; ten Kroode, HFJ

    2005-01-01

    Can a parent adjust to the idea that its child is at risk for a sudden death? This question is raised by a diagnostic procedure in which children were tested for an inherited Long QT Syndrome (LQTS). This potentially life-threatening but treatable cardiac arrhythmia syndrome may cause sudden death,

  20. Calculations for Adjusting Endogenous Biomarker Levels During Analytical Recovery Assessments for Ligand-Binding Assay Bioanalytical Method Validation.

    Science.gov (United States)

    Marcelletti, John F; Evans, Cindy L; Saxena, Manju; Lopez, Adriana E

    2015-07-01

    It is often necessary to adjust for detectable endogenous biomarker levels in spiked validation samples (VS) and in selectivity determinations during bioanalytical method validation for ligand-binding assays (LBA) with a matrix like normal human serum (NHS). Described herein are case studies of biomarker analyses using multiplex LBA which highlight the challenges associated with such adjustments when calculating percent analytical recovery (%AR). The LBA test methods were the Meso Scale Discovery V-PLEX® proinflammatory and cytokine panels with NHS as test matrix. The NHS matrix blank exhibited varied endogenous content of the 20 individual cytokines before spiking, ranging from undetectable to readily quantifiable. Addition and subtraction methods for adjusting endogenous cytokine levels in %AR calculations are both used in the bioanalytical field. The two methods were compared in %AR calculations following spiking and analysis of VS for cytokines having detectable endogenous levels in NHS. Calculations for %AR obtained by subtracting quantifiable endogenous biomarker concentrations from the respective total analytical VS values yielded reproducible and credible conclusions. The addition method, in contrast, yielded %AR conclusions that were frequently unreliable and discordant with values obtained with the subtraction adjustment method. It is shown that subtraction of assay signal attributable to matrix is a feasible alternative when endogenous biomarkers levels are below the limit of quantitation, but above the limit of detection. These analyses confirm that the subtraction method is preferable over that using addition to adjust for detectable endogenous biomarker levels when calculating %AR for biomarker LBA.

  1. Evaluating the Investment Benefit of Multinational Enterprises' International Projects Based on Risk Adjustment: Evidence from China

    Science.gov (United States)

    Chen, Chong

    2016-01-01

    This study examines the international risks faced by multinational enterprises to understand their impact on the evaluation of investment projects. Moreover, it establishes a 'three-dimensional' theoretical framework of risk identification to analyse the composition of international risk indicators of multinational enterprises based on the theory…

  2. Pathways of Parenting Style on Adolescents' College Adjustment, Academic Achievement, and Alcohol Risk

    Science.gov (United States)

    Kenney, Shannon R.; Lac, Andrew; Hummer, Justin F.; Grimaldi, Elizabeth M.; LaBrie, Joseph W.

    2015-01-01

    This study examined the pathways of parenting style (permissive, authoritarian, and authoritative) to alcohol consumption and consequences through the mediators of college adjustment and academic achievement (grade point average [GPA]). Participants were 289 students from a private, mid-size, West Coast university (mean age 19.01 years, 58.8%…

  3. Alternative Payment Models Should Risk-Adjust for Conversion Total Hip Arthroplasty: A Propensity Score-Matched Study.

    Science.gov (United States)

    McLawhorn, Alexander S; Schairer, William W; Schwarzkopf, Ran; Halsey, David A; Iorio, Richard; Padgett, Douglas E

    2017-12-06

    For Medicare beneficiaries, hospital reimbursement for nonrevision hip arthroplasty is anchored to either diagnosis-related group code 469 or 470. Under alternative payment models, reimbursement for care episodes is not further risk-adjusted. This study's purpose was to compare outcomes of primary total hip arthroplasty (THA) vs conversion THA to explore the rationale for risk adjustment for conversion procedures. All primary and conversion THAs from 2007 to 2014, excluding acute hip fractures and cancer patients, were identified in the National Surgical Quality Improvement Program database. Conversion and primary THA patients were matched 1:1 using propensity scores, based on preoperative covariates. Multivariable logistic regressions evaluated associations between conversion THA and 30-day outcomes. A total of 2018 conversions were matched to 2018 primaries. There were no differences in preoperative covariates. Conversions had longer operative times (148 vs 95 minutes, P reimbursement models shift toward bundled payment paradigms, conversion THA appears to be a procedure for which risk adjustment is appropriate. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Methods to estimate the genetic risk

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1989-01-01

    The estimation of the radiation-induced genetic risk to human populations is based on the extrapolation of results from animal experiments. Radiation-induced mutations are stochastic events. The probability of the event depends on the dose; the degree of the damage dose not. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of expected frequencies of genetic changes induced per unit dose. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The advantage of the indirect method is that not only can Mendelian mutations be quantified, but also other types of genetic disorders. The disadvantages of the method are the uncertainties in determining the current incidence of genetic disorders in human and, in addition, the estimasion of the genetic component of congenital anomalies, anomalies expressed later and constitutional and degenerative diseases. Using the direct method we estimated that 20-50 dominant radiation-induced mutations would be expected in 19 000 offspring born to parents exposed in Hiroshima and Nagasaki, but only a small proportion of these mutants would have been detected with the techniques used for the population study. These methods were used to predict the genetic damage from the fallout of the reactor accident at Chernobyl in the vicinity of Southern Germany. The lack of knowledge for the interaction of chemicals with ionizing radiation and the discrepancy between the high safety standards for radiation protection and the low level of knowledge for the toxicological evaluation of chemical mutagens will be emphasized. (author)

  5. Inclusion of Highest Glasgow Coma Scale Motor Component Score in Mortality Risk Adjustment for Benchmarking of Trauma Center Performance.

    Science.gov (United States)

    Gomez, David; Byrne, James P; Alali, Aziz S; Xiong, Wei; Hoeft, Chris; Neal, Melanie; Subacius, Harris; Nathens, Avery B

    2017-12-01

    The Glasgow Coma Scale (GCS) is the most widely used measure of traumatic brain injury (TBI) severity. Currently, the arrival GCS motor component (mGCS) score is used in risk-adjustment models for external benchmarking of mortality. However, there is evidence that the highest mGCS score in the first 24 hours after injury might be a better predictor of death. Our objective was to evaluate the impact of including the highest mGCS score on the performance of risk-adjustment models and subsequent external benchmarking results. Data were derived from the Trauma Quality Improvement Program analytic dataset (January 2014 through March 2015) and were limited to the severe TBI cohort (16 years or older, isolated head injury, GCS ≤8). Risk-adjustment models were created that varied in the mGCS covariates only (initial score, highest score, or both initial and highest mGCS scores). Model performance and fit, as well as external benchmarking results, were compared. There were 6,553 patients with severe TBI across 231 trauma centers included. Initial and highest mGCS scores were different in 47% of patients (n = 3,097). Model performance and fit improved when both initial and highest mGCS scores were included, as evidenced by improved C-statistic, Akaike Information Criterion, and adjusted R-squared values. Three-quarters of centers changed their adjusted odds ratio decile, 2.6% of centers changed outlier status, and 45% of centers exhibited a ≥0.5-SD change in the odds ratio of death after including highest mGCS score in the model. This study supports the concept that additional clinical information has the potential to not only improve the performance of current risk-adjustment models, but can also have a meaningful impact on external benchmarking strategies. Highest mGCS score is a good potential candidate for inclusion in additional models. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  6. New methods for fall risk prediction.

    Science.gov (United States)

    Ejupi, Andreas; Lord, Stephen R; Delbaere, Kim

    2014-09-01

    Accidental falls are the leading cause of injury-related death and hospitalization in old age, with over one-third of the older adults experiencing at least one fall or more each year. Because of limited healthcare resources, regular objective fall risk assessments are not possible in the community on a large scale. New methods for fall prediction are necessary to identify and monitor those older people at high risk of falling who would benefit from participating in falls prevention programmes. Technological advances have enabled less expensive ways to quantify physical fall risk in clinical practice and in the homes of older people. Recently, several studies have demonstrated that sensor-based fall risk assessments of postural sway, functional mobility, stepping and walking can discriminate between fallers and nonfallers. Recent research has used low-cost, portable and objective measuring instruments to assess fall risk in older people. Future use of these technologies holds promise for assessing fall risk accurately in an unobtrusive manner in clinical and daily life settings.

  7. The Use of the Kurtosis-Adjusted Cumulative Noise Exposure Metric in Evaluating the Hearing Loss Risk for Complex Noise.

    Science.gov (United States)

    Xie, Hong-Wei; Qiu, Wei; Heyer, Nicholas J; Zhang, Mei-Bian; Zhang, Peng; Zhao, Yi-Ming; Hamernik, Roger P

    2016-01-01

    To test a kurtosis-adjusted cumulative noise exposure (CNE) metric for use in evaluating the risk of hearing loss among workers exposed to industrial noises. Specifically, to evaluate whether the kurtosis-adjusted CNE (1) provides a better association with observed industrial noise-induced hearing loss, and (2) provides a single metric applicable to both complex (non-Gaussian [non-G]) and continuous or steady state (Gaussian [G]) noise exposures for predicting noise-induced hearing loss (dose-response curves). Audiometric and noise exposure data were acquired on a population of screened workers (N = 341) from two steel manufacturing plants located in Zhejiang province and a textile manufacturing plant located in Henan province, China. All the subjects from the two steel manufacturing plants (N = 178) were exposed to complex noise, whereas the subjects from textile manufacturing plant (N = 163) were exposed to a G continuous noise. Each subject was given an otologic examination to determine their pure-tone HTL and had their personal 8-hr equivalent A-weighted noise exposure (LAeq) and full-shift noise kurtosis statistic (which is sensitive to the peaks and temporal characteristics of noise exposures) measured. For each subject, an unadjusted and kurtosis-adjusted CNE index for the years worked was created. Multiple linear regression analysis controlling for age was used to determine the relationship between CNE (unadjusted and kurtosis adjusted) and the mean HTL at 3, 4, and 6 kHz (HTL346) among the complex noise-exposed group. In addition, each subject's HTLs from 0.5 to 8.0 kHz were age and sex adjusted using Annex A (ISO-1999) to determine whether they had adjusted high-frequency noise-induced hearing loss (AHFNIHL), defined as an adjusted HTL shift of 30 dB or greater at 3.0, 4.0, or 6.0 kHz in either ear. Dose-response curves for AHFNIHL were developed separately for workers exposed to G and non-G noise using both unadjusted and adjusted CNE as the exposure

  8. Flexible Multi-Objective Transmission Expansion Planning with Adjustable Risk Aversion

    Directory of Open Access Journals (Sweden)

    Jing Qiu

    2017-07-01

    Full Text Available This paper presents a multi-objective transmission expansion planning (TEP framework. Rather than using the conventional deterministic reliability criterion, a risk component based on the probabilistic reliability criterion is incorporated into the TEP objectives. This risk component can capture the stochastic nature of power systems, such as load and wind power output variations, component availability, and incentive-based demand response (IBDR costs. Specifically, the formulation of risk value after risk aversion is explicitly given, and it aims to provide network planners with the flexibility to conduct risk analysis. Thus, a final expansion plan can be selected according to individual risk preferences. Moreover, the economic value of IBDR is modeled and integrated into the cost objective. In addition, a relatively new multi-objective evolutionary algorithm called the MOEA/D is introduced and employed to find Pareto optimal solutions, and tradeoffs between overall cost and risk are provided. The proposed approach is numerically verified on the Garver’s six-bus, IEEE 24-bus RTS and Polish 2383-bus systems. Case study results demonstrate that the proposed approach can effectively reduce cost and hedge risk in relation to increasing wind power integration.

  9. Risk Adjusted Production Efficiency of Maize Farmers in Ethiopia: Implication for Improved Maize Varieties Adoption

    Directory of Open Access Journals (Sweden)

    Sisay Diriba Lemessa

    2017-09-01

    Full Text Available This study analyzes the technical efficiency and production risk of 862 maize farmers in major maize producing regions of Ethiopia. It employs the stochastic frontier approach (SFA to estimate the level of technical efficiencies of stallholder farmers. The stochastic frontier approach (SFA uses flexible risk properties to account for production risk. Thus, maize production variability is assessed from two perspectives, the production risk and the technical efficiency. The study also attempts to determine the socio-economic and farm characteristics that influence technical efficiency of maize production in the study area. The findings of the study showed the existence of both production risk and technical inefficiency in maize production process. Input variables (amounts per hectare such as fertilizer and labor positively influence maize output. The findings also show that farms in the study area exhibit decreasing returns to scale. Fertilizer and ox plough days reduce output risk while labor and improved seed increase output risk. The mean technical efficiency for maize farms is 48 percent. This study concludes that production risk and technical inefficiency prevents the maize farmers from realizing their frontier output. The best factors that improve the efficiency of the maize farmers in the study area include: frequency of extension contact, access to credit and use of intercropping. It was also realized that altitude and terracing in maize farms had influence on farmer efficiency.

  10. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  11. Lung cancer among coal miners, ore miners and quarrymen : smoking-adjusted risk estimates from the synergy pooled analysis of case-control studies

    NARCIS (Netherlands)

    Taeger, Dirk; Pesch, Beate; Kendzia, Benjamin; Behrens, Thomas; Jöckel, Karl-Heinz; Dahmann, Dirk; Siemiatycki, Jack; Kromhout, Hans; Vermeulen, Roel; Peters, Susan; Olsson, Ann; Brüske, Irene; Wichmann, Heinz-Erich; Stücker, Isabelle; Guida, Florence; Tardón, Adonina; Merletti, Franco; Mirabelli, Dario; Richiardi, Lorenzo; Pohlabeln, Hermann; Ahrens, Wolfgang; Landi, Maria Teresa; Caporaso, Neil; Pesatori, Angela Cecilia; Mukeriya, Anush; Szeszenia-Dabrowska, Neonila; Lissowska, Jolanta; Gustavsson, Per; Field, John; Marcus, Michael W; Fabianova, Eleonora; 't Mannetje, Andrea; Pearce, Neil; Rudnai, Peter; Bencko, Vladimir; Janout, Vladimir; Dumitru, Rodica Stanescu; Foretova, Lenka; Forastiere, Francesco; John McLaughlin, John McLaughlin; Paul Demers, Paul Demers; Bas Bueno-de-Mesquita, Bas Bueno-de-Mesquita; Joachim Schüz, Joachim Schüz; Kurt Straif, Kurt Straif; Brüning, Thomas

    2015-01-01

    OBJECTIVES: Working in mines and quarries has been associated with an elevated lung cancer risk but with inconsistent results for coal miners. This study aimed to estimate the smoking-adjusted lung cancer risk among coal miners and compare the risk pattern with lung cancer risks among ore miners and

  12. New method for assessing risks of email

    Science.gov (United States)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  13. The HHS-HCC Risk Adjustment Model for Individual and Small

    Data.gov (United States)

    U.S. Department of Health & Human Services — Volume 4, Issue 3 of the Medicare and Medicaid Research Review includes three articles describing the Department of Health and Human Services (HHS) developed risk...

  14. Method for environmental risk analysis (MIRA) revision 2007; Metode for miljoerettet risikoanalyse (MIRA) revisjon 2007

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-04-15

    OLF's instruction manual for carrying out environmental risk analyses provides a united approach and a common framework for environmental risk assessments. This is based on the best information available. The manual implies standardizations of a series of parameters, input data and partial analyses that are included in the environmental risk analysis. Environmental risk analyses carried out according to the MIRA method will thus be comparable between fields and between companies. In this revision an update of the text in accordance with today's practice for environmental risk analyses and prevailing regulations is emphasized. Moreover, method adjustments for especially protected beach habitats have been introduced, as well as a general method for estimating environmental risk concerning fish. Emphasis has also been put on improving environmental risk analysis' possibilities to contribute to a better management of environmental risk in the companies (ml)

  15. Are Chinese consumers at risk due to exposure to metals in crayfish? A bioaccessibility-adjusted probabilistic risk assessment.

    Science.gov (United States)

    Peng, Qian; Nunes, Luís M; Greenfield, Ben K; Dang, Fei; Zhong, Huan

    2016-03-01

    Freshwater crayfish, the world's third largest crustacean species, has been reported to accumulate high levels of metals, while the current knowledge of potential risk associated with crayfish consumption lags behind that of finfish. We provide the first estimate of human health risk associated with crayfish (Procambarus clarkii) consumption in China, the world's largest producer and consumer of crayfish. We performed Monte Carlo Simulation on a standard risk model parameterized with local data on metal concentrations, bioaccessibility (φ), crayfish consumption rate, and consumer body mass. Bioaccessibility of metals in crayfish was found to be variable (68-95%) and metal-specific, suggesting a potential influence of metal bioaccessibility on effective metal intake. However, sensitivity analysis suggested risk of metals via crayfish consumption was predominantly explained by consumption rate (explaining >92% of total risk estimate variability), rather than metals concentration, bioaccessibility, or body mass. Mean metal concentrations (As, Cd, Cu, Ni, Pb, Se and Zn) in surveyed crayfish samples from 12 provinces in China conformed to national safety standards. However, risk calculation of φ-modified hazard quotient (HQ) and hazard index (HI) suggested that crayfish metals may pose a health risk for very high rate consumers, with a HI of over 24 for the highest rate consumers. Additionally, the φ-modified increased lifetime risk (ILTR) for carcinogenic effects due to the presence of As was above the acceptable level (10(-5)) for both the median (ILTR=2.5×10(-5)) and 90th percentile (ILTR=1.8×10(-4)), highlighting the relatively high risk of As in crayfish. Our results suggest a need to consider crayfish when assessing human dietary exposure to metals and associated health risks, especially for high crayfish-consuming populations, such as in China, USA and Sweden. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A classification scheme for risk assessment methods.

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In

  17. A comparative evaluation of risk-adjustment models for benchmarking amputation-free survival after lower extremity bypass.

    Science.gov (United States)

    Simons, Jessica P; Goodney, Philip P; Flahive, Julie; Hoel, Andrew W; Hallett, John W; Kraiss, Larry W; Schanzer, Andres

    2016-04-01

    Providing patients and payers with publicly reported risk-adjusted quality metrics for the purpose of benchmarking physicians and institutions has become a national priority. Several prediction models have been developed to estimate outcomes after lower extremity revascularization for critical limb ischemia, but the optimal model to use in contemporary practice has not been defined. We sought to identify the highest-performing risk-adjustment model for amputation-free survival (AFS) at 1 year after lower extremity bypass (LEB). We used the national Society for Vascular Surgery Vascular Quality Initiative (VQI) database (2003-2012) to assess the performance of three previously validated risk-adjustment models for AFS. The Bypass versus Angioplasty in Severe Ischaemia of the Leg (BASIL), Finland National Vascular (FINNVASC) registry, and the modified Project of Ex-vivo vein graft Engineering via Transfection III (PREVENT III [mPIII]) risk scores were applied to the VQI cohort. A novel model for 1-year AFS was also derived using the VQI data set and externally validated using the PIII data set. The relative discrimination (Harrell c-index) and calibration (Hosmer-May goodness-of-fit test) of each model were compared. Among 7754 patients in the VQI who underwent LEB for critical limb ischemia, the AFS was 74% at 1 year. Each of the previously published models for AFS demonstrated similar discriminative performance: c-indices for BASIL, FINNVASC, mPIII were 0.66, 0.60, and 0.64, respectively. The novel VQI-derived model had improved discriminative ability with a c-index of 0.71 and appropriate generalizability on external validation with a c-index of 0.68. The model was well calibrated in both the VQI and PIII data sets (goodness of fit P = not significant). Currently available prediction models for AFS after LEB perform modestly when applied to national contemporary VQI data. Moreover, the performance of each model was inferior to that of the novel VQI-derived model

  18. Comparison of the performance of the CMS Hierarchical Condition Category (CMS-HCC) risk adjuster with the Charlson and Elixhauser comorbidity measures in predicting mortality.

    Science.gov (United States)

    Li, Pengxiang; Kim, Michelle M; Doshi, Jalpa A

    2010-08-20

    The Centers for Medicare and Medicaid Services (CMS) has implemented the CMS-Hierarchical Condition Category (CMS-HCC) model to risk adjust Medicare capitation payments. This study intends to assess the performance of the CMS-HCC risk adjustment method and to compare it to the Charlson and Elixhauser comorbidity measures in predicting in-hospital and six-month mortality in Medicare beneficiaries. The study used the 2005-2006 Chronic Condition Data Warehouse (CCW) 5% Medicare files. The primary study sample included all community-dwelling fee-for-service Medicare beneficiaries with a hospital admission between January 1st, 2006 and June 30th, 2006. Additionally, four disease-specific samples consisting of subgroups of patients with principal diagnoses of congestive heart failure (CHF), stroke, diabetes mellitus (DM), and acute myocardial infarction (AMI) were also selected. Four analytic files were generated for each sample by extracting inpatient and/or outpatient claims for each patient. Logistic regressions were used to compare the methods. Model performance was assessed using the c-statistic, the Akaike's information criterion (AIC), the Bayesian information criterion (BIC) and their 95% confidence intervals estimated using bootstrapping. The CMS-HCC had statistically significant higher c-statistic and lower AIC and BIC values than the Charlson and Elixhauser methods in predicting in-hospital and six-month mortality across all samples in analytic files that included claims from the index hospitalization. Exclusion of claims for the index hospitalization generally led to drops in model performance across all methods with the highest drops for the CMS-HCC method. However, the CMS-HCC still performed as well or better than the other two methods. The CMS-HCC method demonstrated better performance relative to the Charlson and Elixhauser methods in predicting in-hospital and six-month mortality. The CMS-HCC model is preferred over the Charlson and Elixhauser methods

  19. Why and when is ethnic harassment a risk for immigrant adolescents' school adjustment? understanding the processes and conditions.

    Science.gov (United States)

    Bayram Özdemir, Sevgi; Stattin, Håkan

    2014-08-01

    Ethnically harassed immigrant youth are at risk for experiencing a wide range of school adjustment problems. However, it is still unclear why and under what conditions experiencing ethnic harassment leads to school adjustment difficulties. To address this limitation in the literature, we examined two important questions. First, we investigated whether self-esteem and/or depressive symptoms would mediate the associations between ethnic harassment and poor school adjustment among immigrant youth. Second, we examined whether immigrant youths' perception of school context would play a buffering role in the pathways between ethnic harassment and school adjustment difficulties. The sample (n = 330; M age = 14.07, SD = .90; 49% girls at T1) was drawn from a longitudinal study in Sweden. The results revealed that experiencing ethnic harassment led to a decrease in immigrant youths' self-esteem over time, and that youths' expectations of academic failure increased. Further, youths' relationships with their teachers and their perceptions of school democracy moderated the mediation processes. Specifically, when youth had poor relationships with their teachers or perceived their school context as less democratic, being exposed to ethnic harassment led to a decrease in their self-esteem. In turn, they reported low school satisfaction and perceived themselves as being unsuccessful in school. Such indirect effects were not observed when youth had high positive relationships with their teachers or perceived their school as offering a democratic environment. These findings highlight the importance of understanding underlying processes and conditions in the examination of the effects of ethnic devaluation experiences in order to reach a more comprehensive understanding of immigrant youths' school adjustment.

  20. Behavioral adjustments of African herbivores to predation risk by lions: spatiotemporal variations influence habitat use.

    Science.gov (United States)

    Valeix, M; Loveridge, A J; Chamaillé-Jammes, S; Davidson, Z; Murindagomo, F; Fritz, H; Macdonald, D W

    2009-01-01

    Predators may influence their prey populations not only through direct lethal effects, but also through indirect behavioral changes. Here, we combined spatiotemporal fine-scale data from GPS radio collars on lions with habitat use information on 11 African herbivores in Hwange National Park (Zimbabwe) to test whether the risk of predation by lions influenced the distribution of herbivores in the landscape. Effects of long-term risk of predation (likelihood of lion presence calculated over four months) and short-term risk of predation (actual presence of lions in the vicinity in the preceding 24 hours) were contrasted. The long-term risk of predation by lions appeared to influence the distributions of all browsers across the landscape, but not of grazers. This result strongly suggests that browsers and grazers, which face different ecological constraints, are influenced at different spatial and temporal scales in the variation of the risk of predation by lions. The results also show that all herbivores tend to use more open habitats preferentially when lions are in their vicinity, probably an effective anti-predator behavior against such an ambush predator. Behaviorally induced effects of lions may therefore contribute significantly to structuring African herbivore communities, and hence possibly their effects on savanna ecosystems.

  1. Exploring methods for comparing the real-world effectiveness of treatments for osteoporosis: adjusted direct comparisons versus using patients as their own control.

    Science.gov (United States)

    Karlsson, Linda; Mesterton, Johan; Tepie, Maurille Feudjo; Intorcia, Michele; Overbeek, Jetty; Ström, Oskar

    2017-09-21

    Using Swedish and Dutch registry data for women initiating bisphosphonates, we evaluated two methods of comparing the real-world effectiveness of osteoporosis treatments that attempt to adjust for differences in patient baseline characteristics. Each method has advantages and disadvantages; both are potential complements to clinical trial analyses. We evaluated methods of comparing the real-world effectiveness of osteoporosis treatments that attempt to adjust for both observed and unobserved confounding. Swedish and Dutch registry data for women initiating zoledronate or oral bisphosphonates (OBPs; alendronate/risedronate) were used; the primary outcome was fracture. In adjusted direct comparisons (ADCs), regression and matching techniques were used to account for baseline differences in known risk factors for fracture (e.g., age, previous fracture, comorbidities). In an own-control analysis (OCA), for each treatment, fracture incidence in the first 90 days following treatment initiation (the baseline risk period) was compared with fracture incidence in the 1-year period starting 91 days after treatment initiation (the treatment exposure period). In total, 1196 and 149 women initiating zoledronate and 14,764 and 25,058 initiating OBPs were eligible in the Swedish and Dutch registries, respectively. Owing to the small Dutch zoledronate sample, only the Swedish data were used to compare fracture incidences between treatment groups. ADCs showed a numerically higher fracture incidence in the zoledronate than in the OBPs group (hazard ratio 1.09-1.21; not statistically significant, p > 0.05). For both treatment groups, OCA showed a higher fracture incidence in the baseline risk period than in the treatment exposure period, indicating a treatment effect. OCA showed a similar or greater effect in the zoledronate group compared with the OBPs group. ADC and OCA each possesses advantages and disadvantages. Combining both methods may provide an estimate of real

  2. Balancing the risks and benefits of drinking water disinfection: disability adjusted life-years on the scale.

    Science.gov (United States)

    Havelaar, A H; De Hollander, A E; Teunis, P F; Evers, E G; Van Kranen, H J; Versteegh, J F; Van Koten, J E; Slob, W

    2000-04-01

    To evaluate the applicability of disability adjusted life-years (DALYs) as a measure to compare positive and negative health effects of drinking water disinfection, we conducted a case study involving a hypothetical drinking water supply from surface water. This drinking water supply is typical in The Netherlands. We compared the reduction of the risk of infection with Cryptosporidium parvum by ozonation of water to the concomitant increase in risk of renal cell cancer arising from the production of bromate. We applied clinical, epidemiologic, and toxicologic data on morbidity and mortality to calculate the net health benefit in DALYs. We estimated the median risk of infection with C. parvum as 10(-3)/person-year. Ozonation reduces the median risk in the baseline approximately 7-fold, but bromate is produced in a concentration above current guideline levels. However, the health benefits of preventing gastroenteritis in the general population and premature death in patients with acquired immunodeficiency syndrome outweigh health losses by premature death from renal cell cancer by a factor of > 10. The net benefit is approximately 1 DALY/million person-years. The application of DALYs in principle allows us to more explicitly compare the public health risks and benefits of different management options. In practice, the application of DALYs may be hampered by the substantial degree of uncertainty, as is typical for risk assessment.

  3. Male crickets adjust ejaculate quality with both risk and intensity of sperm competition.

    Science.gov (United States)

    Simmons, Leigh W; Denholm, Amy; Jackson, Chantelle; Levy, Esther; Madon, Ewa

    2007-10-22

    Sperm competition theory predicts that males should increase their expenditure on the ejaculate with increasing risk of sperm competition, but decrease their expenditure with increasing intensity. There is accumulating evidence for sperm competition theory, based on examinations of testes size and/or the numbers of sperm ejaculated. However, recent studies suggest that ejaculate quality can also be subject to selection by sperm competition. We used experimental manipulations of the risk and intensity of sperm competition in the cricket, Teleogryllus oceanicus. We found that males produced ejaculates with a greater percentage of live sperm when they had encountered a rival male prior to mating. However, when mating with a female that presented a high intensity of sperm competition, males did not respond to risk, but produced ejaculates with a reduced percentage of live sperm. Our data suggest that males exhibit a fine-tuned hierarchy of responses to these cues of sperm competition.

  4. A particle method with adjustable transport properties - the generalized consistent Boltzmann algorithm

    International Nuclear Information System (INIS)

    Garcia, A.L.; Alexander, F.J.; Alder, B.J.

    1997-01-01

    The consistent Boltzmann algorithm (CBA) for dense, hard-sphere gases is generalized to obtain the van der Waals equation of state and the corresponding exact viscosity at all densities except at the highest temperatures. A general scheme for adjusting any transport coefficients to higher values is presented

  5. A novel suture method to place and adjust peripheral nerve catheters

    DEFF Research Database (Denmark)

    Rothe, C.; Steen-Hansen, C.; Madsen, M. H.

    2015-01-01

    We have developed a peripheral nerve catheter, attached to a needle, which works like an adjustable suture. We used in-plane ultrasound guidance to place 45 catheters close to the femoral, saphenous, sciatic and distal tibial nerves in cadaver legs. We displaced catheters after their initial...

  6. Identifying Military and Combat Specific Risk Factors for Child Adjustment: Comparing High and Low Risk Military Families and Civilian Families

    Science.gov (United States)

    2016-06-01

    separation and the potentially destabilizing impact of deployment on the remaining caregiver and daily routines. The project entails the assessment of...milestones, and; 2) examine the role of spousal-perceived Service Member risk on caregiver behaviors associated with parental deployment in the prediction...n/a INTRODUCTION There is an emerging consensus that parental combat deployment may increase risk for child development; but details on what the

  7. Mate guarding in the Seychelles warbler is energetically costly and adjusted to paternity risk

    NARCIS (Netherlands)

    Komdeur, Jan

    2001-01-01

    Males may increase their fitness through extra-pair copulations (copulations outside the pair bond) that result in extra-pair fertilizations, but also risk lost paternity when they leave their own mate unguarded. The fitness costs of cuckoldry for Seychelles warblers (Acrocephalus sechellensis) are

  8. Risk-adjusted impact of administrative costs on the distribution of terminal wealth for long-term investment.

    Science.gov (United States)

    Guillén, Montserrat; Jarner, Søren Fiig; Nielsen, Jens Perch; Pérez-Marín, Ana M

    2014-01-01

    The impact of administrative costs on the distribution of terminal wealth is approximated using a simple formula applicable to many investment situations. We show that the reduction in median returns attributable to administrative fees is usually at least twice the amount of the administrative costs charged for most investment funds, when considering a risk-adjustment correction over a reasonably long-term time horizon. The example we present covers a number of standard cases and can be applied to passive investments, mutual funds, and hedge funds. Our results show investors the potential losses they face in performance due to administrative costs.

  9. Ergonomic risk assessment by REBA method

    Directory of Open Access Journals (Sweden)

    A. Hassanzadeh

    2007-09-01

    Full Text Available Background and aims   Awkward posture has been recognized as one of the important risk factors of work-related musculoskeletal disorders (WMSD. The current study aimed at determining ergonomic risk level, WMSDs ratio and exploring working postures contribution to WMSD. During the study, working postures were phased and then they were scored using the REBAtool from observing the work.   Methods   To perform the study, workers of a home appliances manufacturing factory were  assessed. In order to collecting required data, each part of the body was scored and work frequency,  load/force, coupling were considered to achieve a REBA score. Nordic Questionnaire was used  to determining WMSD ratio and its relationship whit REBA score. 231 working phases were  assessed and 13761 questions using Nordic Questionnaire were answered. Percentage of the workers in press, spot welding, grinding, cutting, assembling, and painting was 15.8, 21.6, 25.9, 34.5, 89.9%, respectively. Workers were 18-54 years old and their work recording average was 52  month.   Results   REBAscore was 4-13 in under study tasks. REBA score = 9 had the most frequency  (20% and REBA score =13 had the least frequency (1.4%. Risk level in press, cutting, and  painting was high (25.5, 100, 68.2% cases. This shows that cutting has the highest risk level. On the other hand 38.5% of the workers in past 12 month had problem in different parts of their body. Totally 11.7% of the workers had problem in neck, 19.4$ in leg, 10.7% in foot, 82.5% in lower back,  87.6% in upper back and 7.8% in shoulders.10.7% of the workers had previous illness that 8.7%  of them were non occupational and 1.9% were caused their previous jobs. The REBAscore mean  and ergonomic risk level is not equal in tasks (p-value0. Action level was necessary  soon in others.   Conclusion   Risk level should be reduced specially in cutting. The heavy workload and  working height poor design, awkward

  10. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    Science.gov (United States)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  11. Secrets of Success in a Landscape of Fear: Urban Wild Boar Adjust Risk Perception and Tolerate Disturbance

    Directory of Open Access Journals (Sweden)

    Milena Stillfried

    2017-12-01

    Full Text Available In urban areas with a high level of human disturbance, wildlife has to adjust its behavior to deal with the so called “landscape of fear.” This can be studied in risk perception during movement in relation to specific habitat types, whereby individuals trade-off between foraging and disturbance. Due to its high behavioral plasticity and increasing occurrence in urban environments, wild boar (Sus scrofa is an excellent model organism to study adjustment to urbanization. With the help of GPS tracking, space use of 11 wild boar from Berlin's metropolitan region was analyzed: we aimed at understanding how animals adjust space use to deal with the landscape of fear in urban areas compared to rural areas. We compared use vs. availability with help of generalized linear mixed models. First, we studied landscape types selected by rural vs. urban wild boar, second, we analyzed distances of wild boar locations to each of the landscape types. Finally, we mapped the resulting habitat selection probability to predict hotspots of human-wildlife conflicts. A higher tolerance to disturbance in urban wild boar was shown by a one third shorter flight distance and by an increased re-use of areas close to the trap. Urban wild boar had a strong preference for natural landscapes such as swamp areas, green areas and deciduous forests, and areas with high primary productivity, as indicated by high NDVI (normalized difference vegetation index values. The areas selected by urban wild boar were often located closely to roads and houses. The spatial distribution maps show that a large area of Berlin would be suitable for urban wild boar but not their rural conspecifics, with the most likely reason being a different perception of anthropogenic disturbance. Wild boar therefore showed considerable behavioral plasticity suitable to adjust to human-dominated environments in a potentially evolutionarily adaptive manner.

  12. Evaluating variation in use of definitive therapy and risk-adjusted prostate cancer mortality in England and the USA.

    Science.gov (United States)

    Sachdeva, Ashwin; van der Meulen, Jan H; Emberton, Mark; Cathcart, Paul J

    2015-02-24

    Prostate cancer mortality (PCM) in the USA is among the lowest in the world, whereas PCM in England is among the highest in Europe. This paper aims to assess the association of variation in use of definitive therapy on risk-adjusted PCM in England as compared with the USA. Observational study. Cancer registry data from England and the USA. Men diagnosed with non-metastatic prostate cancer (PCa) in England and the USA between 2004 and 2008. Competing-risks survival analyses to estimate subhazard ratios (SHR) of PCM adjusted for age, ethnicity, year of diagnosis, Gleason score (GS) and clinical tumour (cT) stage. 222,163 men were eligible for inclusion. Compared with American patients, English patients were more likely to present at an older age (70-79 years: England 44.2%, USA 29.3%, pUSA 8.6%, pUSA 11.2%, pUSA 77%, pUSA. This difference may be explained by less frequent use of definitive therapy in England. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Risk-adjusted econometric model to estimate postoperative costs: an additional instrument for monitoring performance after major lung resection.

    Science.gov (United States)

    Brunelli, Alessandro; Salati, Michele; Refai, Majed; Xiumé, Francesco; Rocco, Gaetano; Sabbatini, Armando

    2007-09-01

    The objectives of this study were to develop a risk-adjusted model to estimate individual postoperative costs after major lung resection and to use it for internal economic audit. Variable and fixed hospital costs were collected for 679 consecutive patients who underwent major lung resection from January 2000 through October 2006 at our unit. Several preoperative variables were used to develop a risk-adjusted econometric model from all patients operated on during the period 2000 through 2003 by a stepwise multiple regression analysis (validated by bootstrap). The model was then used to estimate the postoperative costs in the patients operated on during the 3 subsequent periods (years 2004, 2005, and 2006). Observed and predicted costs were then compared within each period by the Wilcoxon signed rank test. Multiple regression and bootstrap analysis yielded the following model predicting postoperative cost: 11,078 + 1340.3X (age > 70 years) + 1927.8X cardiac comorbidity - 95X ppoFEV1%. No differences between predicted and observed costs were noted in the first 2 periods analyzed (year 2004, $6188.40 vs $6241.40, P = .3; year 2005, $6308.60 vs $6483.60, P = .4), whereas in the most recent period (2006) observed costs were significantly lower than the predicted ones ($3457.30 vs $6162.70, P model may be used as a methodologic template for economic audit in our specialty and complement more traditional outcome measures in the assessment of performance.

  14. Asymmetric adjustment

    NARCIS (Netherlands)

    2010-01-01

    A method of adjusting a signal processing parameter for a first hearing aid and a second hearing aid forming parts of a binaural hearing aid system to be worn by a user is provided. The binaural hearing aid system comprises a user specific model representing a desired asymmetry between a first ear

  15. Efficient Estimation of Sensitivities for Counterparty Credit Risk with the Finite Difference Monte Carlo Method

    NARCIS (Netherlands)

    de Graaf, C.S.L.; Kandhai, D.; Sloot, P.M.A.

    According to Basel III, financial institutions have to charge a credit valuation adjustment (CVA) to account for a possible counterparty default. Calculating this measure and its sensitivities is one of the biggest challenges in risk management. Here, we introduce an efficient method for the

  16. Efficient estimation of sensitivities for counterparty credit risk with the finite difference Monte Carlo method

    NARCIS (Netherlands)

    C.S.L. de Graaf (Kees); B.D. Kandhai; P.M.A. Sloot

    2017-01-01

    htmlabstractAccording to Basel III, financial institutions have to charge a credit valuation adjustment (CVA) to account for a possible counterparty default. Calculating this measure and its sensitivities is one of the biggest challenges in risk management. Here, we introduce an efficient method

  17. Performance of risk-adjusted control charts to monitor in-hospital mortality of intensive care unit patients: A simulation study

    NARCIS (Netherlands)

    Koetsier, Antonie; de Keizer, Nicolette F.; de Jonge, Evert; Cook, David A.; Peek, Niels

    2012-01-01

    Objectives: Increases in case-mix adjusted mortality may be indications of decreasing quality of care. Risk-adjusted control charts can be used for in-hospital mortality monitoring in intensive care units by issuing a warning signal when there are more deaths than expected. The aim of this study was

  18. The need for unique risk adjustment for surgical site infections at a high-volume, tertiary care center with inherent high-risk colorectal procedures.

    Science.gov (United States)

    Gorgun, E; Benlice, C; Hammel, J; Hull, T; Stocchi, L

    2017-08-01

    The aim of the present study was to create a unique risk adjustment model for surgical site infection (SSI) in patients who underwent colorectal surgery (CRS) at the Cleveland Clinic (CC) with inherent high risk factors by using a nationwide database. The American College of Surgeons National Surgical Quality Improvement Program database was queried to identify patients who underwent CRS between 2005 and 2010. Initially, CC cases were identified from all NSQIP data according to case identifier and separated from the other NSQIP centers. Demographics, comorbidities, and outcomes were compared. Logistic regression analyses were used to assess the association between SSI and center-related factors. A total of 70,536 patients met the inclusion criteria and underwent CRS, 1090 patients (1.5%) at the CC and 69,446 patients (98.5%) at other centers. Male gender, work-relative value unit, diagnosis of inflammatory bowel disease, pouch formation, open surgery, steroid use, and preoperative radiotherapy rates were significantly higher in the CC cases. Overall morbidity and individual postoperative complication rates were found to be similar in the CC and other centers except for the following: organ-space SSI and sepsis rates (higher in the CC cases); and pneumonia and ventilator dependency rates (higher in the other centers). After covariate adjustment, the estimated degree of difference between the CC and other institutions with respect to organ-space SSI was reduced (OR 1.38, 95% CI 1.08-1.77). The unique risk adjustment strategy may provide center-specific comprehensive analysis, especially for hospitals that perform inherently high-risk procedures. Higher surgical complexity may be the reason for increased SSI rates in the NSQIP at tertiary care centers.

  19. A Water Hammer Protection Method for Mine Drainage System Based on Velocity Adjustment of Hydraulic Control Valve

    Directory of Open Access Journals (Sweden)

    Yanfei Kou

    2016-01-01

    Full Text Available Water hammer analysis is a fundamental work of pipeline systems design process for water distribution networks. The main characteristics for mine drainage system are the limited space and high cost of equipment and pipeline changing. In order to solve the protection problem of valve-closing water hammer for mine drainage system, a water hammer protection method for mine drainage system based on velocity adjustment of HCV (Hydraulic Control Valve is proposed in this paper. The mathematic model of water hammer fluctuations is established based on the characteristic line method. Then, boundary conditions of water hammer controlling for mine drainage system are determined and its simplex model is established. The optimization adjustment strategy is solved from the mathematic model of multistage valve-closing. Taking a mine drainage system as an example, compared results between simulations and experiments show that the proposed method and the optimized valve-closing strategy are effective.

  20. Is use of fall risk-increasing drugs in an elderly population associated with an increased risk of hip fracture, after adjustment for multimorbidity level

    DEFF Research Database (Denmark)

    Thorell, Kristine; Ranstad, Karin; Midlöv, Patrik

    2014-01-01

    BACKGROUND: Risk factors for hip fracture are well studied because of the negative impact on patients and the community, with mortality in the first year being almost 30% in the elderly. Age, gender and fall risk-increasing drugs, identified by the National Board of Health and Welfare in Sweden......, are well known risk factors for hip fracture, but how multimorbidity level affects the risk of hip fracture during use of fall risk-increasing drugs is to our knowledge not as well studied. This study explored the relationship between use of fall risk-increasing drugs in combination with multimorbidity...... level and risk of hip fracture in an elderly population. METHODS: Data were from Östergötland County, Sweden, and comprised the total population in the county aged 75 years and older during 2006. The odds ratio (OR) for hip fracture during use of fall risk-increasing drugs was calculated by multivariate...

  1. Introducing risk adjustment and free health plan choice in employer-based health insurance: Evidence from Germany.

    Science.gov (United States)

    Pilny, Adam; Wübker, Ansgar; Ziebarth, Nicolas R

    2017-12-01

    To equalize differences in health plan premiums due to differences in risk pools, the German legislature introduced a simple Risk Adjustment Scheme (RAS) based on age, gender and disability status in 1994. In addition, effective 1996, consumers gained the freedom to choose among hundreds of existing health plans, across employers and state-borders. This paper (a) estimates RAS pass-through rates on premiums, financial reserves, and expenditures and assesses the overall RAS impact on market price dispersion. Moreover, it (b) characterizes health plan switchers and investigates their annual and cumulative switching rates over time. Our main findings are based on representative enrollee panel data linked to administrative RAS and health plan data. We show that sickness funds with bad risk pools and high pre-RAS premiums lowered their total premiums by 42 cents per additional euro allocated by the RAS. Consequently, post-RAS, health plan prices converged but not fully. Because switchers are more likely to be white collar, young and healthy, the new consumer choice resulted in more risk segregation and the amount of money redistributed by the RAS increased over time. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Calculating disability-adjusted life years (DALY) as a measure of excess cancer risk following radiation exposure

    International Nuclear Information System (INIS)

    Shimada, K; Kai, M

    2015-01-01

    This paper has proposed that disability-adjusted life year (DALY) can be used as a measure of radiation health risk. DALY is calculated as the sum of years of life lost (YLL) and years lived with disability (YLD). This multidimensional concept can be expressed as a risk index without a probability measure to avoid the misuse of the current radiation detriment at low doses. In this study, we calculated YLL and YLD using Japanese population data by gender. DALY for all cancers in Japan per 1 Gy per person was 0.84 year in men and 1.34 year in women. The DALY for all cancers in the Japanese baseline was 4.8 in men and 3.5 in women. When we calculated the ICRP detriment from the same data, DALYs for the cancer sites were similar to the radiation detriment in the cancer sites, excluding leukemia, breast and thyroid cancer. These results suggested that the ICRP detriment overestimate the weighting fraction of leukemia risk and underestimate the weighting fraction of breast and thyroid cancer. A big advantage over the ICRP detriment is that DALY can calculate the risk components for non-fatal diseases without the data of lethality. This study showed that DALY is a practical tool that can compare many types of diseases encountered in public health. (paper)

  3. Epidemiology, Management, and Risk-Adjusted Mortality of ICU-Acquired Enterococcal Bacteremia

    NARCIS (Netherlands)

    Ong, David S Y; Bonten, Marc J M; Safdari, Khatera; Spitoni, Cristian; Frencken, Jos F; Witteveen, Esther; Horn, Janneke; Klein Klouwenberg, Peter M C; Cremer, Olaf L

    2015-01-01

    BACKGROUND:  Enterococcal bacteremia has been associated with high case fatality, but it remains unknown to what extent death is caused by these infections. We therefore quantified attributable mortality of intensive care unit (ICU)-acquired bacteremia caused by enterococci. METHODS:  From 2011 to

  4. Psychological methods of subjective risk estimates

    International Nuclear Information System (INIS)

    Zimolong, B.

    1980-01-01

    Reactions to situations involving risks can be divided into the following parts/ perception of danger, subjective estimates of the risk and risk taking with respect to action. Several investigations have compared subjective estimates of the risk with an objective measure of that risk. In general there was a mis-match between subjective and objective measures of risk, especially, objective risk involved in routine activities is most commonly underestimated. This implies, for accident prevention, that attempts must be made to induce accurate subjective risk estimates by technical and behavioural measures. (orig.) [de

  5. Antenatal steroids and risk of bronchopulmonary dysplasia: a lack of effect or a case of over-adjustment?

    Science.gov (United States)

    Gagliardi, Luigi; Bellù, Roberto; Rusconi, Franca; Merazzi, Daniele; Mosca, Fabio

    2007-07-01

    Although antenatal steroids reduce risk factors for bronchopulmonary dysplasia (BPD) in preterm infants, their effect on BPD is conflicting. We hypothesised that the lack of protective effect found in some studies could derive from over-adjustment during analysis, caused by controlling for factors intermediate in the causal pathway between treatment and outcome. We prospectively studied a cohort of infants 23-32 weeks gestation steroids. In univariable analysis, steroids were not significantly protective against BPD; some intermediate factors (mechanical ventilation, greater severity of illness as measured by Clinical Risk Index for Babies score, patent ductus arteriosus) were significantly positively associated with (i.e. were risk factors for) BPD (OR = 11.0, 1.55, 4.42, respectively, all P steroids (OR = 0.58, 0.92, and 0.58, respectively, all P steroid-treated infants had a lower risk of BPD (OR 0.59 [95% CI 0.36, 0.97], P = 0.036); male sex (OR = 2.08), late-onset sepsis (OR = 4.26), and birthweight (OR = 0.63 for 100 g increase) were also associated with BPD, all P effect of steroids disappeared; ventilation (OR = 3.03), increased illness severity (OR = 1.11), and patent ductus arteriosus (OR = 1.90) were significant risk factors. This study suggests that including variables that are potential mediators in the causal chain can obscure the ability to detect a protective effect of treatment. We observed such a phenomenon in our analyses of the relationship between antenatal steroids and BPD, suggesting that steroid effect is partly mediated through a reduction in the classical risk factors.

  6. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan.

    Science.gov (United States)

    Chang, Hsien-Yen; Weiner, Jonathan P

    2010-01-18

    Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory

  7. [Susceptibility to strategy of the drug component of the IPHCC+RxGroups classification system in a risk-adjusted morbidity compensation scheme--a conceptional and data-supported analysis].

    Science.gov (United States)

    Behrend, C; Felder, S; Busse, R

    2007-01-01

    A report commissioned by the German Ministry of Health recommends to the existing scheme for calculating risk-adjusted transfers to sickness funds supplement with the IPHCC+RxGroups method. The method is based on inpatient diagnoses and prescribed drugs as health status measures deduced from prior use. The present study investigates the sickness fund's expected net return from gaming based on the drug component of the risk adjuster. The study explores three possible strategies using the RxGroups method. For the stimulations, insurees are assigned to additional indications or to higher valued RxGroups within the same indication. Then, costs and financial benefits attributable to the altered drug use are estimated and compared with the status quo. The study uses 2000 and 2001 sample data of more than 370,000 insurees of Germany's company-based sickness funds system (BKK). While upgrading increases overall costs, it can be beneficial for the individual sickness funds. Their net return crucially depends on the number of sickness funds gaming the system: the more participating in the game, the smaller is the average net return. Moreover, not participating often is even worse, which in turn points to a prisoner's dilemma. When extending the risk adjustment scheme in social health insurance, the German legislator should take into account the perverse incentives of risk adjusters such as the described prescription drug model.

  8. Method for optimum determination of adjustable parameters in the boiling water reactor core simulator using operating data on flux distribution

    International Nuclear Information System (INIS)

    Kiguchi, T.; Kawai, T.

    1975-01-01

    A method has been developed to optimally and automatically determine the adjustable parameters of the boiling water reactor three-dimensional core simulator FLARE. The steepest gradient method is adopted for the optimization. The parameters are adjusted to best fit the operating data on power distribution measured by traversing in-core probes (TIP). The average error in the calculated TIP readings normalized by the core average is 0.053 at the rated power. The k-infinity correction term has also been derived theoretically to reduce the relatively large error in the calculated TIP readings near the tips of control rods, which is induced by the coarseness of mesh points. By introducing this correction, the average error decreases to 0.047. The void-quality relation is recognized as a function of coolant flow rate. The relation is estimated to fit the measured distributions of TIP reading at the partial power states

  9. System and method of adjusting the equilibrium temperature of an inductively-heated susceptor

    Science.gov (United States)

    Matsen, Marc R; Negley, Mark A; Geren, William Preston

    2015-02-24

    A system for inductively heating a workpiece may include an induction coil, at least one susceptor face sheet, and a current controller coupled. The induction coil may be configured to conduct an alternating current and generate a magnetic field in response to the alternating current. The susceptor face sheet may be configured to have a workpiece positioned therewith. The susceptor face sheet may be formed of a ferromagnetic alloy having a Curie temperature and being inductively heatable to an equilibrium temperature approaching the Curie temperature in response to the magnetic field. The current controller may be coupled to the induction coil and may be configured to adjust the alternating current in a manner causing a change in at least one heating parameter of the susceptor face sheet.

  10. Adjusting for treatment switching in randomised controlled trials - A simulation study and a simplified two-stage method.

    Science.gov (United States)

    Latimer, Nicholas R; Abrams, K R; Lambert, P C; Crowther, M J; Wailoo, A J; Morden, J P; Akehurst, R L; Campbell, M J

    2017-04-01

    Estimates of the overall survival benefit of new cancer treatments are often confounded by treatment switching in randomised controlled trials (RCTs) - whereby patients randomised to the control group are permitted to switch onto the experimental treatment upon disease progression. In health technology assessment, estimates of the unconfounded overall survival benefit associated with the new treatment are needed. Several switching adjustment methods have been advocated in the literature, some of which have been used in health technology assessment. However, it is unclear which methods are likely to produce least bias in realistic RCT-based scenarios. We simulated RCTs in which switching, associated with patient prognosis, was permitted. Treatment effect size and time dependency, switching proportions and disease severity were varied across scenarios. We assessed the performance of alternative adjustment methods based upon bias, coverage and mean squared error, related to the estimation of true restricted mean survival in the absence of switching in the control group. We found that when the treatment effect was not time-dependent, rank preserving structural failure time models (RPSFTM) and iterative parameter estimation methods produced low levels of bias. However, in the presence of a time-dependent treatment effect, these methods produced higher levels of bias, similar to those produced by an inverse probability of censoring weights method. The inverse probability of censoring weights and structural nested models produced high levels of bias when switching proportions exceeded 85%. A simplified two-stage Weibull method produced low bias across all scenarios and provided the treatment switching mechanism is suitable, represents an appropriate adjustment method.

  11. Development and Testing of a Decision Making Based Method to Adjust Automatically the Harrowing Intensity

    Science.gov (United States)

    Rueda-Ayala, Victor; Weis, Martin; Keller, Martina; Andújar, Dionisio; Gerhards, Roland

    2013-01-01

    Harrowing is often used to reduce weed competition, generally using a constant intensity across a whole field. The efficacy of weed harrowing in wheat and barley can be optimized, if site-specific conditions of soil, weed infestation and crop growth stage are taken into account. This study aimed to develop and test an algorithm to automatically adjust the harrowing intensity by varying the tine angle and number of passes. The field variability of crop leaf cover, weed density and soil density was acquired with geo-referenced sensors to investigate the harrowing selectivity and crop recovery. Crop leaf cover and weed density were assessed using bispectral cameras through differential images analysis. The draught force of the soil opposite to the direction of travel was measured with electronic load cell sensor connected to a rigid tine mounted in front of the harrow. Optimal harrowing intensity levels were derived in previously implemented experiments, based on the weed control efficacy and yield gain. The assessments of crop leaf cover, weed density and soil density were combined via rules with the aforementioned optimal intensities, in a linguistic fuzzy inference system (LFIS). The system was evaluated in two field experiments that compared constant intensities with variable intensities inferred by the system. A higher weed density reduction could be achieved when the harrowing intensity was not kept constant along the cultivated plot. Varying the intensity tended to reduce the crop leaf cover, though slightly improving crop yield. A real-time intensity adjustment with this system is achievable, if the cameras are attached in the front and at the rear or sides of the harrow. PMID:23669712

  12. Development and Testing of a Decision Making Based Method to Adjust Automatically the Harrowing Intensity

    Directory of Open Access Journals (Sweden)

    Roland Gerhards

    2013-05-01

    Full Text Available Harrowing is often used to reduce weed competition, generally using a constant intensity across a whole field. The efficacy of weed harrowing in wheat and barley can be optimized, if site-specific conditions of soil, weed infestation and crop growth stage are taken into account. This study aimed to develop and test an algorithm to automatically adjust the harrowing intensity by varying the tine angle and number of passes. The field variability of crop leaf cover, weed density and soil density was acquired with geo-referenced sensors to investigate the harrowing selectivity and crop recovery. Crop leaf cover and weed density were assessed using bispectral cameras through differential images analysis. The draught force of the soil opposite to the direction of travel was measured with electronic load cell sensor connected to a rigid tine mounted in front of the harrow. Optimal harrowing intensity levels were derived in previously implemented experiments, based on the weed control efficacy and yield gain. The assessments of crop leaf cover, weed density and soil density were combined via rules with the aforementioned optimal intensities, in a linguistic fuzzy inference system (LFIS. The system was evaluated in two field experiments that compared constant intensities with variable intensities inferred by the system. A higher weed density reduction could be achieved when the harrowing intensity was not kept constant along the cultivated plot. Varying the intensity tended to reduce the crop leaf cover, though slightly improving crop yield. A real-time intensity adjustment with this system is achievable, if the cameras are attached in the front and at the rear or sides of the harrow.

  13. Risk-Informed SSCs Categorization: Elicitation Method of Expert's Opinion

    International Nuclear Information System (INIS)

    Hwang, Mee Jeong; Yang, Joon Eon; Kim, Kil Yoo

    2005-01-01

    The regulations have been performing by deterministic way since nuclear power plants have been operating. However, some SSCs identified as safety-significance by deterministic way, were turned out to be low or non safety-significant and some SSCs identified as non-safety significance were turned out to be high safety-significant according to the results of PSA. Considering these risk insights, Regulatory Guide 1.174 and 10CFR50.69 were drawn up, and we can re-categorize the SSCs according to their safety significance. Therefore, a study and an interest about the risk-informed SSCs re-categorization and treatment has been continued. The objective of this regulatory initiative is to adjust the scope of equipment subject to special regulatory treatment to better focus licensee and regulatory attention and resources on equipment that has safety significance. Current most regulations define the plant equipment necessary to meet deterministic regulatory basis as 'safety-related.' This equipment is subject to special treatment regulations. Other plant equipment is categorized as 'non-safety related,' and is not subject to a select number of special treatment requirement or a subset of those requirement. However, risk information is not a magic tool making a decision but a supporting tool to categorize SSCs. This is because only small parts of a plant are modeled in PSA model. Thus, engineering and deterministic judgments are also used for risk-informed SSCs categorization, and expert opinion elicitation is very important for risk-informed SSCs categorization. Therefore, we need a rational method to elicit the expert's opinions, and in this study, we developed a systematic method for expert elicitation to categorize the nuclear power plants' SSCs. Current states for SSCs categorization of the USA and the existing methods for expert elicitation were surveyed and more systematic way eliciting the expert opinions and combining was developed. To validate the developed method

  14. Drive Beam Quadrupoles for the CLIC Project: a Novel Method of Fiducialisation and a New Micrometric Adjustment System

    CERN Document Server

    AUTHOR|(SzGeCERN)411678; Duquenne, Mathieu; Sandomierski, Jacek; Sosin, Mateusz; Rude, Vivien

    2014-01-01

    This paper presents a new method of fiducialisation applied to determine the magnetic axis of the Drive Beam quadrupole of the CLIC project with respect to external alignment fiducials, within a micrometric accuracy and precision. It introduces also a new micrometric adjustment system along 5 Degrees of Freedom, developed for the same Drive Beam quadrupole. The combination of both developments opens very interesting perspectives to get a more simple and accurate alignment of the quadrupoles.

  15. Phenotypic plasticity in anti-intraguild predator strategies: mite larvae adjust their behaviours according to vulnerability and predation risk.

    Science.gov (United States)

    Walzer, Andreas; Schausberger, Peter

    2013-05-01

    Interspecific threat-sensitivity allows prey to maximize the net benefit of antipredator strategies by adjusting the type and intensity of their response to the level of predation risk. This is well documented for classical prey-predator interactions but less so for intraguild predation (IGP). We examined threat-sensitivity in antipredator behaviour of larvae in a predatory mite guild sharing spider mites as prey. The guild consisted of the highly vulnerable intraguild (IG) prey and weak IG predator Phytoseiulus persimilis, the moderately vulnerable IG prey and moderate IG predator Neoseiulus californicus and the little vulnerable IG prey and strong IG predator Amblyseius andersoni. We videotaped the behaviour of the IG prey larvae of the three species in presence of either a low- or a high-risk IG predator female or predator absence and analysed time, distance, path shape and interaction parameters of predators and prey. The least vulnerable IG prey A. andersoni was insensitive to differing IGP risks but the moderately vulnerable IG prey N. californicus and the highly vulnerable IG prey P. persimilis responded in a threat-sensitive manner. Predator presence triggered threat-sensitive behavioural changes in one out of ten measured traits in N. californicus larvae but in four traits in P. persimilis larvae. Low-risk IG predator presence induced a typical escape response in P. persimilis larvae, whereas they reduced their activity in the high-risk IG predator presence. We argue that interspecific threat-sensitivity may promote co-existence of IG predators and IG prey and should be common in predator guilds with long co-evolutionary history.

  16. Extra-Margins in ACM's Adjusted NMa ‘Mortgage-Rate-Calculation Method

    NARCIS (Netherlands)

    Dijkstra, M.; Schinkel, M.P.

    2013-01-01

    We analyse the development since 2004 of our concept of extra-margins on Dutch mortgages (Dijkstra & Schinkel, 2012), based on funding cost estimations in ACM (2013), which are an update of those in NMa (2011). Neither costs related to increased mortgage-specific risks, nor the inclusion of Basel

  17. Mandatory pooling as a supplement to risk-adjusted capitation payments in a competitive health insurance market.

    Science.gov (United States)

    Van Barneveld, E M; Lamers, L M; van Vliet, R C; van de Ven, W P

    1998-07-01

    Risk-adjusted capitation payments (RACPs) to competing health insurers are an essential element of market-oriented health care reforms in many countries. RACPs based on demographic variables only are insufficient, because they leave ample room for cream skimming. However, the implementation of improved RACPs does not appear to be straightforward. A solution might be to supplement imperfect RACPs with a form of mandatory pooling that reduces the incentives for cream skimming. In a previous paper it was concluded that high-risk pooling (HRP), is a promising supplement to RACPs. The purpose of this paper is to compare HRP with two other main variants of mandatory pooling. These variants are called excess-of-loss (EOL) and proportional pooling (PP). Each variant includes ex post compensations to insurers for some members which depend to various degrees on actually incurred costs. Therefore, these pooling variants reduce the incentives for cream skimming which are inherent in imperfect RACPs, but they also reduce the incentives for efficiency and cost containment. As a rough measure of the latter incentives we use the percentage of total costs for which an insurer is at risk. This paper analyzes which of the three main pooling variants yields the greatest reduction of incentives for cream skimming given such a percentage. The results show that HRP is the most effective of the three pooling variants.

  18. Introducing conjoint analysis method into delayed lotteries studies: its validity and time stability are higher than in adjusting.

    Science.gov (United States)

    Białek, Michał; Markiewicz, Łukasz; Sawicki, Przemysław

    2015-01-01

    The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship). However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal. The goal of this study was to introduce the novel method for analyzing delayed lotteries-conjoint analysis-which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 and 2), and they are more stable over time (Study 2) compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods.

  19. Introducing conjoint analysis method into delayed lotteries studies: Its validity and time stability are higher than in adjusting

    Directory of Open Access Journals (Sweden)

    Michal eBialek

    2015-01-01

    Full Text Available The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship. However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal.The goal of this study was to introduce the novel method for analyzing delayed lotteries - conjoint analysis - which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 & 2, and they are more stable over time (Study 2 compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods.

  20. Risk and protection factors in the peer context: how do other children contribute to the psychosocial adjustment of the adolescent?

    Directory of Open Access Journals (Sweden)

    Marie-Hélène Véronneau

    2014-03-01

    Full Text Available As children become adolescents, peers assume greater importance in their lives. Peer experiences can either help them thrive or negatively affect their psychosocial adjustment. In this review article definitions for the types of peer experiences are provided followed by an overview of common psychosocial issues encountered by adolescents. Past research that has pointed to risk and protection factors that emerge from peer experiences during adolescence and the role of peer influences in the context of current issues relevant to adolescent education are discussed. Research suggests that friendships with deviant peers, involvement in bullying and the experience of rejection from the overall peer group are related to adjustment problems, whereas friendships with prosocial and academically oriented peers and social acceptance in the peer group are related to healthy development. Friendship quality, popularity among peers, and involvement in friendship cliques cannot be clearly categorized as either positive or negative influences, because they interact with other factors in shaping the development of adolescents. The promotion of social skills and positive youth leadership as an integral part of the student's learning process in school is recommended.

  1. Probabilistic methods in fire-risk analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment

  2. Proximal Alternating Direction Method with Relaxed Proximal Parameters for the Least Squares Covariance Adjustment Problem

    Directory of Open Access Journals (Sweden)

    Minghua Xu

    2014-01-01

    Full Text Available We consider the problem of seeking a symmetric positive semidefinite matrix in a closed convex set to approximate a given matrix. This problem may arise in several areas of numerical linear algebra or come from finance industry or statistics and thus has many applications. For solving this class of matrix optimization problems, many methods have been proposed in the literature. The proximal alternating direction method is one of those methods which can be easily applied to solve these matrix optimization problems. Generally, the proximal parameters of the proximal alternating direction method are greater than zero. In this paper, we conclude that the restriction on the proximal parameters can be relaxed for solving this kind of matrix optimization problems. Numerical experiments also show that the proximal alternating direction method with the relaxed proximal parameters is convergent and generally has a better performance than the classical proximal alternating direction method.

  3. A risk-adjusted financial model to estimate the cost of a video-assisted thoracoscopic surgery lobectomy programme.

    Science.gov (United States)

    Brunelli, Alessandro; Tentzeris, Vasileios; Sandri, Alberto; McKenna, Alexandra; Liew, Shan Liung; Milton, Richard; Chaudhuri, Nilanjan; Kefaloyannis, Emmanuel; Papagiannopoulos, Kostas

    2016-05-01

    To develop a clinically risk-adjusted financial model to estimate the cost associated with a video-assisted thoracoscopic surgery (VATS) lobectomy programme. Prospectively collected data of 236 VATS lobectomy patients (August 2012-December 2013) were analysed retrospectively. Fixed and variable intraoperative and postoperative costs were retrieved from the Hospital Accounting Department. Baseline and surgical variables were tested for a possible association with total cost using a multivariable linear regression and bootstrap analyses. Costs were calculated in GBP and expressed in Euros (EUR:GBP exchange rate 1.4). The average total cost of a VATS lobectomy was €11 368 (range €6992-€62 535). Average intraoperative (including surgical and anaesthetic time, overhead, disposable materials) and postoperative costs [including ward stay, high dependency unit (HDU) or intensive care unit (ICU) and variable costs associated with management of complications] were €8226 (range €5656-€13 296) and €3029 (range €529-€51 970), respectively. The following variables remained reliably associated with total costs after linear regression analysis and bootstrap: carbon monoxide lung diffusion capacity (DLCO) 0.05) in 86% of the samples. A hypothetical patient with COPD and DLCO less than 60% would cost €4270 more than a patient without COPD and with higher DLCO values (€14 793 vs €10 523). Risk-adjusting financial data can help estimate the total cost associated with VATS lobectomy based on clinical factors. This model can be used to audit the internal financial performance of a VATS lobectomy programme for budgeting, planning and for appropriate bundled payment reimbursements. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  4. The Adjusted Net Asset Valuation Method – Connecting the dots between Theory and Practice

    Directory of Open Access Journals (Sweden)

    Silvia Ghiță-Mitrescu

    2016-01-01

    The goal of this paper is to present the theoretical background of this method as well as itspractical application. We will first analyze the main theoretical issues regarding the correctionsthat need to be performed in order to transform the book value of assets and liabilities to theirmarket value, afterwards proceeding to an example on how this method is applied to the balancesheet of a company. Finally, we will conclude on the importance of the method for a company’sevaluation process.

  5. Rationale of a quick adjustment method for crystal orientation in oscillation photography

    International Nuclear Information System (INIS)

    Suh, I.H.; Suh, J.M.; Ko, T.S.

    1988-01-01

    The rationale for a convenient crystal orientation method for oscillation photography is presented. The method involves the measurement of the deviations of reflection spots from the equator. These deviations are added or subtracted to give the horizontal and vertical arc corrections. (orig.)

  6. Shaft adjuster

    Science.gov (United States)

    Harry, Herbert H.

    1989-01-01

    Apparatus and method for the adjustment and alignment of shafts in high power devices. A plurality of adjacent rotatable angled cylinders are positioned between a base and the shaft to be aligned which when rotated introduce an axial offset. The apparatus is electrically conductive and constructed of a structurally rigid material. The angled cylinders allow the shaft such as the center conductor in a pulse line machine to be offset in any desired alignment position within the range of the apparatus.

  7. An Algebraic Method of Synchronous Pulsewidth Modulation for Converters for Adjustable Speed Drives

    DEFF Research Database (Denmark)

    Oleschuk, Valentin; Blaabjerg, Frede

    2002-01-01

    This paper describes the basic peculiarities of a new method of feedforward synchronous pulsewidth modulation (PWM) of the output voltage of converters, based on one-stage closed-form strategy of PWM with pure algebraic control dependencies. It is applied to voltage source inverters with a contin......This paper describes the basic peculiarities of a new method of feedforward synchronous pulsewidth modulation (PWM) of the output voltage of converters, based on one-stage closed-form strategy of PWM with pure algebraic control dependencies. It is applied to voltage source inverters...... with a continuous scheme of conventional voltage space vector modulation and with two basic variants of symmetrical discontinuous PWM. Simulations give the behaviour of the proposed method and show the advantage of algebraic synchronous PWM compared with the typical asynchronous, for low indices of the frequency...

  8. Adjustment of a rapid method for quantification of Fusarium spp. spore suspensions in plant pathology.

    Science.gov (United States)

    Caligiore-Gei, Pablo F; Valdez, Jorge G

    2015-01-01

    The use of a Neubauer chamber is a broadly employed method when cell suspensions need to be quantified. However, this technique may take a long time and needs trained personnel. Spectrophotometry has proved to be a rapid, simple and accurate method to estimate the concentration of spore suspensions of isolates of the genus Fusarium. In this work we present a linear formula to relate absorbance measurements at 530nm with the number of microconidia/ml in a suspension. Copyright © 2014 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Burden of typhoid fever in low-income and middle-income countries: a systematic, literature-based update with risk-factor adjustment.

    Science.gov (United States)

    Mogasale, Vittal; Maskery, Brian; Ochiai, R Leon; Lee, Jung Seok; Mogasale, Vijayalaxmi V; Ramani, Enusa; Kim, Young Eun; Park, Jin Kyung; Wierzba, Thomas F

    2014-10-01

    No access to safe water is an important risk factor for typhoid fever, yet risk-level heterogeneity is unaccounted for in previous global burden estimates. Since WHO has recommended risk-based use of typhoid polysaccharide vaccine, we revisited the burden of typhoid fever in low-income and middle-income countries (LMICs) after adjusting for water-related risk. We estimated the typhoid disease burden from studies done in LMICs based on blood-culture-confirmed incidence rates applied to the 2010 population, after correcting for operational issues related to surveillance, limitations of diagnostic tests, and water-related risk. We derived incidence estimates, correction factors, and mortality estimates from systematic literature reviews. We did scenario analyses for risk factors, diagnostic sensitivity, and case fatality rates, accounting for the uncertainty in these estimates and we compared them with previous disease burden estimates. The estimated number of typhoid fever cases in LMICs in 2010 after adjusting for water-related risk was 11·9 million (95% CI 9·9-14·7) cases with 129 000 (75 000-208 000) deaths. By comparison, the estimated risk-unadjusted burden was 20·6 million (17·5-24·2) cases and 223 000 (131 000-344 000) deaths. Scenario analyses indicated that the risk-factor adjustment and updated diagnostic test correction factor derived from systematic literature reviews were the drivers of differences between the current estimate and past estimates. The risk-adjusted typhoid fever burden estimate was more conservative than previous estimates. However, by distinguishing the risk differences, it will allow assessment of the effect at the population level and will facilitate cost-effectiveness calculations for risk-based vaccination strategies for future typhoid conjugate vaccine. Copyright © 2014 Mogasale et al. Open Access article distributed under the terms of CC BY-NC-SA. Published by .. All rights reserved.

  10. The Adaptation of Ways and Methods of Risk Minimization in Local Payment Systems in Public Transport

    Directory of Open Access Journals (Sweden)

    Avdaev Mausar Yushaevich

    2014-12-01

    Full Text Available The problems of risk management gain special relevance in the conditions of payment systems development in public passenger transport in Russia. The risk carriers as well as the sources of their occurrence are revealed; the characteristics of private risks of individual participants in the system of public passenger transport are presented. The directions of risk management in relation to the payment system in public transport are reasoned and structured. It is proved that the choice of specific ways to minimize the risks in local payment systems in public transport is conditioned by the following factors – the nature of the payment system integration in public transport areas, the temporary nature of risk components effect due to the improvement of organizational, economic and technological factors, the change of the stages of payment systems development, the evaluation of risks effects. The article reasons the possibility of using and adjusting traditional ways (risk evasion, risk compensation, decrease in risk level, risk transfer, distribution of risk between participants and the methods of risk management in the payment systems in public transport according to the stages of their development and functioning for the processing center, passenger motor transport organizations, financial center and passengers (payers. The authors justify the directions of integrating the local payment systems of public transport in the national payment system, taking into account the risks involved in the activity of its members.

  11. MANGO – Modal Analysis for Grid Operation: A Method for Damping Improvement through Operating Point Adjustment

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhenyu; Zhou, Ning; Tuffner, Francis K.; Chen, Yousu; Trudnowski, Daniel J.; Diao, Ruisheng; Fuller, Jason C.; Mittelstadt, William A.; Hauer, John F.; Dagle, Jeffery E.

    2010-10-18

    Small signal stability problems are one of the major threats to grid stability and reliability in the U.S. power grid. An undamped mode can cause large-amplitude oscillations and may result in system breakups and large-scale blackouts. There have been several incidents of system-wide oscillations. Of those incidents, the most notable is the August 10, 1996 western system breakup, a result of undamped system-wide oscillations. Significant efforts have been devoted to monitoring system oscillatory behaviors from measurements in the past 20 years. The deployment of phasor measurement units (PMU) provides high-precision, time-synchronized data needed for detecting oscillation modes. Measurement-based modal analysis, also known as ModeMeter, uses real-time phasor measurements to identify system oscillation modes and their damping. Low damping indicates potential system stability issues. Modal analysis has been demonstrated with phasor measurements to have the capability of estimating system modes from both oscillation signals and ambient data. With more and more phasor measurements available and ModeMeter techniques maturing, there is yet a need for methods to bring modal analysis from monitoring to actions. The methods should be able to associate low damping with grid operating conditions, so operators or automated operation schemes can respond when low damping is observed. The work presented in this report aims to develop such a method and establish a Modal Analysis for Grid Operation (MANGO) procedure to aid grid operation decision making to increase inter-area modal damping. The procedure can provide operation suggestions (such as increasing generation or decreasing load) for mitigating inter-area oscillations.

  12. FEM-based Printhead Intelligent Adjusting Method for Printing Conduct Material

    Directory of Open Access Journals (Sweden)

    Liang Xiaodan

    2017-01-01

    Full Text Available Ink-jet printing circuit board has some advantage, such as non-contact manufacture, high manufacture accuracy, and low pollution and so on. In order to improve the and printing precision, the finite element technology is adopted to model the piezoelectric print heads, and a new bacteria foraging algorithm with a lifecycle strategy is proposed to optimize the parameters of driving waveforms for getting the desired droplet characteristics. Results of numerical simulation show such algorithm has a good performance. Additionally, the droplet jetting simulation results and measured results confirmed such method precisely gets the desired droplet characteristics.

  13. Analysis of Longitudinal Studies With Repeated Outcome Measures: Adjusting for Time-Dependent Confounding Using Conventional Methods.

    Science.gov (United States)

    Keogh, Ruth H; Daniel, Rhian M; VanderWeele, Tyler J; Vansteelandt, Stijn

    2018-05-01

    Estimation of causal effects of time-varying exposures using longitudinal data is a common problem in epidemiology. When there are time-varying confounders, which may include past outcomes, affected by prior exposure, standard regression methods can lead to bias. Methods such as inverse probability weighted estimation of marginal structural models have been developed to address this problem. However, in this paper we show how standard regression methods can be used, even in the presence of time-dependent confounding, to estimate the total effect of an exposure on a subsequent outcome by controlling appropriately for prior exposures, outcomes, and time-varying covariates. We refer to the resulting estimation approach as sequential conditional mean models (SCMMs), which can be fitted using generalized estimating equations. We outline this approach and describe how including propensity score adjustment is advantageous. We compare the causal effects being estimated using SCMMs and marginal structural models, and we compare the two approaches using simulations. SCMMs enable more precise inferences, with greater robustness against model misspecification via propensity score adjustment, and easily accommodate continuous exposures and interactions. A new test for direct effects of past exposures on a subsequent outcome is described.

  14. Study of spectral response of a neutron filter. Design of a method to adjust spectra

    International Nuclear Information System (INIS)

    Colomb-Dolci, F.

    1999-02-01

    The first part of this thesis describes an experimental method which intends to determine a neutron spectrum in the epithermal range [1 eV -10 keV]. Based on measurements of reaction rates provided by activation foils, it gives flux level in each energy range corresponding to each probe. This method can be used in any reactor location or in a neutron beam. It can determine scepter on eight energy groups, five groups in the epithermal range. The second part of this thesis presents a study of an epithermal neutron beam design, in the frame of Neutron Capture Therapy. A beam tube was specially built to test filters made up of different materials. Its geometry was designed to favour epithermal neutron crossing and to cut thermal and fast neutrons. A code scheme was validated to simulate the device response with a Monte Carlo code. Measurements were made at ISIS reactor and experimental spectra were compared to calculated ones. This validated code scheme was used to simulate different materials usable as shields in the tube. A study of these shields is presented at the end of this thesis. (author)

  15. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Limitation of risk: Protective methods. 223.11 Section 223.11 Money and Finance: Treasury Regulations Relating to Money and Finance... BUSINESS WITH THE UNITED STATES § 223.11 Limitation of risk: Protective methods. The limitation of risk...

  16. Odds per adjusted standard deviation: comparing strengths of associations for risk factors measured on different scales and across diseases and populations.

    Science.gov (United States)

    Hopper, John L

    2015-11-15

    How can the "strengths" of risk factors, in the sense of how well they discriminate cases from controls, be compared when they are measured on different scales such as continuous, binary, and integer? Given that risk estimates take into account other fitted and design-related factors-and that is how risk gradients are interpreted-so should the presentation of risk gradients. Therefore, for each risk factor X0, I propose using appropriate regression techniques to derive from appropriate population data the best fitting relationship between the mean of X0 and all the other covariates fitted in the model or adjusted for by design (X1, X2, … , Xn). The odds per adjusted standard deviation (OPERA) presents the risk association for X0 in terms of the change in risk per s = standard deviation of X0 adjusted for X1, X2, … , Xn, rather than the unadjusted standard deviation of X0 itself. If the increased risk is relative risk (RR)-fold over A adjusted standard deviations, then OPERA = exp[ln(RR)/A] = RR(s). This unifying approach is illustrated by considering breast cancer and published risk estimates. OPERA estimates are by definition independent and can be used to compare the predictive strengths of risk factors across diseases and populations. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Normalized impact factor (NIF): an adjusted method for calculating the citation rate of biomedical journals.

    Science.gov (United States)

    Owlia, P; Vasei, M; Goliaei, B; Nassiri, I

    2011-04-01

    The interests in journal impact factor (JIF) in scientific communities have grown over the last decades. The JIFs are used to evaluate journals quality and the papers published therein. JIF is a discipline specific measure and the comparison between the JIF dedicated to different disciplines is inadequate, unless a normalization process is performed. In this study, normalized impact factor (NIF) was introduced as a relatively simple method enabling the JIFs to be used when evaluating the quality of journals and research works in different disciplines. The NIF index was established based on the multiplication of JIF by a constant factor. The constants were calculated for all 54 disciplines of biomedical field during 2005, 2006, 2007, 2008 and 2009 years. Also, ranking of 393 journals in different biomedical disciplines according to the NIF and JIF were compared to illustrate how the NIF index can be used for the evaluation of publications in different disciplines. The findings prove that the use of the NIF enhances the equality in assessing the quality of research works produced by researchers who work in different disciplines. Copyright © 2010 Elsevier Inc. All rights reserved.

  18. Method and apparatus for rapid adjustment of process gas inventory in gaseous diffusion cascades

    International Nuclear Information System (INIS)

    1980-01-01

    A method is specified for the operation of a gaseous diffusion cascade wherein electrically driven compressors circulate a process gas through a plurality of serially connected gaseous diffusion stages to establish first and second countercurrently flowing cascade streams of process gas, one of the streams being at a relatively low pressure and enriched in a component of the process gas and the other being at a higher pressure and depleted in the same, and wherein automatic control systems maintain the stage process gas pressures by positioning process gas flow control valve openings at values which are functions of the difference between reference-signal inputs to the systems, and signal inputs proportional to the process gas pressures in the gaseous diffusion stages associated with the systems, the cascade process gas inventory being altered, while the cascade is operating, by simultaneously directing into separate process-gas freezing zones a plurality of substreams derived from one of the first and second streams at different points along the lengths thereof to solidify approximately equal weights of process gas in the zone while reducing the reference-signal inputs to maintain the positions of the control valves substantially unchanged despite the removal of process gas inventory via the substreams. (author)

  19. The relationship between the C-statistic of a risk-adjustment model and the accuracy of hospital report cards: a Monte Carlo Study.

    Science.gov (United States)

    Austin, Peter C; Reeves, Mathew J

    2013-03-01

    Hospital report cards, in which outcomes following the provision of medical or surgical care are compared across health care providers, are being published with increasing frequency. Essential to the production of these reports is risk-adjustment, which allows investigators to account for differences in the distribution of patient illness severity across different hospitals. Logistic regression models are frequently used for risk adjustment in hospital report cards. Many applied researchers use the c-statistic (equivalent to the area under the receiver operating characteristic curve) of the logistic regression model as a measure of the credibility and accuracy of hospital report cards. To determine the relationship between the c-statistic of a risk-adjustment model and the accuracy of hospital report cards. Monte Carlo simulations were used to examine this issue. We examined the influence of 3 factors on the accuracy of hospital report cards: the c-statistic of the logistic regression model used for risk adjustment, the number of hospitals, and the number of patients treated at each hospital. The parameters used to generate the simulated datasets came from analyses of patients hospitalized with a diagnosis of acute myocardial infarction in Ontario, Canada. The c-statistic of the risk-adjustment model had, at most, a very modest impact on the accuracy of hospital report cards, whereas the number of patients treated at each hospital had a much greater impact. The c-statistic of a risk-adjustment model should not be used to assess the accuracy of a hospital report card.

  20. SPECIFIC METHOD OF RISK ASSESSMENT IN TOURISM ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Andreea ARMEAN

    2014-12-01

    Full Text Available The objective of this paper is to present an innovative method of risk assessment for tourism businesses. The contribution to literature is the novelty of this method of following paths: is an ante-factum assessment not post-factum; risk assessment is based on perception rather than results; is based on specific risks tourism enterprises not on the overall risks. Is an asset-research methodology and consists in generating its own method of risk assessment based on the ideas summarized from the literature studied. The aim established is tourism enterprises from Romania. The data necessary for the application of this method will result from applying to top level management of tourism enterprises, a questionnaire about risk perception. The results from this study will help identify and measure the risks specific to tourism enterprises. The applicability of the results is to improve risk management in these enterprises.

  1. The impact of aortic manipulation on neurologic outcomes after coronary artery bypass surgery: a risk-adjusted study.

    Science.gov (United States)

    Kapetanakis, Emmanouil I; Stamou, Sotiris C; Dullum, Mercedes K C; Hill, Peter C; Haile, Elizabeth; Boyce, Steven W; Bafi, Ammar S; Petro, Kathleen R; Corso, Paul J

    2004-11-01

    Cerebral embolization of atherosclerotic plaque debris caused by aortic manipulation during conventional coronary artery bypass grafting (CABG) is a major mechanism of postoperative cerebrovascular accidents (CVA). Off-pump CABG (OPCABG) reduces stroke rates by minimizing aortic manipulation. Consequently, the effect of different levels of aortic manipulation on neurologic outcomes after CABG surgery was examined. From January 1998 to June 2002, 7,272 patients underwent isolated CABG surgery through three levels of aortic manipulation: full plus tangential (side-biting) aortic clamp application (on-pump surgery; n = 4,269), only tangential aortic clamp application (OPCABG surgery; n = 2,527) or an "aortic no-touch" technique (OPCABG surgery; n = 476). A risk-adjusted logistic regression analysis was performed to establish the likelihood of postoperative stroke with each technique. Preoperative risk factors for stroke from the literature, and those found significant in a univariable model were used. A significant association for postoperative stroke correspondingly increasing with the extent of aortic manipulation was demonstrated by the univariable analysis (CVA incidence respectively increasing from 0.8% to 1.6% to a maximum of 2.2%, p < 0.01). In the logistic regression model, patients who had a full and a tangential aortic clamp applied were 1.8 times more likely to have a stroke versus those without any aortic manipulation (95% confidence interval: 1.15 to 2.74, p < 0.01) and 1.7 times more likely to develop a postoperative stroke than those with only a tangential aortic clamp applied (95% confidence interval: 1.11 to 2.48, p < 0.01). Aortic manipulation during CABG is a contributing mechanism for postoperative stroke. The incidence of postoperative stroke increases with increased levels of aortic manipulation.

  2. Adjusting survival time estimates to account for treatment switching in randomized controlled trials--an economic evaluation context: methods, limitations, and recommendations.

    Science.gov (United States)

    Latimer, Nicholas R; Abrams, Keith R; Lambert, Paul C; Crowther, Michael J; Wailoo, Allan J; Morden, James P; Akehurst, Ron L; Campbell, Michael J

    2014-04-01

    Treatment switching commonly occurs in clinical trials of novel interventions in the advanced or metastatic cancer setting. However, methods to adjust for switching have been used inconsistently and potentially inappropriately in health technology assessments (HTAs). We present recommendations on the use of methods to adjust survival estimates in the presence of treatment switching in the context of economic evaluations. We provide background on the treatment switching issue and summarize methods used to adjust for it in HTAs. We discuss the assumptions and limitations associated with adjustment methods and draw on results of a simulation study to make recommendations on their use. We demonstrate that methods used to adjust for treatment switching have important limitations and often produce bias in realistic scenarios. We present an analysis framework that aims to increase the probability that suitable adjustment methods can be identified on a case-by-case basis. We recommend that the characteristics of clinical trials, and the treatment switching mechanism observed within them, should be considered alongside the key assumptions of the adjustment methods. Key assumptions include the "no unmeasured confounders" assumption associated with the inverse probability of censoring weights (IPCW) method and the "common treatment effect" assumption associated with the rank preserving structural failure time model (RPSFTM). The limitations associated with switching adjustment methods such as the RPSFTM and IPCW mean that they are appropriate in different scenarios. In some scenarios, both methods may be prone to bias; "2-stage" methods should be considered, and intention-to-treat analyses may sometimes produce the least bias. The data requirements of adjustment methods also have important implications for clinical trialists.

  3. Qualitative Analysis of Chang'e-1 γ-ray Spectrometer Spectra Using Noise Adjusted Singular Value Decomposition Method

    International Nuclear Information System (INIS)

    Yang Jia; Ge Liangquan; Xiong Shengqing

    2010-01-01

    From the features of spectra shape of Chang'e-1 γ-ray spectrometer(CE1-GRS) data, it is difficult to determine elemental compositions on the lunar surface. Aimed at this problem, this paper proposes using noise adjusted singular value decomposition (NASVD) method to extract orthogonal spectral components from CE1-GRS data. Then the peak signals in the spectra of lower-order layers corresponding to the observed spectrum of each lunar region are respectively analyzed. Elemental compositions of each lunar region can be determined based upon whether the energy corresponding to each peak signal equals to the energy corresponding to the characteristic gamma-ray line emissions of specific elements. The result shows that a number of elements such as U, Th, K, Fe, Ti, Si, O, Al, Mg, Ca and Na are qualitatively determined by this method. (authors)

  4. Methods of assessing nuclear power plant risks

    International Nuclear Information System (INIS)

    Skvarka, P.; Kovacz, Z.

    1985-01-01

    The concept of safety evalution is based on safety criteria -standards or set qualitative values of parameters and indices used in designing nuclear power plants, incorporating demands on the quality of equipment and operation of the plant, its siting and technical means for achieving nuclear safety. The concepts are presented of basic and optimal risk values. Factors are summed up indispensable for the evaluation of the nuclear power plant risk and the present world trend of evaluation based on probability is discussed. (J.C.)

  5. External adjustment of unmeasured confounders in a case-control study of benzodiazepine use and cancer risk

    DEFF Research Database (Denmark)

    Thygesen, Lau Caspar; Pottegård, Anton; Ersbøll, Annette Kjaer

    2017-01-01

    AIMS: Previous studies have reported diverging results on the association between benzodiazepine use and cancer risk. METHODS: We investigated this association in a matched case-control study including incident cancer cases during 2002-2009 in the Danish Cancer Registry (n = 94 923) and age......% confidence interval 1.00-1.19) and for smoking-related cancers from 1.20 to 1.10 (95% confidence interval 1.00-1.21). CONCLUSION: We conclude that the increased risk observed in the solely register-based study could partly be attributed to unmeasured confounding....... PSs were used: The error-prone PS using register-based confounders and the calibrated PS based on both register- and survey-based confounders, retrieved from the Health Interview Survey. RESULTS: Register-based data showed that cancer cases had more diagnoses, higher comorbidity score and more co...

  6. Method ranks competing projects by priorities, risk

    International Nuclear Information System (INIS)

    Moeckel, D.R.

    1993-01-01

    A practical, objective guide for ranking projects based on risk-based priorities has been developed by Sun Pipe Line Co. The deliberately simple system guides decisions on how to allocate scarce company resources because all managers employ the same criteria in weighing potential risks to the company versus benefits. Managers at all levels are continuously having to comply with an ever growing amount of legislative and regulatory requirements while at the same time trying to run their businesses effectively. The system primarily is designed for use as a compliance oversight and tracking process to document, categorize, and follow-up on work concerning various issues or projects. That is, the system consists of an electronic database which is updated periodically, and is used by various levels of management to monitor progress of health, safety, environmental and compliance-related projects. Criteria used in determining a risk factor and assigning a priority also have been adapted and found useful for evaluating other types of projects. The process enables management to better define potential risks and/or loss of benefits that are being accepted when a project is rejected from an immediate work plan or budget. In times of financial austerity, it is extremely important that the right decisions are made at the right time

  7. Application of adjusted subpixel method (ASM) in HRCT measurements of the bronchi in bronchial asthma patients and healthy individuals

    International Nuclear Information System (INIS)

    Mincewicz, Grzegorz; Rumiński, Jacek; Krzykowski, Grzegorz

    2012-01-01

    Background: Recently, we described a model system which included corrections of high-resolution computed tomography (HRCT) bronchial measurements based on the adjusted subpixel method (ASM). Objective: To verify the clinical application of ASM by comparing bronchial measurements obtained by means of the traditional eye-driven method, subpixel method alone and ASM in a group comprised of bronchial asthma patients and healthy individuals. Methods: The study included 30 bronchial asthma patients and the control group comprised of 20 volunteers with no symptoms of asthma. The lowest internal and external diameters of the bronchial cross-sections (ID and ED) and their derivative parameters were determined in HRCT scans using: (1) traditional eye-driven method, (2) subpixel technique, and (3) ASM. Results: In the case of the eye-driven method, lower ID values along with lower bronchial lumen area and its percentage ratio to total bronchial area were basic parameters that differed between asthma patients and healthy controls. In the case of the subpixel method and ASM, both groups were not significantly different in terms of ID. Significant differences were observed in values of ED and total bronchial area with both parameters being significantly higher in asthma patients. Compared to ASM, the eye-driven method overstated the values of ID and ED by about 30% and 10% respectively, while understating bronchial wall thickness by about 18%. Conclusions: Results obtained in this study suggest that the traditional eye-driven method of HRCT-based measurement of bronchial tree components probably overstates the degree of bronchial patency in asthma patients.

  8. Methods to Quantify Uncertainty in Human Health Risk Assessment

    National Research Council Canada - National Science Library

    Aurelius, Lea

    1998-01-01

    ...) and other health professionals, such as the Bioenviroumental Engineer, to identify the appropriate use of probabilistic techniques for a site, and the methods by which probabilistic risk assessment...

  9. A Method for Accounting for Risk in Lending

    National Research Council Canada - National Science Library

    Kobylski, Gerald

    1997-01-01

    ..., or decreased to increase competitiveness? Many lending institutions, specifically furniture retailers, do not use scientific methods for determining their risk of payment defaults on loans to customers...

  10. Characterization of the CALIBAN Critical Assembly Neutron Spectra using Several Adjustment Methods Based on Activation Foils Measurement

    Science.gov (United States)

    Casoli, Pierre; Grégoire, Gilles; Rousseau, Guillaume; Jacquet, Xavier; Authier, Nicolas

    2016-02-01

    CALIBAN is a metallic critical assembly managed by the Criticality, Neutron Science and Measurement Department located on the French CEA Center of Valduc. The reactor is extensively used for benchmark experiments dedicated to the evaluation of nuclear data, for electronic hardening or to study the effect of the neutrons on various materials. Therefore CALIBAN irradiation characteristics and especially its central cavity neutron spectrum have to be very accurately evaluated. In order to strengthen our knowledge of this spectrum, several adjustment methods based on activation foils measurements are being studied for a few years in the laboratory. Firstly two codes included in the UMG package have been tested and compared: MAXED and GRAVEL. More recently, the CALIBAN cavity spectrum has been studied using CALMAR, a new adjustment tool currently under development at the CEA Center of Cadarache. The article will discuss and compare the results and the quality of spectrum rebuilding obtained with the UMG codes and with the CALMAR software, from a set of activation measurements carried out in the CALIBAN irradiation cavity.

  11. Characterization of the CALIBAN Critical Assembly Neutron Spectra using Several Adjustment Methods Based on Activation Foils Measurement

    Directory of Open Access Journals (Sweden)

    Casoli Pierre

    2016-01-01

    Full Text Available CALIBAN is a metallic critical assembly managed by the Criticality, Neutron Science and Measurement Department located on the French CEA Center of Valduc. The reactor is extensively used for benchmark experiments dedicated to the evaluation of nuclear data, for electronic hardening or to study the effect of the neutrons on various materials. Therefore CALIBAN irradiation characteristics and especially its central cavity neutron spectrum have to be very accurately evaluated. In order to strengthen our knowledge of this spectrum, several adjustment methods based on activation foils measurements are being studied for a few years in the laboratory. Firstly two codes included in the UMG package have been tested and compared: MAXED and GRAVEL. More recently, the CALIBAN cavity spectrum has been studied using CALMAR, a new adjustment tool currently under development at the CEA Center of Cadarache. The article will discuss and compare the results and the quality of spectrum rebuilding obtained with the UMG codes and with the CALMAR software, from a set of activation measurements carried out in the CALIBAN irradiation cavity.

  12. Methods of assessment and management of enterprise risks

    Directory of Open Access Journals (Sweden)

    I. A. Kiseleva

    2017-01-01

    Full Text Available The article is devoted to the actual topic of our time – the management of business risks. An integral part of professional risk management is to identify the nature of the object of management in the sphere of economy. Since the domestic theory of risk management is under development, the problem of a clear comprehensive definition of risk becomes now of particular relevance. The article discusses the basic concepts of risk management; studied its components in the business activities; reflected system and risk management principles; The basic types of risks in business. A organizational and economic mechanism of enterprise risk assessment. Practical advice on risk management. Entrepreneurship without risk does not exist. With the development of market economy the specific entrepreneur determines the methods that will work, and they all lead to entrepreneurial risks. The level of threats on the market today, above the level of potential profits. It is concluded that it is impossible to increase revenue without increasing the risk or reduce risk without reducing income. The lower range of the probability distribution of expected returns relative to its mean value, the lower the risk associated with this operation. Avoid risk in business is almost impossible, but you can reduce this risk. And it depends on how professionally and correctly operates the entrepreneur, what kind of strategy he will choose to reduce the appearance of risk.

  13. Application of adjusted subpixel method (ASM) in HRCT measurements of the bronchi in bronchial asthma patients and healthy individuals.

    Science.gov (United States)

    Mincewicz, Grzegorz; Rumiński, Jacek; Krzykowski, Grzegorz

    2012-02-01

    Recently, we described a model system which included corrections of high-resolution computed tomography (HRCT) bronchial measurements based on the adjusted subpixel method (ASM). To verify the clinical application of ASM by comparing bronchial measurements obtained by means of the traditional eye-driven method, subpixel method alone and ASM in a group comprised of bronchial asthma patients and healthy individuals. The study included 30 bronchial asthma patients and the control group comprised of 20 volunteers with no symptoms of asthma. The lowest internal and external diameters of the bronchial cross-sections (ID and ED) and their derivative parameters were determined in HRCT scans using: (1) traditional eye-driven method, (2) subpixel technique, and (3) ASM. In the case of the eye-driven method, lower ID values along with lower bronchial lumen area and its percentage ratio to total bronchial area were basic parameters that differed between asthma patients and healthy controls. In the case of the subpixel method and ASM, both groups were not significantly different in terms of ID. Significant differences were observed in values of ED and total bronchial area with both parameters being significantly higher in asthma patients. Compared to ASM, the eye-driven method overstated the values of ID and ED by about 30% and 10% respectively, while understating bronchial wall thickness by about 18%. Results obtained in this study suggest that the traditional eye-driven method of HRCT-based measurement of bronchial tree components probably overstates the degree of bronchial patency in asthma patients. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  14. Risk adjustment models for interhospital comparison of CS rates using Robson's ten group classification system and other socio-demographic and clinical variables.

    Science.gov (United States)

    Colais, Paola; Fantini, Maria P; Fusco, Danilo; Carretta, Elisa; Stivanello, Elisa; Lenzi, Jacopo; Pieri, Giulia; Perucci, Carlo A

    2012-06-21

    Caesarean section (CS) rate is a quality of health care indicator frequently used at national and international level. The aim of this study was to assess whether adjustment for Robson's Ten Group Classification System (TGCS), and clinical and socio-demographic variables of the mother and the fetus is necessary for inter-hospital comparisons of CS rates. The study population includes 64,423 deliveries in Emilia-Romagna between January 1, 2003 and December 31, 2004, classified according to theTGCS. Poisson regression was used to estimate crude and adjusted hospital relative risks of CS compared to a reference category. Analyses were carried out in the overall population and separately according to the Robson groups (groups I, II, III, IV and V-X combined). Adjusted relative risks (RR) of CS were estimated using two risk-adjustment models; the first (M1) including the TGCS group as the only adjustment factor; the second (M2) including in addition demographic and clinical confounders identified using a stepwise selection procedure. Percentage variations between crude and adjusted RRs by hospital were calculated to evaluate the confounding effect of covariates. The percentage variations from crude to adjusted RR proved to be similar in M1 and M2 model. However, stratified analyses by Robson's classification groups showed that residual confounding for clinical and demographic variables was present in groups I (nulliparous, single, cephalic, ≥37 weeks, spontaneous labour) and III (multiparous, excluding previous CS, single, cephalic, ≥37 weeks, spontaneous labour) and IV (multiparous, excluding previous CS, single, cephalic, ≥37 weeks, induced or CS before labour) and to a minor extent in groups II (nulliparous, single, cephalic, ≥37 weeks, induced or CS before labour) and IV (multiparous, excluding previous CS, single, cephalic, ≥37 weeks, induced or CS before labour). The TGCS classification is useful for inter-hospital comparison of CS section rates, but

  15. Assessing risk of draft survey by AHP method

    Science.gov (United States)

    Xu, Guangcheng; Zhao, Kuimin; Zuo, Zhaoying; Liu, Gang; Jian, Binguo; Lin, Yan; Fan, Yukun; Wang, Fei

    2018-04-01

    The paper assesses the risks of vessel floating in the seawater for draft survey by using the analytic hierarchy process. On this basis, the paper established draft survey risk index from the view of draft reading, ballast water, fresh water, and calculation process and so on. Then the paper proposes the method to deal with risk assessment using one concrete sample.

  16. Empirical comparison of four baseline covariate adjustment methods in analysis of continuous outcomes in randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhang S

    2014-07-01

    Full Text Available Shiyuan Zhang,1 James Paul,2 Manyat Nantha-Aree,2 Norman Buckley,2 Uswa Shahzad,2 Ji Cheng,2 Justin DeBeer,5 Mitchell Winemaker,5 David Wismer,5 Dinshaw Punthakee,5 Victoria Avram,5 Lehana Thabane1–41Department of Clinical Epidemiology and Biostatistics, 2Department of Anesthesia, McMaster University, Hamilton, ON, Canada; 3Biostatistics Unit/Centre for Evaluation of Medicines, St Joseph's Healthcare - Hamilton, Hamilton, ON, Canada; 4Population Health Research Institute, Hamilton Health Science/McMaster University, 5Department of Surgery, Division of Orthopaedics, McMaster University, Hamilton, ON, CanadaBackground: Although seemingly straightforward, the statistical comparison of a continuous variable in a randomized controlled trial that has both a pre- and posttreatment score presents an interesting challenge for trialists. We present here empirical application of four statistical methods (posttreatment scores with analysis of variance, analysis of covariance, change in scores, and percent change in scores, using data from a randomized controlled trial of postoperative pain in patients following total joint arthroplasty (the Morphine COnsumption in Joint Replacement Patients, With and Without GaBapentin Treatment, a RandomIzed ControlLEd Study [MOBILE] trials.Methods: Analysis of covariance (ANCOVA was used to adjust for baseline measures and to provide an unbiased estimate of the mean group difference of the 1-year postoperative knee flexion scores in knee arthroplasty patients. Robustness tests were done by comparing ANCOVA with three comparative methods: the posttreatment scores, change in scores, and percentage change from baseline.Results: All four methods showed similar direction of effect; however, ANCOVA (-3.9; 95% confidence interval [CI]: -9.5, 1.6; P=0.15 and the posttreatment score (-4.3; 95% CI: -9.8, 1.2; P=0.12 method provided the highest precision of estimate compared with the change score (-3.0; 95% CI: -9.9, 3.8; P=0

  17. Realistic PIC modelling of laser-plasma interaction: a direct implicit method with adjustable damping and high order weight functions

    International Nuclear Information System (INIS)

    Drouin, M.

    2009-11-01

    This research thesis proposes a new formulation of the relativistic implicit direct method, based on the weak formulation of the wave equation which is solved by means of a Newton algorithm. The first part of this thesis deals with the properties of the explicit particle-in-cell (PIC) methods: properties and limitations of an explicit PIC code, linear analysis of a numerical plasma, numerical heating phenomenon, interest of a higher order interpolation function, and presentation of two applications in high density relativistic laser-plasma interaction. The second and main part of this report deals with adapting the direct implicit method to laser-plasma interaction: presentation of the state of the art, formulating of the direct implicit method, resolution of the wave equation. The third part concerns various numerical and physical validations of the ELIXIRS code: case of laser wave propagation in vacuum, demonstration of the adjustable damping which is a characteristic of the proposed algorithm, influence of space-time discretization on energy conservation, expansion of a thermal plasma in vacuum, two cases of plasma-beam unsteadiness in relativistic regime, and then a case of the overcritical laser-plasma interaction

  18. Companies Credit Risk Assessment Methods for Investment Decision Making

    Directory of Open Access Journals (Sweden)

    Dovilė Peškauskaitė

    2017-06-01

    Full Text Available As the banks have tightened lending requirements, companies look for alternative sources of external funding. One of such is bonds issue. Unfortunately, corporate bonds issue as a source of funding is rare in Lithuania. This occurs because companies face with a lack of information, investors fear to take on credit risk. Credit risk is defined as a borrower’s failure to meet its obligation. Investors, in order to avoid credit risk, have to assess the state of the companies. The goal of the article is to determine the most informative methods of credit risk assessment. The article summarizes corporate lending sources, analyzes corporate default causes and credit risk assessment methods. The study based on the SWOT analysis shows that investors before making an investment decision should evaluate both the business risk,using qualitative method CAMPARI, and the financial risk, using financial ratio analysis.

  19. Compensatory Postural Adjustments in an Oculus Virtual Reality Environment and the Risk of Falling in Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Miguel F. Gago

    2016-06-01

    Full Text Available Background/Aims: Alzheimer's disease (AD patients have an impaired ability to quickly reweight central sensory dependence in response to unexpected body perturbations. Herein, we aim to study provoked compensatory postural adjustments (CPAs in a conflicting sensory paradigm with unpredictable visual displacements using virtual reality goggles. Methods: We used kinematic time-frequency analyses of two frequency bands: a low-frequency band (LB; 0.3-1.5 Hz; mechanical strategy and a high-frequency band (HB; 1.5-3.5 Hz; cognitive strategy. We enrolled 19 healthy subjects (controls and 21 AD patients, divided according to their previous history of falls. Results: The AD faller group presented higher-power LB CPAs, reflecting their worse inherent postural stability. The AD patients had a time lag in their HB CPA reaction. Conclusion: The slower reaction by CPA in AD may be a reflection of different cognitive resources including body schema self-perception, visual motion, depth perception, or a different state of fear and/or anxiety.

  20. Risk-adjusted survival for adults following in-hospital cardiac arrest by day of week and time of day: observational cohort study

    Science.gov (United States)

    Robinson, Emily J; Power, Geraldine S; Nolan, Jerry; Soar, Jasmeet; Spearpoint, Ken; Gwinnutt, Carl; Rowan, Kathryn M

    2016-01-01

    Background Internationally, hospital survival is lower for patients admitted at weekends and at night. Data from the UK National Cardiac Arrest Audit (NCAA) indicate that crude hospital survival was worse after in-hospital cardiac arrest (IHCA) at night versus day, and at weekends versus weekdays, despite similar frequency of events. Objective To describe IHCA demographics during three day/time periods—weekday daytime (Monday to Friday, 08:00 to 19:59), weekend daytime (Saturday and Sunday, 08:00 to 19:59) and night-time (Monday to Sunday, 20:00 to 07:59)—and to compare the associated rates of return of spontaneous circulation (ROSC) for >20 min (ROSC>20 min) and survival to hospital discharge, adjusted for risk using previously developed NCAA risk models. To consider whether any observed difference could be attributed to differences in the case mix of patients resident in hospital and/or the administered care. Methods We performed a prospectively defined analysis of NCAA data from 27 700 patients aged ≥16 years receiving chest compressions and/or defibrillation and attended by a hospital-based resuscitation team in response to a resuscitation (2222) call in 146 UK acute hospitals. Results Risk-adjusted outcomes (OR (95% CI)) were worse (p20 min 0.88 (0.81 to 0.95); hospital survival 0.72 (0.64 to 0.80)), and night-time (ROSC>20 min 0.72 (0.68 to 0.76); hospital survival 0.58 (0.54 to 0.63)) compared with weekday daytime. The effects were stronger for non-shockable than shockable rhythms, but there was no significant interaction between day/time of arrest and age, or day/time of arrest and arrest location. While many daytime IHCAs involved procedures, restricting the analyses to IHCAs in medical admissions with an arrest location of ward produced results that are broadly in line with the primary analyses. Conclusions IHCAs attended by the hospital-based resuscitation team during nights and weekends have substantially worse outcomes than during

  1. The evolution of credit risk: phenomena, methods and management

    OpenAIRE

    George A. Christodoulakis

    2007-01-01

    This paper summarizes the proceedings of a conference at the Bank of Greece on credit risk. The papers presented focused on innovations in risk management methods which contribute to systemic financial stability, calculation of capital adequacy in financial institutions as well as the validation of credit rating methods in the context of Basel II.

  2. Convexity Adjustments

    DEFF Research Database (Denmark)

    M. Gaspar, Raquel; Murgoci, Agatha

    2010-01-01

    A convexity adjustment (or convexity correction) in fixed income markets arises when one uses prices of standard (plain vanilla) products plus an adjustment to price nonstandard products. We explain the basic and appealing idea behind the use of convexity adjustments and focus on the situations...

  3. An efficient method to generate a perturbed parameter ensemble of a fully coupled AOGCM without flux-adjustment

    Directory of Open Access Journals (Sweden)

    P. J. Irvine

    2013-09-01

    Full Text Available We present a simple method to generate a perturbed parameter ensemble (PPE of a fully-coupled atmosphere-ocean general circulation model (AOGCM, HadCM3, without requiring flux-adjustment. The aim was to produce an ensemble that samples parametric uncertainty in some key variables and gives a plausible representation of the climate. Six atmospheric parameters, a sea-ice parameter and an ocean parameter were jointly perturbed within a reasonable range to generate an initial group of 200 members. To screen out implausible ensemble members, 20 yr pre-industrial control simulations were run and members whose temperature responses to the parameter perturbations were projected to be outside the range of 13.6 ± 2 °C, i.e. near to the observed pre-industrial global mean, were discarded. Twenty-one members, including the standard unperturbed model, were accepted, covering almost the entire span of the eight parameters, challenging the argument that without flux-adjustment parameter ranges would be unduly restricted. This ensemble was used in 2 experiments; an 800 yr pre-industrial and a 150 yr quadrupled CO2 simulation. The behaviour of the PPE for the pre-industrial control compared well to ERA-40 reanalysis data and the CMIP3 ensemble for a number of surface and atmospheric column variables with the exception of a few members in the Tropics. However, we find that members of the PPE with low values of the entrainment rate coefficient show very large increases in upper tropospheric and stratospheric water vapour concentrations in response to elevated CO2 and one member showed an implausible nonlinear climate response, and as such will be excluded from future experiments with this ensemble. The outcome of this study is a PPE of a fully-coupled AOGCM which samples parametric uncertainty and a simple methodology which would be applicable to other GCMs.

  4. Impact of urine concentration adjustment method on associations between urine metals and estimated glomerular filtration rates (eGFR) in adolescents

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Virginia M., E-mail: vweaver@jhsph.edu [Department of Environmental Health Sciences, Johns Hopkins Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD (United States); Johns Hopkins University School of Medicine, Baltimore, MD (United States); Welch Center for Prevention, Epidemiology, and Clinical Research, Johns Hopkins Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD (United States); Vargas, Gonzalo García [Faculty of Medicine, University of Juárez of Durango State, Durango (Mexico); Secretaría de Salud del Estado de Coahuila, Coahuila, México (Mexico); Silbergeld, Ellen K. [Department of Environmental Health Sciences, Johns Hopkins Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD (United States); Rothenberg, Stephen J. [Instituto Nacional de Salud Publica, Centro de Investigacion en Salud Poblacional, Cuernavaca, Morelos (Mexico); Fadrowski, Jeffrey J. [Johns Hopkins University School of Medicine, Baltimore, MD (United States); Welch Center for Prevention, Epidemiology, and Clinical Research, Johns Hopkins Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD (United States); Rubio-Andrade, Marisela [Faculty of Medicine, University of Juárez of Durango State, Durango (Mexico); Parsons, Patrick J. [Laboratory of Inorganic and Nuclear Chemistry, Wadsworth Center, New York State Department of Health, Albany, NY (United States); Department of Environmental Health Sciences, School of Public Health, University at Albany, Albany, NY (United States); Steuerwald, Amy J. [Laboratory of Inorganic and Nuclear Chemistry, Wadsworth Center, New York State Department of Health, Albany, NY (United States); and others

    2014-07-15

    Positive associations between urine toxicant levels and measures of glomerular filtration rate (GFR) have been reported recently in a range of populations. The explanation for these associations, in a direction opposite that of traditional nephrotoxicity, is uncertain. Variation in associations by urine concentration adjustment approach has also been observed. Associations of urine cadmium, thallium and uranium in models of serum creatinine- and cystatin-C-based estimated GFR (eGFR) were examined using multiple linear regression in a cross-sectional study of adolescents residing near a lead smelter complex. Urine concentration adjustment approaches compared included urine creatinine, urine osmolality and no adjustment. Median age, blood lead and urine cadmium, thallium and uranium were 13.9 years, 4.0 μg/dL, 0.22, 0.27 and 0.04 g/g creatinine, respectively, in 512 adolescents. Urine cadmium and thallium were positively associated with serum creatinine-based eGFR only when urine creatinine was used to adjust for urine concentration (β coefficient=3.1 mL/min/1.73 m{sup 2}; 95% confidence interval=1.4, 4.8 per each doubling of urine cadmium). Weaker positive associations, also only with urine creatinine adjustment, were observed between these metals and serum cystatin-C-based eGFR and between urine uranium and serum creatinine-based eGFR. Additional research using non-creatinine-based methods of adjustment for urine concentration is necessary. - Highlights: • Positive associations between urine metals and creatinine-based eGFR are unexpected. • Optimal approach to urine concentration adjustment for urine biomarkers uncertain. • We compared urine concentration adjustment methods. • Positive associations observed only with urine creatinine adjustment. • Additional research using non-creatinine-based methods of adjustment needed.

  5. Impact of urine concentration adjustment method on associations between urine metals and estimated glomerular filtration rates (eGFR) in adolescents

    International Nuclear Information System (INIS)

    Weaver, Virginia M.; Vargas, Gonzalo García; Silbergeld, Ellen K.; Rothenberg, Stephen J.; Fadrowski, Jeffrey J.; Rubio-Andrade, Marisela; Parsons, Patrick J.; Steuerwald, Amy J.

    2014-01-01

    Positive associations between urine toxicant levels and measures of glomerular filtration rate (GFR) have been reported recently in a range of populations. The explanation for these associations, in a direction opposite that of traditional nephrotoxicity, is uncertain. Variation in associations by urine concentration adjustment approach has also been observed. Associations of urine cadmium, thallium and uranium in models of serum creatinine- and cystatin-C-based estimated GFR (eGFR) were examined using multiple linear regression in a cross-sectional study of adolescents residing near a lead smelter complex. Urine concentration adjustment approaches compared included urine creatinine, urine osmolality and no adjustment. Median age, blood lead and urine cadmium, thallium and uranium were 13.9 years, 4.0 μg/dL, 0.22, 0.27 and 0.04 g/g creatinine, respectively, in 512 adolescents. Urine cadmium and thallium were positively associated with serum creatinine-based eGFR only when urine creatinine was used to adjust for urine concentration (β coefficient=3.1 mL/min/1.73 m 2 ; 95% confidence interval=1.4, 4.8 per each doubling of urine cadmium). Weaker positive associations, also only with urine creatinine adjustment, were observed between these metals and serum cystatin-C-based eGFR and between urine uranium and serum creatinine-based eGFR. Additional research using non-creatinine-based methods of adjustment for urine concentration is necessary. - Highlights: • Positive associations between urine metals and creatinine-based eGFR are unexpected. • Optimal approach to urine concentration adjustment for urine biomarkers uncertain. • We compared urine concentration adjustment methods. • Positive associations observed only with urine creatinine adjustment. • Additional research using non-creatinine-based methods of adjustment needed

  6. A comparison of two sleep spindle detection methods based on all night averages: individually adjusted versus fixed frequencies

    Directory of Open Access Journals (Sweden)

    Péter Przemyslaw Ujma

    2015-02-01

    Full Text Available Sleep spindles are frequently studied for their relationship with state and trait cognitive variables, and they are thought to play an important role in sleep-related memory consolidation. Due to their frequent occurrence in NREM sleep, the detection of sleep spindles is only feasible using automatic algorithms, of which a large number is available. We compared subject averages of the spindle parameters computed by a fixed frequency (11-13 Hz for slow spindles, 13-15 Hz for fast spindles automatic detection algorithm and the individual adjustment method (IAM, which uses individual frequency bands for sleep spindle detection. Fast spindle duration and amplitude are strongly correlated in the two algorithms, but there is little overlap in fast spindle density and slow spindle parameters in general. The agreement between fixed and manually determined sleep spindle frequencies is limited, especially in case of slow spindles. This is the most likely reason for the poor agreement between the two detection methods in case of slow spindle parameters. Our results suggest that while various algorithms may reliably detect fast spindles, a more sophisticated algorithm primed to individual spindle frequencies is necessary for the detection of slow spindles as well as individual variations in the number of spindles in general.

  7. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, December 2016

    International Nuclear Information System (INIS)

    Cabellos, Oscar; ); PELLONI, Sandro; Ivanov, Evgeny; Sobes, Vladimir; Fukushima, M.; Yokoyama, Kenji; Palmiotti, Giuseppe; Kodeli, Ivo

    2016-12-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the eighth Subgroup 39 meeting, held at the OECD NEA, Boulogne-Billancourt, France, on 1-2 December 2016. It comprises all the available presentations (slides) given by the participants: A - Presentations: Welcome and actions review (Oscar CABELLOS); B - Methods: - Detailed comparison of Progressive Incremental Adjustment (PIA) sequence results involving adjustments of spectral indices and coolant density effects on the basis of the SG33 benchmark (Sandro PELLONI); - ND assessment alternatives: Validation matrix vs XS adjustment (Evgeny IVANOV); - Implementation of Resonance Parameter Sensitivity Coefficients Calculation in CE TSUNAMI-3D (Vladimir SOBES); C - Experiment analysis, sensitivity calculations and benchmarks: - Benchmark tests of ENDF/B-VIII.0 beta 1 using sodium void reactivity worth of FCA-XXVII-1 assembly (M. FUKUSHIMA, Kenji YOKOYAMA); D - Adjustments: - Cross-section adjustment based on JENDL-4.0 using new experiments on the basis of the SG33 benchmark (Kenji YOKOYAMA); - Comparison of adjustment trends with the Cielo evaluation (Sandro PELLONI); - Expanded adjustment in support of CIELO initiative (Giuseppe PALMIOTTI); - First preliminary results of the adjustment exercise using ASPIS Fe88 and SNEAK-7A/7B k_e_f_f and b_e_f_f benchmarks (Ivo KODELI); E - Future actions, deliverables: - Discussion on future of SG39 and possible new subgroup (Giuseppe PALMIOTTI); - WPEC sub-group proposal: Investigation of Covariance Data in

  8. DORIAN, Bayes Method Plant Age Risk Analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    2002-01-01

    1 - Description of program or function: DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternative hypothesized 'aging models' (i.e., possible trends) along prior probabilities indicating the subject probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop a posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5. and 95. percentile trends are also compiled from the posterior probabilities. 2 - Method of solution: DORIAN carries out a Bayesian analysis of failure data and a prior distribution on a time-dependent failure rate to obtain a posterior distribution on the failure rate. The form of the time-dependent failure rate is arbitrary, because DORIAN approximates the form by a step-function, constant within specified time intervals. Similarly, the parameters may have any prior distribution, because DORIAN uses a discrete distribution to approximate this. Likewise, the database file produced by DORIAN approximates the entire range of possible failure rates or outage durations developed by means of a discrete probability distribution containing no more than 20 distinct values with their probabilities. 3 - Restrictions on the complexity of the problem: Prior distribution is discrete with up to 25 values. Up to 60 times are accommodated in the discrete time history

  9. Applying Multi-Criteria Analysis Methods for Fire Risk Assessment

    Directory of Open Access Journals (Sweden)

    Pushkina Julia

    2015-11-01

    Full Text Available The aim of this paper is to prove the application of multi-criteria analysis methods for optimisation of fire risk identification and assessment process. The object of this research is fire risk and risk assessment. The subject of the research is studying the application of analytic hierarchy process for modelling and influence assessment of various fire risk factors. Results of research conducted by the authors can be used by insurance companies to perform the detailed assessment of fire risks on the object and to calculate a risk extra charge to an insurance premium; by the state supervisory institutions to determine the compliance of a condition of object with requirements of regulations; by real state owners and investors to carry out actions for decrease in degree of fire risks and minimisation of possible losses.

  10. Risk assessment methods for life cycle costing in buildings

    Directory of Open Access Journals (Sweden)

    Oduyemi Olufolahan

    2016-01-01

    Originality/value. This paper contributes with new outlooks aimed at assessing the current level of awareness, usage and advocated benefits of risk assessment methods in LCC and adds to the limited empirical studies on risk assessment to corporate occupants and decision makers.

  11. Psychosocial Adjustment over a Two-Year Period in Children Referred for Learning Problems: Risk, Resilience, and Adaptation.

    Science.gov (United States)

    Sorensen, Lisa G.; Forbes, Peter W.; Bernstein, Jane H.; Weiler, Michael D.; Mitchell, William M.; Waber, Deborah P.

    2003-01-01

    A 2-year study evaluated the relationship among psychosocial adjustment, changes in academic skills, and contextual factors in 100 children (ages 7-11) with learning problems. Contextual variables were significantly associated with psychosocial adaptation, including the effectiveness of the clinical assessment, extent of academic support, and the…

  12. Parental Dysphoria and Children's Adjustment: Marital Conflict Styles, Children's Emotional Security, and Parenting as Mediators of Risk

    Science.gov (United States)

    Du Rocher Schudlich, Tina D.; Cummings, E. Mark

    2007-01-01

    Dimensions of martial conflict, children's emotional security regarding interparental conflict, and parenting style were examined as mediators between parental dysphoria and child adjustment. A community sample of 262 children, ages 8-16, participated with their parents. Behavioral observations were made of parents' interactions during marital…

  13. DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS

    International Nuclear Information System (INIS)

    HIGGINS, J.C.; O'HARA, J.M.; LEWIS, P.M.; PERSENSKY, J.; BONGARRA, J.

    2002-01-01

    DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS. THE U.S. NUCLEAR REGULATORY COMMISSION (NRC) REVIEWS THE HUMAN FACTORS ASPECTS OF PROPOSED LICENSE AMENDMENTS THAT IMPACT HUMAN ACTIONS THAT ARE CREDITED IN A PLANTS SAFETY ANALYSIS. THE STAFF IS COMMITTED TO A GRADED APPROACH TO THESE REVIEWS THAT FOCUS RESOURCES ON THE MOST RISK IMPORTANT CHANGES. THEREFORE, A RISK INFORMED SCREENING METHOD WAS DEVELOPED BASED ON AN ADAPTATION OF EXISTING GUIDANCE FOR RISK INFORMED REGULATION AND HUMAN FACTORS. THE METHOD USES BOTH QUANTITATIVE AND QUALITATIVE INFORMATION TO DIVIDE THE AMENDMENT REQUESTS INTO DIFFERENT LEVELS OF REVIEW. THE METHOD WAS EVALUATED USING A VARIETY OF TESTS. THIS PAPER WILL SUMMARIZE THE DEVELOPMENT OF THE METHODOLOGY AND THE EVALUATIONS THAT WERE PERFORMED TO VERIFY ITS USEFULNESS

  14. OPERATIONAL RISK IN INTERNATIONAL BUSINESS: TAXONOMY AND ASSESSMENT METHODS

    Directory of Open Access Journals (Sweden)

    Marinoiu Ana Maria

    2009-05-01

    Full Text Available The paper aims at presenting the classifications and the assessment methods for operational risk according to international regulations (ie. Basel 2, in the context of its importance as a managerial tool for international business. Considering the growin

  15. The barriers to and enablers of providing reasonably adjusted health services to people with intellectual disabilities in acute hospitals: evidence from a mixed-methods study.

    Science.gov (United States)

    Tuffrey-Wijne, Irene; Goulding, Lucy; Giatras, Nikoletta; Abraham, Elisabeth; Gillard, Steve; White, Sarah; Edwards, Christine; Hollins, Sheila

    2014-04-16

    To identify the factors that promote and compromise the implementation of reasonably adjusted healthcare services for patients with intellectual disabilities in acute National Health Service (NHS) hospitals. A mixed-methods study involving interviews, questionnaires and participant observation (July 2011-March 2013). Six acute NHS hospital trusts in England. Reasonable adjustments for people with intellectual disabilities were identified through the literature. Data were collected on implementation and staff understanding of these adjustments. Data collected included staff questionnaires (n=990), staff interviews (n=68), interviews with adults with intellectual disabilities (n=33), questionnaires (n=88) and interviews (n=37) with carers of patients with intellectual disabilities, and expert panel discussions (n=42). Hospital strategies that supported implementation of reasonable adjustments did not reliably translate into consistent provision of such adjustments. Good practice often depended on the knowledge, understanding and flexibility of individual staff and teams, leading to the delivery of reasonable adjustments being haphazard throughout the organisation. Major barriers included: lack of effective systems for identifying and flagging patients with intellectual disabilities, lack of staff understanding of the reasonable adjustments that may be needed, lack of clear lines of responsibility and accountability for implementing reasonable adjustments, and lack of allocation of additional funding and resources. Key enablers were the Intellectual Disability Liaison Nurse and the ward manager. The evidence suggests that ward culture, staff attitudes and staff knowledge are crucial in ensuring that hospital services are accessible to vulnerable patients. The authors suggest that flagging the need for specific reasonable adjustments, rather than the vulnerable condition itself, may address some of the barriers. Further research is recommended that describes and

  16. Methodical Approaches to Risk Management in a Regional Commercial Bank

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Altukhova

    2016-03-01

    Full Text Available The article presents the results of the research of the methodological and information infrastructure of the integrated risk management in a regional commercial bank. Within the study of the general development tendencies of the regional banking services market, the most significant risks for a regional bank are revealed. The analysis is carried out on the basis of the stress testing technique developed at the Plekhanov Russian University of Economics. It is based on a technique of dynamic economic and mathematical modeling with the application of information technologies. The created combination of the methodological and instrumental tools allows to carry out the dynamic scenario analysis of the activity of a commercial bank for the identification of potential risks and for the development of the strategy of financial management reducing the potential risks and leveling the consequences of their realization. The received tool allows during the computer test to watch the predicted dynamics of the condition of the key indicators of the activity of a regional commercial bank changing under the influence of the exogenous regulatory measures and instruments of bank management applied to decrease risk and at the same time to introduce adjustments in the perspective strategy of management. As the result of the analysis, the universal management model of the main bank risks in a regional commercial bank within three alternative scenarios is created. The software product allowing to develop and acquire the practical skills of the students in banking is developed. It also may help to develop the methodological support for the regulation of the organizational procedures of risk management in a regional commercial bank. The received software product may be used in a system of the improving the professional skills, and also for obtaining the expected data in a risk management system in a regional commercial bank.

  17. Business risks, functions, methods of assessment and ways to reduce risk

    Directory of Open Access Journals (Sweden)

    A.V. Mihalchuk

    2015-06-01

    Full Text Available For successful existence in a market economy entrepreneur have to take bold actions, and this increases the risk. The article describes the concept of entrepreneurship and business risk, positive and negative aspects of functions of risk in business. Therefore, it is necessary to assess the risk properly and be able to manage it to achieve the most effective results in the market. In market conditions the problem of assessing and accounting market becomes independent theoretical and practical significance as an important component of the theory and practice of management. Risk - a key element of business activities. Development of risk situations can lead to both the occurrence of adverse effects (losses, lost profits, and positive results for a company in the form of increased profit. This article describes: the concept of entrepreneurship, risk and business risks, characteristic of positive and negative aspects of risk functions in business, methods of assessment and risk reduction, shows formulae and examples you can use to assess risk in an enterprise. Analyzing already established methods of risk assessment a number of rules were proposed in order to reduce business risk.

  18. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    International Nuclear Information System (INIS)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described

  19. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  20. Risk and dose assessment methods in gamma knife QA

    International Nuclear Information System (INIS)

    Banks, W.W.; Jones, E.D.; Rathbun, P.

    1992-10-01

    Traditional methods used in assessing risk in nuclear power plants may be inappropriate to use in assessing medical radiation risks. The typical philosophy used in assessing nuclear reactor risks is machine dominated with only secondary attention paid to the human component, and only after critical machine failure events have been identified. In assessing the risk of a misadministrative radiation dose to patients, the primary source of failures seems to stem overwhelmingly, from the actions of people and only secondarily from machine mode failures. In essence, certain medical misadministrations are dominated by human events not machine failures. Radiological medical devices such as the Leksell Gamma Knife are very simple in design, have few moving parts, and are relatively free from the risks of wear when compared with a nuclear power plant. Since there are major technical differences between a gamma knife and a nuclear power plant, one must select a particular risk assessment method which is sensitive to these system differences and tailored to the unique medical aspects of the phenomena under study. These differences also generate major shifts in the philosophy and assumptions which drive the risk assessment (Machine-centered vs Person-centered) method. We were prompted by these basic differences to develop a person-centered approach to risk assessment which would reflect these basic philosophical and technological differences, have the necessary resolution in its metrics, and be highly reliable (repeatable). The risk approach chosen by the Livermore investigative team has been called the ''Relative Risk Profile Method'' and has been described in detail by Banks and Paramore, (1983)

  1. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, November 2013

    International Nuclear Information System (INIS)

    De Saint Jean, C.; Dupont, E.; ); Dyrda, J.; Hursin, M.; Pelloni, S.; Ishikawa, M.; Ivanov, E.; Ivanova, T.; Kim, D.H.; Ee, Y.O.; Kodeli, I.; Leal, L.; Leichtle, D.; Palmiotti, G.; Salvatores, M.; Pronyaev, V.; Simakov, S.; )

    2013-11-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the first formal Subgroup 39 meeting held at the NEA, Issy-les-Moulineaux, France, on 28-29 November 2013. It comprises a Summary Record of the meeting and all the available presentations (slides) given by the participants: A - Recent data adjustments performances and trends: 1 - Recommendations from ADJ2010 adjustment (M. Ishikawa); 2 - Feedback on CIELO isotopes from ENDF/B-VII.0 adjustment (G. Palmiotti); 3 - Sensitivity and uncertainty results on FLATTOP-Pu (I. Kodeli); 4 - SG33 benchmark: Comparative adjustment results (S. Pelloni) 5 - Integral benchmarks for data assimilation: selection of a consistent set and establishment of integral correlations (E. Ivanov); 6 - PROTEUS experimental data (M. Hursin); 7 - Additional information on High Conversion Light Water Reactor (HCLWR aka FDWR-II) experiments (14 January 2014); 8 - Data assimilation of benchmark experiments for homogenous thermal/epithermal uranium systems (J. Dyrda); B - Methodology issues: 1 - Adjustment methodology issues (G. Palmiotti); 2 - Marginalisation, methodology issues and nuclear data parameter adjustment (C. De Saint Jean); 3 - Nuclear data parameter adjustment (G. Palmiotti). A list of issues and actions conclude the document

  2. The predictive value of an adjusted COPD assessment test score on the risk of respiratory-related hospitalizations in severe COPD patients.

    Science.gov (United States)

    Barton, Christopher A; Bassett, Katherine L; Buckman, Julie; Effing, Tanja W; Frith, Peter A; van der Palen, Job; Sloots, Joanne M

    2017-02-01

    We evaluated whether a chronic obstructive pulmonary disease (COPD) assessment test (CAT) with adjusted weights for the CAT items could better predict future respiratory-related hospitalizations than the original CAT. Two focus groups (respiratory nurses and physicians) generated two adjusted CAT algorithms. Two multivariate logistic regression models for infrequent (≤1/year) versus frequent (>1/year) future respiratory-related hospitalizations were defined: one with the adjusted CAT score that correlated best with future hospitalizations and one with the original CAT score. Patient characteristics related to future hospitalizations ( p ≤ 0.2) were also entered. Eighty-two COPD patients were included. The CAT algorithm derived from the nurse focus group was a borderline significant predictor of hospitalization risk (odds ratio (OR): 1.07; 95% confidence interval (CI): 1.00-1.14; p = 0.050) in a model that also included hospitalization frequency in the previous year (OR: 3.98; 95% CI: 1.30-12.16; p = 0.016) and anticholinergic risk score (OR: 3.08; 95% CI: 0.87-10.89; p = 0.081). Presence of ischemic heart disease and/or heart failure appeared 'protective' (OR: 0.17; 95% CI: 0.05-0.62; p = 0.007). The original CAT score was not significantly associated with hospitalization risk. In conclusion, as a predictor of respiratory-related hospitalizations, an adjusted CAT score was marginally significant (although the original CAT score was not). 'Previous respiratory-related hospitalizations' was the strongest factor in this equation.

  3. Risk, Conflict, Mothers' Parenting, and Children's Adjustment in Low-Income, Mexican Immigrant, and Mexican American Families.

    Science.gov (United States)

    Dumka, Larry E.; Roosa, Mark W.; Jackson, Kristina M.

    1997-01-01

    Reports on a test of a risk-stress process model. Examines the influence of mothers' supportive parenting and inconsistent discipline practices on risk factors and family conflict as these affect children's conduct disorder and depression. Tests on 121 families indicate that mothers' supportive patenting partially mediated family conflict effects…

  4. Environmental factors and social adjustment as predictors of a first psychosis in subjects at ultra high risk

    NARCIS (Netherlands)

    Dragt, Sara; Nieman, Dorien H.; Veltman, Doede; Becker, Hiske E.; van de Fliert, Reinaud; de Haan, Lieuwe; Linszen, Don H.

    2011-01-01

    BACKGROUND: The onset of schizophrenia is associated with genetic, symptomatic, social and environmental risk factors. The aim of the present study was to determine which environmental factors may contribute to a prediction of a first psychotic episode in subjects at Ultra High Risk (UHR) for

  5. Refined estimates of local recurrence risks by DCIS score adjusting for clinicopathological features: a combined analysis of ECOG-ACRIN E5194 and Ontario DCIS cohort studies.

    Science.gov (United States)

    Rakovitch, E; Gray, R; Baehner, F L; Sutradhar, R; Crager, M; Gu, S; Nofech-Mozes, S; Badve, S S; Hanna, W; Hughes, L L; Wood, W C; Davidson, N E; Paszat, L; Shak, S; Sparano, J A; Solin, L J

    2018-06-01

    Better tools are needed to estimate local recurrence (LR) risk after breast-conserving surgery (BCS) for DCIS. The DCIS score (DS) was validated as a predictor of LR in E5194 and Ontario DCIS cohort (ODC) after BCS. We combined data from E5194 and ODC adjusting for clinicopathological factors to provide refined estimates of the 10-year risk of LR after treatment by BCS alone. Data from E5194 and ODC were combined. Patients with positive margins or multifocality were excluded. Identical Cox regression models were fit for each study. Patient-specific meta-analysis was used to calculate precision-weighted estimates of 10-year LR risk by DS, age, tumor size and year of diagnosis. The combined cohort includes 773 patients. The DS and age at diagnosis, tumor size and year of diagnosis provided independent prognostic information on the 10-year LR risk (p ≤ 0.009). Hazard ratios from E5194 and ODC cohorts were similar for the DS (2.48, 1.95 per 50 units), tumor size ≤ 1 versus  > 1-2.5 cm (1.45, 1.47), age ≥ 50 versus  15%) 10-year LR risk after BCS alone compared to utilization of DS alone or clinicopathological factors alone. The combined analysis provides refined estimates of 10-year LR risk after BCS for DCIS. Adding information on tumor size and age at diagnosis to the DS adjusting for year of diagnosis provides improved LR risk estimates to guide treatment decision making.

  6. COX-2 rs689466, rs5275, and rs20417 polymorphisms and risk of head and neck squamous cell carcinoma: a meta-analysis of adjusted and unadjusted data

    International Nuclear Information System (INIS)

    Leng, Wei-Dong; Wen, Xiu-Jie; Kwong, Joey S. W.; Huang, Wei; Chen, Jian-Gang; Zeng, Xian-Tao

    2016-01-01

    Numerous case–control studies have been performed to investigate the association between three cyclooxygenase-2 (COX-2) polymorphisms (rs20417 (−765G > C), rs689466 (−1195G > A), and rs5275 (8473 T > C)) and the risk of head and neck squamous cell carcinoma (HNSCC). However, the results were inconsistent. Therefore, we conducted this meta-analysis to investigate the association. We searched in PubMed, Embase, and Web of Science up to January 20, 2015 (last updated on May 12, 2016). Two independent reviewers extracted the data. Odds ratios (ORs) with their 95 % confidence intervals (CIs) were used to assess the association. All statistical analyses were performed using the Review Manager (RevMan) 5.2 software. Finally 8 case–control studies were included in this meta-analysis. For unadjusted data, an association with increased risk was observed in three genetic models in COX-2 rs689466 polymorphism; however, COX-2 rs5275 and rs20417 polymorphisms were not related to HNSCC risk in this study. The pooled results from adjusted data all revealed non-significant association between these three polymorphisms and risk of HNSCC. We also found a similar result in the subgroup analyses, based on both unadjusted data and adjusted data. Current results suggest that COX-2 rs689466, rs5275, and rs20417 polymorphisms are not associated with HNSCC. Further large and well-designed studies are necessary to validate this association

  7. Infections and risk-adjusted length of stay and hospital mortality in Polish Neonatology Intensive Care Units

    Directory of Open Access Journals (Sweden)

    A. Różańska

    2015-06-01

    Conclusions: The general condition of VLBW infants statistically increase both their risk of mortality and LOS; this is in contrast to the presence of infection, which significantly prolonged LOS only.

  8. Hierarchic Analysis Method to Evaluate Rock Burst Risk

    Directory of Open Access Journals (Sweden)

    Ming Ji

    2015-01-01

    Full Text Available In order to reasonably evaluate the risk of rock bursts in mines, the factors impacting rock bursts and the existing grading criterion on the risk of rock bursts were studied. By building a model of hierarchic analysis method, the natural factors, technology factors, and management factors that influence rock bursts were analyzed and researched, which determined the degree of each factor’s influence (i.e., weight and comprehensive index. Then the grade of rock burst risk was assessed. The results showed that the assessment level generated by the model accurately reflected the actual risk degree of rock bursts in mines. The model improved the maneuverability and practicability of existing evaluation criteria and also enhanced the accuracy and science of rock burst risk assessment.

  9. A method for minimum risk portfolio optimization under hybrid uncertainty

    Science.gov (United States)

    Egorova, Yu E.; Yazenin, A. V.

    2018-03-01

    In this paper, we investigate a minimum risk portfolio model under hybrid uncertainty when the profitability of financial assets is described by fuzzy random variables. According to Feng, the variance of a portfolio is defined as a crisp value. To aggregate fuzzy information the weakest (drastic) t-norm is used. We construct an equivalent stochastic problem of the minimum risk portfolio model and specify the stochastic penalty method for solving it.

  10. Towards risk-based structural integrity methods for PWRs

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Lloyd, R.B.

    1992-01-01

    This paper describes the development of risk-based structural integrity assurance methods and their application to Pressurized Water Reactor (PWR) plant. In-service inspection is introduced as a way of reducing the failure probability of high risk sites and the latter are identified using reliability analysis; the extent and interval of inspection can also be optimized. The methodology is illustrated by reference to the aspect of reliability of weldments in PWR systems. (author)

  11. Covariate-adjusted measures of discrimination for survival data

    DEFF Research Database (Denmark)

    White, Ian R; Rapsomaniki, Eleni; Frikke-Schmidt, Ruth

    2015-01-01

    by the study design (e.g. age and sex) influence discrimination and can make it difficult to compare model discrimination between studies. Although covariate adjustment is a standard procedure for quantifying disease-risk factor associations, there are no covariate adjustment methods for discrimination...... statistics in censored survival data. OBJECTIVE: To develop extensions of the C-index and D-index that describe the prognostic ability of a model adjusted for one or more covariate(s). METHOD: We define a covariate-adjusted C-index and D-index for censored survival data, propose several estimators......, and investigate their performance in simulation studies and in data from a large individual participant data meta-analysis, the Emerging Risk Factors Collaboration. RESULTS: The proposed methods perform well in simulations. In the Emerging Risk Factors Collaboration data, the age-adjusted C-index and D-index were...

  12. A typology of interpartner conflict and maternal parenting practices in high-risk families: examining spillover and compensatory models and implications for child adjustment.

    Science.gov (United States)

    Sturge-Apple, Melissa L; Davies, Patrick T; Cicchetti, Dante; Fittoria, Michael G

    2014-11-01

    The present study incorporates a person-based approach to identify spillover and compartmentalization patterns of interpartner conflict and maternal parenting practices in an ethnically diverse sample of 192 2-year-old children and their mothers who had experienced higher levels of socioeconomic risk. In addition, we tested whether sociocontextual variables were differentially predictive of theses profiles and examined how interpartner-parenting profiles were associated with children's physiological and psychological adjustment over time. As expected, latent class analyses extracted three primary profiles of functioning: adequate functioning, spillover, and compartmentalizing families. Furthermore, interpartner-parenting profiles were differentially associated with both sociocontextual predictors and children's adjustment trajectories. The findings highlight the developmental utility of incorporating person-based approaches to models of interpartner conflict and maternal parenting practices.

  13. Methods and models used in comparative risk studies

    International Nuclear Information System (INIS)

    Devooght, J.

    1983-01-01

    Comparative risk studies make use of a large number of methods and models based upon a set of assumptions incompletely formulated or of value judgements. Owing to the multidimensionality of risks and benefits, the economic and social context may notably influence the final result. Five classes of models are briefly reviewed: accounting of fluxes of effluents, radiation and energy; transport models and health effects; systems reliability and bayesian analysis; economic analysis of reliability and cost-risk-benefit analysis; decision theory in presence of uncertainty and multiple objectives. Purpose and prospect of comparative studies are assessed in view of probable diminishing returns for large generic comparisons [fr

  14. Assessment of Methods for Estimating Risk to Birds from ...

    Science.gov (United States)

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mortality to birds from ingestion of lead particles. Response to ERASC Request #16

  15. [Legal and methodical aspects of occupational risk management].

    Science.gov (United States)

    2011-01-01

    Legal and methodical aspects of occupational risk management (ORM) are considered with account of new official documents. Introduction of risk and risk management notions into Labor Code reflects the change of forms of occupational health and safety. The role of hygienist and occupational medicine professionals in workplace conditions certification (WCC) and periodical medical examinations (PME) is strengthened. The ORM could be improved by introducing the block of prognosis and causation based on IT-technologies that could match systems of WCC and PME thus improving the effectiveness of prophylactics.

  16. A comparison of radiological risk assessment methods for environmental restoration

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Peterson, J.M.

    1993-01-01

    Evaluation of risks to human health from exposure to ionizing radiation at radioactively contaminated sites is an integral part of the decision-making process for determining the need for remediation and selecting remedial actions that may be required. At sites regulated under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a target risk range of 10 -4 to 10 -6 incremental cancer incidence over a lifetime is specified by the US Environmental Protection Agency (EPA) as generally acceptable, based on the reasonable maximum exposure to any individual under current and future land use scenarios. Two primary methods currently being used in conducting radiological risk assessments at CERCLA sites are compared in this analysis. Under the first method, the radiation dose equivalent (i.e., Sv or rem) to the receptors of interest over the appropriate period of exposure is estimated and multiplied by a risk factor (cancer risk/Sv). Alternatively, incremental cancer risk can be estimated by combining the EPA's cancer slope factors (previously termed potency factors) for radionuclides with estimates of radionuclide intake by ingestion and inhalation, as well as radionuclide concentrations in soil that contribute to external dose. The comparison of the two methods has demonstrated that resulting estimates of lifetime incremental cancer risk under these different methods may differ significantly, even when all other exposure assumptions are held constant, with the magnitude of the discrepancy depending upon the dominant radionuclides and exposure pathways for the site. The basis for these discrepancies, the advantages and disadvantages of each method, and the significance of the discrepant results for environmental restoration decisions are presented

  17. The Impact of Disability and Social Determinants of Health on Condition-Specific Readmissions beyond Medicare Risk Adjustments: A Cohort Study.

    Science.gov (United States)

    Meddings, Jennifer; Reichert, Heidi; Smith, Shawna N; Iwashyna, Theodore J; Langa, Kenneth M; Hofer, Timothy P; McMahon, Laurence F

    2017-01-01

    Readmission rates after pneumonia, heart failure, and acute myocardial infarction hospitalizations are risk-adjusted for age, gender, and medical comorbidities and used to penalize hospitals. To assess the impact of disability and social determinants of health on condition-specific readmissions beyond current risk adjustment. Retrospective cohort study of Medicare patients using 1) linked Health and Retirement Study-Medicare claims data (HRS-CMS) and 2) Healthcare Cost and Utilization Project State Inpatient Databases (Florida, Washington) linked with ZIP Code-level measures from the Census American Community Survey (ACS-HCUP). Multilevel logistic regression models assessed the impact of disability and selected social determinants of health on readmission beyond current risk adjustment. Outcomes measured were readmissions ≤30 days after hospitalizations for pneumonia, heart failure, or acute myocardial infarction. HRS-CMS models included disability measures (activities of daily living [ADL] limitations, cognitive impairment, nursing home residence, home healthcare use) and social determinants of health (spouse, children, wealth, Medicaid, race). ACS-HCUP model measures were ZIP Code-percentage of residents ≥65 years of age with ADL difficulty, spouse, income, Medicaid, and patient-level and hospital-level race. For pneumonia, ≥3 ADL difficulties (OR 1.61, CI 1.079-2.391) and prior home healthcare needs (OR 1.68, CI 1.204-2.355) increased readmission in HRS-CMS models (N = 1631); ADL difficulties (OR 1.20, CI 1.063-1.352) and 'other' race (OR 1.14, CI 1.001-1.301) increased readmission in ACS-HCUP models (N = 27,297). For heart failure, children (OR 0.66, CI 0.437-0.984) and wealth (OR 0.53, CI 0.349-0.787) lowered readmission in HRS-CMS models (N = 2068), while black (OR 1.17, CI 1.056-1.292) and 'other' race (OR 1.14, CI 1.036-1.260) increased readmission in ACS-HCUP models (N = 37,612). For acute myocardial infarction, nursing home status

  18. The bystander effect model of Brenner and Sachs fitted to lung cancer data in 11 cohorts of underground miners, and equivalence of fit of a linear relative risk model with adjustment for attained age and age at exposure

    International Nuclear Information System (INIS)

    Little, M P

    2004-01-01

    Bystander effects following exposure to α-particles have been observed in many experimental systems, and imply that linearly extrapolating low dose risks from high dose data might materially underestimate risk. Brenner and Sachs (2002 Int. J. Radiat. Biol. 78 593-604; 2003 Health Phys. 85 103-8) have recently proposed a model of the bystander effect which they use to explain the inverse dose rate effect observed for lung cancer in underground miners exposed to radon daughters. In this paper we fit the model of the bystander effect proposed by Brenner and Sachs to 11 cohorts of underground miners, taking account of the covariance structure of the data and the period of latency between the development of the first pre-malignant cell and clinically overt cancer. We also fitted a simple linear relative risk model, with adjustment for age at exposure and attained age. The methods that we use for fitting both models are different from those used by Brenner and Sachs, in particular taking account of the covariance structure, which they did not, and omitting certain unjustifiable adjustments to the miner data. The fit of the original model of Brenner and Sachs (with 0 y period of latency) is generally poor, although it is much improved by assuming a 5 or 6 y period of latency from the first appearance of a pre-malignant cell to cancer. The fit of this latter model is equivalent to that of a linear relative risk model with adjustment for age at exposure and attained age. In particular, both models are capable of describing the observed inverse dose rate effect in this data set

  19. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  20. Risk Assessment of Healthcare Waste by Preliminary Hazard Analysis Method

    Directory of Open Access Journals (Sweden)

    Pouran Morovati

    2017-09-01

    Full Text Available Introduction and purpose: Improper management of healthcare waste (HCW can pose considerable risks to human health and the environment and cause serious problems in developing countries such as Iran. In this study, we sought to determine the hazards of HCW in the public hospitals affiliated to Abadan School of Medicine using the preliminary hazard analysis (PHA method. Methods: In this descriptive and analytic study, health risk assessment of HCW in government hospitals affiliated to Abadan School of Medicine (4 public hospitals was carried out by using PHA in the summer of  2016. Results: We noted the high risk of sharps and infectious wastes. Considering the dual risk of injury and disease transmission, sharps were classified in the very high-risk group, and pharmaceutical and chemical and radioactive wastes were classified in the medium-risk group. Sharps posed the highest risk, while pharmaceutical and chemical wastes had the lowest risk. Among the various stages of waste management, the waste treatment stage was the most hazardous in all the studied hospitals. Conclusion: To diminish the risks associated with healthcare waste management in the studied hospitals, adequate training of healthcare workers and care providers, provision of suitable personal protective and transportation equipment, and supervision of the environmental health manager of hospitals should be considered by the authorities.  

  1. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    Science.gov (United States)

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  2. Analysis of risk assessment methods for goods trucking

    Directory of Open Access Journals (Sweden)

    Yunyazova A.O.

    2018-04-01

    Full Text Available the article considers models of risk assessment that can be applied to cargo transportation, for forecasting possible damage in the form of financial and material costs in order to reduce the percentage of probability of their occurrence. The analysis of risk by the method «Criterion. Event. Rule" is represented. This method is based on the collection of information by various methods, assigning an assessment to the identified risks, ranking and formulating a report on the analysis. It can be carried out as a fully manual mechanical method of information collecting and performing calculations or can be brought to an automated level from data collection to the delivery of finished results (but in this case some nuances that could significantly influence the outcome of the analysis can be ignored. The expert method is of particular importance, since it relies directly on human experience. In this case, a special role is played by the human factor. The collection of information and the assigned assessments to risk groups depend on the extent to which experts agree on this issue. The smaller the fluctuations in the values ​​of the estimates of the experts, the more accurate and optimal the results will be.

  3. Cumulative Socioeconomic Status Risk, Allostatic Load, and Adjustment: A Prospective Latent Profile Analysis with Contextual and Genetic Protective Factors

    Science.gov (United States)

    Brody, Gene H.; Yu, Tianyi; Chen, Yi-Fu; Kogan, Steven M.; Evans, Gary W.; Beach, Steven R. H.; Windle, Michael; Simons, Ronald L.; Gerrard, Meg; Gibbons, Frederick X.; Philibert, Robert A.

    2013-01-01

    The health disparities literature has identified a common pattern among middle-aged African Americans that includes high rates of chronic disease along with low rates of psychiatric disorders despite exposure to high levels of cumulative socioeconomic status (SES) risk. The current study was designed to test hypotheses about the developmental…

  4. Acculturation and Adjustment in Latino Adolescents: How Cultural Risk Factors and Assets Influence Multiple Domains of Adolescent Mental Health

    Science.gov (United States)

    Smokowski, Paul; Buchanan, Rachel L.; Bacallao, Martica L.

    2009-01-01

    The purpose of this study was to examine the relationships among risk factors, cultural assets, and Latino adolescent mental health outcomes. We extend past research by using a longitudinal design and evaluating direct and moderated acculturation effects across a range of internalizing, externalizing, and academic engagement outcomes. The sample…

  5. Development of environmental risk assessment framework using index method

    International Nuclear Information System (INIS)

    Ali, M.W.; Wu, Y.

    2000-01-01

    This paper presents a newly developed framework for assessing the risk from events which are considered to be major accidents to the environment according to the classifications by the United Kingdom Department of Environment (DoE). The application of an environmental risk assessment framework using the newly developed index method is demonstrated by means of a case study. The framework makes use of Environmental Hazard Index (EHI) method by the United Kingdom AEA Technology for releases to river, but improves it by taking account to toxic dose rather than concentration; taking account of long-term effects including persistence and bio accumulation, not just short term effects; extending the method to all aspects of environment, not just rivers; and allowing account to be taken of design changes to mitigate the risk. The development of the framework has also led to a revision of the tolerability criteria to be used with the framework proposed earlier by weakness and recommend further work to improve this newly proposed environmental risk assessment framework. From the study, it is recommended that the environmental risk assessment framework be applied to a wide range of other case studies in order to further improve it. The framework should be modified to maintain consistency when the DoE revises its definitions of major accidents to the environment. Ease-of-use of the framework (and any other environmental framework) would be aided by the compilation of databases for environmental toxicity, river data and available consequence models. Further work could also be done to suggest methods of mitigating the risk and including them as numerical factors within method. (author)

  6. Cancer risks, risk-cost-benefit analyses, and the scientific method

    International Nuclear Information System (INIS)

    Seiler, F.A.; Alvarez, J.L.

    1995-01-01

    Two main changes in risk analysis are increasingly beginning to influence the manner in which, in the perception of scientists, low-dose modeling of radiation carcinogenesis is supposed to be done. In the past, efforts to model radiation risks have been carried out under the banner of scientific endeavors. On closer inspection, however, it has become obvious that these efforts were not guided by the scientific method and that a change in approach is needed. We realize increasingly that risk analysis is not done in a vacuum and that any action taken due to the result of the analysis not only has a benefit in the form of a risk reduction but leads inevitably to an increase in cost and an increase in the risks of persons effecting the benefit. Thus, a risk-cost-benefit analysis should be done and show a clear-cut net benefit before a remedial action is taken

  7. Corporate Cash Holdings and Adjustment Behaviour in Chinese Firms: An Empirical Analysis Using Generalized Method of Moments

    Directory of Open Access Journals (Sweden)

    Ajid ur Rehman

    2016-05-01

    Full Text Available This study is intended to find out the motives of cash holding in Chinese firms and theories associated with these motives. The study is unique because it not only estimates the adjustment speed of corporate cash holdings but also discuss several firm specific factors that affects cash holdings in Chinese firms with special reference to Chinese SOEs and NSOEs. An extensive set of panel data comprising 1632 A listed Chines firms, over a period from 2001 to 2013 are taken for analysis. The study reports a lower adjustment coefficient for Chinese firms compared to other developed nations. The study finds that target level of cash holdings in Chinese firms is better explained by Trade off and Pecking order theories. To cope with issues of endogeneity and serial correlation the study apply GMM and random effects model with an added AR (autoregressive term.

  8. INTERIM REPORT IMPROVED METHODS FOR INCORPORATING RISK IN DECISION MAKING

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, M. J.; Fraley, D. W.; Denning, R. S.

    1980-08-01

    This paper reports observations and preliminary investigations in the first phase of a research program covering methodologies for making safety-related decisions. The objective has been to gain insight into NRC perceptions of the value of formal decision methods, their possible applications, and how risk is, or may be, incorporated in decision making. The perception of formal decision making techniques, held by various decision makers, and what may be done to improve them, were explored through interviews with NRC staff. An initial survey of decision making methods, an assessment of the applicability of formal methods vis-a-vis the available information, and a review of methods of incorporating risk and uncertainty have also been conducted.

  9. ANALYSIS METHODS OF BANKRUPTCY RISK IN ROMANIAN ENERGY MINING INDUSTRY

    Directory of Open Access Journals (Sweden)

    CORICI MARIAN CATALIN

    2016-12-01

    Full Text Available The study is an analysis of bankruptcy risk and assessing the economic performance of the entity in charge of energy mining industry from southwest region. The scientific activity assesses the risk of bankruptcy using score’s method and some indicators witch reflecting the results obtained and elements from organization balance sheet involved in mining and energy which contributes to the stability of the national energy system. Analysis undertaken is focused on the application of the business organization models that allow a comprehensive assessment of the risk of bankruptcy and be an instrument of its forecast. In this study will be highlighted developments bankruptcy risk within the organization through the Altman model and Conan-Holder model in order to show a versatile image on the organization's ability to ensure business continuity

  10. Methods for estimating risks to nuclear power plants from shipping

    International Nuclear Information System (INIS)

    Walker, D.H.; Hartman, M.G.; Robbins, T.R.

    1975-01-01

    Nuclear power plants sited on land near shipping lanes or offshore can be exposed to potential risks if there is nearby ship or barge traffic which involves the transport of hazardous cargo. Methods that have been developed for estimating the degree of risk are summarized. Of concern are any accidents which could lead to a release or spill of the hazardous cargo, or to an explosion. A probability of occurrence of the order of 10 -7 per year is a general guideline which has been used to judge whether or not the risk from hazards created by accidents is acceptable. This guideline has been followed in the risk assessment discussed in this paper. 19 references

  11. Screening-Level Ecological Risk Assessment Methods, Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Mirenda, Richard J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2012-08-16

    This document provides guidance for screening-level assessments of potential adverse impacts to ecological resources from release of environmental contaminants at the Los Alamos National Laboratory (LANL or the Laboratory). The methods presented are based on two objectives, namely: to provide a basis for reaching consensus with regulators, managers, and other interested parties on how to conduct screening-level ecological risk investigations at the Laboratory; and to provide guidance for ecological risk assessors under the Environmental Programs (EP) Directorate. This guidance promotes consistency, rigor, and defensibility in ecological screening investigations and in reporting those investigation results. The purpose of the screening assessment is to provide information to the risk managers so informed riskmanagement decisions can be made. This document provides examples of recommendations and possible risk-management strategies.

  12. Comparison of methods for prioritizing risk in radiation oncology

    International Nuclear Information System (INIS)

    Biazotto, Bruna; Tokarski, Marcio

    2016-01-01

    Proactive risk management tools, such as Failure Mode and Effect Analysis (FEMA), were imported from engineering and have been widely used in Radiation Oncology. An important step in this process is the risk prioritization and there are many methods to do that. This paper compares the risk prioritization of computerized planning phase in interstitial implants with high dose rate brachytherapy performed with Health Care Failure Mode and Effect Analysis (HFMEA) and FMEA with guidelines given by the Task Group 100 (TG 100) of the American Association of Physicists in Medicine. Out of the 33 possible failure modes of this process, 21 require more attention when evaluated by HFMEA and 22, when evaluated by FMEA TG 100. Despite the high coincidence between the methods, the execution of HFMEA was simpler. (author)

  13. Development of fire risk assessment method caused by earthquake

    International Nuclear Information System (INIS)

    Mitomo, Nobuo; Matsukura, Hiroshi; Matsuoka, Takeshi; Suzuki, Kazutaka

    2000-01-01

    The purpose of this research is to establish the assessment method of the risk of the multiple fires caused by earthquake, in the framework of PSA. In order to establish this method, we have settled four tasks and started a five years research project in 1999 for five years. These results will be useful for not only nuclear power plants but also chemical plants, traffic systems etc. (author)

  14. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, May 2014

    International Nuclear Information System (INIS)

    Aliberti, G.; Archier, P.; Dunn, M.; Dupont, E.; Hill, I.; ); Garcia, A.; Hursin, M.; Pelloni, S.; Ivanova, T.; Kodeli, I.; Palmiotti, G.; Salvatores, M.; Touran, N.; Wenming, Wang; Yokoyama, K.

    2014-05-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the second Subgroup meeting, held at the NEA, Issy-les-Moulineaux, France, on 13 May 2014. It comprises a Summary Record of the meeting and all the available presentations (slides) given by the participants: A - Welcome: Review of actions (M. Salvatores); B - Inter-comparison of sensitivity coefficients: 1 - Sensitivity Computation with Monte Carlo Methods (T. Ivanova); 2 - Sensitivity analysis of FLATTOP-Pu (I. Kodeli); 3 - Sensitivity coefficients by means of SERPENT-2 (S. Pelloni); 4 - Demonstration - Database for ICSBEP (DICE) and Database and Analysis Tool for IRPhE (IDAT) (I. Hill); C - Specific new experiments: 1 - PROTEUS FDWR-II (HCLWR) program summary (M. Hursin); 2 - STEK and SEG Experiments, M. Salvatores 3 - Experiments related to "2"3"5U, "2"3"8U, "5"6Fe and "2"3Na, G. Palmiotti); 4 - Validation of Iron Cross Sections against ASPIS Experiments (JEF/DOC-420) (I. Kodeli); 5 - Benchmark analysis of Iron Cross-sections (EFFDOC-1221) (I. Kodeli 6 - Integral Beta-effective Measurements (K. Yokoyama on behalf of M. Ishikawa); D - Adjustment results: 1 - Impacts of Covariance Data and Interpretation of Adjustment Trends of ADJ2010, (K. Yokoyama); 2 - Revised Recommendations from ADJ2010 Adjustment (K. Yokoyama); 3 - Comparisons and Discussions on Adjustment trends from JEFF (CEA) (P. Archier); 4 - Feedback on CIELO Isotopes from ENDF/B-VII.0 Adjustment (G. Palmiotti); 5 - Demonstration - Plot comparisons of participants' results (E

  15. Social and Emotional Adjustment of Siblings of Children with Autism

    Science.gov (United States)

    Pilowsky, Tammy; Yirmiya, Nurit; Doppelt, Osnat; Gross-Tsur, Varda; Shalev, Ruth S.

    2004-01-01

    Background: Social and emotional adjustment of siblings of children with autism was examined, to explore their risk or resilience to effects of genetic liability and environmental factors involved in having a sibling with autism. Method: Social-emotional adjustment, behavior problems, socialization skills, and siblings' relationships were compared…

  16. A New Method for Spatial Health Risk Assessment of Pollutants

    Directory of Open Access Journals (Sweden)

    Mohamad Sakizadeh*

    2017-03-01

    Full Text Available Background: The area of contaminated lands exposed to the health risk of environmental pollutants is a matter of argument. In this study, a new method was developed to estimate the amount of area that is exposed to higher than normal levels of Cr, Mn, and V. Methods: Overall, 170 soil samples were collected from the upper 10 cm of soil in an arid area in central part of Iran in Semnan Province. The values of Cr, Mn, and V were detected by ICP-OES technique. A geostatistical method known as sequential Gaussian co-simulation was applied to consider the spatial risk of these toxic elements. Results: The moderate spatial dependence of Cr indicates the contribution of both intrinsic and extrinsic factor to the levels of this heavy metal in the study area, whereas, Mn and V can be attributed to intrinsic factors (such as lithology. There has not been any significant influence due to agricultural practices on the Cr values in the region. The surface of contaminated area for manganese, produced by risk curve on surface method, was higher than chromium and vanadium. Conclusion: The produced risk curves as rendered in this study can be adopted in similar studies to help managers to estimate the total area that requires cleanup action.

  17. Planning of operation & maintenance using risk and reliability based methods

    DEFF Research Database (Denmark)

    Florian, Mihai; Sørensen, John Dalsgaard

    2015-01-01

    Operation and maintenance (OM) of offshore wind turbines contributes with a substantial part of the total levelized cost of energy (LCOE). The objective of this paper is to present an application of risk- and reliability-based methods for planning of OM. The theoretical basis is presented...

  18. FDG PET/CT diagnostic criteria may need adjustment based on MRI to estimate the presurgical risk of extrapelvic infiltration in patients with uterine endometrial cancer

    Energy Technology Data Exchange (ETDEWEB)

    Sudo, Satoko; Sakuragi, Noriaki [Hokkaido University Graduate School of Medicine, Department of Gynecology, Sapporo (Japan); Hattori, Naoya; Manabe, Osamu; Hirata, Kenji; Tamaki, Nagara [Hokkaido University Graduate School of Medicine, Department of Nuclear Medicine, Kitaku, Sapporo (Japan); Kato, Fumi; Mimura, Rie; Magota, Keiichi; Sugimori, Hiroyuki [Hokkaido University Graduate School of Medicine, Department of Diagnostic and Interventional Radiology, Sapporo (Japan)

    2015-04-01

    The staging of endometrial cancer requires surgery which carries the risk of morbidity. FDG PET/CT combined with anatomical imaging may reduce the number of unnecessary lymphadenectomies by demonstrating the risk of extrapelvic infiltration. The purpose of this study was to optimize FDG PET/CT diagnostic criteria for risk assessment in endometrial cancer after first-line risk triage with MRI. The study population comprised 37 patients who underwent curative surgery for the treatment of endometrial cancer. First, the risk of extrapelvic infiltration was triaged using MRI. Second, multiple glucose metabolic profiles of the primary lesion were assessed with FDG PET/CT, and these were correlated with the histopathological risk of extrapelvic infiltration including lymphovascular space invasion (LVSI) and high-grade malignancy (grades 2 and 3). The results of histological correlation were used to adjust FDG PET/CT diagnostic criteria. Presurgical assessment using MRI was positive for deep (>50 %) myometrial invasion in 17 patients. The optimal FDG PET/CT diagnostic criteria vary depending on the results of MRI. Specifically, SUVmax (≥16.0) was used to indicate LVSI risk with an overall diagnostic accuracy of 88.2 % in patients with MRI findings showing myometrial invasion. High-grade malignancy did not correlate with any of metabolic profiles in this patient group. In the remaining patients without myometrial invasion, lesion glycolysis (LG) or metabolic volume were better indicators of LVSI than SUVmax with the same diagnostic accuracy of 80.0 %. In addition, LG (≥26.9) predicted high-grade malignancy with an accuracy of 72.2 %. Using the optimized cut-off criteria for LVSI, glucose metabolic profiling of primary lesions correctly predicted lymph node metastasis with an accuracy of 73.0 %, which was comparable with the accuracy of visual assessment for lymph node metastasis using MRI and FDG PET/CT. FDG PET/CT diagnostic criteria may need adjustment based on the

  19. Identifying the contents of a type 1 diabetes outpatient care program based on the self-adjustment of insulin using the Delphi method.

    Science.gov (United States)

    Kubota, Mutsuko; Shindo, Yukari; Kawaharada, Mariko

    2014-10-01

    The objective of this study is to identify the items necessary for an outpatient care program based on the self-adjustment of insulin for type 1 diabetes patients. Two surveys based on the Delphi method were conducted. The survey participants were 41 certified diabetes nurses in Japan. An outpatient care program based on the self-adjustment of insulin was developed based on pertinent published work and expert opinions. There were a total of 87 survey items in the questionnaire, which was developed based on the care program mentioned earlier, covering matters such as the establishment of prerequisites and a cooperative relationship, the basics of blood glucose pattern management, learning and practice sessions for the self-adjustment of insulin, the implementation of the self-adjustment of insulin, and feedback. The participants' approval on items in the questionnaires was defined at 70%. Participants agreed on all of the items in the first survey. Four new parameters were added to make a total of 91 items for the second survey and participants agreed on the inclusion of 84 of them. Items necessary for a type 1 diabetes outpatient care program based on self-adjustment of insulin were subsequently selected. It is believed that this care program received a fairly strong approval from certified diabetes nurses; however, it will be necessary to have the program further evaluated in conjunction with intervention studies in the future. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  20. Effect of Laparoscopic Adjustable Gastric Banding on Metabolic Syndrome and Its Risk Factors in Morbidly Obese Adolescents

    Directory of Open Access Journals (Sweden)

    Rushika Conroy

    2011-01-01

    Full Text Available We examined the effect of laparoscopic adjustable gastric banding (LAGB on weight loss, inflammatory markers, and components of the Metabolic Syndrome (MeS in morbidly obese adolescents and determined if those with MeS lose less weight post-LAGB than those without. Data from 14–18 yr adolescents were obtained at baseline, 6 and 12 months following LAGB. Significant weight loss and improvements in MeS components were observed 6 months and one year following LAGB. The incidence of MeS declined 56.8% after 6 months and 69.6% after 12 months. There was no significant difference in amount of weight lost post-LAGB between those with and without MeS at either timepoint. Correlations between change in weight parameters and components of MeS in those with and without MeS at baseline were examined and found to vary by diagnostic category. LAGB is effective for short-term improvement in weight, inflammatory markers, and components of MeS in morbidly obese adolescents.

  1. The Problem with the Low-Tax Backlash: Rethinking Corporate Tax Policies to Adjust for Uneven Reputational Risks

    Directory of Open Access Journals (Sweden)

    Jack M. Mintz

    2015-05-01

    Full Text Available When a major corporation is found to be paying little or no taxes, public backlash and media furor over the issue may ensue. Some governments may well be just fine with it, while others like U.S. may take steps to ensure companies pay more tax. Sometimes, companies being in a non-taxpaying position properly reflects appropriate tax policy. That explanation, however, does not sell lattés, which is why in 2012, after the British public grew outraged over the discovery that Starbucks was paying no corporate taxes in the U.K., the coffee retailer actually volunteered to just write a cheque to the government. The reputational damage to Starbucks’ brand, the company calculated, was not worth the money it was saving in avoiding taxes, even if it was doing so perfectly legally. The fear of this kind of reputational damage can foil the very taxation policies that governments design specifically as a means to tax corporations fairly, efficiently and competitively. It may be good tax policy to allow corporations various deductions, or the ability to carry forward or carry back losses, but it can be politically vexatious. U.S. President Barack Obama demonstrated that explicitly when he suggested certain American companies using so-called tax inversions to relocate their headquarters to low-tax jurisdictions, were failing in their “economic patriotism.” Yet more multinationals than ever are legally and quite appropriately using tax strategies to minimize their taxes in various jurisdictions to the point where they are paying little to no corporate tax. For some corporations, the risk of public backlash is greater than it is for others: Starbucks and Facebook, being consumer-facing companies with a great deal of brand goodwill, have a lot more at risk than do Pfizer and Oracle. This risk makes the playing field for taxation less level, jeopardizing the fundamental tax principle of horizontal equity — that those of similar means should pay similar

  2. The relationship between effectiveness and costs measured by a risk-adjusted case-mix system: multicentre study of Catalonian population data bases

    Directory of Open Access Journals (Sweden)

    Flor-Serra Ferran

    2009-06-01

    Full Text Available Abstract Background The main objective of this study is to measure the relationship between morbidity, direct health care costs and the degree of clinical effectiveness (resolution of health centres and health professionals by the retrospective application of Adjusted Clinical Groups in a Spanish population setting. The secondary objectives are to determine the factors determining inadequate correlations and the opinion of health professionals on these instruments. Methods/Design We will carry out a multi-centre, retrospective study using patient records from 15 primary health care centres and population data bases. The main measurements will be: general variables (age and sex, centre, service [family medicine, paediatrics], and medical unit, dependent variables (mean number of visits, episodes and direct costs, co-morbidity (Johns Hopkins University Adjusted Clinical Groups Case-Mix System and effectiveness. The totality of centres/patients will be considered as the standard for comparison. The efficiency index for visits, tests (laboratory, radiology, others, referrals, pharmaceutical prescriptions and total will be calculated as the ratio: observed variables/variables expected by indirect standardization. The model of cost/patient/year will differentiate fixed/semi-fixed (visits costs of the variables for each patient attended/year (N = 350,000 inhabitants. The mean relative weights of the cost of care will be obtained. The effectiveness will be measured using a set of 50 indicators of process, efficiency and/or health results, and an adjusted synthetic index will be constructed (method: percentile 50. The correlation between the efficiency (relative-weights and synthetic (by centre and physician indices will be established using the coefficient of determination. The opinion/degree of acceptance of physicians (N = 1,000 will be measured using a structured questionnaire including various dimensions. Statistical analysis: multiple regression

  3. Use of the method of neutrons moderation to the adjustment of the concrete dosage through the total humidity of the arid s

    International Nuclear Information System (INIS)

    Howland, J.; Morejon, D.; Simeon, G.; Gracia, R.; Desdin, L.; O'Reilly, V.

    1997-01-01

    The method of neutrons moderation to the fast determination was applied of the content of humidity in the fine and thick arid s. The measure values of humidity were employed in the adjustment of the Dosification of concrete by the total humidity of the arid. The obtained results indicate that the employment of this fitting method allows to get higher values of resistance to the compression and also reduces the dispersions in the concretes production. This method would permit a considerable saving of cement in comparison with the traditional method. (author) [es

  4. Risk-adjustment models for heart failure patients' 30-day mortality and readmission rates: the incremental value of clinical data abstracted from medical charts beyond hospital discharge record.

    Science.gov (United States)

    Lenzi, Jacopo; Avaldi, Vera Maria; Hernandez-Boussard, Tina; Descovich, Carlo; Castaldini, Ilaria; Urbinati, Stefano; Di Pasquale, Giuseppe; Rucci, Paola; Fantini, Maria Pia

    2016-09-06

    Hospital discharge records (HDRs) are routinely used to assess outcomes of care and to compare hospital performance for heart failure. The advantages of using clinical data from medical charts to improve risk-adjustment models remain controversial. The aim of the present study was to evaluate the additional contribution of clinical variables to HDR-based 30-day mortality and readmission models in patients with heart failure. This retrospective observational study included all patients residing in the Local Healthcare Authority of Bologna (about 1 million inhabitants) who were discharged in 2012 from one of three hospitals in the area with a diagnosis of heart failure. For each study outcome, we compared the discrimination of the two risk-adjustment models (i.e., HDR-only model and HDR-clinical model) through the area under the ROC curve (AUC). A total of 1145 and 1025 patients were included in the mortality and readmission analyses, respectively. Adding clinical data significantly improved the discrimination of the mortality model (AUC = 0.84 vs. 0.73, p < 0.001), but not the discrimination of the readmission model (AUC = 0.65 vs. 0.63, p = 0.08). We identified clinical variables that significantly improved the discrimination of the HDR-only model for 30-day mortality following heart failure. By contrast, clinical variables made little contribution to the discrimination of the HDR-only model for 30-day readmission.

  5. Surgeon length of service and risk-adjusted outcomes: linked observational analysis of the UK National Adult Cardiac Surgery Audit Registry and General Medical Council Register.

    Science.gov (United States)

    Hickey, Graeme L; Grant, Stuart W; Freemantle, Nick; Cunningham, David; Munsch, Christopher M; Livesey, Steven A; Roxburgh, James; Buchan, Iain; Bridgewater, Ben

    2014-09-01

    To explore the relationship between in-hospital mortality following adult cardiac surgery and the time since primary clinical qualification for the responsible consultant cardiac surgeon (a proxy for experience). Retrospective analysis of prospectively collected national registry data over a 10-year period using mixed-effects multiple logistic regression modelling. Surgeon experience was defined as the time between the date of surgery and award of primary clinical qualification. UK National Health Service hospitals performing cardiac surgery between January 2003 and December 2012. All patients undergoing coronary artery bypass grafts and/or valve surgery under the care of a consultant cardiac surgeon. All-cause in-hospital mortality. A total of 292,973 operations performed by 273 consultant surgeons (with lengths of service from 11.2 to 42.0 years) were included. Crude mortality increased approximately linearly until 33 years service, before decreasing. After adjusting for case-mix and year of surgery, there remained a statistically significant (p=0.002) association between length of service and in-hospital mortality (odds ratio 1.013; 95% CI 1.005-1.021 for each year of 'experience'). Consultant cardiac surgeons take on increasingly complex surgery as they gain experience. With this progression, the incidence of adverse outcomes is expected to increase, as is demonstrated in this study. After adjusting for case-mix using the EuroSCORE, we observed an increased risk of mortality in patients operated on by longer serving surgeons. This finding may reflect under-adjustment for risk, unmeasured confounding or a real association. Further research into outcomes over the time course of surgeon's careers is required. © The Royal Society of Medicine.

  6. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  7. Salary adjustments

    CERN Multimedia

    HR Department

    2008-01-01

    In accordance with decisions taken by the Finance Committee and Council in December 2007, salaries are adjusted with effect from 1 January 2008. Scale of basic salaries and scale of stipends paid to fellows (Annex R A 5 and R A 6 respectively): increased by 0.71% with effect from 1 January 2008. As a result of the stability of the Geneva consumer price index, following elements do not increase: a) Family Allowance, Child Allowance and Infant Allowance (Annex R A 3). b) Reimbursement of education fees: maximum amounts of reimbursement (Annex R A 4.01) for the academic year 2007/2008. Related adjustments will be implemented, wherever applicable, to Paid Associates and Students. As in the past, the actual percentage increase of each salary position may vary, due to the application of a constant step value and the rounding effects. Human Resources Department Tel. 73566

  8. Salary adjustments

    CERN Multimedia

    HR Department

    2008-01-01

    In accordance with decisions taken by the Finance Committee and Council in December 2007, salaries are adjusted with effect from 1 January 2008. Scale of basic salaries and scale of stipends paid to fellows (Annex R A 5 and R A 6 respectively): increased by 0.71% with effect from 1 January 2008. As a result of the stability of the Geneva consumer price index, the following elements do not increase: a)\tFamily Allowance, Child Allowance and Infant Allowance (Annex R A 3); b)\tReimbursement of education fees: maximum amounts of reimbursement (Annex R A 4.01) for the academic year 2007/2008. Related adjustments will be applied, wherever applicable, to Paid Associates and Students. As in the past, the actual percentage increase of each salary position may vary, due to the application of a constant step value and rounding effects. Human Resources Department Tel. 73566

  9. Adjustable collimator

    International Nuclear Information System (INIS)

    Carlson, R.W.; Covic, J.; Leininger, G.

    1981-01-01

    In a rotating fan beam tomographic scanner there is included an adjustable collimator and shutter assembly. The assembly includes a fan angle collimation cylinder having a plurality of different length slots through which the beam may pass for adjusting the fan angle of the beam. It also includes a beam thickness cylinder having a plurality of slots of different widths for adjusting the thickness of the beam. Further, some of the slots have filter materials mounted therein so that the operator may select from a plurality of filters. Also disclosed is a servo motor system which allows the operator to select the desired fan angle, beam thickness and filter from a remote location. An additional feature is a failsafe shutter assembly which includes a spring biased shutter cylinder mounted in the collimation cylinders. The servo motor control circuit checks several system conditions before the shutter is rendered openable. Further, the circuit cuts off the radiation if the shutter fails to open or close properly. A still further feature is a reference radiation intensity monitor which includes a tuning-fork shaped light conducting element having a scintillation crystal mounted on each tine. The monitor is placed adjacent the collimator between it and the source with the pair of crystals to either side of the fan beam

  10. Development and Validation of an Agency for Healthcare Research and Quality Indicator for Mortality After Congenital Heart Surgery Harmonized With Risk Adjustment for Congenital Heart Surgery (RACHS-1) Methodology.

    Science.gov (United States)

    Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee

    2016-05-20

    The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  11. Labor productivity adjustment factors. A method for estimating labor construction costs associated with physical modifications to nuclear power plants

    International Nuclear Information System (INIS)

    Riordan, B.J.

    1986-03-01

    This report develops quantitative labor productivity adjustment factors for the performance of regulatory impact analyses (RIAs). These factors will allow analysts to modify ''new construction'' labor costs to account for changes in labor productivity due to differing work environments at operating reactors and at reactors with construction in progress. The technique developed in this paper relies on the Energy Economic Data Base (EEDB) for baseline estimates of the direct labor hours and/or labor costs required to perform specific tasks in a new construction environment. The labor productivity cost factors adjust for constraining conditions such as working in a radiation environment, poor access, congestion and interference, etc., which typically occur on construction tasks at operating reactors and can occur under certain circumstances at reactors under construction. While the results do not portray all aspects of labor productivity, they encompass the major work place conditions generally discernible by the NRC analysts and assign values that appear to be reasonable within the context of industry experience. 18 refs

  12. [Factors affecting in-hospital mortality in patients with sepsis: Development of a risk-adjusted model based on administrative data from German hospitals].

    Science.gov (United States)

    König, Volker; Kolzter, Olaf; Albuszies, Gerd; Thölen, Frank

    2018-05-01

    Inpatient administrative data from hospitals is already used nationally and internationally in many areas of internal and public quality assurance in healthcare. For sepsis as the principal condition, only a few published approaches are available for Germany. The aim of this investigation is to identify factors influencing hospital mortality by employing appropriate analytical methods in order to improve the internal quality management of sepsis. The analysis was based on data from 754,727 DRG cases of the CLINOTEL hospital network charged in 2015. The association then included 45 hospitals of all supply levels with the exception of university hospitals (range of beds: 100 to 1,172 per hospital). Cases of sepsis were identified via the ICD codes of their principal diagnosis. Multiple logistic regression analysis was used to determine the factors influencing in-hospital lethality for this population. The model was developed using sociodemographic and other potential variables that could be derived from the DRG data set, and taking into account current literature data. The model obtained was validated with inpatient administrative data of 2016 (51 hospitals, 850,776 DRG cases). Following the definition of the inclusion criteria, 5,608 cases of sepsis (2016: 6,384 cases) were identified in 2015. A total of 12 significant and, over both years, stable factors were identified, including age, severity of sepsis, reason for hospital admission and various comorbidities. The AUC value of the model, as a measure of predictability, is above 0.8 (H-L test p>0.05, R 2 value=0.27), which is an excellent result. The CLINOTEL model of risk adjustment for in-hospital lethality can be used to determine the mortality probability of patients with sepsis as principal diagnosis with a very high degree of accuracy, taking into account the case mix. Further studies are needed to confirm whether the model presented here will prove its value in the internal quality assurance of hospitals

  13. Evaluation Method of Collision Risk by Using True Motion

    Directory of Open Access Journals (Sweden)

    Hayama Imazu

    2017-03-01

    Full Text Available It is necessary to develop a useful application to use big data like as AIS for safety and efficiency of ship operation. AIS is very useful system to collect targets information, but this information is not effective use yet. The evaluation method of collision risk is one of the cause disturb. Usually the collision risk of ship is evaluated by the value of the Closest Point of Approach (CPA which is related to a relative motion. So, it becomes difficult to find out a safety pass in a congested water. Here, Line of Predicted Collision (LOPC and Obstacle Zone by Target (OZT for evaluation of collision risk are introduced, these values are related to a true motion and it became visible of dangerous place, so it will make easy to find out a safety pass in a congested water.

  14. Equivalence of ten different discounted cash flow valuation methods

    OpenAIRE

    Fernandez, Pablo

    2004-01-01

    This paper shows that ten methods of company valuation using discounted cash flows (WACC; equity cash flow; capital cash flow; adjusted present value; residual income; EVA; business's risk-adjusted equity cash flow; business's risk-adjusted free cash flow; risk-free-adjusted equity cash flow; and risk-free-adjusted free cash flow) always give the same value when identical assumptions are used. This result is logical, since all the methods analyze the same reality using the same assumptions; t...

  15. The adjusted Global AntiphosPholipid Syndrome Score (aGAPSS) for risk stratification in young APS patients with acute myocardial infarction.

    Science.gov (United States)

    Radin, M; Schreiber, K; Costanzo, P; Cecchi, I; Roccatello, D; Baldovino, S; Bazzan, M; Cuadrado, M J; Sciascia, S

    2017-08-01

    Young adults with acute myocardial infarction are a critical group to examine for the purpose of risk factor stratification and modification. In this study we aimed to assess the clinical utility of the adjusted Global AntiphosPholipid Syndrome Score (aGAPSS) for the risk stratification of acute myocardial infarction in a cohort of young patients with antiphospholipid syndrome (APS). The analysis included 83 consecutive APS patients (≤50years old) who presented with arterial or venous thromboembolic events. Data on cardiovascular risk factors and antiphospholipid antibodies (aPL) positivity were retrospectively collected. The aGAPSS was calculated by adding the points corresponding to the risk factors, based on a linear transformation derived from the ß-regression coefficient as follows: 3 for hyperlipidaemia, 1 for arterial hypertension, 5 for aCL IgG/IgM, 4 for anti-b2 glycoprotein I IgG/IgM and 4 for LA. Higher aGAPSS values were observed in patients with acute myocardial infarction when compared to the others [mean aGAPSS 11.9 (S.D. 4.15, range 4-18) Vs. mean aGAPSS 9.2 (S.D. 5.1, range 1-17); T test: psyndrome compared to patients with a history of peripheral or cerebrovascular arterial thrombotic events [mean aGAPSS 11.9 (S.D. 4.15, range 4-18) Vs. mean aGAPSS 6.7 (S.D. 5.7, range 1-17); T test: P<0.005]. The aGAPSS is based upon a quantitative score and could aid risk stratifying APS patients younger than 50years for the likelihood of developing coronary thrombotic events and may guide pharmacological treatment for high-risk patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Sickness presence, sick leave and adjustment latitude

    Directory of Open Access Journals (Sweden)

    Joachim Gerich

    2014-10-01

    Full Text Available Objectives: Previous research on the association between adjustment latitude (defined as the opportunity to adjust work efforts in case of illness and sickness absence and sickness presence has produced inconsistent results. In particular, low adjustment latitude has been identified as both a risk factor and a deterrent of sick leave. The present study uses an alternative analytical strategy with the aim of joining these results together. Material and Methods: Using a cross-sectional design, a random sample of employees covered by the Upper Austrian Sickness Fund (N = 930 was analyzed. Logistic and ordinary least square (OLS regression models were used to examine the association between adjustment latitude and days of sickness absence, sickness presence, and an estimator for the individual sickness absence and sickness presence propensity. Results: A high level of adjustment latitude was found to be associated with a reduced number of days of sickness absence and sickness presence, but an elevated propensity for sickness absence. Conclusions: Employees with high adjustment latitude experience fewer days of health complaints associated with lower rates of sick leave and sickness presence compared to those with low adjustment latitude. In case of illness, however, high adjustment latitude is associated with a higher pro­bability of taking sick leave rather than sickness presence.

  17. The comparison of cardiovascular risk scores using two methods of substituting missing risk factor data in patient medical records

    Directory of Open Access Journals (Sweden)

    Andrew Dalton

    2011-07-01

    Conclusions A simple method of substituting missing risk factor data can produce reliable estimates of CVD risk scores. Targeted screening for high CVD risk, using pre-existing electronic medical record data, does not require multiple imputation methods in risk estimation.

  18. Methods of currency risk management in foreign trade

    Directory of Open Access Journals (Sweden)

    V.V. Ksendzuk

    2016-03-01

    Full Text Available Development of the country’s market national economy is closely connected with international economic relations. Therefore national business entities are actively involved in foreign trade, and their positive results influence not only on the status and income of owners, but form the economic potential of the country. The survey describes the main indicators of foreign trade and the impact of export and import transactions on economic development of Ukraine, particularly on the gross domestic income of the country. Taking into account also the negative trends in foreign currency exchange rates, the article considers the types of currency risks that accompany international transactions and identifies the limits of the usefulness of currency risk management methods. The methods of currency risk management are also systematized, the benefits of their use for the enterprise are considered and the status and readiness of Ukraine’s financial market to ensure appropriate conditions for the functioning of the currency risk management in domestic enterprises are analyzed.

  19. Methods for measuring risk-aversion: problems and solutions

    International Nuclear Information System (INIS)

    Thomas, P J

    2013-01-01

    Risk-aversion is a fundamental parameter determining how humans act when required to operate in situations of risk. Its general applicability has been discussed in a companion presentation, and this paper examines methods that have been used in the past to measure it and their attendant problems. It needs to be borne in mind that risk-aversion varies with the size of the possible loss, growing strongly as the possible loss becomes comparable with the decision maker's assets. Hence measuring risk-aversion when the potential loss or gain is small will produce values close to the risk-neutral value of zero, irrespective of who the decision maker is. It will also be shown how the generally accepted practice of basing a measurement on the results of a three-term Taylor series will estimate a limiting value, minimum or maximum, rather than the value utilised in the decision. A solution is to match the correct utility function to the results instead

  20. Methods for measuring risk-aversion: problems and solutions

    Science.gov (United States)

    Thomas, P. J.

    2013-09-01

    Risk-aversion is a fundamental parameter determining how humans act when required to operate in situations of risk. Its general applicability has been discussed in a companion presentation, and this paper examines methods that have been used in the past to measure it and their attendant problems. It needs to be borne in mind that risk-aversion varies with the size of the possible loss, growing strongly as the possible loss becomes comparable with the decision maker's assets. Hence measuring risk-aversion when the potential loss or gain is small will produce values close to the risk-neutral value of zero, irrespective of who the decision maker is. It will also be shown how the generally accepted practice of basing a measurement on the results of a three-term Taylor series will estimate a limiting value, minimum or maximum, rather than the value utilised in the decision. A solution is to match the correct utility function to the results instead.

  1. RESEARCH ON RISK CLASSIFICATION METHOD OF ASSEMBLY OCCUPANCIES

    Directory of Open Access Journals (Sweden)

    Hao Yu

    2017-10-01

    Full Text Available Due to the densely population and mobility characteristics of the crowd, generally accidents happened in assembly occupancies will trigger a chain reaction, and then bring heavy casualties and property loss, and result disastrous consequences. In the context of safety regulation resources limited, building risk classification system of assembly occupancies is important for "scientific predicting, and hierarchical controlling” In this paper, a software with a graphical user interface is designed using MATLAB GUI to analyze and calculate risks of stampede accident caused by gathered crowds in the video. A velocity extraction method based on cross-correlation algorithm is adopted, and the risk characteristic parameters such as velocity variance is also applied. In this way, real-time analysis and early-warning for risks of stampede accident in time and space can be achieved. Also, the algorithm is applied to the surveillance video of the stampede in Shanghai and its feasibility is proved. Empirical research shows that, the assembly occupancies risk rating model built in this paper has good effectiveness, simplicity and practicability, applies to the government safety regulation and organization safety management, and can improve the safety situation of assembly occupancies effectively.

  2. Improved outcome for children with acute lymphoblastic leukemia after risk-adjusted intensive therapy: a single institution experience

    International Nuclear Information System (INIS)

    Al-Nasser, A.; El-Solh, H.; Al-Mahr, M.

    2008-01-01

    Because of need for more comprehensive information on the least toxic and most effective forms of therapy for children with acute lymphoblastic leukemia (ALL), we reviewed our experience in the treatment of children with ALL at King Faisal Specialist Hospital and Research Centre (KFSHRC) and King Fahd National Center for Children's Cancer and Research (KFNCCCR) over a period of 18 years with a focus on patient characteristics and outcome. During the period of 1981 to 1988, records of children with ALL were retrospectively reviewed with respect to clinical presentation, laboratory findings, risk factors, stratification, therapy and outcome. The protocols used in treatment included 4 local protocols (KFSH 81, 84, 87 and 90) and subsequently. Children's Cancer Group (CCG) protocols and these were grouped as Era (1981-1992) and Era 2 (1993-1998). Of 509 children with ALL treated during this period, 316 were treated using local protocols and 193 using CCG protocols. Drugs used in Era 1 included a 4-drug induction using etoposid (VP-16) instead of L-asparaginase. Consolidation was based on high dose methotexate (MTX) 1g/m2 and maintenance was based on oral mercaptopurine (6-MP) and MTX with periodic pulses using intravenous teniposide (VM-26), Ara-C, L-asparaginase, adriamycin, prednisone, VP-16 cyclophosphamide .International protocols were introduced in Era 2, which was also marked by intensification of early treatment, a wider selection of cytoreductive agents, and the alternating use of non-cross-resistant pairs of drugs using the post-remission period. The end of induction remission rate improved from 90% in Era 1 to 95% in Era 2, which was of borderline statistical significance (P=0.49). The 5-year event-free survival (EFS) improved from 30.6% in Era 1 to 64.2% in Era 2 (P<.001). Improvement in outcome was achieved without any significant increase in morbidity or mortality, due to improvement in both systemic therapy and supportive care. The most important

  3. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  4. Simulation of a method for determining one-dimensional {sup 137}Cs distribution using multiple gamma spectroscopic measurements with an adjustable cylindrical collimator and center shield

    Energy Technology Data Exchange (ETDEWEB)

    Whetstone, Z.D.; Dewey, S.C. [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States); Kearfott, K.J., E-mail: kearfott@umich.ed [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States)

    2011-05-15

    With multiple in situ gamma spectroscopic measurements obtained with an adjustable cylindrical collimator and a circular shield, the arbitrary one-dimensional distribution of radioactive material can be determined. The detector responses are theoretically calculated, field measurements obtained, and a system of equations relating detector response to measurement geometry and activity distribution solved to estimate the distribution. This paper demonstrates the method by simulating multiple scenarios and providing analysis of the system conditioning.

  5. Managing commodity risks in highway contracts : quantifying premiums, accounting for correlations among risk factors, and designing optimal price-adjustment contracts.

    Science.gov (United States)

    2011-09-01

    It is a well-known fact that macro-economic conditions, such as prices of commodities (e.g. oil, : cement and steel) affect the cost of construction projects. In a volatile market environment, highway : agencies often pass such risk to contractors us...

  6. Studies of cancer risk among Chernobyl liquidators: materials and methods

    International Nuclear Information System (INIS)

    Kesminiene, A.; Cardis, E.; Tenet, V.; Ivanov, V.K.; Kurtinaitis, J.; Malakhova, I.; Stengrevics, A.; Tekkel, M.

    2002-01-01

    The current paper presents the methods and design of two case-control studies among Chernobyl liquidators - one of leukaemia and non-Hodgkin lymphoma, the other of thyroid cancer risk - carried out in Belarus, Estonia, Latvia, Lithuania and Russia. The specific objective of these studies is to estimate the radiation induced risk of these diseases among liquidators of the Chernobyl accident, and, in particular, to study the effect of exposure protraction and radiation type on the risk of radiation induced cancer in the low-to-medium- (0-500 mSv) radiation dose range. The study population consists of the approximately 10,000 Baltic, 40,000 Belarus and 51,000 Russian liquidators who worked in the 30 km zone in 1986-1987, and who were registered in the Chernobyl registry of these countries. The studies included cases diagnosed in 1993-1998 for all countries but Belarus, where the study period was extended until 2000. Four controls were selected in each country from the national cohort for each case, matched on age, gender and region of residence. Information on study subjects was obtained through face-to-face interview using a standardised questionnaire with questions on demographic factors, time, place and conditions of work as a liquidator and potential risk and confounding factors for the tumours of interest. Overall, 136 cases and 595 controls after receiving their consent were included in the studies. A method of analytical dose reconstruction has been developed, validated and applied to the estimation of doses and related uncertainties for all the subjects in the study. Dose-response analyses are underway and results are likely to have important implications to assess the adequacy of existing protection standards, which are based on risk estimates derived from analyses of the mortality of atomic bomb survivors and other high dose studies. (author)

  7. Malnutrition Increases With Obesity and Is a Stronger Independent Risk Factor for Postoperative Complications: A Propensity-Adjusted Analysis of Total Hip Arthroplasty Patients.

    Science.gov (United States)

    Fu, Michael C; D'Ambrosia, Christopher; McLawhorn, Alexander S; Schairer, William W; Padgett, Douglas E; Cross, Michael B

    2016-11-01

    Obesity is frequently associated with complications after total hip arthroplasty (THA) and is often concomitant with malnutrition. The purpose of this study was to investigate the independent morbidity risk of malnutrition relative to obesity. The National Surgical Quality Improvement Program from 2005 to 2013 was queried for elective primary THA cases. Malnutrition was defined as albumin malnutrition with 30-day outcomes. A total of 40,653 THA cases were identified, of which 20,210 (49.7%) had preoperative albumin measurements. Propensity score adjustment successfully reduced potential selection bias, with P > .05 for differences between those with and without albumin data. Malnutrition incidence increased from 2.8% in obese I to 5.7% in obese III patients. With multivariable propensity-adjusted logistic regression, malnutrition was a more robust predictor than any obesity class for any postoperative complication(s) (odds ratio [OR] 1.61, 95% confidence interval [CI] 1.25-2.08), major complications (OR 1.63, 95% CI 1.21-2.19), respiratory complications (OR 2.35, 95% CI 1.27-4.37), blood transfusions (OR 1.71, 95% CI 1.44-2.03), and extended length of stay (OR 1.35, 95% CI 1.14-1.59). Malnutrition incidence increased significantly from obese I to obese III patients and was a stronger and more consistent predictor than obesity of complications after THA. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. High-dose chemotherapy for patients with high-risk breast cancer: a clinical and economic assessment using a quality-adjusted survival analysis.

    Science.gov (United States)

    Marino, Patricia; Roché, Henri; Moatti, Jean-Paul

    2008-04-01

    The benefit of high-dose chemotherapy (HDC) has not been clearly demonstrated. It may offer disease-free survival improvement at the expense of major toxicity and increasing cost. We evaluated the trade-offs between toxicity, relapse, and costs using a quality-adjusted time without symptoms or toxicity (Q-TWiST) analysis. The analysis was conducted in the context of a randomized trial (PEGASE 01) evaluating the benefit of HDC for 314 patients with high-risk breast cancer. A Q-TWiST analysis was first performed to compare HDC with standard chemotherapy. We then used the results of this Q-TWiST analysis to inform a cost per quality-adjusted life-year (QALY) comparison between treatments. Q-TWiST durations were in favor of HDC, whatever the weighting coefficients used for the analysis. This benefit was significant when the weighting coefficient related to the time spent after relapse was low (0.78), HDC offered no benefit. For intermediate values, the results depended on the weighting coefficient attributed to the toxicity period. The incremental cost per QALY ranged from 12,691euro/QALY to 26,439euro/QALY, according to the coefficients used to weight toxicity and relapse. The benefits of HDC outweigh the burdens of treatment for a wide range of utility coefficients. Economic impact is not a barrier to HDC diffusion in this situation. Nevertheless, no significant benefit was demonstrated for a certain range of utility values.

  9. A Robust and Fast Method to Compute Shallow States without Adjustable Parameters: Simulations for a Silicon-Based Qubit

    Science.gov (United States)

    Debernardi, Alberto; Fanciulli, Marco

    Within the framework of the envelope function approximation we have computed - without adjustable parameters and with a reduced computational effort due to analytical expression of relevant Hamiltonian terms - the energy levels of the shallow P impurity in silicon and the hyperfine and superhyperfine splitting of the ground state. We have studied the dependence of these quantities on the applied external electric field along the [001] direction. Our results reproduce correctly the experimental splitting of the impurity ground states detected at zero electric field and provide reliable predictions for values of the field where experimental data are lacking. Further, we have studied the effect of confinement of a shallow state of a P atom at the center of a spherical Si-nanocrystal embedded in a SiO2 matrix. In our simulations the valley-orbit interaction of a realistically screened Coulomb potential and of the core potential are included exactly, within the numerical accuracy due to the use of a finite basis set, while band-anisotropy effects are taken into account within the effective-mass approximation.

  10. Adjustment Criterion and Algorithm in Adjustment Model with Uncertain

    Directory of Open Access Journals (Sweden)

    SONG Yingchun

    2015-02-01

    Full Text Available Uncertainty often exists in the process of obtaining measurement data, which affects the reliability of parameter estimation. This paper establishes a new adjustment model in which uncertainty is incorporated into the function model as a parameter. A new adjustment criterion and its iterative algorithm are given based on uncertainty propagation law in the residual error, in which the maximum possible uncertainty is minimized. This paper also analyzes, with examples, the different adjustment criteria and features of optimal solutions about the least-squares adjustment, the uncertainty adjustment and total least-squares adjustment. Existing error theory is extended with new observational data processing method about uncertainty.

  11. Differences in case-mix can influence the comparison of standardised mortality ratios even with optimal risk adjustment: an analysis of data from paediatric intensive care.

    Science.gov (United States)

    Manktelow, Bradley N; Evans, T Alun; Draper, Elizabeth S

    2014-09-01

    The publication of clinical outcomes for consultant surgeons in 10 specialties within the NHS has, along with national clinical audits, highlighted the importance of measuring and reporting outcomes with the aim of monitoring quality of care. Such information is vital to be able to identify good and poor practice and to inform patient choice. The need to adequately adjust outcomes for differences in case-mix has long been recognised as being necessary to provide 'like-for-like' comparisons between providers. However, directly comparing values of the standardised mortality ratio (SMR) between different healthcare providers can be misleading even when the risk-adjustment perfectly quantifies the risk of a poor outcome in the reference population. An example is shown from paediatric intensive care. Using observed case-mix differences for 33 paediatric intensive care units (PICUs) in the UK and Ireland for 2009-2011, SMRs were calculated under four different scenarios where, in each scenario, all of the PICUs were performing identically for each patient type. Each scenario represented a clinically plausible difference in outcome from the reference population. Despite the fact that the outcome for any patient was the same no matter which PICU they were to be admitted to, differences between the units were seen when compared using the SMR: scenario 1, 1.07-1.21; scenario 2, 1.00-1.14; scenario 3, 1.04-1.13; scenario 4, 1.00-1.09. Even if two healthcare providers are performing equally for each type of patient, if their patient populations differ in case-mix their SMRs will not necessarily take the same value. Clinical teams and commissioners must always keep in mind this weakness of the SMR when making decisions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Adjustment of a direct method for the determination of man body burden in Pu-239 on by X-ray detection of U-235

    International Nuclear Information System (INIS)

    Boulay, P.

    1968-04-01

    The use of Pu-239 on a larger scale sets a problem about the contamination measurement by aerosol at lung level. A method of direct measurement of Pu-239 lung burden is possible, thanks to the use of a large area window proportional counter. A counter of such pattern, has been especially carried out for this purpose. The adjustment of the apparatus allows an adequate sensibility to detect a contamination at the maximum permissible body burden level. Besides, a method for individual 'internal calibration', with a plutonium mock: the protactinium-233, is reported. (author) [fr

  13. On performing of interference technique based on self-adjusting Zernike filters (SA-AVT method) to investigate flows and validate 3D flow numerical simulations

    Science.gov (United States)

    Pavlov, Al. A.; Shevchenko, A. M.; Khotyanovsky, D. V.; Pavlov, A. A.; Shmakov, A. S.; Golubev, M. P.

    2017-10-01

    We present a method for and results of determination of the field of integral density in the structure of flow corresponding to the Mach interaction of shock waves at Mach number M = 3. The optical diagnostics of flow was performed using an interference technique based on self-adjusting Zernike filters (SA-AVT method). Numerical simulations were carried out using the CFS3D program package for solving the Euler and Navier-Stokes equations. Quantitative data on the distribution of integral density on the path of probing radiation in one direction of 3D flow transillumination in the region of Mach interaction of shock waves were obtained for the first time.

  14. Survey and evaluation of aging risk assessment methods and applications

    International Nuclear Information System (INIS)

    Sanzo, D.; Kvam, P.; Apostolakis, G.; Wu, J.; Milici, T.; Ghoniem, N.; Guarro, S.

    1994-11-01

    The US Nuclear Regulatory Commission initiated the nuclear power plant aging research program about 6 years ago to gather information about nuclear power plant aging. Since then, this program has collected a significant amount of information, largely qualitative, on plant aging and its potential effects on plant safety. However, this body of knowledge has not yet been integrated into formalisms that can be used effectively and systematically to assess plant risk resulting from aging, although models for assessing the effect of increasing failure rates on core damage frequency have been proposed. This report surveys the work on the aging of systems, structures, and components (SSCs) of nuclear power plants, as well as associated data bases. We take a critical look at the need to revise probabilistic risk assessments (PRAs) so that they will include the contribution to risk from plant aging, the adequacy of existing methods for evaluating this contribution, and the adequacy of the data that have been used in these evaluation methods. We identify a preliminary framework for integrating the aging of SSCs into the PRA and include the identification of necessary data for such an integration

  15. Methodical treatment of dependent failures in risk analyses

    International Nuclear Information System (INIS)

    Hennings, W.; Mertens, J.

    1987-06-01

    In this report the state-of-the-art regarding dependent failures is compiled and commented on. Among others the following recommendations are infered: The term 'common mode failures' should be restricted to failures of redundant, similar components; the generic term is 'dependent failures' with the subsets 'causal failures' and 'common cause failures'. In risk studies, dependent failures should be covered as far as possible by 'explicit methods'. Nevertheless an uncovered rest remains, which should be accounted for by sensitivity analyses using 'implicit methods'. For this the homogeneous Marshall-Olkin model is recommended. Because the available reports on operating experiences only record 'common mode failures' systematically, it is recommended to additionally apply other methods, e.g. carry out a 'precursor study'. (orig.) [de

  16. City-Level Adult Stroke Prevalence in Relation to Remote Sensing Derived PM2.5 Adjusting for Unhealthy Behaviors and Medical Risk Factors

    Science.gov (United States)

    Hu, Z.

    2018-04-01

    This research explores the use of PM2.5 gird derived from remote sensing for assessing the effect of long-term exposure to PM2.5 (ambient air pollution of particulate matter with an aerodynamic diameter of 2.5 μm or less) on stroke, adjusting for unhealthy behaviors and medical risk factors. Health data was obtained from the newly published CDC "500 Cities Project" which provides city- and census tract-level small area estimates for chronic disease risk factors, and clinical preventive service use for the largest 500 cities in the United States. PM2.5 data was acquired from the "The Global Annual PM2.5 Grids from MODIS, MISR and SeaWiFS Aerosol Optical Depth (AOD), V1 (1998-2012)" datasets. Average PM2.5 were calculated for each city using a GIS zonal statistics function. Map data visualization and pattern comparison, univariate linear regression, and a multivariate linear regression model fitted using a generalized linear model via penalized maximum likelihood found that long-term exposure to ambient PM2.5 may increase the risk of stroke. Increasing physical activity, reducing smoking and body weight, enough sleeping, controlling diseases such as blood pressure, coronary heart disease, diabetes, and cholesterol, may mitigate the effect. PM2.5 grids derived from moderate resolution satellite remote sensing imagery may offer a unique opportunity to fill the data gap due to limited ground monitoring at broader scales. The evidence of raised stroke prevalence risk in high PM2.5 areas would support targeting of policy interventions on such areas to reduce pollution levels and protect human health.

  17. CITY-LEVEL ADULT STROKE PREVALENCE IN RELATION TO REMOTE SENSING DERIVED PM2.5 ADJUSTING FOR UNHEALTHY BEHAVIORS AND MEDICAL RISK FACTORS

    Directory of Open Access Journals (Sweden)

    Z. Hu

    2018-04-01

    Full Text Available This research explores the use of PM2.5 gird derived from remote sensing for assessing the effect of long-term exposure to PM2.5 (ambient air pollution of particulate matter with an aerodynamic diameter of 2.5 μm or less on stroke, adjusting for unhealthy behaviors and medical risk factors. Health data was obtained from the newly published CDC “500 Cities Project” which provides city- and census tract-level small area estimates for chronic disease risk factors, and clinical preventive service use for the largest 500 cities in the United States. PM2.5 data was acquired from the “The Global Annual PM2.5 Grids from MODIS, MISR and SeaWiFS Aerosol Optical Depth (AOD, V1 (1998–2012” datasets. Average PM2.5 were calculated for each city using a GIS zonal statistics function. Map data visualization and pattern comparison, univariate linear regression, and a multivariate linear regression model fitted using a generalized linear model via penalized maximum likelihood found that long-term exposure to ambient PM2.5 may increase the risk of stroke. Increasing physical activity, reducing smoking and body weight, enough sleeping, controlling diseases such as blood pressure, coronary heart disease, diabetes, and cholesterol, may mitigate the effect. PM2.5 grids derived from moderate resolution satellite remote sensing imagery may offer a unique opportunity to fill the data gap due to limited ground monitoring at broader scales. The evidence of raised stroke prevalence risk in high PM2.5 areas would support targeting of policy interventions on such areas to reduce pollution levels and protect human health.

  18. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, May 2016

    International Nuclear Information System (INIS)

    Herman, Michal Wladyslaw; Cabellos De Francisco, Oscar; Beck, Bret; Ignatyuk, Anatoly V.; Palmiotti, Giuseppe; Grudzevich, Oleg T.; Salvatores, Massimo; Chadwick, Mark; Pelloni, Sandro; Diez De La Obra, Carlos Javier; Wu, Haicheng; Sobes, Vladimir; Rearden, Bradley T.; Yokoyama, Kenji; Hursin, Mathieu; Penttila, Heikki; Kodeli, Ivan-Alexander; Plevnik, Lucijan; Plompen, Arjan; Gabrielli, Fabrizio; Leal, Luiz Carlos; Aufiero, Manuele; Fiorito, Luca; Hummel, Andrew; Siefman, Daniel; Leconte, Pierre

    2016-05-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. WPEC subgroup 40-CIELO (Collaborative International Evaluated Library Organization) provides a new working paradigm to facilitate evaluated nuclear reaction data advances. It brings together experts from across the international nuclear reaction data community to identify and document discrepancies among existing evaluated data libraries, measured data, and model calculation interpretations, and aims to make progress in reconciling these discrepancies to create more accurate ENDF-formatted files. SG40-CIELO focusses on 6 important isotopes: "1H, "1"6O, "5"6Fe, "2"3"5","2"3"8U, "2"3"9Pu. This document is the proceedings of the seventh formal Subgroup 39 meeting and of the Joint SG39+SG40 Session held at the NEA, OECD Conference Center, Paris, France on 10-11 May 2016. It comprises a Summary Record of the meeting, and all the available presentations (slides) given by the participants: A - Welcome and actions review (Oscar CABELLOS); B - Methods: - XGPT: uncertainty propagation and data assimilation from continuous energy covariance matrix and resonance parameters covariances (Manuele AUFIERO); - Optimal experiment utilization (REWINDing PIA), (G. Palmiotti); C - Experiment analysis, sensitivity calculations and benchmarks: - Tripoli-4 analysis of SEG experiments (Andrew HUMMEL); - Tripoli-4 analysis of BERENICE experiments (P. DUFAY, Cyrille DE SAINT JEAN); - Preparation of sensitivities of k-eff, beta-eff and shielding benchmarks for adjustment exercise (Ivo KODELI); - SA and

  19. Cooking Methods for Red Meats and Risk of Type 2 Diabetes: A Prospective Study of U.S. Women.

    Science.gov (United States)

    Liu, Gang; Zong, Geng; Hu, Frank B; Willett, Walter C; Eisenberg, David M; Sun, Qi

    2017-08-01

    This study examined different cooking methods for red meats in relation to type 2 diabetes (T2D) risk among U.S. women who consumed red meats regularly (≥2 servings/week). We monitored 59,033 women (1986-2012) aged 30-55 years and free of diabetes, cardiovascular disease, and cancer at baseline when information on frequency of different cooking methods for red meats, including broiling, barbequing, roasting, pan-frying, and stewing/boiling, was collected. During 1.24 million person-years of follow-up, we documented 6,206 incident cases of T2D. After multivariate adjustment including red meat cooking methods, total red meat and processed red meat intake were both associated with a monotonically increased T2D risk (both P trend cooking methods were further mutually adjusted. Independent of total red meat consumption, high-temperature and/or open-flame cooking methods for red meats, especially broiling and barbequing, may further increase diabetes risk among regular meat eaters. © 2017 by the American Diabetes Association.

  20. Internal dosimetry hazard and risk assessments: methods and applications

    International Nuclear Information System (INIS)

    Roberts, G.A.

    2006-01-01

    Routine internal dose exposures are typically (in the UK nuclear industry) less than external dose exposures: however, the costs of internal dosimetry monitoring programmes can be significantly greater than those for external dosimetry. For this reason decisions on when to apply routine monitoring programmes, and the nature of these programmes, can be more critical than for external dosimetry programmes. This paper describes various methods for performing hazard and risk assessments which are being developed by RWE NUKEM Limited Approved Dosimetry Services to provide an indication when routine internal dosimetry monitoring should be considered. (author)

  1. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, November 2014

    International Nuclear Information System (INIS)

    Aufiero, Manuele; Ivanov, Evgeny; Hoefer, Axel; Yokoyama, Kenji; Da Cruz, Dirceu Ferreira; KODELI, Ivan-Alexander; Hursin, Mathieu; Pelloni, Sandro; Palmiotti, Giuseppe; Salvatores, Massimo; Barnes, Andrew; Cabellos De Francisco, Oscar; ); Ivanova, Tatiana; )

    2014-11-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the third formal Subgroup meeting held at the NEA, Issy-les-Moulineaux, France, on 27-28 November 2014. It comprises a Summary Record of the meeting and all the available presentations (slides) given by the participants: A - Sensitivity methods: 1 - Perturbation/sensitivity calculations with Serpent (M. Aufiero); 2 - Comparison of deterministic and Monte Carlo sensitivity analysis of SNEAK-7A and FLATTOP-Pu Benchmarks (I. Kodeli); B - Integral experiments: 1 - PROTEUS experiments: selected experiments sensitivity profiles and availability, (M. Hursin, M. Salvatores - PROTEUS Experiments, HCLWR configurations); 2 - SINBAD Benchmark Database and FNS/JAEA Liquid Oxygen TOF Experiment Analysis (I. Kodeli); 3 - STEK experiment Opportunity for Validation of Fission Products Nuclear Data (D. Da Cruz); 4 - SEG (tailored adjoint flux shapes) (M. Savatores - comments) 5 - IPPE transmission experiments (Fe, 238 U) (T. Ivanova); 6 - RPI semi-integral (Fe, 238 U) (G. Palmiotti - comments); 7 - New experiments, e.g. in connection with the new NSC Expert Group on 'Improvement of Integral Experiments Data for Minor Actinide Management' (G. Palmiotti - Some comments from the Expert Group) 8 - Additional PSI adjustment studies accounting for nonlinearity (S. Pelloni); 9 - Adjustment methodology issues (G. Palmiotti); C - Am-241 and fission product issues: 1 - Am-241 validation for criticality-safety calculations (A. Barnes - Visio

  2. Application of adjustment calculus in the nodeless Trefftz method for a problem of two-dimensional temperature field of the boiling liquid flowing in a minichannel

    Directory of Open Access Journals (Sweden)

    Hożejowska Sylwia

    2014-03-01

    Full Text Available The paper presents application of the nodeless Trefftz method to calculate temperature of the heating foil and the insulating glass pane during continuous flow of a refrigerant along a vertical minichannel. Numerical computations refer to an experiment in which the refrigerant (FC-72 enters under controlled pressure and temperature a rectangular minichannel. Initially its temperature is below the boiling point. During the flow it is heated by a heating foil. The thermosensitive liquid crystals allow to obtain twodimensional temperature field in the foil. Since the nodeless Trefftz method has very good performance for providing solutions to such problems, it was chosen as a numerical method to approximate two-dimensional temperature distribution in the protecting glass and the heating foil. Due to known temperature of the refrigerant it was also possible to evaluate the heat transfer coefficient at the foil-refrigerant interface. For expected improvement of the numerical results the nodeless Trefftz method was combined with adjustment calculus. Adjustment calculus allowed to smooth the measurements and to decrease the measurement errors. As in the case of the measurement errors, the error of the heat transfer coefficient decreased.

  3. A Method and a Model for Describing Competence and Adjustment: A Preschool Version of the Classroom Behavior Inventory.

    Science.gov (United States)

    Schaefer, Earl S.; Edgerton, Marianna D.

    A preschool version of the Classroom Behavior Inventory which provides a method for collecting valid data on a child's classroom behavior from day care and preschool teachers, was developed to complement the earlier form which was developed and validated for elementary school populations. The new version was tested with a pilot group of twenty-two…

  4. Using Case-Mix Adjustment Methods To Measure the Effectiveness of Substance Abuse Treatment: Three Examples Using Client Employment Outcomes.

    Science.gov (United States)

    Koenig, Lane; Fields, Errol L.; Dall, Timothy M.; Ameen, Ansari Z.; Harwood, Henrick J.

    This report demonstrates three applications of case-mix methods using regression analysis. The results are used to assess the relative effectiveness of substance abuse treatment providers. The report also examines the ability of providers to improve client employment outcomes, an outcome domain relatively unexamined in the assessment of provider…

  5. Risk-adjusted survival for adults following in-hospital cardiac arrest by day of week and time of day: observational cohort study.

    Science.gov (United States)

    Robinson, Emily J; Smith, Gary B; Power, Geraldine S; Harrison, David A; Nolan, Jerry; Soar, Jasmeet; Spearpoint, Ken; Gwinnutt, Carl; Rowan, Kathryn M

    2016-11-01

    Internationally, hospital survival is lower for patients admitted at weekends and at night. Data from the UK National Cardiac Arrest Audit (NCAA) indicate that crude hospital survival was worse after in-hospital cardiac arrest (IHCA) at night versus day, and at weekends versus weekdays, despite similar frequency of events. To describe IHCA demographics during three day/time periods-weekday daytime (Monday to Friday, 08:00 to 19:59), weekend daytime (Saturday and Sunday, 08:00 to 19:59) and night-time (Monday to Sunday, 20:00 to 07:59)-and to compare the associated rates of return of spontaneous circulation (ROSC) for >20 min (ROSC>20 min) and survival to hospital discharge, adjusted for risk using previously developed NCAA risk models. To consider whether any observed difference could be attributed to differences in the case mix of patients resident in hospital and/or the administered care. We performed a prospectively defined analysis of NCAA data from 27 700 patients aged ≥16 years receiving chest compressions and/or defibrillation and attended by a hospital-based resuscitation team in response to a resuscitation (2222) call in 146 UK acute hospitals. Risk-adjusted outcomes (OR (95% CI)) were worse (p20 min 0.88 (0.81 to 0.95); hospital survival 0.72 (0.64 to 0.80)), and night-time (ROSC>20 min 0.72 (0.68 to 0.76); hospital survival 0.58 (0.54 to 0.63)) compared with weekday daytime. The effects were stronger for non-shockable than shockable rhythms, but there was no significant interaction between day/time of arrest and age, or day/time of arrest and arrest location. While many daytime IHCAs involved procedures, restricting the analyses to IHCAs in medical admissions with an arrest location of ward produced results that are broadly in line with the primary analyses. IHCAs attended by the hospital-based resuscitation team during nights and weekends have substantially worse outcomes than during weekday daytimes. Organisational or care differences at

  6. Ergonomic lumbar risk analysis of construction workers by NIOSH method

    Directory of Open Access Journals (Sweden)

    Cinara Caetano Pereira

    2015-09-01

    Full Text Available Work in construction has tasks directly connected with manual transport. One of the body segments suffering greater demand in works with these characteristics is the lumbar spine segment. The aim of this study was to analyze the level of risk of lumbar construction workers in the shipment of materials. The sample was composed of 74 construction workers. Were used as a research tool: the NIOSH method for lumbar risk verification expressed by weight limit recommended (WPR and the lifting Index (IL, Visual analogue scale (VAS for the evaluation of pain intensity, the e-1 Corlett.0 for the mapping of the pain and Borg to the subjective perception of the intensity of physical exertion. The present study identified the weight limit (WP of 8.707 for management activity of bags of cement for the load of 8.194 wheelbarrows used. These findings are 6 times under actual weights handled during the activities that revolve around 50 kg with the sacks and averaged 49.72 kg stands with mass. The dimensional settings found in the search are at high risk for ergonomic lumbar region, and measures of reconfiguration of workplaces and operation of auxiliary devices for lifting, transporting and unloading are fundamental, in addition to the need for reflection about the current logistical problems that induce producers to supply the cement sacks with 50 kg.

  7. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, May 2015

    International Nuclear Information System (INIS)

    Wang, Wenming; Yokoyama, Kenji; Kim, Do Heon; Kodeli, Ivan-Alexander; Hursin, Mathieu; Pelloni, Sandro; Palmiotti, Giuseppe; Salvatores, Massimo; Touran, Nicholas; Cabellos De Francisco, Oscar; )

    2015-05-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the fourth Subgroup meeting, held at the NEA, Issy-les-Moulineaux, France, on 19-20 May 2015. It comprises a Summary Record of the meeting, two papers on deliverables and all the available presentations (slides) given by the participants: 1 - Status of Deliverables: '1. Methodology' (K. Yokoyama); 2 - Status of Deliverables: '2. Comments on covariance data' (K. Yokoyama); 3 - PROTEUS HCLWR Experiments (M. Hursin); 4 - Preliminary UQ Efforts for TWR Design (N. Touran); 5 - Potential use of beta-eff and other benchmark for adjustment (I. Kodeli); 6 - k_e_f_f uncertainties for a simple case of Am"2"4"1 using different codes and evaluated files (I. Kodeli); 7 - k_e_f_f uncertainties for a simple case of Am"2"4"1 using TSUNAMI (O. Cabellos); 8 - REWIND: Ranking Experiments by Weighting to Improve Nuclear Data (G. Palmiotti); 9 - Recent analysis on NUDUNA/MOCABA applications to reactor physics parameters (E. Castro); 10 - INL exploratory study for SEG (A. Hummel); 11 - The Development of Nuclear Data Adjustment Code at CNDC (H. Wu); 12 - SG39 Perspectives (M. Salvatores). A list of issues and actions conclude the document

  8. RISK EVALUATION OF PIN JIG WORK UNIT IN SHIPBUILDING BY USING FUZZY AHP METHOD

    Directory of Open Access Journals (Sweden)

    Murat Ozkok

    2015-03-01

    Full Text Available Shipbuilding industry includes many different industry branches in itself so various kind of work accidents occur. These work accidents often cause serious injuries and also deaths. It is a crucial thing to prevent or minimize these accidents. In order to reduce work accidents in shipyards, the most hazardous activities are needed to be determined and then, shipyard management must work on it in order to remove these hazard sources. In this study, pin jig work unit, where the curved parts are mounted on adjustable pin jigs, was considered. At first, the work activities and operations of pin jig work station were identified and they were classified as main and sub risk criterions. Then, pair comparison scales were built and these risk criterions were evaluated by experts who have been working for a shipyard located in Turkey. As a result of the evaluations of the experts, the risk weights of the activities carried out at pin jig work unit were defined by using fuzzy AHP method. Therefore, it is aimed for the shipyard management to take some precautions at pinjig work unit on the risky operations before failures happen.

  9. Valuing Drinking Water Risk Reductions Using the Contingent Valuation Method: A Methodological Study of Risks from THM and Giardia (1986)

    Science.gov (United States)

    This study develops contingent valuation methods for measuring the benefits of mortality and morbidity drinking water risk reductions. The major effort was devoted to developing and testing a survey instrument to value low-level risk reductions.

  10. Control of Risks Through the Use of Procedures: A Method for Evaluating the Change in Risk

    Science.gov (United States)

    Praino, Gregory T.; Sharit, Joseph

    2010-01-01

    This paper considers how procedures can be used to control risks faced by an organization and proposes a means of recognizing if a particular procedure reduces risk or contributes to the organization's exposure. The proposed method was developed out of the review of work documents and the governing procedures performed in the wake of the Columbia accident by NASA and the Space Shuttle prime contractor, United Space Alliance, LLC. A technique was needed to understand the rules, or procedural controls, in place at the time in the context of how important the role of each rule was. The proposed method assesses procedural risks, the residual risk associated with a hazard after a procedure's influence is accounted for, by considering each clause of a procedure as a unique procedural control that may be beneficial or harmful. For procedural risks with consequences severe enough to threaten the survival of the organization, the method measures the characteristics of each risk on a scale that is an alternative to the traditional consequence/likelihood couple. The dual benefits of the substitute scales are that they eliminate both the need to quantify a relationship between different consequence types and the need for the extensive history a probabilistic risk assessment would require. Control Value is used as an analog for the consequence, where the value of a rule is based on how well the control reduces the severity of the consequence when operating successfully. This value is composed of two parts: the inevitability of the consequence in the absence of the control, and the opportunity to intervene before the consequence is realized. High value controls will be ones where there is minimal need for intervention but maximum opportunity to actively prevent the outcome. Failure Likelihood is used as the substitute for the conventional likelihood of the outcome. For procedural controls, a failure is considered to be any non-malicious violation of the rule, whether intended or

  11. Bundle Adjustment-Based Stability Analysis Method with a Case Study of a Dual Fluoroscopy Imaging System

    Science.gov (United States)

    Al-Durgham, K.; Lichti, D. D.; Detchev, I.; Kuntze, G.; Ronsky, J. L.

    2018-05-01

    A fundamental task in photogrammetry is the temporal stability analysis of a camera/imaging-system's calibration parameters. This is essential to validate the repeatability of the parameters' estimation, to detect any behavioural changes in the camera/imaging system and to ensure precise photogrammetric products. Many stability analysis methods exist in the photogrammetric literature; each one has different methodological bases, and advantages and disadvantages. This paper presents a simple and rigorous stability analysis method that can be straightforwardly implemented for a single camera or an imaging system with multiple cameras. The basic collinearity model is used to capture differences between two calibration datasets, and to establish the stability analysis methodology. Geometric simulation is used as a tool to derive image and object space scenarios. Experiments were performed on real calibration datasets from a dual fluoroscopy (DF; X-ray-based) imaging system. The calibration data consisted of hundreds of images and thousands of image observations from six temporal points over a two-day period for a precise evaluation of the DF system stability. The stability of the DF system - for a single camera analysis - was found to be within a range of 0.01 to 0.66 mm in terms of 3D coordinates root-mean-square-error (RMSE), and 0.07 to 0.19 mm for dual cameras analysis. It is to the authors' best knowledge that this work is the first to address the topic of DF stability analysis.

  12. Noninvasive non Doses Method for Risk Stratification of Breast Diseases

    Directory of Open Access Journals (Sweden)

    I. A. Apollonova

    2014-01-01

    Full Text Available The article concerns a relevant issue that is a development of noninvasive method for screening diagnostics and risk stratification of breast diseases. The developed method and its embodiment use both the analysis of onco-epidemiologic tests and the iridoglyphical research.Widely used onco-epidemiologic tests only reflect the patient’s subjective perception of her own life history and sickness. Therefore to confirm the revealed factors, modern objective and safe methods are necessary.Iridoglyphical research may be considered as one of those methods, since it allows us to reveal changes in iris’ zones in real time. As these zones are functionally linked with intern organs and systems, in this case mammary glands, changes of iris’ zones may be used to assess risk groups for mammary gland disorders.The article presents results of research conducted using a prototype of the hardwaresoftware complex to provide screening diagnostics and risk stratification of mammary gland disorders.Research has been conducted using verified materials, provided by the Biomedical Engineering Faculty and the Scientific Biometry Research and Development Centre of Bauman Moscow State Technical University, the City of Moscow’s GUZ Clinical and Diagnostic Centre N°4 of the Western Administrative District and the First Mammology (Breast Care Centre of the Russian Federation’s Ministry of Health and Social Development.The information, obtained as a result of onco-epidemiological tests and iridoglyphical research, was used to develop a procedure of quantitative diagnostics aimed to assess mammary gland cancer risk groups. The procedure is based on Bayes conditional probability.The task of quantitative diagnostics may be formally divided into the differential assessment of three states. The first, D1, is the norm, which corresponds to the population group with a lack of risk factors or changes of the mammary glands. The second, D2, is the population group

  13. Alternative Testing Methods for Predicting Health Risk from Environmental Exposures

    Directory of Open Access Journals (Sweden)

    Annamaria Colacci

    2014-08-01

    Full Text Available Alternative methods to animal testing are considered as promising tools to support the prediction of toxicological risks from environmental exposure. Among the alternative testing methods, the cell transformation assay (CTA appears to be one of the most appropriate approaches to predict the carcinogenic properties of single chemicals, complex mixtures and environmental pollutants. The BALB/c 3T3 CTA shows a good degree of concordance with the in vivo rodent carcinogenesis tests. Whole-genome transcriptomic profiling is performed to identify genes that are transcriptionally regulated by different kinds of exposures. Its use in cell models representative of target organs may help in understanding the mode of action and predicting the risk for human health. Aiming at associating the environmental exposure to health-adverse outcomes, we used an integrated approach including the 3T3 CTA and transcriptomics on target cells, in order to evaluate the effects of airborne particulate matter (PM on toxicological complex endpoints. Organic extracts obtained from PM2.5 and PM1 samples were evaluated in the 3T3 CTA in order to identify effects possibly associated with different aerodynamic diameters or airborne chemical components. The effects of the PM2.5 extracts on human health were assessed by using whole-genome 44 K oligo-microarray slides. Statistical analysis by GeneSpring GX identified genes whose expression was modulated in response to the cell treatment. Then, modulated genes were associated with pathways, biological processes and diseases through an extensive biological analysis. Data derived from in vitro methods and omics techniques could be valuable for monitoring the exposure to toxicants, understanding the modes of action via exposure-associated gene expression patterns and to highlight the role of genes in key events related to adversity.

  14. Adjoint Methods for Adjusting Three-Dimensional Atmosphere and Surface Properties to Fit Multi-Angle Multi-Pixel Polarimetric Measurements

    Science.gov (United States)

    Martin, William G.; Cairns, Brian; Bal, Guillaume

    2014-01-01

    This paper derives an efficient procedure for using the three-dimensional (3D) vector radiative transfer equation (VRTE) to adjust atmosphere and surface properties and improve their fit with multi-angle/multi-pixel radiometric and polarimetric measurements of scattered sunlight. The proposed adjoint method uses the 3D VRTE to compute the measurement misfit function and the adjoint 3D VRTE to compute its gradient with respect to all unknown parameters. In the remote sensing problems of interest, the scalar-valued misfit function quantifies agreement with data as a function of atmosphere and surface properties, and its gradient guides the search through this parameter space. Remote sensing of the atmosphere and surface in a three-dimensional region may require thousands of unknown parameters and millions of data points. Many approaches would require calls to the 3D VRTE solver in proportion to the number of unknown parameters or measurements. To avoid this issue of scale, we focus on computing the gradient of the misfit function as an alternative to the Jacobian of the measurement operator. The resulting adjoint method provides a way to adjust 3D atmosphere and surface properties with only two calls to the 3D VRTE solver for each spectral channel, regardless of the number of retrieval parameters, measurement view angles or pixels. This gives a procedure for adjusting atmosphere and surface parameters that will scale to the large problems of 3D remote sensing. For certain types of multi-angle/multi-pixel polarimetric measurements, this encourages the development of a new class of three-dimensional retrieval algorithms with more flexible parametrizations of spatial heterogeneity, less reliance on data screening procedures, and improved coverage in terms of the resolved physical processes in the Earth?s atmosphere.

  15. A method of adjusting SUV for injection-acquisition time differences in {sup 18}F-FDG PET Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Laffon, Eric [Hopital du Haut Leveque, CHU de Bordeaux, Pessac (France); Centre de Recherche Cardio-Thoracique, Bordeaux (France); Hopital du Haut-Leveque, Service de Medecine Nucleaire, Pessac (France); Clermont, Henri de [Hopital du Haut Leveque, CHU de Bordeaux, Pessac (France); Marthan, Roger [Hopital du Haut Leveque, CHU de Bordeaux, Pessac (France); Centre de Recherche Cardio-Thoracique, Bordeaux (France)

    2011-11-15

    A time normalisation method of tumour SUVs in {sup 18}F-FDG PET imaging is proposed that has been verified in lung cancer patients. A two-compartment model analysis showed that, when SUV is not corrected for {sup 18}F physical decay (SUV{sub uncorr}), its value is within 5% of its peak value (t = 79 min) between 55 and 110 min after injection, in each individual patient. In 10 patients, each with 1 or more malignant lesions (n = 15), two PET acquisitions were performed within this time delay, and the maximal SUV of each lesion, both corrected and uncorrected, was assessed. No significant difference was found between the two uncorrected SUVs, whereas there was a significant difference between the two corrected ones: mean differences were 0.04 {+-} 0.22 and 3.24 {+-} 0.75 g.ml{sup -1}, respectively (95% confidence intervals). Therefore, a simple normalisation of decay-corrected SUV for time differences after injection is proposed: SUV{sub N} = 1.66*SUV{sub uncorr}, where the factor 1.66 arises from decay correction at t = 79 min. When {sup 18}F-FDG PET imaging is performed within the range 55-110 min after injection, a simple SUV normalisation for time differences after injection has been verified in patients with lung cancer, with a {+-}2.5% relative measurement uncertainty. (orig.)

  16. Improving Results of Elective Abdominal Aortic Aneurysm Repair at a Low-Volume Hospital by Risk-Adjusted Selection of Treatment in the Endovascular Era

    International Nuclear Information System (INIS)

    Wibmer, Andreas; Meyer, Bernhard; Albrecht, Thomas; Buhr, Heinz-Johannes; Kruschewski, Martin

    2009-01-01

    open repair was reduced from 8.5% to 3.7% (p = 0.414). In conclusion, by risk-adjusted selection of treatment and frequent application of EVAR, it is possible to improve perioperative outcome of elective AAA repair at a low-volume hospital. Mortality figures are similar to those of recent trials at high-volume centers, as reported in the literature.

  17. Environmental risk comparisons with internal methods of UST leak detection

    International Nuclear Information System (INIS)

    Durgin, P.B.

    1993-01-01

    The past five years have seen a variety of advances in how leaks can be detected from within underground storage tanks. Any leak-detection approach employed within a storage tanks must be conducted at specific time intervals and meet certain leak-rate criteria according to federal and state regulations. Nevertheless, the potential environmental consequences of leak detection approaches differ widely. Internal, volumetric UST monitoring techniques have developed over time including: (1) inventory control with stick measurements, (2) precision tank testing, (3) automatic tank gauging (ATG), (4) statistical inventory reconciliation (SIR), and (5) statistical techniques with automatic tank gauging. An ATG focuses on the advantage of precise data but measured for only a brief period. On the other hand, stick data has less precision but when combined with SIR over extended periods it too can detect low leak rates. Graphs demonstrate the comparable amounts of fuel than can leak out of a tank before being detected by these techniques. The results indicate that annual tank testing has the greatest potential for large volumes of fuel leaking without detection while new statistical approaches with an ATG have the least potential. The environmental implications of the volumes of fuel leaked prior to detection are site specific. For example, if storage tank is surrounded by a high water table and in a sole-source aquifer even small leaks may cause problems. The user must also consider regulatory risks. The level of environmental and regulatory risk should influence selection of the UST leak detection method

  18. Equivalence of ten different methods for valuing companies by cash flow discounting.

    OpenAIRE

    Fernandez, Pablo

    2003-01-01

    This paper shows that ten methods of company valuation using cash flow discounting (WACC; equity cash flow; capital cash flow; adjusted present value; residual income; EVA; business's risk-adjusted equity cash flow; business's risk-adjusted free cash flow; risk-free-adjusted equity cash flow; and risk-free-adjusted free cash flow) always give the same value when identical assumptions are used. This result is logical, since all the methods analyze the same reality based upon the same assumptio...

  19. The significance of amlodipine on autonomic nervous system adjustment (ANSA method: A new approach in the treatment of hypertension

    Directory of Open Access Journals (Sweden)

    Milovanović Branislav

    2009-01-01

    Full Text Available Introduction. Cardiovascular autonomic modulation is altered in patients with essential hypertension. Objective To evaluate acute and long-term effects of amlodipine on cardiovascular autonomic function and haemodynamic status in patients with mild essential hypertension. Methods. Ninety patients (43 male, mean age 52.12 ±10.7 years with mild hypertension were tested before, 30 minutes after the first 5 mg oral dose of amlodipine and three weeks after monotherapy with amlodipine. A comprehensive study protocol was done including finger blood pressure variability (BPV and heart rate variability (HRV beat-to-beat analysis with impedance cardiography, ECG with software short-term HRV and nonlinear analysis, 24-hour Holter ECG monitoring with QT and HRV analysis, 24-hour blood pressure (BP monitoring with systolic and diastolic BPV analysis, cardiovascular autonomic reflex tests, cold pressure test, mental stress test. The patients were also divided into sympathetic and parasympathetic groups, depending on predominance in short time spectral analysis of sympathovagal balance according to low frequency and high frequency values. Results. We confirmed a significant systolic and diastolic BP reduction, and a reduction of pulse pressure during day, night and early morning hours. The reduction of supraventricular and ventricular ectopic beats during the night was also achieved with therapy, but without statistical significance. The increment of sympathetic activity in early phase of amlodipine therapy was without statistical significance and persistence of sympathetic predominance after a few weeks of therapy detected based on the results of short-term spectral HRV analysis. All time domain parameters of long-term HRV analysis were decreased and low frequency amongst spectral parameters. Amlodipne reduced baroreflex sensitivity after three weeks of therapy, but increased it immediately after the administration of the first dose. Conclusion. The results

  20. Risk management for whales

    OpenAIRE

    Cont, R; Wagalath, L

    2016-01-01

    We propose framework for modeling portfolio risk which integrates market risk with liquidation costs which may arise in stress scenarios. Our model provides a systematic method for computing liquidation-adjusted risk measures for a portfolio. Calculation of Liquidation-adjusted VaR (LVaR) for sample portfolios reveals a substantial impact of liquidation costs on portfolio risk for portfolios with large concentrated positions.

  1. Risk-adjusted morbidity in teaching hospitals correlates with reported levels of communication and collaboration on surgical teams but not with scale measures of teamwork climate, safety climate, or working conditions.

    Science.gov (United States)

    Davenport, Daniel L; Henderson, William G; Mosca, Cecilia L; Khuri, Shukri F; Mentzer, Robert M

    2007-12-01

    Since the Institute of Medicine patient safety reports, a number of survey-based measures of organizational climate safety factors (OCSFs) have been developed. The goal of this study was to measure the impact of OCSFs on risk-adjusted surgical morbidity and mortality. Surveys were administered to staff on general/vascular surgery services during a year. Surveys included multiitem scales measuring OCSFs. Additionally, perceived levels of communication and collaboration with coworkers were assessed. The National Surgical Quality Improvement Program was used to assess risk-adjusted morbidity and mortality. Correlations between outcomes and OCSFs were calculated and between outcomes and communication/collaboration with attending and resident doctors, nurses, and other providers. Fifty-two sites participated in the survey: 44 Veterans Affairs and 8 academic medical centers. A total of 6,083 surveys were returned, for a response rate of 52%. The OCSF measures of teamwork climate, safety climate, working conditions, recognition of stress effects, job satisfaction, and burnout demonstrated internal validity but did not correlate with risk-adjusted outcomes. Reported levels of communication/collaboration with attending and resident doctors correlated with risk-adjusted morbidity. Survey-based teamwork, safety climate, and working conditions scales are not confirmed to measure organizational factors that influence risk-adjusted surgical outcomes. Reported communication/collaboration with attending and resident doctors on surgical services influenced patient morbidity. This suggests the importance of doctors' coordination and decision-making roles on surgical teams in providing high-quality and safe care. We propose risk-adjusted morbidity as an effective measure of surgical patient safety.

  2. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  3. Risk-Informed SSCs Categorization: Elicitation Method of Expert's Opinion

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Mee Jeong; Yang, Joon Eon; Kim, Kil Yoo

    2005-01-15

    The regulations have been performing by deterministic way since nuclear power plants have been operating. However, some SSCs identified as safety-significance by deterministic way, were turned out to be low or non safety-significant and some SSCs identified as non-safety significance were turned out to be high safety-significant according to the results of PSA. Considering these risk insights, Regulatory Guide 1.174 and 10CFR50.69 were drawn up, and we can re-categorize the SSCs according to their safety significance. Therefore, a study and an interest about the risk-informed SSCs re-categorization and treatment has been continued. The objective of this regulatory initiative is to adjust the scope of equipment subject to special regulatory treatment to better focus licensee and regulatory attention and resources on equipment that has safety significance. Current most regulations define the plant equipment necessary to meet deterministic regulatory basis as 'safety-related.' This equipment is subject to special treatment regulations. Other plant equipment is categorized as 'non-safety related,' and is not subject to a select number of special treatment requirement or a subset of those requirement. However, risk information is not a magic tool making a decision but a supporting tool to categorize SSCs. This is because only small parts of a plant are modeled in PSA model. Thus, engineering and deterministic judgments are also used for risk-informed SSCs categorization, and expert opinion elicitation is very important for risk-informed SSCs categorization. Therefore, we need a rational method to elicit the expert's opinions, and in this study, we developed a systematic method for expert elicitation to categorize the nuclear power plants' SSCs. Current states for SSCs categorization of the USA and the existing methods for expert elicitation were surveyed and more systematic way eliciting the expert opinions and combining was developed. To

  4. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, December 2015

    International Nuclear Information System (INIS)

    Cabellos, Oscar; De Saint Jean, Cyrille; Hursin, Mathieu; Pelloni, Sandro; Ivanov, Evgeny; Kodeli, Ivan; Leconte, Pierre; Palmiotti, Giuseppe; Salvatores, Massimo; Sobes, Vladimir; Yokoyama, Kenji

    2015-12-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the fifth formal Subgroup 39 meeting held at the Institute Curie, Paris, France, on 4 December 2015. It comprises a Summary Record of the meeting, and all the available presentations (slides) given by the participants: A - Sensitivity methods: - 1: Short update on deliverables (K. Yokoyama); - 2: Does one shot Bayesian is equivalent to successive update? Bayesian inference: some matrix linear algebra (C. De Saint Jean); - 3: Progress in Methodology (G. Palmiotti); - SG39-3: Use of PIA approach. Possible application to neutron propagation experiments (S. Pelloni); - 4: Update on sensitivity coefficient methods (E. Ivanov); - 5: Stress test for U-235 fission (H. Wu); - 6: Methods and approaches development at ORNL for providing feedback from integral benchmark experiments for improvement of nuclear data files (V. Sobes); B - Integral experiments: - 7a: Update on SEG analysis (G. Palmiotti); - 7b:Status of MANTRA (G. Palmiotti); - 7c: Possible new experiments at NRAD (G. Palmiotti); - 8: B-eff experiments (I. Kodeli); - 9: On going CEA activities related to dedicated integral experiments for nuclear date validation in the Fast energy range (P. Leconte); - 10: PROTEUS Experiments: an update (M. Hursin); - 11: Short updates on neutron propagation experiments, STEK, CIELO status (O. Cabellos)

  5. Mainstreaming Disaster Risk Management for Finance: Application of Real Options Method for Disaster Risk Sensitive Project

    Directory of Open Access Journals (Sweden)

    KUSDHIANTO SETIAWAN

    Full Text Available This paper discusses the application of real options analysis for a project that is in the process of construction and was affected by a natural disaster. The use of the analytical method has become a way of thinking in making decisions that should be taught to business school students. The case in this paper is based on an MBA thesis at the University of Gadjah Mada that was intended as a showcase for application of real options to address real business problems. It shows one of the strategies in mainstreaming disaster risk management in the business school that also answers the needs of businesses in the disaster-prone country.

  6. FIFRA Peer Review: Proposed Risk Assessment Methods Process

    Science.gov (United States)

    From September 11-14, 2012, EPA participated in a Federal Insecticide, Fungicide and Rodenticide Act Scientific Advisory Panel (SAP) meeting on a proposed pollinator risk assessment framework for determining the potential risks of pesticides to honey bees.

  7. The Heating Curve Adjustment Method

    NARCIS (Netherlands)

    Kornaat, W.; Peitsman, H.C.

    1995-01-01

    In apartment buildings with a collective heating system usually a weather compensator is used for controlling the heat delivery to the various apartments. With this weather compensator the supply water temperature to the apartments is regulated depending on the outside air temperature. With

  8. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Science.gov (United States)

    2013-05-01

    ... Citations on Methods for Cumulative Risk Assessment AGENCY: Office of the Science Advisor, Environmental... influence exposures, dose-response or risk/hazard posed by environmental contaminant exposures, and methods... who wish to receive further information about submitting information on methods for cumulative risk...

  9. Set up of a method for the adjustment of resonance parameters on integral experiments; Mise au point d`une methode d`ajustement des parametres de resonance sur des experiences integrales

    Energy Technology Data Exchange (ETDEWEB)

    Blaise, P.

    1996-12-18

    Resonance parameters for actinides play a significant role in the neutronic characteristics of all reactor types. All the major integral parameters strongly depend on the nuclear data of the isotopes in the resonance-energy regions.The author sets up a method for the adjustment of resonance parameters taking into account the self-shielding effects and restricting the cross section deconvolution problem to a limited energy region. (N.T.).

  10. A GRAMMATICAL ADJUSTMENT ANALYSIS OF STATISTICAL MACHINE TRANSLATION METHOD USED BY GOOGLE TRANSLATE COMPARED TO HUMAN TRANSLATION IN TRANSLATING ENGLISH TEXT TO INDONESIAN

    Directory of Open Access Journals (Sweden)

    Eko Pujianto

    2017-04-01

    Full Text Available Google translate is a program which provides fast, free and effortless translating service. This service uses a unique method to translate. The system is called ―Statistical Machine Translation‖, the newest method in automatic translation. Machine translation (MT is an area of many kinds of different subjects of study and technique from linguistics, computers science, artificial intelligent (AI, translation theory, and statistics. SMT works by using statistical methods and mathematics to process the training data. The training data is corpus-based. It is a compilation of sentences and words of the languages (SL and TL from translation done by human. By using this method, Google let their machine discovers the rules for themselves. They do this by analyzing millions of documents that have already been translated by human translators and then generate the result based on the corpus/training data. However, questions arise when the results of the automatic translation prove to be unreliable in some extent. This paper questions the dependability of Google translate in comparison with grammatical adjustment that naturally characterizes human translators' specific advantage. The attempt is manifested through the analysis of the TL of some texts translated by the SMT. It is expected that by using the sample of TL produced by SMT we can learn the potential flaws of the translation. If such exists, the partial of more substantial undependability of SMT may open more windows to the debates of whether this service may suffice the users‘ need.

  11. The predictive value of an adjusted COPD assessment test score on the risk of respiratory-related hospitalizations in severe COPD patients.

    NARCIS (Netherlands)

    Sloots, Joanne M; Barton, Christopher A; Buckman, Julie; Bassett, Katherine L.; van der Palen, Job; Frith, Peter A.; Effing, Tanja

    2017-01-01

    We evaluated whether a chronic obstructive pulmonary disease (COPD) assessment test (CAT) with adjusted weights for the CAT items could better predict future respiratory-related hospitalizations than the original CAT. Two focus groups (respiratory nurses and physicians) generated two adjusted CAT

  12. MONTE CARLO METHOD AND APPLICATION IN @RISK SIMULATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Gabriela Ižaríková

    2015-12-01

    Full Text Available The article is an example of using the software simulation @Risk designed for simulation in Microsoft Excel spread sheet, demonstrated the possibility of its usage in order to show a universal method of solving problems. The simulation is experimenting with computer models based on the real production process in order to optimize the production processes or the system. The simulation model allows performing a number of experiments, analysing them, evaluating, optimizing and afterwards applying the results to the real system. A simulation model in general is presenting modelling system by using mathematical formulations and logical relations. In the model is possible to distinguish controlled inputs (for instance investment costs and random outputs (for instance demand, which are by using a model transformed into outputs (for instance mean value of profit. In case of a simulation experiment at the beginning are chosen controlled inputs and random (stochastic outputs are generated randomly. Simulations belong into quantitative tools, which can be used as a support for a decision making.

  13. Application of the Method Risk Matrix to Radiotherapy. Main Principles

    International Nuclear Information System (INIS)

    2012-08-01

    The published fundamental principles of security, and basic international standards of security for ionizing radiation safety, contain requirements of protection for patients undergoing medical exposure. In accordance with these requirements and fulfilling its responsibility to provide for the application of these rules, the IAEA has been working intensively in the prevention of accidental exposures in radiotherapy, and this has resulted in a series of technical reports on the lessons learned from the research done in very serious events, and also in teaching materials shared for regional courses and accessible on the website for the protection of patients. The lessons learned are necessary but not sufficient, as we continue receiving information about new types of accidental exposures and not all may have been published. We need a more proactive approach, with a systematic, comprehensive and structured manner, to try to find out in advance what other errors may happen, to prevent or detect them early. Among these approaches are the method of the 'risk matrix', which by its relative simplicity can be applied to all radiotherapy service.

  14. Risk factors for the undermined coal bed mining method

    Energy Technology Data Exchange (ETDEWEB)

    Arad, V. [Petrosani Univ., Petrosani (Romania). Dept. of Mining Engineering; Arad, S. [Petrosani Univ., Petrosani (Romania). Dept of Electrical Engineering

    2009-07-01

    The Romanian mining industry has been in a serious decline and is undergoing ample restructuring. Analyses of reliability and risk are most important during the early stages of a project in guiding the decision as to whether or not to proceed and in helping to establish design criteria. A technical accident occurred in 2008 at the Petrila coal mine involving an explosion during the exploitation of a coal seam. Over time a series of technical accidents, such as explosions and ignitions of methane gas, roof blowing phenomena or self-ignition of coal and hazard combustions have occurred. This paper presented an analysis of factors that led to this accident as well an analysis of factors related to the mining method. Specifically, the paper discussed the geomechanical characteristics of rocks and coal; the geodynamic phenomenon from working face 431; the spontaneous combustion phenomenon; gas accumulation; and the pressure and the height of the undermined coal bed. It was concluded that for the specific conditions encountered in Petrila colliery, the undermined bed height should be between 5 and 7 metres, depending on the geomechanic characteristics of coal and surrounding rocks. 8 refs., 1 tab., 3 figs.

  15. ESTIMATING RISK ON THE CAPITAL MARKET WITH VaR METHOD

    Directory of Open Access Journals (Sweden)

    Sinisa Bogdan

    2015-06-01

    Full Text Available The two basic questions that every investor tries to answer before investment are questions about predicting return and risk. Risk and return are generally considered two positively correlated sizes, during the growth of risk it is expected increase of return to compensate the higher risk. The quantification of risk in the capital market represents the current topic since occurrence of securities. Together with estimated future returns it represents starting point of any investment. In this study it is described the history of the emergence of VaR methods, usefulness in assessing the risks of financial assets. Three main Value at Risk (VaR methodologies are decribed and explained in detail: historical method, parametric method and Monte Carlo method. After the theoretical review of VaR methods it is estimated risk of liquid stocks and portfolio from the Croatian capital market with historical and parametric VaR method, after which the results were compared and explained.

  16. Methods of Economic Valuation of The Health Risks Associated with Nanomaterials

    Science.gov (United States)

    Shalhevet, S.; Haruvy, N.

    The worldwide market for nanomaterials is growing rapidly, but relatively little is still known about the potential risks associated with these materials. The potential health hazards associated with exposure to nanomaterials may lead in the future to increased health costs as well as increased economic costs to the companies involved, as has happened in the past in the case of asbestos. Therefore, it is important to make an initial estimate of the potential costs associated with these health hazards, and to prepare ahead with appropriate health insurance for individuals and financial insurance for companies. While several studies have examined the environmental and health hazards of different nanomaterials by performing life cycle impact assessments, so far these studies have concentrated on the cost of production, and did not estimate the economic impact of the health hazards. This paper discusses methods of evaluating the economic impact of potential health hazards on the public. The proposed method is based on using life cycle impact assessment studies of nanomaterials to estimate the DALYs (Disability Adjusted Life Years) associated with the increased probability of these health hazards. The economic valuation of DALY's can be carried out based on the income lost and the costs of medical treatment. The total expected increase in cost depends on the increase in the statistical probability of each disease.

  17. Application of risk-based inspection methods for cryogenic equipment

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Risk-based Inspection (RBI) is widely applied across the world as part of Pressure Equipment Integrity Management, especially in the oil and gas industry, to generally reduce costs compared with time-based approaches and assist in assigning resources to the most critical equipment. One of the challenges in RBI is to apply it for low temperature and cryogenic applications, as there are usually no degradation mechanisms by which to determine a suitable probability of failure in the overall risk assessment. However, the assumptions used for other degradation mechanisms can be adopted to determine, qualitatively and semi-quantitatively, a consequence of failure within the risk assessment. This can assist in providing a consistent basis for the assumptions used in ensuring adequate process safety barriers and determining suitable sizing of relief devices. This presentation will discuss risk-based inspection in the context of cryogenic safety, as well as present some of the considerations for the risk assessme...

  18. Use of Modern Methods of Credit Portfolio Risk Management in Commercial Banks of Russian Federation

    Directory of Open Access Journals (Sweden)

    Dmitrii S. Melnyk

    2013-01-01

    Full Text Available The article deals with the structure and factors of credit portfolio risk, analyses existing models of portfolio risk assessment and develops recommendations on the implementation of risk management adapted methods, presents recommendations on the optimization of the approach to credit risk minimization in Russian banking system.

  19. Development of innovative methods for risk assessment in high-rise construction based on clustering of risk factors

    Science.gov (United States)

    Okolelova, Ella; Shibaeva, Marina; Shalnev, Oleg

    2018-03-01

    The article analyses risks in high-rise construction in terms of investment value with account of the maximum probable loss in case of risk event. The authors scrutinized the risks of high-rise construction in regions with various geographic, climatic and socio-economic conditions that may influence the project environment. Risk classification is presented in general terms, that includes aggregated characteristics of risks being common for many regions. Cluster analysis tools, that allow considering generalized groups of risk depending on their qualitative and quantitative features, were used in order to model the influence of the risk factors on the implementation of investment project. For convenience of further calculations, each type of risk is assigned a separate code with the number of the cluster and the subtype of risk. This approach and the coding of risk factors makes it possible to build a risk matrix, which greatly facilitates the task of determining the degree of impact of risks. The authors clarified and expanded the concept of the price risk, which is defined as the expected value of the event, 105 which extends the capabilities of the model, allows estimating an interval of the probability of occurrence and also using other probabilistic methods of calculation.

  20. Chemical Mixtures Health Risk Assessment of Environmental Contaminants: Concepts, Methods, Applications

    Science.gov (United States)

    This problems-based, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks from multic...

  1. Chemical Mixtures Health Risk Assessment of Environmental Contaminants: Concepts, Methods, And Applications

    Science.gov (United States)

    This problems-based, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks from multic...

  2. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    International Nuclear Information System (INIS)

    Kenley, C. Robert; Collins, John W.; Beck, John M.; Heydt, Harold J.; Garcia, Chad B.

    2001-01-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  3. The art of alternative risk transfer methods of insurance

    Directory of Open Access Journals (Sweden)

    Athenia Bongani Sibindi

    2015-11-01

    Full Text Available The very basis of insurance is risk assumption. Hence it is the business of insurance to give risk protection. The notion that all ‘risk is risk’ and hence should be treated as such, has become the driving force on the risk landscape. Insurance companies have no room to be selective, as there are competitive threats posed by other financial players who are waiting on the wings to invade the market segment. There has been an emergence of new risks, such as cyber, terrorism as well as liability risks. The insurance cycles have made traditional insurance cover expensive. In this article we sought to interrogate whether Alternative Risk Transfer techniques represent a cost effective way of balancing insurability and the bottom line by analysing global trends. On the basis of the research findings it can be concluded that indeed the ART solutions are a must buy for both corporates and insurance companies, as they result in the organisation using them achieving financial efficiency. The present study also demonstrates that there is a paradigm shift in insurance from that of indemnity to that of value enhancement. Lastly the study reveals that ART solutions are here to stay and are not a fad. Insurance companies cannot afford the luxury of missing any further opportunities, such as happened with Y2K, which proved to be a free lunch.

  4. Unexploded Ordnance: A Critical Review of Risk Assessment Methods

    National Research Council Canada - National Science Library

    MacDonald, Jacqueline

    2004-01-01

    .... While civilian fatalities from UXO explosions on U.S. soil have been rare, the risk of such accidents could increase substantially as more closed bases are transferred from military to civilian control...

  5. New Tools and Methods for Assessing Risk-Management Strategies

    National Research Council Canada - National Science Library

    Vendlinski, Terry P; Munro, Allen; Chung, Gregory K; De la Cruz, Girlie C; Pizzini, Quentin A; Bewley, William L; Stuart, Gale; Baker, Eva L

    2004-01-01

    .... The Decision Analysis Tool (DAT) allowed subjects to use Expected Value and Multi-attribute Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used...

  6. A Roadmap of Risk Diagnostic Methods: Developing an Integrated View of Risk Identification and Analysis Techniques

    National Research Council Canada - National Science Library

    Williams, Ray; Ambrose, Kate; Bentrem, Laura

    2004-01-01

    ...), which is envisioned to be a comprehensive reference tool for risk identification and analysis (RI AND A) techniques. Program Managers (PMs) responsible for developing or acquiring software-intensive systems typically identify risks in different ways...

  7. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  8. Risk management for engineering projects procedures, methods and tools

    CERN Document Server

    Munier, Nolberto

    2014-01-01

    Many people see risk in engineering projects as an imprecise and nebulous problem - something that exists, is feared and is impossible to deal with. Nothing could be further from the truth. While risk is certainly ubiquitous, sometimes difficult to detect, and cannot always be completely avoided, it can generally be mitigated, reduced or prevented through timely analysis and action.   This book covers the entire process of risk management by providing methodologies for determining the sources of project risk, and once threats have been identified, managing them through:   ·         identification and assessment (probability, relative importance, variables, risk breakdown structure, etc.) ·         implementation of measures for their prevention, reduction or mitigation ·         evaluation of impacts and quantification of risks ·         establishment of control measures   It also considers sensitivity analysis to determine the influence of uncertain parameters values ...

  9. Research on the Risk Early Warning Method of Material Supplier Performance in Power Industry

    Science.gov (United States)

    Chen, Peng; Zhang, Xi

    2018-01-01

    The early warning of supplier performance risk is still in the initial stage interiorly, and research on the early warning mechanism to identify, analyze and prevent the performance risk is few. In this paper, a new method aiming at marerial supplier performance risk in power industry is proposed, firstly, establishing a set of risk early warning indexes, Then use the ECM method to classify the indexes to form different risk grades. Then, improving Crock Ford risk quantization model by considering three indicators, including the stability of power system, economic losses and successful bid ratio to form the predictive risk grade, and ultimately using short board effect principle to form the ultimate risk grade to truly reflect the supplier performance risk. Finally, making empirical analysis on supplier performance and putting forward the counter measures and prevention strategies for different risks.

  10. A review of cyber security risk assessment methods for SCADA systems

    OpenAIRE

    Cherdantseva, Yulia; Burnap, Peter; Blyth, Andrew; Eden, Peter; Jones, Kevin; Soulsby, Hugh; Stoddart, Kristan

    2016-01-01

    This paper reviews the state of the art in cyber security risk assessment of Supervisory Control and Data Acquisition (SCADA) systems. We select and in-detail examine twenty-four risk assessment methods developed for or applied in the context of a SCADA system. We describe the essence of the methods and then analyse them in terms of aim; application domain; the stages of risk management addressed; key risk management concepts covered; impact measurement; sources of probabilistic data; evaluat...

  11. Identification of Outliers in Grace Data for Indo-Gangetic Plain Using Various Methods (Z-Score, Modified Z-score and Adjusted Boxplot) and Its Removal

    Science.gov (United States)

    Srivastava, S.

    2015-12-01

    Gravity Recovery and Climate Experiment (GRACE) data are widely used for the hydrological studies for large scale basins (≥100,000 sq km). GRACE data (Stokes Coefficients or Equivalent Water Height) used for hydrological studies are not direct observations but result from high level processing of raw data from the GRACE mission. Different partner agencies like CSR, GFZ and JPL implement their own methodology and their processing methods are independent from each other. The primary source of errors in GRACE data are due to measurement and modeling errors and the processing strategy of these agencies. Because of different processing methods, the final data from all the partner agencies are inconsistent with each other at some epoch. GRACE data provide spatio-temporal variations in Earth's gravity which is mainly attributed to the seasonal fluctuations in water level on Earth surfaces and subsurface. During the quantification of error/uncertainties, several high positive and negative peaks were observed which do not correspond to any hydrological processes but may emanate from a combination of primary error sources, or some other geophysical processes (e.g. Earthquakes, landslide, etc.) resulting in redistribution of earth's mass. Such peaks can be considered as outliers for hydrological studies. In this work, an algorithm has been designed to extract outliers from the GRACE data for Indo-Gangetic plain, which considers the seasonal variations and the trend in data. Different outlier detection methods have been used such as Z-score, modified Z-score and adjusted boxplot. For verification, assimilated hydrological (GLDAS) and hydro-meteorological data are used as the reference. The results have shown that the consistency amongst all data sets improved significantly after the removal of outliers.

  12. Optimization method to determine mass transfer variables in a PWR crud deposition risk assessment tool

    International Nuclear Information System (INIS)

    Do, Chuong; Hussey, Dennis; Wells, Daniel M.; Epperson, Kenny

    2016-01-01

    Optimization numerical method was implemented to determine several mass transfer coefficients in a crud-induced power shift risk assessment code. The approach was to utilize a multilevel strategy that targets different model parameters that first changes the major order variables, mass transfer inputs, then calibrates the minor order variables, crud source terms, according to available plant data. In this manner, the mass transfer inputs are effectively simplified as 'dependent' on the crud source terms. Two optimization studies were performed using DAKOTA, a design and analysis toolkit, with the difference between the runs, being the number of model runs using BOA, allowed for adjusting the crud source terms, therefore, reducing the uncertainty with calibration. The result of the first case showed that the current best estimated values for the mass transfer coefficients, which were derived from first principle analysis, can be considered an optimized set. When the run limit of BOA was increased for the second case, an improvement in the prediction was obtained with the results deviating slightly from the best estimated values. (author)

  13. Dietary patterns derived with multiple methods from food diaries and breast cancer risk in the UK Dietary Cohort Consortium

    Science.gov (United States)

    Pot, Gerda K; Stephen, Alison M; Dahm, Christina C; Key, Timothy J; Cairns, Benjamin J; Burley, Victoria J; Cade, Janet E; Greenwood, Darren C; Keogh, Ruth H; Bhaniani, Amit; McTaggart, Alison; Lentjes, Marleen AH; Mishra, Gita; Brunner, Eric J; Khaw, Kay Tee

    2015-01-01

    Background/ Objectives In spite of several studies relating dietary patterns to breast cancer risk, evidence so far remains inconsistent. This study aimed to investigate associations of dietary patterns derived with three different methods with breast cancer risk. Subjects/ Methods The Mediterranean Diet Score (MDS), principal components analyses (PCA) and reduced rank regression (RRR) were used to derive dietary patterns in a case-control study of 610 breast cancer cases and 1891 matched controls within 4 UK cohort studies. Dietary intakes were collected prospectively using 4-to 7-day food diaries and resulting food consumption data were grouped into 42 food groups. Conditional logistic regression models were used to estimate odds ratios (ORs) for associations between pattern scores and breast cancer risk adjusting for relevant covariates. A separate model was fitted for post-menopausal women only. Results The MDS was not associated with breast cancer risk (OR comparing 1st tertile with 3rd 1.20 (95% CI 0.92; 1.56)), nor the first PCA-derived dietary pattern, explaining 2.7% of variation of diet and characterized by cheese, crisps and savoury snacks, legumes, nuts and seeds (OR 1.18 (95% CI 0.91; 1.53)). The first RRR-derived pattern, a ‘high-alcohol’ pattern, was associated with a higher risk of breast cancer (OR 1.27; 95% CI 1.00; 1.62), which was most pronounced in post-menopausal women (OR 1.46 (95% CI 1.08; 1.98). Conclusions A ‘high-alcohol’ dietary pattern derived with RRR was associated with an increased breast cancer risk; no evidence of associations of other dietary patterns with breast cancer risk was observed in this study. PMID:25052230

  14. A field study comparing two methods of transportation risk assessment

    International Nuclear Information System (INIS)

    Harmon, M.F.; Brey, R.R.; Gesell, T.F.; Oberg, S.G.

    1996-01-01

    RADTRAN 4 is a computer code used for; assessing risks associated with the transportation of nuclear materials. The code employs the common modeling practice of using default values for input variables to simplify the modeling of complex scenarios, thus producing conservative final risk determinations. To better address local public concerns it is of interest to quantify the introduced conservatism by taking a site-specific approach to radiation risk assessment. With RISKIND, incident-free and accident condition doses were calculated for two suburban population groups using both default input parameters; and site-specific values to describe population demographics of regions in Pocatello, Idaho, along the I-15 corridor. The use of site-specific parameters resulted in incident-free doses ranging from the same order of magnitude to one order of magnitude less than the doses calculated with default input parameters. Correcting accident condition doses for the age distribution of the populations and employing site-specific weather data resulted in doses 1.1 times lower than estimated using default input parameters. Dose-risks calculated with RISKIND for the two population groups using site-specific data were of the same order of magnitude as the risk calculated using RADTRAN 4 for the suburban population described in DOE/EIS-0203-D. This study revealed in one specific application that use of default and site-specific parameters resulted in comparable dose estimates. If this tendency were to hold generally true over other environments and model variables, then risk assessors might prefer to select codes on the basis of criteria such as (1) the number of variables to select from; (2) ability to calculate consequences directly, and (3) outputs geared to addressing public concerns

  15. Adjustment of nitrogen fertilization to the needs of plants and limitations posed by the risk of nitrate accumulation and pollution of the soil and subsoil

    Energy Technology Data Exchange (ETDEWEB)

    Muller, J C

    1980-01-01

    In chalky Champagne, nitrogen balance is study to adjust availability to plant response. For this, it is necessary to know some parameters whose measurement is obtained progressively; plants exportation, nitrogen transformations in terms of transport processes in soil system, kinetic of mineralization of soil organic nitrogen, plants residus and agricultural waste waters. Lysimeters with rotation of Champagne (wheat, sugarbeet, potatoes...) are used to measure losses of nitrogen and follow transport of nitrates by mean of soil solution captors. Comparisons with field results, lysimeters results and laboratory experimentations are used to adjust an experimental model. Two examples show: 1) Nitrogen fertilizer requirement for wheat. 2) Possibility of maximum application for agricultural waste waters.

  16. [Risk stratification of patients with diabetes mellitus undergoing coronary artery bypass grafting--a comparison of statistical methods].

    Science.gov (United States)

    Arnrich, B; Albert, A; Walter, J

    2006-01-01

    Among the coronary bypass patients from our Datamart database, we found a prevalence of 29.6% of diagnosed diabetics. 5.2% of the patients without a diagnosis of diabetes mellitus and a fasting plasma glucose level > 125 mg/dl were defined as undiagnosed diabetics. The objective of this paper was to compare univariate methods and techniques for risk stratification to determine, whether undiagnosed diabetes is per se a risk factor for increased ventilation time and length of ICU stay, and for increased prevalence of resuscitation, reintubation and 30-d mortality for diabetics in heart surgery. Univariate comparisons reveals that undiagnosed diabetics needed resuscitation significantly more often and had an increased ventilation time, while the length of ICU stay was significantly reduced. The significantly different distribution between the diabetics groups of 11 from 32 attributes examined, demands the use of methods for risk stratification. Both risk adjusted methods regression and matching confirm that undiagnosed diabetics had an increased ventilation time and an increased prevalence of resuscitation, while the length of ICU stay was not significantly reduced. A homogeneous distribution of the patient characteristics in the two diabetics groups could be achieved through a statistical matching method using the propensity score. In contrast to the regression analysis, a significantly increased prevalence of reintubation in undiagnosed diabetics was found. Based on an example of undiagnosed diabetics in heart surgery, the presented study reveals the necessity and the possibilities of techniques for risk stratification in retrospective analysis and shows how the potential of data collection from daily clinical practice can be used in an effective way.

  17. Beyond discrimination: A comparison of calibration methods and clinical usefulness of predictive models of readmission risk.

    Science.gov (United States)

    Walsh, Colin G; Sharman, Kavya; Hripcsak, George

    2017-12-01

    Prior to implementing predictive models in novel settings, analyses of calibration and clinical usefulness remain as important as discrimination, but they are not frequently discussed. Calibration is a model's reflection of actual outcome prevalence in its predictions. Clinical usefulness refers to the utilities, costs, and harms of using a predictive model in practice. A decision analytic approach to calibrating and selecting an optimal intervention threshold may help maximize the impact of readmission risk and other preventive interventions. To select a pragmatic means of calibrating predictive models that requires a minimum amount of validation data and that performs well in practice. To evaluate the impact of miscalibration on utility and cost via clinical usefulness analyses. Observational, retrospective cohort study with electronic health record data from 120,000 inpatient admissions at an urban, academic center in Manhattan. The primary outcome was thirty-day readmission for three causes: all-cause, congestive heart failure, and chronic coronary atherosclerotic disease. Predictive modeling was performed via L1-regularized logistic regression. Calibration methods were compared including Platt Scaling, Logistic Calibration, and Prevalence Adjustment. Performance of predictive modeling and calibration was assessed via discrimination (c-statistic), calibration (Spiegelhalter Z-statistic, Root Mean Square Error [RMSE] of binned predictions, Sanders and Murphy Resolutions of the Brier Score, Calibration Slope and Intercept), and clinical usefulness (utility terms represented as costs). The amount of validation data necessary to apply each calibration algorithm was also assessed. C-statistics by diagnosis ranged from 0.7 for all-cause readmission to 0.86 (0.78-0.93) for congestive heart failure. Logistic Calibration and Platt Scaling performed best and this difference required analyzing multiple metrics of calibration simultaneously, in particular Calibration

  18. Reciprocal Influences Between Maternal Parenting and Child Adjustment in a High-risk Population: A Five-Year Cross-Lagged Analysis of Bidirectional Effects

    Science.gov (United States)

    Barbot, Baptiste; Crossman, Elizabeth; Hunter, Scott R.; Grigorenko, Elena L.; Luthar, Suniya S.

    2014-01-01

    This study examines longitudinally the bidirectional influences between maternal parenting (behaviors and parenting stress) and mothers' perceptions of their children's adjustment, in a multivariate approach. Data was gathered from 361 low-income mothers (many with psychiatric diagnoses) reporting on their parenting behavior, parenting stress and their child's adjustment, in a two-wave longitudinal study over 5 years. Measurement models were developed to derive four broad parenting constructs (Involvement, Control, Rejection, and Stress) and three child adjustment constructs (Internalizing problems, Externalizing problems, and Social competence). After measurement invariance of these constructs was confirmed across relevant groups and over time, both measurement models were integrated in a single crossed-lagged regression analysis of latent constructs. Multiple reciprocal influence were observed between parenting and perceived child adjustment over time: Externalizing and internalizing problems in children were predicted by baseline maternal parenting behaviors, while child social competence was found to reduce parental stress and increase parental involvement and appropriate monitoring. These findings on the motherhood experience are discussed in light of recent research efforts to understand mother-child bi-directional influences, and their potential for practical applications. PMID:25089759

  19. A Best Evidence Synthesis of Literacy Instruction on the Social Adjustment of Students with or At-Risk for Behavior Disorders

    Science.gov (United States)

    Nelson, J. Ron; Lane, Kathleen L.; Benner, Gregory J.; Kim, Ockjean

    2011-01-01

    The findings of a best-evidence synthesis of the collateral effect of literacy instruction on the social adjustment of students are reported. The goal of the synthesis was to extend the work of Wanzek, Vaughn, Kim, and Cavanaugh (2006) by (a) reviewing treatment-outcomes conducted using group design methodology; (b) focusing on a more defined set…

  20. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    Science.gov (United States)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  1. Methods for comparative risk assessment of different energy sources

    International Nuclear Information System (INIS)

    1992-10-01

    The environmental and health aspects of different energy systems, particularly those associated with the generation of electricity, are emerging as significant issues for policy formulation and implementation. This, together with the growing need of many countries to define their energy programmes for the next century, has provided the basis for a renewed interest in the comparative risk assessment of different energy sources (fossil, nuclear, renewables). This document is the outcome of a Specialists Meeting on the procedural and methodological issues associated with comparative health and environmental risks of different energy sources. After an introductory chapter outlining the issues under consideration the papers presented at the Meeting, which have been indexed separately, are given. Refs, figs and tabs

  2. Method and system for dynamic probabilistic risk assessment

    Science.gov (United States)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  3. Deciding which chemical mixtures risk assessment methods work best for what mixtures

    International Nuclear Information System (INIS)

    Teuschler, Linda K.

    2007-01-01

    The most commonly used chemical mixtures risk assessment methods involve simple notions of additivity and toxicological similarity. Newer methods are emerging in response to the complexities of chemical mixture exposures and effects. Factors based on both science and policy drive decisions regarding whether to conduct a chemical mixtures risk assessment and, if so, which methods to employ. Scientific considerations are based on positive evidence of joint toxic action, elevated human exposure conditions or the potential for significant impacts on human health. Policy issues include legislative drivers that may mandate action even though adequate toxicity data on a specific mixture may not be available and risk assessment goals that impact the choice of risk assessment method to obtain the amount of health protection desired. This paper discusses three important concepts used to choose among available approaches for conducting a chemical mixtures risk assessment: (1) additive joint toxic action of mixture components; (2) toxicological interactions of mixture components; and (3) chemical composition of complex mixtures. It is proposed that scientific support for basic assumptions used in chemical mixtures risk assessment should be developed by expert panels, risk assessment methods experts, and laboratory toxicologists. This is imperative to further develop and refine quantitative methods and provide guidance on their appropriate applications. Risk assessors need scientific support for chemical mixtures risk assessment methods in the form of toxicological data on joint toxic action for high priority mixtures, statistical methods for analyzing dose-response for mixtures, and toxicological and statistical criteria for determining sufficient similarity of complex mixtures

  4. Current methods in risk assessment of genotoxic chemicals.

    Science.gov (United States)

    Cartus, Alexander; Schrenk, Dieter

    2017-08-01

    Chemical contaminants and residues are undesired chemicals occurring in consumer products such as food and drugs, at the workplace and in the environment, i.e. in air, soil and water. These compounds can be detected even at very low concentrations and lead frequently to considerable concerns among consumers and in the media. Thus it is a major challenge for modern toxicology to provide transparent and versatile tools for the risk assessment of such compounds in particular with respect to human health. Well-known examples of toxic contaminants are dioxins or mercury (in the environment), mycotoxins (from infections by molds) or acrylamide (from thermal treatment of food). The process of toxicological risk assessment of such chemicals is based on i) the knowledge of their contents in food, air, water etc., ii) the routes and extent of exposure of humans, iii) the toxicological properties of the compound, and, iv) its mode(s) of action. In this process quantitative dose-response relationships, usually in experimental animals, are of outstanding importance. For a successful risk assessment, in particular of genotoxic chemicals, several conditions and models such as the Margin of Exposure (MoE) approach or the Threshold of Toxicological Concern (TTC) concept exist, which will be discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Soil plate bioassay: an effective method to determine ecotoxicological risks.

    Science.gov (United States)

    Boluda, R; Roca-Pérez, L; Marimón, L

    2011-06-01

    Heavy metals have become one of the most serious anthropogenic stressors for plants and other living organisms. Having efficient and feasible bioassays available to assess the ecotoxicological risks deriving from soil pollution is necessary. This work determines pollution by Cd, Co, Cr, Cu, Ni, Pb, V and Zn in two soils used for growing rice from the Albufera Natural Park in Valencia (Spain). Both were submitted to a different degree of anthropic activity, and their ecotoxicological risk was assessed by four ecotoxicity tests to compare their effectiveness: Microtox test, Zucconi test, pot bioassay (PB) and soil plate bioassay (SPB). The sensitivity of three plant species (barley, cress and lettuce) was also assessed. The results reveal a different degree of effectiveness and level of inhibition in the target organisms' growth depending on the test applied, to such an extent that the one-way analysis of variance showed significant differences only for the plate bioassay results, with considerable inhibition of root and shoot elongation in seedlings. Of the three plant species selected, lettuce was the most sensitive species to toxic effects, followed by cress and barley. Finally, the results also indicate that the SPB is an efficient, simple and economic alternative to other ecotoxicological assays to assess toxicity risks deriving from soil pollution. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Influence of the method of optimizing adjustments of ARV-SD on attainable degree of system stability. Vliyaniye metoda optimizatsii nastroyek ARV-SD na dostizhimuyu stepen ustoychivosti sistemy

    Energy Technology Data Exchange (ETDEWEB)

    Gruzdev, I.A.; Trudospekova, G.Kh.

    1983-01-01

    An examination is made of the efficiency of the methods of successive and simultaneous optimization of adjustments of ARV-SD (ARV of strong action) of several PP. It is shown that with the use of the method of simultaneous optimization for an idealized model of complex EPS, it is possible to attain absolute controllability of the degree of stability.

  7. Development of a non-expert risk assessment method for hand-arm related tasks (HARM)

    NARCIS (Netherlands)

    Douwes, M.; Kraker, H. de

    2014-01-01

    To support health and safety practitioners in their obligation of risk assessment the 'Hand Arm Risk Assessment Method' (HARM) was developed. This tool can be used by any type of company for risk assessment of developing arm, neck or shoulders symptoms (pain) resulting from light manual tasks.This

  8. A Method of Fire Scenarios Identification in a Consolidated Fire Risk Analysis

    International Nuclear Information System (INIS)

    Lim, Ho Gon; Han, Sang Hoon; Yang, Joon Eon

    2010-01-01

    Conventional fire PSA consider only two cases of fire scenarios, that is one for fire without propagation and the other for single propagation to neighboring compartment. Recently, a consolidated fire risk analysis using single fault tree (FT) was developed. However, the fire scenario identification in the new method is similar to conventional fire analysis method. The present study develops a new method of fire scenario identification in a consolidated fire risk analysis method. An equation for fire propagation is developed to identify fire scenario and a mapping method of fire scenarios into internal event risk model is discussed. Finally, an algorithm for automatic program is suggested

  9. Comprehensive safeguards evaluation methods and societal risk analysis

    International Nuclear Information System (INIS)

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures

  10. Risk-averse formulations and methods for a virtual power plant

    KAUST Repository

    Lima, Ricardo M.; Conejo, Antonio J.; Langodan, Sabique; Hoteit, Ibrahim; Knio, Omar M.

    2017-01-01

    In this paper we address the optimal operation of a virtual power plant using stochastic programming. We consider one risk-neutral and two risk-averse formulations that rely on the conditional value at risk. To handle large-scale problems, we implement two decomposition methods with variants using single- and multiple-cuts. We propose the utilization of wind ensembles obtained from the European Centre for Medium Range Weather Forecasts (ECMWF) to quantify the uncertainty of the wind forecast. We present detailed results relative to the computational performance of the risk-averse formulations, the decomposition methods, and risk management and sensitivities analysis as a function of the number of scenarios and risk parameters. The implementation of the two decomposition methods relies on the parallel solution of subproblems, which turns out to be paramount for computational efficiency. The results show that one of the two decomposition methods is the most efficient.

  11. Risk-averse formulations and methods for a virtual power plant

    KAUST Repository

    Lima, Ricardo M.

    2017-12-15

    In this paper we address the optimal operation of a virtual power plant using stochastic programming. We consider one risk-neutral and two risk-averse formulations that rely on the conditional value at risk. To handle large-scale problems, we implement two decomposition methods with variants using single- and multiple-cuts. We propose the utilization of wind ensembles obtained from the European Centre for Medium Range Weather Forecasts (ECMWF) to quantify the uncertainty of the wind forecast. We present detailed results relative to the computational performance of the risk-averse formulations, the decomposition methods, and risk management and sensitivities analysis as a function of the number of scenarios and risk parameters. The implementation of the two decomposition methods relies on the parallel solution of subproblems, which turns out to be paramount for computational efficiency. The results show that one of the two decomposition methods is the most efficient.

  12. Risk Evaluation on UHV Power Transmission Construction Project Based on AHP and FCE Method

    OpenAIRE

    Huiru Zhao; Sen Guo

    2014-01-01

    Ultra high voltage (UHV) power transmission construction project is a high-tech power grid construction project which faces many risks and uncertainty. Identifying the risk of UHV power transmission construction project can help mitigate the risk loss and promote the smooth construction. The risk evaluation on “Zhejiang-Fuzhou” UHV power transmission construction project was performed based on analytic hierarchy process (AHP) and fuzzy comprehensive evaluation (FCE) method in this paper. Afte...

  13. Adjustment of nursing home quality indicators

    Directory of Open Access Journals (Sweden)

    Hirdes John P

    2010-04-01

    Full Text Available Abstract Background This manuscript describes a method for adjustment of nursing home quality indicators (QIs defined using the Center for Medicaid & Medicare Services (CMS nursing home resident assessment system, the Minimum Data Set (MDS. QIs are intended to characterize quality of care delivered in a facility. Threats to the validity of the measurement of presumed quality of care include baseline resident health and functional status, pattern of comorbidities, and facility case mix. The goal of obtaining a valid facility-level estimate of true quality of care should include adjustment for resident- and facility-level sources of variability. Methods We present a practical and efficient method to achieve risk adjustment using restriction and indirect and direct standardization. We present information on validity by comparing QIs estimated with the new algorithm to one currently used by CMS. Results More than half of the new QIs achieved a "Moderate" validation level. Conclusions Given the comprehensive approach and the positive findings to date, research using the new quality indicators is warranted to provide further evidence of their validity and utility and to encourage their use in quality improvement activities.

  14. Perceived Risks Associated with Contraceptive Method Use among ...

    African Journals Online (AJOL)

    AJRH Managing Editor

    Communication Programs, Baltmore, Maryland2; Center for Interdisciplinary Inquiry and Innovation in Sexual and Reproductive. Health .... sterilization making up the remaining method mix .... substandard material and could therefore break or.

  15. Review of methods for modelling forest fire risk and hazard

    African Journals Online (AJOL)

    user

    -Leal et al., 2006). Stolle and Lambin (2003) noted that flammable fuel depends on ... advantages over conventional fire detection and fire monitoring methods because ofits repetitive andconsistent coverage over large areas of land (Martin et ...

  16. Delayed heart rate recovery after exercise as a risk factor of incident type 2 diabetes mellitus after adjusting for glycometabolic parameters in men.

    Science.gov (United States)

    Yu, Tae Yang; Jee, Jae Hwan; Bae, Ji Cheol; Hong, Won-Jung; Jin, Sang-Man; Kim, Jae Hyeon; Lee, Moon-Kyu

    2016-10-15

    Some studies have reported that delayed heart rate recovery (HRR) after exercise is associated with incident type 2 diabetes mellitus (T2DM). This study aimed to investigate the longitudinal association of delayed HRR following a graded exercise treadmill test (GTX) with the development of T2DM including glucose-associated parameters as an adjusting factor in healthy Korean men. Analyses including fasting plasma glucose, HOMA-IR, HOMA-β, and HbA1c as confounding factors and known confounders were performed. HRR was calculated as peak heart rate minus heart rate after a 1-min rest (HRR 1). Cox proportional hazards model was used to quantify the independent association between HRR and incident T2DM. During 9082 person-years of follow-up between 2006 and 2012, there were 180 (10.1%) incident cases of T2DM. After adjustment for age, BMI, systolic BP, diastolic BP, smoking status, peak heart rate, peak oxygen uptake, TG, LDL-C, HDL-C, fasting plasma glucose, HOMA-IR, HOMA-β, and HbA1c, the hazard ratios (HRs) [95% confidence interval (CI)] of incident T2DM comparing the second and third tertiles to the first tertile of HRR 1 were 0.867 (0.609-1.235) and 0.624 (0.426-0.915), respectively (p for trend=0.017). As a continuous variable, in the fully-adjusted model, the HR (95% CI) of incident T2DM associated with each 1 beat increase in HRR 1 was 0.980 (0.960-1.000) (p=0.048). This study demonstrated that delayed HRR after exercise predicts incident T2DM in men, even after adjusting for fasting glucose, HOMA-IR, HOMA-β, and HbA1c. However, only HRR 1 had clinical significance. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions.

    Science.gov (United States)

    Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song

    2017-11-01

    Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from lo